Learning DAG structured inter-variable dependencies from observational data finds wide range of potentials given its convenient explainability that is favored in today’s high stakes applications of artificial intelligence. In a variety of scientific disciplines, the data generation mechanism exhibits time-varying characteristics and hence it calls for effective methods to address the problem of time-varying structure learning from time-series data. Meanwhile, many practical time-varying systems show state transitioning property. In addition, state space models provide great generalization ability and interpretable results. Therefore, it makes learning time-varying systems with state space models appealing. Against this, we study the novel problem of jointly discrete hidden state monitoring and dynamic structure learning from multivariate time series data and introduce the State-Regularized Vector Autoregression Model (SRVAR). SRVAR exploits state-regularized recurrent neural network to discover underlying finite discrete state transition pattern while leveraging dynamic vector autoregression model together with a recent algebraic result to learn state-dependent inter-variable dependencies. Results of extensive experiments on simulated data as well as a real-world data show the superiority of SRVAR over state-of-the-art baselines at recovering the unobserved state transitions and discovering the state- dependent inter-variable relationships.