Long-term seismic data accumulation (known as Large-T) and ultra-dense seismic instrumentation (known as Large-N) are major momentums for the rapid growth in the quantity of seismic data. Long-term accumulated seismic data provide indispensable resources for seismological research. In particular, permanent high-quality seismic networks (e.g., SoCal network, Hi-net) that have been operated for decades are regularly used in seismological research, which greatly advance our understanding of the earthquake phenomena and the Earth’s structures. On the other hand, temporary seismic networks, which commonly operate from weeks to years, have become increasingly dense in order to achieve high-resolution imaging of the Earth. Among them nodal arrays and distributed acoustic sensing (DAS) are outstanding examples. The unprecedented big volume of seismic data pose great challenges for processing and analysis techniques in modern seismology. Correspondingly, nowadays machine learning has provided a large collection of tools to handle voluminous data. In this talk, I will briefly introduce the emerging global race in big seismic data and machine learning seismology, and make a holistic presentation of our recent efforts in 1) Large-N microseismic detection using both nodal arrays and DAS data; 2) machine learning applications to Large-T with emphasis on earthquake early warning.