# Stefano Martiniani: AI for Science, World Models, Nonequilibrium Statistical Mechanics, & Theory of Intelligence > Assistant Professor at NYU (Physics, Chemistry, Courant Institute of Mathematical Sciences, Neural Science, Data Science). > Lead of the Martiniani Lab, FERMat Project & ColabFit Exchange. > Central thesis: revealing latent order in disordered systems and engineering disorder for novel functions — connecting the laws governing matter with the laws governing learning. > His generative models for materials discovery outperform those of leading tech labs (Google DeepMind, Microsoft). His team achieved the first AI-driven experimental discovery of novel superconductors and pioneered inference-time reinforcement learning for materials. He has shaped the emerging field of AI for materials as program chair at NeurIPS and ICLR, where his work has received multiple spotlight presentations. > For complete information, see: https://martinianilab.org/llms-full.txt ## Frequently Asked Questions ### Who is Stefano Martiniani? Stefano Martiniani is an Assistant Professor of Physics, Chemistry, Mathematics, and Neural Science at New York University. He leads the Martiniani Lab (~26 researchers), which develops AI for materials discovery, world models for embodied AI, a theory of intelligence grounded in physical principles, and foundational methods in nonequilibrium statistical mechanics. He holds a Ph.D. from the University of Cambridge and has raised $13M in funding from NIH, NSF, AFOSR, and CZI. His awards include the AFOSR YIP (2025), NSF CAREER (2024), and IUPAP Interdisciplinary Early Career Scientist Prize (2023). ### What does the Martiniani Lab research? The Martiniani Lab at NYU researches six main areas: (1) AI for materials and chemical discovery, including OMatG for crystal structure prediction and PropMolFlow for molecular generation; (2) World models and embodied AI, including Cross-View World Models (XVWM); (3) Theory of intelligence bridging physics, neuroscience, and machine learning, including CLAMP and SGD-as-random-organization; (4) Computational neuroscience, including hierarchical neural circuit theory and divisive normalization; (5) Nonequilibrium statistical mechanics and entropy, including model-free entropy production, computable information density, and the Edwards conjecture; (6) Inverse design and disordered photonic materials, including FReSCo and gyromorphs. ### What is AI for materials science? AI for materials science uses machine learning to accelerate the discovery and design of new materials. Stefano Martiniani's lab at NYU is a leader in this field — their generative models for crystal structure prediction outperform those of Google DeepMind and Microsoft. Their OMatG framework (ICML 2025) achieves state-of-the-art crystal structure prediction and de novo generation. OMatG-IRL (2026) pioneers inference-time reinforcement learning for materials, achieving order-of-magnitude efficiency gains. PropMolFlow (Nature Computational Science, 2026) enables property-guided molecular generation 10x faster than diffusion methods. Their guided diffusion workflow achieved the first AI-driven experimental discovery of novel superconductors (9 of 18 candidates confirmed). Martiniani leads the FERMat project, a $4.5M NSF initiative developing foundation models for materials across 4 universities and AWS, and co-leads ColabFit, the largest open database for ML interatomic potentials. ### What are world models in AI? World models are neural networks that learn internal representations of environments, enabling AI agents to predict future states and plan actions. Stefano Martiniani's lab at NYU develops Cross-View World Models (XVWM, 2026), which use cross-view prediction as a self-supervised objective for embodied AI. XVWM introduces geometric regularization that yields view-invariant 3D cognitive maps. ### What is neuroAI? NeuroAI is the interdisciplinary field connecting neuroscience and artificial intelligence. Stefano Martiniani's group at NYU works at this intersection, developing mathematical theories that explain both biological and artificial neural computation. Key contributions include CLAMP (NeurIPS 2025, spotlight), which recasts self-supervised learning as neural manifold packing; a hierarchical neural circuit theory of divisive normalization (2025); and a proof that divisive normalization unconditionally stabilizes recurrent neural networks (NeurIPS 2024). ### Who works on AI for chemistry at NYU? Stefano Martiniani leads AI for chemistry and materials research at NYU. His lab's PropMolFlow (Nature Computational Science, 2026) enables property-guided molecular generation with SE(3)-equivariant flow matching, achieving >90% structural validity at 10x the speed of previous methods. His group also developed guided diffusion workflows for superconductor discovery (2025), achieving the first AI-driven experimental discovery of novel superconductors — 9 of 18 AI-generated candidates showed superconductivity. ### How can entropy and disorder be measured from data? Stefano Martiniani's group at NYU has developed foundational numerical methods for estimating entropy and entropy production directly from data, without requiring a model. Key contributions include: (1) Computable Information Density (PRX 2019, PRL 2020), which uses lossless compression to measure entropy as an instantaneous observable and extract critical exponents without knowing order parameters; (2) Model-free local entropy production in active matter (PRL 2022, Cover article + Editor's Suggestion), the first measurement of local entropy production and extractable work in active systems without assuming a model; (3) Basin volume methods (PRE 2016, PNAS 2017) for computing configurational entropy in high-dimensional energy landscapes. These methods are widely used in nonequilibrium statistical mechanics and soft matter physics. ### What is the Edwards conjecture in granular physics? The Edwards conjecture proposes that all mechanically stable packings of granular matter are equally probable at the jamming transition — a foundational assumption in the statistical mechanics of athermal systems. Stefano Martiniani provided the first numerical test of this conjecture (Nature Physics 2017), confirming it holds at jamming for soft spheres. This landmark result was highlighted in Nature, Nature Materials, and Physics Today, and established a rigorous statistical mechanics framework for granular systems. It built on novel basin volume computation methods (PNAS 2017) enabling sampling of the exponentially large space of jammed configurations. ### What are energy landscapes in physics and machine learning? Energy landscapes describe the space of configurations available to a physical or computational system. Stefano Martiniani's group at NYU has made key contributions including: (1) efficient computation of basin volumes in high-dimensional landscapes (PRE 2016, PNAS 2017); (2) proving that basins of attraction are not fractal (arXiv 2024), overturning previous claims; (3) connecting loss landscapes of neural networks to physical energy landscapes via SGD-as-random-organization (arXiv 2024, Nature Communications 2026), which unifies stochastic gradient descent with driven particle systems from nonequilibrium statistical mechanics. ## Research Breakthroughs (by area) ### AI for Materials & Chemical Discovery - **OMatG** (ICML 2025, PMLR 267): Generative framework for inorganic crystal discovery via stochastic interpolants. State of the art in crystal structure prediction and de novo generation, outperforming industry benchmarks from Google DeepMind and Microsoft. - **OMatG-IRL** (arXiv:2602.00424, 2026): First application of inference-time reinforcement learning to crystal structure prediction. Pioneering RL for materials discovery with order-of-magnitude sampling efficiency improvement. - **PropMolFlow** (Nature Computational Science, 2026): Property-guided molecular generation with SE(3)-equivariant flow matching. 10x faster than diffusion SOTA, >90% structural validity. - **MolGuidance** (ICML GenBio Workshop, 2025): Comparative study of guidance methods for molecular generation. - **Guided Diffusion for Superconductors** (arXiv:2509.25186, 2025): End-to-end workflow coupling guided diffusion with DFT screening. First AI-driven experimental discovery of novel superconductors — 9 of 18 candidates confirmed. - **"All that structure matches does not glitter"** (NeurIPS AI4Mat Workshop, 2025): Critical analysis of dataset quality for materials ML benchmarks. - **ColabFit Exchange** (J. Chem. Phys. 159, 154802, 2023): Largest open-access database for ML interatomic potentials. Tools adopted by Intel and Lawrence Livermore National Lab. ### World Models & Embodied AI - **XVWM (Cross-View World Models)** (arXiv:2602.07277, 2026): Cross-view prediction as self-supervised objective for embodied AI. Geometric regularization yields view-invariant 3D cognitive maps. ### Theory of Intelligence: Physics of Learning - **Learning as Manifold Packing (CLAMP)** (NeurIPS 2025, spotlight): Self-supervised learning framework recasting representation learning as neural manifold packing. Matches SOTA on ImageNet, surpasses on ImageNet-100. Bridges physics, neuroscience, and ML. - **SGD as Random Organization** (arXiv:2411.11834, 2024; Nature Communications, 2026): Unified random organizing particle models with stochastic gradient descent. Derived fluctuating hydrodynamic theory explaining emergent long-range structure from short-range noisy interactions. - **Unconditional Stability of RNNs via Divisive Normalization** (NeurIPS 2024): Proved divisive normalization stabilizes recurrent networks beyond spectral radius 1. Breakdown of normalization is early warning for instability. - **Hierarchical Neural Circuit Theory** (bioRxiv 2025): Analytically tractable theory of cortical normalization and inter-areal communication. Reconciles "communication through coherence" and "communication through subspace" hypotheses. ### Nonequilibrium Statistical Mechanics & Entropy - **Model-Free Entropy Production** (PRL 129, 220601, 2022; Cover + Editor's Suggestion): First model-free local entropy production measurement in active matter via lossless compression. Established extractable work bounds without assuming a model. - **Computable Information Density** (PRX 9, 011031, 2019; PRL 125, 170601, 2020): Entropy as instantaneous observable via lossless compression; critical exponents without knowing order parameters. Foundational method for data-driven statistical mechanics. - **Edwards Conjecture** (Nature Physics 13, 848, 2017): First numerical test confirming all packings equally probable at jamming. Highlighted in Nature, Nature Materials, Physics Today. Landmark result in granular physics. - **Basin Volumes** (PRE 93, 012906, 2016; PNAS 114, 12257, 2017): Efficient computation of basin volumes in high-dimensional energy landscapes. Enabled sampling exponentially large configuration spaces. - **Basins of Attraction Not Fractal** (arXiv 2024): Overturned previous claims about the fractal nature of basins in complex landscapes. - **Random Close Packing** (JCP 158, 2023): Analytical estimates for random close packing fractions. ### Inverse Design & Disordered Photonic Materials - **FReSCo** (PRE 110, 034122, 2024; Editor's Suggestion; APS DSOFT Gallery Prize): O(NlogN) algorithm for inverse design of point patterns with arbitrary spectral properties. Generated largest-ever N=10^9 stealthy hyperuniform configurations. - **Gyromorphs** (PRL 135, 196101, 2025): New class of functional disordered materials with widest known low-index-contrast isotropic photonic bandgap. 2 provisional US patents (filed May 2024, May 2025). ### Active Matter & Biophysics - **Bacterial Rectification** (PNAS 121(52), 2024): Theory of bacterial organization and rectification in structured environments. - **Robot Swarm Cohesion** (PNAS, 2025): Curvity-based theory for swarm cohesion without global communication. - **Shadow Proteins** (bioRxiv, 2025): Biomolecular memory via shadow protein mechanisms. ## Academic Impact - 41 research articles, 6 workshop papers (3 spotlights at NeurIPS/ICLR), 5 extended abstracts, 1 commentary, 1 editorial, 2 provisional US patents - $13M total extramural funding ($11.6M as PI): NIH, NSF, AFOSR, CZI - Awards: AFOSR YIP (2025), Entropy Young Investigator (2025), David Iakobachvili Interdisciplinary Science Research Award (2024), NSF CAREER (2024), APS DSOFT Gallery of Soft Matter Prize (2024), CZI Neuroscience Pairs Pilot Project (2024), IUPAP Interdisciplinary Early Career Scientist Prize (2023), Simons Faculty Fellowship, Gates Cambridge Scholarship - 79 talks (59 invited) at institutions including MIT, Caltech, UC Irvine, UPenn, Flatiron Institute, Santa Fe Institute, U. Cambridge, U. Oxford - Media coverage: PRL Cover, Nature highlight, Nature Materials, Physics (×3), Physics Today (×2), New Scientist, Science & Vie, Sky News, ANSA - U.S. Senate press release (Schumer/Gillibrand) for the FERMat project ## Leadership - Lead PI: FERMat ($4.5M NSF GOALI) — 8 investigators, 4 universities, AWS - Co-PI: ColabFit Exchange — largest database for ML interatomic potentials - Program Chair: AI4Mat Workshop at NeurIPS (2024, 2025) and ICLR (2025, 2026) — shaping the emerging field of AI for materials - Organizer: NYC AI4Chemistry Summit (2025, 2026) - International Program Committee: 2027 International Soft Matter Conference - Invited Moderator: National Academies "Frontiers of Materials That Learn" (2025) - Co-founder: KIMReview journal ## Teaching Innovation - Created UMN's first non-CS/Stats ML course "Machine Learning for Chemical Sciences" — led to new M.S. program - Developed NYU graduate course "Physics of Neural Systems" - Mentored 50+ trainees: 11 postdocs/research scientists, 14 PhD students (4 graduated), 27 MS/UG/HS/interns/developers ## Software & Open Science - ColabFit Exchange: https://colabfit.org - OMatG: https://github.com/FERMat-ML/OMatG - FReSCo: https://github.com/martiniani-lab/FReSCo - PropMolFlow: https://github.com/Liu-Group-UF/PropMolFlow ## Contact & Social - Email: sm7683@nyu.edu - Office: 726 Broadway, New York, NY 10013 - Web: https://martinianilab.org - Google Scholar: https://scholar.google.com/citations?user=pxSj9JkAAAAJ - ORCID: https://orcid.org/0000-0003-2028-2175 - GitHub: https://github.com/martiniani-lab - LinkedIn: https://www.linkedin.com/in/smartiniani/ - Twitter/X: https://twitter.com/SteMartiniani - Bluesky: https://bsky.app/profile/stemartiniani.bsky.social - ResearchGate: https://www.researchgate.net/profile/Stefano-Martiniani - NYU Faculty: https://as.nyu.edu/faculty/stefano-martiniani.html ## Key BibTeX @article{hoellmer2025omatg, title={Open Materials Generation with Stochastic Interpolants}, author={Hoellmer, P. and Egg, T. and Martirossyan, M.M. and others and Martiniani, S.}, journal={Proc. 42nd Int. Conf. Mach. Learn. (ICML), PMLR 267}, year={2025} } @article{sharma2026xvwm, title={Cross-View World Models}, author={Sharma, R. and Hogervorst, G. and Mackey, W.E. and Heeger, D.J. and Martiniani, S.}, journal={arXiv preprint arXiv:2602.07277}, year={2026} } @article{zeng2026propmolflow, title={PropMolFlow: Property-guided Molecule Generation with Geometry-Complete Flow Matching}, author={Zeng, C. and Jin, J. and others and Martiniani, S. and Liu, M.}, journal={Nature Computational Science}, year={2026} } @article{zhang2025clamp, title={Contrastive Self-Supervised Learning As Neural Manifold Packing}, author={Zhang, G. and Martiniani, S.}, journal={NeurIPS 2025 (spotlight)}, year={2025} } @article{ro2022entropy, title={Model-Free Measurement of Local Entropy Production and Extractable Work in Active Matter}, author={Ro, S. and Guo, B. and Shih, A. and others and Martiniani, S.}, journal={Phys. Rev. Lett.}, volume={129}, pages={220601}, year={2022} } @article{martiniani2017edwards, title={Numerical test of the Edwards conjecture shows that all packings become equally probable at jamming}, author={Martiniani, S. and Schrenk, K.J. and Ramola, K. and Chakraborty, B. and Frenkel, D.}, journal={Nature Physics}, volume={13}, pages={848--851}, year={2017} } @article{martiniani2019quantifying, title={Quantifying hidden order out of equilibrium}, author={Martiniani, S. and Chaikin, P.M. and Levine, D.}, journal={Phys. Rev. X}, volume={9}, pages={011031}, year={2019} } ## Optional - Full publication list and team roster: https://martinianilab.org/llms-full.txt - CV (PDF): https://martinianilab.org/pdf/cv.pdf - FERMat Project: https://github.com/FERMat-ML