Jiayuan Mao 茅佳源
Email: jiayuanm [at] mit [dot] edu
Jiayuan Mao is a PhD candidate at MIT EECS, advised by Prof. Josh Tenenbaum and Prof. Leslie Kaelbling. Previously, she obtained her Bachelor's degree from YaoClass, Tsinghua University.
Email: jiayuanm [at] mit [dot] edu
Jiayuan Mao is a PhD candidate at MIT EECS, advised by Prof. Josh Tenenbaum and Prof. Leslie Kaelbling. Previously, she obtained her Bachelor's degree from YaoClass, Tsinghua University.
Underrepresented groups: including but not limited to gender/racial/ethnic minority groups.
My long-term research goal is to build machines that can continually learn concepts (e.g., properties, relations, skills, rules and algorithms) from their experiences and apply them for reasoning and planning in the physical world. The central theme of my research is to decompose the learning problem into learning a vocabulary of neuro-symbolic concepts. The symbolic part describes their structures and how different concepts can be composed; the neural part handles grounding in perception and physics. I leverage structures to make learning more data-efficient, more compositionally generalizable, and also inference and planning faster.
How should we represent various types of concepts?
How to capture the programmatic structures underlying these concepts (The Theory-Theory of Concepts)?
How can we efficiently learn these concepts from natural supervisions (e.g., language, videos)?
How can we leverage the structures of these concepts to make inference and planning faster?
Topics:
Concept Learning and Language Acquisition /
Reasoning and Planning /
Scene and Activity Understanding
Past topics: Object Detection /
Structured NLP
(*/†: indicates equal contribution.)
NeurIPS 2024 Datasets and Benchmarks Track (Oral)
Paper (Coming Soon) /
Project Page /
Code /
Data
ECCV Human-Inspired Computer Vision Workshop 2024 Paper / Project Page / Data
CogSci 2024 (Best Undergraduate Student Paper) Paper
RSS 2024
Paper
Presented at RSS Workshop on Task Specification for General-Purpose Intelligent Robots
ICLR 2024 (Spotlight) Paper / Project Page / Code / MIT News / TechCrunch
ICLR 2024 (Spotlight) Paper / Project Page / Code
NeurIPS 2023 (Spotlight) Paper / Project Page / Code
CoRL 2023
Paper /
Supplementary Videos
Presented at IROS 2023 Workshop on Leveraging Models for Contact-Rich Manipulation (Spotlight)
(Slides /
Video)
CVPR 2023
Paper /
Project Page /
Code
Presented at CVPR 2023 Workshop On Compositional 3D Vision (Oral)
ICLR 2023 (Notable Top 25%) Paper / Project Page
NeurIPS 2022 Datasets and Benchmarks Track Paper / Project Page / Code
NeurIPS 2022 Datasets and Benchmarks Track Paper / Project Page / Code
NeurIPS 2022 Datasets and Benchmarks Track Paper / Project Page / Code
IJCAI 2021 Paper / Project Page / Code
(First two authors contributed equally; order determined by coin toss.)ArXiv 2020 Paper
ICCV 2019 Paper / Project Page
(First two authors contributed equally; order determined by coin toss.)ACL 2019 (Best Paper Nomination) Paper / Project Page / Code
CVPR 2019
(Oral)
Paper
Presented at NAACL 2019 SpLU-RoboNLP
ICLR 2019 (Oral) Paper / Project Page / Code / MIT News / MIT Technology Review
ArXiv Preprint Paper