PROFESSIONAL RECORD • COMPLETE DOCUMENTATION
CURRICULUM VITAE
ADITHYA BHASKAR • PRINCETON UNIVERSITY
EDUCATION
2023-Ongoing
Ph.D. in Computer Science, Princeton University, USA
Advised by Prof. Danqi Chen (Specialization: Natural Language Processing)
Ph.D. in Computer Science, Princeton University, USA
Advised by Prof. Danqi Chen (Specialization: Natural Language Processing)
2019-23
Bachelor of Technology in Computer Science and Engineering (Honors), IIT Bombay, India
Bachelor's Thesis advised by Prof. Sunita Sarawagi
Bachelor of Technology in Computer Science and Engineering (Honors), IIT Bombay, India
Bachelor's Thesis advised by Prof. Sunita Sarawagi
2017-19
High School, Central Board of Secondary Education, India
High School, Central Board of Secondary Education, India
2017
Senior Secondary School, Central Board of Secondary Education, India
Senior Secondary School, Central Board of Secondary Education, India
PUBLICATIONS
2025 Unintentional Unalignment: Likelihood Displacement in Direct Preference Optimization, ICLR 2025
Noman Razin, Sadhika Malladi, Adithya Bhaskar, Danqi Chen, Sanjeev Arora, and Boris Hanin
Noman Razin, Sadhika Malladi, Adithya Bhaskar, Danqi Chen, Sanjeev Arora, and Boris Hanin
2024 Finding Transformer Circuits With Edge Pruning, NeurIPS 2024 (Spotlight)
Adithya Bhaskar, Alexander Wettig, Dan Friedman, and Danqi Chen
Adithya Bhaskar, Alexander Wettig, Dan Friedman, and Danqi Chen
2024 The Heuristic Core: Understanding Subnetwork Generalization in Pretrained Language Models, ACL 2024 (Oral)
Adithya Bhaskar, Dan Friedman, and Danqi Chen
Adithya Bhaskar, Dan Friedman, and Danqi Chen
2023 Benchmarking and Improving Text-to-SQL Generation under Ambiguity, EMNLP 2023
Adithya Bhaskar*, Tushar Tomar*, Ashutosh Sathe, and Sunita Sarawagi
Adithya Bhaskar*, Tushar Tomar*, Ashutosh Sathe, and Sunita Sarawagi
2023 Prompted Opinion Summarization with GPT-3.5, ACL 2023 (Findings)
Adithya Bhaskar, Alexander R. Fabbri, and Greg Durrett
Adithya Bhaskar, Alexander R. Fabbri, and Greg Durrett
2023 Performance Bounds for LASSO under Multiplicative Noise: Applications to Pooled RT-PCR Testing, Signal Processing, Vol. 214
Richeek Das, Aaron Jerry Ninan, Adithya Bhaskar, and Ajit Rajwade
Richeek Das, Aaron Jerry Ninan, Adithya Bhaskar, and Ajit Rajwade
PREPRINTS
2025 Cache Me If You Can: How Many KVs Do You Need for Effective Long-Context LMs?, arXiv preprint, arXiv:2506.17121
Adithya Bhaskar*, Alexander Wettig*, Tianyu Gao, Yihe Dong, and Danqi Chen
Adithya Bhaskar*, Alexander Wettig*, Tianyu Gao, Yihe Dong, and Danqi Chen
2024 Continual Memorization of Factoids in Language Models, arXiv preprint, arXiv:2411.01715
Howard Chen, Jiayi Geng, Adithya Bhaskar, Dan Friedman, and Danqi Chen
Howard Chen, Jiayi Geng, Adithya Bhaskar, Dan Friedman, and Danqi Chen
2024 Improving Language Understanding from Screenshots, arXiv preprint, arXiv:2402.14073
Tianyu Gao, Zirui Wang, Adithya Bhaskar, and Danqi Chen
Tianyu Gao, Zirui Wang, Adithya Bhaskar, and Danqi Chen
SCHOLASTIC ACHIEVEMENTS
2024 Recipient of the Hisashi and Masae Kobayashi '67 Fellowship.
2023 Recipient of the Thomas Dooie Class of 1974 Research Award.
2019 All India Rank 18 in JEE Advanced 2019 among 240 thousand candidates.
2019 All India Rank 114 in JEE Mains 2019 among 1.1 million candidates.
2018 Secured a position in the top 39 ranks in the Indian National Physics Olympiad and was invited to the Orientation-cum-Selection-Camp in Physics held in May-June 2018.
2018 Secured a position in the top 49 ranks in the Indian National Chemistry Olympiad and was invited to the Orientation-cum-Selection-Camp in Chemistry held in May-June 2018.
2016 Among the 39 students to clear the Indian National Mathematical Olympiad, becoming one of the youngest to ever be invited to the Orientation-cum-Selection-Camp in Mathematics aged 14.
EXPERIENCE
UT Austin, Summer 2022
Research Intern, Natural Language Processing, USA
Advisor: Prof. Greg Durrett
Research Intern, Natural Language Processing, USA
Advisor: Prof. Greg Durrett
- Developed metrics to measure factuality, faithfulness and specificity for multi-document summarization.
- Benchmarked GPT-3.5 and showed hierarchical summarization of large text performs best on faithfulness and specificity.
- Investigated pre-clustering and pre-summarization methods for improved correctness, faithfulness, and specificity.
Uppsala University, Summer 2021
Research Intern, Formal Verification
Advisor: Prof. Parosh Abdulla
Research Intern, Formal Verification
Advisor: Prof. Parosh Abdulla
- Developed a model and simulator for programs running under the ARMv8 memory model.
- Used Context Bounded Model Checking for State Reachability Analysis, achieving up to an order of magnitude speedup over existing checkers.
INVITED TALKS
April 2024 The Heuristic Core: Understanding Subnetwork Generalization in Pretrained Language Models
Host: Mathew Monfort, Amazon AWS
Host: Mathew Monfort, Amazon AWS
TEACHING
Spring 2025 Graduate Teaching Assistant, COS 484: Natural Language Processing
Instructors: Danqi Chen, Vikram Ramaswamy, and Tri Dao
Princeton University
Instructors: Danqi Chen, Vikram Ramaswamy, and Tri Dao
Princeton University
Fall 2024 Graduate Teaching Assistant, COS 597R: Deep Dive into Large Language Models
Instructors: Danqi Chen, and Sanjeev Arora
Princeton University
Instructors: Danqi Chen, and Sanjeev Arora
Princeton University
SERVICE
2025 NeurIPS 2025, Reviewer
2025 ICML 2025 MOSS Workshop, Reviewer
2024 NeurIPS 2024 ATTRIB Workshop, Reviewer
OTHER PROJECTS
Robust Models, Spring 2023
Bachelor's Project, Natural Language Processing, Guide: Prof. Sunita Sarawagi
Bachelor's Project, Natural Language Processing, Guide: Prof. Sunita Sarawagi
- Demonstrated that training a Text-to-SQL model on partially masked inputs leads to diversity in model outputs, including in columns/tables, string literals, integers, and aggregates.
- Filtered outputs by model probabilities relative to the unmasked question.
- Furnished questions for generated queries via an SQL-to-Text model. Data augmentation with pairs led to increases in accuracy and robustness.
C Decompiler, Fall 2020
Course Project, Software Systems, Guide: Prof. Amitabh Sanyal
Course Project, Software Systems, Guide: Prof. Amitabh Sanyal
- Built a decompiler to convert Register Transfer Language to C for portability across architectures.
- Utilized lex and bison to parse source code in RTL and identify program elements like assignments, arithmetic operations, conditional/looping constructs, function calls and memory accesses.
- Performed local & global data flow analysis and control flow analysis to contextualize parsed code.
COMPLETE PUBLICATION RECORD AVAILABLE IN PUBLICATIONS SECTION