(Room 217-219, New Orleans, December 15, 2023, Website)


Friday, December 15, 2023

All times are in Central Standard Time (CST) (Check local time)

Location: Room 217-219, New Orleans Convention Center (Map)

08:55-09:00 Introduction and Opening Remarks
09:00-09:30 Invited Talk AI4Crypto
Kristin Lauter, FAIR Labs North America, Meta
09:30-10:00 Invited Talk Exploring Mathematical Conjecturing – From Heuristic Search to Large Language Models
Moa Johansson, Chalmers University
10:00-10:30 Invited Talk Axioms (and curiosity and attention) are all you need
Noah D. Goodman, Stanford University
10:30-11:00 Break
11:00-12:00 Panel Discussion Timothy Gowers (College de France), Talia Ringer (UIUC), Armando Solar-Lezama (MIT), Yuhuai (Tony) Wu (xAI), Mateja Jamnik (University of Cambridge)
12:00-13:00 Break
13:00-13:15 Contributed Talk Learning the Greatest Divisor - Explainable Predictions in Transformers
Francois Charton,
13:15-13:30 Contributed Talk Lemur: Integrating Large Language Models in Automated Program Verification
Nina Narodytska,
13:30-13:45 Contributed Talk OpenWebMath: An Open Dataset of High-Quality Mathematical Web Text
Keiran Paster,
13:45-14:00 Contributed Talk What Algorithms Can Transformers Learn? A Study in Length Generalization
Hattie Zhou,
14:00-14:30 Invited Talk AI can learn from data. But can it learn to reason?
Guy Van den Broeck, UCLA
14:30-15:00 Coffee Break
15:00-16:00 Poster Session
16:00-16:30 Invited Talk Analogical Reasoning with Large Language Models
Xinyun Chen, Google DeepMind
16:30-17:00 Invited Talk Mechanisms of Symbol Processing for In-Context Learning in Transformers
Paul Smolensky, JHU, Microsoft