Research Areas

Knowledge Distillation

Teaching reasoning capabilities to smaller models using effective knowledge distillation techniques.

Refinement in LLMs

A refinement framework that explores how reasoning can be improved in LLMs through resampling and selection.

Problem Decomposition

Can decomposition of problem into simpler problems assist students in learning a concept better?