Why the Exponential? From Max‑Entropy RL to the Boltzmann Distribution
This blog post explores why the exponential function appears ubiquitously across modern RL, energy-based modeling, and statistical mechanics. We examine the connection between max-entropy reinforcement learning and the Boltzmann distribution, uncovering the fundamental principles that make the exponential form inevitable and explaining what "temperature" actually does in these frameworks.
A Unified Framework for Diffusion Distillation
The explosive growth in one-step and few-step diffusion models has taken the field deep into the weeds of complex notations. In this blog, we cut through the confusion by proposing a coherent set of notations that reveal the connections among these methods.
On the Permutation Invariance of Graph Generative Models
This blog post discusses the permutation invariance principle of graph generative models, which has often been taken for granted in graph-related tasks. While permutation symmetry is an elegant property of graph data, there is still more to learn about its empirical implications.
Conditional Generative Models for Motion Prediction
In this blog post, we discuss good engineering practices and the lessons learned—sometimes the hard way—from building conditional generative models (in particular, flow matching) for motion prediction problems.
Evaluating Motion Consistency by Fréchet Video Motion Distance (FVMD)
In this blog post, we introduce a promising new metric for video generative models, Fréchet Video Motion Distance (FVMD), which focuses on the motion consistency of generated videos.