Scientific Computing
Before and after visualization of point-set alignment using the Kabsch algorithm

Kabsch-Horn Cookbook: Differentiable Alignment

A differentiable point-set alignment library implementing N-dimensional Kabsch, Horn quaternion, and Umeyama scaling algorithms with per-point weights, batch dimensions, and custom autograd across NumPy, PyTorch, JAX, TensorFlow, and MLX.

Computational Biology
Three-panel diagram showing input point sets, SVD factorization of the cross-covariance matrix, and the aligned result

Arun et al.: SVD-Based Least-Squares Fitting of 3D Points

Presents a concise SVD-based algorithm for finding the optimal rotation and translation between two 3D point sets, with analysis of the degenerate reflection case that Umeyama later corrected.

Computational Biology
Diagram showing the polar decomposition of the cross-covariance matrix M into orthonormal factor U and positive semidefinite square root

Horn et al.: Absolute Orientation Using Orthonormal Matrices

The matrix-based companion to Horn’s 1987 quaternion method, deriving the optimal rotation as the orthonormal factor in the polar decomposition of the cross-covariance matrix via eigendecomposition of a 3x3 symmetric matrix.

Computational Biology
Side-by-side comparison showing naive SVD producing a reflected alignment versus Umeyama's corrected proper rotation

Umeyama's Method: Corrected SVD for Point Alignment

Corrects a flaw in prior SVD-based alignment methods (Arun et al., Horn et al.) that could produce reflections instead of rotations under noisy data, and provides a complete closed-form solution for similarity transformations in arbitrary dimensions.

Generative Modeling
Diagram showing consistency models mapping points on a PF ODE trajectory to the same origin

Consistency Models: Fast One-Step Diffusion Generation

This paper introduces consistency models, a new family of generative models that map any point on a Probability Flow ODE trajectory to its origin. They support fast one-step generation by design, while allowing multi-step sampling for improved quality and zero-shot editing tasks like inpainting and colorization.

Generative Modeling
D3PM forward and reverse processes on a quantized swiss roll with uniform, Gaussian, and absorbing transition matrices

D3PM: Discrete Denoising Diffusion Probabilistic Models

This paper introduces Discrete Denoising Diffusion Probabilistic Models (D3PMs), which generalize diffusion to discrete state-spaces using structured Markov transition matrices. D3PMs include uniform, absorbing-state, and discretized Gaussian corruption processes, drawing a connection between diffusion and masked language models.

Computational Chemistry
GraphReco system architecture showing component extraction, atom and bond ambiguity resolution, and graph reconstruction stages

GraphReco: Probabilistic Structure Recognition (2026)

GraphReco presents a rule-based OCSR system with two key innovations: a Fragment Merging line detection algorithm for precise bond identification and a Markov network for probabilistic resolution of atom/bond ambiguity during graph assembly. Achieves 94.2% accuracy on USPTO-10K, outperforming both traditional rule-based and some ML-based methods.

Computational Biology
3D scatter plot showing left and right point sets with rotation axis and quaternion rotation arc

Horn's Method: Absolute Orientation via Unit Quaternions

Derives the optimal rotation between two 3D point sets as the eigenvector of a 4x4 symmetric matrix built from cross-covariance sums, using unit quaternions to enforce the orthogonality constraint.

Computational Biology
3D scatter plot showing source points, target points, and Kabsch-aligned points overlapping the targets

Kabsch Algorithm: Optimal Rotation for Point Set Alignment

A foundational 1976 short communication presenting a direct, non-iterative method for finding the best rotation matrix between two point sets via eigendecomposition of a cross-covariance matrix.

Generative Modeling
LDM architecture diagram showing conditioning via concatenation and cross-attention

Latent Diffusion Models for High-Res Image Synthesis

This paper introduces Latent Diffusion Models (LDMs), which apply denoising diffusion in the latent space of pretrained autoencoders. By separating perceptual compression from generative learning and adding cross-attention conditioning, LDMs achieve FID 1.50 on Places inpainting and FID 3.60 on ImageNet class-conditional synthesis, with competitive text-to-image generation, at a fraction of the compute cost of pixel-space diffusion.

Machine Learning Fundamentals
Three-panel diagram showing an original sequence, its time-warped version, and the gate values derived from requiring time warping invariance

Can Recurrent Neural Networks Warp Time? (ICLR 2018)

Tallec and Ollivier show that requiring invariance to time transformations in recurrent models leads to gating mechanisms, recovering key LSTM components from first principles. They propose the chrono initialization for gate biases that improves learning of long-term dependencies.

Machine Learning Fundamentals
Graph network block diagram showing input graph transformed through edge, node, and global update steps to produce an updated graph

Relational Inductive Biases in Deep Learning (2018)

Battaglia et al. argue that combinatorial generalization requires structured representations, systematically analyze the relational inductive biases in standard deep learning architectures (MLPs, CNNs, RNNs), and present the graph network as a unifying framework that generalizes and extends prior graph neural network approaches.