Adscn's Blog
  • Home
  • Blog
  • Archives
  • Categories
  • Tags
  • About

Unified Latents: 扩散模型的框架下,令生成与编码相互配合

本文介绍 google deepmind 团队提出的 Unified Latents 架构,在扩散模型的架构下统一了生成和表示压缩,并且给出了优雅的信息界。 生成与编码的权衡 在生成模型设定中(详细见此文 ) ,首先我们有一个在高维空间中的分布 \[ x\sim \mu,\ p(x):=\mathrm{Law}(\mu) : \mathbb R^n \to [0,\infty) \] 历史上, 变
2026-02-26
Machine Learning
#Diffusion Models

nanochat 学习笔记

本文档记录解读学习 Andrej Karpathy 开源项目 nanochat 的笔记。 学习的 git 代码版本是: f5a0ea4。 项目简介 nanochat 是一个用(几乎,有一点 Rust 用来训练分词器)纯 Python 实现的全栈 GPT库,包括 pretrain、midtrain、finetune、inference 等。它的目标是最简化帮助理解大型语言模型(LLM)的工作原理。
2026-01-10
Programming
#LLM #Open Source

随机微分方程(SDE)

参考文献 基本上参考 Bernt Øksendal 的《Stochastic Differential Equations: An Introduction with Applications》。 加上一些个人的理解,和测度论上的一些补充。也许会简单过一些习题。 这里主要是速通,不会太过深入某些定理的证明细节。 前置:基本的概率论,实分析,最好有测度论知识。 1. 测度概率论和随机过程基础 (Ch
2026-01-01
Mathematics
#Stochastic Differential Equations #Ito Calculus

QAOA

Quantum Approximate Optimization Algorithm (QAOA) Problem Setting Let \(z=z_1 z_2 \ldots z_n\) be a string of \(n\) bits, where each bit \(z_i \in \{0, 1\}\). We define a cost function \(C(z)\) that a
2025-12-15
Machine Learning
#Optimization #Quantum Computing #QAOA

Reading notes of The Principle of Diffusion Models

This note is to summarize key points, note down key formulae and gives some discussion for the book “The Principle of Diffusion Models” by Song et al.  arxiv 1. Deep Generative Modeling TODO. 2. Varia
2025-11-03
Machine Learning
#Generative Models #Diffusion Models

Flow matching, Score-based Generative Model, Schrödinger Bridge and Optimal Transport

Reference Denoising Diffusion Probabilistic Models Diffusion Schrödinger Bridge Matching Simplified Diffusion Schrödinger Bridge Speed-accuracy relations for diffusion models: Wisdom from nonequilibri
2025-09-18
Machine Learning
#Generative Models #Flow Matching #Score-based Models #Optimal Transport #Stochastic Differential Equations

Variational Calculus in VAE

Given a probability distribution \(p(x)\), we want to find an encoder \(q_{\theta}(z|x)\) that approximates the true posterior \(p(z|x)\). And we resort to minimize the Kullback-Leibler divergence bet
2025-08-25
Machine Learning
#Generative Models #VAE #Calculus of Variations

Renormalization Group Flow as Optimal Transport

Reference Renormalization Group Flow as Optimal Transport Introduction I have very little knowledge in physics, so there might be many errors in this post. If you see any errors, please point it out i
2025-08-21
Physics
#Optimal Transport #Renormalization Group

Flow Matching

Reference Flow Matching For Generative Modeling Continuous Normalizing Flows Flow Straight and Fast Preliminaries The objective of generative modeling is to learn the underlying distribution \(p_{data
2025-08-09
Machine Learning
#Generative Models #Flow Matching

Differential Manifold

References John Lee, Introduction to Smooth Manifolds Loring Tu, An Introduction to Manifolds Victor Guillemin and Alan Pollack, Differential Topology For Chinese reader, you can refer to the websit
2025-08-04
Mathematics
#Differential Geometry
12

Search

Hexo Fluid