Adscn's Blog
  • Home
  • Blog
  • Archives
  • Categories
  • Tags
  • About

Topology

Basic Concepts Topology Space A topology space is a set \(X\) with a collection of open sets \(\mathcal{T}\) that satisfy the following properties: 1. \(\emptyset \in \mathcal{T}\) and \(X \in \mathca
2025-07-30
Mathematics
#Topology

Differential Geometry

References John Lee, Introduction to Smooth Manifolds Loring Tu, An Introduction to Manifolds Victor Guillemin and Alan Pollack, Differential Topology For Chinese reader, you can refer to the websit
2025-07-30
Mathematics
#Differential Geometry

GAN and Wasserstein GAN

Introduction Generative Adversarial Networks is invented in 2014 by Ian J.Goodfellow, et. al. as a generative models, its idea derives from the game theory where two player compete against one another
2025-07-22
Machine Learning
#Generative Models #GAN #Wasserstein Distance

Wasserstein Distance and Optimal Transport

Optimal Transport Problem Definition Given two probability measures \(\mu\) and \(\nu\) on measurable spaces \((X, \mathcal{A})\) and \((Y, \mathcal{B})\), respectively, the Optimal Transport Problem
2025-07-16
Mathematics
#Optimal Transport #Wasserstein Distance

Densest Subgraph

Reference: A Convex-Programming Approach for Efficient Directed Densest Subgraph Discovery Efficient and Scalable Directed Densest Subgraph Discovery UDS problem Definition The Undirected Densest Su
2025-06-16
Mathematics
#Graph Theory #Optimization

Random Coordinate Descent Methods

This is a reading note on the paper “Random Coordinate Descent Methods for Minimizing Decomposable Submodular Functions” by A. ENE, Huy L. NGUYEN arxiv Knowledge prerequisite on submodular function (i
2025-06-14
Mathematics
#Optimization #Coordinate Descent

子模函数以及Lovász拓展

参考文献: Wikipedia - Submodular function Lecture 7: Submodular Functions, Lovász Extension and Minimization Learning with Submodular Functions: A Convex Optimization Perspective 子模函数 (Submodular Function
2025-06-11
Mathematics
#Optimization #Submodular Functions

机器学习中常见的损失函数

回归任务损失函数(Regression Losses) 均方误差(MSE, Mean Squared Error) 均方误差是回归任务中最常用的损失函数之一。它计算预测值与实际值之间差异的平方和的平均值。公式如下: \[ \text{MSE} = \frac{1}{n} \sum_{i=1}^{n} (y_i - \hat{y}_i)^2 \] 其中,\(y_i\) 是实际值,\(\hat{y}_
2025-06-10
Machine Learning
#Loss Functions

测试文章

2025-06-07
Other
#Test

安瓦尔·谢克《资本主义》读书笔记

前言 这本书主要回答了几个重要的问题: 1.如何解释1940年前后巨大的差异:在战前表现出价格持续的涨跌波动。而在战后时期,物价水平呈现出新的变动模式,即永无止歇的增长。以及解释用黄金价格修正的长波规律。——这个是贯穿全文的主线,从第二章开始讨论至最后一章。 2.现代宏观经济学的体系建立在一些既不现实也不必要的假设之下,例如边际效率递减,超理性假设。如何从真实竞争视角,或者从涌现特质来推导常见的经
2025-06-07
Reading Notes
#Economics
12

Search

Hexo Fluid