Part of your course grade is determined by participation, which can include both in-class participation as well as discussion here on the course webpage. Therefore, your first assignment is to:
- create an account, and
- leave a comment on this post containing your favorite mathematical formula (see below).
To make things interesting, your comment should include a description of your favorite mathematical formula typeset in $\LaTeX$. If you don’t know how to use $\LaTeX$ this is a great opportunity to learn — a very basic introduction can be found here. (And if you don’t have a favorite mathematical formula, this is a great time to pick one!)
(P.S. Anyone interested in hearing about some cool “favorite theorems” should check out this podcast.)
chain rule?
$$\frac{\partial z}{\partial x} = \frac{\partial z}{\partial y}\frac{\partial y}{\partial x}$$
Continuous Fourier transform
$$ F(\omega) = \int_{-\infty}^{+\infty} f(t) e^{-i \omega t} dt $$
The pythagorean theorem?
$$a^2 +b^2 = c^2$$
Definition of Convolution
$$ (f*g)(x) = \int f(t)g(x-t) dt$$
Euler’s Identity. So beautiful.
\[ e^{i\pi}+1=0\]
Euler’s Polyhedron Formula
\[ V – E + F = 2 \]
Collatz conjecture
$$C(n) = \begin{cases} n/2 &\text{if } n \equiv 0 \pmod{2}\\ 3n+1 & \text{if } n\equiv 1 \pmod{2} .\end{cases}$$
Taylor series
$$ f(x) = \sum_{n = 0}^\infty \frac{f^{(n)}(a)}{n!}(x – a)^n $$
$$\forall \epsilon > 0 , \exists n \in \mathbb{N} : \frac{1}{n} < \epsilon$$
Policy gradient!
$$\nabla_\theta J(\theta) = \mathbb{E}_{\pi_\theta} \big[ \nabla_\theta \log \pi_\theta (a_t \vert s_t) \; Q^{\pi_\theta}(s_t, a_t) \big]$$
Mandelbrot set:
\[\begin{aligned}
f_c(z) &:= z^2 + c\\
M &:= \{c \in \mathbb{C} : \forall n \in \mathbb{N}, |f^n_c(0)| \le 2\}
\end{aligned}\]
Euler’s formula in complex analysis:
$$e ^{i \theta} = \cos \theta + i \sin \theta$$
The roots of any quadratic equation:
\begin{equation}
x = \frac{-b \pm \sqrt{b^2 – 4ac}}{2a}
\end{equation}
Pythagorean Theorem:
a^2+b^2=c^2
Not sure about favorite about this is always fun
$$\text{median}(S) = \arg_x \min \sum_i^{|S|} | s_i – x|$$
Chernoff Bound:
$$Pr[\sum_i X_i \geq E[\sum_i X_i] + d] \leq e^{\frac{-2d^2}n}$$
Basel Problem:
$\displaystyle{\sum_{i=1}^{\infty}\frac{1}{i^2}} = \frac{\pi^2}{6}$
Guess I liked it because I just learnt it in Set Theory yesterday, but…
The recursive set encoding of natural numbers
$0 = \emptyset $
$n + 1 = \cup \{ n , \{ n \} \}$
Monte Carlo Integration?
$I=\int_\Omega f(\overline{x})d\overline{x}$
$\overline{x}_{i}\in\Omega$
$I = \frac{V}{N}\sum\limits_{i=1}^N f(\overline{x}_{i})$
Experience + reflection = progress by Ray Dalio
This is physics but
$L = T – U$ and $\frac{\delta L}{\delta q_i} – \frac{d}{dt} \frac{\delta L}{\delta \dot{q_i}} = 0$
Markov’s inequality
$ P(X \geq a) \leq \frac{E[X]}{a}$
Not exactly math, but the classic rendering equation!
$$ L_o(\textbf{x},\omega_o,\lambda,t) = L_e(\textbf{x},\omega_o,\lambda,t) + \int_{\Omega} f_r(\textbf{x},\omega_i,\lambda,t)L_i(\textbf{x},\omega_i,\lambda,t)(\omega_i \cdot \textbf{n})d\omega_i $$
Reflexivity of equality x=x
Cauchy’s Stress Theorem:
$\boldsymbol{T}^{(\boldsymbol{n})} = \boldsymbol{n} \cdot \boldsymbol{\sigma}$
divergence theorem
$\iiint\limits_V \, (\nabla\cdot \mathbf{F})\ dV = \oint\limits_S \, (\mathbf{F}\cdot \mathbf{n})\ dS$
Kinematic Reconstruction Equation :
$(g^{-1}\dot{g})^{\vee} = -A(\alpha)\dot{\alpha}$
I like the formula for the $n^{th}$ Catalan number.
$$C_n=\frac{(2n)!}{(n+1)!n!}$$
Cauchy’s Integral Formula:
\[f(a) = \frac{1}{2\pi i}\int_{\gamma}\frac{f(z)}{z-a}dz\]
This one that I learnt in concepts of math at CMU:
$|{\mathbb{R} |= |\mathcal{P}(\mathbb{N})|}$
Integration by parts:
$$ \int u dv = uv – \int v du $$
An integral producing the GCD!
\[ \int_0^{\pi/2}\ln{\lvert\sin(mx)\rvert}\cdot \ln{\lvert\sin(nx)\rvert}\, dx=\frac{\pi^3}{24}\frac{\gcd^2(m,n)}{mn}+\frac{\pi\ln^2(2)}{2} \]
I should probably use this account instead. So here’s a nice fourier series used in the derivation:
\[ \sum_{n=1}^\infty \frac{\cos(kx)}{k}=-\ln\left(\bigg\lvert \sin\left(\frac{x}{2}\right) \bigg\rvert\right) \]
fresh out of 21-801 Asymptotic Convex Geometry. this is what baby geometers play with.
it is the unit ball in $(\mathbb{R}^n,||\cdot||_p)$ – that is: $\mathbb{R}^n$ equipped with the p-norm $||\cdot||_p$
\begin{equation}
B_{p}^{n} = \left\{x \in \mathbb{R}^n \,:\, ||x||_p \leq 1\right\}
\end{equation}
actual favorite: slicing with hyperplanes, the volume of a slice
given a centred convex body $K\in \mathbb{R}^n$ of volume 1 and a direction $\theta \in \mathbb{S}^{n-1}$ consider the function of the (n-1)-dimensional volume of the slice of $K$ with a hyperplane orthogonal to $\theta$ passing through $t\theta$ for $t \in \mathbb{R}.$
\begin{equation}
f(t) = |K \cap t\theta + \theta^\bot|
\end{equation}
$K \subseteq \mathbb{R}^n$
Wave Equation
\begin{equation}
\frac{\partial^2 u }{\partial t^2} = c^2 \nabla^2 u
\end{equation}
Euler’s identity:
$ e^{i \pi }+1=0 $
Probability of uniformly sampled orientation with magnitude less than $\theta$
$\frac{1}{\pi} (\theta – \sin(\theta))$
Euclidean Distance!
For points $(a_{1}, b_{1})$ and $(a_{2}, b_{2})$, distance between is $ d = \sqrt{(a_{1} – b_{1})^2 + (a_{2} – b_{2})^2}$
I took my first PDE class that was more theoretical last semester and used Divergence Theorem a lot
$$\int_{\Omega} \nabla \cdot F dV = \int_{\partial \Omega} F \cdot \nu dS$$
Pythogorian Identity
$\sin^{2} \theta + \cos^{2} \theta = 1 $
My current favorite formula is the KdV (Soliton) equation:
$$\frac{\partial u}{\partial t}+6 u \frac{\partial u}{\partial x} + \frac{\partial^3 u}{\partial x^3}=0$$
because 1) It is solvable even though it is nonlinear and third order, 2) Has really cool pattern formation (traveling stable wave) solutions, and 3) a number of applications/relationships
Stokes Theorem
$\int_{ds} \alpha = \int_{S} d\alpha$
Gauss’s Law, if it counts as mathematical
$$\oint_S {E_n dA = \frac{1}{{\varepsilon _0 }}} Q_{inside}$$
wow hi, one minute apart huh?
Binomial distribution:
Probability of exactly $k$ successes in $n$ trials each with probability $p$ is ${n\choose k}p^{k}{(1-p)}^{n-k}$
This definition of $e$:
$$e = \lim_{n\rightarrow \infty} \left(1 + \frac{1}{n}\right)^n$$
Bayes’ rule $$P(A|B)=\frac{P(B|A)P(A)}{P(B)}$$
Euler’s formula
$e^{i\theta} = \cos\theta + i\sin\theta$
Spectral Theorem (for symmetric matrices)
\[ A = U \Lambda U^T \]
Taylor series: great generalization of function approximation methods. Fundamental view of dealing with non-linear systems. Inspiring idea of function decomposition.
$f(x)=\sum_{n=0}^{\infty}\frac{f^{n}(a)}{n!}(x-a)^{n}$
Euler’s identity:
$e^{i\pi}+1=0$
The Rendering Equation, because I feel like I need to review rendering.
\[L_o(\mathbf{x},\omega_o,\lambda,t)=L_e(\mathbf{x},\omega_o,\lambda,t)+\int_{\Omega}f_r(\mathbf{x},\omega_i,\omega_o,\lambda,t)L_i(\mathbf{x},\omega_o,\lambda,t)(w_i\ \cdot \ \mathbf{n})d\omega_i\]
Trig identity:
$sin^{2}(x) + cos^{2}(x) = 1$
Poisson’s equation:
\begin{equation}
\nabla \cdot (-D\nabla c)=S
\end{equation}
Euler’s Formula:
\[ \frac{1}{\zeta(s)} = \prod_p (1 – \frac{1}{p^s}) \]
Stewart’s Theorem (Geometry!)
$$MAN+DAD=BMB+CNC$$
To use a bit more LaTeX, $$amn+ad^2=b^2m+c^2n$$