# Assignment -1: Favorite Formula

Part of your course grade is determined by participation, which can include both in-class participation as well as discussion here on the course webpage.  Therefore, your first assignment is to:

1. create an account, and
To make things interesting, your comment should include a description of your favorite mathematical formula typeset in $\LaTeX$.  If you don’t know how to use $\LaTeX$ this is a great opportunity to learn — a very basic introduction can be found here.  (And if you don’t have a favorite mathematical formula, this is a great time to pick one!)

(P.S. Anyone interested in hearing about some cool “favorite theorems” should check out this podcast.)

## 58 thoughts on “Assignment -1: Favorite Formula”

1. Yufei Ye says:

chain rule?
$$\frac{\partial z}{\partial x} = \frac{\partial z}{\partial y}\frac{\partial y}{\partial x}$$

2. donglaix says:

Continuous Fourier transform

$$F(\omega) = \int_{-\infty}^{+\infty} f(t) e^{-i \omega t} dt$$

3. joshkalapos says:

The pythagorean theorem?
$$a^2 +b^2 = c^2$$

4. leohuang says:

Definition of Convolution

$$(f*g)(x) = \int f(t)g(x-t) dt$$

5. yujiew says:

Euler’s Identity. So beautiful.

$e^{i\pi}+1=0$

6. Connor says:

Euler’s Polyhedron Formula

$V – E + F = 2$

7. earthwalker says:

Collatz conjecture
$$C(n) = \begin{cases} n/2 &\text{if } n \equiv 0 \pmod{2}\\ 3n+1 & \text{if } n\equiv 1 \pmod{2} .\end{cases}$$

8. jyiyang says:

Taylor series

$$f(x) = \sum_{n = 0}^\infty \frac{f^{(n)}(a)}{n!}(x – a)^n$$

9. Yiyuan Chen says:

$$\forall \epsilon > 0 , \exists n \in \mathbb{N} : \frac{1}{n} < \epsilon$$

10. jzhanson says:

$$\nabla_\theta J(\theta) = \mathbb{E}_{\pi_\theta} \big[ \nabla_\theta \log \pi_\theta (a_t \vert s_t) \; Q^{\pi_\theta}(s_t, a_t) \big]$$

11. sdkim1 says:

Mandelbrot set:
\begin{aligned} f_c(z) &:= z^2 + c\\ M &:= \{c \in \mathbb{C} : \forall n \in \mathbb{N}, |f^n_c(0)| \le 2\} \end{aligned}

12. ShumianXin says:

Euler’s formula in complex analysis:

$$e ^{i \theta} = \cos \theta + i \sin \theta$$

13. chenj47 says:

The roots of any quadratic equation:

x = \frac{-b \pm \sqrt{b^2 – 4ac}}{2a}

14. leonidk says:

$$\text{median}(S) = \arg_x \min \sum_i^{|S|} | s_i – x|$$

15. Josh Durham says:

Chernoff Bound:

$$Pr[\sum_i X_i \geq E[\sum_i X_i] + d] \leq e^{\frac{-2d^2}n}$$

16. dahoas says:

Basel Problem:

$\displaystyle{\sum_{i=1}^{\infty}\frac{1}{i^2}} = \frac{\pi^2}{6}$

17. YuuSama says:

Guess I liked it because I just learnt it in Set Theory yesterday, but…
The recursive set encoding of natural numbers

$0 = \emptyset$
$n + 1 = \cup \{ n , \{ n \} \}$

18. tcl1 says:

Monte Carlo Integration?
$I=\int_\Omega f(\overline{x})d\overline{x}$
$\overline{x}_{i}\in\Omega$
$I = \frac{V}{N}\sum\limits_{i=1}^N f(\overline{x}_{i})$

19. williamq says:

Experience + reflection = progress by Ray Dalio

20. ljelenak says:

This is physics but

$L = T – U$ and $\frac{\delta L}{\delta q_i} – \frac{d}{dt} \frac{\delta L}{\delta \dot{q_i}} = 0$

21. Hmahon says:

Markov’s inequality

$P(X \geq a) \leq \frac{E[X]}{a}$

22. acpatel says:

Not exactly math, but the classic rendering equation!

$$L_o(\textbf{x},\omega_o,\lambda,t) = L_e(\textbf{x},\omega_o,\lambda,t) + \int_{\Omega} f_r(\textbf{x},\omega_i,\lambda,t)L_i(\textbf{x},\omega_i,\lambda,t)(\omega_i \cdot \textbf{n})d\omega_i$$

23. t-nice says:

divergence theorem

$\iiint\limits_V \, (\nabla\cdot \mathbf{F})\ dV = \oint\limits_S \, (\mathbf{F}\cdot \mathbf{n})\ dS$

Kinematic Reconstruction Equation :
$(g^{-1}\dot{g})^{\vee} = -A(\alpha)\dot{\alpha}$

25. zwellner says:

I like the formula for the $n^{th}$ Catalan number.

$$C_n=\frac{(2n)!}{(n+1)!n!}$$

26. jingyangliu says:

Cauchy’s Integral Formula:
$f(a) = \frac{1}{2\pi i}\int_{\gamma}\frac{f(z)}{z-a}dz$

27. Hesper says:

This one that I learnt in concepts of math at CMU:
$|{\mathbb{R} |= |\mathcal{P}(\mathbb{N})|}$

28. Georgia says:

Integration by parts:
$$\int u dv = uv – \int v du$$

29. m109n6s8t8e3r says:

An integral producing the GCD!
$\int_0^{\pi/2}\ln{\lvert\sin(mx)\rvert}\cdot \ln{\lvert\sin(nx)\rvert}\, dx=\frac{\pi^3}{24}\frac{\gcd^2(m,n)}{mn}+\frac{\pi\ln^2(2)}{2}$

1. m1o9n6s8t8e3r says:

I should probably use this account instead. So here’s a nice fourier series used in the derivation:

$\sum_{n=1}^\infty \frac{\cos(kx)}{k}=-\ln\left(\bigg\lvert \sin\left(\frac{x}{2}\right) \bigg\rvert\right)$

30. Brandon Cobb says:

fresh out of 21-801 Asymptotic Convex Geometry. this is what baby geometers play with.

it is the unit ball in $(\mathbb{R}^n,||\cdot||_p)$ – that is: $\mathbb{R}^n$ equipped with the p-norm $||\cdot||_p$

B_{p}^{n} = \left\{x \in \mathbb{R}^n \,:\, ||x||_p \leq 1\right\}

1. Brandon Cobb says:

actual favorite: slicing with hyperplanes, the volume of a slice

given a centred convex body $K\in \mathbb{R}^n$ of volume 1 and a direction $\theta \in \mathbb{S}^{n-1}$ consider the function of the (n-1)-dimensional volume of the slice of $K$ with a hyperplane orthogonal to $\theta$ passing through $t\theta$ for $t \in \mathbb{R}.$

f(t) = |K \cap t\theta + \theta^\bot|

Wave Equation

\frac{\partial^2 u }{\partial t^2} = c^2 \nabla^2 u

32. Brian Okorn says:

Probability of uniformly sampled orientation with magnitude less than $\theta$
$\frac{1}{\pi} (\theta – \sin(\theta))$

33. Emma Liu says:

Euclidean Distance!

For points $(a_{1}, b_{1})$ and $(a_{2}, b_{2})$, distance between is $d = \sqrt{(a_{1} – b_{1})^2 + (a_{2} – b_{2})^2}$

34. Varun Gudibanda says:

I took my first PDE class that was more theoretical last semester and used Divergence Theorem a lot

$$\int_{\Omega} \nabla \cdot F dV = \int_{\partial \Omega} F \cdot \nu dS$$

35. faisal says:

My current favorite formula is the KdV (Soliton) equation:

$$\frac{\partial u}{\partial t}+6 u \frac{\partial u}{\partial x} + \frac{\partial^3 u}{\partial x^3}=0$$

because 1) It is solvable even though it is nonlinear and third order, 2) Has really cool pattern formation (traveling stable wave) solutions, and 3) a number of applications/relationships

36. gaurav82 says:

Stokes Theorem

$\int_{ds} \alpha = \int_{S} d\alpha$

37. jagu says:

Gauss’s Law, if it counts as mathematical
$$\oint_S {E_n dA = \frac{1}{{\varepsilon _0 }}} Q_{inside}$$

38. lemur says:

Binomial distribution:
Probability of exactly $k$ successes in $n$ trials each with probability $p$ is ${n\choose k}p^{k}{(1-p)}^{n-k}$

39. yaochonl says:

This definition of $e$:
$$e = \lim_{n\rightarrow \infty} \left(1 + \frac{1}{n}\right)^n$$

40. Wentao says:

Bayes’ rule $$P(A|B)=\frac{P(B|A)P(A)}{P(B)}$$

41. Eric Huang says:

Euler’s formula

$e^{i\theta} = \cos\theta + i\sin\theta$

42. samvit says:

Spectral Theorem (for symmetric matrices)

$A = U \Lambda U^T$

43. tongl1 says:

Taylor series: great generalization of function approximation methods. Fundamental view of dealing with non-linear systems. Inspiring idea of function decomposition.
$f(x)=\sum_{n=0}^{\infty}\frac{f^{n}(a)}{n!}(x-a)^{n}$

44. Turtle50UP says:

The Rendering Equation, because I feel like I need to review rendering.

$L_o(\mathbf{x},\omega_o,\lambda,t)=L_e(\mathbf{x},\omega_o,\lambda,t)+\int_{\Omega}f_r(\mathbf{x},\omega_i,\omega_o,\lambda,t)L_i(\mathbf{x},\omega_o,\lambda,t)(w_i\ \cdot \ \mathbf{n})d\omega_i$

45. ethanxu says:

Trig identity:

$sin^{2}(x) + cos^{2}(x) = 1$

46. shoheiogawa says:

Poisson’s equation:

\nabla \cdot (-D\nabla c)=S

47. eric.n says:

Euler’s Formula:

$\frac{1}{\zeta(s)} = \prod_p (1 – \frac{1}{p^s})$

48. drli says:

Stewart’s Theorem (Geometry!)
$$MAN+DAD=BMB+CNC$$
To use a bit more LaTeX, $$amn+ad^2=b^2m+c^2n$$