# Assignment -1: Favorite Formula

Part of your course grade is determined by participation, which can include both in-class participation as well as discussion here on the course webpage.  Therefore, your first assignment is to:

1. create an account, and
To make things interesting, your comment should include a description of your favorite mathematical formula typeset in $\LaTeX$.  If you don’t know how to use $\LaTeX$ this is a great opportunity to learn — a very basic introduction can be found here.  (And if you don’t have a favorite mathematical formula, this is a great time to pick one!)

(P.S. Anyone interested in hearing about some cool “favorite theorems” should check out this podcast.)

## 38 thoughts on “Assignment -1: Favorite Formula”

1. hawner says:

Euler’s formula: $e^{ix} = \cos{x} + i\sin{x}$

2. elu00 says:

Gotta go with the classic
$(-80538738812075974)^3 + 80435758145817515^3 + 12602123297335631^3 = 42.$

3. Pip says:

The only one that stuck in my brain all these years: the pythagorean theorem: $a^{2} + b^{2} = c^{2}$

4. Jin Shang says:

The two equivalent definitions of $e$:
$$\lim_{n\to \infty} \left(1+\frac{1}{n}\right)^n = \sum_{n=0}^{\infty} \frac{1}{n!} = e$$

5. coffeeholic says:

Bayes’ Rule: $P(A\mid B) = \frac{P(B\mid A)P(A)}{P(B)}$

6. hawaiii says:

Fourier transform: $X(j\omega) = \int_{-\infty}^\infty x(t) e^{-2 j\pi t \omega} dt$

7. kshitijgoel says:

I love squaring things in the head, so here we go.

$$(a+b)^2 = a^2 + 2ab + b^2$$

8. Lqq88888 says:

Geodesic equation:
$\frac{\partial^2 x_k}{\partial t^2} + {\sum_{i j }\Gamma_{ij}^{k} \frac{\mathrm{d} x_i}{\mathrm{d} t}\frac{\mathrm{d} x_j}{\mathrm{d} t}} = 0$

9. manugopa says:

Thin Lens Equation: $\frac{1}{o} + \frac{1}{i} = \frac{1}{f}$

10. OtB_BlueBerry says:

The von Neumann entropy:
$S=-\mathrm{tr}(\rho\log\rho)$

11. yixinh says:

Good ol’ quadratic equation: $x = \frac{-b \pm \sqrt{b^2-4ac}}{2a}$

12. seanzhang says:

Stokes’ theorem:

$$\int_M d\omega = \int_{\partial M} \omega.$$

13. yitongl says:

Linear interpolation because it’s easy and I use it a lot.

\begin{equation}
\frac{y – y_0}{x – x_0} = \frac{y_1 – y_0}{x_1 – x_0}
\end{equation}

14. slzheng says:

Good ol’ Bayes: $P(A|B)=\frac{P(B|A)P(A)}{P(B)}$

15. yuepeng says:

The Minkowski’s inequality: $\|f+g\|_p\le \|f\|_p + \|g\|_p, p\in [1, \infty]$.

16. bahn says:

Euler’s formula: $e^{j\theta} = \cos\theta + j\sin\theta$

17. sdai says:

Principle of least action: $\delta S = 0$

18. ceviri says:

$\forall X(\varnothing \not\in X \implies \exists f:X\rightarrow \bigcup X \quad \forall A \in X(f(A \in X)))$
(amssymb pls 🙁 )
The Axiom of Choice

19. battal says:

Bayes rule!

$P(X | Y) = \frac{P(Y|X) \cdot P(X)}{\sum_{X’}P(Y|X’) \cdot P(X’)}$

20. Ninae says:

Heron’s formula is pretty cool! $A = \sqrt{s(s-a)(s-b)(s-c)}$ where $s$ is defined as semiperimeter of a triangle.

21. andrewc3 says:

Layer-cake formula:

$\int_{X} f d\mu = \int_{0}^{\infty} d_{f}(t) dt$

22. FattyAcid says:

Euler’s identity:
$e^{i\pi}+1 = 0$

23. Zhang Zetian says:

Chain rule:

$\frac{d_u}{d_x} = \frac{d_u}{d_v}\frac{d_v}{d_x}$

24. abiagiol says:

I’ll go with Wilson’s theorem.

$n \text{ is prime} \Leftrightarrow (n – 1)! \equiv -1 \text{ mod } n$

25. irabazle says:

Bayes’ Theorem! $P(A\mid B) = \frac{P(B\mid A)P(A)}{P(B)}$

26. tpan496 says:

Entropy, $I = -\Sigma_i P_i log P_i$

27. odadfar says:

Jacobian Transpose IK:

$\Delta \theta = J^T \Delta e$

28. svsalem says:

Difference of cubes!
$latex (a^3 – b^3) = (a-b) (a^2 + ab – b^2)$

29. TheNumbat says:

Divergence/Stokes

$\iiint_V \left(\nabla \cdot \mathbf{F}\right) dV = \iint_S \left(\mathbf{F}\cdot \vec{n}\right) dS$
$\iint_S \left(\nabla \times \mathbf{F}\right) dS = \int_C \mathbf{F}\cdot dC$

30. Haochen says:

ELBO

\begin{equation}
\text{ELBO} = \log P(x) – \text{KL}(q_x || P(z | x) ) &= \int_z q_x(z) \log \frac{P(x, z)}{q_x(z)} \,dz \\
&= E_{z \sim q_x} \bigg[ \log \frac{P(x, z)}{q_x(z)} \bigg]
\end{equation}

31. Haochen says:

ELBO

\begin{equation}
\text{ELBO} = \log P(x) – \text{KL}(q_x || P(z | x) ) = \int_z q_x(z) \log \frac{P(x, z)}{q_x(z)} \,dz = E_{z \sim q_x} \bigg[ \log \frac{P(x, z)}{q_x(z)} \bigg]
\end{equation}

32. fpou says:

Stirling’s formula: $n! \sim \sqrt{2\pi n} \left( \frac{n}{e} \right)^n$.

33. Wei Dong says:

Zeta function:
$\zeta(s) = \sum_{n=1}^{\infty}n^{-s}$
with
$\zeta(-1) = -1/12, \zeta(2)=\pi^2/6$, …

34. lykospirit says:

The chain rule!
$$J_{f\circ g}=(J_f\circ g)J_g$$

35. bmiller2 says:

Bayes Theorem

\begin{equation}
P(A|B) = \frac{P(B|A) P(A)}{P(B)}
\end{equation}

36. ddeangul says:

Factorial of Zero
$$0! = 1$$

37. xTheBHox says:

Binomials:
$${N\choose k}=\frac{n!}{(n-k)!k!}$$

38. jkcao says:

Angle between two vectors: $\cos \theta = \frac{|\vec{a} \cdot \vec{b}|}{|\vec{a}||\vec{b}|}$