Assignment 4 (Written): Conformal Parameterization

The written part of your next assignment, on conformal surface flattening, is now available below. Conformal flattening is important for (among other things) making the connection between processing of 3D surfaces, and existing fast algorithms for 2D image processing. You’ll have the opportunity to implement one of these algorithms in the coding part of the assignment (to be announced soon).

Assignment 3 (Written): The Laplacian

These exercises will lead you through two different derivations for the cotan-Laplace operator. As we’ll discuss in class, this operator is basically the “Swiss army knife” of discrete differential geometry and digital geometry processing, opening the door to a huge number of interesting algorithms and applications. Note that this time the exercises all come from the course notes—you will need to read the accompanying notes in order to familiarize yourself with the necessary material (though actually we’ve covered much of this stuff in class already!)

Implementing Curvature Flow

For anyone interested in learning more about the 1D curvature flows we saw today in class, there’s an assignment (and some notes) from a previous year’s class here:

Curvature Flow

In fact, it wouldn’t be hard to implement in the same code framework we’re using for the class, since you can think of a plane curve as a “mesh” consisting of a single flat polygon with many edges.

The paper I mentioned on discrete curve shortening with no crossings is:

Hass and Scott, “Shortening Curves on Surfaces”, Topology 33, (1994) 25-43.

It would be fun to see an implementation of something like this on a surface (I’ve never done it myself!).

Taking Gradients: Partial Derivatives vs. Geometric Reasoning

In your homework, you are asked to derive an expression for the gradient of the area of a triangle with respect to one of its vertices. In particular, if the triangle has vertices \(a,b,c \in \mathbb{R}^3\), then the gradient of its area \(A\) with respect to the vertex \(a\) can be expressed as
\[
\nabla_a A = \tfrac{1}{2} N \times (b-c).
\]
This formula can be obtained via a simple geometric argument, has a clear geometric meaning, and generally leads to a an efficient and error-free implementation.

In contrast, here’s the expression produced by taking partial derivatives via Mathematica (even after calling FullSimplify[]):

Longer expressions like these will of course produce the same values. But without further simplification (by hand) it will be less efficient, and could potentially exhibit poorer numerical behavior due to the use of a longer sequence of floating-point operations. Moreover, they are far less easy to understand/interpret, especially if this calculation is just one small piece of a much larger equation (as it often is).

In general, taking gradients the “geometric way” often provides greater simplicity and deeper insight than just grinding everything out in components. Your current assignment will give you some opportunity to see how this all works.

Update: As mentioned by Peter in the comments, here’s the expression for the gradient of angle via partial derivatives (as computed by Mathematica). Hopefully by the time you’re done with your homework, you’ll realize there’s a better way!

Reading Assignment: Introduction to Curves & Surfaces (Due 10/24)

For your next reading assignment, you will read a few pages about curves and surfaces from the course notes: Chapter 2, pages 7–23. This material should be enough to get you started on the written/coding exercises NOW, rather than waiting until we are done with the full set of lectures. We will cover these topics in greater depth during lecture (especially the topic of curvature).

Assignment: Read the pages above, and write 2–3 sentences summarizing what you read, plus at least one question about something you didn’t understand, or some thought/idea that occurred to you while reading the article.

Handin instructions can be found in the “Readings” section of the Assignments page.  Note that you must send your summary in no later than 10am Eastern on the date of the next lecture (October 24, 2017).

Enjoy!