课程概况
Hi! Our course aims to provide necessary background in Calculus sufficient for up-following Data Science courses.
Course starts with basic introduction to concepts concerning functional mappings. Later students are assumed to study limits (in case of sequences, single- and multivariate functions), differentiability (once again starting from single variable up to multiple cases), integration, thus sequentially building up a base for the basic optimisation. To provide an understanding of the practical skills set being taught, the course introduces the final programming project considering the usage of optimisation routine in machine learning.
Additional materials provided during the course include interactive plots in GeoGebra environment used during lectures, bonus reading materials with more general methods and more complicated basis for discussed themes.
课程大纲
Introduction: Numerical Sets, Functions, Limits
Here we introduce basic concept the calculus course could not be imagine without: function. In order to properly do it, one should say that the function is a mapping from one set to another. Thus, we start with the ideas of numerical sets and mapping, then proceeding with functions itself. Since we are particularly interested in functions' graph, we spend a lot of time discussing simplest ways to produce a complex function graph from elementary case. In the second part of the week we start our calculus journey with a discrete limit, the limit of sequences, and master skills needed to calculate them.
Limits and Multivariate Functions
Now it is time to move from discrete limits to continuous ones: in other words, in the current module we are going to discuss limits of functions. We start with the basic question: does this case sufficiently differ from the sequences? Turns out, yes, it does thanks to significant structural differences between natural and real numbers. One of those differences - the continuousness - allows us to define and calculate limits at finite moments. We spend some time specifically on the famous important limits, then we proceed with the idea of asymptotic comparison of functions, Big- and little-o notations. To top our module with, we introduce functions of several variables and spend some time getting used to conveniently plot and interpret them, finishing up with discussion of its limits.
Derivatives and Linear Approximations: Singlevariate Functions
Since we now know limits, let us use them in order to define some instantaneous characteristics of functions starting with its slope. Thus we define function's derivative and discuss all the machinery to calculate it. Since it is a purely technical issue, you are expected to be able to do it: in order to make sure that you can find a derivative we provide a drill. This skills could be used for finding approximate values via linear approximation or during the search for extremal values. To provide an understanding of the sufficient condition of the extremum, we introduce the concept of convexity.
Derivatives and Linear Approximations: Multivariate Functions
Whilst we have discussed all linear related concepts for single variate functions, it is essential to try and generalise it for the multivariate case. Since the derivative concept is hard to stretch directly, we start with the idea of linear approximation and tangent plane; thus we introduce partial derivatives and the differentiability. We separately spend sometime discussing neural network inspired composite multivariate functions and all-mighty chain rule. Our generalisation attempt finalised with the idea of convexity in terms of the second partial derivatives.
Integrals: Anti-derivative, Area under Curve
As we introduced the operation of differentiation, it is essential to think about the inverse procedure - the integration. We start the module with basic definition of the integration and, as usual, all techniques required to calculate wide range of the indefinite integrals, stressing out that the result is not guaranteed now. Then we proceed with the idea and formal definition of area under curve and its relation to the indefinite case - the fundamental theorem of calculus. We finish our week with the discussion of the areas of infinite figures (improper integrals) and numerical methods to assess the value of the definite integral.
Optimization: Directional derivative, Extrema and Gradient Descent
As we built up impressive base by introducing various estimations of change and overall function's behaviour, it is essential to speak about general idea of the optimisation procedure. Since we already tackled it in a single variate case, we try to generalise our principles of necessary and sufficient conditions to the case of multivariate functions. Whilst it provides theoretical understanding, one should seek for faster iterative way to find an extremal point. In order to do it, we start our week with the concept of the directional derivative in order to provide and understanding of the desired direction of iterative search. Thus we produce the idea and motivation of the gradient descent, the last and final concept in our course you are asked to master.