Contact

Eitan Farchi, Verification & Quality Technologies, IBM Research - Haifa

Optimization - Here we continue the optimization thread starting with the least square for regression.

I have added an example of a non-continuous function that has continuous partial derivatives.

    Also try to answer the following questions:
    Consider the Regression I image below.
  1. What is the VC dimension of M?
  2. For a given continuous function construct a neural network that approximates it.
  3. A m*n matrix A defines a function from R^m to R^n by the multiplication Ax, x in R^n. What is the Jacobian of A?

How to turn non convex optimization problems to convex optimization problems and why.

Quadric form:
Given that the second order Taylor expansion of a function is a quadratic form it plays a central role in optimization.

Positive definite matrix:
The relation between positive definite of the Hessian and the Taylor expansion.