Operations Research Cheat Sheet Technical Interview
This is a cheat sheet in operations research for preparing technical interviews. We will skim through the types of Mathematical Program that is commonly used and the solution techniques for the programing.
Types of Mathematical Program¶
Objective function | Constraints | Variable | Solution approach | |
---|---|---|---|---|
Linear Program | Linear | Linear | real | Simplex method |
Mixed Integer Linear Program | Linear | Linear | integer & real | Cutting plane, Branch and bound |
Quadratic Program | Quadratic | Linear | real | |
Quadratic Constrained Quadratic Program | Quadratic | Quadratic | real | |
Unconstrained Nonlinear Program | Nonlinear | X | real | Gradient descent, Newton method |
Convex Program | Convex | Convex | Convex cone |
Unconstrained Nonlinear Optimization¶
Analytical approach¶
At local minimum, first-order partial derivative function (Gradient) is equal to 0, and second-order partial derivative function (Hessian) is positive semi-definite. This is necessary condition. To satisty sufficient condition at a certain point, gradient should be 0 and Hessian matrix should be positive definite (To determine if a matrix is positive definite, we need to check if all eigenvalues are positive). In this case, we can assure that the point is strict local minimum. However, there might be no such a point. In that case, we need numerical methods listed below.
Gradient descent¶
This is the method to find a minimum/maximum when the analytical form of the first derivative of any input function is given. Start at an initial point. There are two steps of iterations. 1) Compute an ascent/descent direciton 2) take a step towards that direction by predefined step size. Repeat until the terminate condition is satisfied, which is the difference between the new point and the initial point is small enough.
Newton's method¶
This method is used when the function is differentiable twice. This method can be used to find a root or finding the minimum point.
Mixed Integer Program¶
By introducing integrality constraints, the problem complexity changes from P (polynomial time solvable Linear Program) to NP (not solvable in polynomial time).
Branch and bound¶
Bounding is finding an optimistic evaluation by relaxation (in other words, finding a lower bound in minimization problem), and branching is splitting the problem in subproblems. We solve linear relaxation. If the relaxation is worse than the best known solution so far, we prune the node. If the linear relaxation problem has integer solution, this is the new feasible solution. Otherwise, create two subproblems by adding additional constraints for a fractional solution (meaning that set the variable to 0 or 1). Branch and bound is effective when the linear relaxation is strong and solutions are fractional.
Cutting plane¶
Cutting plane method is to add a new "cut" to improve the value of the linear relaxation. We find a linear constraint that exclude the current LP solution but does not remove any feasible integer solution. Geometrically, since the linear space is larger than integer space, we want to get smaller polytope by adding constraints. In theory, if we add all the possible linear constraints, we obtain the convex hull of the feasible integer points. In the convex hull, relaxed LP will always give the true optimal solution. However, in practice, it is computationally expensive to compute the convex hull.
Branch and cut¶
Branch and cut is a type of branch and bound method that uses cutting planes in addition to LP relaxation.
Big-M Transformation¶
Handling disjunction (or- which is not supported in MIP) with conjunction (and).
Constraint Programming (CP)¶
Constraint Optimization is to identify feasible solutions rather than to find an optimal solution. It focuses on the constraints and variables rather than objective function. The goal is to narrow down a very large set of possible solutions to a more manageable subset by adding constraints to the problem.
Local search¶
Local search starts from a solution and improves it by performing local perturbations.
Large neighborhood search (LNS)¶
Combine constraint programming and local search. First, use CP to find a feasible solution. Second, select a neighborhood using local search. Third, optimize neighborbood using CP. Repeat the three steps. LNS utilizes two good facets of CP, which is good both for finding feasible solutions and for optimizing small combinatorial spaces. This technique is often used to asymmetric TSP with time windows.