Classical analysis covers many function-constructing operations, such as addition, multiplication, composition and even integration, but not the operations that are essential in optimization, namely minimization and maximization. The simple reason is that those operations don't preserve differentiability. If a function g(x) is defined as the minimum (or the maximum) of f(x,y) with respect to y in some set Y, no amount of differentiability of f in x and y will carry over, in general, to g being differentiable in x.
Variational analysis, as an extension of classical analysis which encompasses also convex analysis, gets around this by introducing one-sided concepts of generalized differentiability which moreover have a basis in set convergence that's very different from the usual pointwise convergence of different quotients. Variational geometry provides powerful support by associating one-sided tangent and normal "cones" instead of subspaces to the points of a set in a linear space. Novel concepts of regularity, unanticipated in classical theory, then come up.