In mathematics, a partial derivative of a function of several variables is its derivative with respect to one of those variables, with the others held constant (as opposed to the total derivative, in which all variables are allowed to vary). Partial derivatives are used in vector calculus and differential geometry.
The partial derivative of a function f with respect to the variable x is variously denoted by
The partialderivative symbol ∂ is a rounded letter, derived but distinct from the Greek letter delta. The notation was introduced by AdrienMarie Legendre and gained general acceptance after its reintroduction by Carl Gustav Jacob Jacobi.^{[1]}
Contents 
Suppose that ƒ is a function of more than one variable. For instance,
It is difficult to describe the derivative of such a function, as there are an infinite number of tangent lines to every point on this surface. Partial differentiation is the act of choosing one of these lines and finding its slope. Usually, the lines of most interest are those that are parallel to the xzplane, and those that are parallel to the yzplane.
A good way to find these parallel lines is to treat the other variable as a constant. For example, to find the slope of the line tangent to the function at (1, 1, 3) that is parallel to the xzplane, we treat the y variable as constant. The graph and this plane are shown on the right. On the left, we see the way the function looks on the plane y = 1. By finding the derivative of the equation while assuming that y is a constant, we discover that the slope of ƒ at the point (x, y, z) is:
So at (1, 1, 3), by substitution, the slope is 3. Therefore
at the point (1, 1, 3). That is, the partial derivative of z with respect to x at (1, 1, 3) is 3.
The function f can be reinterpreted as a family of functions of one variable indexed by the other variables:
In other words, every value of x defines a function, denoted f_{x}, which is a function of one real number.^{[2]} That is,
Once a value of x is chosen, say a, then f(x,y) determines a function f_{a} which sends y to a^{2} + ay + y^{2}:
In this expression, a is a constant, not a variable, so f_{a} is a function of only one real variable, that being y. Consequently, the definition of the derivative for a function of one variable applies:
The above procedure can be performed for any choice of a. Assembling the derivatives together into a function gives a function which describes the variation of f in the y direction:
This is the partial derivative of f with respect to y. Here ∂ is a rounded d called the partial derivative symbol. To distinguish it from the letter d, ∂ is sometimes pronounced "del" or "partial" instead of "dee".
In general, the partial derivative of a function f(x_{1},...,x_{n}) in the direction x_{i} at the point (a_{1},...,a_{n}) is defined to be:
In the above difference quotient, all the variables except x_{i} are held fixed. That choice of fixed values determines a function of one variable , and by definition,
In other words, the different choices of a index a family of onevariable functions just as in the example above. This expression also shows that the computation of partial derivatives reduces to the computation of onevariable derivatives.
An important example of a function of several variables is the case of a scalarvalued function f(x_{1},...x_{n}) on a domain in Euclidean space R^{n} (e.g., on R^{2} or R^{3}). In this case f has a partial derivative ∂f/∂x_{j} with respect to each variable x_{j}. At the point a, these partial derivatives define the vector
This vector is called the gradient of f at a. If f is differentiable at every point in some domain, then the gradient is a vectorvalued function ∇f which takes the point a to the vector ∇f(a). Consequently, the gradient produces a vector field.
A common abuse of notation is to define the del operator (∇) as follows in threedimensional Euclidean space R^{3} with unit vectors :
Or, more generally, for ndimensional Euclidean space R^{n} with coordinates (x_{1}, x_{2}, x_{3},...,x_{n}) and unit vectors ():
Like ordinary derivatives, the partial derivative is defined as a limit. Let U be an open subset of R^{n} and f : U → R a function. The partial derivative of f at the point a = (a_{1}, ..., a_{n}) ∈ U with respect to the ith variable a_{i} is defined as
Even if all partial derivatives ∂f/∂a_{i}(a) exist at a given point a, the function need not be continuous there. However, if all partial derivatives exist in a neighborhood of a and are continuous there, then f is totally differentiable in that neighborhood and the total derivative is continuous. In this case, it is said that f is a C^{1} function. This can be used to generalize for vector valued functions (f : U → R'^{m}) by carefully using a componentwise argument.
The partial derivative can be seen as another function defined on U and can again be partially differentiated. If all mixed second order partial derivatives are continuous at a point (or on a set), f is termed a C^{2} function at that point (or on that set); in this case, the partial derivatives can be exchanged by Clairaut's theorem:
The volume V of a cone depends on the cone's height h and its radius r according to the formula
The partial derivative of V with respect to r is
which represents the rate with which a cone's volume changes if its radius is varied and its height is kept constant. The partial derivative with respect to h is
which represents the rate with which the volume changes if its height is varied and its radius is kept constant.
By contrast, the total derivative of V with respect to r and h are respectively
and
The difference between the total and partial derivative is the elimination of indirect dependencies between variables in the latter.
If (for some arbitrary reason) the cone's proportions have to stay the same, and the height and radius are in a fixed ratio k,
This gives the total derivative with respect to r:
Equations involving an unknown function's partial derivatives are called partial differential equations and are common in physics, engineering, and other sciences and applied disciplines.
For the following examples, let f be a function in x, y and z.
Firstorder partial derivatives:
Secondorder partial derivatives:
Secondorder mixed derivatives:
Higherorder partial and mixed derivatives:
When dealing with functions of multiple variables, some of these variables may be related to each other, and it may be necessary to specify explicitly which variables are being held constant. In fields such as statistical mechanics, the partial derivative of f with respect to x, holding y and z constant, is often expressed as
There is a concept for partial derivatives that is analogous to antiderivatives for regular derivatives. Given a partial derivative, it allows for the partial recovery of the original function.
Consider the example of . The "partial" integral can be taken with respect to x (treating y as constant, in a similar manner to partial derivation):
Here, the "constant" of integration is no longer a constant, but instead a function of all the variables of the original function except x. The reason for this is that all the other variables are treated as constant when taking the partial derivative, so any function which does not involve x will disappear when taking the partial derivative, and we have to account for this when we take the antiderivative. The most general way to represent this is to have the "constant" represent an unknown function of all the other variables.
Thus the set of functions x^{2} + xy + g(y), where g is any oneargument function, represents the entire set of functions in variables x,y that could have produced the xpartial derivative 2x+y.
If all the partial derivatives of a function are known (for example, with the gradient), then the antiderivatives can be matched via the above process to reconstruct the original function up to a constant.
