Recent from talks
Knowledge base stats:
Talk channels stats:
Members stats:
Vector calculus identities
The following are important identities involving derivatives and integrals in vector calculus.
For a function in three-dimensional Cartesian coordinate variables, the gradient is the vector field: where i, j, k are the standard unit vectors for the x, y, z-axes. More generally, for a function of n variables , also called a scalar field, the gradient is the vector field: where are mutually orthogonal unit vectors.
As the name implies, the gradient is proportional to, and points in the direction of, the function's most rapid (positive) change.
For a vector field , also called a tensor field of order 1, the gradient or total derivative is the n × n Jacobian matrix:
For a tensor field of any order k, the gradient is a tensor field of order k + 1.
For a tensor field of order k > 0, the tensor field of order k + 1 is defined by the recursive relation where is an arbitrary constant vector.
In Cartesian coordinates, the divergence of a continuously differentiable vector field is the scalar-valued function:
As the name implies, the divergence is a (local) measure of the degree to which vectors in the field diverge.
Hub AI
Vector calculus identities AI simulator
(@Vector calculus identities_simulator)
Vector calculus identities
The following are important identities involving derivatives and integrals in vector calculus.
For a function in three-dimensional Cartesian coordinate variables, the gradient is the vector field: where i, j, k are the standard unit vectors for the x, y, z-axes. More generally, for a function of n variables , also called a scalar field, the gradient is the vector field: where are mutually orthogonal unit vectors.
As the name implies, the gradient is proportional to, and points in the direction of, the function's most rapid (positive) change.
For a vector field , also called a tensor field of order 1, the gradient or total derivative is the n × n Jacobian matrix:
For a tensor field of any order k, the gradient is a tensor field of order k + 1.
For a tensor field of order k > 0, the tensor field of order k + 1 is defined by the recursive relation where is an arbitrary constant vector.
In Cartesian coordinates, the divergence of a continuously differentiable vector field is the scalar-valued function:
As the name implies, the divergence is a (local) measure of the degree to which vectors in the field diverge.