BREAKING NEWS
Multilinear polynomial

## Summary

In algebra, a multilinear polynomial[1] is a multivariate polynomial that is linear (meaning affine) in each of its variables separately, but not necessarily simultaneously. It is a polynomial in which no variable occurs to a power of 2 or higher; that is, each monomial is a constant times a product of distinct variables. For example f(x,y,z) = 3xy + 2.5 y - 7z is a multilinear polynomial of degree 2 (because of the monomial 3xy) whereas f(x,y,z) = x² +4y is not. The degree of a multilinear polynomial is the maximum number of distinct variables occurring in any monomial.

## Definition

Multilinear polynomials can be understood as a multilinear map (specifically, a multilinear form) applied to the vectors [1 x], [1 y], etc. The general form can be written as a tensor contraction:${\displaystyle f(x)=\sum _{i_{1}=0}^{1}\sum _{i_{2}=0}^{1}\cdots \sum _{i_{n}=0}^{1}a_{i_{1}i_{2}\cdots i_{n}}x_{1}^{i_{1}}x_{2}^{i_{2}}\cdots x_{n}^{i_{n}}}$

For example, in two variables:${\displaystyle f(x,y)=\sum _{i=0}^{1}\sum _{j=0}^{1}a_{ij}x^{i}y^{j}=a_{00}+a_{10}x+a_{01}y+a_{11}xy={\begin{pmatrix}1&x\end{pmatrix}}{\begin{pmatrix}a_{00}&a_{01}\\a_{10}&a_{11}\end{pmatrix}}{\begin{pmatrix}1\\y\end{pmatrix}}}$

## Properties

A multilinear polynomial ${\displaystyle f}$  is linear (affine) when varying only one variable, ${\displaystyle x_{k}}$ :${\displaystyle f(x_{1},x_{2},...,x_{k},...,x_{n})=ax_{k}+b}$ where ${\displaystyle a}$  and ${\displaystyle b}$  do not depend on ${\displaystyle x_{k}}$ . Note that ${\displaystyle b}$  is generally not zero, so ${\displaystyle f}$  is linear in the "shaped like a line" sense, but not in the "directly proportional" sense of a multilinear map.

All repeated second partial derivatives are zero:${\displaystyle {\frac {\partial ^{2}f}{\partial x_{k}^{2}}}=0}$ In other words, its Hessian matrix is a symmetric hollow matrix.

In particular, the Laplacian ${\displaystyle \nabla ^{2}f=0}$ , so ${\displaystyle f}$  is a harmonic function. This implies ${\displaystyle f}$  has maxima and minima only on the boundary of the domain.

More generally, every restriction of ${\displaystyle f}$  to a subset of its coordinates is also multilinear, so ${\displaystyle \nabla ^{2}f=0}$  still holds when one or more variables are fixed. In other words, ${\displaystyle f}$  is harmonic on every "slice" of the domain along coordinate axes.

### On a rectangular domain

When the domain is rectangular in the coordinate axes (e.g. a hypercube), ${\displaystyle f}$  will have maxima and minima only on the vertices of the domain, i.e. the finite set of ${\displaystyle 2^{n}}$  points with minimal and maximal coordinate values. The value of the function on these points completely determines the function, since the value on the edges of the boundary can be found by linear interpolation, and the value on the rest of the boundary and the interior is fixed by Laplace's equation, ${\displaystyle \nabla ^{2}f=0}$ .[1]

The value of the polynomial at an arbitrary point can be found by repeated linear interpolation along each coordinate axis. Equivalently, it is a weighted mean of the vertex values, where the weights are the Lagrange interpolation polynomials. These weights also constitute a set of generalized barycentric coordinates for the hyperrectangle. Geometrically, the point divides the domain into ${\displaystyle 2^{n}}$  smaller hyperrectangles, and the weight of each vertex is the (fractional) volume of the hyperrectangle opposite it.

Algebraically, the multilinear interpolant on the hyperrectangle ${\displaystyle [a_{i},b_{i}]_{i=1}^{n}}$  is:${\displaystyle f(x)=\sum _{v}f(v)\prod _{i|v_{i}=b_{i}}{\frac {x_{i}-a_{i}}{b_{i}-a_{i}}}\prod _{i|v_{i}=a_{i}}{\frac {b_{i}-x_{i}}{b_{i}-a_{i}}}}$ where the sum is taken over the vertices ${\displaystyle v}$ . Equivalently,${\displaystyle f(x)={\frac {1}{V}}\sum _{v}f(v)\prod _{i|v_{i}=b_{i}}(x_{i}-a_{i})\prod _{i|v_{i}=a_{i}}(b_{i}-x_{i})}$ where V is the volume of the hyperrectangle.

The value at the center is the arithmetic mean of the value at the vertices, which is also the mean over the domain boundary, and the mean over the interior. The components of the gradient at the center are proportional to the balance of the vertex values along each coordinate axis.

The vertex values and the coefficients of the polynomial are related by a linear transformation (specifically, a Möbius transform if the domain is the unit hypercube ${\displaystyle \{0,1\}^{n}}$ , and a Walsh-Hadamard-Fourier transform if the domain is the symmetric hypercube ${\displaystyle \{-1,1\}^{n}}$ ).

## Applications

Multilinear polynomials are the interpolants of multilinear or n-linear interpolation on a rectangular grid, a generalization of linear interpolation, bilinear interpolation and trilinear interpolation to an arbitrary number of variables. This is a specific form of multivariate interpolation, not to be confused with piecewise linear interpolation. The resulting polynomial is not a linear function of the coordinates (its degree can be higher than 1), but it is a linear function of the fitted data values.

The determinant, permanent and other immanants of a matrix are homogeneous multilinear polynomials in the elements of the matrix (and also multilinear forms in the rows or columns).

The multilinear polynomials in ${\displaystyle n}$  variables form a ${\displaystyle 2^{n}}$ -dimensional vector space, which is also the basis used in the Fourier analysis of (pseudo-)Boolean functions. Every (pseudo-)Boolean function can be uniquely expressed as a multilinear polynomial (up to a choice of domain and codomain).

Multilinear polynomials are important in the study of polynomial identity testing.[2]