Page 681 - 8th European Congress of Mathematics ∙ 20-26 June 2021 ∙ Portorož, Slovenia ∙ Book of Abstracts
P. 681
STATISTICS AND FINANCIAL MATHEMATICS

New insight into partial differentiation with non-independent variables

Matieyendou Lamboni, matieyendou.lamboni@gmail.com
Université de Guyane, French Guiana

Summary. We use to work with models defined through a function and some equations con-
necting the input variables such as a given function subject to some constraints involving input
variables. For such models, it is interesting to better determine the partial derivatives with re-
spect to each input variable that comply with the constrained equations. As the equations con-
necting the input variables introduce some dependency structures among input variables, and
the theory of probability allows for better characterizing the dependencies among variables, In
this abstract, we propose new partial derivatives for functions with non-independent variables
by making use of the formal definition of independence or dependence such as the cumulative
distribution function (CDF). The proposed new partial derivatives are based on the classical
gradient and the CDF. Such derivatives are uniquely defined and do not require any additional
assumption. Our approach can be extended for determining cross-partial derivatives as well.

Main results. In this section, we include the distribution of inputs to derive the partial deriva-

tives. It is to be noted that each initial input Xj lies in a given domain Ωj ⊆ R with j = 1, . . . , d,
and we assume that we are able to attribute to Xj a distribution. It is common to attribute a nor-

mal distribution with a higher variance when we do have much information about the variable,

which comes down to make use of uniform distribution for a bounded domain Ωj.

Therefore, the input variables X = (X1, . . . , Xd) have a given distribution F , and we

are interested in a function given by f (X) and h (X) = 0. In what follows, we assume that

F= d Fj with Fj the CDF of Xj , which means that the initial input variables are indepen-
j=1

dent. The equation h (X) = 0 introduces some dependencies, and this yields to new dependent

variables Xc ∼ F c. It is worth noting that the inputs Xc must satisfies h(Xc) = 0 and we have

Y= f (X) =d f (Xc) ,
s.t. h(X) = 0

provided that F c is known.
Formally, Xc =d {X ∼ F : h(X) = 0}, and we are able to find the distribution of Xc.

Indeed, some analytic derivation of F c can be found in [1]. For complex function h, a copula-
based approach is suitable to fit a distribution to simulated data. Based on F c or the estimated

distribution F c, the multivariate conditional quantile transform (see [2,3,4]) implies a regression
representation of Xc (see [5,6]), which also implies a dependency function of Xc given by

([1,7])

X∼c j = rj Xjc, U ,

where Xj is independent of U; fj : Rd → Rd−1 and Xjc, Xc∼j =d Xjc, rj Xjc, U ∼ F c.
Now we have all the elements in hand to provide the partial derivatives (see Theorem 1). To

that end, we use

T Jjc := 1, ∂rw1,j , . . . , ∂rwd−1,j T
∂xj ∂xj
∇jf := fxj , fxw1 , . . . , fxwd−1 ; ,

for the gradient and the partial derivatives of each component of rj w.r.t. xj, respectively.
Moreover, we use Jjc( ) the th component of Jjc and

Jwc k := 1 1) , Jjc(2) , . . . , Jjc(d) T ∀ k ∈ {1, . . . d − 1} .
Jjc(k + Jjc(k + 1) Jjc(k + 1)
;

679
   676   677   678   679   680   681   682   683   684   685   686