Home > Functional Analysis, Measure Theory, Partial Differential Equations > Uniqueness and Error estimates via Kinetic Entropy Defect Measure

Uniqueness and Error estimates via Kinetic Entropy Defect Measure

Here are a few thoughts from my preparation for the exam of Kinetic Equations at Universite de Savoie, France. The teachers of the course were Christian Bourdarias and Stephane Gerbi. I had to study an article of Benoit Perthame entitled Uniqueness and Error estimates in First Order Quasilinear Conservation Laws via the Kinetic Entropy Defect Measure.

This was a very nice article to study, since it used many things like distribution theory, measures and regularization. It showed the power of these tools, and motivated me to learn more about them.

As the title of the article says, a relatively new proof of the uniqueness of the solution for a scalar conservation law coupled with some entropy inequalities is given. The only known proof at the time the article was published was due to Kruzkov and was more intricate and difficult to understand than the one provided in the article. The estimates on the entropy defect measure, which will be introduced can yield some error term approximation for approximate equation, which in particular imply unicity at once.

Here are my detailed notes on the article. They are handwritten, but I think they are readable. Perthame-Uniqueness and Error Estimates

Consider solututions {u(t,x)} to first order quasilinear scalar conservation laws

\displaystyle (SCL) \ \ \partial_tu+\text{div}\, A(u)=0 \text{ in }\mathcal{D}'((0,\infty \times \Bbb{R}^d)

endowed with a family of entropy inequalities

\displaystyle (EI) \ \ \partial_tS(u)+\text{div}\, \eta(u)\leq 0 \text{ in }\mathcal{D}'((0,\infty \times \Bbb{R}^d)

for all Lipschitz continuous and convex functions {S} and

\displaystyle \eta_i(u)=\int_0^u a_i(v)S'(v)dv

\displaystyle a(\cdot)=A'(\cdot) \in L^1_{loc}(\Bbb{R};\Bbb{R}^d).

We call {u} an entropy solution if {u} satisfies {(SCL)} and {(EI)}. It can be proved directly that if {u,v} are entropy solutions then

\displaystyle (CP) \ \ \partial_t|u-v|+\text{div}\, [(A(u)-A(v)\text{sgn}\,(u-v)]\leq 0 \text{ in }\mathcal{D}'((0,\infty \times \Bbb{R}^d).

From here unicity follows directly, if we consider an initial condition in {(SCL)} at {t=0} and we assume that at {t=0} the solution {u} is {L^1} continuous. A proof can be found in Evans, Partial Differential Equations for the scalar care, but can easily be translated in the multidimensional case. The idea is that we consider a suitable choice of test functions in {(CP)}, for example, {\phi(t,x)=\alpha(t)\beta(x)} where {0<s<t}, {r>0} and

\displaystyle \alpha(t)=\begin{cases} 0 & [0,s] \\ \text{linear} & [s,s+\delta] \\ 1 & [s+\delta,t] \\ \text{linear} & [t,t+\delta] \\ 0 & [t+\delta,\infty) \end{cases}

where the linear parts are chosen such that {\alpha} is continuous. In the following we choose {\beta: \Bbb{R}^n \rightarrow \Bbb{R}} such that

\displaystyle \beta(x)=\begin{cases}1 & |x| \leq r \\ 0 & |x| \geq r+1 \end{cases}

and {|\beta'(t)| \leq 2} for {r<|x|<r+1}. Replace {\phi} in {(CP)} and take first {r \rightarrow \infty} then {\delta \rightarrow 0} to get

\displaystyle \|u(t,\cdot)-v(t,\cdot)\|_{L^1(\Bbb{R}^d)}\leq \|u(s,\cdot)-v(s,\cdot)\|_{L^1(\Bbb{R}^d)}

Taking then s \to 0 and using the continuity of u,v in t=0 we obtain

\displaystyle \|u(t,\cdot)-v(t,\cdot)\|_{L^1(\Bbb{R}^d)}\leq \|u^0-v^0\|_{L^1(\Bbb{R}^d)}

where {u^0,v^0} are the initial conditions in {(SCL)}. Same initial condition clearly implies uniqueness.

To obtain {(CP)} we introduce the following kinetic formulation, which replaces with a single equation, with an extra variable {\xi} the family of entropy inequalities {(EI)}. The kinetic formulation is

\displaystyle (KF) \ \ \partial_t\chi(\xi,u)+a(\xi)\cdot \nabla_x \chi(\xi,u)=\partial_\xi m(t,x,\xi) \text{ in }\mathcal{D}'((0,\infty \times \Bbb{R}^d\times \Bbb{R})

where the measure {m} in the RHS is nonnegative, locally bounded, and is called the entropy defect measure. Also, the function {\chi} is defined as

\displaystyle \chi(\xi,u)=\begin{cases}1 & 0\leq u \leq \xi \\ -1 & u \leq \xi \leq 0 \\ 0 & \text{otherwise}\end{cases}

For a proof of the fact that the kinetic formulation {(KF)} is equivalent to {(EI)} you may consult the references in the article. A formal way to do this is to notice that since the distribution in the {LHS} of {(EI)} is negative, and since every positive distribution is a measure, we can consider the {RHS} of {(EI)} to be a measure multiplied with a negative scalar ({-2} in our case), and {(KF)} is then just the derivative with respect to {\xi} of {(EI)}, with {S} replaced by Kruzkov’s entropies: {S(u)=|u-\xi|-|\xi|}. Conversely, if we know {(KF)}, then we multiply it by {S'(\xi)} where {S} is {C^2(\Bbb{R})} and convex, and then integrate with respect to {\xi} to obtain

\displaystyle \partial_tS(u)+\text{div}\, \eta(u)=-\int S''(\xi)m(t,x,\xi)d\xi \text{ in } \mathcal{D}'((0,\infty \times \Bbb{R}^d)

This readily implies {(EI)} since {S'' \geq 0} ({S} is convex) and the measure {m} is nonnegative.

We can now present the first theorem in the article, which proves that {(CP)} holds and gives an estimate for the measure {m}. In the following we denote

\displaystyle |A|(u)=\int_0^u|a(\xi)|d\xi.

Theorem (2.1) If {u,v \in L^1_{loc}((0,\infty) \times \Bbb{R}^d)} are entropy solutions such that {|A|(u),|A|(v) \in L^1_{loc}((0,\infty) \times \Bbb{R}^d)} then {(CP)} holds. Moreover, for any regularizing kernel {\varphi_{\varepsilon}(t,x)} we have

\displaystyle \lim_{\varepsilon \rightarrow 0} \int m(\cdot,\cdot,\xi)\star \varphi_\varepsilon \delta(\xi-u(\cdot,\cdot))\star\varphi_\varepsilon d\xi \rightarrow 0 \text{ in }\mathcal{D}'((o,\infty)\times \Bbb{R}^d).

We can give an explicit formula for {(CP)}. If we denote by {q} the entropy defect measure associated to {v} then

\displaystyle \partial_t|u-v|+\text{div}\, [(A(u)-A(v)\text{sgn}\,(u-v)]=

\displaystyle =-2 \lim_{\varepsilon \rightarrow 0}\int \left[m(\cdot,\cdot,\xi)\star \varphi_\varepsilon \delta(\xi-v(\cdot,\cdot))\star\varphi_\varepsilon +q(\cdot,\cdot,\xi)\star \varphi_\varepsilon \delta(\xi-u(\cdot,\cdot))\star\varphi_\varepsilon \right]d\xi

Remark that if we choose the regularizing kernel {\varphi_\varepsilon} to be positive, then from the second equality {(CP)} follows immediately, and the first limit helps prove the second equality. The proof of the theorem is made in three steps.

Proof: Step 1 We can prove that

\displaystyle \xi \mapsto m(t,x,\xi) \text{ is continuous in }\mathcal{D}'((0,\infty) \times \Bbb{R}^d)


\displaystyle \partial_t |u-\xi_0|+\text{div}\, \eta_0(u) =-2m(t,x,0) \text{ in }\mathcal{D}'((0,\infty) \times \Bbb{R}^d)

where {\eta_\xi} denotes the entropy flux corresponding to Kruzkov’s entropy \mbox{{S(u)=|u-\xi|-|\xi|}}.

Step 2 Consider a regularizing kernel {\varphi_\varepsilon(t,x)}. Convolving {(KF)} with {\varphi_\varepsilon(t,x)}, multiplying with {\chi(\xi,u)\star\varphi_\varepsilon(t,x)} and finally integrating with respect to {\xi} we get

\displaystyle \partial_t \int (\chi(\xi,u)\star \varphi_\varepsilon(t,x))^2d\xi+\text{div}\,\int a(\xi)(\chi(\xi,u)\star \varphi_\varepsilon(t,x))^2d\xi=

\displaystyle =-2 \int m(t,x,\xi)\star \varphi_\varepsilon(t,x)[\delta(\xi)-\delta(\xi-u(t,x)]\star \varphi_\varepsilon(t,x) d\xi

where in the RHS we have integrated by parts. Making {\varepsilon \rightarrow 0} in the above relation we get

\displaystyle \partial_t|u| +\text{div}\,\eta_0(u)=-2m(t,x,0)+L

where {L} is the first limit in the theorem. By step 1 it easily follows that {L=0}.

Step 3 I will prezent a sketch of the proof, the details being similar to step 2. Do the following steps

\displaystyle \int[(KF)_u-(KF)_v]\star\varphi_\varepsilon [\chi(\xi,u)-\chi(\xi,v)]\star \varphi_\varepsilon d \xi

where {\varphi_\varepsilon\geq 0}. The conclusion easily follows taking {\varepsilon \rightarrow 0}.

For the second result of the article, we consider the following approximate kinetic formulation

\displaystyle (AKF) \ \ \partial_t\chi(\xi,v)+a(\xi)\cdot \nabla_x \chi(\xi,v)=\partial_\xi [q(t,x,\xi)+D_x^j e(t,x,\xi)] \text{ in }\mathcal{D}'((0,\infty \times \Bbb{R}^d\times \Bbb{R})

where {D_x^je} is the {j}-th order derivaive in {x} of some error term {e}. We also consider the error norm for {T >0}

\displaystyle |||e(...)|||_T =\|\sup_\xi |e(\cdot,\cdot,\xi)|\|_{L^1((0,T)\times \Bbb{R}^d)}

The main result of the second theorem of the article is the following error estimate.

Theorem (3.1) With the same condition as in the first theorem on {u,v} and {u_0 \in L^1\cap BV(\Bbb{R}^d), \ u,v \in C((0,\infty);L^1(\Bbb{R}^d))} and {|A|(u),|A|(v)\in L^\infty ((0,\infty);L^1(\Bbb{R}^d))} we have

\displaystyle \|(u(T,\cdot)-v(T,\cdot)\|_{L^1(\Bbb{R}^d)}\leq \|u^0-v^0\|_{L^1(\Bbb{R}^d)}+C\|u^0\|_{BV}^{j/(j+1)}|||e(...)|||_T^{1/(j+1)}

The proof of this theorem is divided again in several steps, where the first three steps can be done like the first steps in the first theorem (there are some complications due to the error term, of course, but the main idea is the same). In the last step, the estimate of proposision 2.2 from the article are used to deduce an estimate for the LHS of the conclusion in terms of {\varepsilon}. Optimizing by {\varepsilon} gives then the desired result.

Finally, in the end of the article an application is presented: an approximation for the diffusion equation.

\displaystyle \partial_t u+\text{div}\, A(v)=\varepsilon \Delta v

with {L^1 \cap BV} initial data. For this approximation we have the following estimate

\displaystyle \|(u(T,\cdot)-v(T,\cdot)\|_{L^1(\Bbb{R}^d)}\leq \|u^0-v^0\|_{L^1(\Bbb{R}^d)}+C\left(\varepsilon T\|u_0\|_{BV}\|v_0\|_{BV}\right)^{1/2}

as a simple application of theorem 3.1.

For more details, you can consult the article, and the references therein.

  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: