New Estimates for the Jensen Gap Using s-Convexity With Applications

In this article, we use s-convex and Green functions to obtain a bound for the Jensen gap in discrete form and a bound for the Jensen gap in integral form. We present two numerical examples to verify the main results and to examine the tightness of the bounds. Then, as an application of the discrete result, we derive a converse of the Hölder inequality. Based on the integral result, we obtain a bound for the Hermite-Hadamard gap and present a converse of the Hölder inequality in its integral form. Also, we obtain bounds for the Csiszár and Rényi divergences as applications of the discrete result. Finally, we utilize the bound obtained for the Csiszár divergence to deduce new estimates for some other divergences in information theory.

The function Ŵ is said to be s-concave if the inequality (1.1) holds in the reverse direction. Obviously, for s = 1 an s-convex function becomes a convex function, which shows that s-convexity of a function is a generalization of ordinary convexity of that function. Lemma 1.2 ([29]). Let B be a convex subset of a real linear space S and let Ŵ : B → R be a convex function. Then the following two statements hold: (a) Ŵ is s-convex for 0 < s ≤ 1 if Ŵ is non-negative; (b) Ŵ is s-convex for 1 ≤ s < ∞ if Ŵ is non-positive.
The Green function [30] defined on [α 1 , α 2 ] × [α 1 , α 2 ] and the integral identity for the function Ŵ ∈ C 2 [α 1 , α 2 ] will be used to obtain the main results. Note that G 1 is convex and continuous with respect to both variables. This paper is organized as follows. In section 2 we give a bound for the Jensen gap in discrete form, which pertains to functions for which the absolute value of the second derivative is s-convex. We also derive a bound for the integral version of the Jensen gap. Then we conduct two numerical experiments that provide evidence for the tightness of the bound in the main result. We deduce a converse of the Hölder inequality from the discrete result and a bound for the Hermite-Hadamard gap from the integral result. Moreover, as a consequence of the integral result we obtain a converse of the Hölder inequality in its corresponding integral version. At the beginning of section 3 we present bounds for the Csiszár and Rényi divergences in the discrete case. Finally, we give estimates for the Shannon entropy, Kullback-Leibler divergence, χ 2 divergence, Bhattacharyya coefficient, Hellinger distance, and triangular discrimination as applications of the bound obtained for the Csiszár divergence. Conclusions are presented in the final section.
Remark 2.2. If we use the Green function G 2 , G 3 , or G 4 instead of G 1 in Theorem 2.1, where G 2 , G 3 , and G 4 are given in [30], we obtain the same result (2.4).
In the following theorem, we give a bound for the Jensen gap in integral form.
In the following proposition, we provide a converse of the Hölder inequality in integral form as an application of Theorem 2.3. Proposition 2.5. Let q 2 > 1 and q 1 ∈ (2, 3) be such that As an application of Theorem 2.3, in the following corollary we establish a bound for the Hermite-Hadamard gap. Corollary 2.6. Let ψ ∈ C 2 [c 1 , c 2 ] be a function such that |ψ ′′ | is s-convex; then
Remark 3.17. Analogously, bounds for various divergences in integral form can be derived as applications of Theorem 2.3.

CONCLUSION
The Jensen inequality has numerous applications in engineering, economics, computer science, information theory, and coding; it has been derived for convex and generalized convex functions. This paper presents a novel approach to bounding the Jensen gap. Some bounds are obtained for the Jensen gap via s-convex functions. Numerical experiments not only confirm the sharpness of the Jensen inequality but also provide evidence for the tightness of the bound given in (2.15) for the Jensen gap. These experiments also show that the bound in (2.15) gives very close estimates for the Jensen gap even when the functions are not convex. The bounds are used to obtain new estimates for the Hermite-Hadamard and Hölder inequalities. Furthermore, based on the main results, various divergences are estimated. These estimates for divergences can be applied to signal processing, magnetic resonance image analysis, image segmentation, pattern recognition, and other areas. The ideas in this paper can also be used with other inequalities and for some other classes of convex functions.

DATA AVAILABILITY STATEMENT
The original contributions presented in the study are included in the article/supplementary materials, further inquiries can be directed to the corresponding author/s.

AUTHOR CONTRIBUTIONS
MA gave the main idea. MA and SK worked on Main Results while Y-MC worked on Introduction. All authors checked carefully the whole manuscript and approved.