We consider decision-making under incomplete information about an unknown state of nature. We show that a decision problem yields a higher value of information than another, uniformly across information structures, if and only if it is obtained by adding an independent, parallel decision problem.
Linear programming is the seminal optimization problem that has spawned and grown into today's rich and diverse optimization modeling and algorithmic landscape. This article provides an overview of the recent development of first-order methods for solving large-scale linear programming.
This paper systematically compares two mathematical foundations for multitarget tracking: labeled random finite sets (LRFS's) and trajectory random finite sets (TRFS's).
We provide formulas for Riemannian gradient and Levi-Civita connection for a family of metrics on fixed-rank matrix manifolds, based on nonconstant metrics on Stiefel manifolds.
We present an example of smooth quasi-convex functions in the positive octant of $\mathbb{R}^{3}$ which cannot be obtained as the images of convex smooth functions under a monotone smooth mappings of $\mathbb{R}$.
Using the construction of a Lyapunov function, it is shown that the Douglas-Rachford iteration with respect to a sphere and a line in $\mathbb R^d$ is robustly $\mathcal{KL}$-stable. This implies a convergence which is stronger than uniform convergence on compact sets.
A refinement of Zhong's variational principle [Nonlin. Anal., 29 (1997), 1421-1431] is given, in the realm of almost metric structures. Applications to equilibrium points are also provided.