# Jonathan Borwein

Jonathan Borwein passed away on August 1st.  He was a prolific mathematician, with 427 publications as of this writing.  He was also quite broad, publishing in number theory, operations research, calculus of variations, and many other subjects.  Many people knew him for his book with his brother PeterPi and the AGM.  His most cited work in MathSciNet is his paper “On projection algorithms for solving convex feasibility problems” with Heinz Bauschke (the review is reproduced below).  Borwein was also known for promoting experimental mathematics, and was the founding director of the Centre for Experimental and Constructive Mathematics at Simon Fraser University.  But many people knew Borwein’s mathematics directly as a mentor or as a collaborator.  He had many graduate students and 163 collaborators on published papers.

MR1409591 (98f:90045)
Bauschke, Heinz H.; Borwein, Jonathan M.
On projection algorithms for solving convex feasibility problems.
SIAM Rev. 38 (1996), no. 3, 367–426.

Suppose $X$ is a Hilbert space and $C_1, \cdots ,C_N$ are closed convex subsets whose intersection $C:=\bigcap_1^NC_i$ is nonempty. The convex feasibility problem is to find a point in $C$. This general formulation includes many practical problems of interest arising in diverse areas of mathematics and physical sciences.

A frequently employed approach to solving the convex feasibility problem is algorithmic. A general algorithmic approach, essentially due to Flåm and Zowe [1990], is as follows: given the current iterate $x_n$, the next iterate $x_{n+1}$ is obtained from $$x_{n+1}:= A^{(n)}x_n:=\left(\sum_{i=1}^N\lambda_i^{(n)}[(1-\alpha_i^{(n)})I+ \alpha_i^{(n)}P_i^{(n)}]\right)x_n, \tag{*}$$ where every $P_i^{(n)}$ is the projection onto some approximating superset $C_i^{(n)}$ of $C_i$, every $\alpha_i^{(n)}$ is a relaxation parameter between $0$ and $2$, and the $\lambda_i^{(n)}$‘s are nonnegative weights adding up to $1$. Briefly, $x_{n+1}$ is a weighted average of relaxed projections of $x_n$. Algorithms such as $(\ast)$ are called projection algorithms by the authors.

It was the authors’ intent to unify, improve, and survey a fairly substantial body of results that is included in this general framework. In my opinion, they have succeeded on all counts.

The key concepts used were those of “attracting mappings”, “Fejér monotone sequences”, and their new geometric concept of “regularity”. Roughly speaking, the collection of sets $\{C_1, \cdots, C_N\}$ is called “regular” if “closeness to each of the individual sets $C_i$ implies closeness to their intersection”.

The paper is divided into seven sections. The first contains the introduction, preliminaries, and notation. The second is concerned with attracting mappings and Fejér monotone sequences. In the third section, the basic properties of the algorithm $(\ast)$ and its convergence results are discussed. In Section 4, projection algorithms are studied in detail. In Section 5, the notion of a collection of closed convex sets $\{C_1, \cdots, C_N\}$, with nonempty intersection, being “regular” or “(boundedly) (linearly) regular” is introduced. It is also shown how the rate of convergence of the algorithm is related to these notions.

The applicability of the framework presented in the paper is demonstrated in Section 6 where several examples are given. Finally, in Section 7, subgradient algorithms are discussed in the context of the algorithm $(\ast)$, and several more examples and applications are given.

The exposition is well written, concludes with a useful index of terms, and contains a bibliography of 109 items.

## About Edward Dunne

I am the Executive Editor of Mathematical Reviews. Previously, I was an editor for the AMS Book Program for 17 years. Before working for the AMS, I had an academic career working at Rice University, Oxford University, and Oklahoma State University. In 1990-91, I worked for Springer-Verlag in Heidelberg. My Ph.D. is from Harvard. I received a world-class liberal arts education as an undergraduate at Santa Clara University.
This entry was posted in Mathematicians, Uncategorized. Bookmark the permalink.