Braverman's Readings in Learning Theory and Related Areas

April 28 - April 30, 2017
Northeastern University, Boston, MA
The Conference, sponsored by the Yandex School of Data Analysis, is organized to mark the 40th anniversary of the death of Emmanuel Braverman, an outstanding Russian scientist and engineer, one of the founder of the learning theory. Our motivation and goal for the conference is to emphasize new tendencies in modern learning theory, and bringing for discussion the contemporary state and possible further developments of the main research directions which were started by E. Braverman and his students and collaborators.
Emmanuel Braverman (1932-1977). In the early 70th of the last century with a small group of collaborators E. Braverman developed currently important parts of Data Science such as kernel functions, cluster-analysis of multivariate data, clustering networks and similarity matrices, classifiers and fitting them to data, region detection in spatial data, stratified sampling, subspace clustering, etc. His first paper on Machine Learning was published in 1962 providing a geometric perspective for Rosenblatt's Perceptron and its generalization. The very notion of "machine learning" as a research subject traces to Braverman's books "Machine learning for pattern recognition" (M.. Nauka, 1964, in Russian) and "Machine learning for classification" (M., Nauka, 1971, in Russian) written by him for general audiences to popularize both the subject and obtained results. Extraordinary vision of non-trivial analogies allowed him to obtain interesting results in general non-linear dynamic system theory and in regression structural equation modeling by similar methods.
E. Braverman was an active propagandist of the emerging research area, recently dubbed as "Data Science". This was perhaps the most important motivation in his work as the organizer and instructor of the corresponding course for students of Engineering Cybernetics in the Moscow Steele and Alloys Technological University. He played a leading role in organization and co-supervision of the All-Moscow research seminar "Extending Automata Capabilities" in the Institute of Control Problems of the Russian Academy of Sciences. He led quite a few developments oriented at applications - in speech recognition, in image analysis, in sociology, history, geology, and in biology, among the others.
E. Braverman did not limit himself with purely academic activities. He was far from being indifferent to emotional and social needs of people around him - they frequently sought for his opinion and advice over various personal or social issues. His advice was always deep and wise, and usually with a touch of humor, a rather necessary device to keep psychological balance in the Soviet system. These wisdom and compassion led him to develop quite an original approach in modeling of the Soviet economy, of which he justly focused on the feature of keeping all the prices invariable.
His book "Potential Function Methods in Machine Learning Theory" (co-authored by Mark Aizerman and Lev Rozonoer), Moscow, 1970, in Russian, was the first monograph to use
the term "Machine Learning Theory". Unfortunately, it was never published in English, although the concept of potential function has been later reintroduced by Vladimir Vapnik to become quite popular under the name of "kernel function" (btw, Vladimir Vapnik, in his first writing on the subject, did make all the necessary references to Braverman and Co's work).
We present here a sligtly abridged version of Chapte III.3 of the book. Technical details of some proofs are omitted and replaced by a short sketch of the main steps of the proof. An interested reader can either fill those details or consult the original Russian edition. The chapter was translated by Benjamin Rozonoer. The translation was edited by Maxim Braverman.