Professor Vladik Kreinovich
TITLE: Dealing with Uncertainties in Computing: from Probabilistic and Interval Uncertainty to Combination of Different Approaches, with Application to Geoinformatics, Bioinformatics, and Engineering
ABSTRACT: Most data processing techniques traditionally used in scientific and engineering practice are statistical. These techniques are based on the assumption that we know the probability distributions of measurement errors etc.
In practice, often, we do not know the distributions, we only know the bound D on the measurement accuracy — hence, after the get the measurement result X, the only information that we have about the actual (unknown) value x of the measured quantity is that x belongs to the interval [X-D,X+D]. Techniques for data processing under such interval uncertainty are called interval computations; these techniques have been developed since 1950s.
In many practical problems, we have a combination of different types of uncertainty, where we know the probability distribution for some quantities, intervals for other quantities, and expert information for yet other quantities.
There exist a lot of theoretical research and practical applications in dealing with these types of uncertainty: interval, fuzzy, and combined. However, even for the simplest basic data processing techniques, it is often still necessary to undertake a lot of research to transit from probabilistic to interval and fuzzy uncertainty.
The purpose of this talk is to describe the theoretical background for interval and combined techniques, to describe the existing practical applications, and ideally, to come up with a roadmap for such techniques.
We start with the problem of chip design in computer engineering. In this problem, traditional interval methods lead to estimates with excess width. The reason for this width is that often, in addition to the intervals of possible values of inputs, we also have partial information about the probabilities of different values within these intervals — and standard interval techniques ignore this information.
It is, therefore, desirable to extend interval techniques to the situations when, in addition to intervals, we also have a partial probabilistic information. In the talk, we give a brief overview of these techniques, and we emphasize the following three application areas: computer engineering, bioinformatics, and
SHORT BIO: Vladik Kreinovich received his MS in Mathematics and Computer Science from St. Petersburg University, Russia, in 1974, and PhD from the Institute of Mathematics, Soviet Academy of Sciences, Novosibirsk, in 1979. From 1975 to 1980, he worked with the Soviet Academy of Sciences; during this time, he worked with the Special Astrophysical Observatory (focusing on the representation and processing of uncertainty in radioastronomy). For most of the 1980s, he worked on error estimation and intelligent information processing for the National Institute for Electrical Measuring Instruments, Russia. In 1989, he was a visiting scholar at Stanford University. Since 1990, he has worked in the Department of Computer Science at the University of Texas at El Paso. In addition, he has served as an invited professor in Paris (University of Paris VI), France; Hong Kong; St. Petersburg, Russia; and Brazil.
His main interests are the representation and processing of uncertainty, especially interval computations and intelligent control. He has published six books, sixteen edited books, and more than 1,200 papers. Vladik is a member of the editorial board of the international journal "Reliable Computing" (formerly "Interval Computations") and several other journals. In addition, he is the co-maintainer of the international Web site on interval
Vladik is Vice President for Publications of IEEE Systems, Man, and Cybernetics Society; he served as President of the North American Fuzzy Information Processing Society 2012-14; is a foreign member of the Russian Academy of Metrological Sciences; was the recipient of the 2003 El Paso Energy Foundation Faculty Achievement Award for Research awarded by the University of Texas at El Paso; and was a co-recipient of the 2005 Star Award from the University of Texas System.
Professor Longpre earned his Ph.D. in Computer Science at Cornell in 1986. He has served for several years as an organizer of the IEEE Conference on Computational Complexity. He has taught computer security courses for almost 20 years, is the Director of UTEP's Center for Information Assurance at UTEP,
and lead the efforts for UTEP to become a Center of Academic Excellence in Cyber Defense Education, a National Security Agency (NSA) designation.
A view on dealing with privacy and computer malware concerns
Computer security and privacy concerns have become a priority for our nation. A traditional approach is to develop strong and intelligent virus and intrusion detection programs. Although this approach is important, we believe this approach will never really put these concerns to rest. A more promising approach for running security and privacy sensitive programs is to be able to run them with guaranteed confidentiality and integrity in spite of a computing environment containing malware. We will start with the difficulty of defining privacy and a review of some existing definitions. We will continue with different approaches to run programs securely in an insecure environment. This includes whitelisting and process isolation techniques.