2016 IEEE/WIC/ACM International Conference on Web Intelligence
October 13-16, 2016 in Omaha, Nebraska, USA

Keynote Speakers

Portrait of Dr. Leslie Valiant (Turing Award 2010)

Dr. Leslie Valiant (Turing Award 2010): A Computational Model and Theory of Cortex

Harvard University

Bio: Leslie Valiant was educated at King's College, Cambridge; Imperial College, London; and at Warwick University where he received his Ph.D. in computer science in 1974. He is currently T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics in the School of Engineering and Applied Sciences at Harvard University, where he has taught since 1982. Before coming to Harvard he had taught at Carnegie Mellon University, Leeds University, and the University of Edinburgh. He has interests in computational neuroscience, evolution and artificial intelligence and is the author of two books, Circuits of the Mind, and Probably Approximately Correct. He received the Nevanlinna Prize at the International Congress of Mathematicians in 1986, the Knuth Award in 1997, the European Association for Theoretical Computer Science EATCS Award in 2008, and the 2010 A. M. Turing Award. He is a Fellow of the Royal Society (London) and a member of the National Academy of Sciences. Read More.

Abstract: The brain performs many kinds of computation for which it is challenging to hypothesize any mechanism that does not contradict the quantitative evidence. Over a lifetime the brain performs hundreds of thousands of individual cognitive acts, of a variety of kinds, most having some dependence on past experience, and having in turn long-term effects on future behavior. It is difficult to reconcile such large scale capabilities, even in principle, with the known resource constraints on cortex, such as low connectivity and low average synaptic strength, and with the requirement that there be explicit algorithms that realize these acts.
Here we shall describe model neural circuits and associated algorithms that respect the brain's most basic resource constraints. These circuits simultaneously support a suite of four basic model tasks that each requires some circuit modification: memory allocation, association, supervised memorization, and inductive learning of threshold functions. The capacity of these circuits is established by simulating sequences of thousands of such acts in a computer, and then testing the circuits created for the cumulative efficacy of the many past acts. Thus the earlier acts of learning need to be retained without undue interference from the more recent ones.
A basic prerequisite for this endeavor is that of devising an appropriate model of computation that reflects the gross quantitative parameters of cortex, including timing, and can be used for expressing algorithms for these systems level tasks in a distributed environment.

Portrait of Dr. Butler Lampson (Turing Award 1992)

Dr. Butler Lampson (Turing Award 1992): Personal Control of Data

Technical Fellow at Microsoft Corporation and Adjunct Professor at MIT

Bio: Butler Lampson is a Technical Fellow at Microsoft Corporation and an Adjunct Professor at MIT. He has worked on computer architecture, local area networks, raster printers, page description languages, operating systems, remote procedure call, programming languages and their semantics, programming in the large, fault-tolerant computing, transaction processing, computer security, WYSIWYG editors, and tablet computers. He was one of the designers of the SDS 940 time-sharing system, the Alto personal distributed computing system, the Xerox 9700 laser printer, two-phase commit protocols, the Autonet LAN, the SPKI system for network security, the Microsoft Tablet PC software, the Microsoft Palladium high-assurance stack, and several programming languages. He received the ACM Software Systems Award in 1984 for his work on the Alto, the IEEE Computer Pioneer award in 1996 and von Neumann Medal in 2001, the Turing Award in 1992, and the NAE's Draper Prize in 2004. Read More.

Abstract: People around the world are concerned that more and more of their personal data is on the Internet, where it’s easy to find, copy, and link up with other data. Data about people’s presence and actions in the physical world (from cameras, microphones, and other sensors) soon will be just as important as data that is born digital. What people most often want is a sense of control over their data (even if they don’t exercise this control very often). Control means that you can tell who has your data, limit what they can do with it, and change your mind about the limits. Many people feel that this control is a fundamental human right (thinking of personal data as an extension of the self), or an essential part of your property rights to your data.
Regulators are starting to respond to these concerns. Because societies around the world have different cultural norms and governments have different priorities, there will not be a single worldwide regulatory regime. However, it does seem possible to have a single set of basic technical mechanisms that support regulation, based on the idea of requiring data holders to respect the current policy of data subjects about how their data is used.