Mjolnir: talks

Achieving zero-latency user interface technologies — A tale of psychometrics, electrical engineering, input processing, software design, and entrepreneurship
Daniel Wigdor (University of Toronto)

Friday, April 15th 2016 from 14:30 to 15:30 in Inria Lille - Nord Europe's amphitheater (B building)

Registration required: due to increased security measures, only people registered through this form will be able to attend the talk (see below for the list of registered participants)


Latency is a scourge which has long plagued interactive computing. In this talk, I will describe how my team at the University of Toronto and Tactual Labs have been working to banish it once and for all. First, by studying human perception of latency, and discovering that existing dogma around what is "good enough" is actually two orders of magnitude off of what can actually be perceived. Second, by developing all-new touch sensing technology, capable of sub-millisecond response times. Third, by recognizing that traditional interactive computing architectures intended to enable good software engineering create Moore's-law proof bottlenecks. Fourth, by re-architecting modern hardware and operating systems to overcome these bottlenecks, while leaving the developer experience unchanged for application designers. From these and other innovations, we hope to banish latency once and for all, to finally provide truly zero-latency interaction experiences.


Daniel Wigdor is an assistant professor of computer science and co-director of the Dynamic Graphics Project at the University of Toronto. His research is in the area of human-computer interaction, with major areas of focus in the architecture of highly-performant UI’s, on development methods for ubiquitous computing, and on post-WIMP interaction methods. Before joining the faculty at U of T in 2011, Daniel was a researcher at Microsoft Research, the user experience architect of the Microsoft Surface Table, and a company-wide expert in user interfaces for new technologies. Simultaneously, he served as an affiliate assistant professor in both the Department of Computer Science & Engineering and the Information School at the University of Washington. Prior to 2008, he was a fellow at the Initiative in Innovative Computing at Harvard University, and conducted research as part of the DiamondSpace project at Mitsubishi Electric Research Labs. He is co-founder of Iota Wireless, a startup dedicated to the commercialization of his research in mobile-phone gestural interaction, and of Tactual Labs, a startup dedicated to the commercialization of his research in high-performance, low-latency user input. For his research, he has been awarded an Ontario Early Researcher Award (2014) and the Alfred P. Sloan Foundation’s Research Fellowship (2015), as well as best paper awards or honorable mentions at CHI 2016, CHI 2015, CHI 2014, Graphics Interface 2013, CHI 2011, and UIST 2004. Three of his projects were selected as the People’s Choice Best Talks at CHI 2014 and CHI 2015.

Daniel is the co-author of Brave NUI World | Designing Natural User Interfaces for Touch and Gesture, the first practical book for the design of touch and gesture interfaces. He has also published dozens of other works as invited book chapters and papers in leading international publications and conferences, and is an author of over three dozen patents and pending patent applications. Daniel’s is sought after as an expert witness, and has testified before courts in the United Kingdom and the United States. Further information, including publications and videos demonstrating some of his research, can be found at www.dgp.toronto.edu/~dwigdor.


Loading data...

In accordance with French law "loi Informatique et Libertés", articles 39 and 40, you have the right to access and rectify data about you. In practice, proving your identity, you can ask us whether or not we process data about you and in case the answer is yes, which data, for which purpose and how. Besides, French law acknowledges your right to rectify, update and suppress these data. For any such request, please contact Nicolas Roussel.