Thursday, September 16, 2010
12:00 pm - 1:00 pm
PLACE: Gates 8102 - NOTE SPECIAL LOCATION
TITLE: Search by Sight: Google Goggles
Google Goggles enables Search by Sight: you can get relevant search results on what you're looking at by simply taking a picture from your mobile device. It's ideal for things that aren't easy to describe in words. We take a look at how Goggles operates: a collection of machine vision channels under a root that attempts to find the best few words out of the thousand that make up a single picture. And we think about how Goggles may look in the future after various hardware advances. As Augmented Reality is near the peak of the hype curve, we also review the conditions under which an AR presentation of information leads to a better user experience. What are the characteristics of searches triggered by stimuli in the environment, such as an interesting object I am looking at, versus searches started by an urge "within" me?
David is currently a Staff Engineer at Google where he leads part of the Google Goggles project. David holds a Ph.D from Carnegie Mellon University and a B.S. from UC Berkeley, rides his bike to work, plays drums, and once opened for the rock band No Doubt!
Visitor Host: Garth Gibson
Visitor Coordinator: Angela Miller (firstname.lastname@example.org)
SDI / LCS Seminar Questions?
Karen Lindenfelser, 86716, or visit www.pdl.cmu.edu/SDI/