Microsoft’s Mundie Predicts ‘New Era of Computing’
Posted October 28, 2011 | Atlanta, GA
Before a packed auditorium in the middle of Georgia Tech’s Homecoming week, Microsoft’s Research Chief Strategy Officer—and two-time Tech alumnus—Craig Mundie, EE 1971, MS CS 1972, laid out a technology-enhanced vision of the future. And that future, he said, is not so far away.
Craig Mundie spoke at Georgia Tech on Thursday, October 27, 2011.
“This is the beginning of an era of computing that we think will be substantially different,” said Mundie, delivering the College of Computing’s John P. Imlay Lecture in the College of Management’s LeCraw Auditorium.
Mundie said the new era could mark “the fourth paradigm” in science, preceded by the eras of theory, experimentation and modeling, the latter made possible by previous advances in computational capability. But according to Mundie, in the era of “Big Data,” everyone will have access to vast stores of information in “the cloud.” Combined with tools for visual analytics, these massive public data sets will reveal findings that would have been exceedingly hard to recognize before.
“You can take your Visa card, go online and rent computing capability larger than anything the government used to have,” said Mundie.
To demonstrate, Mundie pulled up a normal Microsoft Excel spreadsheet and, with a couple mouse clicks, populated it with 30 years of precipitation data for the eastern United States using a feature that links Excel to publicly available data sets stored in the cloud. One or two clicks more, and the data turned into a bar graph on which spikes and dips were easily discernible.
“Things that are hard to find in some ways leap out at you when you have these capabilities,” Mundie said. “Machine learning is going to be a big part of this Big Data environment. It’s going to enable you to find patterns that people would have a hard time finding before.”
Another hallmark of this new era will be computers that receive input more like people, shifting from the traditional graphical user interface (GUI) to a natural user interface (NUI). For example, instead of keying in data, users of next-generation smartphones will be able simply to point their phone at an object such as a book and have the phone not only recognize what book it is, but also instantly offer a range of information or access to applications that can grab even more information or services related to the book. Another technological innovation will allow people to simply talk to their computers by waving their hands.
Mundie closed with a series of demonstrations of the possibilities offered by the Microsoft Kinect. Retailing at just $149, the Kinect is a “revolution” in high-quality, affordable machine vision, Mundie said, with many of the same capabilities as equipment that cost $30,000 or more just a year ago.
Immediately after Kinect’s release in November 2010, users around the world started developing hacks to make the device do much more than just play video games, and in June of this year, Microsoft supported the effort by releasing a Kinect software development kit. The company also created Avatar Kinect, which enabled 70 million Xbox users to hold virtual meetings with each other.
“I view this as the first step toward more photorealistic and [business-minded] applications of this technology,” Mundie said. “We are not very far away—maybe three years—from having telemeetings with photorealistic avatars and real-time language translation, including adjustments of facial movements to account for the translation.”
Mundie's visit was co-sponsored by the College of Computing, the Georgia Tech Alumni Association, Peach New Media, the Georgia Tech Student Alumni Association and the Georgia Tech Office of Greek Life. Video below courtesy of Peach New Media.