I'm leading a talented group of graduate and undergraduate students who are developing a new cloud-based service, called Autolab, that provides the capability to offer programming assignments to students around the world.
The service is based on the notion of "autograding", that is, programs evaluating other programs. In this model, teachers select the labs for their classes from a repository of high-quality labs written by other teachers and students. An author whose lab is adopted for a class receives a small royalty payment and community recognition in the form of a public adoptions page. The autograding service provides each student with a virtual machine in the cloud that they can use to store and work on their lab assignments. Each lab is distributed with its own autograder, which students can run at any time to check their progress. The scores from each autograder invocation are streamed back to the autograding service, where they are displayed, anonymized, on a realtime scoreboard (shown) that is visible to everyone in the class. Each time a student hands in their work for credit, the service spins up a new VM and autogrades the student's work in this new VM.
We are actively developing and deploying a local version of the Autolab service at Carnegie Mellon. The system is used by over 1200 CMU students each semester, in courses such as 15-110: Intro to CS (Cortina), 15-112: Intro to Programming (Kosbie), 15-122: Intro to Imperative Computing (Platzer), 18-213/15-213: Intro to Computer Systems (Ohallaron, Ganger, Mowry, Rowe), 15-381: Artificial Intelligence (Velosa, Brunskill), 15-441: Distributed Systems (Andersen, Bryant), 15-746/18-746: Storage Systems (Ganger, Gibson). The site serves about 15,000 page views each day. Since September, 2010, we have autograded over 60,000 jobs.