This turned out to be a pretty nerdy post, but there are nuggets you might be able to use as small talk in the next mosh pit you visit. (Hey, it's worth a try.)
The World Community Grid is a network of individually-owned PCs each running a small, unobtrusive piece of software that works on a tiny part of a much larger problem. When your machine is idle, the program wakes up and thinks for a while, then squirts partial results back to Grid Headquarters, where they are combined with results from nearly 1 million other machines. If it were a supercomputer, the World Community Grid would be the third-fastest in the world. It's used for mathematical simulations around protein folding, cancer, dengue, agronomy, muscular dystrophy, and other research topics.
SETI@HOME, which at 3 million machines is exactly three times as impressive as the World Community Grid, is dedicated to finding signs of extra-terrestrial life by analyzing radio signals from distant stars. It's a quixotic quest and we can only hope for their complete failure. I mean, here's an ironic, massive understatement: Few low-tech societies have benefited from contact with high-tech civilizations. Very much the reverse, in fact. And a starfaring civilization (if that's what we find) is about as high-tech as they come.
Back to the point with a provocative thought: Could you run your nightly batch cycle on your employees' PCs?
Maybe save a bundle on hardware? Probably not. The technique is best for compute-intensive tasks--tasks that require lots of calculation and little data. Conventional batch is usually the reverse: quick, simple calculations but lots of information to be retrieved and returned. Spray thousands of tasks like that onto your company's idle PCs and watch the network collapse under the resulting tsunami of database calls. (At least, I think that's how it'd go...)
So, are there any business uses of the technology? I can only think of one, and it's not a very good example. Companies running "rich internet applications" (RIAs) are exploiting users' machines to handle the human interface aspects of a web application. When you navigate to an RIA web page, it drops a program onto your browser that interacts with you (drag-and-drop, field validation, zooming/scrolling a map image, etc.). It occasionally squirts partial results/requests back to headquarters, but for the most part, it's operating autonomously--using your machine's processor and memory. So I suppose that might be sort of an example of grid computing, though purists would probably disagree.
And with that, I'm out of ideas, even bad ones. Please let me know if you happen to think something up. Thank you.