We all want smaller and faster computers. Of course, this increases the complexity of the work of computer designers. But now, computer scientists from the University of Virginia are coming with a radical new idea which may revolutionize computer design. They've developed Tortola, a virtual interface that enables hardware and software to communicate and to solve problems together. This approach can be applied to get a better performance from a specific system. It also can be used in the areas of security or power consumption. And it soon could be commercialized with the help of IBM and Intel.
Kim Hazelwood, an assistant professor of computer science, received $50,000 award from the Fund for Excellence in Science and Technology (FEST) Distinguished Young Investigator Grant program, for her idea of a new layer between hardware and software. This will help her and her team to develop the Tortola Project. You can see on the diagram above how an application would run by using this middle layer between hardware and software. (Credit: The Tortola Project)
Now, let's read the explanations given by the University of Virginia.
Computer engineers have long tried to optimize computer systems by instituting changes based within the hardware, but according to Hazelwood, targeting this single layer is limiting. She notes that traditionally, the interface between hardware and software has been fixed. "We are looking forward to what we need to do to fundamentally change this -- to engineer software that can communicate between the two layers," she says. Hazelwood is using the FEST grant to help design Tortola -- a middle layer between hardware and software that can translate and communicate between software and hardware, allowing for cooperative problem solving. "This middle layer would allow software to adapt to the hardware it's running on, something engineers have not been able to do in the past," she says.
And she adds that this virtual layer between software and hardware could have helped Intel a few years ago with the math 'bug' that was hitting some of its processors.
Hazelwood cites a famous Intel mishap where microprocessors were distributed before a flaw in their fine mathematics function was detected, resulting in a massive recall. A system like Tortola could prevent such expensive glitches in the future. "We could use the software to hide flaws in the hardware, which would allow designers to release products sooner because problems could be fixed later," explains Hazelwood.
And here are some details found on the Tortola project home page.
Modern computer systems designers must consider many more factors than just raw performance. Thermal output, power consumption, reliability, testing, and security are quickly becoming first-order concerns. Until recently, the vast majority of research efforts in optimizing computer systems have targeted a single logical "layer" in isolation: application code, operating systems, virtual machines, microarchitecture, or circuits.
However, we feel that we are reaching the limits of the solutions than we can provide by targeting a single layer in isolation. We also feel that there is an important class of computing challenges that is better suited for more holistic approaches. The Tortola project is exploring of a symbiotic relationship between a virtual machine and the host microarchitecture to solve future challenges in the areas of power, reliability, security, and performance.
For more information about the Tortola project, you can read this paper, "Tortola: Addressing Tomorrow’s Computing Challenges through Hardware/Software Symbiosis (PDF format, 2 pages, 78 KB). You also might want to take a look at a presentation given during the Boston Area Architecture 2006 Workshop (PDF format, 16 slides, 4 pages, 69 KB), from which the above illustration has been extracted.
Sources: Melissa Maki, University of Virginia Research News, May 31, 2007; and various websites
You'll find related stories by following the links below.