Video: US to invest $258m in supercomputer race with China
IBM researchers have developed a wave forecasting system that's so lightweight it can run simulations on a Raspberry Pi.
Normally, calculating wind, tides, and the ocean's varying depths to forecast waves might require a supercomputer, but a system developed by scientists at the University of Notre Dame and IBM Research Ireland this summer could pave the way to using cheaper equipment to achieve comparable wave-forecasting accuracy.
The forecasting system they've developed is an emulator of a conventional physics-based model, the Simulating Waves Nearshore or SWAN, a wave-modeling tool developed by Delft University of Technology in the Netherlands.
The researchers used SWAN to generate training data for the deep-learning network, feeding SWAN wave conditions from the NOAA National Data Buoy Center, live ocean current readings, and wind data from IBM-owned The Weather Company.
IBM research scientist Fearghal O'Donncha says the deep-learning model they created can generate forecasts up to 12,000 percent faster than current forecasting systems thanks to its accelerated computational speeds.
That higher performance allows the model to create real-time forecasts of wave conditions and run simulations on hardware as small as a Raspberry Pi.
The development could allow scientists to look at a far wider set of physical conditions, geometries and timescales by changing the datasets used to train the deep-learning model, according to O'Donncha.
It may also help make wave forecasting easier for marine-dependent organizations, such as shipping firms, aquaculture businesses, as well as navy and military teams. IBM sees accurate wave forecasting as a key to tapping wave-energy as a renewable resource.
O'Donncha says another advantage of the way IBM has created this artificial intelligence (AI) was exploiting an existing physics model's outputs to generate the training data.
As with any AI project, researchers need a lot of labeled data to train the deep-learning network. By using SWAN or other models, they've got labeled training data on tap.
"We can generate thousands of wave fields using the physics model and then assess each of them against the observed wave data collected at buoys to get a measure of the model's accuracy," O'Donncha told ZDNet.
"Each physics model output serves as an image and the corresponding accuracy measure serves as the 'label' for that image. Use of the physics model gives us the ability to generate as much labeled training data as needed."
However, there are some limitations to the current state of IBM's 'surrogate' for SWAN. The system has only been verified to accurately forecast waves at Monterey Bay, California.
Still, they have been able to demonstrate that the approach worked for that location by using the 12,400 different outputs SWAN generated over four years. The model also factors in seasonal changes that influence wave conditions, for example, higher winds during winter.
To expand the model to different locations, the researchers would need to repeat the training using new wind, tide and ocean data. But O'Donncha argues that the effort needed to expand the deep-learning model to new areas would be similar to the work involved in tuning a physics-based model.
"The biggest effort here is the extraction of data to train the model as model training itself is relatively fast," said O'Donncha.
He notes that The Weather Company could provide data from a wide variety of locations while IBM could create of a suite of trained machine-learning models in operational mode for a specific location such as the entire US coastline.
"These models could then be readily provided based on a set of coordinates to enable exceedingly fast forecasts for any region within this location," he added.
It may also be difficult to forecast waves generated by extreme weather events, such as a typhoon.
"Generation of training data with the effects of a typhoon or other events is enabled by adding those effects to the forcing conditions at the boundary of the physics model," said O'Donncha.
"The challenge is obtaining the corresponding observations of waves caused by extreme events for the labeling. By definition, extreme events are rare."
Previous and related coverage
Commodity hardware makes possible massive 100,000 node clusters, because, after all, commodity hardware is "cheap" -- if you're Google. What if you want a lot of cycles but don't have a few million dollars to spend? Think Raspberry Pi.
In the latest Top500 supercomputer race, only two -- count 'em, two -- of the world's fastest computers aren't running Linux.
- GCHQ builds monster Raspberry Pi cloud with OctaPi formation
- IBM supercomputing to power global weather forecasting model
- Hardware spotlight: The Raspberry Pi [Tech Pro Research]