What do you do if a global pandemic means you can't stage one of the world's most famous golf tournaments? For The R&A, organisers of The Open, the answer was to use a combination of data and video to create a virtual tournament of golfing greats from the past 50 years.
While the virtual tournament was no replacement for the excitement created by a real-life Open, it did allow golf fans to enjoy something that might previously have been considered impossible: to watch the greatest golfers from the past 50 years compete against each other in a single championship.
Known as The Open for the Ages, the event took place from the 16 to 19 July, when the 149th staging of The Open was due to take place. It used archive footage to play out a data-led tournament that included some of golf's greatest players, including Seve Ballesteros, Tiger Woods, Rory McIlroy, Jack Nicklaus and Tom Watson.
SEE: Managing AI and ML in the enterprise 2020: Tech leaders increase project development and implementation (TechRepublic Premium)
The final virtual production was a slick rendition of a real-life Open. Viewers at home got to see 'in-play' clips, leaderboards and statistics over the first three days of the tournament, with the final round broadcast live on Sunday 19 July, when the outright winner was revealed.
The staging of the tournament was all the more remarkable given the technical and time constraints faced by the people putting it together. The project was conceived during lockdown on 24 April and the team involved in the project had to work quickly to create something that would not only look good but feel right.
That's not straightforward when you've got to bring together data and video from across five decades' of sporting highlights. And it's even more challenging when you're working in a sport that has historically not made the most of the information it holds, says Steve Otto, CTO at the R&A, which is golf's governing body as well as the organiser of The Open.
"Golf is a very data-rich sport but it's probably not utilising its data fully at the moment," he says. "There's a lot of it being collected, but there's a lot of isolated pockets and really what we'd like to see is more data integration and governance going forward, so it could be more exploited and perhaps used in the future alongside things like AI."
Otto and his senior colleagues at the R&A are keen to find ways to help golf create new ways to inform and entertain its fans. In many ways, The Open for the Ages is a case study of what can be possible when a data-rich organisation uses the extreme circumstances of lockdown to create new experiences for its customers.
Proof of its success comes in the fact that more than a million people watched the final round broadcast live on Sunday morning UK time. The biggest audiences around the world came from the UK, the USA and Japan. Malcolm Booth, director of sales and marketing at the R&A, says the key to success was that the virtual tournament felt real.
"We knew there were some big characters of the game that we wanted to feature and we knew we wanted to have a winner," he says. "And in order to have a winner, we needed to have some rigour that supported that – the R&A just couldn't, as the governing body for the sport, on a whim decide who should win this championship."
That's where data modelling came in. While R&A specialists worked on creating a story, the organisation engaged tech firm NTT Data to help build a data model that would support its story. The model drew on a wide range of sources from historical information, video footage, weather information and a social-media-sponsored fan vote.
The final production used data from seven previous Open tournaments at St Andrews. The 1970 Open was the first championship where the video was deemed of reasonable enough quality that it could be used in the final production.
While data points from 1970 were basically limited to scorecard results, by 2015 – the last Open used in the data model – the team working on the virtual tournament was able to draw on a much richer collection of data points around shot accuracy.
The NNT Data team also used additional sources to enrich its data analysis, including published information from other tournaments held across the PGA Tour. Further context came from supervised learning, where NTT Data drew on expert opinion from across the R&A to help understand what the data meant and whether the insight they were developing was creating an accurate representation of a real-life Open.
These historical and expert data sources were backed up by fan votes. The R&A put out a survey to The One Club, which is the Open's digital membership platform. They also put out polls on Facebook and Instagram. The surveys – which elucidated opinions on the greatest Open players, without giving too much away – produced more than 10,000 responses.
About six weeks before delivering the final programme to broadcasters, the R&A brought its video-based story and the results from the data model together to check that the result was a genuine narrative that would stand up to analysis. The result was a closely guarded secret, with only a handful of people knowing the final leaderboard before live broadcast.
The Open for the Ages was eventually won by all-time great Jack Nicklaus, who battled back to beat Tiger Woods on the final hole. For Booth, enthusiastic feedback from fans and critics alike is proof they got the key elements right.
"Being able to bring that to life in some way – and make it feel somewhat credible – was certainly one of the yardsticks we use to measure the success of the project," he says.
Otto believes the virtual staging of The Open creates important lessons for the future of golf back on terra firma. Golf fans – like all sports fanatics – are huge consumers of data. The aim now is to use the experiences learned from this virtual golf tournament to find new way to keep fans engaged through data and video.
As many as 105 cameras are already used at The Open to gather data, yet NTT Data estimates that only about 5% of shots are actually used in television productions. The tech company is now working with R&A to use AI to automatically clip and tag this huge volume of film, so videos can be searched, found, compiled and turned into more personalised content for fans.
"We have an immense amount of video-archive footage," says Otto. "Making sure that's not just films on shelves, making sure we've got the metadata in place, making sure we can index and search it easily, and make it really valuable in the future, is what's really come through to me during this project."