Google and MIT's AI can fix your phone snaps in real time

About
Share
Contact
Google and MIT's AI can fix your phone snaps in real time Retouching smartphone snaps after taking them could soon be a thing of the past. *** Google has a new image-processing algorithm that builds on an MIT cloud-based system. *** MIT's system, developed in 2015, sends a low-res image to the cloud for processing. *** It returns a tailored 'transform recipe' to edit the high-res image stored on a phone. *** Google is using machine learning to train a neural network to do what MIT's system did. *** The company's new algorithm is now efficient enough to move image processing to a phone. *** Google tested this on a Pixel phone, and it rendered a 12MP image in 61 milliseconds. *** Google also said its algorithm delivers a better viewfinder with less battery impact. *** Computational photography has been limited on phones due to power constraints and more. *** But Google and MIT just presented their computational photography work in a joint paper. *** "This paper may provide us with a way to sidestep these issues," Google said.

Most Recent