Apple acquires AI startup that uses machine learning to make pictures crisper so you can snap the perfect selfie every time
- Bloomberg received documents that Apple recently acquired an AI startup
- Spectral Edge uses machine learning to make smartphone pictures crisper
- These findings suggest Apple will use the technology in its next iPhone camera
Apple is working on technology for the perfect selfie.
The tech giant acquired Spectral Edge, a UK-based AI startup that uses machine learning to make smartphone pictures crisper, with more accurate colors.
The system captures and blends an infrared shot with a standard shot to enhance a photograph’s overall depth, detail and color.
The startup uses a process that completely relies on machine learning that can be combined with both hardware and software to improve pictures.
The news was first revealed by Bloomberg, which obtained secret documents ‘that Apple now controls Spectral.’
Scroll down for video
Apple is working on technology for the perfect selfie. The tech giant acquired Spectral Edge, a UK-based AI startup that uses machine learning to make smartphone pictures crisper, with more accurate colors
Speaking with TechCrunch last year, Spectral CEO Rhodri Thomas said: ‘Right now there is no real solution for white balancing across the whole image [on smartphones] — so you’ll get areas of the image with excessive blues or yellows, perhaps, because the balance is out — but our tech allows this to be solved elegantly and with great results.’
‘We also can support bokeh processing by eliminating artifacts that are common in these images.’
Developing the best smartphone camera seems to be the goal for all handset makers.
Apple jsut revamped the camera in its iPhone 11 Pro with a triple-lens system and rumors have suggested that next year’s smartphone will include a 3-D camera for improved depth sensing and augmented reality.
‘Spectral Edge’s technology could contribute to the AI Apple already uses in its Camera app by continuing to improve the quality of photos in low-light environments,’ shared Mark Curman with Bloomberg.
‘The startup has said its technology can be applied via software or chips. Apple’s latest devices include custom processors that assist with picture taking.’
The system captures and blends an infrared shot with a standard shot to enhance a photograph’s overall depth, detail and color. Apple just revamped its camera system in the new iPhone 11 Pro (pictured)
The iPhone 11 Pro however, is the tech giant’s best camera yet.
All three of the lenses are housed in a fairly large module on the back of the phone and can be selected in the Pro’s interface using the phone’s new camera software.
While all three camera units boast 12 megapixels, one of the main differences between the wide/ultra-wide and the telephoto comes in focal length size.
With a 56 mm focal length the telephoto lens is capable of capturing 40 percent more light compared to the company’s previous high-end model, the iPhone XS.
That extra light will enable a long-awaited ‘Night-Mode’ which is designed specifically for taking pictures in low amounts of natural or artificial light.
As for video, each of the cameras in the iPhone 11 Pro is capable of shooting in 4K resolution and is aided by Apple’s ‘cinematic video stabilization’ which helps to reduce camera-shake.
A front-facing camera is also 12 megapixels and is capable of shooting 4K video at 60 frames-per-second to enable slow-motion video.
Phil Schiller, Apple’s senior vice president of Worldwide Marketing, said: ‘iPhone 11 Pro has the first triple-camera system in iPhone and is far and away the best camera we’ve ever made, it provides our customers with great range of creative control and advanced photo and video editing features in iOS 13.’