Apple recently acquired a sensor company that may dramatically improve the cameras in future iPhone models.
The company confirmed Thursday it has acquired inVisage Technology, a startup that builds “nanoparticle” sensors, though a spokesperson declined to “discuss our purpose or plans.”
InVisage’s flagship product, QuantumFilm, was designed to improve smartphone image and video quality in low-light conditions. It uses a what it calls a “photosensitive” layer made of quantum dots, or nanoparticles, that absorb light and get dispersed to form a grid. InVisage claims its QuantumFilm sensor is capable of absorbing the same amount of light as a traditional silicon sensor but in a layer that’s 10 times thinner.
“The higher positioning of the photosensitive layer allows the QuantumFilm pixel to detect more photons, store more electrons (and therefore more photographic information), and reproduce colors more accurately—all with a thinner camera module,” the company’s site reads.
The startup released a short-film, PRIX, shot entirely by a QuantumFilm sensor.
Apple currently uses back-illuminated sensors made by Sony, but it has slowly weaned itself off third-party suppliers—many of which are its rivals—and started to produce its own components. In the last few years, Apple built its own GPU and Bluetooth chips for its smartphones, laptops, and headphones. However, even its flagship iPhone X still relies on display technology from Samsung and several smaller internals made by companies like Qualcomm and Bosch.
It’s not clear how Apple plans on using the startup’s technology, but don’t be surprised if the mobile giant releases an in-house smartphone camera sensor in the next few years.
H/T Business Insider