Computational photography digitally manipulates images captured by a camera. The best smartphone cameras use computational photography to take good-looking photos or ones considered impossible. Companies like Samsung, Apple, and Google have advanced in this field. They depend on AI, machine learning, and sensor technologies to improve image quality and processing.




One company that has done amazing work is OnePlus. OnePlus has shown its prowess with its hardware and software capabilities. It’s leading in technical improvements like charging efficiency and battery life. Moreover, the OnePlus Open remains our favorite tablet foldable in 2024. One feature we want to highlight is the company’s approach to computational photography.

Related

What is computational photography?

Software plays a huge part in mobile photography — here’s how


OnePlus’ approach to computational photography

The multi-million dollar partnership is noticeable

OnePlus used to have horrible cameras. People noticed improvements when OnePlus teamed up with Hasselblad during its launch of the OnePlus 9 series. Oneplus’ partnership with Hasselblad began in 2020. It had two key considerations. The first one was Hasselblad’s history and expertise in aesthetics photography, notably color science.



Hasselblad’s portfolio of Natural Color Solution.

The second required applying that experience to smartphone camera technology, including sensors, lenses, and computational photography. As a result of these research developments, the OnePlus 9 series introduced Natural Color Calibration, a targeted calibration solution based on the sensors’ performance, lens optics, and more.

The OnePlus 9 partnership with Hasselblad for Natural Color Calibration

Source: OnePlus



OnePlus spent $150 million in three years on that partnership, demonstrating how serious it was about making its cameras on par with those of Google, Samsung, and Apple. Fast-forward four years, the company still makes delectable phones. The OnePlus 12 and OnePlus Open have been among our favorites. The OnePlus 13 also brings major improvements to its cameras, which is enough to (potentially) become the best at mobile photography until Samsung shows its hand.

The OnePlus 13 camera systems were co-developed with imaging brand Hasselblad and include a triple-camera array. The image below shows the vivid colors while perfectly capturing the subject.

OnePlus 13 camera sample showing a dog mid-air catching a frisbee

Source: OnePlus

The healthy partnership between OnePlus and Hasselblad is still in effect four years later. We hope the OnePlus Open 2 uses similar tech to bring out the colors and contrast in photos.

Related

2025 could be the year of OnePlus, and I’m here for it

OnePlus is killing it, but can it continue to do so?

12



The OnePlus 13’s macro photography could be big

Get more detailed close-ups using OnePlus’ macro close-up mode

The OnePlus 13 has great camera specs: a 50-megapixel f/1.6 main sensor, a 50-megapixel 3x telephoto lens, a 50-megapixel f/2.0 ultra-wide sensor, and a 32-megapixel front-facing camera, all Hasselblad-tuned. It’s equipped to capture great-looking photos.

Another feature is a macro photography mode, called macro close-up, which is built into the OnePlus 13. It captures optimized close-up shots. The feature was missing at launch in China but came in later in an update. The idea is to retain details while photographing a 1:1 subject. Imagine making those shots even better and calibrating them using the developed Hasselblad Master Imaging algorithm. You can get some incredible enhancements.

A look of macro photography in the OnePlus 13 showing the macro closeup icon zoomed in

Source: Notebookcheck

Macro photography
is extreme close-up photography that focuses on framing a subject using the macro lens in your camera. The macro lens gives sharpness and optimum resolution. It isn’t a form of computational photography, but it can use computational photography to recreate the details you’d otherwise miss.



What others have done in computational photography

Companies had to innovate to keep up

Marc Levoy led the computational photography team at Google Research to improve the Pixel camera. That was a breakthrough when the Google Pixel and Pixel XL phones launched in 2016, using software smarts to carry the meager Pixel hardware. Levoy’s team worked on different projects, but the most notable was the HDR+ mode deployed on Google’s old Nexus devices. Apple capitalized on mobile HDR (high dynamic range) photography when the iPhone 4 launched, improving its imagery in various lighting conditions. These companies were light years ahead.

a night sky with stars done with pixel computational photography



Samsung gave an example of super-telephoto shots recreating the faces of the moon, where the company used high-end phones to rely on computational photography. Samsung couldn’t have recreated this shot without employing computational photography techniques since it’s impossible to resolve details that are many miles away. Samsung has decent cameras, but it can be a mixed-bag experience, even on the powerful Samsung Galaxy S24 Ultra. Anything in motion could blur the image.


OnePlus continues to innovate

Using your handy mobile to snap good-looking photos is no longer a pipedream, thanks to companies like OnePlus, Google, and Apple, which are always innovating. OnePlus’ partnership with Hasselblad continues to thrive, making the OnePlus 13 and OnePlus Open 2 more exciting than ever. Instead of reusing similar tech and making it slightly better (ahem, Samsung), OnePlus is always looking for ways to make photos look sharper, brighter, and more colorful than ever.