Table of contents
1.
Introduction 
2.
Lighting cues
2.1.
Shadows
2.2.
Ambient light
2.3.
Shading
2.4.
Specular highlights
2.5.
Reflections
3.
Environmental HDR mode
4.
Ambient Intensity mode
4.1.
Pixel intensity
5.
Frequently Asked Questions
5.1.
What are feature points in AR?
5.2.
What are the elements that enable spatial mapping environmental understanding and light estimation for an AR app?
5.3.
What is scaling in AR?
5.4.
What is the difference between ARKit and ARCore?
6.
Conclusion
Last Updated: Mar 27, 2024

Light Estimation in AR

Career growth poll
Do you think IIT Guwahati certified course can help you in your career?

Introduction 

A key part for creating realistic AR experiences is getting the lighting right. When a virtual object is missing a shadow or has a shiny material that doesn't reflect the surrounding space, users can sense that the object doesn't quite fit, even if they can't explain why. This is because humans subconsciously perceive cues regarding how objects are lit in their environment. The Lighting Estimation API analyzes given images for such cues, providing detailed information about the lighting in a scene. You can then use this information when rendering virtual objects to light them under the same conditions as the scene they're placed in, keeping users grounded and engaged.

Lighting cues

The Lighting Estimation API provides detailed data that lets you mimic various lighting cues when rendering virtual objects. These cues are shadows, ambient light, shading, specular highlights, and reflections.

Shadows

Shadows are often directional and tell viewers where light sources are coming from.

Ambient light

Ambient light is the overall diffuse light that comes in from around the environment, making everything visible.

Shading

Shading is the intensity of the light. For example, different parts of the same object can have different levels of shading in the same scene, depending on angle relative to the viewer and its proximity to a light source.

Specular highlights

Specular highlights are the shiny bits of surfaces that reflect a light source directly. Highlights on an object change relative to the position of a viewer in a scene.

Reflections

Light bounces off of surfaces differently depending on whether the surface has specular (highly reflective) or diffuse (not reflective) properties. For example, a metallic ball will be highly specular and reflect its environment, while another ball painted a dull matte grey will be diffuse. Most real-world objects have a combination of these properties — think of a scuffed-up bowling ball or a well-used credit card.

Reflective surfaces also pick up colours from the ambient environment. The colouring of an object can be directly affected by the colouring of its environment. For example, a white ball in a blue room will take on a bluish hue.

Environmental HDR mode

These modes consist of separate APIs that allow for granular and realistic lighting estimation for directional lighting, shadows, specular highlights, and reflections.

Environmental HDR mode uses machine learning to analyze the camera images in real time and synthesize environmental lighting to support realistic rendering of virtual objects.

This lighting estimation mode provides:

  1. Main directional light. Represents the main light source. Can be used to cast shadows.
  2. Ambient spherical harmonics. Represents the remaining ambient light energy in the scene.
  3. An HDR cubemap. Can be used to render reflections in shiny metallic objects.

You can use these APIs in different combinations, but they're designed to be used together for the most realistic effect.

Ambient Intensity mode

Ambient Intensity mode determines the average pixel intensity and the color correction scalars for a given image. It's a coarse setting designed for use cases where precise lighting is not critical, such as objects that have baked-in lighting.

Pixel intensity

Captures the average pixel intensity of the lighting in a scene. You can apply this lighting to a whole virtual object.

Color

Detects the white balance for each individual frame. You can then color correct a virtual object so that it integrates more smoothly into the overall coloring of the scene.

Frequently Asked Questions

What are feature points in AR?

During a world-tracking AR session, ARKit builds a coarse point cloud representing its rough understanding of the 3D world around the user (see rawFeaturePoints ). Individual feature points represent parts of the camera image likely to be part of a real-world surface, but not necessarily a planar surface.

What are the elements that enable spatial mapping environmental understanding and light estimation for an AR app?

ARCore has three elements: motion tracking, which figures out a phone's location based on internal sensors and video, enabling you to pin objects and walk around them; environmental understanding, which uses a phone's camera to detect flat surfaces

What is scaling in AR?

Multiplying positional data by a scale factor transforms from device space into content space. Scaling the content, by contrast, transforms content space into device space.

What is the difference between ARKit and ARCore?

Put simply, ARCore is Google's answer to Apple's ARKit. It's a development platform for creating augmented reality applications that was released in early 2018. The SDK run on Google Play Services for AR and you'll have to agree to its terms and conditions before you download the tool for development.

Conclusion

So, in a nutshell, There are a lot of things that can be done related to light estimation in google’s arcore library and there are a lot of settings which can be tried to get the best results. If you are more passionate about learning AR and VR, have a look at this article where passion has led to build something from scratch. 

Refer to our Guided Path on Coding Ninjas Studio to upskill yourself in Data Structures and AlgorithmsCompetitive ProgrammingJavaScriptSystem Design, and many more! If you want to test your competency in coding, you may check out the mock test series and participate in the contests hosted on Coding Ninjas Studio! But if you have just started your learning process and are looking for questions asked by tech giants like Amazon, Microsoft, Uber, etc; you must look at the problemsinterview experiences, and interview bundle for placement preparations.

Nevertheless, you may consider our paid courses to give your career an edge over others!

Do upvote our blogs if you find them helpful and engaging!

Happy Learning!

Live masterclass