Introduction
In this blog, we will discuss a brief introduction of ARcore, but before discussing the introduction, we will look at the history and the working of ARCore.
ARCore is Google's platform for constructing Augmented Reality experiences. Using one-of-a-kind APIs, ARCore allows your smartphone to sense its environment, recognize the arena and interact with information. Some APIs are available throughout Android and iOS to enable shared AR experiences.
Background of AR
1862
Henry Dircks and John Henry Pepper created an illusion (projection) technique, later known as Pepper's ghost, used in the theatre.
The primary trick right here is that there are rooms constructed at the stage, which humans can see into, and the opposite one (known as the blue room) is hidden. For the illusion, you install a massive glass in the primary room at a 45-degree angle that can replicate the view of the blue room toward the audience so that you can project 'floating' ghost objects onto the spatiality of the twin room.
1968
Sproull and Sutherland developed the 'Sword of Damocles, the first augmented reality head-mounted display (HMD) and tracking system.
Their machine used a head-mounted display to monitor situations in another room, using magnetic tracking to monitor the user's head movements.
1995
Bajura and Neumann invented the device Fiducial Tracking.
1997
Steven Feiner, Blair Maclntyre, Tobias Hoellerer, and Anthony Webster invented the Touring Machine. A 3 Dimensional mobile AR system has been created to explore the urban environment. It uses a head-mounted display and a bag with a laptop.
2011
Google invented a device named Google Tango, designed for standalone mobile platforms.
2015
Qualcomm released its AR device and named it Vuforia. It is an Augmented Reality software development kit (SDK) for mobile devices.
2017
Google released ARCore, a platform to build Augmented Reality experiences for Android devices.
Working
ARCore does not require any special sensor, unlike Project Tango. ARCore only relies on the device's camera, the phone's motion sensors like the accelerometer and the gyroscope, and some of Google's software tricks to do what it does. ARCore follows a sequence of fundamental standards to successfully see and interpret what the camera is seeing and offer an augmented reality experience based on that information.
ARCore uses SLAM (simultaneous localization and mapping) to understand where the smartphone is precisely relative to the world around it. It manages to compute modifications in location by detecting visually distinct features in the captured camera picture and then using that as characteristic points to understand if it modified the location and the features of that location. It uses these specific points to locate planes or horizontal/vertical surfaces for extra context.
While using the camera's pose (orientation and position) relative to the surroundings over time, that visual record is likewise paired with inertial measurements from the device's IMU. Using these records and context, developers can render things over the mobile's camera feed and make it appear as if they had been a part of the actual world. It also can infer how much light there is in a plane, and using that context makes a rendered picture look brighter or darker in line with how much light ARCore thinks is hitting it.
Compatibility of ARCore with phones
Motion Tracking
To track any motion in the device, it requires some essential features in mobile phones, such as:
- Gyroscope: It measures or maintains the angular velocity and orientation.
- Accelerometer: It is used to measure the acceleration (speed divided by time).
- Global Navigation Satellite (GPS) provides time information and geolocation to the GPS receiver.
Environmental understanding
An ARCore environment generally mixes the images captured in the real world and portrays them as it is in the AR world. It builds a model of the user's environment and starts adding the information as the phone moves around the environment. The camera collects information and new details about the environment. ARCore recognizes and stores the information. Using that information clusters them together to make some feature points that lie on the same vertical, angular, or horizontal surfaces and make these surfaces visible to your app as a plane.
Light Estimation
Light Estimation is a technique used for replicating the lighting conditions of the real world and applying them to our Three Dimensional virtual objects. It currently uses an image analysis algorithm to determine light intensity based on the current image from the given device. Light intensity is then applied as global light to the Three Dimensional objects in the scene.
Advantages
- It knows the environment thoroughly because it maps it across the user.
- It knows the users' relative position in the environment and tracks their movements.
- It can experience the encompassing lights and overlay on top of the AR gadgets to make it more immersive.
- Using their lighting and surroundings capabilities, we can develop [a] sensible surroundings around [a] 3D object.
Disadvantages
- Scanning items can certainly be improved. Sometimes, it can take a long time to inspect, which is unfriendly to the user.
- It is available on devices that support Android Operating System only.
- Scanning accuracy can be improved.
- More functions are required to be introduced, like frame tracking or body tracking.
Overview
ARCore is a rising technology bound to innovate and open up an entirely new array of apps, video games, and client reviews in addition to rising lines of business. ARCore is one of the technologies with the highest potential growth in the next ten years. In the upcoming time, we will enter a new era of fully virtualized and functioning models in different areas.