- Augmented Reality
- Mixed Reality
- Virtual Reality
- Augmented Reality
- Virtual Reality
Realistic 3D rendering is one of the features that distinguishes a high-quality AR-powered app from a subpar one. It takes a blink of an eye to notice all the imperfections of a virtual object to realize it looks nothing like in real life. To create a truly realistic 3D rendering on mobile devices takes lots of calculations, mathematical modeling and a good deal of patience.
So how to make AR rendering as close to real-life as possible and make sure that the result won’t take ages to upload on the user’s phone?
Let’s start with the basics and figure out what AR or 3D rendering means.
3D Rendering is the process of converting a 3D model generated by a computer into a 2D image. As a result, it’s possible to get a non-realistic and a photorealistic image. While the former doesn’t require going into the details of a 3D model and is mostly used in the gaming industry, the latter captures all the little details and looks like a real-life photo.
Photorealistic rendering is a hot item in AR apps used by jewelry brands as it’s important to show the play of light on gemstones and metals on screen as close to real-life as possible.
Making virtual objects, let it be a diamond ring or a necklace with gemstones, blend into the real scene is one of the biggest challenges of augmented reality.
It’s not enough to construct a detailed geometric 3D model and get accurate surface properties for a virtual object. Perfect rendering requires accurate light estimation. Developers should take into account lighting conditions of the real space where the virtual object will be placed, as well as surrounding objects.
The trickiest thing here is that the light in any indoor environment disperses, reflects and takes on the colors of the objects around. What makes the situation even more complicated is that several sources of light can be present at the same time.
To make sure that the virtual object looks realistic, all the shades from the light or any other objects should fall on the right sides, reflections should be visible if the object has glossy texture.
This can be achieved with the help of precise mathematical modeling and refraction calculations for all types of surfaces.
Conventional rendering techniques, including ray tracing, can be a working option but require lots of preparation work and intensive calculations needed to solve the lighting transport equation. Furthermore, they don’t cover the entire spectrum of light estimation in the space where a virtual object is placed.
An image-based approach to rendering offers an efficient solution allowing to achieve close to real-life rendering in AR apps.
If you are ready to upgrade your AR showroom or AR-powered app with the image-based approach, it’s best to consider environment mapping.
Environmental mapping is an image-based lighting technique that works on the grounds of approximation of the appearance of reflective surfaces.
Environment maps can be captured and stored in several formats: sphere mapping, dual paraboloid mapping and cube mapping. In order not to get you bored, we’ll go straight to the latter – the one that is the best fit for AR apps.
Here is how cube mapping works: a device takes pictures of the space in six directions forming a cube around the physical object and detects the sources of light in each of them. This allows reflections to appear on the virtual object right where they should be and makes the object look more realistic overall.
Depending on the texture of the object reflections in the cube map will be layered respectively. Glossy textures reflect everything like a mirror, whereas matted ones adjust their color to the surrounding objects.
One of the benefits of this method is that it’s relatively cheap in terms of computational cost.
Besides getting environmental mapping under the belt, it’s also important to take into account such details of space as ambient light, shadows, shading, specular highlights, etc. All these require separate mathematical calculations that the latest frameworks like ARCore and ARKit know how to deal with.
What these platforms have yet to learn is how to process massive calculations in real-time. This is another challenge of augmented reality app development that needs to be addressed to achieve realistic rendering.
The problem here is that complex virtual objects with many vertices also require more time for rendering than simple objects. This adds up to app response time.
Let’s take a diamond ring, for example, and take a closer look at its structure. You see how many edges the gemstone has, meaning that there will be reflections on each edge depending on the side of the source of light. All this creates a complex 3D model that needs calculations for each edge and lighting scenario.
One of the possible solutions can be to simplify the 3D model, which will subsequently cut down rendering time. The cloud-based app Unreal Engine 5 offers something of a kind, although for now only for the gaming industry. One of its technologies called Nanite promises to allow skipping the vertices in 3D models that it finds insignificant but ensures realistic rendering anyway. It streams and processes only those details that users are able to perceive, eliminating time-consuming work related to calculations of all the equations. So what you do is upload a detailed 3D model onto this app that does all the magic and decides what vertices can be left out.
It’s worth mentioning that such a solution works perfectly for objects placed in distance. When there is a need for a close-up, any little flaw will be visible.
Realistic AR rendering is an important component of a high-quality AR showroom or app. The advancement of technologies in this domain allows to make virtual objects look as close to real life as possible.
With the help of environmental mapping, developers can break down the surrounding environment into detailed images and detect the source of the light and estimate where reflections will appear.
Processing all the calculations in real-time remains to be a challenge for the time being, but steps in this direction are already being made, giving hope that a breakthrough in AR-powered apps for e-commerce is on the doorstep.