Skip to main content

Glossary

3D Models

Digital representations of objects or characters crafted in a three-dimensional environment. These models encompass intricate details, textures, and often animation, enabling augmented reality (AR) applications to introduce lifelike and interactive virtual elements into the AR experience.

API

An API is a set of rules and protocols that allows different software applications to communicate and interact with each other. It defines the methods and data formats that developers can use to request and exchange information or perform specific tasks within a software system or between different systems.

AppID

AppID refers to the unique identifier assigned to an Android application. This identifier is formally known as the "package name" or "application ID." It is a critical component of an Android app's identity and is declared in the app's manifest file. Read more about AppID here.

AR effects

Digital enhancements and interactive elements added to real-world views via AR devices. AR effects empower developers to create engaging and immersive AR experiences, enhancing user engagement and creativity. The term AR effects or filters is used interchangeably. There is no difference between the two. But DeepAR officially uses the term effects.

Background blur

A visual effect applied to images or videos that makes the background appear soft and out of focus, while keeping the subject, typically in the foreground, sharp and clear.

Background replacement

Technique that utilizes neural networks and computer vision algorithms to automatically separate the subject (usually a person) from the background in images or videos. Once the subject is segmented, a new background can be seamlessly inserted, replacing the original one. This technology is commonly used in applications like virtual backgrounds in video conferencing or creating dynamic, context-aware environments in augmented reality (AR) experiences, where the user can appear in different virtual settings while maintaining a realistic and convincing visual presentation.

Beauty API

An add-on within the AR SDK that enhances a user's appearance by applying virtual makeup, skin enhancements, or filters to create a more polished and attractive look in augmented reality applications.

Blend shapes

Blend shapes, also known as shape keys or morph targets, are a technique used in computer graphics, 3D modeling, and animation to create a range of different facial expressions or deformations for characters or objects. Blend shapes work by defining a set of key shapes or positions for specific parts of a 3D model, typically associated with facial features like the eyes, mouth, and eyebrows. Each blend shape represents a particular deformation or expression, such as a smile, frown, or raised eyebrow. By combining these blend shapes in various proportions, it's possible to smoothly transition between different facial expressions. For example, to create a smile, you might blend the "neutral" face shape with a "smile" blend shape.

Bone rigging

Bone rigging is a technique in 3D animation where interconnected bones simulate a character's or object's skeleton. It allows for realistic movement by manipulating these bones, simplifying the animation process.

Cross-Platform

Refers to software or technologies that work on different operating systems or devices without requiring major modifications, allowing for broader compatibility and efficient development.

Domain name

A domain name is like a website's easy-to-remember address, such as www.example.com. Protocol name (like https) does not belong to the domain name as well as any additional paths. For example, in https://www.example.com/path/to/page, only www.example.com is domain name.

Emotion detection

Analyzing facial expressions in images to determine a person's emotional state, such as happiness, sadness, or surprise. In AR face filters, this allows for dynamic adjustments and interactions based on the user's emotion, creating a more immersive and responsive experience.

Face tracking

Detecting and monitoring the position and movement of a face within an image, enabling AR face filters with the accurate overlay and alignment of virtual elements or effects on the user's face.

FBX

FBX is a proprietary file format developed by Autodesk, commonly used in the fields of 3D modeling, animation, and computer graphics. It serves as a versatile and widely supported format for exchanging 3D data between different software applications and platforms. FBX files can contain 3D models, animations, textures, and other assets, making them suitable for transferring complex 3D scenes and animations between various 3D modeling and animation software.

Foot tracking

Computer vision algorithms used to detect, monitor, and track the position and movement of a person's feet in real-time. This feature is especially valuable for applications like virtual shoe try-ons, where users can preview how footwear looks on their feet in a virtual setting before making a purchase. For easy virtual shoe try-on creation and implementation check out ShopAR.

Framework

Pre-designed, structured set of libraries, tools, and conventions that provides a foundation for building software applications. It offers a reusable structure and a predefined architecture that simplifies the development process by providing common functionalities, such as handling input, managing databases, and implementing specific features. Frameworks are designed to be flexible and extensible, allowing developers to build customized solutions on top of the framework's core components.

glTF

glTF is an open-standard file format designed for the efficient transmission and loading of 3D scenes and models. It is developed by the Khronos Group, an open consortium of industry-leading companies. glTF files are typically compact and optimized for fast loading and rendering, making them ideal for web-based 3D experiences and interactive applications. A glTF file uses one of two possible file extensions: .gltf (JSON/ASCII) or .glb (binary).

Hair segmentation

Process of identifying and isolating the pixels in an image or video that correspond to human hair, effectively distinguishing and separating the hair region from other regions, such as the face, background, or other objects.

License key

A License Key is a unique code provided by DeepAR to registered developers. This key validates and authorizes an application to use the features and capabilities of the DeepAR SDK.

LUT

LUT (Lookup Table) is a data structure or mathematical formula used in digital image and video processing to map input color values to desired output color values. LUTs are commonly employed to perform color grading, color correction, or color transformations. They can adjust brightness, contrast, saturation, and color balance by remapping the colors in an image or video based on predefined color values.

MAU

MAU (Monthly Active Users) represents the number of unique users who actively use the DeepAR platform and its features within a calendar month. In DeepAR, we use the MAU metric as a key factor in determining monthly subscription plans.

Particles

Small, individual graphical elements used to simulate complex phenomena, such as smoke, fire, rain, or explosions, by representing them as dynamic collections of many tiny elements moving and interacting according to defined behaviors.

PBR

PBR, short for Physically Based Rendering, is a computer graphics rendering technique that aims to simulate the interaction of light with materials in a physically accurate way. In PBR, materials are described using parameters that mimic real-world properties, such as roughness, metallicity, and reflectivity. These parameters determine how a material responds to light, including how it scatters, reflects, or absorbs light rays. By using PBR, artists and developers can create 3D models and scenes that behave in a physically plausible manner, resulting in more convincing and immersive visuals.

Physics

Simulation of real-world physical properties and behaviors within the augmented environment. By integrating physics algorithms, AR applications can make virtual objects behave in ways that align with our expectations from the real world.

PNG animation

PNG animation refers to a sequence of images in the PNG (Portable Network Graphics) format that are displayed in rapid succession to create the illusion of motion or animation. Each image, known as a frame, is a standalone PNG file, and when displayed sequentially at a fast rate, they produce the appearance of movement.

Post-processing

Post-processing refers to the manipulation and enhancement of an image or video after it has been rendered. This step occurs in the final stages of the graphics pipeline, typically after the 3D scene has been rendered into a 2D image or video frame.

Render textures

Dynamic textures onto which content is directly rendered. Instead of holding a static image, they capture and store the output of a rendering process. This allows for operations like off-screen rendering, post-processing effects, or storing the visual output of a camera in real-time for subsequent use in the AR environment.

Scripting

Scripting involves writing code in a scripting language to automate tasks or control the behavior of software applications. Unlike full-fledged programming, which is often geared towards creating standalone applications, scripting is commonly used for automating repetitive tasks, manipulating data, or adding functionality to existing software platforms.

Segmentation

Process of dividing a digital image into multiple segments or sets of pixels, often with the goal of simplifying the representation of an image or separating it into meaningful parts, such as object and background.

Simple pendulum physics

In computer graphics and animation, a "simple pendulum" is often used as a simplified example to illustrate principles of motion, oscillation, and physics simulation. It's a way to demonstrate how objects move back and forth under the influence of gravity and can be used to teach basic concepts of keyframing, animation curves, and dynamics in 3D modeling and animation software.

Sound playback

Support for sound effects within AR experiences, enabling the addition of music and audio enhancements to filters and effects.

Triggers

Events or conditions that initiate specific actions or responses within a program. Triggers can be user interactions like opening mouth, smiling, raising eyebrows, etc. When the user performs these trigger actions, the AR effect can detect them and trigger corresponding reaction.

Video recording

A feature in the SDK that enables users to capture and store real-time video content.

Video textures

Moving image or video clip mapped onto a surface or object in 3D modeling and animation. This technique allows users to integrate dynamic content, such as videos, directly onto 3D objects, enhancing the visual appeal and interactivity of the 3D scene.

Virtual Try-On

A feature that allows users to digitally try on various products, such as eyeglasses, sunglasses, or footwear, within an AR environment to visualize how they would appear before making a purchase decision.

Wrist tracking

Computer vision algorithms used to detect, monitor, and track the position and movement of a person's wrist in real-time. This allows an AR application to place a virtual watch onto the user's wrist, giving them a realistic view of how it might look and fit, much like trying it on in a physical store. For easy virtual watch try-on creation and implementation check out ShopAR.