The XR Glossary: 150+ AR, VR & Spatial Computing Terms Defined
Every AR, VR, MR, and spatial computing term you need to know — defined clearly and concisely. From 6DoF to WebXR, the most complete XR glossary online.
Whether you are a developer, enterprise buyer, journalist, or XR newcomer, the vocabulary of augmented reality, virtual reality, and spatial computing can feel overwhelming. This glossary covers 150+ essential terms - from foundational concepts to cutting-edge techniques - so you always have a reliable reference. Bookmark it. Share it. Use it.
A
A-Frame
An open-source web framework built on top of Three.js that lets developers create WebXR scenes using HTML-like markup. A-Frame is popular for rapid prototyping and browser-based VR/AR experiences. Related: WebXR, Three.js.
AI Avatars
Digital human representations driven by artificial intelligence, capable of realistic facial expressions, lip sync, voice synthesis, and conversational interaction. AI avatars are increasingly used in XR for training simulations, virtual customer service, and social presence applications.
Air Link
A wireless PCVR streaming feature by Meta that allows Meta Quest headsets to connect to a gaming PC over a local Wi-Fi network, streaming high-quality VR content without a physical cable. Related: Virtual Desktop, Wi-Fi 6E.
All-in-One (Standalone) Headset
A self-contained XR headset with built-in compute, storage, battery, and displays - no external PC or phone required. Examples include Meta Quest 3 and PICO 4. Related: Standalone VR Headset.
Anchors
Digital reference points tied to real-world positions that allow AR/MR content to remain fixed to a specific physical location across sessions or multiple devices. Anchors are fundamental for persistent AR experiences. Related: Spatial Anchors, World Locking.
Android XR
Google's XR operating system platform, built on Android, designed for headsets and smart glasses. Announced in partnership with Samsung, Android XR powers devices like the Samsung Moohan headset and integrates Gemini AI.
Anti-Aliasing
Rendering techniques that smooth jagged edges ("jaggies") on 3D geometry. In XR, common methods include MSAA (Multi-Sample Anti-Aliasing) and FXAA (Fast Approximate Anti-Aliasing). Smooth edges are especially important in VR where the display is very close to the eye.
AR (Augmented Reality)
A technology that overlays digital information - images, text, 3D models, animations - onto the user's view of the real physical world. AR can be delivered through smartphones, tablets, or dedicated AR glasses. The real world remains visible at all times. Related: MR, XR.
ARCore
Google's AR development platform for Android devices. ARCore enables motion tracking, environmental understanding (plane detection, depth), and light estimation, allowing developers to build AR apps for Android phones and tablets.
ARKit
Apple's AR development framework for iOS and iPadOS. ARKit provides world tracking, scene understanding, face tracking, and LiDAR-based depth sensing, enabling AR apps on iPhones and iPads as well as visionOS experiences on Apple Vision Pro.
Asynchronous Spacewarp (ASW)
A technique developed by Meta that synthesizes intermediate frames when the GPU cannot maintain the target frame rate. ASW uses motion vectors and depth data to extrapolate new frames, reducing judder and maintaining smooth perceived motion even under GPU load. Related: Asynchronous Timewarp.
Asynchronous Timewarp (ATW)
A reprojection technique that adjusts the last rendered frame to account for head movement that occurred between render time and display time, reducing perceived latency and motion sickness. ATW operates at the compositor level, independently of the application. Related: Reprojection, Motion-to-Photon Latency.
B
Babylon.js
A powerful open-source 3D engine built for the web, with robust WebXR support. Babylon.js is feature-rich - physics, PBR materials, post-processing - and is often chosen for more complex WebXR applications compared to A-Frame. Related: WebXR, Three.js.
Binocular Overlap
The horizontal field of view region that both eyes can see simultaneously in a VR headset. Higher binocular overlap (closer to the human eye's ~114 degrees) increases the sense of depth and immersion. Related: Field of View, IPD.
Birdbath Optics
A compact optical design used in some AR glasses where a beamsplitter reflects light from a microdisplay toward the eye. Birdbath optics are simpler and cheaper to manufacture than waveguides but tend to have lower transparency and a narrower field of view. Related: Waveguide, Pancake Lens.
Body Tracking
The real-time capture and mapping of a user's full body movements into a virtual avatar or interaction model. Body tracking can be achieved via camera-based computer vision, wearable sensors, or combinations thereof. Related: Hand Tracking, Face Tracking.
C
Color Passthrough
A passthrough camera feed rendered in full color (as opposed to black-and-white), allowing users to see their real environment through a video feed with accurate color representation. Meta Quest 3 and Apple Vision Pro feature color passthrough. Related: Passthrough, Video See-Through.
Controller
A handheld input device used in VR to interact with virtual environments. Modern VR controllers incorporate 6DoF tracking, buttons, triggers, thumbsticks, and often haptic feedback. Some platforms (Meta Quest, Apple Vision Pro) also support controller-free hand tracking.
D
Depth Sensor
A hardware component that measures the distance from the device to surfaces in the environment. Depth sensors enable scene reconstruction, occlusion, and hand/object tracking. Common depth sensing approaches include Time-of-Flight, structured light, and stereo cameras. Related: Time-of-Flight, LiDAR.
Digital Twin
A virtual replica of a physical object, system, or environment that is synchronized with real-world data. In XR, digital twins allow engineers and operators to visualize, simulate, and interact with real-world systems - from factory floors to city infrastructure - in an immersive environment.
Display Panel
The screen technology used inside an XR headset. Common types include LCD, OLED, Micro-OLED, and MicroLED, each with different trade-offs in brightness, contrast, response time, and power consumption. Related: OLED, Micro-OLED, MicroLED.
Draw Call
A command sent from the CPU to the GPU instructing it to render a specific mesh with a specific material. Excessive draw calls are a major performance bottleneck in real-time XR rendering. Batching and instancing are common techniques to reduce draw call count. Related: Rendering, Shader.
E
Enterprise XR
The application of XR technology in professional and industrial contexts - training, remote assistance, design review, warehouse logistics, healthcare, and more. Enterprise XR often prioritizes durability, manageability, and ROI over consumer entertainment. Related: XR Training, XR for Remote Assistance.
Extended Reality (XR)
The overarching umbrella term for all immersive technologies that merge the physical and digital worlds: Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR). XR describes the full spectrum from fully real to fully virtual. Related: AR, VR, MR.
Eye Relief
The distance between the eyepiece lens and the user's eye at which the full field of view is visible. Headsets with larger eye relief accommodate eyeglass wearers more comfortably. Related: IPD, Field of View.
Eye Tracking
Technology that measures the orientation and movement of a user's eyes in real time. Eye tracking in XR enables foveated rendering (rendering at full quality only where the user looks), gaze-based interaction, and social presence features like natural eye contact in avatars. Related: Foveated Rendering, Eye-Tracked Foveated Rendering.
Eye-Tracked Foveated Rendering
An advanced form of foveated rendering where the high-resolution render region dynamically follows the user's gaze in real time, as detected by built-in eye tracking hardware. This is significantly more efficient than fixed foveated rendering. Related: Foveated Rendering, Eye Tracking.
F
Face Tracking
Real-time capture of facial expressions - brow raises, smiles, mouth movements - typically using cameras and computer vision or dedicated face sensors. Face tracking enables expressive social avatars and is used in Apple Vision Pro's Persona feature and Meta's Quest Pro. Related: Body Tracking, AI Avatars.
Field of View (FOV)
The angular extent of the observable world visible through an XR headset at any given moment, measured in degrees. Human binocular vision spans roughly 200 degrees horizontally. Consumer headsets typically offer 90-120 degrees horizontal FOV. Higher FOV increases immersion. Related: Horizontal FOV, Vertical FOV, Binocular Overlap.
Finger Tracking
A subset of hand tracking focused on detecting the individual position and bend of each finger joint. Fine-grained finger tracking enables precise pinch gestures, virtual keyboard typing, and detailed hand-object interactions. Related: Hand Tracking.
Fixed Foveated Rendering
A form of foveated rendering where the high-resolution region is fixed at the center of the display (where most users tend to look), without using eye tracking. Less adaptive than eye-tracked foveated rendering but requires no eye tracking hardware. Related: Foveated Rendering, Eye-Tracked Foveated Rendering.
Foveated Rendering
A rendering optimization technique that concentrates computational resources on the central region of the field of view (where the eye is focused) while reducing quality in the periphery. Exploits the eye's limited peripheral acuity to save significant GPU cost. Related: Fixed Foveated Rendering, Eye-Tracked Foveated Rendering.
Frame Rate
The number of image frames rendered and displayed per second, measured in Hz or fps. XR headsets typically target 72, 90, 120, or even 144 Hz. Higher frame rates reduce judder and motion sickness. Related: Refresh Rate, Judder, Motion-to-Photon Latency.
Fresnel Lens
A compact lens design used in many VR headsets that uses concentric grooves to refract light, reducing weight and depth compared to conventional lenses. Fresnel lenses are cost-effective but can produce "god rays" (light artifacts) with bright content on dark backgrounds. Related: Pancake Lens, Waveguide.
G
Gaussian Splatting
A novel 3D scene representation technique that models scenes as millions of 3D Gaussian "splats" rather than polygons or NeRF volumes. Gaussian Splatting enables real-time rendering of photorealistic scenes captured from photos/video and is rapidly gaining traction for XR visualization. Related: NeRF, Photogrammetry, Point Cloud.
Generative AI in XR
The application of generative AI models - for images, 3D assets, audio, text, and more - within XR contexts. Generative AI enables rapid creation of virtual environments, NPC dialogue, personalized content, and dynamic world-building in real time. Related: AI Avatars, Neural Radiance Fields.
GLTF / GLB
glTF (GL Transmission Format) is an open standard file format for 3D models and scenes, designed for efficient transmission and loading in real-time applications. GLB is the binary container version of glTF. glTF is widely used in WebXR, AR Quick Look, and cross-platform 3D asset pipelines. Related: USD, USDZ.
H
Hand Tracking
The real-time detection and mapping of a user's hand and finger positions using cameras and computer vision, without requiring physical controllers. Hand tracking enables natural, controller-free interaction in XR. Meta Quest, Apple Vision Pro, and HoloLens all support hand tracking. Related: Finger Tracking, Controller.
Haptic Feedback
Physical sensations - vibrations, force, texture - delivered to the user to simulate touch in a virtual environment. In XR, haptics are most commonly delivered via vibration motors in controllers, but advanced solutions include haptic gloves and ultrasonic mid-air haptics. Related: Haptics.
Haptics
The study and use of technology that simulates the sense of touch in digital interactions. In XR, haptics enhance immersion by letting users "feel" virtual objects, surfaces, and events. Related: Haptic Feedback, Controller.
Head-Mounted Display (HMD)
The general term for any XR device worn on the head, encompassing VR headsets, AR glasses, and mixed reality headsets. HMD is the hardware category that includes everything from Meta Quest to Microsoft HoloLens to Apple Vision Pro. Related: Standalone VR Headset, AR Glasses.
Holographic Display
A display technology that creates the perception of three-dimensional images that appear to float in space, using interference patterns of light. True holographic displays are still largely research-phase; some devices (like HoloLens) use the term loosely to describe waveguide-based see-through displays. Related: Light Field Display, Waveguide.
Horizontal FOV
The field of view measured left-to-right across the width of the display. Horizontal FOV has the greatest impact on the sense of immersion and peripheral vision in VR. Related: Field of View, Vertical FOV.
I
Immersion
The degree to which a technology objectively envelops a user's senses - wider FOV, higher resolution, lower latency, and spatial audio all increase immersion. Often distinguished from "presence" which is the subjective psychological sensation of "being there." Related: Presence.
Immersive Technology
Frequently Asked Questions
What does XR stand for?
XR stands for Extended Reality — an umbrella term covering Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR). It describes any technology that extends or replaces human perception of the real world with digital content.
What is the difference between AR and VR?
AR (Augmented Reality) overlays digital content onto the real world, which you can still see. VR (Virtual Reality) replaces the real world entirely with a simulated environment. MR (Mixed Reality) blends both, allowing digital objects to interact with the physical world.
What is 6DoF in VR?
6DoF (Six Degrees of Freedom) means a device can track movement in all six spatial directions: forward/back, left/right, up/down, plus pitch, yaw, and roll rotations. This enables you to physically walk around and lean in a virtual space, creating a much more immersive experience than 3DoF.
What is foveated rendering?
Foveated rendering renders only the area where your eye is focused at full resolution, while peripheral areas are rendered at lower resolution. Eye-tracked foveated rendering uses real-time eye tracking data to follow your gaze, dramatically reducing GPU load without perceptible quality loss.
What is OpenXR?
OpenXR is an open, royalty-free standard from the Khronos Group that defines a common API for XR applications and runtimes. It allows developers to write XR code once and deploy across multiple headsets and platforms without platform-specific rewrites.