Open Source XR: The Complete Directory of AR/VR Projects, Frameworks & Tools
Every major open source project in XR, AR, and VR — from WebXR and A-Frame to OpenXR and Godot. The developer's guide to building spatial computing experiences without vendor lock-in.
Why Open Source Is Reshaping XR Development
Extended reality - AR, VR, and MR - has historically been dominated by proprietary SDKs, closed hardware ecosystems, and expensive toolchains. Unity, Unreal Engine, and device-specific SDKs created a fragmented landscape where developers found themselves locked into vendor decisions and licensing changes. The rise of a robust open source XR ecosystem is changing all of that.
Today, you can build a fully immersive VR game with Godot (MIT license, zero royalties), deploy it via OpenXR to any major headset, stream it wirelessly via ALVR or WiVRn, and track hands using MediaPipe - all without paying a single licensing fee or touching proprietary code. For web-based XR, A-Frame, Babylon.js, and Three.js deliver production-grade experiences in the browser. For AR on mobile, AR.js and MindAR.js make markerless tracking accessible to every web developer.
This directory catalogs every significant open source project in the XR ecosystem - with GitHub star counts, licenses, primary languages, and descriptions - so you can build your stack with confidence. Whether you're a solo indie developer, a research lab, or an enterprise team evaluating vendor lock-in risk, this is your reference guide.
Complete Directory of Open Source XR Projects
The table below covers all major open source projects in the XR ecosystem as of 2026. GitHub star counts are approximate and reflect active community interest.
| Project⇅ | Category⇅ | Language⇅ | License⇅ | ★ Stars⇅ | Description⇅ |
|---|---|---|---|---|---|
| WebXR API (W3C) | Web XR | Spec | Open Standard | — | W3C standard API enabling immersive experiences in web browsers |
| A-Frame | Web XR | JavaScript | MIT | ~16k | HTML-based framework for building 3D/AR/VR on the web |
| Babylon.js | Web XR | TypeScript | Apache 2.0 | ~23k | Full-featured 3D engine with first-class WebXR support |
| Three.js | Web XR | JavaScript | MIT | ~102k | Foundational 3D library; includes WebXRManager for VR/AR |
| React Three Fiber | Web XR | TypeScript | MIT | ~27k | React renderer for Three.js; pairs with @react-three/xr |
| PlayCanvas | Web XR | JavaScript | MIT | ~9.5k | WebGL game engine with built-in WebXR module |
| Godot Engine | Game Engine | C++/GDScript | MIT | ~92k | Full game engine with native OpenXR support in 4.x |
| OpenXR SDK | Runtime/Standard | C/C++ | Apache 2.0 | ~1.5k | Khronos cross-platform XR API — the industry standard |
| LÖVR | Game Engine | Lua/C | MIT | ~2.2k | Simple VR framework for Lua; great for experimentation |
| OpenVR | Runtime | C++ | BSD-3-Clause | ~7k | Valve's VR API (legacy); superseded by OpenXR but still used |
| AR.js | AR Framework | JavaScript | MIT | ~16k | Lightweight marker-based and location AR for the web |
| MindAR.js | AR Framework | JavaScript | MIT | ~3k | Web AR with image tracking and face tracking via TensorFlow.js |
| MediaPipe | AI/ML & AR | C++/Python/JS | Apache 2.0 | ~26k | Google's ML pipeline for hand, face, pose, and object tracking |
| OpenCV | Computer Vision | C++/Python | Apache 2.0 | ~78k | Essential computer vision library used in AR pipelines worldwide |
| ORB-SLAM3 | SLAM | C++ | GPLv3 | ~9k | State-of-the-art SLAM supporting monocular, stereo, RGB-D cameras |
| OpenVINS | SLAM/VIO | C++ | GPLv3 | ~3k | Open-source visual-inertial navigation system for EKF-based SLAM |
| Kimera | SLAM | C++ | BSD-2-Clause | ~2.5k | MIT SPARK Lab SLAM with semantic mesh reconstruction |
| ElasticFusion | SLAM | C++ | Non-commercial | ~2k | Dense surfel-based real-time RGB-D reconstruction |
| OpenXR Toolkit | Tools | C++ | MIT | ~1k | In-headset overlay for OpenXR — upscaling, hand tracking, overlays |
| Monado | Runtime | C | MIT | — | First open source OpenXR runtime; Linux-native (freedesktop.org) |
| WiVRn | Wireless PCVR | C++ | GPLv3 | ~1.2k | Wireless VR streaming using Monado on Linux to Android headsets |
| ALVR | Wireless PCVR | Rust | MIT | ~5k | Air Light VR — stream SteamVR/PCVR wirelessly to Quest headsets |
| Blender | Creation Tools | C/Python | GPL-2.0 | ~12k | 3D creation suite with XR viewport for in-headset editing |
| Mozilla Hubs | Social VR | JavaScript | MPL 2.0 | ~2k | Open source browser-based social VR platform |
| Spoke (Hubs Editor) | Creation Tools | JavaScript | MPL 2.0 | ~1.5k | Scene editor for Mozilla Hubs — drag-and-drop world building |
| OpenPose | AI/ML | C++ | Non-commercial | ~30k | CMU real-time multi-person 2D/3D pose estimation |
| DepthAI (OAK) | AI/ML Hardware | Python | MIT | ~3.5k | Luxonis SDK for OAK-D spatial AI cameras with on-device ML |
1. Web-Based XR: Build Without an App Store
The browser is the most accessible XR delivery platform in existence - no app store approval, no sideloading, no SDK installation. The WebXR Device API (a W3C standard) defines how browsers expose VR and AR capabilities, and a thriving ecosystem of frameworks sits on top of it.
WebXR Device API
The WebXR Device API is the foundation of all browser-based XR. It provides access to headset pose, controllers, hand tracking, and AR hit-testing directly from JavaScript. Supported in Chrome, Edge, Firefox Reality, and Safari (partial), it is maintained by the W3C Immersive Web Working Group. The spec includes modules for WebXR Depth Sensing, Lighting Estimation, Anchors, and Hand Input - making modern WebXR surprisingly capable for production AR apps.
A-Frame
Created by Mozilla and now community-maintained, A-Frame remains the most beginner-friendly path into WebXR. It uses HTML-like custom elements (
- GitHub: github.com/aframevr/aframe - ~16k stars, MIT
- Best for: Rapid prototyping, education, simple VR/AR web demos
Babylon.js
Microsoft's Babylon.js is arguably the most feature-complete open source 3D engine for the web. With full PBR rendering, real-time shadows, Havok physics, and first-class WebXR support including VR, AR, and hand tracking, it punches well above its weight. Its WebXR Experience Helper makes adding immersive mode to any scene a single line of code. The engine ships with a full inspector, a node material editor, and excellent TypeScript support - making it attractive for enterprise development.
- GitHub: github.com/BabylonJS/Babylon.js - ~23k stars, Apache 2.0
- Best for: Production web apps, enterprise XR, game-quality rendering
Three.js & React Three Fiber
Three.js is the backbone of most web 3D - with over 100k GitHub stars it is the most widely used 3D library in JavaScript. Its WebXRManager handles device session setup, controller events, and the render loop automatically. React Three Fiber (R3F) brings Three.js into the React component model, making it the natural choice for teams already using React. The @react-three/xr package adds a clean XR controller and hand tracking API on top of R3F. Together they form one of the most productive WebXR stacks available.
- Three.js: github.com/mrdoob/three.js - ~102k stars, MIT
- React Three Fiber: github.com/pmndrs/react-three-fiber - ~27k stars, MIT
2. Game Engines & Runtimes: Full-Stack XR Development
Godot Engine
Godot 4.x made a monumental leap forward for open source game development, and XR is a first-class citizen. OpenXR support is built directly into the engine core - no plugin required for Quest, SteamVR, or Pico headsets. The Godot OpenXR Vendors plugin adds device-specific extensions (hand tracking, passthrough, eye tracking) for Meta, Pico, and Khronos conformant devices. As of March 2026, Godot introduced an XR Project Setup Wizard that guides developers through configuration and makes export-ready projects in minutes. With a fully MIT license (zero royalties, no revenue thresholds), Godot is a genuine Unity alternative for XR.
- GitHub: github.com/godotengine/godot - ~92k stars, MIT
- Best for: Full VR/AR games, cross-platform XR, avoiding Unity/Unreal licensing
OpenXR
OpenXR is the single most important standard in XR development. Created by the Khronos Group (the same body behind OpenGL and Vulkan), it defines a portable, royalty-free API for both XR hardware and software. Before OpenXR, developers had to maintain separate codepaths for Oculus SDK, OpenVR, Windows Mixed Reality, and others. Today, Meta Quest, SteamVR, Pico, HoloLens 2, Varjo, and Magic Leap all ship OpenXR runtimes. The Khronos OpenXR SDK and conformance test suite are available on GitHub. If you're starting an XR project in 2026, OpenXR should be your runtime target.
- GitHub: github.com/KhronosGroup/OpenXR-SDK - Apache 2.0
LÖVR
LÖVR is a delightfully simple VR framework that lets you write VR experiences in Lua - the same lightweight scripting language used by Roblox and Defold. If you just want to build something immersive without wrestling with a full engine, LÖVR's minimal API is refreshing. It supports OpenXR natively, works on desktop and standalone headsets, and has a friendly community. Great for research prototypes and artistic experiments.
- GitHub: github.com/bjornbytes/lovr - ~2.2k stars, MIT
3. AR Frameworks: Augmented Reality Without the SDK
AR.js
AR.js is the original open source web AR library - bringing marker-based, NFT (image tracking), and location-based AR to any browser via WebXR or WebRTC. It requires no app install, works on iOS and Android, and integrates with both A-Frame and Three.js. While its image tracking accuracy has been surpassed by newer libraries, AR.js remains the easiest entry point for marker-based AR and city-scale location AR experiences.
- GitHub: github.com/AR-js-org/AR.js - ~16k stars, MIT
MindAR.js
MindAR.js is a more modern web AR library built on TensorFlow.js, offering robust image tracking (like Snapchat lens quality) and face tracking directly in the browser. No server-side processing required - all ML inference runs client-side. It integrates cleanly with both Three.js and A-Frame, and its image tracking performance rivals native SDKs for many use cases.
- GitHub: github.com/hiukim/mind-ar-js - ~3k stars, MIT
MediaPipe
Google's MediaPipe is the backbone of most open source hand and body tracking in XR. It provides production-ready ML pipelines for hand landmark detection (21 keypoints per hand), face mesh (468 landmarks), pose estimation (full body), and object detection - all running in real-time on CPU or GPU. MediaPipe runs on Android, iOS, the web (WASM), and desktop. Many open source XR projects use MediaPipe as their perception layer for hand tracking without dedicated VR gloves.
- GitHub: github.com/google/mediapipe - ~26k stars, Apache 2.0
4. Spatial Mapping & SLAM: Understanding the World
Simultaneous Localization and Mapping (SLAM) is the core technology behind inside-out tracking and persistent AR anchors. These open source libraries are what researchers and hardware makers use to build spatial awareness into devices.
ORB-SLAM3
ORB-SLAM3 from the University of Zaragoza is the most widely cited open source SLAM implementation. It supports monocular, stereo, and RGB-D cameras and adds support for fisheye cameras and IMU integration (visual-inertial SLAM). It includes loop closure detection, map reuse across sessions, and multi-map support. While the GPLv3 license limits commercial use, ORB-SLAM3 is the gold standard for benchmarking and academic research.
- GitHub: github.com/UZ-SLAMLab/ORB_SLAM3 - ~9k stars, GPLv3
Kimera (MIT SPARK Lab)
Kimera is a full metric-semantic SLAM system from MIT's SPARK Lab. It combines visual-inertial odometry (Kimera-VIO), 3D mesh reconstruction, and semantic labeling into a single pipeline. If you're building AR applications that need to understand what objects are in the scene (not just their geometry), Kimera is one of the only open source systems that handles this end-to-end.
- GitHub: github.com/MIT-SPARK/Kimera-VIO - ~2.5k stars, BSD-2-Clause
5. Tools, Utilities & Infrastructure
Monado
Monado is the world's first open source OpenXR runtime, developed by Collabora and hosted on freedesktop.org. It runs on Linux and supports a wide range of hardware including Valve Index, WMR headsets, and tracked controllers via PSMove or Lighthouse. Monado is the runtime that powers WiVRn (wireless streaming) and is increasingly used in embedded and automotive XR systems. For any Linux XR development or headset manufacturer looking to implement OpenXR, Monado is the reference implementation.
- GitLab: gitlab.freedesktop.org/monado/monado - MIT
ALVR (Air Light VR)
ALVR lets you stream PC VR games wirelessly to Meta Quest headsets. Unlike the official Meta Air Link (proprietary), ALVR is fully open source, written in Rust for performance and safety, and supports SteamVR games out of the box. It uses a custom codec pipeline to minimize latency over Wi-Fi. ALVR is one of the most popular open source XR projects on GitHub and is actively maintained with frequent releases.
- GitHub: github.com/alvr-org/ALVR - ~5k stars, MIT
WiVRn
WiVRn is a newer wireless VR streaming solution built specifically for the Linux ecosystem. It integrates with Monado and streams OpenXR content to Android-based headsets (Quest, Pico) over Wi-Fi. Where ALVR focuses on SteamVR compatibility, WiVRn is architecture-first - designed for the open source OpenXR runtime stack. It is the recommended solution for Linux PC VR users who want full open source throughout.
- GitHub: github.com/WiVRn/WiVRn - ~1.2k stars, GPLv3
6. Creation & Content Tools
Blender XR Viewport
Blender - the world's most popular open source 3D creation tool - has included a native VR viewport since version 2.83. Artists and developers can review their 3D scenes, animations, and environments directly in a VR headset without exporting. The XR viewport supports room-scale navigation, object snapping, and basic manipulation tools. It is particularly valuable for environment artists building XR content who want immediate spatial feedback.
- GitHub: github.com/blender/blender - ~12k stars, GPL-2.0
Mozilla Hubs & Spoke
Mozilla Hubs is an open source browser-based social VR platform - think VRChat but fully open and self-hostable. You join rooms with an avatar via WebXR, no app install needed. Spoke is the companion world-building editor that lets you compose 3D scenes using drag-and-drop. While Mozilla scaled back its hosted Hubs service, the codebase remains fully open source and a vibrant community maintains self-hosted deployments. Hubs is built on Three.js, A-Frame, and Phoenix (Elixir), making it a fascinating open source XR architecture study.
- GitHub: github.com/mozilla/hubs - ~2k stars, MPL 2.0
Getting Started: Which Stack for Which Use Case
Use Case: WebXR Demo or Interactive Experience
Stack: A-Frame + AR.js or MindAR.js for AR; A-Frame or Babylon.js for VR. Deploy as a static site (Vercel, Netlify, GitHub Pages). Zero backend needed. Works on Meta Quest Browser, Chrome, and iOS Safari.
Use Case: Indie VR Game
Stack: Godot 4.x + OpenXR + Godot OpenXR Vendors plugin. Export to Meta Quest, SteamVR, or Pico. Use GDScript or C#. MIT license means no royalties. Optionally stream via ALVR during development for wireless testing.
Use Case: AR Research or Prototype
Stack: ORB-SLAM3 or Kimera for SLAM + MediaPipe for hand/pose tracking + OpenCV for image processing. Connect via ROS2 for robot integration. Run on a laptop + RealSense D435i or OAK-D camera.
Use Case: Linux PC VR
Stack: Monado (OpenXR runtime) + SteamVR or native Godot + WiVRn for wireless streaming. Full open source stack from driver to headset.
Use Case: Enterprise Web Application with Spatial UI
Stack: React Three Fiber + @react-three/xr + Babylon.js (for GUI). TypeScript throughout. Works inside a standard React/Next.js app. Full WebXR feature set including hand tracking and depth sensing.
The Open Source XR Ecosystem: How It All Connects
The projects above do not exist in isolation - they form a layered ecosystem where each project builds on others. At the base, OpenXR is the universal runtime API that hardware vendors implement and software targets. Godot and LÖVR consume OpenXR at the engine level. Monado implements OpenXR for Linux. ALVR and WiVRn bridge the gap between OpenXR on PC and wireless Android headsets.
Above the runtime, game engines and web frameworks provide the developer experience. Three.js and Babylon.js talk to WebXR in browsers. React Three Fiber brings Three.js into the React paradigm. A-Frame abstracts Three.js with declarative HTML components. Each layer of abstraction trades performance for developer velocity - knowing where your use case sits on that spectrum is key.
The perception layer - what the device knows about the physical world - is powered by SLAM (ORB-SLAM3, Kimera) for spatial mapping, and by ML (MediaPipe, DepthAI) for semantic understanding of hands, faces, and objects. These systems feed into higher-level frameworks that expose anchors, hand joints, and meshing APIs through OpenXR extensions.
Finally, creation tools like Blender, Spoke, and Mozilla Hubs close the loop - enabling artists and designers to produce, preview, and deploy content without proprietary software. The result is a complete, vendor-neutral pipeline from scene creation to delivery.
The State of Open Source XR in 2026
The open source XR ecosystem in 2026 is more mature, more production-ready, and more interconnected than ever before. Godot's XR improvements, Babylon.js's enterprise adoption, ALVR's refined streaming pipeline, and the broad industry adoption of OpenXR have collectively raised the floor for what open source can deliver. Vendor lock-in is no longer an inevitability in XR development.
Whether you're building for the browser, a standalone headset, a Linux workstation, or a research robot, there is a high-quality open source path to production. The projects in this directory represent thousands of contributors and millions of developer-hours. Use them well - and consider contributing back.