Polymer: Immersive Computing Engine
Immersive media reaches far beyond games, inclusive of content creation, cinematic storytelling, visualization, and simulation. Unity and Unreal are flexible enough to support these experiences, but many developers have taken to codifying design patterns and feature implementations by way of plugins and samples in cases where an engine provides limited built-in facilities. For example, I'm speaking of projects like TurboButton, Virtual Reality Toolkit (VRTK), Unity's EditorXR, and Google's Daydream Elements.
Based on a large amount of open-source libraries and middleware, building a base engine layer is not as difficult as it once was. A byproduct of being a relatively new domain, immersive computing platforms are essentially built on modern hardware, reducing the need to support legacy graphics APIs and related engine baggage. Armed with the OpenVR device abstraction (and its standardized sibling OpenXR), I saw an open opportunity for new project that caters to spatial computing developers.
Polymer is the start of a desktop-class immersive computing engine with built-in abstractions for AR/VR design and prototyping. The audience for Polymer is tiny and reflects my personal development toolchain: native C++ targeting desktop-class virtual reality headsets on Windows.
The project intends to supplement available tools to experience developers, not propose a replacement for well-supported, ship-your-game engines. A core design principle of Polymer is to provide an architecture that invites code-level experimentation and hacking in ways that might be tedious in larger codebases. This might mean directly interfacing with novel hardware, input methods, or rendering backends. Polymer embodies a "data + code" philosophy, which helpfully narrows its definition considerably: it's not a generic game development environment with scripting, artist-friendly asset import pipeline, and shader authoring. Instead, it provides abstractions useful to AR/VR experience developers like locomotion/teleportation, reticles, controllers & haptics, and physics-aware interactive objects.
Polymer is an early stage project. Prior development was focused on creating a tiny core engine capable of supporting more complex ideas. With consumer 6DoF standalone devices on the horizon, Polymer's supported platforms will soon include mobile hardware. Excitement around future devices was a motivational factor in open-sourcing Polymer in present form. As the code evolves to include a reusable hand/controller/interaction model, now seemed like an ideal moment to shift focus from core development towards building a critical mass of samples and examples, and to invite community experimentation beyond my personal prototyping activities.
PostScript A: Polymer consists of ideas and code from a wide variety of sources. Notably influence has been taken from Google's Lullaby, Microsoft's Mixed Reality Toolkit, NVIDIA's Falcor, Unity's EditorXR, VRTK, Morgan McGuire's G3D, Marko Pintera's bs::framework, OpenVR Unity SDK, and Vladimír Vondruš' Magnum.
Postscript B: It was a calculated risk to share the name with Google's Polymer web component library. Ultimately the venn-diagram of developers interested in this project and Google's should be tiny and contextually not-so-confusing.