AR/VR engineering is not software development with a headset strapped on. It is a fundamentally different discipline where your code runs inside a human perception loop — every dropped frame causes nausea, every misaligned coordinate breaks the illusion of presence, and every millisecond of latency disconnects the user from the virtual world. You are not building screens. You are building spaces. Your “UI” floats in three-dimensional space. Your “input system” tracks twenty-six hand joints at 90Hz. Your “rendering pipeline” must hit a frame budget so tight that a single unoptimized shader pass means the difference between immersion and motion sickness.
This guide evaluates every major AI coding tool through the lens of what AR/VR engineers actually build — not web apps, not mobile apps, but real-time spatial applications where quaternion math is daily work, where you juggle Unity C# and Unreal C++ across multiple target platforms, where your shader code must run on mobile GPUs with thermal throttling while maintaining stereo rendering at 90fps. We tested each tool on production XR tasks: setting up hand-tracked grab interactions, writing custom VR shaders, implementing teleportation systems, optimizing frame budgets, building WebXR experiences, and deploying across Quest, Vision Pro, and PCVR simultaneously.
If you build games but not specifically XR, see the Game Developers guide. If your work is primarily GPU shaders outside of XR, see the Graphics & GPU Programmers guide. This guide is specifically for engineers building immersive experiences — the intersection of real-time 3D, spatial computing, and human perception.
Best free ($0): GitHub Copilot Free — decent Unity C# completions, basic shader snippets, 2,000 completions/mo covers hobbyist XR prototyping. Best overall ($20/mo): Cursor Pro — multi-file context handles Unreal C++ headers + implementation files together, shader preview integration, and strong cross-file awareness for XR Interaction Toolkit setups. Best for reasoning ($20/mo): Claude Code — strongest spatial math reasoning of any tool, best at understanding quaternion operations, coordinate system transforms, and complex interaction state machines. Best combo ($30/mo): Claude Code + Copilot Pro — Claude for complex spatial logic, architecture decisions, and performance optimization reasoning; Copilot for fast inline completions while writing MonoBehaviours and AActor subclasses in Unity/Unreal.
Why AR/VR Engineering Is Different
AR/VR engineering operates under constraints that break most AI coding tools’ assumptions about software development:
- Multi-engine workflows with incompatible paradigms: Unity uses C# with MonoBehaviour lifecycle, component-based architecture, and garbage collection. Unreal uses C++ with UObject hierarchy, UPROPERTY/UFUNCTION macros, and deterministic memory management via the garbage collector. Custom engines use whatever their architects decided. An AR/VR engineer often works across multiple engines in the same year — sometimes the same month — and AI tools trained predominantly on web code struggle with both. The engine API surface is massive: Unity’s XR Interaction Toolkit alone has hundreds of classes, and Unreal’s motion controller component hierarchy spans dozens of headers across engine modules.
- Spatial computing is a fundamentally different paradigm: Web development operates in 2D screen space with pixel coordinates. AR/VR development operates in 3D world space with quaternions, transformation matrices, coordinate system conversions, and six degrees of freedom. You do not position a button at
(x, y)— you position an interactable at(x, y, z)with a rotation expressed as a quaternion(w, x, y, z)and a scale that must account for the user’s IPD (interpupillary distance). Quaternion multiplication is non-commutative. Euler angles have gimbal lock. Transformation order matters. AI tools that suggesttransform.rotation = new Vector3(90, 0, 0)instead oftransform.rotation = Quaternion.Euler(90f, 0f, 0f)reveal they do not understand the domain. - Shader programming spans multiple languages and compilation targets: Unity shaders use ShaderLab wrapping HLSL (or the Shader Graph visual system). Unreal uses HLSL through its Material Editor or custom USF files. WebXR uses GLSL through WebGL/WebGPU. Each platform has different capability tiers — Quest’s Adreno GPU does not support geometry shaders or tessellation. A shader that looks correct on a desktop 4090 will either fail to compile or run at 5fps on Quest 3. AI tools must understand not just shader syntax but platform-specific GPU constraints.
- Performance constraints are absolute, not aspirational: A web page that loads in 3 seconds instead of 1 is slow. A VR application that renders at 45fps instead of 90fps makes people physically ill. The frame budget for 90fps is 11.1 milliseconds — total, including CPU logic, physics, animation, rendering, compositing, and late-stage reprojection. On Quest, you also have thermal throttling that dynamically clocks down the GPU after sustained load. There is no “we’ll optimize later.” Performance is a correctness requirement from day one.
- Platform fragmentation is extreme: Meta Quest 2/3/Pro (Android, Snapdragon XR2), Apple Vision Pro (visionOS, M2), PCVR via SteamVR/OpenXR (Windows, any GPU), PlayStation VR2 (PS5, proprietary SDK), WebXR (any browser with WebXR support), Magic Leap 2 (Android, custom silicon), Pico 4 (Android, Snapdragon XR2). Each platform has different input capabilities, rendering pipelines, performance envelopes, and SDK requirements. Code that works on Quest will not compile for Vision Pro. Vision Pro’s eye tracking API has no equivalent on Quest. PSVR2’s haptic triggers are unique hardware.
- Physics and interaction systems are uniquely complex: Hand tracking provides 26 joint positions per hand at 30–90Hz with noisy, jittery data that must be filtered and debounced. Eye tracking provides gaze direction with saccade detection and fixation analysis. Haptic feedback requires precise timing correlated with collision events. Grab interactions need to handle physics objects, kinematic objects, and UI elements with different behaviors. Teleportation needs arc traces, valid landing area detection, and smooth transitions that do not trigger vestibular discomfort. None of these systems have equivalents in web or mobile development.
- Asset pipeline integration is tightly coupled to code: A VR application’s performance is determined as much by asset configuration as by code. Texture compression formats (ASTC for Quest, BC7 for PC), mesh LOD settings, audio spatialization parameters, animation compression, lightmap resolution — all of these must be configured correctly per platform. Code that loads a 4K uncompressed texture works fine on a 4090 and crashes on Quest. AI tools that generate code without understanding the asset pipeline generate code that works in the editor and fails on device.
- Spatial UI has no established conventions: Web has 30 years of UI conventions. Mobile has 15 years. Spatial UI has maybe 5 years of serious exploration, and the conventions are still forming. Should a settings menu float in world space or be attached to the user’s hand? Should buttons have depth or be flat planes? How far away should UI elements be to be comfortable to read without eye strain? Should UI follow gaze, follow head, or stay world-anchored? These are open design questions with no standard answers, and AI tools trained on established UI patterns provide actively misleading suggestions for spatial interfaces.
AR/VR Development Task Support Matrix
AR/VR engineers need tools that understand engine-specific XR APIs, spatial math, shader constraints, and the unique performance requirements of immersive applications. Here is how each AI tool handles the tasks that define XR development:
| XR Development Task | Copilot | Cursor | Windsurf | Claude Code | Amazon Q | Gemini CLI |
|---|---|---|---|---|---|---|
| Unity C# Development | Good | Strong | Good | Strong | Fair | Good |
| Unreal C++/Blueprint Development | Fair | Good | Fair | Strong | Weak | Fair |
| Shader & Material Programming | Fair | Strong | Fair | Good | Weak | Fair |
| Spatial Interaction & Input Systems | Fair | Good | Fair | Strong | Weak | Fair |
| Performance Optimization & Profiling | Fair | Good | Fair | Strong | Fair | Good |
| WebXR & Web-Based XR | Good | Good | Good | Good | Fair | Good |
| Cross-Platform Deployment | Fair | Good | Fair | Strong | Weak | Fair |
Reading the matrix: “Strong” means the tool reliably generates correct, production-quality code for the task with minimal editing. “Good” means it gets the structure right but needs manual correction for engine-specific details. “Fair” means it produces a starting point but frequently uses deprecated APIs or misunderstands XR-specific constraints. “Weak” means the tool’s output requires near-complete rewriting for XR use cases.
Unity C# Development for XR
Unity remains the most popular engine for AR/VR development, particularly for Quest and cross-platform targeting. XR development in Unity centers on the XR Interaction Toolkit (XRI), which provides the component-based architecture for hands, controllers, interactables, and locomotion. The codebase is heavily MonoBehaviour-driven with ScriptableObject data patterns, and the XR-specific APIs have evolved rapidly — XRI 3.0 introduced breaking changes to the interactor/interactable architecture that most AI tools have not fully absorbed into their training data.
The typical Unity XR workflow involves creating interactable objects that respond to hand or controller input, managing locomotion systems (teleportation, continuous move, snap turn), building spatial UI, and handling platform-specific features. Here is a complete hand-tracked grabbable object with haptic feedback — the kind of component every VR project needs:
using UnityEngine;
using UnityEngine.XR.Interaction.Toolkit;
using UnityEngine.XR.Interaction.Toolkit.Interactables;
using UnityEngine.XR.Interaction.Toolkit.Interactors;
using UnityEngine.XR.Interaction.Toolkit.Haptics;
/// <summary>
/// A grabbable object with haptic feedback, velocity tracking for throwing,
/// and configurable grab behavior. Compatible with both hand tracking
/// and controller input via XRI 3.0.
/// </summary>
[RequireComponent(typeof(XRGrabInteractable))]
[RequireComponent(typeof(Rigidbody))]
public class HapticGrabbable : MonoBehaviour
{
[Header("Haptic Settings")]
[SerializeField, Range(0f, 1f)] private float _grabHapticIntensity = 0.4f;
[SerializeField, Range(0f, 1f)] private float _releaseHapticIntensity = 0.2f;
[SerializeField] private float _grabHapticDuration = 0.1f;
[SerializeField] private float _collisionHapticScale = 0.6f;
[Header("Throw Settings")]
[SerializeField] private float _throwVelocityScale = 1.5f;
[SerializeField] private float _throwAngularVelocityScale = 1.0f;
[SerializeField] private int _velocitySampleCount = 5;
[Header("Grab Behavior")]
[SerializeField] private bool _useGravityOnRelease = true;
[SerializeField] private float _attachEaseInDuration = 0.15f;
private XRGrabInteractable _grabInteractable;
private Rigidbody _rigidbody;
private IXRSelectInteractor _currentInteractor;
// Velocity tracking ring buffer
private Vector3[] _velocitySamples;
private Vector3[] _angularVelocitySamples;
private int _velocitySampleIndex;
private Vector3 _previousPosition;
private Quaternion _previousRotation;
private void Awake()
{
_grabInteractable = GetComponent<XRGrabInteractable>();
_rigidbody = GetComponent<Rigidbody>();
_velocitySamples = new Vector3[_velocitySampleCount];
_angularVelocitySamples = new Vector3[_velocitySampleCount];
ConfigureGrabInteractable();
}
private void OnEnable()
{
_grabInteractable.selectEntered.AddListener(OnGrabbed);
_grabInteractable.selectExited.AddListener(OnReleased);
}
private void OnDisable()
{
_grabInteractable.selectEntered.RemoveListener(OnGrabbed);
_grabInteractable.selectExited.RemoveListener(OnReleased);
}
private void ConfigureGrabInteractable()
{
_grabInteractable.throwOnDetach = false; // We handle throwing manually
_grabInteractable.attachEaseInTime = _attachEaseInDuration;
_grabInteractable.movementType = XRBaseInteractable.MovementType.VelocityTracking;
}
private void OnGrabbed(SelectEnterEventArgs args)
{
_currentInteractor = args.interactorObject;
_previousPosition = transform.position;
_previousRotation = transform.rotation;
_velocitySampleIndex = 0;
// Clear velocity history
for (int i = 0; i < _velocitySampleCount; i++)
{
_velocitySamples[i] = Vector3.zero;
_angularVelocitySamples[i] = Vector3.zero;
}
SendHapticImpulse(_grabHapticIntensity, _grabHapticDuration);
}
private void OnReleased(SelectExitEventArgs args)
{
if (_useGravityOnRelease)
_rigidbody.useGravity = true;
ApplyThrowVelocity();
SendHapticImpulse(_releaseHapticIntensity, _grabHapticDuration * 0.5f);
_currentInteractor = null;
}
private void FixedUpdate()
{
if (_currentInteractor == null) return;
// Sample velocity for accurate throwing
Vector3 currentVelocity = (transform.position - _previousPosition) / Time.fixedDeltaTime;
Quaternion deltaRotation = transform.rotation * Quaternion.Inverse(_previousRotation);
deltaRotation.ToAngleAxis(out float angle, out Vector3 axis);
if (angle > 180f) angle -= 360f;
Vector3 currentAngularVelocity = axis * (angle * Mathf.Deg2Rad / Time.fixedDeltaTime);
_velocitySamples[_velocitySampleIndex] = currentVelocity;
_angularVelocitySamples[_velocitySampleIndex] = currentAngularVelocity;
_velocitySampleIndex = (_velocitySampleIndex + 1) % _velocitySampleCount;
_previousPosition = transform.position;
_previousRotation = transform.rotation;
}
private void ApplyThrowVelocity()
{
Vector3 averageVelocity = Vector3.zero;
Vector3 averageAngularVelocity = Vector3.zero;
for (int i = 0; i < _velocitySampleCount; i++)
{
averageVelocity += _velocitySamples[i];
averageAngularVelocity += _angularVelocitySamples[i];
}
averageVelocity /= _velocitySampleCount;
averageAngularVelocity /= _velocitySampleCount;
_rigidbody.linearVelocity = averageVelocity * _throwVelocityScale;
_rigidbody.angularVelocity = averageAngularVelocity * _throwAngularVelocityScale;
}
private void SendHapticImpulse(float intensity, float duration)
{
if (_currentInteractor == null) return;
// XRI 3.0: Use HapticImpulseCommandChannel via the interactor
if (_currentInteractor is MonoBehaviour interactorBehaviour)
{
var hapticChannel = interactorBehaviour.GetComponentInParent<HapticImpulsePlayer>();
if (hapticChannel != null)
{
hapticChannel.SendHapticImpulse(intensity, duration);
}
}
}
private void OnCollisionEnter(Collision collision)
{
if (_currentInteractor == null) return;
// Scale haptic feedback by collision force
float impactForce = collision.relativeVelocity.magnitude;
float hapticIntensity = Mathf.Clamp01(impactForce * _collisionHapticScale * 0.1f);
if (hapticIntensity > 0.05f)
{
SendHapticImpulse(hapticIntensity, 0.05f);
}
}
}
What to look for in the AI output: Does the tool use XRI 3.0 namespaces (UnityEngine.XR.Interaction.Toolkit.Interactables) or the deprecated flat namespace? Does it use linearVelocity (Unity 6+) or the deprecated velocity? Does it handle the velocity ring buffer for smooth throwing, or does it naively use _rigidbody.velocity = interactorVelocity? Does it understand that HapticImpulsePlayer replaced the old SendHapticImpulse on XRBaseController?
Copilot: Generates structurally correct Unity C# with proper MonoBehaviour patterns. Knows XRGrabInteractable exists but frequently uses pre-XRI 3.0 namespaces and deprecated controller-based haptic APIs. Good for boilerplate — [SerializeField] fields, lifecycle methods, event subscription patterns — but needs manual correction for XR-specific API versions. Completions are fast and useful for filling in method bodies once the structure is set.
Cursor: Strong Unity C# support with multi-file context that helps when your interactable references ScriptableObject data, input action maps, and custom events defined in other files. Handles XRI patterns well when you include a reference file in context. The Composer feature is useful for scaffolding entire interaction systems across multiple MonoBehaviours. Occasionally mixes XRI 2.x and 3.0 patterns in the same file.
Windsurf: Generates reasonable Unity boilerplate but struggles with XRI-specific patterns. Often suggests the older XRDirectInteractor/XRRayInteractor setup patterns from XRI 1.x instead of the current input-action-driven architecture. Adequate for non-XR Unity work but needs significant manual correction for VR interaction code.
Claude Code: Strongest understanding of the XRI architecture and how interactors, interactables, and the interaction manager relate. When asked to build a grab system, it reasons about edge cases — what happens when an object is grabbed by two hands simultaneously, how to handle the transition from kinematic to physics-based movement on release, how velocity tracking must use a ring buffer to avoid single-frame spikes. The terminal-based workflow is less ideal for Unity development where you typically want IDE integration, but for complex systems design and debugging, the reasoning quality is unmatched.
Amazon Q: Generates generic C# that compiles but lacks engine-specific knowledge. Often produces MonoBehaviour code that misuses Start vs Awake ordering, does not understand serialization constraints, and has no awareness of XR Interaction Toolkit APIs. Not recommended for Unity XR work.
Gemini CLI: Reasonable Unity C# generation with awareness of common patterns. Handles basic XRI setup but struggles with the nuances of the interaction system — particularly around hand tracking versus controller input paths. Good for general Unity questions, less reliable for XR-specific architecture.
Unreal Engine C++ & Blueprint Development for VR
Unreal Engine’s VR development is built on the motion controller component system, with VR-specific classes inheriting from APawn or ACharacter. The codebase involves heavy use of UE macros (UPROPERTY, UFUNCTION, UCLASS), header/implementation file pairs, and the Gameplay Ability System for complex interaction logic. Unreal’s VR template provides a starting point, but production VR requires extensive custom C++ for performance-critical paths that Blueprints cannot handle efficiently.
Here is a VR teleportation component with arc trace visualization and haptic feedback — a fundamental locomotion system for any Unreal VR project:
// VRTeleportComponent.h
#pragma once
#include "CoreMinimal.h"
#include "Components/ActorComponent.h"
#include "Components/SplineMeshComponent.h"
#include "NavigationSystem.h"
#include "VRTeleportComponent.generated.h"
UCLASS(ClassGroup=(VR), meta=(BlueprintSpawnableComponent))
class MYPROJECT_API UVRTeleportComponent : public UActorComponent
{
GENERATED_BODY()
public:
UVRTeleportComponent();
UFUNCTION(BlueprintCallable, Category = "VR|Teleport")
void ActivateTeleport();
UFUNCTION(BlueprintCallable, Category = "VR|Teleport")
void ExecuteTeleport();
UFUNCTION(BlueprintCallable, Category = "VR|Teleport")
void DeactivateTeleport();
UFUNCTION(BlueprintPure, Category = "VR|Teleport")
bool IsValidTeleportDestination() const { return bHasValidDestination; }
protected:
virtual void TickComponent(float DeltaTime, ELevelTick TickType,
FActorComponentTickFunction* ThisTickFunction) override;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Trace")
float ArcVelocity = 900.0f;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Trace")
float ArcGravityOverride = -980.0f;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Trace")
float TraceRadius = 5.0f;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Trace")
int32 MaxSimulationSteps = 30;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Trace")
float SimulationStepSize = 0.05f;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Visuals")
UStaticMesh* ArcSegmentMesh;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Visuals")
UMaterialInterface* ValidMaterial;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Visuals")
UMaterialInterface* InvalidMaterial;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Visuals")
UStaticMesh* DestinationMarkerMesh;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Haptics")
float HapticIntensityOnValid = 0.3f;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Haptics")
float HapticIntensityOnTeleport = 0.6f;
UPROPERTY(EditDefaultsOnly, Category = "Teleport|Navigation")
FVector NavMeshQueryExtent = FVector(100.0f, 100.0f, 200.0f);
private:
void PerformArcTrace();
void UpdateArcVisualization(const TArray<FVector>& ArcPoints);
void ClearArcVisualization();
bool ProjectToNavMesh(const FVector& Point, FVector& OutProjected) const;
void PlayHapticFeedback(float Intensity, float Duration) const;
UPROPERTY()
TArray<USplineMeshComponent*> ArcMeshPool;
UPROPERTY()
UStaticMeshComponent* DestinationMarker;
FVector TeleportDestination;
bool bIsActive = false;
bool bHasValidDestination = false;
bool bWasValidLastFrame = false;
};
// VRTeleportComponent.cpp
#include "VRTeleportComponent.h"
#include "MotionControllerComponent.h"
#include "Kismet/GameplayStatics.h"
#include "GameFramework/PlayerController.h"
UVRTeleportComponent::UVRTeleportComponent()
{
PrimaryComponentTick.bCanEverTick = true;
PrimaryComponentTick.bStartWithTickEnabled = false;
}
void UVRTeleportComponent::ActivateTeleport()
{
bIsActive = true;
SetComponentTickEnabled(true);
if (!DestinationMarker)
{
DestinationMarker = NewObject<UStaticMeshComponent>(GetOwner());
DestinationMarker->SetStaticMesh(DestinationMarkerMesh);
DestinationMarker->SetCollisionEnabled(ECollisionEnabled::NoCollision);
DestinationMarker->RegisterComponent();
DestinationMarker->SetVisibility(false);
}
}
void UVRTeleportComponent::ExecuteTeleport()
{
if (!bHasValidDestination || !bIsActive) return;
PlayHapticFeedback(HapticIntensityOnTeleport, 0.2f);
// Calculate offset from camera to pawn root for accurate placement
APawn* OwnerPawn = Cast<APawn>(GetOwner());
if (!OwnerPawn) return;
APlayerController* PC = Cast<APlayerController>(OwnerPawn->GetController());
if (!PC || !PC->PlayerCameraManager) return;
FVector CameraLocation = PC->PlayerCameraManager->GetCameraLocation();
FVector PawnLocation = OwnerPawn->GetActorLocation();
// Only offset on XY plane; maintain height from nav mesh projection
FVector CameraOffset = CameraLocation - PawnLocation;
CameraOffset.Z = 0.0f;
FVector FinalDestination = TeleportDestination - CameraOffset;
FinalDestination.Z = TeleportDestination.Z;
OwnerPawn->TeleportTo(FinalDestination, OwnerPawn->GetActorRotation());
DeactivateTeleport();
}
void UVRTeleportComponent::DeactivateTeleport()
{
bIsActive = false;
bHasValidDestination = false;
bWasValidLastFrame = false;
SetComponentTickEnabled(false);
ClearArcVisualization();
if (DestinationMarker)
{
DestinationMarker->SetVisibility(false);
}
}
void UVRTeleportComponent::TickComponent(float DeltaTime, ELevelTick TickType,
FActorComponentTickFunction* ThisTickFunction)
{
Super::TickComponent(DeltaTime, TickType, ThisTickFunction);
if (!bIsActive) return;
PerformArcTrace();
}
void UVRTeleportComponent::PerformArcTrace()
{
UMotionControllerComponent* MotionController =
GetOwner()->FindComponentByClass<UMotionControllerComponent>();
if (!MotionController) return;
FVector StartPos = MotionController->GetComponentLocation();
FVector ForwardDir = MotionController->GetForwardVector();
FVector Velocity = ForwardDir * ArcVelocity;
TArray<FVector> ArcPoints;
ArcPoints.Reserve(MaxSimulationSteps + 1);
ArcPoints.Add(StartPos);
FVector CurrentPos = StartPos;
FVector CurrentVelocity = Velocity;
bool bFoundHit = false;
for (int32 Step = 0; Step < MaxSimulationSteps; ++Step)
{
FVector NextPos = CurrentPos + CurrentVelocity * SimulationStepSize;
CurrentVelocity.Z += ArcGravityOverride * SimulationStepSize;
FHitResult Hit;
FCollisionQueryParams QueryParams;
QueryParams.AddIgnoredActor(GetOwner());
if (GetWorld()->SweepSingleByChannel(
Hit, CurrentPos, NextPos, FQuat::Identity,
ECC_Visibility, FCollisionShape::MakeSphere(TraceRadius),
QueryParams))
{
ArcPoints.Add(Hit.Location);
bFoundHit = true;
// Check if hit location is on nav mesh
FVector ProjectedPoint;
bHasValidDestination = ProjectToNavMesh(Hit.Location, ProjectedPoint);
if (bHasValidDestination)
{
TeleportDestination = ProjectedPoint;
DestinationMarker->SetWorldLocation(ProjectedPoint);
DestinationMarker->SetVisibility(true);
// Haptic pulse on transition to valid
if (!bWasValidLastFrame)
{
PlayHapticFeedback(HapticIntensityOnValid, 0.1f);
}
}
else
{
DestinationMarker->SetVisibility(false);
}
bWasValidLastFrame = bHasValidDestination;
break;
}
ArcPoints.Add(NextPos);
CurrentPos = NextPos;
}
if (!bFoundHit)
{
bHasValidDestination = false;
bWasValidLastFrame = false;
DestinationMarker->SetVisibility(false);
}
UpdateArcVisualization(ArcPoints);
}
bool UVRTeleportComponent::ProjectToNavMesh(const FVector& Point,
FVector& OutProjected) const
{
UNavigationSystemV1* NavSys = FNavigationSystem::GetCurrent<UNavigationSystemV1>(GetWorld());
if (!NavSys) return false;
FNavLocation NavLocation;
bool bFound = NavSys->ProjectPointToNavigation(
Point, NavLocation, NavMeshQueryExtent);
if (bFound)
{
OutProjected = NavLocation.Location;
}
return bFound;
}
void UVRTeleportComponent::UpdateArcVisualization(const TArray<FVector>& ArcPoints)
{
if (ArcPoints.Num() < 2) return;
UMaterialInterface* CurrentMaterial = bHasValidDestination ? ValidMaterial : InvalidMaterial;
// Grow pool if needed
while (ArcMeshPool.Num() < ArcPoints.Num() - 1)
{
USplineMeshComponent* Segment = NewObject<USplineMeshComponent>(GetOwner());
Segment->SetStaticMesh(ArcSegmentMesh);
Segment->SetCollisionEnabled(ECollisionEnabled::NoCollision);
Segment->SetForwardAxis(ESplineMeshAxis::X);
Segment->RegisterComponent();
ArcMeshPool.Add(Segment);
}
for (int32 i = 0; i < ArcMeshPool.Num(); ++i)
{
if (i < ArcPoints.Num() - 1)
{
FVector StartTangent = (ArcPoints[i + 1] - ArcPoints[i]).GetSafeNormal() * 50.0f;
FVector EndTangent = StartTangent;
if (i + 2 < ArcPoints.Num())
{
EndTangent = (ArcPoints[i + 2] - ArcPoints[i]).GetSafeNormal() * 50.0f;
}
ArcMeshPool[i]->SetStartAndEnd(
ArcPoints[i], StartTangent,
ArcPoints[i + 1], EndTangent);
ArcMeshPool[i]->SetMaterial(0, CurrentMaterial);
ArcMeshPool[i]->SetVisibility(true);
}
else
{
ArcMeshPool[i]->SetVisibility(false);
}
}
}
void UVRTeleportComponent::ClearArcVisualization()
{
for (USplineMeshComponent* Segment : ArcMeshPool)
{
if (Segment)
{
Segment->SetVisibility(false);
}
}
}
void UVRTeleportComponent::PlayHapticFeedback(float Intensity, float Duration) const
{
APawn* OwnerPawn = Cast<APawn>(GetOwner());
if (!OwnerPawn) return;
APlayerController* PC = Cast<APlayerController>(OwnerPawn->GetController());
if (!PC) return;
PC->PlayHapticEffect(nullptr, EControllerHand::Right, Intensity);
}
What to look for: Does the tool generate the header and source file together with matching signatures? Does it use GENERATED_BODY() correctly? Does it understand that UPROPERTY() is required for UObject pointers to prevent garbage collection? Does the arc trace account for camera-to-pawn offset when calculating the final teleport position? Does it use ProjectPointToNavigation for valid landing area detection?
Copilot: Generates syntactically correct Unreal C++ but frequently forgets UE macros on member variables, causing garbage collection issues. Handles basic AActor/UActorComponent patterns but struggles with the VR-specific class hierarchy. Often generates header-only code without the matching .cpp implementation, which is usable for small utilities but not for production VR components. Completions are faster than other tools for filling in UE boilerplate.
Cursor: The multi-file context is genuinely valuable for Unreal development where every class spans a .h and .cpp file. Cursor can hold both files in context simultaneously, keeping function signatures synchronized. Handles UPROPERTY and UFUNCTION macros well. The VR-specific API knowledge is decent but not deep — it knows MotionControllerComponent exists but may not know the correct setup for OpenXR hand tracking in UE 5.4+.
Windsurf: Produces basic Unreal C++ that compiles but misses VR-specific patterns. Often generates code using deprecated VR APIs from UE 4.x (the old MotionController plugin instead of OpenXR). The cascade editing feature is useful for propagating changes across .h/.cpp pairs but requires manual oversight for correctness.
Claude Code: Best understanding of the Unreal Engine object model and VR component architecture. When asked to build a teleportation system, it reasons about the camera offset problem (the pawn’s root is not where the player’s head is), the nav mesh projection requirement, and the need for haptic feedback on state transitions. The header/source split is handled correctly with matching UFUNCTION/UPROPERTY declarations. Weakest point is that the terminal workflow requires you to copy code back into your Unreal project manually, breaking the editor’s live compilation loop.
Amazon Q: Generates generic C++ that does not understand Unreal conventions. Missing UCLASS/UPROPERTY macros, incorrect include paths, no awareness of the Unreal build system (Build.cs module dependencies). Not usable for Unreal VR development without extensive rewriting.
Gemini CLI: Knows Unreal conventions at a surface level — generates UCLASS with GENERATED_BODY, uses UPROPERTY for exposed variables. VR-specific knowledge is limited. Often confuses Unreal’s VR template setup with custom implementation patterns, generating code that assumes the template’s exact class hierarchy rather than being composable.
Shader & Material Programming for XR
Shaders in AR/VR carry constraints that desktop rendering does not face. You are rendering the scene twice (once per eye) at 90fps, which means your fragment shader budget is roughly half what it would be for a single-view 60fps target. On Quest’s mobile GPU, you cannot use geometry shaders, tessellation, or compute-shader-based effects that desktop VR takes for granted. Every texture sample, every ALU operation, every branching instruction counts double because of stereo rendering.
Here is a Unity ShaderLab/HLSL custom shader for a hologram dissolve effect — the kind of visual effect commonly used in VR interfaces and sci-fi environments, written to be VR-performant:
Shader "Custom/VR_HologramDissolve"
{
Properties
{
_MainTex ("Main Texture", 2D) = "white" {}
_HoloColor ("Hologram Color", Color) = (0.0, 0.8, 1.0, 1.0)
_ScanlineSpeed ("Scanline Speed", Float) = 2.0
_ScanlineDensity ("Scanline Density", Float) = 80.0
_ScanlineIntensity ("Scanline Intensity", Range(0, 1)) = 0.3
_FresnelPower ("Fresnel Power", Range(0.1, 5.0)) = 2.0
_FresnelIntensity ("Fresnel Intensity", Range(0, 3)) = 1.5
_DissolveAmount ("Dissolve Amount", Range(0, 1)) = 0.0
_DissolveNoise ("Dissolve Noise", 2D) = "white" {}
_DissolveEdgeWidth ("Dissolve Edge Width", Range(0.01, 0.2)) = 0.05
_DissolveEdgeColor ("Dissolve Edge Color", Color) = (1.0, 0.3, 0.0, 1.0)
_FlickerSpeed ("Flicker Speed", Float) = 8.0
_FlickerIntensity ("Flicker Intensity", Range(0, 0.3)) = 0.1
}
SubShader
{
Tags
{
"RenderType" = "Transparent"
"Queue" = "Transparent"
"RenderPipeline" = "UniversalPipeline"
}
Blend SrcAlpha OneMinusSrcAlpha
ZWrite Off
Cull Back
Pass
{
Name "HologramDissolve"
Tags { "LightMode" = "UniversalForward" }
HLSLPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma multi_compile_instancing
#pragma multi_compile _ STEREO_INSTANCING_ON
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
struct Attributes
{
float4 positionOS : POSITION;
float3 normalOS : NORMAL;
float2 uv : TEXCOORD0;
UNITY_VERTEX_INPUT_INSTANCE_ID
};
struct Varyings
{
float4 positionCS : SV_POSITION;
float2 uv : TEXCOORD0;
float3 normalWS : TEXCOORD1;
float3 viewDirWS : TEXCOORD2;
float3 positionWS : TEXCOORD3;
UNITY_VERTEX_INPUT_INSTANCE_ID
UNITY_VERTEX_OUTPUT_STEREO
};
TEXTURE2D(_MainTex);
SAMPLER(sampler_MainTex);
TEXTURE2D(_DissolveNoise);
SAMPLER(sampler_DissolveNoise);
CBUFFER_START(UnityPerMaterial)
float4 _MainTex_ST;
float4 _DissolveNoise_ST;
half4 _HoloColor;
half4 _DissolveEdgeColor;
half _ScanlineSpeed;
half _ScanlineDensity;
half _ScanlineIntensity;
half _FresnelPower;
half _FresnelIntensity;
half _DissolveAmount;
half _DissolveEdgeWidth;
half _FlickerSpeed;
half _FlickerIntensity;
CBUFFER_END
Varyings vert(Attributes input)
{
Varyings output;
UNITY_SETUP_INSTANCE_ID(input);
UNITY_TRANSFER_INSTANCE_ID(input, output);
UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(output);
VertexPositionInputs posInputs = GetVertexPositionInputs(input.positionOS.xyz);
output.positionCS = posInputs.positionCS;
output.positionWS = posInputs.positionWS;
output.uv = TRANSFORM_TEX(input.uv, _MainTex);
output.normalWS = TransformObjectToWorldNormal(input.normalOS);
output.viewDirWS = GetWorldSpaceNormalizeViewDir(posInputs.positionWS);
return output;
}
half4 frag(Varyings input) : SV_Target
{
UNITY_SETUP_INSTANCE_ID(input);
UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
// Dissolve
float2 dissolveUV = TRANSFORM_TEX(input.uv, _DissolveNoise);
half noiseVal = SAMPLE_TEXTURE2D(_DissolveNoise, sampler_DissolveNoise, dissolveUV).r;
half dissolveThreshold = _DissolveAmount;
// Clip pixels below dissolve threshold
clip(noiseVal - dissolveThreshold);
// Dissolve edge glow
half edgeFactor = 1.0 - saturate((noiseVal - dissolveThreshold) / _DissolveEdgeWidth);
half3 edgeColor = _DissolveEdgeColor.rgb * edgeFactor * edgeFactor;
// Base texture
half4 baseTex = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv);
// Fresnel rim
half fresnel = pow(1.0 - saturate(dot(input.normalWS, input.viewDirWS)), _FresnelPower);
fresnel *= _FresnelIntensity;
// Scanlines (world-space Y for consistent look in VR)
half scanline = sin(input.positionWS.y * _ScanlineDensity + _Time.y * _ScanlineSpeed);
scanline = scanline * 0.5 + 0.5;
scanline = lerp(1.0, scanline, _ScanlineIntensity);
// Flicker
half flicker = 1.0 - _FlickerIntensity * step(0.97,
frac(sin(_Time.y * _FlickerSpeed) * 43758.5453));
// Compose
half3 holoColor = _HoloColor.rgb * baseTex.rgb;
holoColor += _HoloColor.rgb * fresnel;
holoColor *= scanline;
holoColor += edgeColor;
holoColor *= flicker;
half alpha = saturate(baseTex.a * _HoloColor.a + fresnel * 0.5) * flicker;
return half4(holoColor, alpha);
}
ENDHLSL
}
}
FallBack "Hidden/Universal Render Pipeline/FallbackError"
}
Why this matters for VR: The shader uses UNITY_SETUP_INSTANCE_ID, UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO, and UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX — these macros are essential for single-pass instanced stereo rendering, which is the default rendering mode on Quest and the most performant option for all VR platforms. A shader that omits these macros will either render only to one eye or require the more expensive multi-pass rendering path. The shader also uses half precision throughout the fragment shader, which is critical for mobile GPU performance — Adreno GPUs process half operations at twice the throughput of float.
Copilot: Generates syntactically valid HLSL but consistently misses the stereo rendering macros. Produces shaders that work in the editor’s scene view but render to only one eye in a VR build. Also tends to use float precision everywhere, halving performance on mobile GPUs. Knows basic ShaderLab property syntax and can scaffold a shader structure, but the VR-specific requirements are almost always missing.
Cursor: Strongest shader editing experience because it can hold the shader file, the C# script that drives shader properties, and the material configuration context simultaneously. Generates more complete shader code than Copilot, including proper URP includes. The stereo instancing macros are present about 60% of the time. Handles HLSL and GLSL reasonably well and can convert between them when asked, though the conversion is not always performance-optimal.
Windsurf: Produces basic shader boilerplate that compiles under the built-in render pipeline but frequently misses URP-specific includes (Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl) and generates legacy CG syntax instead of HLSL. Not aware of VR-specific shader requirements. Adequate for simple unlit shaders but not for production VR visual effects.
Claude Code: Understands the relationship between shader performance and VR frame budgets. When asked to write a VR shader, it proactively uses half precision, includes stereo instancing macros, and warns about operations that are expensive on mobile GPUs (dynamic branching, dependent texture reads, per-pixel normalize). The reasoning about why specific optimizations matter is the best of any tool. However, it cannot preview shader output, so the write-test-iterate cycle is slower than with an IDE-integrated tool.
Amazon Q: Generates shader code that looks structurally correct but uses outdated Unity shader syntax (CG instead of HLSL, built-in RP includes instead of URP). No awareness of VR rendering requirements. Not usable for VR shader work.
Gemini CLI: Produces reasonable HLSL with awareness of common shader patterns (fresnel, scanlines, dissolve). Occasionally includes stereo rendering macros but does not consistently generate VR-ready shaders. Better than Copilot for shader architecture questions, worse than Cursor or Claude Code for production VR shader code.
Spatial Interaction & Input Systems
Spatial interaction is the heart of AR/VR engineering. Hand tracking replaces mouse and keyboard. Eye tracking enables foveated rendering and gaze-based selection. Haptic feedback provides confirmation that screen-based UI handles with color changes and animations. These systems process high-frequency, noisy sensor data and must feel instantaneous — any perceptible latency between a hand gesture and the system’s response breaks the sense of presence.
Here is a pinch-to-select gesture recognizer with debouncing — the fundamental interaction primitive for hand-tracked AR/VR interfaces:
using UnityEngine;
using UnityEngine.XR.Hands;
using System;
/// <summary>
/// Recognizes pinch gestures from XR hand tracking data with
/// hysteresis-based debouncing to prevent flickering selections.
/// Uses the XR Hands package (com.unity.xr.hands) for joint data.
/// </summary>
public class PinchGestureRecognizer : MonoBehaviour
{
[Header("Detection Thresholds")]
[SerializeField, Tooltip("Distance in meters between thumb and index tips to trigger pinch")]
private float _pinchStartThreshold = 0.02f;
[SerializeField, Tooltip("Distance in meters to release pinch (must be > start threshold)")]
private float _pinchEndThreshold = 0.04f;
[SerializeField, Tooltip("Minimum pinch duration in seconds before registering as intentional")]
private float _minimumPinchDuration = 0.08f;
[SerializeField, Tooltip("Cooldown between consecutive pinch events")]
private float _pinchCooldown = 0.15f;
[Header("Filtering")]
[SerializeField, Range(0f, 1f), Tooltip("Low-pass filter strength for joint positions")]
private float _jointSmoothingFactor = 0.7f;
[SerializeField, Tooltip("Maximum joint velocity (m/s) to accept pinch (filters fast swipes)")]
private float _maxPinchVelocity = 2.0f;
[Header("Hand Selection")]
[SerializeField] private Handedness _handedness = Handedness.Right;
public event Action<PinchEventData> OnPinchStarted;
public event Action<PinchEventData> OnPinchHeld;
public event Action<PinchEventData> OnPinchEnded;
public struct PinchEventData
{
public Vector3 PinchPosition;
public float PinchStrength;
public float Duration;
public Handedness Hand;
}
private XRHandSubsystem _handSubsystem;
private Vector3 _smoothedThumbTip;
private Vector3 _smoothedIndexTip;
private Vector3 _previousThumbTip;
private bool _isPinching;
private bool _pinchConfirmed;
private float _pinchStartTime;
private float _lastPinchEndTime;
private void OnEnable()
{
TryGetHandSubsystem();
}
private void TryGetHandSubsystem()
{
var subsystems = new System.Collections.Generic.List<XRHandSubsystem>();
SubsystemManager.GetSubsystems(subsystems);
if (subsystems.Count > 0)
{
_handSubsystem = subsystems[0];
_handSubsystem.updatedHands += OnHandsUpdated;
}
}
private void OnDisable()
{
if (_handSubsystem != null)
{
_handSubsystem.updatedHands -= OnHandsUpdated;
}
}
private void OnHandsUpdated(XRHandSubsystem subsystem,
XRHandSubsystem.UpdateSuccessFlags updateFlags,
XRHandSubsystem.UpdateType updateType)
{
XRHand hand = _handedness == Handedness.Left
? subsystem.leftHand
: subsystem.rightHand;
if (!hand.isTracked) return;
XRHandJoint thumbTip = hand.GetJoint(XRHandJointID.ThumbTip);
XRHandJoint indexTip = hand.GetJoint(XRHandJointID.IndexTip);
if (!thumbTip.TryGetPose(out Pose thumbPose) ||
!indexTip.TryGetPose(out Pose indexPose))
return;
// Low-pass filter to reduce tracking jitter
_smoothedThumbTip = Vector3.Lerp(thumbPose.position, _smoothedThumbTip, _jointSmoothingFactor);
_smoothedIndexTip = Vector3.Lerp(indexPose.position, _smoothedIndexTip, _jointSmoothingFactor);
float distance = Vector3.Distance(_smoothedThumbTip, _smoothedIndexTip);
float thumbVelocity = (_smoothedThumbTip - _previousThumbTip).magnitude / Time.deltaTime;
_previousThumbTip = _smoothedThumbTip;
float pinchStrength = 1.0f - Mathf.InverseLerp(0.0f, _pinchEndThreshold, distance);
Vector3 pinchMidpoint = (_smoothedThumbTip + _smoothedIndexTip) * 0.5f;
ProcessPinchState(distance, thumbVelocity, pinchStrength, pinchMidpoint);
}
private void ProcessPinchState(float distance, float velocity,
float strength, Vector3 position)
{
float currentTime = Time.time;
if (!_isPinching)
{
// Hysteresis: use tighter threshold to START pinch
if (distance < _pinchStartThreshold
&& velocity < _maxPinchVelocity
&& (currentTime - _lastPinchEndTime) > _pinchCooldown)
{
_isPinching = true;
_pinchConfirmed = false;
_pinchStartTime = currentTime;
}
}
else
{
float duration = currentTime - _pinchStartTime;
// Confirm pinch after minimum hold duration
if (!_pinchConfirmed && duration >= _minimumPinchDuration)
{
_pinchConfirmed = true;
OnPinchStarted?.Invoke(new PinchEventData
{
PinchPosition = position,
PinchStrength = strength,
Duration = duration,
Hand = _handedness
});
}
// Continue reporting while held
if (_pinchConfirmed)
{
OnPinchHeld?.Invoke(new PinchEventData
{
PinchPosition = position,
PinchStrength = strength,
Duration = duration,
Hand = _handedness
});
}
// Hysteresis: use wider threshold to END pinch
if (distance > _pinchEndThreshold)
{
if (_pinchConfirmed)
{
OnPinchEnded?.Invoke(new PinchEventData
{
PinchPosition = position,
PinchStrength = strength,
Duration = duration,
Hand = _handedness
});
}
_isPinching = false;
_pinchConfirmed = false;
_lastPinchEndTime = currentTime;
}
}
}
}
The critical details: This implementation uses hysteresis (different thresholds for pinch start and end) to prevent flickering. It applies a low-pass filter to joint positions to reduce hand tracking jitter. It enforces a minimum hold duration to filter accidental touches. It rejects high-velocity finger movements to distinguish intentional pinches from hand swipes. It uses a cooldown period to prevent double-triggers. Every one of these details matters for a comfortable VR experience, and most AI tools miss most of them.
Copilot: Generates a basic distance check between thumb and index tip without any of the filtering, debouncing, or hysteresis logic. The resulting code would flicker constantly during real hand tracking sessions. Knows the XR Hands API exists but does not understand the data quality issues that require filtering. Useful for the initial class structure, not for the interaction logic.
Cursor: Produces a more complete gesture recognizer than Copilot, sometimes including a distance threshold with some debouncing. Rarely generates hysteresis or velocity filtering unprompted. The multi-file context helps when your gesture recognizer needs to integrate with a broader interaction system across multiple scripts. Good starting point that requires one round of detailed prompting to reach production quality.
Windsurf: Generates a naive pinch detector with a single distance threshold and no filtering. The code would be unusable with real hand tracking data. Does not understand the noise characteristics of optical hand tracking or the perceptual requirements for responsive but stable gesture detection.
Claude Code: The only tool that proactively includes hysteresis, low-pass filtering, minimum hold duration, and velocity rejection when asked to build a pinch gesture recognizer. It understands that hand tracking data is noisy and reasons about the tradeoffs between responsiveness (lower smoothing factor) and stability (higher smoothing factor). When prompted about the interaction, it explains why each filtering stage exists and how to tune the parameters for different use cases (precise selection vs. fast grabbing). This is the tool that thinks like an XR interaction engineer.
Amazon Q: Produces a distance-check function with no awareness of the XR Hands API, hand tracking data quality issues, or interaction design requirements. Not useful for spatial interaction work.
Gemini CLI: Generates a gesture recognizer with basic distance checking and sometimes a simple cooldown timer. Does not include filtering or hysteresis. Knows about hand tracking at a conceptual level but does not produce code that handles real-world tracking noise.
Performance Optimization for XR
Performance in VR is not a nice-to-have. It is a health and safety requirement. At 90fps, your entire frame — CPU simulation, GPU rendering, compositor submit — must complete in 11.1 milliseconds. Miss that budget and the runtime engages reprojection (ASW on Quest, Motion Smoothing on SteamVR), which introduces visual artifacts. Miss it badly and users experience motion sickness. On Quest specifically, sustained high GPU load triggers thermal throttling, which dynamically reduces clock speeds and makes the problem worse in a feedback loop.
The frame budget breaks down roughly as follows on Quest 3: CPU game logic gets about 3–4ms, physics and animation get 1–2ms, render thread command submission gets 2–3ms, and the GPU gets the remaining 4–5ms for actual rendering. That GPU budget must cover two eye views via single-pass instanced rendering, plus any compositor layers for UI overlays.
Here is a dynamic quality scaler that monitors frame timing and adjusts rendering settings to maintain target framerate — an essential system for any VR application that ships on mobile hardware:
using UnityEngine;
using UnityEngine.Rendering.Universal;
using UnityEngine.XR;
/// <summary>
/// Dynamically adjusts rendering quality (resolution scale, LOD bias,
/// shadow distance, MSAA) based on GPU frame time to maintain target
/// VR framerate. Prevents thermal throttling on mobile XR devices.
/// </summary>
public class XRDynamicQualityScaler : MonoBehaviour
{
[Header("Frame Budget")]
[SerializeField] private float _targetFrameTimeMs = 11.1f; // 90fps
[SerializeField] private float _criticalFrameTimeMs = 13.5f; // Below reprojection threshold
[SerializeField] private float _comfortableFrameTimeMs = 9.0f; // Room to increase quality
[Header("Resolution Scaling")]
[SerializeField] private float _minRenderScale = 0.6f;
[SerializeField] private float _maxRenderScale = 1.2f;
[SerializeField] private float _renderScaleStep = 0.05f;
[Header("Quality Levels")]
[SerializeField] private float _minLodBias = 0.5f;
[SerializeField] private float _maxLodBias = 2.0f;
[SerializeField] private float _minShadowDistance = 15.0f;
[SerializeField] private float _maxShadowDistance = 50.0f;
[Header("Adaptation Speed")]
[SerializeField] private float _downgradeCooldown = 0.5f;
[SerializeField] private float _upgradeCooldown = 3.0f;
[SerializeField] private int _frameSampleCount = 15;
[SerializeField, Range(0f, 1f)] private float _frameTimeSmoothing = 0.85f;
private float _smoothedFrameTimeMs;
private float _lastDowngradeTime;
private float _lastUpgradeTime;
private float _currentRenderScale;
private int _currentQualityTier; // 0 = lowest, 3 = highest
private float[] _frameSamples;
private int _sampleIndex;
private UniversalRenderPipelineAsset _urpAsset;
private void Start()
{
_urpAsset = UniversalRenderPipeline.asset;
_currentRenderScale = XRSettings.eyeTextureResolutionScale;
_frameSamples = new float[_frameSampleCount];
_currentQualityTier = 2; // Start at medium-high
ApplyQualityTier(_currentQualityTier);
}
private void LateUpdate()
{
// Sample frame time (GPU + CPU combined)
float frameTimeMs = Time.unscaledDeltaTime * 1000.0f;
_frameSamples[_sampleIndex] = frameTimeMs;
_sampleIndex = (_sampleIndex + 1) % _frameSampleCount;
// Exponential moving average for smooth response
_smoothedFrameTimeMs = Mathf.Lerp(frameTimeMs, _smoothedFrameTimeMs, _frameTimeSmoothing);
float currentTime = Time.unscaledTime;
// Fast downgrade path: react quickly to frame drops
if (_smoothedFrameTimeMs > _criticalFrameTimeMs
&& (currentTime - _lastDowngradeTime) > _downgradeCooldown)
{
DowngradeQuality();
_lastDowngradeTime = currentTime;
}
// Slow upgrade path: cautiously increase quality when headroom exists
else if (_smoothedFrameTimeMs < _comfortableFrameTimeMs
&& (currentTime - _lastUpgradeTime) > _upgradeCooldown
&& AllRecentFramesBelowTarget())
{
UpgradeQuality();
_lastUpgradeTime = currentTime;
}
}
private bool AllRecentFramesBelowTarget()
{
for (int i = 0; i < _frameSampleCount; i++)
{
if (_frameSamples[i] > _targetFrameTimeMs && _frameSamples[i] > 0.0f)
return false;
}
return true;
}
private void DowngradeQuality()
{
// Priority order: resolution first (biggest impact), then shadows, then LOD
// Resolution scaling has the most direct impact on GPU frame time
float newScale = _currentRenderScale - _renderScaleStep;
if (newScale >= _minRenderScale)
{
_currentRenderScale = newScale;
XRSettings.eyeTextureResolutionScale = _currentRenderScale;
return;
}
// If resolution is already at minimum, reduce quality tier
if (_currentQualityTier > 0)
{
_currentQualityTier--;
ApplyQualityTier(_currentQualityTier);
}
}
private void UpgradeQuality()
{
// Upgrade in reverse priority: quality tier first, then resolution
if (_currentQualityTier < 3)
{
_currentQualityTier++;
ApplyQualityTier(_currentQualityTier);
return;
}
float newScale = _currentRenderScale + _renderScaleStep;
if (newScale <= _maxRenderScale)
{
_currentRenderScale = newScale;
XRSettings.eyeTextureResolutionScale = _currentRenderScale;
}
}
private void ApplyQualityTier(int tier)
{
switch (tier)
{
case 0: // Survival mode
QualitySettings.lodBias = _minLodBias;
QualitySettings.shadowDistance = 0; // Disable shadows
_urpAsset.msaaSampleCount = 1; // Disable MSAA
break;
case 1: // Low
QualitySettings.lodBias = Mathf.Lerp(_minLodBias, _maxLodBias, 0.33f);
QualitySettings.shadowDistance = _minShadowDistance;
_urpAsset.msaaSampleCount = 2;
break;
case 2: // Medium
QualitySettings.lodBias = Mathf.Lerp(_minLodBias, _maxLodBias, 0.66f);
QualitySettings.shadowDistance = Mathf.Lerp(_minShadowDistance, _maxShadowDistance, 0.5f);
_urpAsset.msaaSampleCount = 4;
break;
case 3: // High
QualitySettings.lodBias = _maxLodBias;
QualitySettings.shadowDistance = _maxShadowDistance;
_urpAsset.msaaSampleCount = 4;
break;
}
}
#if UNITY_EDITOR
private void OnGUI()
{
GUILayout.BeginArea(new Rect(10, 10, 300, 120));
GUILayout.Label($"Frame Time: {_smoothedFrameTimeMs:F2} ms");
GUILayout.Label($"Render Scale: {_currentRenderScale:F2}");
GUILayout.Label($"Quality Tier: {_currentQualityTier}/3");
GUILayout.Label($"Target: {_targetFrameTimeMs:F1} ms ({1000f/_targetFrameTimeMs:F0} fps)");
GUILayout.EndArea();
}
#endif
}
Why this design matters: The scaler uses asymmetric adaptation speeds — fast downgrade (0.5s cooldown) and slow upgrade (3s cooldown with verification). This prevents oscillation where the system bounces between quality levels every few seconds, which is visually distracting. The downgrade path prioritizes resolution scaling because it has the most direct impact on GPU frame time without changing visual character. The upgrade path restores quality tiers first (shadows, LOD) before increasing resolution, because the quality features are more noticeable to the user than a slight resolution bump.
Copilot: Generates basic frame time monitoring but does not implement the asymmetric adaptation or priority-ordered quality adjustments. Often suggests changing Application.targetFrameRate, which is not how VR frame pacing works (the VR compositor controls frame timing). Knows XRSettings.eyeTextureResolutionScale exists but does not understand the broader quality scaling strategy.
Cursor: Produces a reasonable quality scaler with resolution adjustment and some frame time monitoring. Understands URP asset properties for runtime quality changes. The multi-file context helps when the scaler needs to integrate with custom render features or post-processing settings defined elsewhere. Does not proactively implement asymmetric timing or priority ordering without specific prompting.
Windsurf: Generates a simple frame counter that adjusts resolution on a fixed interval. No smoothing, no hysteresis, no quality tier system. The code would cause visible quality oscillation in any real VR application.
Claude Code: Understands VR frame budget constraints deeply. When asked to build a quality scaler, it proactively discusses the 11.1ms budget, explains why asymmetric adaptation prevents oscillation, reasons about the priority order of quality adjustments, and warns about thermal throttling on Quest. It generates the most complete initial implementation and can explain the tradeoffs of each design decision. The performance analysis capabilities — reasoning about what costs GPU time and what to cut first — are the strongest of any tool.
Amazon Q: Produces a generic frame rate monitor without VR awareness. Does not understand VR-specific timing requirements, resolution scaling, or the relationship between rendering quality and comfort. Not useful for XR performance work.
Gemini CLI: Generates a quality scaler with basic frame time monitoring and resolution adjustment. Includes some awareness of VR timing requirements. Does not implement the sophisticated adaptation logic needed for a production system but provides a reasonable starting point for further development.
WebXR Development
WebXR brings immersive experiences to the browser, removing the friction of app store distribution and native installation. The ecosystem has matured significantly: Three.js with its WebXR support, A-Frame for declarative XR scenes, and Babylon.js for full-featured 3D applications. WebXR is increasingly important for enterprise AR/VR deployments where IT policies restrict native app installation, and for experiences that need to be shared via a simple URL.
Here is a WebXR session with hand tracking using the Three.js WebXR hand input API:
import * as THREE from 'three';
import { XRControllerModelFactory } from 'three/addons/webxr/XRControllerModelFactory.js';
import { XRHandModelFactory } from 'three/addons/webxr/XRHandModelFactory.js';
class WebXRHandTrackingApp {
constructor() {
this.scene = new THREE.Scene();
this.camera = new THREE.PerspectiveCamera(70, window.innerWidth / window.innerHeight, 0.01, 100);
this.renderer = new THREE.WebGLRenderer({ antialias: true, alpha: true });
this.renderer.setSize(window.innerWidth, window.innerHeight);
this.renderer.xr.enabled = true;
document.body.appendChild(this.renderer.domElement);
this.handModelFactory = new XRHandModelFactory();
this.hands = [];
this.pinchStates = [false, false]; // Track pinch per hand
this.interactables = [];
this.setupScene();
this.setupHands();
this.setupInteractables();
this.renderer.setAnimationLoop(this.render.bind(this));
}
async startSession() {
if (!navigator.xr) {
console.error('WebXR not supported');
return;
}
const supported = await navigator.xr.isSessionSupported('immersive-vr');
if (!supported) {
console.error('Immersive VR not supported');
return;
}
const session = await navigator.xr.requestSession('immersive-vr', {
requiredFeatures: ['local-floor'],
optionalFeatures: ['hand-tracking', 'layers']
});
await this.renderer.xr.setSession(session);
}
setupScene() {
// Ambient light for baseline visibility
const ambient = new THREE.AmbientLight(0x404040, 2);
this.scene.add(ambient);
// Directional light for depth cues
const directional = new THREE.DirectionalLight(0xffffff, 3);
directional.position.set(1, 3, 2);
this.scene.add(directional);
// Ground plane with grid for spatial reference
const gridHelper = new THREE.GridHelper(10, 20, 0x444444, 0x222222);
this.scene.add(gridHelper);
}
setupHands() {
for (let i = 0; i <= 1; i++) {
const hand = this.renderer.xr.getHand(i);
const handModel = this.handModelFactory.createHandModel(hand, 'mesh');
hand.add(handModel);
this.scene.add(hand);
// Add pinch indicator sphere at pinch point
const pinchIndicator = new THREE.Mesh(
new THREE.SphereGeometry(0.008, 16, 16),
new THREE.MeshStandardMaterial({
color: 0x00aaff,
emissive: 0x0044ff,
emissiveIntensity: 0.5,
transparent: true,
opacity: 0.0
})
);
hand.userData.pinchIndicator = pinchIndicator;
this.scene.add(pinchIndicator);
hand.addEventListener('pinchstart', (event) => {
this.onPinchStart(i, event);
});
hand.addEventListener('pinchend', (event) => {
this.onPinchEnd(i, event);
});
this.hands.push(hand);
}
}
setupInteractables() {
const geometries = [
new THREE.BoxGeometry(0.08, 0.08, 0.08),
new THREE.SphereGeometry(0.05, 32, 32),
new THREE.ConeGeometry(0.04, 0.1, 16),
new THREE.TorusGeometry(0.04, 0.015, 16, 32)
];
const colors = [0xff4444, 0x44ff44, 0x4444ff, 0xffaa00];
for (let i = 0; i < 4; i++) {
const mesh = new THREE.Mesh(
geometries[i],
new THREE.MeshStandardMaterial({
color: colors[i],
roughness: 0.4,
metalness: 0.3
})
);
// Position objects in an arc in front of user at comfortable reach distance
const angle = (i / 4) * Math.PI * 0.6 - Math.PI * 0.3;
mesh.position.set(
Math.sin(angle) * 0.4,
1.2 + (i % 2) * 0.1,
-0.3 + Math.cos(angle) * 0.1
);
mesh.userData.isInteractable = true;
mesh.userData.originalColor = colors[i];
mesh.userData.isGrabbed = false;
mesh.userData.grabHand = null;
mesh.userData.grabOffset = new THREE.Vector3();
this.scene.add(mesh);
this.interactables.push(mesh);
}
}
onPinchStart(handIndex) {
this.pinchStates[handIndex] = true;
const hand = this.hands[handIndex];
// Get pinch position from thumb tip and index tip joints
const indexTip = hand.joints['index-finger-tip'];
const thumbTip = hand.joints['thumb-tip'];
if (!indexTip || !thumbTip) return;
const pinchPos = new THREE.Vector3()
.addVectors(indexTip.position, thumbTip.position)
.multiplyScalar(0.5);
// Convert to world space
const worldPinchPos = pinchPos.clone();
hand.localToWorld(worldPinchPos);
// Find closest interactable within grab range
let closest = null;
let closestDist = 0.06; // 6cm grab radius
for (const obj of this.interactables) {
if (obj.userData.isGrabbed) continue;
const dist = worldPinchPos.distanceTo(obj.position);
if (dist < closestDist) {
closestDist = dist;
closest = obj;
}
}
if (closest) {
closest.userData.isGrabbed = true;
closest.userData.grabHand = handIndex;
closest.userData.grabOffset.copy(closest.position).sub(worldPinchPos);
closest.material.emissive.setHex(closest.userData.originalColor);
closest.material.emissiveIntensity = 0.4;
}
}
onPinchEnd(handIndex) {
this.pinchStates[handIndex] = false;
for (const obj of this.interactables) {
if (obj.userData.isGrabbed && obj.userData.grabHand === handIndex) {
obj.userData.isGrabbed = false;
obj.userData.grabHand = null;
obj.material.emissive.setHex(0x000000);
obj.material.emissiveIntensity = 0;
}
}
}
render(timestamp, frame) {
// Update grabbed objects to follow hand pinch point
for (const obj of this.interactables) {
if (!obj.userData.isGrabbed) continue;
const hand = this.hands[obj.userData.grabHand];
const indexTip = hand.joints['index-finger-tip'];
const thumbTip = hand.joints['thumb-tip'];
if (!indexTip || !thumbTip) continue;
const pinchPos = new THREE.Vector3()
.addVectors(indexTip.position, thumbTip.position)
.multiplyScalar(0.5);
const worldPinchPos = pinchPos.clone();
hand.localToWorld(worldPinchPos);
obj.position.copy(worldPinchPos).add(obj.userData.grabOffset);
}
// Update pinch indicators
for (let i = 0; i <= 1; i++) {
const hand = this.hands[i];
const indicator = hand.userData.pinchIndicator;
if (!indicator) continue;
const indexTip = hand.joints['index-finger-tip'];
const thumbTip = hand.joints['thumb-tip'];
if (indexTip && thumbTip) {
const mid = new THREE.Vector3()
.addVectors(indexTip.position, thumbTip.position)
.multiplyScalar(0.5);
hand.localToWorld(mid);
indicator.position.copy(mid);
const dist = indexTip.position.distanceTo(thumbTip.position);
const opacity = THREE.MathUtils.smoothstep(1.0 - dist / 0.05, 0, 1);
indicator.material.opacity = opacity;
}
}
this.renderer.render(this.scene, this.camera);
}
}
// Entry point
const app = new WebXRHandTrackingApp();
document.getElementById('enter-vr').addEventListener('click', () => {
app.startSession();
});
Copilot: Generates clean Three.js code and knows the basic WebXR session setup. Handles navigator.xr.requestSession correctly and understands the renderer.xr API. Hand tracking support is hit-or-miss — sometimes generates code using the older XRHand API patterns instead of the current Three.js hand input events. Good for scaffolding a WebXR scene, less reliable for hand tracking specifics.
Cursor: Solid WebXR generation with awareness of Three.js addons for XR controllers and hand models. The multi-file context works well for WebXR projects where you have scene setup, interaction logic, and UI components in separate modules. Handles the session configuration (required vs optional features) correctly and knows which features are widely supported.
Windsurf: Produces working WebXR boilerplate with Three.js. The JavaScript/TypeScript generation quality is higher than its C# or C++ output for engine-based XR. Handles basic session setup and controller input. Hand tracking code is less reliable.
Claude Code: Generates complete WebXR applications with good architectural decisions (separation of scene setup, input handling, and rendering). Understands the WebXR feature negotiation model (requiredFeatures vs optionalFeatures) and can reason about cross-browser compatibility issues. Hand tracking support is accurate. The main limitation is the same as with engine-based development — no visual preview, so the iteration cycle for 3D scene layout is slower.
Amazon Q: Produces basic Three.js code but has limited WebXR-specific knowledge. Session setup is usually correct but the XR-specific interaction code (controllers, hands, input sources) frequently uses incorrect or deprecated API patterns. Acceptable for basic WebXR scaffolding with manual correction.
Gemini CLI: Generates reasonable WebXR code with Three.js. Understands the session lifecycle and basic controller input. Hand tracking code is present but not always accurate. JavaScript quality is decent overall, making this a viable option for WebXR prototyping when combined with manual review of XR-specific APIs.
What AI Tools Get Wrong About AR/VR
After extensive testing across all major AI coding tools, these are the most common and most dangerous errors in AI-generated XR code:
- Suggesting deprecated VR APIs (SteamVR plugin instead of OpenXR): AI tools frequently generate code using the legacy SteamVR plugin (
Valve.VRnamespace), the old Oculus Integration package, or Unity’s deprecated XR Management system instead of the current OpenXR standard. OpenXR is the cross-platform runtime that all major headsets now support. Code using vendor-specific SDKs is harder to maintain, locks you to a single platform, and is increasingly unsupported. The correct path is OpenXR with the XR Interaction Toolkit for Unity, and the OpenXR plugin for Unreal. - Wrong coordinate system assumptions (Y-up vs Z-up confusion): Unity uses Y-up left-handed coordinates. Unreal uses Z-up left-handed coordinates. WebXR uses Y-up right-handed coordinates. OpenXR uses Y-up right-handed coordinates. AI tools frequently generate spatial math code that assumes the wrong coordinate system, especially when converting between formats (importing FBX models, translating between engines, interfacing with tracking APIs). A rotation that looks correct in one coordinate system produces upside-down or mirrored results in another. This is one of the most time-consuming bugs to diagnose because the code compiles and runs — it just puts objects in the wrong orientation.
- Ignoring VR performance budgets (suggesting post-processing effects that kill framerate): AI tools routinely suggest adding bloom, depth of field, screen-space ambient occlusion, screen-space reflections, and motion blur to VR scenes. Motion blur in VR is actively harmful — the VR runtime handles reprojection, and shader-based motion blur causes nausea. Depth of field is wrong because VR headsets have a fixed focal plane and the user’s eyes handle accommodation naturally. SSAO and SSR are extremely expensive for stereo rendering. Every full-screen post-processing effect costs double in VR because it runs per eye.
- Desktop-first shader code that does not work on mobile GPUs: AI tools generate shaders using features that are unavailable on Quest’s Adreno GPU: geometry shaders, tessellation, compute shaders for rendering effects, more than 4 render targets, high iteration count loops, and extensive use of
discard(which breaks tile-based deferred rendering optimizations). The shader compiles on PC and crashes or runs at single-digit framerates on Quest. - Missing platform-specific compilation directives: A VR application that ships on both Quest and PCVR needs platform-specific code paths for features like passthrough, hand tracking, eye tracking, and spatial anchors. AI tools generate single-platform code without
#if UNITY_ANDROID/#if UNITY_STANDALONEguards or Unreal’s platform macros. The code works on the developer’s PC and fails to compile or behaves incorrectly on the target device. - Treating spatial UI like flat UI (suggesting Canvas-based UI instead of world-space): AI tools suggest Unity’s Screen Space Overlay canvas for VR UI, which does not work in VR at all — the canvas renders on top of the stereo output and appears at the wrong depth, causing eye strain and breaking stereoscopic rendering. VR UI must be World Space canvas rendered at a comfortable distance (1–2 meters) with a billboard orientation or head-locked position. The interaction model is also different: no mouse pointer, no click events — you need ray interactors or poke interactors from the XR Interaction Toolkit.
- Outdated Unity XR packages (pre-XRI 3.0 patterns): XR Interaction Toolkit 3.0 (released mid-2024) restructured the interactor/interactable architecture, moved classes into sub-namespaces, renamed several key components, and changed the input binding model. AI tools trained on pre-3.0 code generate
XRDirectInteractorpatterns that do not work with the current package structure, reference classes in the wrong namespace, and use the deprecated controller-based input path instead of the new input-action-driven architecture. - Ignoring single-pass instanced rendering requirements: Single-pass instanced stereo rendering is the default and most performant VR rendering mode. It requires shaders to include stereo instancing macros (
UNITY_VERTEX_INPUT_INSTANCE_ID,UNITY_VERTEX_OUTPUT_STEREO, etc.) and custom render features to handle the instanced draw calls correctly. AI-generated shaders almost always omit these macros, resulting in shaders that render to only one eye or require falling back to the much more expensive multi-pass rendering mode. This is the single most common shader bug in AI-generated VR code.
AI coding tools are trained overwhelmingly on screen-based software — web apps, mobile apps, desktop applications. Spatial computing represents a tiny fraction of their training data. The core concepts that AR/VR engineers work with daily — quaternion algebra, stereo rendering pipelines, hand tracking noise filtering, vestibular comfort, thermal throttling, and the 11ms frame budget — are underrepresented in every tool’s knowledge base. The tools are useful for code structure and boilerplate, but every line of XR-specific logic must be reviewed by someone who understands the domain. A shader that renders incorrectly in one eye is not a bug report — it is a health hazard.
Cost Model for AR/VR Engineers
AR/VR engineers range from solo hobbyists building Quest experiments to XR studios with multi-million dollar contracts. Here is how AI tool costs scale across the spectrum:
| Profile | Stack | Monthly Cost | Best For |
|---|---|---|---|
| Hobbyist VR developer | Copilot Free + Unity Personal + Quest Developer Hub | $0 | Weekend VR projects, game jams, learning XR development — 2,000 completions/mo covers light usage |
| Indie XR studio developer | Copilot Pro + Unity Pro | $10 | Small team shipping to Quest — unlimited completions for fast iteration on interaction code and shaders |
| Professional AR/VR engineer | Claude Code + Unity Pro or Unreal Engine | $20 | Full-time XR engineers who need deep reasoning for spatial math, performance optimization, and complex interaction systems |
| Lead XR developer / technical artist | Claude Code + Copilot Pro | $30 | Best of both worlds — Claude for architecture, perf analysis, and spatial reasoning; Copilot for fast inline completions in Unity/Unreal editors |
| XR studio team (per seat) | Cursor Business + Claude Code Team + engine licenses | $60–$99 | Studios building enterprise AR/VR (training simulations, digital twins, medical XR) needing team management, SSO, and IP protection |
The economics: A professional VR engineer at a studio earns $120K–$200K annually. A $30/mo tool subscription ($360/year) represents less than 0.3% of compensation. The ROI question is simple: does the tool save more than 15 minutes per month? For most XR engineers, Copilot alone saves that much on MonoBehaviour boilerplate in the first week. The real value of Claude Code at $20/mo is in the performance optimization and spatial math reasoning — catching a shader that would fail on Quest before you spend an hour profiling on device, or correctly reasoning about quaternion composition for a multi-joint hand tracking system. At the studio level, Cursor Business at $60–$99/seat is justified by the multi-file context alone — Unreal VR projects routinely span hundreds of .h/.cpp pairs, and holding the relevant files in context during code generation eliminates a class of type mismatch and missing include errors.
The Bottom Line
AR/VR engineering is one of the hardest domains for AI coding tools to serve well. The training data is thin compared to web and mobile development. The APIs change rapidly as the XR ecosystem matures. The performance constraints are strict and platform-specific. The spatial computing paradigm is fundamentally different from the screen-based development that dominates AI training corpora.
The most effective setup for most professional XR engineers is Claude Code ($20/mo) for complex spatial reasoning, performance optimization, and architecture decisions, plus Copilot Pro ($10/mo) for fast inline completions during the daily grind of writing MonoBehaviours and UActorComponents = $30/mo total. If you work primarily in Unreal with large C++ codebases, consider Cursor Pro ($20/mo) instead of Copilot for its multi-file context across header/implementation pairs. If you are a hobbyist or learning XR development, Copilot Free ($0) covers the basics and lets you focus your budget on hardware (the Quest 3 is still the best development target for most VR projects).
Every tool will generate VR code that looks plausible but fails on device. The shader will compile on PC and render to one eye on Quest. The interaction system will work with controllers but break with hand tracking. The performance will be fine in the editor and miss the frame budget by 4ms on device. This is not a tool limitation that will be fixed next quarter — it is a structural consequence of spatial computing being a small fraction of the training data. Use AI tools for structure, boilerplate, and reasoning assistance. Verify everything on the target headset. The user wearing your VR application will tell you immediately — with their stomach — whether your code actually works.
Compare all tools and pricing on the CodeCosts homepage. For game development beyond XR, see the Game Developers guide. For GPU shader work outside of VR, see the Graphics & GPU Programmers guide. For cross-platform mobile deployment, see the Mobile Developers guide. For full-stack web with 3D integration, see the Full-Stack Engineers guide.
Related on CodeCosts
- AI Coding Tools for Game Developers (2026) — Unity, Unreal, Godot, shader languages, engine-specific APIs
- AI Coding Tools for Graphics & GPU Programmers (2026) — HLSL, GLSL, Vulkan, compute shaders, rendering pipelines
- AI Coding Tools for Mobile Developers (2026) — iOS, Android, cross-platform frameworks, app store deployment
- AI Coding Tools for Full-Stack Engineers (2026) — frontend, backend, databases, deployment pipelines