The Architecture of Presence: A Comprehensive Analysis of Extended Reality Evolution and the Spatial Computing Paradigm

The global landscape of human-computer interaction is currently undergoing its most significant transformation since the transition from command-line interfaces to the graphical user interface. This shift is defined by the emergence of Extended Reality (XR), a multifaceted domain that serves as the technological and conceptual framework for spatial computing. By 2026, the industry has transitioned from a period of fragmented experimentation into a sophisticated ecosystem where hardware, software, and artificial intelligence converge to redefine the boundaries between the physical and digital worlds. The following analysis explores the intricate relationships within the XR domain, the engineering breakthroughs defining the current leading edge, and the socio-economic forces shaping a future where spatial computing is as ubiquitous as the smartphone.

The Taxonomy of Immersion and the Reality-Virtuality Continuum

At the heart of the XR domain lies a spectrum of technologies often conflated but technically distinct. Extended Reality (XR) functions as an expansive umbrella term that encompasses all real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables.1 To understand the relationship between the constituent items—Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR)—it is necessary to revisit the foundational theory of the Reality-Virtuality (RV) Continuum.1

The Milgram-Kishino Framework

Proposed by Paul Milgram and Fumio Kishino in 1994, the RV Continuum provides a sliding scale that maps the degree of immersion and the composition of real versus virtual elements in a display environment.1 At the far left of the scale is the Real Environment, consisting solely of physical objects experienced through natural senses.3 At the far right is the Virtual Environment, a fully computer-generated world where the user’s sensory input is entirely replaced by digital information.3

The space between these two poles is classified as Mixed Reality (MR). Within this expanse, two primary sub-categories emerge based on the "anchor" of the experience. Augmented Reality (AR) represents a condition where digital elements are overlaid onto a predominantly physical world, enhancing the user’s perception without severing their connection to reality.4 Conversely, Augmented Virtuality (AV) describes the integration of physical objects—such as a live feed of a user's own body or a real-world tool—into a predominantly virtual environment.3

Continuum Segment

Compositional Nature

Primary Anchor

Typical Interaction

Real Environment

100% Physical

Natural Senses

Biological/Physical

Augmented Reality (AR)

Predominantly Physical + Digital Overlays

Physical World

Informational/Contextual

Mixed Reality (MR)

Balanced Blend of Physical and Digital

Blended Context

Interactive/Occlusive

Augmented Virtuality (AV)

Predominantly Digital + Physical Inserts

Virtual World

Hybrid Tooling/Presence

Virtual Reality (VR)

100% Digital

Synthetic World

Immersive/Agency-driven

1

The Discontinuity of Presence

Modern research in 2026 has refined this 1994 model, suggesting that the RV Continuum may be discontinuous.7 The "perfect" virtual reality—a state where a digital environment is indistinguishable from the physical world across all human senses—remains an asymptotic goal.7 Furthermore, the industry usage of "Mixed Reality" has evolved beyond Milgram’s broad definition. In contemporary commercial parlance, MR refers specifically to advanced AR systems capable of real-time interaction between real and virtual objects, characterized by spatial mapping, depth sensing, and occlusion.1 This convergence is exemplified by modern high-end headsets that can toggle between full VR immersion and high-fidelity passthrough MR, effectively spanning the entire continuum within a single hardware platform.1

Leading Edge Technologies in the 2026 Landscape

The current era of XR is defined by a radical departure from the limitations of early-generation hardware. The focus has shifted from incremental improvements in resolution to a total reimagining of the display, optical, and processing architectures required for "nausea-free" and visually transparent immersion.

Silicon-Based Display Architectures

The most significant shift in display technology is the transition to Micro-OLED, also known as Silicon-based OLED.10 Unlike traditional AMOLED or LCD panels that utilize glass or plastic substrates, Micro-OLED is manufactured directly on single-crystal silicon wafers using mature CMOS (Complementary Metal-Oxide-Semiconductor) processes.10 This semiconductor-based approach allows for the integration of display driver circuits directly into the silicon backplane, achieving pixel sizes roughly 1/10th of those found in traditional displays.10

By 2026, the scaling of 12-inch (300mm) wafer production lines by manufacturers such as SeeYa Technology, BOE, and Samsung has significantly lowered unit costs, allowing Micro-OLED to move from experimental high-end devices like the Apple Vision Pro into more affordable consumer hardware.10 These displays offer pixel densities exceeding 3,000 to 4,000 pixels per inch (PPI), effectively eliminating the "screen door effect" and enabling text-heavy productivity workflows that were previously impossible due to eye strain.10

Optical Engineering: From Fresnel to Waveguides

To accommodate the high resolutions of Micro-OLED displays while reducing device bulk, manufacturers have transitioned to "pancake" optics and waveguide combiners.11 Pancake lenses use a folded-path design that bounces light between multiple lens elements, allowing for a much slimmer headset profile and edge-to-edge clarity that Fresnel lenses could never achieve.9

For lightweight AR glasses, waveguide technology remains the standard. Waveguides use internal reflection within a thin lens to guide light from a micro-projector into the user's eye.12 Innovation in 2026 is centered on Surface Relief Grating (SRG) and Volume Holographic Grating (VHG) systems that offer wider fields of view (exceeding 150 degrees in prototypes) and higher brightness efficiency, which is critical for outdoor readability.8 Furthermore, the introduction of variable-focus liquid crystal lenses allows XR devices to provide continuous refractive adjustment (from -3.0 D to +3.0 D), eliminating the need for corrective eyewear within the headset for many users.13

The Proliferation of On-Device AI

Artificial Intelligence has become the primary engine for spatial awareness and user interaction. Leading-edge XR chipsets, moving toward sub-2nm neural SoC (System-on-Chip) architectures, are designed to handle the massive computational load of real-time SLAM (Simultaneous Localization and Mapping), gesture recognition, and eye tracking.12

Hardware Component

Leading Edge Technology (2026)

Primary Benefit

Key Players

Display

Micro-OLED (Silicon-based)

3,000+ PPI, zero screen-door effect

Sony, SeeYa, Samsung

Optics

Waveguide & Pancake Lenses

Miniaturization, FOV expansion

Goertek, Lumus, Meta

Processor

2nm Neural SoCs

Real-time AI, low power consumption

Qualcomm, Apple, MediaTek

Sensing

LiDAR & High-Res Passthrough

Depth-aware MR, 11ms latency

Apple, Samsung, Meta

9

The AI Integration: Predictive Tracking and Generative Realities

In the 2026 XR domain, AI is not merely an additive feature but a foundational component that bridges the gap between hardware limitations and human perception. The integration of Generative AI (GenAI) and predictive algorithms has fundamentally changed how immersive environments are built and experienced.16

Latency Reduction through Predictive Modeling

One of the most persistent barriers to XR adoption has been "motion-to-photon" latency—the delay between a user's movement and the corresponding update on the display. In high-stakes environments like surgical training or competitive gaming, even minor latency can cause discomfort or error.2 To solve this, developers are deploying deep learning-based predictive models, such as sequence-to-sequence (Seq2Seq) neural networks, to forecast a user's future head and hand poses.17

By analyzing past trajectories and velocities, these models can anticipate a user’s position 20-50 milliseconds in advance.17 This allows the rendering engine to pre-calculate frames, effectively reducing perceived latency to near-zero levels. When combined with foveated rendering—a technique where eye-tracking data is used to render only the small portion of the screen the user is looking at in high detail—the system can achieve massive gains in processing efficiency.12

Generative AI and Gaussian Splatting

Content creation has traditionally been the "bottleneck" of the XR industry due to the high cost and complexity of 3D modeling. However, the emergence of Gaussian Splatting and GenAI toolchains has democratized content creation.18 Gaussian Splatting allows for the creation of photorealistic 3D representations from a sparse set of 2D images, enabling the rapid generation of digital twins and lifelike avatars.19

Frameworks like "XR Blocks" and "XARP" now allow developers and AI agents to generate interactive 3D environments through natural language prompts.14 For instance, a user can describe a "collaborative meeting room with industrial aesthetic and whiteboard functionality," and the AI can procedurally generate the scene, assets, and interaction logic in real-time.14

Sensory Immersion: Spatial Audio and Haptic Synthesis

While visual fidelity is the most prominent aspect of XR, true presence—the psychological sensation of "being there"—is heavily dependent on auditory and tactile feedback.

Acoustic Ray Tracing and 6-DoF Sound

Spatial audio in 2026 has evolved from simple directional sound to a complex simulation of physics. Acoustic Ray Tracing, now performant on mobile XR hardware like the Meta Quest and Apple Vision Pro, models how sound waves reflect, diffract, and are occluded by physical and virtual geometry.22 This technology ensures that a digital voice sounds different when it originates from behind a virtual wall compared to an open space.22

By utilizing personalized Head-Related Transfer Functions (HRTF)—often created via a quick scan of the user's ears with a smartphone camera—XR systems can replicate the unique way an individual's physiology filters sound.24 This level of precision is critical for 6-Degrees-of-Freedom (6-DoF) experiences, where the audio rendering must adapt dynamically as the user moves through a virtual environment.23

Tactile Feedback and Haptics

Haptic technology has moved beyond basic vibration to include high-fidelity tactile sensations and even thermal feedback. Advanced haptic gloves and suits, such as the "TouchDIVER Pro," use flexible materials and actuators to simulate the texture, shape, and temperature of virtual objects.12 In 2026, haptics are being integrated into diverse fields, from providing "calming feedback" in automotive seats to helping neurodiverse individuals or those with sensory impairments navigate digital information through touch.25

Market Trends: The Transition to Mainstream Spatial Computing

The XR market in 2026 is characterized by a "Great Inversion" where the primary barriers are no longer technical but organizational and social.26 Despite this, the economic momentum behind spatial computing is undeniable.

Enterprise Adoption and Industrial ROI

The enterprise sector remains the primary engine of growth, with over 72% of Fortune 500 companies implementing XR for training, collaboration, or product design as of 2025.2 The Return on Investment (ROI) has become clear: XR training programs have been shown to reduce employee training time by 43% and improve knowledge retention by 32%.2

In manufacturing, the use of Intelligent Digital Twins (IDTs) allows for a 10-15% reduction in downtime and up to 30% faster product iteration.6 Large-scale manufacturers now maintain production digital twins on platforms like NVIDIA Omniverse, allowing global teams to collaborate on a factory floor that exists simultaneously in the physical and digital realms.27

The Surge of Smart Glasses

A pivotal trend in 2026 is the rapid growth of "Smart Glasses"—lightweight, AI-enabled wearables that look like traditional eyewear.13 For the first time, these devices represent roughly half of all XR shipments worldwide, up from just 25% in 2024.29 While they may not all offer full immersive VR, their always-on connectivity and AI-driven visual assistance (e.g., real-time translation, navigation) make them a more practical daily tool for many consumers than bulky headsets.13

Market Forecast Metric

2025 (Base)

2030 (Projected)

2035 (Projected)

Global XR Market Size

$253.5 Billion

$1.22 Trillion

$2.13 Trillion

Spatial Computing Hardware

$130.3 Billion

$421.2 Billion

~$1.0 Trillion

Digital Twin Market

$12.8 Billion

$122.2 Billion

$572.0 Billion

Active AR App Downloads

1.3 Billion

5.5 Billion+

Ubiquitous

30

Regional Dynamics: The Asia-Pacific Acceleration

While North America currently leads in revenue share (37-41%) due to the presence of tech giants like Meta, Apple, and Microsoft, the Asia-Pacific region is the fastest-growing market.2 China, in particular, is forecasted to grow at a CAGR of 20.4%, reaching $63.9 billion by 2030.32 This growth is supported by a robust supply chain for Micro-OLED and waveguide components and strong government backing for "Smart City" initiatives that integrate spatial data into urban planning.10

Barriers, Privacy, and the Regulatory Landscape

As XR technology becomes more pervasive, it faces intense scrutiny regarding its social impact and data security. The "visible nature" of the hardware and concerns about digital isolation remain cultural barriers, but the most significant challenges are legal and ethical.8

The Privacy of Presence: Neural and Biometric Data

XR devices collect an unprecedented volume of personal data. An average headset captures over 12 GB of spatial and behavioral data per user monthly, including eye-tracking (which can reveal health conditions or sexual preferences), facial micro-expressions, and room-scanning data.2

In 2026, the regulatory landscape is responding with a "second wave" of privacy laws. States like Oregon and California have enacted bans on the sale of precise geolocation data (within 1,750 feet) and introduced strict protections for "neural data"—information derived from brain-computer interfaces or high-resolution biometric sensors.37 Companies are now required to implement "Privacy by Design" and "Data Minimization," ensuring that sensitive sensor data is processed on-device rather than stored in the cloud whenever possible.37

The Organizational Gap

In the professional world, the "Great Inversion" refers to the fact that while the technology is ready, many organizations are not.26 Friction points include a lack of standardized XR frameworks, difficulties in measuring "soft skill" ROI, and political resistance from IT departments that view XR as a security risk.2 To overcome this, successful enterprises are moving toward "Problem-First Scoping," where XR is treated not as a novelty but as a targeted solution for specific operational outcomes.26

Forecast: The Reign of the Universal Spatial Interface

The trajectory of the XR domain suggests that the industry is heading toward a singular "reigning" technology: The Integrated Mixed Reality (MR) Wearable.

Which Technology Will Reign?

In the immediate future (2026-2030), Video-Passthrough MR will dominate the high-end market.9 While optical see-through (looking through clear glass) is the "holy grail" of AR, it currently faces insurmountable physics challenges regarding opacity, field of view, and outdoor brightness.1 Video-passthrough—where high-resolution cameras digitize the world and re-display it to the user with 11-12ms latency—allows for total control over every pixel.9 This enables virtual objects to look truly solid and allows for seamless transitions between AR and VR modes.1

However, by 2035, the "winner" will likely be a hybrid device that uses Dynamic Opacity Waveguides. These devices will look like standard glasses but will have lenses capable of turning 100% opaque on demand, effectively merging the smart glasses and VR headset categories into a single "Universal Spatial Interface".8

What the Future Looks Like (2035 Vision)

By 2035, spatial computing will have superseded the smartphone as the primary interface for human life. This future world will be characterized by:

  1. The Ubiquitous Spatial Web: Information will no longer be stored "in" a device but will be anchored to the physical world.1 A person walking into a grocery store will see their digital shopping list floating over the relevant aisles; a mechanic looking at an engine will see a real-time diagnostic overlay provided by an Intelligent Digital Twin.6
  2. 6G and Edge-Cloud Symbiosis: All-day battery life will be achieved not through massive batteries but by offloading heavy computation to "edge nodes" via ultra-low-latency 6G networks.8 The glasses will primarily serve as a display and sensor array, while the "brain" of the experience exists in the local network cloud.
  3. The End of Distance: Remote collaboration will become indistinguishable from physical presence. High-fidelity Gaussian avatars and 6-DoF spatial audio will allow for "spatial telepresence," making the physical location of workers irrelevant for most knowledge-based industries.23
  4. Empathy-Enabled Infrastructure: AI agents in our XR systems will proactively assist us based on our biometric signals. If a system detects rising cortisol levels during a complex task, it might automatically simplify the visual interface or provide calming audio cues to optimize performance.15

Strategic Conclusions and Recommendations

The evolution of Extended Reality from a niche gaming technology into the foundation of spatial computing represents a fundamental shift in the global economy. For businesses and developers, the implications are profound.

The "Great Inversion" suggests that the most successful players in the next decade will not necessarily be those with the fastest processors, but those who can solve the organizational and privacy challenges of immersion. There is a critical need for standardized cross-platform spatial maps and robust cybersecurity frameworks that protect user biometric data without stifling innovation.27

Furthermore, the surge in smart glasses shipments indicates that "utility" is currently outpacing "immersion" in the consumer market.29 Companies should prioritize AI-driven assistance and lightweight ergonomics over raw graphical power for mainstream applications. In the enterprise sector, the focus must remain on "Problem-First" integration, using digital twins and XR simulations to drive measurable efficiency rather than pilot projects for the sake of novelty.6

The horizon of spatial computing is defined by the seamless convergence of our physical and digital lives. As we move toward 2035, the "Extended" in Extended Reality will no longer describe a separate technology, but a standard feature of human existence. The transition is inevitable; the challenge lies in ensuring this new reality is built on a foundation of security, empathy, and human-centered design.

Works cited

  1. XR Includes VR AR MR Definition - The Ultimate Guide to Extended Realities, accessed March 17, 2026, https://inairspace.com/blogs/learn-with-inair/xr-includes-vr-ar-mr-definition-the-ultimate-guide-to-extended-realities
  2. Extended Reality Market Trends | Size & CAGR of 4.63%. - Industry Research, accessed March 17, 2026, https://www.industryresearch.biz/market-reports/extended-reality-market-112847
  3. Reality–virtuality continuum - Wikipedia, accessed March 17, 2026, https://en.wikipedia.org/wiki/Reality%E2%80%93virtuality_continuum
  4. What Is the Virtuality Continuum? — updated 2026 - IxDF, accessed March 17, 2026, https://ixdf.org/literature/topics/virtuality-continuum
  5. Extended Reality: AR, VR & MR – Information Technology Services - Carleton College, accessed March 17, 2026, https://www.carleton.edu/its/services/learning/ar-vr/
  6. xr technology and artificial intelligence, accessed March 17, 2026, https://xra.org/wp-content/uploads/2025/11/XRA-AI-and-XR-Pamphlet.pdf
  7. Revisiting Milgram and Kishino's Reality-Virtuality Continuum - Frontiers, accessed March 17, 2026, https://www.frontiersin.org/journals/virtual-reality/articles/10.3389/frvir.2021.647997/full
  8. The 2025 XR Landscape: A Comprehensive Analysis of Meta, Apple ..., accessed March 17, 2026, https://medium.com/antaeus-ar/the-2025-xr-landscape-a-comprehensive-analysis-of-meta-apple-and-googles-vision-for-spatial-5ed9f1cf9c61
  9. Best Mixed Reality Headsets for Enterprise (2026) - Treeview Studio, accessed March 17, 2026, https://treeview.studio/blog/best-mixed-reality-headsets
  10. The Micro-OLED Industry: The Golden Standard for XR Display in 2026, accessed March 17, 2026, https://www.panoxdisplay.com/news/micro-oled-industry-report-2026-xr-display-trends.html
  11. Best XR Headset 2025: The Ultimate Guide to the Future of Immersion – INAIRSPACE, accessed March 17, 2026, https://inairspace.com/blogs/learn-with-inair/best-xr-headset-2025-the-ultimate-guide-to-the-future-of-immersion
  12. The Global Extended Reality (XR) Market 2026-2036: Virtual Reality ..., accessed March 17, 2026, https://www.futuremarketsinc.com/the-global-extended-reality-xr-market-2026-2036-virtual-reality-vr-augmented-reality-ar-and-mixed-reality-mr-technologies/
  13. CES 2026 XR and Smart Glass Announcements Recap - Counterpoint, accessed March 17, 2026, https://counterpointresearch.com/en/insights/ces-2026-xr-and-smart-glass-announcements-recap
  14. XR Blocks: Accelerating AI + XR innovation - Google Research, accessed March 17, 2026, https://research.google/blog/xr-blocks-accelerating-ai-xr-innovation/
  15. How AI Enables XR Experiences: Guide by Onix, accessed March 17, 2026, https://onix-systems.com/blog/ai-in-xr-experiences
  16. When Generative Artificial Intelligence meets Extended Reality: A Systematic Review, accessed March 17, 2026, https://arxiv.org/html/2511.03282v1
  17. Pose Estimation and Pose Prediction for Reducing Latency in Virtual Reality, accessed March 17, 2026, https://www.researchgate.net/publication/388683119_Pose_Estimation_and_Pose_Prediction_for_Reducing_Latency_in_Virtual_Reality
  18. 1. AR and smart glasses become more prominent 2. Democratisation of XR content creation using generative AI 3. Increasing AI-enh - SURF, accessed March 17, 2026, https://www.surf.nl/files/2025-10/sf_ttr26_en_immersive.pdf
  19. XR Technologies in 2025. Introduction | by Deep Gan Team | Medium, accessed March 17, 2026, https://deepganteam.medium.com/xr-technologies-in-2025-ee18eb737c79
  20. Papers | IEEE VR 2025, accessed March 17, 2026, http://ieeevr.org/2025/program/papers/
  21. XARP: A Human-First and AI Agent-Ready Extended Reality Toolkit - arXiv.org, accessed March 17, 2026, https://arxiv.org/html/2508.04108v3
  22. Immersive, Realistic Audio in VR: Introducing Acoustic Ray Tracing in Audio SDK | Meta Horizon OS Developers, accessed March 17, 2026, https://developers.meta.com/horizon/blog/acoustic-ray-tracing-audio-sdk-meta-quest-developer-social-presence/
  23. Heterogeneous volumetric sound rendering for XR - Ericsson, accessed March 17, 2026, https://www.ericsson.com/en/blog/2026/3/heterogeneous-sound-sources
  24. Spatial Audio: The Ultimate Guide to Immersive Sound 2025 - Havit's, accessed March 17, 2026, https://havitsmart.com/blogs/havit-audio-center/spatial-audio-the-ultimate-guide-to-immersive-sound-in-2025
  25. Touching the Future: Revealing the Magic of Haptics | InterDigital.com, accessed March 17, 2026, https://www.interdigital.com/post/revealing-the-magic-of-haptics
  26. Exploring Organizational Readiness and Ecosystem Coordination for Industrial XR - arXiv, accessed March 17, 2026, https://arxiv.org/html/2601.09045v1
  27. Spatial Computing Market Size, Share & Forecast, 2030 - Mordor Intelligence, accessed March 17, 2026, https://www.mordorintelligence.com/industry-reports/spatial-computing-market
  28. XR Hardware Coming in 2026: AR & VR Releases - N-iX MR - Solutions, accessed March 17, 2026, https://mr.n-ix.com/ar-vr-releases-in-2026-confirmed-launches-delays-and-the-bigger-picture/
  29. XR, Spatial Computing and Smart Glasses Market Statistics 2026: Headset Sales, Revenue and Active Users - Treeview Studio, accessed March 17, 2026, https://treeview.studio/blog/xr-market-statistics-headset-sales-revenue-active-users
  30. Extended Reality (XR) Market Growth Drivers & Analysis - ReAnIn, accessed March 17, 2026, https://www.reanin.com/reports/extended-reality-market
  31. Extended Reality Market Forecast to Reach USD 2127.81 Billion by 2034, Expanding at 25.5% CAGR (2026–2034) - EIN Presswire, accessed March 17, 2026, https://www.einpresswire.com/article/889905004/extended-reality-market-forecast-to-reach-usd-2127-81-billion-by-2034-expanding-at-25-5-cagr-2026-2034
  32. Spatial Computing Market Size, Share & Forecast to 2030, accessed March 17, 2026, https://www.researchandmarkets.com/report/spatial-computing
  33. Digital Twin Market Size to Hit USD 572.03 Billion by 2035 - Precedence Research, accessed March 17, 2026, https://www.precedenceresearch.com/digital-twin-market
  34. Digital Twin Market Report 2026 to 2035, Forecast - The Business Research Company, accessed March 17, 2026, https://www.thebusinessresearchcompany.com/report/digital-twin-global-market-report
  35. Spatial Computing Market Size To Reach $469.8 Billion by 2030, accessed March 17, 2026, https://www.grandviewresearch.com/press-release/global-spatial-computing-market
  36. Privacy-preserving datasets of eye-tracking samples with applications in XR - ResearchGate, accessed March 17, 2026, https://www.researchgate.net/publication/368727671_Privacy-preserving_datasets_of_eye-tracking_samples_with_applications_in_XR
  37. Data privacy laws: what to expect for 2026 - Ketch, accessed March 17, 2026, https://www.ketch.com/blog/posts/us-privacy-laws-2026
  38. Privacy Laws Ring in the New Year: State Requirements Expand Across the U.S. in 2026, accessed March 17, 2026, https://www.bakerdonelson.com/privacy-laws-ring-in-the-new-year-state-requirements-expand-across-the-us-in-2026
  39. 20 State Privacy Laws in Effect in 2026: Key Dates & Changes | MultiState, accessed March 17, 2026, https://www.multistate.us/insider/2026/2/4/all-of-the-comprehensive-privacy-laws-that-take-effect-in-2026
  40. Privacy and Data - XR Association, accessed March 17, 2026, https://xra.org/advocacy/policy-areas/privacy/
  41. Apple Vision Pro has 3.5x lower latency than Meta Quest 3, test finds - FlatpanelsHD, accessed March 17, 2026, https://www.flatpanelshd.com/news.php?subaction=showfull&id=1707982486
  42. Spatial Computing Market Forecast 2030: A $600 Billion Paradigm Shift, accessed March 17, 2026, https://inairspace.com/blogs/learn-with-inair/spatial-computing-market-forecast-2030-a-600-billion-paradigm-shift
  43. Spatial Computing Market Size to Reach USD 3,978.0 Bn by 2034 | DMR, accessed March 17, 2026, https://dimensionmarketresearch.com/report/spatial-computing-market/
  44. XR Trends 2026: The Future of AR and VR for Business, accessed March 17, 2026, https://yordstudio.com/xr-trends-2026-the-future-of-ar-and-vr-for-business/
  45. Empathic Extended Reality in the Era of Generative AI, accessed March 17, 2026, https://www.sciexplor.com/articles/ec.2025.0009
  46. INTERACT: AI-powered extended reality platform for inclusive communication with real-time sign language translation and sentiment analysis. - Open Research Europe, accessed March 17, 2026, https://open-research-europe.ec.europa.eu/articles/6-71