Apple Vision Pro Update Introduces Foveated Streaming
Apple’s visionOS 26.4 update adds Foveated Streaming, allowing Vision Pro to run immersive apps by streaming sharp content from other computers with minimal delay.
Foveated Streaming lets apps use both the headset and outside computers. Apps display immersive visuals from remote computers while keeping interactions fast by processing input directly on the headset.
This update also adds support for advanced simulation software. Programs like Microsoft Flight Simulator, X-Plane 12, and Laminar Research tools typically run on powerful PCs. Streaming allows these to deliver complex visuals to Vision Pro while users interact via the headset.
This update merges processing between the headset and the external computer. Unlike most streaming services, visionOS 26.4 combines local and remote content.
Interactive parts, like cockpit controls, are handled locally on the headset with RealityKit, while detailed backgrounds are generated externally and streamed in, saving bandwidth.
This technique, called foveated rendering, increases image quality where you’re looking (‘foveated’ refers to the eye’s focus area, or fovea) and reduces detail elsewhere. The system prioritises the part of the image you are focusing on, keeping it sharp while lowering data use for the rest of the screen.
Interactive features are still handled by the headset. Moving objects, tracking your hands, and using the interface all work on the device at full speed, while visuals from other computers are streamed in at the same time. Apple says the system is built to keep response times under 20 milliseconds for important interactions, even while streaming visuals.
Demanding apps, such as flight simulators, can display advanced scenes by streaming from external computers to Vision Pro while maintaining performance and quality.
Large scenes are rendered externally and displayed on Vision Pro, while cockpit controls are handled by the headset for a smooth interaction experience.
This model supports apps needing both intensive visuals and spatial interactivity, such as architecture, design, and engineering tools.
Software for this streaming system must keep user interactions and visuals as separate components. For apps, this means ensuring that local features (controls, menus) stay in sync with remote visuals (graphics sent from another computer). Achieving this might require specialised servers and fast networks to deliver sharp images to the headset with minimal lag (delay).
With this model, Vision Pro serves as a 3D interface, allowing users to interact with remotely generated environments without relying solely on the headset’s computing power.
The headset processes interaction, while complex visuals are streamed in. visionOS 26.4 unites local input with remote rendering for immersive apps.








