Porting Mobile AR Features to Apple Vision Pro
Terrys Fabrics leverages the power of visuals in elevating customer shopping experiences. The company wanted to use Apple Vision Pro to let customers do virtual try-ons of products in their space. We ported the AR product viewer from the Terrys mobile app to the Vision Pro platform, improved interactions, refined user experience, and optimized the app’s performance for the immersive environment.
Terrys, a leader in fabric shops, recognized the potential of Apple Vision Pro for immersive product viewing — a great way for customers to virtually fit curtains and blinds in their homes. Our task was to port Terrys’ existing augmented reality mobile app feature to Apple Vision Pro, transforming the interaction method and user experience. The project also required the development of a new user interface, adapted specifically for Vision Pro, and the optimization of the app’s backend for smooth performance in the spatial computing environment.
We’ve worked with Terrys Fabrics in the past and they recognized our expertise in AR/VR development for mobile phones and headsets.
Scope of Work
The project required a comprehensive approach to adapt the existing AR mobile app for Apple Vision Pro. We leveraged the expertise of our XR developer to achieve optimal performance and our XR designer to craft an exceptional user experience.
Porting AR features to Apple Vision Pro
Transitioning the existing mobile AR product viewer to the Vision Pro platform, including adapting the feature for the device’s spatial computing environment.
Interaction method update
Changing the interaction method from screen tapping on mobile devices to pinching gestures on Apple Vision Pro, enabling a more natural and intuitive user experience.
User journey redesign
Streamlining the process of selecting and viewing products. The original mobile flow was split into four steps due to screen constraints, but the Vision Pro’s flexible spatial windows allowed us to condense this into just one if the user changes only the texture and additional step for switching the product type.
User interface design
Creating a new UI specifically for Apple Vision Pro that maximized usability by carefully distributing 3D content within the spatial environment.
Framework change to accommodate Apple Vision Pro
Transitioning from the Lightship framework to native ARKit due to Lightship’s incompatibility with Apple Vision Pro. The change to native Apple Vision Pro development made the graphics run smoothly.
Enhancing plane detection
Switching from plane scanning to mesh scanning enabled a more accurate and quicker sill recognition, even on very small or crowded surfaces.
Scope of Work
The project required a comprehensive approach to adapt the existing AR mobile app for Apple Vision Pro. We leveraged the expertise of our XR developer to achieve optimal performance and our XR designer to craft an exceptional user experience.
Porting AR features to Apple Vision Pro
Transitioning the existing mobile AR product viewer to the Vision Pro platform, including adapting the feature for the device’s spatial computing environment.
Interaction method update
Changing the interaction method from screen tapping on mobile devices to pinching gestures on Apple Vision Pro, enabling a more natural and intuitive user experience.
User journey redesign
Streamlining the process of selecting and viewing products. The original mobile flow was split into four steps due to screen constraints, but the Vision Pro’s flexible spatial windows allowed us to condense this into just one if the user changes only the texture and additional step for switching the product type.
User interface design
Creating a new UI specifically for Apple Vision Pro that maximized usability by carefully distributing 3D content within the spatial environment.
Framework change to accommodate Apple Vision Pro
Transitioning from the Lightship framework to native ARKit due to Lightship’s incompatibility with Apple Vision Pro. The change to native Apple Vision Pro development made the graphics run smoothly.
Enhancing plane detection
Switching from plane scanning to mesh scanning enabled a more accurate and quicker sill recognition, even on very small or crowded surfaces.
Solution
We began by porting the existing AR feature from Terrys mobile app, which involved significant changes in the interaction model, moving from a screen-tapping interface to a more immersive pinching gesture compatible with Vision Pro.
The next step was to rethink the user journey. The original app required users to navigate through multiple steps to narrow down their curtain or blind choice due to mobile screen limitations. With Vision Pro, we condensed these steps into a fluid experience, allowing real-time adjustments and selections, greatly improving the user’s ability to find the perfect product.
We also redesigned the user interface to take full advantage of the Vision Pro’s spatial computing capabilities. This involved careful consideration of where and how 3D content was displayed within the user’s field of view to ensure maximum usability and comfort. Additionally, we replaced the Lightship framework with native Apple Vision Pro tools to enhance performance, especially in recognizing planes through mesh scanning, which is more effective on Vision Pro than on mobile devices.
Finally, we optimized the development process, addressing challenges related to Unity’s inefficient code regeneration process. By implementing a custom package management script, we significantly reduced build times, allowing for faster iterations and a more efficient development process.
Team Composition
For a great performance combined with visually pleasing models, we put together a team with strong skills in the design and development of spatial and immersive products for e-commerce.
Unity Developer
AR/VR Designer
3D Artist
QA Engineer