The future of media consumption in the Metaverse
Sponsered by
With lighter headsets on the horizon, the number of ways in which users might consume media in the future are endless! So for our latest contest, we asked ShapesXR users to come up with innovative prototypes for mixed reality applications that suggest new and exciting ways in which people might consume media in the metaverse.
We asked designers to consider how media consumption, in a mixed reality environment, would adapt to real world spaces, integrate social interaction, and enhance the overall experience for users.
On December 16th, five finalists presented their prototypes, LIVE from ShapesXR, to a panel of industry professionals from Netflix, Meta, and Google.
Audience members tuned in from around the world via YouTube live to watch each finalist take to the virtual stage to present their ShapesXR design for a mixed reality application that innovated a new and creative way for users to consume media in the future.
Each finalist had five minutes to present their project, and our three winners were announced at the conclusion of the event.
Welcome to another ShapesXR update with brand-new features to increase your creative powers and the collaborative experience with others.
We have added a new yet familiar color wheel so that you can pick ANY color you want! You can even add your hex code and pick colors and materials from another object.
It is now super easy to get your 3D models in ShapesXR or download them from databases like Sketchfab or poly.pizza. From now on you will be able to import GLB and glTF up to 20 Mb directly from the shapes.app web panel.
We have made our first step in supporting multimodal interactions and added the ability to trigger hover effects and transitions using gaze and pinch. Just select the “gaze and pinch” option from the interactions menu and set aside your controllers. Looking at the object will now trigger the hover-in and out effect. Pinching with your hand will instead act like a “click” interaction. Use a QuestPro with eye tracking for the best experience or a Quest 2 by turning your head.
We have simplified the settings for image import and now you can easily set the transparency mode and opacity of all PNGs and components imported from Figma.
Toggling snapping on or off and cycling through the various modes (world, object, all) is now possible through a handy UI located on your dominant hand.
Collaboration and communication is about being on the “Same page”. So there you have it! Everyone who toggles this mode will be syncornized on the same stage/slide of your scene. Anybody can change the stage and the rest of the participants will magically follow. This feature is available to Studio and Enterprise clients only.
We have made a slight change to the limit of editable spaces. From now on you can still edit up to 3 spaces BUT either owned or shared with you. If right now you are above that limit (e.g. you can edit 3 of your own and 2 that have been shared with you) nothing will change, but going forward when you hit the limit you can delete one of YOUR old spaces, leave one that was shared with you or reach out to us for a studio plan. Rest assured that if someone invites you to collaborate on their space and you have exceeded the limit, you will still be able to view and comment on that space. To learn more about viewing and commenting, please refer to this link to learn page.
- Bug fixes and improvements to Holonotes
- Fixed in VR preview menu for Figma assets
- Figma assets are now visible in the web viewer and in the Unity export
- Fixed viewpoint disappearance
- Lip Sync optimization for avatars
- Improved organization settings for Enterprise/Studio/Education licenses
- Loading in the lobby optimized
- Fixed bug of “presentation mode” not starting
- More fixes and improvements overall
Diego is an AR & VR developer graduate student, electrical engineer, and an avionics technician.
Diego designed an application that suggested ways in which airline passengers could consume media during flight. Mixed reality tabletop games that you could play with other passengers, real-time map overlays that one could view while looking out the window, and 360 flight cameras accessible via users' headsets were just a few of Diego's exciting ideas.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
Imtissal is an immersive designer and came up with an innovative way of incorporating mixed reality into the analog experience of reading a magazine. For her design she showed how an MR application could be used to dramatically enhance the experience of reading National Geographic. Imtissal incorporated sidebar overlays, audio cues, and even an interactive 3D model of the solar system into her design.
Michael is a senior XR designer and prototyper. Michael prototyped a social media, mixed reality application called Portals. Portals would allow users to interact with short form media content in highly immersive ways. Not only could users watch portal content in AR, they could create volumetric immersive content directly from their phones.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
Andrew is a content creator, XR developer, digital strategist, and world builder at CineConcerts. Andrew's highly immersive design would transport audiences into virtual concert halls, where they’d be fully immersed in live orchestra performances of scores from popular movies, all while watching the films unfold on giant virtual screens.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.