Insights From The Blog
What happened at Meta Connect 2024?
The 25th September saw the yearly and much anticipated Meta Connect event take place, and we had some cracking announcements from the Menlo Park company. Let’s delve right in and get to the bones of what is up and coming in the technology giant.
Keynote Takeaways
Mark Zuckerberg started the conference with an action-packed overview of the latest company developments, and a basic roadmap for the next twelve months. The fifty-one-minute speech featured a number of main updates:
Meta Quest 3S
The first thing that Meta CEO revealed during the event was the Quest 3S, which is now available to buy, with the US price of the headset being set at $299, and available in the UK for around £289 for a 128GB version. The 256GB version currently comes in at around £90 more, which is still lower than the price tag of $499.99 that was attached to the Quest 3 when it was released the previous year. Additionally, this version will be a fraction of what Apple’s Vision Pro costs, which comes in at almost $3,500. In terms of technological capabilities, the Quest 3S will be exactly as powerful as the Quest 3 was when it is released.
The Quest 3S will still use the same Qualcomm Snapdragon XR2 Gen 2 chip — which powers the Quest 3 — including its Adreno 740 GPU and 8GB of RAM notwithstanding its lower pricing. Regarding performance parity, that’s as good as it gets; other areas will see some cuts, most notably with relation to the Quest 3S’ optics. In short, it will be able to carry out all of the same operations and use the same applications, with the primary variations being the storage capacity and the resolution of the display. Since the Quest 3 has a pretty impressive resolution, then a small reduction isn’t going to be too much of an issue.
Smart Glasses, and Orion
Next, the CEO went on to announce possibly the most exciting development of the whole event; upgrades to the Meta Ray-Ban smart glasses and the new Orion AR glasses. Meta introduced additional capabilities for its smart glasses, which were manufactured in collaboration with Ray-Ban. These capabilities include the capacity to recall information that the glasses perceive, such as phone numbers or parking locations, as well as live translation. Zuckerberg brought attention to a transparent design that is part of a limited edition and, according to him, demonstrates the technology that is used to power the gadget.
Next, the keynote also featured the presentation of Meta’s Orion smart glasses, which are prototypes of augmented reality glasses. According to Meta, these glasses have the potential to be “the most advanced pair of AR glasses ever made.” Mark Zuckerberg has indicated that it could be some time before the business publishes a consumer version of the glasses. He also stated that Meta will utilise the glasses internally for the creation of software for the foreseeable future, and gave no definite product launch date.
Llama 3.2 AI Engine
The last portion of the keynote dealt with advances in the company’s AI aspirations, and the launch of the latest updates in their native AI system.
Along with other new AI capabilities including a voice for its Meta AI assistant, which comes with a variety of celebrity alternatives like John Cena, the CEO showed an AI translating capability on the Ray-Ban smart glasses.
Additional new capabilities in Reels, Meta’s short-form video tool for Facebook and Instagram, featured improved AI photo editing and automated language translating. To enable the new AI capabilities and let developers use its models to create their own AI tools, Meta is also releasing Llama 3.2, its newest AI model. According to Zuckerberg, Meta’s AI assistant now has about 500 million monthly active users and is expected to be the most utilised AI assistant in the sector by the end of 2024. Zuckerberg compared Meta’s Llama 3.2 large language models with other LLMs, saying that Meta “Differentiates itself in this category by offering not only state of the art models, but unlimited access to those models for free, and integrated easily into our different products and apps.” By that, everyone is invited to help build a better AI system.
Well, that was quite a lot to take onboard, but it was only the start of the Connect event, and there was plenty more to come.
Best of the Rest
Developer Keynote
Andrew Bosworth, CTO and Head of Reality Labs at Meta, came on stage to discuss advances in Meta’s developer applications. Delivered by a host of the company’s top executives, including Andrew Bosworth and Aigerim Shorman, this section focused on how Meta is going to make the development of content easy and with more realistic features. Aigerim Shorman walked the audience through the development of a generator in an industrial complex, showing them the level of reality that will be attainable to developers in the very near future.
Improved Avatars
The company is busily redesigning its metaverse avatars; you will be able to use them soon: the firm said at its Connect conference on Wednesday that the new look will be accessible on October 1st. The new avatars will be accessible on Meta Horizon OS (read our post on this here), Facebook, Instagram, and Messenger as well. Although Meta’s avatars began in a bad state, the business has made consistent progress; new developments seem to expand on that. Meta’s Aigerim Shorman, a VP on the Horizon team, stated onstage that with the improved avatars, users would have fresh means to adjust elements such eye size, nose shape, and body forms. A new underlying tech stack with an improved avatar skeleton helps in part with that.
Mixed Reality Aspirations
Meta has reaffirmed its commitment to developing the absolute best Mixed Reality (MR) games and applications, and the company views the circumstances they are currently experiencing as an “inflexion moment” that will lay the groundwork for future developments.
A blank canvas is made available to developers through the usage of mixed reality, which enables them to drive innovation and inspire dynamic interactions between users, virtual content, and the environment around us. It was previously hard to build experiences in virtual reality (VR), but with augmented reality (MR), you can do so. For example, users may incorporate virtual items that augment and adapt to the actual environment in real time and with increasing ease. It doesn’t matter if you want to develop applications for gaming, education, healthcare, entertainment, or any of the countless other categories; MR has an unrivalled potential to boost productivity, unleash creativity, inspire collaboration, and bring people together.
Meta sees that MR can be extended into many other areas of life, including sports applications where players can practise their moves and sharpen their skills in an MR environment. The same applies to almost any area of life – from cooking to life skills. Meta continues to offer funding for developers with a quality idea that can push the boundaries of MR.
Spatial SDK
Mike Armstrong, a software engineer at Meta introduced the Spatial Software Developer Kit (SDK) that is going to speed up App development. Spatial SDK is a novel approach to the development of immersive applications for Meta Horizon OS. By using APIs that are easily available, the Spatial SDK gives the developer the ability to integrate the diverse ecosystem of Android development with the one-of-a-kind features of Meta Quest.
Developers will be able to use the mobile development languages, tools, and libraries that you are already acquainted with because it is built on the Kotlin programming language. One option is to create whole new immersive applications, while another is to take an existing mobile application and augment it by incorporating spatial aspects. You can use Spatial SDK’s rich functionality to create compelling Horizon OS experiences, such as:
- Mixed reality: Spatial SDK supports key mixed reality features such as passthrough, scene, anchors and Mixed Reality Utility Kit (MRUK); enabling developers to quickly build apps that blend the virtual and physical world.
- Realistic 3D graphics: Spatial SDK supports modern graphics pipelines including GLTF, physically-based rendering (PBR), image-based lighting, skeletal animations, and rigid body physics, so that developers can create compelling 3D experiences
- Complete scene composition: Spatial SDK supports complex compositions containing 3D assets, animations, sounds, physics and more. Build full scenes with Spatial Editor or create them at runtime using code.
- Interactive panels: Spatial SDK supports rich panels within your scene built using your preferred 2D UI framework.
Meta hope that by opening up the development process to anyone with a good idea, the richness of the content will increase many-fold. And that benefits everyone.
We at Unity Developers are very impressed with the content of this year’s Connect, and look forward to exploring the new features. If you need help with a Meta development, why not contact us and see how we can make it easy for you.