The recently formed Metaverse Standards Forum (MSF) has been voting on its list of priorities over the last two months, and the results are encouraging for what we at Enjin want to build.
Enjin joined the MSF some months ago, and joined the Open Metaverse Alliance for Web3 (OMA3) shortly after.
We’ll have a post on the many metaverse organizations later (we’re up to four!), but for now, suffice to say the OMA3 began partly because the MSF didn’t seem to have web3 on the frontburner. Things like items and quests that move cross-game and cross-chain didn’t seem like a priority.
For us (and for all of web3), those topics are a big deal.
However you slice it, standards will be the key to any “metaverse” allowing users to move between virtual worlds and bring their progress with them. Getting these venues to communicate is the key – a common language is needed. Which means data needs to be stored in such a way that can be recognized wherever the user goes.
There are two layers to this: raw data storage, and then questions of design and user experience.
The first is all about how to store key:value pairs. How does one design the JSON of web3 interoperability? If you bring a package from one place to another, it needs to be able to interface.
Once that core issue is solved, the second will be dealt with: smaller, design-focused conventions. In a gaming context, this means different games recognizing that an avatar has attributes like 50 hitpoints, or 40 stamina, what armor is equipped, and the special items in their inventory. Perhaps things like history and quest status can also be exported.
If one game allows characters to equip three magic rings and another game only allows one, how is that rectified? Do you just pick the ring in the first slot?
As far as the MSF goes, early communications indicated the group was more concerned with the high-level data storage problems in a web2 architecture, and less on those smaller examples like in-game assets. NFTs were barely a blip on the radar.
But now that group members have voted, the priorities have shifted. According to a recent vote, the top three are:
It seems more MSF members care about cross-game items than we thought. Even more than what is essentially a no-brainer topic about privacy & security.
Other high-scoring topics that didn’t quite break 100 votes were a Metaverse Standards Registry, Avatars and Apparel, User Identity, and Teaching, Education, Exams, & Certifications.
The MSF can walk and chew gum (and many other things) at the same time, so several working groups have been formed to advance all of these top causes. But the fact that interoperable 3D assets rate so highly is a positive sign.
It shows that we’re not alone in our vision for the metaverse. A large number of companies building in this space are also ready to invest money and labor towards cross-game assets and stories.
But before we start thinking that the MSF's vision is actually closer to what the OMA3 wanted all along, this was made clear in the last MSF meeting:
The scope is about moving 3D assets. The Asset Management Exploratory Group may overlap with asset format work in this group, but this group is NOT expected to be doing direct Web3 work items directly
Well then. Looks like web3 companies were right to create their own group. With Blackjack. And... entertainment.
It’s even more significant that MSF members want to specifically concentrate on 3D assets as opposed to just the metadata for those assets. That’s actually several layers more complicated.
As it stands, games use wildly varying art pipelines. From the 3D sculpting software like Maya, 3DS Max, or Blender, all the way to the engines like Unity, Unreal, or Godot, and middleware solutions attached at every joint in the grand software skeleton.
These systems are not designed to talk to each other. Managing just one internal art pipeline is an entire role in itself. This is designing 3D assets in such a way that they can literally be brought from world to world.
Some game developers (invariably anti-NFT skeptics) have already called this task impossible.
I usually consider this to be somewhat of a straw man argument, and respond that no one is aiming to literally take a rifle from Halo and literally put it in Call of Duty. Interoperability is more about data within NFTs that can be interpreted in multiple ways by multiple virtual worlds. At least, that's the web3 way.
But these MSF members actually want to do the thing. The impossible thing.
MSF’s working group on 3D interoperability was proposed by representatives from Autodesk, Microsoft, Epic Games, Adobe, and Cesium. Among the other companies contributing are Meta, NVIDIA, W3C, and Qualcomm.
Its goal is to build consensus and draft a charter for companies who want to continue building towards a future of shareable 3D assets.
So how do they plan to do this?
Early investigations include looking at the Universal Scene Description (USD) format as a potential standard, as well as Khronos Group’s Graphics Language Transmission Format (glTF). The former was designed for authoring and the latter was designed for delivery – both could potentially be expanded, or a combination of the two could be used.
Also being looked into is:
The working group plans to create some benchmarking assets to be made freely available, and will prototype a pipeline for USD and glTF, measuring performance and complexity. There'll be lots of opinions from different companies and no guarantee that they'll all fall in line.
Even after that, much is unclear.
Data storage, for one. Where do these assets live? 3D assets are too expensive to store on the blockchain. Depending on file size, it might be too clunky for users to upload their assets to every virtual world they enter (which then have to be downloaded by everyone in their virtual vicinity, so they can see the extraordinary bling. Multiply that by how many people are in the same space at once).
That means either all these worlds need to host every asset, or they’ll be pulling them from a centralized source. Some might be fine with that, but those in favor of a decentralized metaverse might have an issue.
Art style is also a problem. Even slightly different aesthetic choices could make assets from one world look jarring in another. Despite Valorant and Counter-Strike being similar games, the weapons look completely different.
Another issue is perspective. Liberties are sometimes taken with 3D assets when it's assumed there'll be one camera angle. Change the camera angle, and these models and scenes start to break.
Since much of game development is often smoke and mirrors, if games want to participate in the metaverse, does that limit their stylistic choices? Do we enter an era of 3D models and scenes that have to be scrutable from any direction? With any lighting? With only agreed-upon animations?
Things look different when the camera is unlocked. Often, much of the level doesn't even exist.
These, and far far more, are roadblocks on the way to 3D asset interchangeability.
If all that comes of this is some improved workflows for artists and animators, that’s still a net win. If a future of interchangeable 3D assets is actually created, it’s an even bigger win.
Even if everything goes right, there’s a long lead time here. Films and games take years to make. Bringing your gun skin from a new FPS into a new MMO? Best case scenario, 2025.
Thankfully, we have cross-game items working right now, and these will soon be cross-chain items as well.
MyMetaverse just released Efinity’s first playable NFTs, and they’re immediately usable in three different games: A car in GTA Online, a sword in Minecraft, and a house in Infinity Realms. One NFT, three items (and counting).
There’s a certain magic in these assets not being the exact same wherever they go, and being open to interpretation based on their data.
But there’s room for both metaverses in the meta…metaverse. And while we’re moreso building the architecture for a decentralized, open, community-governed metaverse, we’re also cheering on those 3D asset boffins who are aiming for the impossible.
You’ll receive: