Meet Metadata Guardians Trying to Make Your NFT Collection Available 100 Years from Now

Rarible
5 min readMay 17, 2021

--

The NFT space has made unprecedented progress since we first launched Rarible in late 2019, as it hit several major milestones in the past few months alone. Specifically, users spent more than $2 billion on NFTs in Q1 2021, marking a 2,100% increase from Q4 2020 — and that’s just one of many figures showing the rapid pace at which the NFT industry has been growing lately.

However, as the space keeps expanding and people proceed to mint thousands of NFTs every day, an important question arises — how do we make sure that all that creative work remains accessible over time?

Earlier this year, as NFTs began hitting the mainstream, the industry faced a major technical problem. As highlighted by some analysts, most of the JPEGs (or other file formats) that serve as the media basis for their respective NFTs weren’t stored on the chain itself. Instead, NFTs often simply redirected to a URL where the corresponding media was hosted — in most cases, that was a separate domain that belonged to either the creator or the platform where that NFT was minted.

But what happens when those domains go down? With that discovery, it became apparent that some NFTs have a single point of failure, which, in turn, could significantly impact their longevity and value.

In light of that, numerous projects and organisations have begun working on a potential fix. In a recent Clubhouse session, we brought those pioneering protocols and projects — Protocol Labs (an open-source lab that is working on Filecoin, IPFS and libp2p), Async.art, Fleek and Arweave — together to have a constructive discussion about the future of NFT metadata. Here’s a concise recap of the main ideas that were discussed there.

Tagging and Identifying Data

So how do you make sure that your content will withstand the test of time and always stay online? Molly Mackinlay, Project Lead of the Interplanetary File System (IPFS), suggests that the answer lies in the creation of a distributed network where an artist is not dependent on any central entity.

IPFS is exactly that — a protocol and peer-to-peer network that allows users to store and share data within a distributed file system. In order to uniquely attribute and identify each file, it employs content identifiers (CID). Think of it as a universal content fingerprint of sorts, where one specific hash can only link to one specific piece of work. Consequently, any data changes to an image will cause a different CID to be generated.

This is important because the very value of NFT artwork is premised on its verifiability. Without using a unique identifier like a CID, art represented by an NFT might be susceptible to “rug pulls” or be difficult to authenticate if the particular URL goes down. With a CID, content can be universally referenced — regardless of where the underlying data is stored.

That concept is already being used by third parties. Async.art is a protocol that has cleverly integrated IPFS hashes into the NFTs uploaded to their platform. The project’s founder, Conlan Rios, explained that this is done by hashing both the token metadata along with all the media files, which are then pinned to the IPFS. As a result, subsequent buyers can verify that the original digital artwork is still intact and accessible by running the hash through a special algorithm on the platform.

Storing the metadata in a decentralized way

However, as effective as these solutions might be, Fleek’s CEO Harrison Hines pointed out that they still have some drawbacks — at least at the current stage. As noted by Hines, NFT storage protocols still have some central points of failure. For instance, IPFS nodes run through AWS (centralized cloud computing services) and hence are still at risk of getting shut down should AWS fail or intervene.

The ideal solution, therefore, would be to find some way to facilitate trustless storage.

In light of this, Fleek has started to augment IPFS with CloudFlare, a content delivery network (CDN). This is in contrast to having all of the file retrieval coming from only IPFS, because CDN also looks up the hash and retrieves the appropriate file.

In addition to the above mentioned solutions that help identify a path to a certain piece of work, special storage protocols can also make NFTs more durable in the long term. This is particularly important to add incentives for IPFS nodes to retain a copy of the data attached to a CID.

This is exactly what Filecoin was designed for — a distributed storage marketplace where storage providers and clients can verifiably and trustlessly negotiate storage deals for long-term data persistence. NFT.storage is a new initiative from Protocol Labs and Pinata to help make Filecoin-backed storage accessible to all NFT creators by offering free storage and redundancy for all NFTs. As the project’s name implies, users can upload their NFT data for long term storage on a decentralized storage network. NFT.storage uses Pinata and IPFS Cluster to make the file available on IPFS, while backing it up into Filecoin at the same time. Currently, the project is in the process of pulling in all historical NFT data from the Ethereum blockchain as well.

Whichever storage model projects choose, Mikeal Rogers, IPLD Lead at Protocol Labs, notes that another important thing is to communicate to consumers how exactly all of their data is being stored. According to him, different methods will have an impact on the value of assets, meaning that an asset stored with a more persistent mechanism would likely be valued higher than one which uses a mechanism requiring continuous management and maintenance.

A Decentralized Future

While most experts agree that a community-driven decentralized solution is the goal, it is still unclear how it can be achieved as a consistent source of funding remains a key concern.

One option suggested by Sam Williams, CEO of Arweave, is to create a protocol that generates interest. That interest, he argues, will help to pay for storage costs and incentivize other users to help preserve the metadata.

Another option is to directly empower communities to persist the data that matters to them through DataDAOs and StOracles (Storage Oracles), which combine Filecoin’s verifiable storage with cross-chain smart contracts. NFT.Storage aims to evolve in this direction, “decentralizing itself out of existence” as it upgrades to these provably permanent solutions.

At the same time, other key components of a decentralized system such as DAOs and oracles will hopefully continue developing around the NFT space, to further contribute to the persistence of digital works online.

What’s Next?

While there’s been a lot of action, it’s important to remember that the NFT space is still at the early stages of development. As it continues to expand, the most important thing is to more stakeholders involved in the development of underlying infrastructure.

From joining cross-project initiatives to insisting that the platforms you use include CID from IPFS in the metadata of their contracts, there are plenty of ways to near the decentralized future for NFTs. The main point is that these efforts, as novel as they might seem at this point, will ultimately enable us to maintain and preserve this truly valuable layer of digital heritage that are NFTs.

--

--

Rarible
Rarible

Written by Rarible

Creator-centric #NFT marketplace. Create, sell and collect digital collectibles secured with #blockchain — rarible.com

Responses (1)