deenfr

06 octobre 2019

The state of tokenisation – and how to deal with it

Dr. Lewin Boehnke

Dr. Lewin Boehnke

CTO chez Crypto Storage AG et Head of Research chez Crypto Finance AG

A propos de l'auteur

This article is from our magazine Building Blocks. The article was written by our Crypto Finance Head of Research & Crypto Storage CTO Dr Lewin Boehnke.

To receive your copy of the complete Building Blocks magazine, please register at the bottom of the page.


The state of tokenisation – and how to deal with it.

The narrative around tokenisation has changed significantly over the last few years.

Bitcoin, the asset, managed to capture the value of the success of bitcoin, the product, the payment instrument, and monetary system. This early investment opportunity in an asset with an uncertain risk/reward profile but with high potential is similar to an investment in private equity, with the characteristic difference that it did so without a share register, bylaws, a board of directors, or a general assembly. Bitcoin is a native asset.

What followed was the search for ways to capture the simplicity of that mechanism for different projects. Developers built smart contracts on the Ethereum blockchain, allowing new tokens to be quickly launched over the existing Ethereum platform without having to write a new native protocol. Previously, tokens on Ethereum had been issued often to the public using Initial Coin Offerings (ICOs), or more recently, Security Token Offerings (STOs). The idea of tokenisation was born, but not without its challenges.

Two approaches prevailed in the first few years: tokenisation as a utility and tokenisation representing off-chain assets.

Tokenisation as a utility

The first approach acknowledged that the simplicity is a direct consequence of the native nature of the value-carrying token and tried to find a place for a token in the project. Golem is a typical example of this approach, a peer-to-peer marketplace for renting out computational resources, in its initial phase for rendering images. The economic model foresaw payment for this rent in the form of the Golem Network Token (GNT), which was pre-sold to raise funds for the development of the network. The value proposition of GNT, an ERC-20 token (see later) on the Ethereum blockchain, lies in the demand for it due to the usefulness of that token on the finished network. These “utility tokens” aim to follow bitcoin ‘s model, by receiving their value from the usefulness of the product, independent of an underlying asset. In this way, they are very different from private equity, for example, where the investment would be in a company building a product, not in the value of the product itself. The initial sale of GNT raised an equivalent of 8 million USD.

Ether, the native token of the Ethereum blockchain, itself is a further example and falls into the utility token category. It has the role of “gas” when paying for smart contract execution on Ethereum. It certainly owes part of its value to that use; however, this is analogous to the value that gold has, due to its use in electronics. Such a value is certainly present, but this is certainly nowhere near the current valuation.

Tokenisation representing off-chain assets

The second approach, in contrast, attempts to represent off-chain value in an on-chain token and still capture as much of the simplicity and efficiency as possible. These “asset tokens” are what is currently most often referred to under the term “tokenisation”. Many early cases aimed not to be classified as a security to avoid regulatory challenges (mostly by virtue of the decentralised nature of the blockchain they are deployed on). The DAO, another token on the Ethereum blockchain acting like an autonomous venture capital fund, was considered one of the most decentralised projects and consequently believed to be unlike a security. However, the SEC concluded that the DAO was a security, prior to its spectacular demise (leading to a hard fork in Ethereum and the continuation of the unchanged chain under the name Ethereum Classic). It is clear that the on-chain representation of off-chain value falls under securities regulation.

This moved tokenisation much closer to representing the standard features of traditional assets. If we refer again to the simple example of shares in a company, the board of directors is back in the game, as is the general assembly, and the movement of shares may be restricted by bylaws.

On-chain best practises in tokenisation

Along with the expectations of off-chain processes being represented by the tokens, this introduces a second topic: the processes and standards need to be on-chain as well. For value on-chain, an established convention is the ERC-20[1] standard: a de-facto industry standard for launching a token with fungible qualities on the Ethereum blockchain. For processes on-chain, no such standard exists. There are a few repeating patterns, such as ‘mintable’ and ‘burnable ’, indicating that some treasury account exists that can create new tokens and that token holders have a way of  destroying their funds, respectively.

These patterns help when dealing with a smart contract, especially with processes. However, true standards are generally absent, forming one of the major hurdles to adoption. At the same time, discourse between a tech-savvy lawyer and a competent smart contract developer would certainly be a most interesting and fruitful development in the tokenisation landscape. Smart contracts and the associated legal contracts need to work hand in hand[2]. The rules of the shares should be clearly outlined in bylaws or relevant laws. Inversely, parties interacting with the token should easily understand any operation that is possible in the governance of the token.

Challenges with technology

Back when mostly developers or technology enthusiasts interacted with tokens and their governing contracts, sufficient know-how was taken for granted (although again The DAO was an example where it was unjustifiably assumed), as was the operator’s ability to execute the necessary operations and care for the security of their own private keys. Bringing the token experience into the incumbent financial system, this is less likely to be the case. A board of directors will be faced with tasks that do not lie in its broader experience. Demanding tech-savviness from this group is fanciful, especially if this is not in the nature of the company.

Reducing this challenge to user experience tweaks also does not do justice to the gravity of the task. In a world where the allocation of a token is supposed to truly represent ownership in a company instead of just being some kind of bookkeeping that can be overruled by another ground truth, then not only the value itself needs to be secured, but the governing decisions need to be too. In addition, if a ground truth exists, then there is no reason to bring a blockchain into the equation in the first place.

Adapting governance and operations

Governance decisions occur with tokens as they do in the traditional world, such as with the money printing press. Governments decide when to print more bank notes. Token governance decisions require the ‘minting’ of new tokens and the increase of the reserve of a stablecoin. But the operational and governance pathway to executing these decisions are radically different; the shift in processes for these similar outcomes raises questions about the handling of tokens.

How do you protect the keys to achieve security and how do you secure the process of irreversible money creation from collusion and employee fraud? It is crucial to have this done by the responsible decision makers and stakeholders themselves. This is not a back-office task, merely initiated by a signed piece of paper. It is crucial to have the operator easily understand what type of interaction they are having with a governing contract, especially if that operator is not chosen based on tech-savviness. It is crucial to execute decisions on devices with proper physical security, to maintain relevant operational checks and balances, and to guarantee those in technology instead of paper trails. Standards are emerging, and industry experts are gaining trust and building strong reputations in navigating this paradigm shift.

In a new age of interoperability, the operability needs to be rethought and proper tools deployed.

 

Lire la suite