MPC Multi-Party Computation

What is MPC (Multi-Party Computation)?

“Not your keys, not your coins” has resulted in over $100 billion lost or stolen since the early 2010s, specifically because of private key mismanagement. Clinging to this ‘golden rule’ will fail to onboard the next 1 billion into a bankless, self-empowered Web3. Secure crypto technologies like MPC are the hybrid solution for an overwhelming majority of new and current users, offering optimal tradeoffs between security, self-custody, recoverability, and interoperability.

Not Your Keys Not Your Crypto? Outdated.

The mantra of “not your keys not your crypto” is as powerful today as it was in 2017. But the result? Lost and stolen seed phrases, misplaced private keys, stress for new users, and a flight to CeFi exchanges and ‘crypto banks.’

An estimated $100 billion dollars of Bitcoin (just Bitcoin) has been lost forever, because of private key mismanagement.

As a community, crypto has been dogmatically clinging to a purported “private key gold standard,” more obsessed about the technology than providing what people actually need. MPC is a solution that already exists, recently championed by companies like Coinbase and ZenGo.

Simple and secure MPC technology is already being used at the institutional level – companies like Fireblocks are helping custody billions of dollars of cryptoassets with MPC cryptography. It’s time average users get the same bulletproof security as the big players, and developers understand the security benefits of MPC to onboard more crypto users.

The false dichotomy: Centralized Exchanges v. Non-Custodial Wallets

For years the status-quo perpetuated a dangerous misconception: There are only 2 ways to store crypto. This false dichotomy is why so many potential crypto-enthusiasts haven’t started to get involved in the ecosystem.

Option 1: Exchanges

Custody cryptoassets in a centralized exchange, giving up your freedom, control, and on-chain access in return for relative security, simplicity, and comfort knowing someone else will worry about secure crypto storage.

Option 2: Self-custody with Private Keys

Use an on-chain crypto wallet with private keys, rendering assets vulnerable to scammers, hacks, lost or misplaced keys – but knowing you have ultimate control over your crypto: to store, HODL, or lose…

There’s actually a better way: A hybrid solution in the form of a type of cryptography called MPC, or multi-party computation.

What is MPC and how does it work?

MPC stands for Multi-Party Computation. This is a type of cryptographic technology.

Leveraging MPC, wallets (and institutions) can securely design an on-chain asset management system that makes recovery easier, while simultaneously increasing secure self-custody by removing the single point of failure of a private key.

At a basic level, MPC (within the cryptographic world of threshold signatures) allows 2 (or more) parties to securely input information into a system and activate (or unlock) an outcome > without any party being able to see the inputs of the others.

This makes it possible to design a crypto wallet that uses multiple parties to backup or restore a user’s funds > while keeping the funds in the user’s custody at all times.

This design offers a number of advantages:

  • Easy to recover
  • No single point of failure for phishing
  • Entirely user controlled

Why MPC is a better user experience than a “seed phrase” wallet

This type of recovery is immediately more familiar and far less scary for the majority of people. Almost everyone who has created an account of any kind online knows how to recover their login using an email, trusted contact, cloud backup, or their biometric scan.

This is why these types of recoverability are crucial for bringing new people into crypto systems. Implementing familiar solutions for recovery will allow more people to feel comfortable using crypto.

Once in the ecosystem, some will want different types of security or options with a low centralization risk. There is nothing preventing anyone from using multiple wallets once they have started using crypto. In fact, it is encouraged to use more than 1 wallet when storing cryptoassets.

There is, however, a HUGE barrier to entry with the majority of wallets for the majority of people: Seed phrases.

Having a single phrase that can move the entire contents of an account in an instant can be scary. Some people are willing to rely fully on themselves to keep something this important safe. Most people are not.

Having a path to enter crypto for the first time, try applications, and hold assets where users DON’T have to worry about a seed phrase is CRITICAL for the next 1 Billion people to join the world of #Web3.

MPC Wallets do not use seed phrases

MPC wallets like ZenGo replace the traditional private key with two independently created mathematical “secret shares.” One share is stored on your mobile device and the other on the ZenGo server.

With no single point of failure, even if something happens to one of the shares, no one can access your crypto but you.

Learn more about MPC: Threshold Signature Scheme (TSS):

To understand the type of cryptography behind MPC it’s helpful to learn about TSS (Threshold Cryptography) which is a subfield of MPC.

In TSS cryptography, cryptographic operations are defined with a threshold assumption in mind – it is assumed that at least a threshold of the parties involved in the computation are acting honestly and not controlled by an attacker at the same time. It could be two parties, or more. Learn more about TSS here.

MPC Cryptography is gaining adoption

While ZenGo was the 1st crypto wallet to support MPC for consumers, companies like Fireblocks have been managing billions of dollars of assets for some of the world’s leading crypto institutions for years. Coinbase recently announced support for an MPC-powered Dapp browser inside of their custodial crypto wallet.

As MPC offers the optimal balance between on-chain self-custody, wallet security and crypto recoverability, it is only a matter of time until MPC becomes widely adopted.

FAQ: MPC Crypto Wallet

Q: How does MPC (Multi-Party Computation) work?
A:
MPC works by splitting the traditional private keys into multiple pieces, distributing them in multiple places to ensure no one person has full access to the traditional private key. The major advantage here is that the private key is always used in a distributed manner.

When a transaction signature is required, the parties involved (in ZenGo there are two: the ZenGo server and the user’s phone) in order to separately run a computation to make whatever you wanted to happen on the blockchain, well, happen! The best part of this process is no single entity can ever get access to any private key: There is no single point of vulnerability. Even if an attacker tried to get access to one of the two shares, they can’t access all of the ‘secret shares’ simultaneously, making your digital assets much safer than in the traditional private key architecture.

Q: Who uses MPC?
A: A number of billion-dollar institutions are using by MPC technology, including Fireblocks, Coinbase, and ZenGo.

Q: Is MPC new technology?
A: MPC technology is actually dozens of years old – initial development began in the 1980s – but applied MPC technology to crypto wallets is a relatively recent technological innovation in the last decade.

Q: Does MPC support many blockchains?
A: A major advantage of MPC, in addition to its security and recoverability benefits, includes the fact that it is chain-agnostic. Unlike multi-signature (MultiSigs) approaches which do not support every blockchain, MPC can be applied to many. ZenGo actively contributes to open-source MPC material on GitHub, learn more here.

More info for professionals:

If you’re in the institutional digital asset space, you’ve probably heard about MPC (multi-party computation). While MPC theory has been around since the early ’80s, it first entered the digital asset space just a few years ago; since then, MPC has become one of the primary technologies wallet providers and custodians are utilizing to secure crypto assets.

But what exactly is MPC? How does it work, and what benefits does it have? We’ll walk you through everything you need to know about the technology and its role in digital asset security today.

Let’s start with an introduction to cryptography in general to get a better understanding of MPC’s origins.

A (Very) Brief Introduction to Cryptography

The field of cryptography provides its users with a method for:

  • sending messages that only the intended receiver of the message will understand
  • preventing unauthorized third parties from reading them in case of interception
  • verifying the authenticity and integrity of digital messages from a known sender

Though cryptography stretches as far back as the ancient Egyptians, one of the most famous modern examples is the Enigma machine – a device used by the Germans to send encrypted messages during WWII which was finally cracked by the British mathematician, Alan Turing.

Whereas cryptography was once primarily the concern of government and military agencies, in the internet era cryptography plays an increasingly central role in the way we all transfer information.

While the idea behind cryptography can appear simple, the field does include some extremely complex math. In essence, messages are scrambled, or “encrypted,” by a secret recipe (or algorithm) that hides the information contained within it. This way, should the encrypted message be stolen or intercepted by a malicious or non-trusted third party, they will be unable to understand, see or alter the information the message holds. Instead, the only one who can read that message correctly is the one who knows how the message was encrypted and thus holds the key to unscramble, or “decrypt,” it.

Encrypted Message: HZZO HZ VO OCZ KJNO JAADXZ

Secret Algorithm: *use the letter which is five letters preceding the ‘real message’ letter*

ABCDEFGHIJKLMNOPQRSTUVWXYZ

Decrypted Message: MEET ME AT THE POST OFFICE

This ‘Caesar cipher’ utilizes very simple math to demonstrate the concept of encryption. However, it is known to be broken. To securely encrypt information, more advanced math is required.

In the world of blockchain, the “message” being transferred is a digital asset, and the “key” to that digital asset is essentially the decryption tool used to receive that digital asset.

That key itself – known as the “private key,” as access to a digital asset requires both a publicly known cryptographic key and a related private one – must be kept safe, as anyone who knows the private key can move the asset to their own wallet. This is where MPC comes in: it’s one of the most powerful tools for protecting private keys.

How does MPC (multi-party computation) work?

In a general sense, MPC enables multiple parties – each holding their own private data – to evaluate a computation without ever revealing any of the private data held by each party (or any otherwise related secret information).

The two basic properties that a multi-party computation protocol must ensure are:

  • Privacy: The private information held by the parties cannot be inferred from the execution of the protocol.
  • Accuracy: If a number of parties within the group decide to share information or deviate from the instructions during the protocol execution, the MPC will not allow them to force the honest parties to output an incorrect result or leak an honest party’s secret information.

In an MPC, a given number of participants each possess a piece of private data (d1, d2, …, dN). Together, the participants can compute the value of a public function on that private data: F(d1, d2, …, dN) while keeping their own piece of data secret.

For example, let’s imagine three people, John, Rob, and Sam, want to find out who has the highest salary without revealing to each other how much each of them makes – this is actually a classic example of MPC, known as The Millionaire’s Problem. Using simply their own salaries (d1, d2, and d3), they want to find out which salary is the highest and not share any actual numbers with each other. Mathematically, this translates to them computing:

F(d1,d2,d3) = max(d1,d2,d3)

If there were some trusted third party (i.e. a mutual friend who they knew could keep a secret), they could each tell their salary to that friend and find out which of them makes the most, AKA F(d1,d2,d3), without ever learning the private info. The goal of MPC is to design a protocol, where, by exchanging messages only with each other, John, Rob, and Sam can still learn F(d1,d2,d3) without revealing who makes what and without having to rely on an external third party. They should learn no more by engaging in the MPC than they would have by interacting with their trustworthy mutual friend.

History and Applications of MPC

MPC’s (multi-party computation) initial development began in the ’80s – a fairly recent breakthrough within the world of cryptography.

Up until that point, the majority of cryptography had been about concealing content; this new type of computation focused instead on concealing partial information while computing with data from multiple sources.

  • 1982 – Secure two-party computation is formally introduced as a method of solving The Millionaire’s Problem
  • 1986 – Andrew Yao adapts two-party computation to any feasible computation
  • 1987 – Goldreich, Micali, and Wigderson adapt the two-party case to multi-party
  • 1990s – Study of MPC leads to breakthroughs in areas including universal composability (pioneered by Fireblocks cryptography advisor Ran Canetti) and mobile security
  • 2008 – The first large-scale, practical application of multi-party computation – demonstrated in an auction – takes place in Denmark
  • Late 2010s – MPC is first utilized by digital asset custodians and wallets for digital asset security
  • 2019 – Debut of MPC-CMP, the first 1-round, automatic key-refreshing MPC algorithm

Today, MPC is utilized for a number of practical applications, such as electronic voting, digital auctions, and privacy-centric data mining. One of the top applications for MPC is for securing digital assets – and recently, MPC has become the standard for institutions looking to secure their assets while retaining fast and easy access to them.

Why is MPC becoming the standard for digital asset security?

To utilize your digital assets, you need a public key and a private key; your ability to safely hold and transfer the asset itself is only guaranteed as long as the private key is safe. Once that key is in someone else’s hands, they can transfer the assets to their own wallet. Therefore, preventing the theft of private keys is crucial to maintaining digital asset security.

Historically, there have been a few primary options for securely storing private keys. These options tend to fall into either hot, cold, or hardware based storage.

  • Hot Storage – Private key is held online
  • Cold Storage – Private key is held offline
  • Hardware Wallet – Private key is held offline on a physical device

While these tools were at one point the only options for digital asset storage, certain operational and security inefficiencies in each have led to the rise of new solutions, such as MPC. Importantly, MPC is strong for not only digital asset storage, but digital asset transfers, as well – and as the digital asset market has developed and grown, so has the need for a security tool that enables fast transfers and advanced business strategies.

Cold Storage

One way to reduce the exposure to digital asset loss is by storing funds in cold storage.

Cold storage enables a user to sign a transaction with their private keys in an offline environment. Any transaction initiated online is temporarily transferred to an offline wallet kept on a device such as an offline computer, where it is then digitally signed before it is transmitted to the online network. Because the private key does not come into contact with a server connected online during the signing process, even if an online hacker comes across the transaction, they would not be able to access the private key used for it.

However, there are several issues with cold storage:

  • For a contemporary digital asset business that’s actually trading assets with any frequency, it is too slow to trade from – often taking between 24 to 48 hours to make a transfer
  • It does not protect against deposit address spoofing or credential theft

Hardware Wallet

Another method of securely storing private keys is the hardware wallet. Hardware wallets are external devices where you store your private keys, such as a USB stick. Hardware wallets are resilient to malware, and if you happen to lose the wallet you’ll be able to recover the funds using a seed phrase. On the other hand, if you lose the seed phrase, there is no other way of recovering your bitcoin.

Like cold storage solutions, hardware wallet solutions lack the speed that today’s digital asset businesses require.

Hot Wallets

Alternatively, storing funds in a hot wallet is cumbersome due to error-prone copy-pasting of addresses, ever-changing whitelists, and constant 2FA rituals.

Some hot wallets utilize multisignature, or multisig, technology to divide private keys into multiple shares. Unfortunately, multi-sig is not protocol-agnostic (meaning it’s not compatible with all blockchains), and lacks the operational flexibility to support growing teams.

As a result, the best solution is one that offers both operational and institutional security requirements to store the private key safely while at the same time not hindering operational efficiency.

MPC for Private Key Security

With MPC, private keys (as well as other sensitive information, such as authentication credentials) no longer need to be stored in one single place. The risk involved with storing private keys in one single location is referred to as a “single point of compromise.” With MPC, the private key is broken up into shares, encrypted, and divided among multiple parties.

These parties will independently compute their part of the private key share they hold to produce a signature without revealing the encryption to the other parties. This means there is never a time when the private key is formed in one place; instead, it exists in a fully “liquid” form.

Ordinarily, when a single private key is stored in one place, a wallet’s owner would need to trust that the device or party that holds that private key is completely secure. Such a device could be an HSM or, less securely, a crypto exchange that essentially holds the customer’s private keys on their behalf.

However, these parties have proven themselves to be vulnerable. When an attacker only needs to succeed in hacking one point of compromise to steal a private key, it leaves the digital assets that key unlocks wide open to theft.

MPC does away with this problem, as the private key is now no longer held by any one party at any point in time. Instead, it is decentralized and held across multiple parties (i.e. devices), each blind to the other. Whenever the key is required, MPC is set in motion to confirm that all parties, or a predetermined number of parties out of the full set, approve of the request.

With MPC technology in play, a potential hacker now has a much harder task ahead of them. To gain control over a user’s wallet, they now need to attack multiple parties across different operating platforms at different locations simultaneously.

The MPC solution then solves the problem of secure key storage. As the key no longer resides in one single place, it also allows more personnel to access a wallet without the risk of any of them turning rogue and running off with the digital assets it contains.

In addition, with the private key completely secure, users can now hold their assets online and no longer need cumbersome cold-storage devices. This means that transferring digital assets is now more fluid and no compromise is required between security and operational efficiency.

Types of MPC Algorithms

Given its inherent properties MPC, in and of itself, is a powerful tool for securing digital assets. However, not all MPC algorithms are created equal. Today, many institutions that are using MPC employ algorithms such as Gennaro and Goldfeder’s algorithm (MPC-GG18); while protocols like this one are still considered the industry standard by many, it doesn’t reach as high of a level of efficiency, security, or operational flexibility as certain new MPC algorithms are able to achieve.

To effectively run a profitable digital asset business in today’s ever-changing market or execute high-volume withdrawal requests for a large retail customer base, financial institutions (such as exchanges, lending providers, and banks) require instant and secure access to funds.

However, due to a complex regulatory environment, many of these institutions are forced to operate with secure but slow cold storage solutions. So, the compatibility of an algorithm with cold storage is another important factor to consider when evaluating MPC algorithms.

The Gennaro and Goldfeder MPC Algorithm

Gennaro and Goldfeder’s algorithm is currently one of the top MPC algorithms available, and many institutions that protect their private data using MPC utilize this algorithm.

However, with Gennaro and Goldfeder’s algorithm, the communication latency between the MPC-shares (the devices that hold the key shares) doesn’t reach the highest level of efficiency – as it requires users to wait for transactions to undergo up to 9 signature rounds.

In addition, Gennaro and Goldfeder’s algorithm doesn’t offer any flexibility for institutions that need to use cold storage.

The Lindell et al. MPC Algorithm

Lindell et al. offers a slight decrease in the number of transactions that need to be signed from Gennaro and Goldfeder, at 8. However, this still doesn’t reach the level of operational efficiency necessary for today’s markets.

Like Gennaro and Goldfeder, Lindell et al. does not offer support for cold storage.

The Doerner et al. MPC Algorithm

Doerner et al.’s MPC algorithm accomplishes a threshold using just 6 signatures. Yet, again, the level of efficiency that’s possible with today’s technology is still higher than this.

And like the previous two algorithms, Doerner et al. can’t provide solutions for institutions that are looking to use cold storage in tandem with MPC.

MPC-CMP: The Newest Innovation in MPC

Building off of the groundwork laid by Gennaro and Goldfeder, the Fireblocks cryptography team (in collaboration with Professor Ran Canetti, the founder of the universal composability security model) recently developed and released a new algorithm, MPC-CMP. MPC-CMP enables digital asset transactions to be signed in just 1 round, meaning that it offers the fastest transaction signing speeds of any MPC algorithm by 800%. 

MPC-CMP also solves the challenges faced by businesses looking to use cold storage in tandem with MPC by allowing hot and cold key signing mechanisms – with at least one key share stored offline in an air-gapped device.

This introduces new configuration possibilities for institutions in regions with specific regulations around cold storage and strengthens the security of MPC-based wallets by adding a key refresh mechanism (minutes-long intervals). While traditional cold wallets require physical proximity and trust for certain employees to operate these wallets without making an error or acting maliciously, MPC-CMP operationalizes cold wallets – creating a solution for today’s high-paced crypto markets.

With the new algorithm, we’ve introduced a new security feature that ensures MPC key shares are automatically refreshed in minutes-long intervals. That means a malicious actor only has a few moments to steal all the key shards before the shares are refreshed and they have to start over – effectively adding a new layer of protection to our multi-layered security system.

MPC is open-source and peer-reviewed. We will not be applying for patents on MPC-CMP. That means all digital asset custodians and MPC vendors can access our new protocol and use it for free. In addition, the algorithm is universally composable, guaranteeing strong security properties for any implementation out-of-the-box. Universally composable cryptographic protocols are important to practical implications of new cryptography, as they remain secure even when arbitrarily composed with other protocols – and guarantee that even when multiple transactions are concurrently signed in parallel, security is not compromised.

Algorithm  Transaction Rounds Universally Composable Cold Storage Compatible Peer-Reviewed Open-Source
Gennaro and Goldfeder 9 No No Yes Yes
Lindell et al. 8 No No Yes No
Doerner et al. 6 No No Yes No
MPC-CMP 1 Yes Yes Yes Yes 

What’s next for MPC?

MPC has quickly become the standard for securing digital assets. Major financial institutions – including Celsius (biggest US crypto lending desk), and Revolut (Europe’s largest neobank) – have announced their transition to MPC. But in 2021, MPC is only one part of the equation for digital asset security.

As we’ve seen over the years, the best defense against cybercriminals is a multilayered one that can provide redundancy in the event that one of the security controls fails. That’s why today’s institutions require a security system that layers MPC alongside numerous other software and hardware defenses to make breaking in highly expensive and nearly impossible.

At Fireblocks, our “defense-in-depth” security system fulfills these requirements, utilizing Intel SGX chip-level hardware isolation, distribution of sensitive information across multiple tier-1 cloud providers, and a highly customizable policy engine in addition to MPC. Today, we’re using MPC-CMP – the fastest and most secure MPC algorithm currently available – adding a new degree of flexibility to the equation (including the ability to sign an MPC from a hardware storage device).

im-419950

What led to FTX collapse?

What. A. Week. The world’s third largest crypto exchange just went belly up, and the industry’s white knight got dragged down in the process. Grab a (Irish) coffee, and let’s figure out what the hell happened and what the implications are.

Sam Bankman-Fried (SBF) is the CEO of FTX, one of the world’s largest exchanges and also principal shareholder of Alameda Research, a separate trading firm he founded prior. There is also an entity called FTX.US. FTX.US is operated as a distinct U.S. operation, separate from FTX with SBF again acting as a principal though not overseeing day-to-day operations. FTX has received funding from big names like BlackRock, SoftBank, Temasek, Tiger Global, and the Ontario Teachers’ Pension Plan. FTX was also a large investor themselves through FTX Ventures, making over 60 individual investments.

FTX has an exchange token called FTT which can be used for benefits on FTX like lower fees and cheaper withdrawals. It is also considered pseudo-equity as a third of the revenue generated on the exchange is used to buyback and burn FTT, tying its price to the success of FTX.

The relationship between FTX and Alameda was always a bit unclear for outside parties, as SBF obviously played a pivotal role in the direction of the two distinct entities.

The trouble began when Coindesk reported on November 2 that Alameda’s assets (~$6 billion of the ~$15 billion) were in FTX’s FTT token. This amount of balance sheet concentration, the fact that it was largely illiquid (representing ~2x the circulating supply of FTT), and the link between the two entities caused immediate market concern. SBF made the link even more dubious by promoting the token on Twitter as recently as Halloween.

On November 6, Binance CEO Changpeng “CZ” Zhao announced that his exchange would liquidate $2.1 billion worth of FTT. Alameda offered to buy it at the market price of $22. CZ publicly seemed to decline…and so did FTT. Sharply.

With the price of FTT rapidly declining, concerns arose about FTX’s growing liquidity concerns and whether user funds were backed. About $6 billion was withdrawn in 72 hours before withdrawals were halted on the afternoon of November 8. Later that day Binance agreed, in principle, to acquire FTX and stem any liquidity concerns. However, after reviewing the FTX balance sheet, they have since walked away from the deal citing “the issues are beyond our control or ability to help.” They also explicitly cited “reports regarding mishandled customer funds and alleged U.S. agency investigations” as other reasons.

Just this morning (November 10) SBF released a statement blaming an internal error of tracking user margins for the liquidity crunch. He said FTX would spend the week trying to raise liquidity from investors and all funds will hopefully be returned to users. We’ll see.

Google-Becomes-a-Solana-Supporter-Enhances-its-Value-by-15

Google Cloud To Run Solana Validator

After Google Cloud announced that it would offer a product called the “Blockchain Node Engine” and further noted that Ethereum would be the first blockchain supported, the company revealed it was running a Solana validator on Nov. 5, 2022. Google said that the cloud division is currently working with Solana in order to bring the firm’s Blockchain Node Engine to the Solana chain.

Google Is Operating a Block-Producing Solana Validator, Firm’s Cloud Division Aims to Add Solana to Blockchain Node Engine’s Supported Blockchains in 2023

Google is stepping toward other blockchains after revealing the Blockchain Node Engine at the end of October. According to a tweet from Google Cloud’s official Twitter account, the firm is already running a Solana validator. “Google Cloud is running a block-producing [Solana] validator to participate in and validate the network,” the company tweeted. The company added:

Google Cloud is working with [Solana] to bring Blockchain Node Engine to the Solana chain next year, so it will be easy for anyone to launch a dedicated Solana node in the cloud.

In addition to bringing Blockchain Node Engine support to Solana, Google Cloud will be indexing the Solana blockchain and bringing the data to “Big Query” next year. The goal is to “make it easier for the Solana developer ecosystem to access historical data.” Solana’s native token solana (SOL) jumped in value on the announcement and SOL is up 7.8% higher against the U.S. dollar this week.

However, not everyone was thrilled about Google’s mission to host Solana nodes in the cloud. “This … Is not a good thing. More and more centralization. This is like the opposite point of crypto,” one person criticized in the Google Cloud announcement thread on Twitter. Another individual agreed with the person’s critique and replied: “Crypto went full redacted in 2017. People now celebrate centralized and custodial services because it makes number go up.”

1 tdQfWLSlB-93xnBgma88eQ

Avoid War and Have the Best Times

Ray Dalio explained in his LinkedIn about the four big threats that should create worry and the one big force that should create optimism. The four big threats are the financial/economic threat, the internal conflict threat inside the US, the external conflict threat, and the acts of nature threats (drought, floods, pandemics, etc.) and the one big force that provides reasons for optimism is man’s capacity to adapt and invent ways of improving things.

While these threats exist, the world is in the best position in history judging by most measures of well-being such as life-expectancy, real incomes, and real wealth, so if we handle these big worries well, things should be better than ever. Of course, averages hide the differences which are enormous, but the capacity to deal with these extreme differences exists, so the potential to have the best times ever exists if we can deal with the four big threats well.

To have better times than ever we need to:

  1. Get our finances in order through a mix of a) being more productive by investing in those things that make us more productive and benefit most people (such as education), and b) engineering a “beautiful deleveraging” that spreads out and reduces the real debt liabilities and assets relative to real incomes. (If you want an explanation of how to engineer a beautiful deleveraging, see his book Principles for Navigating Big Debt Crises).
  2. Develop a strong and smart political middle that represents the majority of people and can defeat the extreme populist minority so that we can work and live well together. Because the policies that the majority of people want are both most acceptable and more sensible than the policies that the extremists are fighting for, it should not be all that difficult to put together a platform that represents what the majority in the middle wants.
    How would such bipartisanship work? For elections it can occur in a variety of ways. As for governing, it can occur in a number of ways if leaders want it, such as the next president of the US choosing to have a bipartisan Cabinet of smart people and to initiate a bipartisan “Manhattan Project” type initiative to make economic reforms that would both significantly improve productivity and benefit the majority of people.
  3. Have rival countries develop agreements and protocols that would minimize the chances of military wars. This could involve having each leader delegate teams to look at the existential threats posed by other nations and negotiate paths for minimizing the risk of fighting over them. If parties could work on minimizing the existential risks of the other parties, that would go a long way to avoiding wars.
  4. As for acts of nature, Ray Dalio will defer to others more knowledgeable than he is to suggest ways to cost-effectively deal with them.
  5. As for man’s ability to adapt and invent, he thinks that is naturally happening in the greatest way ever. That’s because of the development of technologies that help people think about how to make such improvements and because of the development of venture capital markets to finance entrepreneurs with good ideas in numbers and amounts that are unprecedented.

While we might think that the odds of doing these things are improbable, they are certainly possible and could even become probable if most people demanded that their leaders and political parties move in directions like these.

W3DA

Web3 Domain Alliance W3DA

The Web3 Domain Alliance is a member-led, member-driven organization dedicated to improving the technological and public policy environments for users of blockchain naming services.

W3DA Missoin

This alliance to promote the development of the blockchain naming ecosystem and the functioning of blockchain domain registries with and across blockchain-based and traditional web applications.

The Web3 Domain Alliance is dedicated to the technological advancement of blockchain domain registries, as well as consumer protection by ensuring the interoperability of blockchain domain registries.

The Web3 Domain Alliance believes that blockchain-based generic web3 Top Level Domains (“TLDs”) developed and marketed by a specific organization are intellectual property, and that industry participants should respect the intellectual property rights of all blockchain naming services for the benefit of consumers as well as applications that want to support blockchain domain functionality.

W3DA Pledge

As a member of the Web3 Domain Alliance, companies pledge the following:

  • To abide by the Web3 Domain Alliance’s principles of developing unique, interoperable blockchain domain namespaces, NFT domains, and advocating for the legal protection and market acceptance of blockchain namespace.
  • To protect users and endeavor toward the development of interoperable NFT domain naming systems by promoting voluntary avoidance of namespace collisions with existing Web3 naming systems in the Web3 domain industry.
  • To advocate for the policy position that NFT domain registry owner-operators create trademark rights in their web3 TLDs through first commercial use with market penetration.
  • To protect our intellectual property rights, including trademark rights, in our web3 TLDs.
  • To work with fellow Web3 Domain Alliance members to promote the advancement of the Web3 Domain Alliance’s policy positions.
  • To work with fellow Web3 Domain Alliance Members to advocate for recognition of NFT domains by a broad community of stakeholders, and the public.

W3DA Partners

[one_third]

[/one_third][one_third]

[/one_third][one_third_last]

[/one_third_last] [one_half]

[/one_half][one_half_last]

[/one_half_last] [one_half]

[/one_half][one_half_last]

[/one_half_last]

 

 

AdobeStock_119325482-1200x655

Decentralized Capital Market (DeCap)

The world is rapidly changing and updating or rather, upgrading. Especially from 2020 onwards we are witnessing developments in technology that previously could only be found in sci-fi movies. Among these technologies, we can mention drones, which today is available for teenagers as a toy.

In the field of fintech (Financial Technologies) digital currencies or to be more precise, cryptocurrencies are at the beginning of their maturity in such a way that from 2021 on, every week, there are news about the joining of companies from all over the world. Even banks and restaurants and other financial, commercial or service institutions and even governments adopt this emerging technology (Blockchain).

Even in the field of financing, concepts such as crowd funding have been introduced, and in the field of financial markets, automatic trading robots and algorithmic trading have entered the field of competition with each other, so that there are even related tournaments around the world in which the best robots and algorithms will find their way to portfolio managers and investment companies and stock exchanges. One of the oldest examples is the Renaissance Technologies, which, has helped its clients’ investment portfolios to have the best ROI compared to their competitors in the long term by designing and owning an advanced algorithm called the Renaissance Black Box.

However, if we consider the financing more fundamentally, we will see that for nearly a decade, companies have been financing part of the capital needed for liquidity, production and their services or development plans in hand throughout IPOs in stock exchanges.

In my opinion, the era of this traditional type of financing has come to an end, and finally, in less than ten years, this will be replaced by financing through the Decentralized Capital Market (DeCap) and with the help of cryptocurrency liquidity pools. Currently (May 21st, 2020), the market cap of all digital currencies is about two trillion dollars. On the other hand, the market cap of a part of the shares of Apple that is offered in the stock market is almost the same. This means that if all cryptocurrencies are injected (staked) by their owners in a Liquidity Pool, they will be able to provide only the required capital for the Apple shares offered on the New York Stock Exchange.

Considering the growing trend of the crypto market and the emerging of tools such as lending protocols and decentralized financing (DeFi), there is NOT a long way to the point where small to medium companies and then large companies to finance the capital they required through DeCap.

Obviously, in the meantime, the role of stock brokers will faded gradually because decentralized cryptocurrency brokerages such as DEXes will take their place eventually.

hivemapper-camera-second

Hivemapper HONEY cryptocurrency

Introduction

Hivemapper is building one of the first on-chain, decentralized global maps powered completely by the people. Since launch in November 2022, the Hivemapper network has significantly grown its active contributor base and map coverage.

Over 301,000 unique kilometers have been added to the network’s decentralized map in the past 3 months. Most of the map is congested around major metropolitan areas such as Los Angeles and Seoul. Additionally, the daily average of unique mapping contribution has increased by 4.5x since January, largely due to the steadily rising number of network contributors.

 

The total unique user base has grown by 30x since November to nearly 6,300 users. This metric comprises 30% mapping contributors, 69% quality assurance (QA) contributors, and <1% data users. Mapping contributors are particularly important because they actively trace roadways with a Hivemapper dashcam. The growth of this user segment is directly correlated with the Hivemapper team’s dashcam production, which has been increasingly catching up to their evergrowing waitlist.

QA contributors verify contributions from dashcam footage. Both the mapping and QA contributors receive the platform’s native HONEY token in exchange for their contributions. The combined efforts of these contributors will ideally create a comprehensive, high-quality, and up-to-date map of the world.

Finally, data users query the network’s API for external usage. Enterprises and developers can purchase map credits — fixed at $0.02 USD — that can be used to query map data. As users consume map imagery, a corresponding amount of the HONEY token is burned. Then, an equivalent number of HONEY tokens is added back to complete Hivemapper’s net emission model.

Current Usage

Currently less than 1% of the world’s roadways have made their way onto Hivemapper’s network. As such, it’s not terribly surprising that there is very little usage of Hivemapper’s map data since launch. The network needs to expand its mapped territory before it can incentivize external application development. So how long will it take before the Hivemapper team has a usable dataset?

Let’s look at Los Angeles, one of the most active cities in terms of mapping contributors. According to the global contribution metrics over the past two months, each mapper contributes an average of 10 unique kilometers per day. The city currently boasts 120 mappers, with about 15% of the 134,000 total kilometers mapped.

Assuming no new contributors are added and that contributors prioritize mapping previously untouched areas, LA will have a functioning map in approximately 3 months. Granted, this timeline is a rough estimate that ignores several key details involved in how map footage is sourced, vetted, and approved. However, if the same logic can be applied to other active urban areas, then developers could potentially start building on Hivemapper’s network within the year.

Mapping Economy

Billions of people around the world depend on maps each day. Maps are used by insurance providers, real estate services, logistics companies, navigation and delivery apps, and governmental organizations, just to name a few. Maps are an essential part of the world’s technology infrastructure and represent a $300B market.1

The Problem

Today, global maps are largely controlled by a few companies because they are extremely expensive to build. This has led to numerous problems with today’s maps:
  • High costs for businesses: The cost of mapping APIs continues to increase due to the lack of competition.2
  • Uneven freshness and coverage: These maps are not as fresh as they should be due to the incredible expense required to build and maintain them. Developing markets have vastly inferior maps given that the high cost to map breaks the economic models.
  • Free use of user data: Existing maps use our private and sensitive location data to improve their own maps without compensation.
  • Prone to censorship: Big tech companies operate a wide set of businesses in many countries, and are often forced to censor or modify their maps to satisfy politicians and land owners.

The Benefits of Building a Mapping Network on the Blockchain

Blockchain and crypto incentives solve these problems by dramatically reducing the cost to map on a global scale, while rewarding contributions to the map and promoting freshness and uncensored quality.
High quality 4K dashcams are widely available for only hundreds of dollars and can be connected to software for efficient collection of 4K street-level imagery. Map QA reviewers, annotators, and annotation QA reviewers are incentivized with ownership to maintain the quality of the map. Additionally, thanks to the decreasing cost of machine vision compute cycles it is now feasible to transform imagery to valuable map data on a global scale.
Imagine a network of map contributors and map consumers intricately connected on a blockchain, participating in the exchange of valuable map data. Map contributors build and refresh the map by capturing 4K street-level imagery, carrying out quality assurance, and annotating imagery. Map consumers leverage the map via a set of APIs such as images, direction, geocoder, and more.

Hivemapper’s decentralized global map and cryptocurrency

The Hivemapper Network is a system that incentivizes map coverage, freshness, and quality with ownership. By installing a simple 4K dashcam on a car or truck, a contributor can earn a new cryptocurrency, own part of the decentralized global map, and support the world’s critical geospatial infrastructure in a cost-effective way.
Powering the Hivemapper Network is the decentralized global map on the blockchain and its cryptocurrency HONEY . With the introduction of a decentralized mapping network and cryptocurrency, the mapping network injects decentralization and built-in crypto incentives into an industry currently controlled by monopolies and governments that take our data for free. The result is a state-of-the-art map of our world that is constantly-renewing, high quality, truly covers our whole world, and is collectively owned by its contributors.
Fig. 1 / The Hivemapper Network
Fig. 1 / The Hivemapper Network – This diagram shows the two-sided marketplace between map contributors and map consumers interacting on the Hivemapper Network with its cryptocurrency token HONEY.

System Overview

Map Tiles

The global map is divided into small hex shaped tiles referred to as map tiles. The map tiles are the atomic unit of the map and based on H3 cells. We intentionally use small H3 cells as the basis for our map tiles (shown in the graphic below) to make it easy to start contributing and to avoid issues with “partial” tiles contributed.

Fig. 2 / Map Tiles
Fig. 2 / Map Tiles – Map tiles are the atomic unit of the Hivemapper map that contributors use to build and refresh coverage. Each map tile is hex shaped. Trillions of these hex shaped tiles cover the entire earth.
4K Street-Level Imagery to Map Tiles

The 4K street-level imagery and related GPS metadata collected with approved dashcams and the Hivemapper app serve as the raw ingredients for the global map. This data provides coverage for map tiles as seen in Figure 3 below.

Fig. 3 / Map Tiles Along a Path
Fig. 3 / Map Tiles Along a Path – Map tiles that have been covered on the road are represented in pink, as shown by H3 cells.
Map Contributors
The collective work of map contributors on the Hivemapper Network results in a high quality global map. The chart below details the different types of map contributors and the type of work they do on the mapping network.
Contributor Work They Do
Drivers Collect 4K street-level imagery via a supported 4K dashcam and Hivemapper app
Imagery QA Reviewers Validate the 4K street-level imagery collected by drivers
Annotators Annotate the map with details such as street direction and name. Annotations can be broadly defined; for example, a customer may pay to add artwork associated to the map
Annotation QA Reviewers Validate the work of Map Annotators
Software Developers Extend the protocol, product, and work on core mapping features and the Hivemapper app
The Hivemapper Dashcam
Fig. 4 / Hivemapper Dashcam
Fig. 4 / Hivemapper Dashcam – The Hivemapper Dashcam is an open source camera based on the Open Dashcam specifications that is optimized for collecting imagery from a vehicle for the purpose of mapping.
In the summer of 2022, Hivemapper will begin selling the world’s first crypto-enabled dashcam. The Hivemapper Dashcam is an open source dashcam that natively integrates with the Hivemapper Mapping Network through a seamless mobile app.
The Hivemapper Dashcam is based on the Open Dashcam specifications that ensures:
  • Location Authentication: Multiple layers of security to ensure that the dashcam is authentically geolocating its position
  • Automatic data transfers: Automatically transfers the collected data from the dashcam to the Hivemapper Network via integration with the Hivemapper Contributor App for iPhone and Android
  • Dynamic data collection: Dynamically determines the data required for the map – ignores the rest

These capabilities make it the ideal dashcam for mapping on a decentralized network.
Hardware manufacturers can incorporate the Open Dashcam specifications in their own dashcams, and seek approval from the Hivemapper Foundation to run their compliant dashcams on the mapping network.
Honey Token

HONEY is the Hivemapper cryptocurrency of the decentralized mapping network. The Hivemapper Network is built on top of the Solana blockchain.

Map contributors mine HONEY by contributing to the Hivemapper Network. A fixed number of HONEY tokens are minted decreasingly over time. The maximum number of HONEY tokens that will ever exist is 10 billion.
Map API Services
APIs built on top of the global map enable developers to cost effectively integrate maps and geolocation services into their applications. Today, Hivemapper Inc. offers individuals and organizations the powerful Map Image API. The set of APIs that Hivemapper Inc. and other organizations can build and commercialize on top of the global map includes Driving Directions, Geo Search, Traffic, and more.
The Hivemapper Foundation ensures that any company or organization can build and commercialize API services on top of the global map.
Map Consumers
Customers who want to integrate Hivemapper’s map APIs into their applications can purchase API calls with Map Credits. Map Credits are created by burning HONEY tokens, ahead of any use of map APIs. These burned tokens then increase the number of tokens available to mint and to pay to map contributors.
The Burn and Mint Equilibrium
The Burn and Mint Equilibrium with Net Emissions model is used, such that whenever map consumers burn tokens to access the network, an equivalent number of tokens is added back into the rewards pool for map contributors as shown in Figure 5 below.
Fig. 5 / The Burn and Mint Equilibrium
Fig. 5 / The Burn and Mint Equilibrium – When map consumers use map APIs this transaction burns tokens, and an equivalent number of tokens is added back into the pool to reward contributors.
At a high level, contributors, including drivers, annotators, and QA reviewers, build the database of fresh and accurate map data in exchange for HONEY. Consumers of the data pay into the network by purchasing and burning tokens to compensate the contributors for their work.

Future Areas

Today, the mapping network focuses on street level maps. Next, the network will introduce map annotations enabling map editors to edit the map alongside the machine learning algorithms to add new layers of data to the map.
Yet, this is just the beginning. Global data collection for a global map, incentivized by the HONEY cryptocurrency token and protocol, is the right approach to building a global map. With this in mind, there are multiple technologies that hardware and software developers can use to add additional layers to the global map in order to meet the needs of customers:
  • Additional imagery sensors such as 360 cameras from street level
  • Support for collecting imagery from scooters and bikes to support high quality maps for alternative transportation modalities
  • Use of air quality sensors to incorporate air quality data into the map
  • Use of lower cost RGB-D, radar, and LiDAR sensors to build 3D maps and street level object mapping
  • Airborne data collection via drones to provide the high precision aerial perspective
  • Use of satellite imagery for a broad scale aerial perspective
william-f-sharpe_final-70794d182ddf4a26b92bed8c36f11a0b

Sharpe Ratio

What Is the Sharpe Ratio?

The Sharpe ratio compares the return of an investment with its risk. It’s a mathematical expression of the insight that excess returns over a period of time may signify more volatility and risk, rather than investing skill.1

Economist William F. Sharpe proposed the Sharpe ratio in 1966 as an outgrowth of his work on the capital asset pricing model (CAPM), calling it the reward-to-variability ratio.1 Sharpe won the Nobel Prize in economics for his work on CAPM in 1990.2

The Sharpe ratio’s numerator is the difference over time between realized, or expected, returns and a benchmark such as the risk-free rate of return or the performance of a particular investment category. Its denominator is the standard deviation of returns over the same period of time, a measure of volatility and risk.

Key Takeaways

  • The Sharpe ratio divides a portfolio’s excess returns by a measure of its volatility to assess risk-adjusted performance
  • Excess returns are those above an industry benchmark or the risk-free rate of return
  • The calculation may be based on historical returns or forecasts
  • A higher Sharpe ratio is better when comparing similar portfolios.
  • The Sharpe ratio has inherent weaknesses and may be overstated for some investment strategies.

 

Formula and Calculation of Sharpe Ratio

In its simplest form,

where:

Rp=return of portfolio
Rf=risk-free rate
σp=standard deviation of the portfolio’s excess return

Standard deviation is derived from the variability of returns for a series of time intervals adding up to the total performance sample under consideration.

The numerator’s total return differential versus a benchmark (Rp Rf) is calculated as the average of the return differentials in each of the incremental time periods making up the total. For example, the numerator of a 10-year Sharpe ratio might be the average of 120 monthly return differentials for a fund versus an industry benchmark.

The Sharpe ratio’s denominator in that example will be those monthly returns’ standard deviation, calculated as follows:

  1. Take the return variance from the average return in each of the incremental periods, square it, and sum the squares from all of the incremental periods.
  2. Divide the sum by the number of incremental time periods.
  3. Take a square root of the quotient.

What the Sharpe Ratio Can Tell You

The Sharpe ratio is one of the most widely used methods for measuring risk-adjusted relative returns. It compares a fund’s historical or projected returns relative to an investment benchmark with the historical or expected variability of such returns.

The risk-free rate was initially used in the formula to denote an investor’s hypothetical minimal borrowing costs.1 More generally, it represents the risk premium of an investment versus a safe asset such as a Treasury bill or bond.

When benchmarked against the returns of an industry sector or investing strategy, the Sharpe ratio provides a measure of risk-adjusted performance not attributable to such affiliations.

The ratio is useful in determining to what degree excess historical returns were accompanied by excess volatility. While excess returns are measured in comparison with an investing benchmark, the standard deviation formula gauges volatility based on the variance of returns from their mean.

The ratio’s utility relies on the assumption that the historical record of relative risk-adjusted returns has at least some predictive value.1

Generally, the higher the Sharpe ratio, the more attractive the risk-adjusted return.

The Sharpe ratio can be used to evaluate a portfolio’s risk-adjusted performance. Alternatively, an investor could use a fund’s return objective to estimate its projected Sharpe ratio ex-ante.

The Sharpe ratio can help explain whether a portfolio’s excess returns are attributable to smart investment decisions or simply luck and risk.

For example, low-quality, highly speculative stocks can outperform blue chip shares for considerable periods of time, as during the Dot-Com Bubble or, more recently, the meme stocks frenzy. If a YouTuber happens to beat Warren Buffett in the market for a while as a result, the Sharpe ratio will provide a quick reality check by adjusting each manager’s performance for their portfolio’s volatility.

The greater a portfolio’s Sharpe ratio, the better its risk-adjusted performance. A negative Sharpe ratio means the risk-free or benchmark rate is greater than the portfolio’s historical or projected return, or else the portfolio’s return is expected to be negative.

William F. Sharpe
Alison Czinkota / Investopedia

Sharpe Ratio Pitfalls

The Sharpe ratio can be manipulated by portfolio managers seeking to boost their apparent risk-adjusted returns history. This can be done by lengthening the return measurement intervals, which results in a lower estimate of volatility. For example, the standard deviation (volatility) of annual returns is generally lower than that of monthly returns, which are in turn less volatile than daily returns. Financial analysts typically consider the volatility of monthly returns when using the Sharpe ratio.

Calculating the Sharpe ratio for the most favorable stretch of performance rather than an objectively chosen look-back period is another way to cherry-pick the data that will distort the risk-adjusted returns.

The Sharpe ratio also has some inherent limitations. The standard deviation calculation in the ratio’s denominator, which serves as its proxy for portfolio risk, calculates volatility based on a normal distribution and is most useful in evaluating symmetrical probability distribution curves. In contrast, financial markets subject to herding behavior can go to extremes much more often than a normal distribution would suggest is possible. As a result, the standard deviation used to calculate the Sharpe ratio may understate tail risk.3

Market returns are also subject to serial correlation. The simplest example is that returns in adjacent time intervals may be correlated because they were influenced by the same market trend. But mean reversion also depends on serial correlation, just like market momentum. The upshot is that serial correlation tends to lower volatility, and as a result investment strategies dependent on serial correlation factors may exhibit misleadingly high Sharpe ratios as a result.4

One way to visualize these criticisms is to consider the investment strategy of picking up nickels in front of a steamroller that moves slowly and predictably nearly all the time, except for the few rare occasions when it suddenly and fatally accelerates. Because such unfortunate events are extremely uncommon, those picking up nickels would, most of the time, deliver positive returns with minimal volatility, earning high Sharpe ratios as a result. And if a fund picking up the proverbial nickels in front of a steamroller got flattened on one of those extremely rare and unfortunate occasions, its long-term Sharpe might still look good: just one bad month, after all. Unfortunately, that would bring little comfort to the fund’s investors.

Sharpe Alternatives: the Sortino and the Treynor

The standard deviation in the Sharpe ratio’s formula assumes that price movements in either direction are equally risky. In fact, the risk of an abnormally low return is very different from the possibility of an abnormally high one for most investors and analysts.

A variation of the Sharpe called the Sortino ratio ignores the above-average returns to focus solely on downside deviation as a better proxy for the risk of a fund of a portfolio.

The standard deviation in the denominator of a Sortino ratio measures the variance of negative returns or those below a chosen benchmark relative to the average of such returns.

Another variation of the Sharpe is the Treynor ratio, which divides excess return over a risk-free rate or benchmark by the beta of a security, fund, or portfolio as a measure of its systematic risk exposure. Beta measures the degree to which the volatility of a stock or fund correlates to that of the market as a whole. The goal of the Treynor ratio is to determine whether an investor is being compensated for extra risk above that posed by the market.

Example of How to Use Sharpe Ratio

The Sharpe ratio is sometimes used in assessing how adding an investment might affect the risk-adjusted returns of the portfolio.

For example, an investor is considering adding a hedge fund allocation to a portfolio that has returned 18% over the last year. The current risk-free rate is 3%, and the annualized standard deviation of the portfolio’s monthly returns was 12%, which gives it a one-year Sharpe ratio of 1.25, or (18 – 3) / 12.

The investor believes that adding the hedge fund to the portfolio will lower the expected return to 15% for the coming year, but also expects the portfolio’s volatility to drop to 8% as a result. The risk-free rate is expected to remain the same over the coming year.

Using the same formula with the estimated future numbers, the investor finds the portfolio would have a projected Sharpe ratio of 1.5, or (15% – 3%) divided by 8%.

In this case, while the hedge fund investment is expected to reduce the absolute return of the portfolio, based on its projected lower volatility it would improve the portfolio’s performance on a risk-adjusted basis. If the new investment lowered the Sharpe ratio it would be assumed to be detrimental to risk-adjusted returns, based on forecasts. This example assumes that the Sharpe ratio based on the portfolio’s historical performance can be fairly compared to that using the investor’s return and volatility assumptions.

What is a Good Sharpe Ratio?

Sharpe ratios above 1 are generally considered “good,” offering excess returns relative to volatility. However, investors often compare the Sharpe ratio of a portfolio or fund with those of its peers or market sector. So a portfolio with a Sharpe ratio of 1 might be found lacking if most rivals have ratios above 1.2, for example. A good Sharpe ratio in one context might be just a so-so one, or worse, in another.

How is the Sharpe Ratio Calculated?

To calculate the Sharpe ratio, investors first subtract the risk-free rate from the portfolio’s rate of return, often using U.S. Treasury bond yields as a proxy for the risk-free rate of return. Then, they divide the result by the standard deviation of the portfolio’s excess return.

 

Deep Understanding of the Sharpe Ratio

Since William Sharpe’s creation of the Sharpe ratio in 1966, it has been one of the most referenced risk/return measures used in finance, and much of this popularity is attributed to its simplicity.1 The ratio’s credibility was boosted further when Professor Sharpe won a Nobel Memorial Prize in Economic Sciences in 1990 for his work on the capital asset pricing model (CAPM).2

The Sharpe Ratio Defined 

Most finance people understand how to calculate the Sharpe ratio and what it represents. The ratio describes how much excess return you receive for the extra volatility you endure for holding a riskier asset.3 Remember, you need compensation for the additional risk you take for not holding a risk-free asset.

We will give you a better understanding of how this ratio works, starting with its formula:

Return (rx)

The measured returns can be of any frequency (e.g., daily, weekly, monthly, or annually) if they are normally distributed. Herein lies the underlying weakness of the ratio: not all asset returns are normally distributed.

Kurtosis—fatter tails and higher peaks—or skewness can be problematic for the ratio as standard deviation is not as effective when these problems exist. Sometimes, it can be dangerous to use this formula when returns are not normally distributed.

Risk-Free Rate of Return (rf )

The risk-free rate of return is used to see if you are properly compensated for the additional risk assumed with the asset. Traditionally, the risk-free rate of return is the shortest-dated government T-bill (i.e. U.S. T-Bill). While this type of security has the least volatility, some argue that the risk-free security should match the duration of the comparable investment.

For example, equities are the longest duration asset available. Should they not be compared with the longest duration risk-free asset available: government-issued inflation-protected securities (IPS)? Using a long-dated IPS would certainly result in a different value for the ratio because, in a normal interest rate environment, IPS should have a higher real return than T-bills.

For instance, the Barclays Global Aggregate 10 Year Index returned 3.3% for the period ending Sept. 30, 2017, while the S&P 500 Index returned 7.4% within the same period.4 Some would argue that investors were fairly compensated for the risk of choosing equities over bonds. The bond index’s Sharpe ratio of 1.16% versus 0.38% for the equity index would indicate equities are the riskier asset.

Standard Deviation (StdDev(x))

Now that we have calculated the excess return by subtracting the risk-free rate of return from the return of the risky asset, we need to divide it by the standard deviation of the measured risky asset. As mentioned above, the higher the number, the better the investment looks from a risk/return perspective.

How the returns are distributed is the Achilles heel of the Sharpe ratio. Bell curves do not take big moves in the market into account. As Benoit Mandelbrot and Nassim Nicholas Taleb note in “How The Finance Gurus Get Risk All Wrong,” bell curves were adopted for mathematical convenience, not realism.5

However, unless the standard deviation is very large, leverage may not affect the ratio. Both the numerator (return) and denominator (standard deviation) could double with no problems. If the standard deviation gets too high, we see problems. For example, a stock that is leveraged 10-to-1 could easily see a price drop of 10%, which would translate to a 100% drop in the original capital and an early margin call.

William F. Sharpe
Alison Czinkota / Investopedia

The Sharpe Ratio and Risk

Understanding the relationship between the Sharpe ratio and risk often comes down to measuring the standard deviation, also known as the total risk. The square of standard deviation is the variance, which was widely used by Nobel Laureate Harry Markowitz, the pioneer of Modern Portfolio Theory.6

So why did Sharpe choose the standard deviation to adjust excess returns for risk, and why should we care? We know that Markowitz understood variance, a measure of statistical dispersion or an indication of how far away it is from the expected value, as something undesirable to investors.7 The square root of the variance, or standard deviation, has the same unit form as the analyzed data series and often measures risk.

The following example illustrates why investors should care about variance:

An investor has a choice of three portfolios, all with expected returns of 10% for the next 10 years. The average returns in the table below indicate the stated expectation. The returns achieved for the investment horizon is indicated by annualized returns, which takes compounding into account. As the data table and chart illustrates, the standard deviation takes returns away from the expected return. If there is no risk—zero standard deviation—your returns will equal your expected returns.

Expected Average Returns

Year Portfolio A Portfolio B Portfolio C
Year 1 10.00% 9.00% 2.00%
Year 2 10.00% 15.00% -2.00%
Year 3 10.00% 23.00% 18.00%
Year 4 10.00% 10.00% 12.00%
Year 5 10.00% 11.00% 15.00%
Year 6 10.00% 8.00% 2.00%
Year 7 10.00% 7.00% 7.00%
Year 8 10.00% 6.00% 21.00%
Year 9 10.00% 6.00% 8.00%
Year 10 10.00% 5.00% 17.00%
Average Returns 10.00% 10.00% 10.00%
Annualized Returns 10.00% 9.88% 9.75%
Standard Deviation 0.00% 5.44% 7.80%

Using the Sharpe Ratio

The Sharpe ratio is a measure of return often used to compare the performance of investment managers by making an adjustment for risk.

For example, Investment Manager A generates a return of 15%, and Investment Manager B generates a return of 12%. It appears that manager A is a better performer. However, if manager A took larger risks than manager B, it may be that manager B has a better risk-adjusted return.

To continue with the example, say that the risk-free rate is 5%, and manager A’s portfolio has a standard deviation of 8% while manager B’s portfolio has a standard deviation of 5%. The Sharpe ratio for manager A would be 1.25, while manager B’s ratio would be 1.4, which is better than that of manager A. Based on these calculations, manager B was able to generate a higher return on a risk-adjusted basis.

For some insight, a ratio of 1 or better is good, 2 or better is very good, and 3 or better is excellent.

The Bottom Line

Risk and reward must be evaluated together when considering investment choices; this is the focal point presented in Modern Portfolio Theory.7 In a common definition of risk, the standard deviation or variance takes rewards away from the investor. As such, always address the risk along with the reward when choosing investments. The Sharpe ratio can help you determine the investment choice that will deliver the highest returns while considering risk.

04

Maximal Extractable Value – MEV

Modern markets require intermediaries (brokers or brokerages) to aggregate and execute transactions on behalf of market participants. While blockchains decentralize the security and ownership of these intermediaries, networks still require them to execute transactions. Consequently, a blockchain’s intermediaries, in the form of security providers (miners, validators, and sequencers), become first-class citizens of the network in the same way that brokers do in traditional markets.

Such a power dynamic allows a blockchain’s security providers to tax users and generate additional profits during the block production process. They do so by selectively inserting, reordering, or censoring user transaction requests. This invisible, yet ubiquitous, phenomenon is what is known as Maximal (formerly Miner) Extractable Value (MEV).

Effects of MEV

MEV creates both positive and negative externalities for blockchain networks. It creates efficient markets and incentivizes proper application functionality within distributed crypto economies. However, if left uncontrolled, MEV may jeopardize network decentralization and consensus stability. Accordingly, the crypto industry is beginning to recognize the significance of MEV both as a for-profit opportunity and as an existential threat.

MEV’s emergent and abstract nature makes it difficult to define concretely. To address its intricacies, this report will provide an in-depth look at the origins of MEV before evaluating its role in the present and future state of blockchain networks.

The good

Many DeFi projects rely on economically rational actors to ensure the usefulness and stability of their protocols. For instance, DEX arbitrage ensures that users get the best, most correct prices for their tokens, and lending protocols rely on speedy liquidations when borrowers fall below collateralization ratios to ensure lenders get paid back.

Without rational searchers seeking and fixing economic inefficiencies and taking advantage of protocols’ economic incentives, DeFi protocols and dapps in general may not be as robust as they are today.

The bad

At the application layer, some forms of MEV, like sandwich trading, result in an unequivocally worse experience for users. Users who are sandwiched face increased slippage and worse execution on their trades.

At the network layer, generalized frontrunners and the gas-price auctions they often engage in (when two or more frontrunners compete for their transaction to be included in the next block by progressively raising their own transactions’ gas price) result in network congestion and high gas prices for everyone else trying to run regular transactions.

Beyond what’s happening within blocks, MEV can have deleterious effects between blocks. If the MEV available in a block significantly exceeds the standard block reward, validators may be incentivized to reorg blocks and capture the MEV for themselves, causing blockchain re-organization and consensus instability.

This possibility of blockchain re-organization has been previously explored on the Bitcoin blockchain. As Bitcoin’s block reward halves and transaction fees make up a greater and greater portion of the block reward, situations arise where it becomes economically rational for miners to give up the next block’s reward and instead remine past blocks with higher fees. With the growth of MEV, the same sort of situation could occur in Ethereum, threatening the integrity of the blockchain.

The MEV Supply Chain and Transaction Lifecycle

The pending transaction pool on a blockchain is a dark forest ripe for profit exploitation. Luckily, there are a few tools that can be used to shed light on MEV’s mysteries. The MEV supply chain is a framework recently introduced by the research organization, Flashbots. It describes the chain of actors that influence a transaction in the presence of MEV. These general classifications hold true across different blockchain networks but may serve slightly different roles depending on a network’s design choices.

Roles Within the MEV Supply Chain

02

The cold fall of Bitcoin miners

During the 2021 bull market, many mining companies took advantage of rising bitcoin prices by raising capital through the issuance of new equity and debt in order to grow their operations. Public mining stocks exploded upwards alongside Bitcoin. However, the broader macro environment caused the euphoria to come to an abrupt halt in 2022. Hashrate at an all-time high combined with increasing energy prices and bitcoin trading near its cycle low have put miners in an increasingly difficult position. Many public miners have resorted to selling bitcoin to continue financing their operations and to service their debts.

What’s different this cycle compared to last cycle is the increased participation of public companies in the mining industry. The Hashrate Index reported that in October 2021, public miners represented 11% of the total hashrate. One year later, public miners’ share of the total hashrate has increased to 33%. In the event that some of these companies end up going bankrupt due to being mismanaged, they will have to sell the bitcoin on their balance sheet, adding downward pressure to the trading price.

Shopping Basket