Categories
Blockchain

ETH2 – Serenity

Portfolio Capital

Editorial Team

Primer – Designed to ease bottlenecks and enhance efficiency, speed, and scalability, Ethereum 2.0 refers to the network’s transition to a Proof of Stake model from the current energy-intensive Proof of Work mechanism. This transition comprises several phases, with The Merge depicting the final move to a PoS blockchain enabling Ethereum to reduce its power consumption and gas fees significantly.

The Shift

Ethereum, the second largest blockchain network in the cryptosphere, runs on a Proof of Work model. Under this model, miners utilize powerful computers to solve complex mathematical transactions. The first miner to solve the problem appends the next block to the blockchain and claims newly minted ETH as a reward. However, the PoW mechanism devours an ever-increasing amount of electricity owing to the constant competition between the miners and the struggle associated with solving the problem. Though the process is highly secure and was introduced by Bitcoin, it expends enormous amounts of electricity and involves substantial hardware costs. According to estimates, Ethereum’s carbon footprint is equivalent to New Zealand’s, and its annualized energy consumption is comparable to that of Switzerland.

The shift to Ethereum 2.0 shall alter the entire validation structure and the energy and hardware requirement, thereby addressing each of the abovementioned aspects. In a PoS network, transaction verification is done by validators instead of miners. Consensus on PoS networks is achieved through coordination between validators (individuals/organizations) staking or pledging large quantities of ETH (32 Ether). An algorithm ascertains which validator gets selected to add the next block to the network. The higher the amount and the longer the duration of the stake, the greater the probability of being selected. Once a new block is added, and there are sufficient attestations from other contributors regarding the validity of the block, the validator can claim newly minted ETH as a reward. The network’s economic incentive to abide by the rules is considerably high as validators tend to face slashing (their staked share is burned/rendered useless) if they are found validating bad transactions or if they go offline.

The shift to PoS has several advantages, such as the utilization of only a fraction of the computing power that’ll make the network about 99% more energy-efficient.
Per reports, the locking up of ETH for the network to function could have a deflationary impact if it can maintain high demand in the future.

Different Phases

Ethereum, the second largest network by market capitalization, is the functional layer of the blockchain and smart contract world. The number of DeFi protocols that run on the network is testimony to its staggering growth. Although Ethereum’s growth has been phenomenal, it has adversely impacted its core architecture. The network’s congestion, low transaction speeds, and high gas costs reflect the impact. A series of updates formally referred to as Serenity is in the works to solve this problem. With Serenity, Ethereum aims to address the issue of scalability while maintaining network security and decentralization.

Listed below are the multiple phases that form part of the overall update.

Phase 0 – Beacon Chain

Phase 0 commenced with the launch of the Beacon Chain in December 2020. At the core of Ethereum 2.0 is the transition from the inefficient PoW to a more scalable and environmentally friendly PoS consensus mechanism. Phase 0 with the Beacon Chain, the PoS consensus mechanism (Casper), and validator nodes provide the necessary framework for the transition and future upgrades pertaining to network scaling and storage. The Beacon Chain introduced native staking to Ethereum and is tasked with managing and storing the registry of the network’s validators. The Beacon Chain has no implications on the mechanics of the existing PoW chain and runs parallel to the Ethereum Mainnet (existing PoW chain). In subsequent phases, the current PoW Chain shall merge with the Beacon Chain.

Phase 1 – The Merge

The next phase of the Serenity rollout is the Merge which shall mark the end of Proof of Work for Ethereum. Post the Merge, the PoW chain with all of its accounts, balances, smart contracts, and blockchain state, and the Beacon chain shall operate under one single network – the Ethereum PoS chain. After that, Eth1 and Eth2 will essentially signify two distinct layers, namely the execution layer and the consensus layer.

Eth1 execution layer – handles transactions and smart contract execution.

Eth2 consensus layer – handles proof-of-stake consensus and ensures validators act in accordance with the rules.

Thus, the Merge represents a permanent switch to a PoS model wherein validators will be assigned to secure the network, and mining will be abolished as the means for block production.

Merging Ethereum Mainnet with Beacon Chain
As part of the transition to the PoS system, testnets are essential to ascertain there aren’t any bugs or vulnerabilities. A testnet is a prerequisite for a smooth and error-free transition. It permits new tech/upgrades to be tested before they’re launched on the main blockchain network. The Merge shall be scheduled post successful testnet merges. As of today, the Merge has already been tested on Ropsten and Sepolia. The next and last test is on the Goerli testnet post which Ethereum network will officially migrate to a controlled and coordinated Proof-of-Stake system.

The Merge will pave the way for the implementation of shard chains. Initially, the plan was to work on sharding before the Merge but owing to the success of Layer 2 scaling solutions like Optimism and Arbitrum, the transition to PoS was prioritized. Layer 2 solutions offload the transactional burden from Ethereum and perform computations at much lower prices.

Phase 2 Sharding

Following the Merge, the focus will shift to sharding to enable the network to eliminate data congestion, enhance data storage, reduce gas fees and provide greater support to Layer 2 solutions. Sharding entails distributing the computational and processing load of the main chain horizontally to decongest the network. Sharding is essential for Ethereum to scale and provide additional faster and cheaper layers for decentralized applications and rollups to store data.

The upgrade will feature 64 shard chains. While the initial plan was centered around adding extra functionality such as the ability to store and execute code and handle transactions, the current plan focuses on danksharding. Danksharding is rollup-centric and uses shard blobs instead of shard chains. It utilizes data availability sampling, which enables the Ethereum network to verify vast amounts of data by sampling a few pieces of it. While this approach doesn’t equip shards to handle smart contracts, it contributes to higher transactions per second when combined with rollups. With rollups, the burdensome task of processing transactions is offloaded to L2, and then highly compressed data is rolled up into batches and sent to L1 for storage, all while retaining the security of L1.

Thus, sharding has important implications as it reduces the hardware requirement and eases the ability to run a node so network participation can be enhanced and higher levels of decentralization can be attained. The current plan entails shard usage for aggregation and movement of data. In addition, Ethereum aims to leverage L2 solutions (rollups) to maximize efficiency and scalability. With Eth 2.0 and rollups working in tandem, the transactional capacity of Ethereum is eventually expected to reach 100,000 transactions per second.

The journey to Serenity has seen multiple changes and continues to evolve as more efficient paths emerge. Each change is carefully considered and implemented to achieve a truly decentralized programmable blockchain.

Difficulty Bomb – Ethereum’s Gray Glacier Upgrade

Network upgrades refer to changes made to an underlying protocol. On the Ethereum network, these are done through Ethereum Improvement Proposals (EIPs). The upgrades alter existing rules in favor of new ones to bring about necessary improvements in the system. Unfortunately, the decentralized nature of blockchains makes the deployment of upgrades a cumbersome task requiring approval from the entire community as well as the developers.

A significant risk associated with upgrades comes in the form of community disagreement on proposed changes. With Ethereum transitioning to a PoS network, there is a great risk of miners continuing to mine the original PoW chain resulting in a hard fork. Something similar happened in July 2016, when an attempt to erase a hacker attack on a DAO caused an Ethereum hard fork and led to the emergence of Ethereum Classic. Ethereum (as we know it today) wanted to roll back the hack. Miners who agreed with the proposal stayed on, whereas those who disagreed became part of the Ethereum Classic. A hard fork represents a code change that renders the new version backward-incompatible with earlier blocks.

The Gray Glacier Upgrade

The rules laid out by the update instruct node operators and miners to manually upgrade to the latest client versions. Failure to comply with the rules would prove problematic as clients would be stuck on the pre-fork incompatible chain post upgrade.
Gray Glacier is a hard fork network upgrade that occurred at block 15,050,000 on 30 June and has delayed the difficulty bomb by 700,000 blocks, translating to an approximately 100-day delay. The difficulty bomb is critical for Ethereum’s transition to PoS.

Difficulty Bomb

The difficulty bomb is a code hardwired into the protocol from the very beginning. Its a mechanism designed to exponentially increase the mining difficulty of PoW mining algorithm resulting in longer block times, making mining eventually unprofitable for the miners. Ethereum developers, after extensive deliberations, decided to delay the detonation further to avoid what happened in 2016.

Closing Thoughts

The shift to Ethereum 2.0 is the most anticipated event in the cryptosphere and involves updates spread across multiple phases. Merge is almost 90% complete with Ethereum Foundation suggesting 19 September as the provisional launch date. The latest announcement from Vitalik at the annual Ethereum Community Conference (EthCC held in Paris) calls Merge to be the first phase of the long term development aimed at making Ethereum quantum resistant. Merge will make Ethereum 55% complete and will be followed by Surge (scalability improvement through sharding), Verge (storage optimized through Verkle trees and reduction of node size), Purge (excess historical data will be purged), and finally Splurge ( Increase efficiency and functionality).

Crypto Analyst Explains What 'Merge, Surge, Verge, Purge, and Splurge' Mean  for Etheruem ($ETH) | Cryptoglobe

 

While it might sound like the title of a “Rick and Morty” episode, the surge, verge, purge, and splurge are actually key parts of Ethereum’s scaling, cleanup, and evolution, Buterin said.

Categories
Blockchain

The Quest for Interoperability

Portfolio Capital

Editorial Team

Blockchain platforms are emerging at a rapid rate however, the lack of interoperability is diminishing the network effect that could exist between these platforms.

Primer – Blockchain technology is seizing much-needed attention and its potential can be corroborated by the rate at which new platforms are cropping up. The technology with its ability to provide a transparent, efficient, secure, and decentralized ecosystem has failed to spur mass adoption. This failure can be attributed to the inability of the platforms to function together. A closer look reveals these platforms aren’t inherently open and are unable to communicate properly with each other hampering the technology’s development and preventing it from reaching its full potential. The major deterrent is the lack of interoperability.

Each platform boasts its own set of features and characteristics incorporated to fulfill a specific need or to deliver a sophisticated product/service. The adoption of varied governance rules and regulatory controls adds to the complexity. Everything from the language to transaction schemes, consensus models, and hashing algorithms differ resulting in a plethora of blockchains siloed from one another. The need of the hour is to break down the walls and move away from a siloed and fragmented landscape to one that is unified and connected i.e. interoperable.

Blockchain Interoperability

The power and value of exclusivity enable individual platforms to attract users but prohibit the industry to unlock its full potential. For a global economy that is becoming increasingly digital and data-driven, it is imperative for individual blockchains to be able to communicate with each other to ensure the large-scale acceptance of blockchain-based applications.

To oversimplify, interoperability refers to the ability of disparate ecosystems to seamlessly exchange data and transfer relevant information between each other sans any centralized entity/intermediary. This move is essential to alleviate the roadblocks that stifle potential growth.

Implementation of interoperability between blockchains serves as a collaborative tool to fulfill Web3’s promise of decentralization and accelerates its adoption by stimulating innovation. It facilitates the creation of products and services that leverage and operate on and across multiple blockchains sparking a new spirit of collaboration for the greater good.

To summarize, it fosters an environment providing the following advantages

– Universal communication sans barriers

– Easy execution of smart contracts

– Seamless exchange and transfer of information

– Cross-industry collaboration

– Multi-token transactions and multi-token wallets

– Enhanced user experience

Fulfilling this vision requires the integration of a number of functionalities and abilities including but not restricted to

– Easy switching between different chains

– Assimilation and exchange of relevant information between chains

– Initiating and conducting transactions with other chains

– Linking private enterprise chains with public chains in a controlled manner

Achieving interoperability is of paramount importance as dependence on legacy architecture or business models of the Web 2 era is inefficient, costly, and risky, especially for enterprises utilizing blockchain technology with the hope of expanding and delivering smooth and efficient services to clients.

How is Interoperability Achieved?

Currently, the highly fragmented landscape necessitates mechanisms for interoperability to eradicate the obstacles to the mass acceptance of blockchain technology. At the core of interoperability is cross-chain technology. Cross-chain entails the transfer and exchange of multiple data types as well as tokenized assets between independent blockchain platforms.

There is no standard form or existing feature incorporated within blockchains that support communication between platforms; however, various tools have emerged to increase the level of interoperability.

Oracles

In the Web3 space, oracles are third-party services that provide blockchain smart contracts access to off-chain data thus bridging the information gap that exists. Blockchain smart contracts aren’t equipped to access off-chain data making them dependent on oracles for trustless execution. Examples of blockchain oracle projects include Chainlink, API3, Augur, etc. While oracles exist to bring reliable and secure off-chain data to blockchain platforms they have certain drawbacks listed below-

– Utilization of centralized oracles goes against decentralization

– Risk of manipulation when using oracle

However, oracles fulfill an important use case in DeFi as they relay real-world data essential to trigger execution in financial smart contracts.

Side-Chains

Sidechains refer to blockchain platforms that are not only compatible but run parallel to the mainchain within the same ecosystem. Sidechains tend to have their own tokens, security measures, and consensus mechanisms. These are Layer 2 solutions designed to scale Layer 1 blockchains. They enhance the overall ecosystem’s transactional efficiency while ensuring safety and security. With interoperability built-in, these chains are equipped to handle and support large-scale network traffic without congesting the main chain. Examples of sidechains include projects like Cosmos, Polygon, Bitcoin RSK, etc.

Bridges

Blockchain Bridges or Cross Chain Bridges are inter-blockchain applications that permit the flow of digital assets from one platform to another. Cryptoverse is home to thousands of blockchains each designed with a specific purpose but confined to its own ecosystem. Bridges enable the exchange of assets (tokens, NFTs, and other data) between these chains that are otherwise siloed.

Bridges are either trusted (centralized) or trustless (decentralized). Their inception has been instrumental in facilitating the transfer of cryptocurrencies from one platform to another without having to change it to fiat currency. These porting tools have played a vital role in enabling investors who move between blockchains in search of higher yields to achieve the desired result efficiently without facing complexities. Several platforms have integrated bridges within their ecosystem so users do not have to leave the platform.

Let’s dive deeper into Trusted and Trustless Bridges.

Trusted Bridges -Wrapping
Wrapping enables the creation of wrapped tokens whose value is pegged to the underlying token. In the case of wrapping, the original token is put in a secure digital vault and a wrapped version of it is created for use on another blockchain. Wrapped BTC or wBTC is a perfect example of a wrapped token that can be traded on the Ethereum network. Users can issue wBTC by giving their original BTC to a protocol (custodian) that stores it and instead issues wBTC by wrapping it in an ERC 20 contract. A major risk associated with using wrapped tokens is giving up control and locking your original asset with a centralized entity.

Trustless Bridges -Locking and Minting
Trustless bridges as the name suggests are decentralized and do not require users to place their trust in a centralized entity. These bridges through a two-step procedure enable interoperability between independent blockchains. The procedure involves locking (step 1) the digital asset on one chain while new tokens of an equivalent amount are minted (step 2) on the receiving blockchain.

Wrapped Bitcoin is an example of a unidirectional or one-way bridge that supports the transfer of assets to the target blockchain and not the other way around meaning BTC can be sent to Ethereum but ETH cannot be sent to Bitcoin. Whereas, Wormhole is an example of a bidirectional or two-way bridge that permits the conversion of assets to and from blockchains meaning SOL can be sent to Ethereum and ETH can be sent to Solana.

Numerous benefits associated with the use of bridges include:

– Execution of dApps across multiple blockchain platforms

– Ability to conduct cheaper, faster transactions in otherwise congested networks

– Greater efficiency, innovation, and user adoption

However,

– Bridges create opportunities for exploitable bugs due to the complexity of the code.

– Censorship risk and Custodial risk in case of Centralized bridges.

It’s important to note bridges aren’t restricted to the transfer of cryptocurrencies and include the transfer of NFTs and smart contracts as well. Some notable examples of bridges include Binance Bridge, cBridge, AnySwap, etc.

Swaps

Fostering interoperability through greater decentralization, atomic swaps facilitate the swapping of cryptocurrencies across different platforms sans intermediaries. It is a peer-to-peer (P2P) method that utilizes Hash Timelock Contracts (HTLC). In less technical terms, HTLC ensures both parties involved conform to the protocol and verify the transaction (hashlocks) before the swap takes place. If one party fails to do so or deviates from the agreed-upon terms or is unable to do it within the stipulated timeframe (timelock) the contract is not executed and the assets are returned to the original owners.

Major advantages associated with the use of atomic swaps include retaining control over your assets, swapping at lower rates, enhanced security as well as user experience. However, the issue of finding users with the same needs can be a taxing process and limited liquidity doesn’t make it easier. Moreover, only HTLC-compatible blockchains can be involved in swaps.  DEXs integrating and supporting atomic swaps may enable users to leverage the aforementioned benefits.

The growing need for interoperability has led to the introduction of multi-chain solutions that offer considerable functionalities and opportunities. In the following section, we shall cover a few that deserve mention.

The first one is DFINITY foundation’s Internet Computer which offers Native Integrations with Bitcoin and Ethereum.

Internet Computer

Poised to be the third great innovation in blockchain after BTC and Ethereum’s smart contracts, IC’s claim to fame is its ability to directly integrate with the Bitcoin network bypassing the use of bridges. But before we get into further details,  let’s understand what the Internet Computer is and how it achieves trustless Bitcoin Integration.

Internet Computer is an infinitely scalable general-purpose blockchain aimed at delivering a decentralized Internet by running smart contracts at web speed. It permits developers to install smart contracts and dApps directly on the blockchain. It can be referred to as a sovereign decentralized network sans cloud computing services (read AWS) to deliver web content.

Hosted on node machines and operated by independent parties who’re geographically separated, the internet computer nodes run on ICP (Internet Computer Protocol). ICP is a secure cryptographic protocol that ensures the security of the smart contracts running on the blockchain. The Internet Computer is a network of sorts comprising individual subnet blockchains that run parallel to each other and are connected using Chain Key cryptography. This implies canisters (smart contracts on IC) on one subnet can seamlessly call canisters hosted on other subnets of the network. Another notable feature is the network’s decentralized permissionless governance system NNS (Network Nervous System) which runs on-chain. NNS is designed to scale the network capacity when required. It does so by spinning up new subnet blockchains.

Internet Computer unlocks immense potential for value creation by introducing direct integration with the Bitcoin network. Displacing bridges and the need for wrapping, the Internet Computer through the use of Chain Key cryptography establishes a direct connection with the BTC ledger providing a trustless foundation for DeFi projects utilizing Bitcoin. It empowers developers to create canister smart contracts equipped to communicate with Bitcoin. Essentially, the ECDSA (Elliptic Curve Digital Signature Algorithm) suite of protocols by Internet Computer enables canisters to obtain ECDSA public keys and securely sign messages under that public key utilizing Chain Key cryptography. Thus, canister smart contracts with a Bitcoin address and the ability to sign transactions effectively serve as Bitcoin wallets that can receive, hold and send BTC.

Eliminating trust assumptions by removing intermediaries and providing direct integration with the Bitcoin blockchain, Internet Computer aims to give developers access to smart contract utility on the world’s largest blockchain. However, the journey to deliver on this vision isn’t going to be easy. If Internet Computer succeeds it will bring about a paradigm shift allowing native Bitcoin to flow into DeFi. This will translate to more adoption and pave way for a whole range of possibilities. IC’s plan does not end with Bitcoin and it aims to leverage its unique offering to Ethereum as well.

But listed below are what seem like major roadblocks on this path

– Security review is still in the works

– Balancing out security and usability to scale will be crucial

– Competition from Cosmos and Polkadot

Bringing us to our next project, Cosmos.

Cosmos and the Inter-Blockchain Communication Protocol

Cosmos IBC (Inter-Blockchain Communication) protocol has connected over 40 chains making it the largest interoperability protocol in the industry. Built to facilitate communication between standalone distributed ledgers sans intermediaries, Cosmos is being dubbed the Internet of Blockchains.

Cosmos deliverables go beyond seamless interaction between different blockchains and include the provision of tools that simplify the process of developing interoperable blockchains. The utilization of Hubs, IBC, Tendermint Byzantine Fault Tolerance (BFT) Engine, and the Cosmos Software Development Kit (a framework for building application-specific blockchains) enables Cosmos to provide the necessary infrastructure for creating interoperable blockchains. While some interoperability solutions involve the use of smart contracts, Cosmos offers open-source tools for the development of independent blockchains referred to as zones. These zones connect to the main blockchain referred to as the Hub. Zones communicate via the hub through the use of IBC. The state of each zone is monitored and maintained by the hub and vice versa.  Zones operate autonomously and have their own validators which validate transactions. Cosmos Hub, a Proof of Stake platform, was the first blockchain on the network and is powered by ATOM coin.

Cosmos interoperability is not limited to Tendermint chains but extends to blockchains that do not have fast-finality (read Proof-of-Work) thanks to a special kind of proxy chain referred to as Peg Zone. Peg Zones can track the state of another blockchain and are compatible with IBC as they have fast-finality themselves. Peg Zones act as bridges and can communicate with other chains by linking zones and hub networks together.

Thus, Cosmos makes blockchain development easy with Tendermint BFT and the modularity of the Cosmos SDK while providing an ecosystem that permits blockchains to seamlessly communicate through IBC and Peg Zones without compromising sovereignty.

With a multi-chain future on the horizon, the timing of Cosmos Interchain accounts couldn’t be better. Developed by Interchain GmbH, Chainapsis, Informal Systems, and Confio, Interchain accounts expand IBC functionality from mere token transfers to facilitate the creation of new products and mechanisms to coordinate between networks. To put it simply, Cosmos ecosystem blockchains can access and control accounts on separate chains as well as carry out actions native to that chain. Essentially, Interchain Accounts grant access to all IBC-enabled Cosmos chains (based on either Cosmos SDK or Tendermint) from one Cosmos Hub account. The addition of this feature will enable networks to test true application interoperability as well as enhance user experience.

The Cosmos platform and ecosystem secure over $120 bn in digital assets and boasts names like Terra, Osmosis, Binance Smart Chain, Binance DEX, Evmos, and Thorchain amongst many others.

While Cosmos is touted as the Internet of Blockchains, Polkadot seeks to be Blockchains of Blockchains.

Polkadot

Polkadot is a leading multi-chain technology in the field of blockchain interoperability. Founded by Gavin Wood (one of the founders of Ethereum), Polkadot envisions itself as a next-generation blockchain protocol that unites an entire network of purpose-built blockchains, allowing them to operate seamlessly together at scale. Furthermore, it seeks to facilitate the easy development of new chains via the Substrate framework and provides bridges to networks such as Bitcoin and Ethereum.

Its ecosystem is composed of a relay chain (the main chain for the system) that connects and supports several ancillary, application-specific blockchains referred to as parachains. These parachains are full-fledged blockchains boasting different characteristics. These chains run parallel to the relay chain and are dependent on it for consensus. Parachains are similar to Zones on Cosmos with one major difference, they share the same set of validators providing unified and strengthened security across the network through the relay chain however, zones connected to the hub do not have access to the same security as the hub. While Polkadot and Cosmos are both multi-chain projects they differ in their focus as Polkadot with its shared validation logic is focused on shared security whereas Cosmos with its bridge-hub model is more focused on easy adoption and development of interoperable blockchains.

Polkadot’s newly launched cross-chain communications (XCM) is far superior to the bridging mechanism prone to hacks. XCM format defines a language around how the message transfer between two interoperating blockchains should be performed. With the new format of XCM, Polkadot manages to deliver on its objective of being a fully interoperable multi-chain and allows the transfer of data and assets between the parachains. Secured at the same level as the Relay chain, XCM channels will serve as a stable and reliable inter-chain messaging channel between parachains themselves as well as smart contracts.

Other projects that deserve mention include

Polygon – Connects all Ethereum-compatible chains

Solana – Utilizes Wormhole to facilitate communication between Solana, Ethereum, Terra, and Binance Smart Chain

Future of Blockchains: Multi-Chain or Cross-Chain

Interoperability is essential for blockchain technology to truly thrive. At present, the industry is composed of siloed blockchains confining users to the independent network. Several solutions have emerged to create an inclusive space that fosters growth and innovation. Some solutions like cross-chain technology solve part of the problem while others focus on connecting blockchains that share similar technologies. The existing solutions can be classified as cross-chain or multi-chain with both trying to achieve the same end goal – the seamless transfer of data and other information between different blockchains.

Cross-chain technology (read bridges) has managed to alleviate the problem of interoperability however, this technology is far from perfect. While this solution is yet to realize its full potential, one can’t negate its importance. Bridges are the most widely used tool for cross-chain transfer, however, in their current state as Vitalik had mentioned are extremely prone to exploitations (Wormhole bridge hack). They form the weakest link in the chain. Another issue relates to centralization leading people to believe that multi-chain is the future. However, multi-chain solutions like Polkadot and Cosmos prove effective within their space but the moment the two need to communicate a bridge is required. It is possible that bridges may be replaced with a newer and safer version or with a different technology that solves this problem. For now, the future is not multi-chain OR cross-chain but multi-chain AND cross-chain.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Categories
Blockchain

Blockchain Scalability

Portfolio Capital

Editorial Team

Primer: To understand Blockchain Scalability, we must first understand Blockchain. Put simply, a blockchain is a distributed digital database that facilitates transactions by enabling an immutable, transparent, and shared ledger. This ledger is designed to achieve decentralized transaction management. Any member node can autonomously administer and manage transactions by complying with the agreements/rules laid out and without any third-party interference. Consensus on data accuracy is achieved by a network of distributed users (nodes). Upon verification, transactions are recorded permanently, delivering immutability. These transactions are recorded in blocks. When a block is filled, a new block is generated and chained onto the previous block. The blocks are inextricably linked together in chronological order forming a chain, hence the blockchain.

Broadly, scalability refers to a system’s ability to adapt to increased workload in an efficient and swift manner. Scalability, as it pertains to Blockchain, refers to the network’s ability to sustain an increasing number of transactions as well as nodes and is primarily determined by throughput (transaction rate), cost and capacity, and networking. Thus, scalability is a cohesion of several parameters and metrics that we shall now delve into. It is essential to note in blockchain terminology, scalable is a comparative term and must not be confused with scalability.

Throughput

Is defined by the time required for committing a valid transaction on the blockchain and the size of the block for the transaction. For perspective, traditional institutions like Visa with centralized infrastructure can process 1700 transactions per second (TPS) whereas Bitcoin can process only 7 transactions per second. The vast difference stems from Bitcoin’s utilization of excessive processing power and time to achieve decentralization and privacy for its users. Transactions on decentralized networks need to go through several steps comprising acceptance, mining, distribution, and eventually validation by a global network of nodes.

An increase in the number of transactions results in an increase in the size of the block, which has its own set of problems. Theoretically, increasing the block size should help enhance network speed and capacity; however, a larger block size adds to the cost as it increases the storage requirement and involves higher power usage. This adds to the burden of maintaining a full node, forcing out nodes that can’t afford the processing requirements. Moreover, an increased block size becomes difficult to relay around the network.

Transaction delays affect scalability, and transaction fees play a significant role in this. Verification of transactions on the blockchain requires users to pay a specific fee. This fee has a direct impact on the confirmation time as it determines which transaction will be chosen for processing. With a growing number of transactions and the need to establish trust between anonymous entities, validating transactions is time-consuming, and priority is given to users willing to pay a hefty fee leaving several transactions unprocessed and in the queue for a long time.

Cost and Capacity

From the genesis block to the most recent transaction, all data is stored on the blockchain. However, nodes have limited capacity for storage, and if the blockchain grows rapidly, the storage requirement increases substantially, making it difficult for nodes to store massive amounts of information. In addition, the transaction processing power of blockchains is limited, and its underlying structure impedes scalability. There is a reason these blockchains have limited capacity, caps are placed to prevent inconvenience and inefficiency in management. If blockchains were permitted to grow in an uncontrollable manner, the nodes would not have been able to keep up.

Networking

Every new transaction on the blockchain needs to be broadcast to all nodes involved. As the number of nodes increases, the time required for a transaction to be propagated increases, degrading the overall performance. Moreover, anytime a new block is mined, it is revealed to all nodes. Thus, the consumption of network resources is significantly high, necessitating efficient data transmission coupled with a dedicated network bandwidth to minimize delays and improve throughput.

With characteristics and benefits such as transparency, trust, immutability, and data security, blockchain technology offers a world of opportunities extending beyond the financial sector. Its inherent security mechanism and public ledger system have found use in non-financial applications drawing a swath of users. Despite the associated benefits and use cases, scalability is a major challenge affecting blockchain development and widespread adoption. The network settlement times and usability of centralized entities are far superior to decentralized entities. The unsolved problem of scalability is a major stumbling block in blockchain adoption. With the number of users on the network increasing, the high transaction fees, slow transaction speeds, and unsatisfactory user experience prevent blockchain technology from realizing its full potential.

One would assume a change in the blockchain parameters, such as increasing the block size and reducing the block generation time, would help tackle the problem of scalability. More space implies more transactions; however, the block size can’t scale infinitely, and its current ability to scale doesn’t enable it to match or compete with existing centralized systems. Moreover, these changes require costly hardware for nodes to remain in sync on the network, proving rather expensive. The root of the problem is in the very design of the network. Many constraints that hinder scalability are packaged with blockchain value propositions.

In the following section, we shall explore the various Blockchain scalability solutions, but before we do that, it is imperative to understand the term – Blockchain Trilemma.

The Blockchain Trilemma

The Blockchain Trilemma is a widely held belief popularized by Vitalik Buterin (Co-Founder of Ethereum) that highlights the three essential and organic properties of blockchain – decentralization, security, and scalability cannot perfectly co-exist, implying any network can achieve only two of the three aforementioned properties. Thus, greater scalability can be achieved, but security or decentralization will have to be compromised. Historically, blockchains have prioritized decentralization over the other two properties, which is reflected in the low transaction volumes. However, achieving greater scalability is essential for decentralized networks to compete with high-performance legacy platforms fairly.

With blockchain technology being deployed across various industries, from finance to art, it is of paramount importance to address the issue of scalability without compromising decentralization and security. The scaling solutions can be categorized as Layer 1 andLayer 2 Solutions.

Layer 1 network is the foundational layer, essentially the blockchain itself, whereas a Layer 2 protocol operates on top of an underlying blockchain to improve its efficiency and scalability. Examples of Layer 1 networks include Bitcoin, Ethereum, Cosmos, Solana, ICP, and Avalanche.

Blockchain Scalability Solutions

Layer 1 (On-Chain) Solutions

Layer 1 solutions optimize performance and improve scalability by making structural and fundamental changes to the base protocol itself. Since the changes are incorporated onto the chain, these solutions are often referred to as On-Chain Solutions.
L1 solutions are designed to enhance a network’s attributes and characteristics in order to accommodate more users and data. Changes to the protocol’s rules are made to increase transaction speed and capacity. These changes involve accelerating the speed of block confirmation as well as increasing the amount of data being stored in each block, all directed at enhancing overall network throughput. Other changes that networks resort to in order to achieve scalability include sharding and consensus protocol improvements.

Sharding

Adapted from distributed databases, sharding implies partitioning of the database into smaller data sets or “shards”. Deploying this technique enables the computational workload to be spread across a peer-to-peer (P2P) network reducing the processing time significantly, as each shard is responsible for processing only a portion of the data stored on the network. Sharding enables more nodes to be connected and disperses data storage across the network, enhancing overall efficiency and management. The shards are processed simultaneously in parallel instead of a network having to operate sequentially on each transaction. Thus, sharding contributes to greater efficiency (computation, communication, and storage) as well as higher network participation. Theoretically, there is no cap on the number of shards, and with the addition of shards on-demand, a blockchain should be able to scale infinitely. However, practically this isn’t possible owing to tight coupling and other limitations such as shard takeovers or collusion between shards. The damage from one shard being corrupt and taking over another can be catastrophic, leading to permanent loss of the corresponding portion of data. Furthermore, the probability of corrupt information being injected and broadcast to the leading network through a malicious node cannot be ignored.

Proper implementation of inter-shard communication is essential for users and applications of one subdomain to communicate with another subdomain. Each shard appears as a separate blockchain network necessitating this requirement, and improper implementation of the same can jeopardize the security of the entire network owing to double-spending.

Consensus Protocol

The consensus protocol is an important feature in blockchains and is responsible for maintaining the security and integrity of the network. It refers to a set of rules that control all activity on the blockchain and ensure participating nodes agree on only valid transactions. Some protocols are more efficient than others. Proof of Work (PoW), an extremely secure but resource-intensive protocol, is the consensus mechanism for Bitcoin, Litecoin, and ETH. PoW is time-consuming and utilizes substantial computational power to solve cryptographic puzzles making it slow in comparison to other protocols like Proof-of-Stake (PoS). With PoS, the consensus is distributed over the blockchain network. Validators are picked based on their staking collateral in the network. Ethereum is on track to transition to PoS with Ethereum 2.0. The transition is expected to drastically enhance network capacity enabling faster transactions and lower fees. Examples of other consensus protocols that serve as effective scalability solutions include Proof-of-Authority, Delegated Proof-of-Stake, Byzantine Fault Tolerance, etc.

Layer 2 (Off-Chain) Solutions

An L2 solution is built on top of an existing blockchain with the primary purpose of improving scalability and efficiency. Most prominent examples include Bitcoin Lightning Network and Ethereum Plasma. An L2 solution essentially offloads the transactional burden from the L1 blockchain. The creation of an auxiliary framework enables L2 to operate and process transactions independently off the chain. Thus, L2 solutions manage and handle all the processing load and report to L1 for result finalization. It is for this reason that L2 solutions are also referred to as Off-Chain solutions.

A major benefit associated with L2 frameworks is the ease of deploying the solution. These solutions are integrated as an extra layer on top of the base layer and require no structural or fundamental changes to the mainchain. As a result, L2 solutions inherit the features that make L1 secure and decentralized. A majority of the data processing is delegated to L2, reducing congestion on L1, thereby improving scalability, TPS and lowering transaction fees.

The quest for scalability has led to the emergence of the following L2 solutions.

Nested Blockchains

A nested blockchain (L2) runs atop another blockchain (L1). To put it simply, L1 establishes the parameters for the network and L2 focuses on the executions. On a single mainchain, multiple blockchain levels can be built, with each level operating as a separate blockchain. The Mainchain has a parent-child relationship with the other blockchains. The parent chain controls overall parameters and delegates work to the child chains that process and subsequently, upon completion, return it to the parent chain. Under this model, a hierarchical structure is followed wherein child chains take instructions from the main chain to reduce the network’s load and facilitate fast transactions at lower costs. OMG Plasma Project, built atop Ethereum, is an example of a nested blockchain.

State Channels

A State Channel is designed to facilitate two-way communication between a blockchain and off-chain transactional channels. Think of it as permitting a group of participants to conduct an unlimited number of transactions off-chain instead of on the blockchain. This significantly cuts down the waiting time as validation on state channels doesn’t require the immediate involvement of miners. But instead, it is a network-adjacent resource sealed off via a multi-signature or smart contract mechanism pre-agreed by the participants.

Once the transaction or batch of transactions between the participants on a state channel is completed, the final “state” of the “channel”, as well as the inherent transitions, are recorded on-chain in a new block. Thus, state channels enable fast and low-cost transactions off-chain that are secured permanently on-chain.
State channel transactions are not public. However, participants within a channel can view the transactions giving them considerable privacy. It is important to note only the initial and final state of the transaction is recorded on the main chain. As soon as the participants sign a state update, it is considered final, implying state channels have instant finality. However, it is important to point out state channels work best for applications with a defined set of participants and where participants can be available round the clock.

Examples include Celer, Bitcoin’s Lightning, and Ethereum’s Raiden Network.

Side Chains

Designed to process a large number of transactions, a side chain is a transactional chain that connects to another blockchain via a 2-way peg (2WP). It operates independently and utilizes a distinct consensus protocol that can be optimized for speed and scalability. Utility tokens form an integral part of the data transfer mechanism between the sidechain and mainchain. In this setup, the mainchain’s principal role is to confirm transaction records, maintain overall security and handle disputes.
Sidechains differ from state channels in several ways. Firstly, sidechains maintain a public record of all transactions in the ledger. Secondly, a security breach on the sidechain has no implications on the mainchain or other sidechains, making it extremely useful for experimenting with new protocols or improvements to the mainchain.

However, the initial setup of sidechains entails a significant amount of time and effort. Sidechains do not derive security from the mainchain. Instead, they operate as independent chains that are responsible for their own security, which can be compromised if sidechain validators coordinate to act maliciously or if network power distribution is inappropriate.

Examples of sidechains include Liquid, Loom, and Rootstock.

Rollups

Rollups are L2 solutions that execute transactions outside L1 but post the transaction data on it, retaining L1 security. In essence, the burdensome task of processing transactions is offloaded to L2, and then highly compressed data is rolled up into batches and sent to L1 for storage. This arrangement enables Rollups to provide users with lower gas fees and higher transaction processing capacity whilst retaining L1 security.

Rollups can have different security models. The two most prominent methods include:
Optimistic Rollups– offer significant improvements in scalability because they operate with an optimistic assumption of transaction validity. They employ a waiting period during which transaction validity can be challenged. If fraud is detected, computation is performed otherwise, it is posted onto the ledger. Examples of Optimistic Rollups include Arbitrum and Optimism.
Zero-knowledge Rollups– run computations off-chain to generate validity proof for the whole bundle of transactions. The validity proof is then submitted to the mainchain or base layer. Examples of ZK Rollups include zkSync and Starknet.

Let us understand how different blockchains are using some of these solutions to achieve Scalability.

Bitcoin

Bitcoin and Ethereum are two blockchains where scalability is traded for decentralization and security. Scalability is a constant issue, and Bitcoin has been grappling with it since its inception. Bitcoin started out with low fees and predictable block times, but its success has been its nemesis. Its increased adoption was coupled with high fees and network congestion. The initial solution focused on increasing the block size. The majority of the community didn’t agree with it, leading to the creation of Bitcoin Cash, a Bitcoin fork. The block size was increased to 32 MB for Bitcoin Cash. However, an expansion in block size is a temporary solution as the problem is likely to resurface with an increase in network usage.

Bitcoin then implemented Segregated Witness (SegWit) to improve scalability by changing the way and structure of data storage. Essentially, SegWit helped alter the size of the block by eliminating signature data linked with each transaction freeing up considerable storage space and thereby increasing capacity. However, the upgrade didn’t really solve the issue of scalability. So Bitcoin turned to Lightning Network, an L2 solution. The Lightning Network uses smart contract functionality to establish a payment channel between transacting parties. It boosts speed and reduces cost substantially as payments take place off-chain sans verification from the mainchain. However, to be able to utilize this network, transacting parties need to fulfill certain conditions. It works well for simple payments, and constant improvements and upgrades are being made to enhance the deliverables. If successful, it could garner widespread adoption.

Ethereum, on the other hand, deploys more complex L2 solutions, so let’s look into those.

Ethereum 2.0 or Serenity

Designed to ease bottlenecks and enhance efficiency, speed, and scalability, Ethereum 2.0 refers to the network’s transition to a PoS-based system. Ethereum 2.0 shall support sharding and other scalability solutions to increase transaction throughput. The shift involves multiple steps, but the highlight is the migration to a PoS consensus mechanism, a less energy-intensive technology.

As the second-largest blockchain network in the cryptosphere, Ethereum aims to solve the congestion issue by deploying shards. Sharding will enable the network to spread its load and immediately address scalability. The Ethereum shards will be organized around the Beacon Chain (a PoS blockchain launched in December 2020), and each shard will function as a separate chain employing proof-of-stake to operate securely. The shards will coordinate through the Beacon Chain. At present, the Beacon Chain is separated from the Mainnet, but a merger is planned for the future. As per the roadmap, Ethereum is set to launch 64 new shards to help expand network capacity. Each node on Ethereum will store one shard. An important implication of this is to reduce the hardware requirement and ease the ability to run a node so network participation can be enhanced and higher levels of decentralization can be attained. The current plan entails shard usage for aggregation and movement of data, but in time like the Mainnnet, shards will be able to execute code. As mentioned, sharding isn’t risk-free, and Ethereum aims to address the security risk associated with it by randomly assigning nodes to shards and reassigning them at random intervals.

Furthermore, Ethereum developers evade centralization potential and reduced security levels by adopting a mixed approach instead of relying solely on making changes to the base layer (consensus protocol improvements + sharding). They aim to leverage Layer 2 solutions (a combination of sidechains and rollups) to maximize efficiency and scalability. With Eth 2.0 and rollups working in tandem, the transactional capacity of Ethereum is expected to reach 100,000 transactions per second. However, a fully functional version of the Beacon Chain and the accompanying PoS consensus mechanism and sharding chains is unlikely to happen in the near foreseeable future. Furthermore, even a full-scale implementation doesn’t guarantee infinite scalability.

At present, plenty of layer 2 solutions are available for Ethereum that help solve the issue of scalability to some degree. These include:
-Plasma
-Payment Channels
-Sidechains
-ZK-Rollups and Optimistic Rollups
-Layer 1 EVM-compatible blockchains that include Cosmos, Polkadot, and Avalanche

Each solution has its own tradeoffs. Polygon, formerly known as Matic, is worth mentioning. A Plasma-based aggregator, Polygon is a blockchain scalability platform native to Ethereum. Essentially, Polygon operates atop Ethereum and processes batches of transactions on its proprietary PoS blockchain diverting traffic from the Mainnet. In addition to that, Polygon provides a robust framework for building dApps off-chain with enhanced security and speed. Dubbed Ethereum’s internet of blockchains, Polygon aims to create a multichain ecosystem of Ethereum compatible blockchains whilst offering interoperability with other sidechains, sovereign blockchains, and other L2 solutions. However, Polygon faces stiff competition from blockchains such as Avalanche, Cosmos, and Solana that support bridges and permit the trade of several variants of Ethereum tokens.

Polkadot

Founded by Gavin Wood (one of the founders of Ethereum), Polkadot envisions itself as a next-generation blockchain protocol that unites an entire network of purpose-built blockchains, allowing them to operate seamlessly together at scale.

Both Polkadot and Ethereum 2.0 are sharding-based protocols. However, there are certain differences in the way the two protocols approach sharding. Each shard on Ethereum 2.0 will have the same transition function (STF), an interface provision for smart contract execution. This will enable contracts on a single shard to share asynchronous messages with other shards. The Beacon Chain will enable smart contract executions through the Ethereum Wasm interface (eWasm). A different variation of sharding is deployed on Polkadot. Its ecosystem is composed of a Relay Chain (the main chain) that connects and supports all shards that are referred to as parachains on the network. The network currently supports 100 parachains. Each parachain on Polkadot can expose a custom interface. It does not have to rely on a single interface like Ethereum 2.0’s eWasm. Instead, the arrangement permits each parachain to individually connect with the Relay Chain, giving developers flexibility in setting the rules with respect to how each shard changes its state.

Polkadot has managed to establish a decent position for itself as Ethereum struggles with scalability woes. Ethereum, with more than 1 million transactions per day, has a much larger user and developer base than any of its rivals, giving it a competitive edge. However, the delay in the launch of Ethereum 2.0 has allowed various platforms, including Polkadot, to offer additional attributes such as superior interoperability, a key challenge in the world of blockchain.

It’s difficult to determine the future of Ethereum 2.0 or Polkadot especially when other interoperability solutions aim to alleviate dependence on such platforms by spreading the load of computational work across blockchains.

Solana

Solana’s architecture claims to ensure the linear performance of transactions doesn’t deteriorate irrespective of the scale of throughput. Solana employs Proof of History (PoH) as a tool in its Proof of Stake consensus. The novel time-based PoH consensus model with its mechanics for synchronizing time across nodes helps speed up the process of validating transactions. This contributes to Solana’s ability to execute up to 50,000 TPS. Solana supports parallel validation and execution of transactions, but the technique that genuinely enables it to optimize transaction validation is Pipelining. Pipelining essentially helps streamline the transaction validation process, making Solana an ultrafast and scalable blockchain.

The network has suffered several interruptions in the recent past. Solana claims the interruptions were an outcome of resource exhaustion. Whether these outages are mere roadblocks in Solana’s journey to solve the inherent problem of scalability or a cause of significant concern remains to be seen.

Cosmos

Built to facilitate communication between standalone distributed ledgers sans intermediaries, Cosmos is dubbed the Internet of Blockchains.Cosmos identifies itself as an expansive ecosystem built on a set of modular, adaptable, and interchangeable tools. Cosmos in a nutshell Cosmos, through the utilization of Hubs, IBC, Tendermint Byzantine Fault Tolerance (BFT) Engine, and the Cosmos Software Development Kit (a framework for building application-specific blockchain), provides the necessary infrastructure for the simplified development of powerful and interoperable blockchains.
Cosmos provides an ecosystem that permits blockchains to seamlessly communicate through IBC and Peg Zones without compromising sovereignty. Cosmos addresses the notorious issue of scalability through its horizontal and vertical scalability solutions.

Cosmos places a high value on new blockchains handling large transaction volumes efficiently and quickly. It leverages two types of scalability –

1. Vertical stability – these encompass techniques for scaling the blockchain itself. Tendermint BFT possesses the capability to achieve thousands of transactions per second by moving away from the PoW system. However, in the case of these solutions, the outcome is largely dependent on the application itself, as Tendermint simply improvises on what’s available.

2. Horizontal scalability – these tend to focus on scaling out and addressing the limits of vertical scalability. The transactions per second cannot be enhanced beyond a certain point even if the consensus engine and applications are highly optimized. Thus, the solution entails a move to multichain architectures. The whole idea is to not be limited by the throughput of a single blockchain. Cosmos aims to have multiple parallel chains with no limit on the number of chains that can be connected to the network. This theoretically makes it infinitely scalable as more chains can be added when the existing ones reach their maximum TPS.
Other blockchains claiming infinite scalability with their novel technological solutions include Avalanche and ICP.

Avalanche

Developed by Ava Labs to address limitations of older blockchain platforms, Avalanche strives to deliver on high performance, extreme scaling capabilities, fast confirmation times, and affordability. It provides a platform for users to easily develop multi-functional blockchains and dApps. Its core value proposition is the deployment of a subnet architecture.

Subnets (short for subnetworks) are a dynamic set of validators tasked with achieving consensus on a certain set of blockchains. An innovative proposition to the trilemma solution, subnets offer a unique form of scaling. These can’t exactly be categorized as L2 solutions. Subnets offer more reliability, security, decentralization as well as flexibility in design and implementation. These are, in fact, similar and closely related to sharding. These create new but separate and connected instances of the same blockchain. However, the critical difference between the two is that subnets are algorithmically generated and can be created by users as and when required. Moreover, sharding is built-in, whereas subnets are limitless, implying infinite subnets can be launched. The possibilities with subnets are limitless as they can include virtual machines, multiple blockchains and can be deployed for different use cases.

Subnets on Avalanche
The Avalanche Network comprises three interoperable chains: X-Chain, P-Chain, and C-Chain. X-Chain is concerned with sending and receiving transfers, C-Chain is the platform’s Ethereum Virtual Machine (read smart contracts), and the P-Chain is for coordination of validators and the creation and management of subnets. P-Chain and C-Chain utilize the Snowman consensus to enable high-throughput secure smart contracts, and X-Chain utilizes the DAG-optimized Avalanche consensus, a scalable protocol that enables low latency and fast finality.

Validators on Avalanche are required to secure and validate all three chains that together form the Primary Network. This requirement essentially enables easy connectivity and implementation between the subnets.

Subnets and Scaling
Subnets on Avalanche enable individual projects to build and connect to the Mainnet via individual chains. This reduces the amount of space they take up on the Mainnet. Adopting the aforementioned strategy enables Avalanche to divert traffic and avoid network congestion addressing the pressing issue of high gas fees and low transaction speed as it scales up.
Broadly speaking, subnets contribute to scaling in multiple ways that include –

1. Competing ideas to co-exist for a single cryptocurrency. Avalanche states in case of incompatible sharing ideas, two different subnets with differing sharding schemes can be deployed. Alternatively, a subnet with two subchains can each try out a different sharding scheme (with validation assigned to a common validator).

2. Permit the creation of infinite subnets for differing use cases. A new subnet can be launched when a chain hits its scaling limit. There are no restrictions on the number of chains that can be added, facilitating the creation of application-specific blockchains with their own set of validators (Note – a validator can validate different subnets while validating the default subnet). Application-specific blockchains can be highly optimized, and because they’re siloed from one another, the network can deliver at low fees. There are no disadvantages to generating limitless subnets other than possible inconvenience. Theoretically, an entire crypto network (like Bitcoin or Ethereum) can be ported to Avalanche over a subnet.
Avalanche claims anyone can launch a subnet on the network, and because the creation of subnets has no bounds, the potential to scale up will always exist.

ICP is another blockchain built for scalability. Let’s look at a network that goes beyond achieving security, decentralization, and scalability and focuses on building the future of Web 3.0 – a decentralized internet and global computing system.

Internet Computer

Internet Computer (IC) is an infinitely scalable general-purpose blockchain that delivers a decentralized Internet by running smart contracts at web speed. It permits developers to install smart contracts and dApps directly on the blockchain. It can host decentralized financial platforms, enterprise systems, NFTs, web-based services, tokenized social media platforms, entirely on chain. IC can be referred to as a sovereign decentralized network sans cloud computing services to deliver web content.

DFINITY (founders of IC) decided to resolve the longstanding scalability issue by developing a network that claims to run at web speed with infinite scalability potential. The resulting effort was novel cryptography and computer science to build and launch the Internet Computer.

Hosted on node machines and operated by independent, geographically separated parties, the internet computer nodes run on ICP, a proprietary protocol (Internet Computer Protocol). None of the nodes on IC are hosted by cloud providers ensuring a tamper-proof and stable network. ICP is a secure cryptographic protocol that ensures the security of the smart contracts running on the blockchain. The Internet Computer comprises individual subnet blockchains that run parallel to each other and are connected using Chain Key cryptography. Subnets on IC do not employ PoS or PoW consensus mechanisms to process transactions but utilize Chain Key cryptography. This technology permits canisters (smart contracts on IC) on one subnet to seamlessly call canisters hosted on other subnets. DFINITY claims these canisters are smart contracts that scale. Moreover, Chain Key technology allows IC to finalize transactions that alter the state of smart contracts in one to two seconds by splitting up smart contract function execution into two types – update calls and query calls. Another notable feature is the network’s decentralized, permissionless governance system Network Nervous System (NNS) which runs on-chain. NNS is designed to scale the network capacity when required. It does so by spinning up new subnet blockchains.

IC derives its ability to infinitely scale by partitioning the network into subnet blockchains. Each subnet blockchain on IC can independently process update calls and query calls. Thus, the network, through the addition of more subnets, can be easily scaled. NNS permits the addition of unlimited subnets on the network. Subnet functionality seems to be a secure and popular choice to achieve scalability, bringing us to what is the difference between IC and Avalanche.

Internet Computer and Avalanche

While both platforms deploy subnets and claim infinite scalability, there are however a few key differences. ICP claims an advantage over Avalanche owing to its ability to host websites and social networks. ICP is designed to deliver a Decentralized World Computer and is currently the only blockchain in the category. Avalanche gives stiff competition to other L1s by supporting speeds up to 4,500 TPS. As per their website, future upgrades can escalate transactions to 20,000 tps sans sharding or L2 solutions. Another metric to measure speed and scalability is finality. IC has demonstrated its ability to process 11,500 TPS and both chains claim sub 2 second finality.

IC requires nodes to be pre-approved and complete full KYC coupled with the need to operate from designated data centers with highly customized specifications. Avalanche has modest hardware requirements to deliver on maximum decentralization. Performance is CPU-bound, and if the network dictates the need for higher performance, specialized subnets can be launched.

Closing Thoughts
Blockchain is slowly but progressively becoming the base layer offering varied use cases in diverse sectors. For it to serve billions of users across the globe, one can only imagine the kind of scalability solutions and infrastructure that would need to exist. In light of the growing demand, there is a place for all of these solutions to co-exist. However, the only ask will be constant evolution and that includes new ideas that haven’t been developed yet.

Categories
Blockchain

Blockchain Oracles

Portfolio Capital

Editorial Team

Primer – With its disruptive potential, Blockchain technology is the future of our society. The decentralized technology provides a distributed digital database that facilitates transactions by enabling an immutable, transparent, and shared ledger. The ledger’s immutability and irreversibility have helped it gain significant momentum across multiple industries as a thriving alternative to legacy systems. Moreover, its use case extends beyond the realm of crypto with myriad implementation opportunities for all businesses and interactions. 

The accelerated adoption of the tamper-resistant blockchain technology is revolutionizing healthcare, real estate, gaming, governance, supply chain, and traceability besides finance and banking. Furthermore, oracles set the stage for isolated blockchain networks to access off-chain information, an essential requirement supporting blockchain deployment in the aforementioned sectors.

Let’s delve into what oracles are and why we need them.

Need for Blockchain Oracles

In simple terms, an oracle is a bridge between the blockchain and the real world. However, before we elaborate on oracles, it is vital to understand the reason behind their existence.

While it’s safe to say blockchain technology is a game-changer underpinning the transition to Web 3.0, certain limitations, such as a blockchain’s inability to access off-chain data, would render its real-world application useless. But that isn’t the case owing to the existence of blockchain oracles. In the Web3 space, oracles are third-party services that provide blockchains and smart contracts access to off-chain data bridging the information gap that exists. Examples of blockchain oracle projects include Chainlink, API3, Augur, etc.

Blockchains and smart contracts are self-contained closed systems that can’t access data outside the network. However, several contractual agreements necessitate access to off-chain data for execution, and this is where oracles come into play. In the absence of oracles, smart contracts would have minimal use, and DeFi would not be where it is today. DeFi relies heavily on oracles to deliver on its value proposition of broader access to financial applications. Most offerings of DeFi, such as derivates and prediction markets, borrowing/lending markets, and insurance products, need oracles for smart contracts to react to real-world events. Oracles play a crucial role in enabling smart contracts to interact with streams of existing data, legacy systems, advanced computation, etc., to execute predefined actions. The introduction of hybrid smart contracts that combine on-chain code with secure off-chain data has greatly enhanced the functionality of advanced blockchain applications (dApps). In addition, they have broadened the scope of smart contract generation and execution. These contracts not only form the backbone of DeFi but offer several use cases across multiple industries and sectors such as insurance, agriculture, pharma, environmental sustainability, supply chain management, etc. Oracles are fundamental to almost all smart contracts whose execution is connected to real-world events. 

It is important to note blockchain oracles do not constitute the data source themselves. Instead, the oracles query, verify and authenticate external data and then relay the information to the enclosed network. 

Types of Blockchain Oracles

Oracles operate on-chain as well as off-chain. The former involves establishing a secure connection with the on-chain infrastructure, broadcasting/extracting data, sending proofs, etc. The latter involves processing and retrieving requests, transmitting on-chain data to external/traditional systems, and performing off-chain computation to enhance scalability and interoperability. There exist different types of oracles with variations in their functions, validation process, and data delivery mechanisms. 

Inbound and Outbound Oracles
Inbound oracles deliver off-chain or real-world data for smart contract consumption (on-chain). They’re the most commonly used and supply information ranging from weather conditions to proof of payments to live price feeds. 

Outbound oracles send information from smart contracts (on-chain) to the external world (off-chain). These systems are designed to send commands to off-chain systems to execute specific actions. They’re most commonly used by decentralized banking networks and IoT systems. 

Software and Hardware Oracles
Software oracles interact with digital sources such as websites, online databases, servers, or any other data source on the web to supply smart contracts with real-time information. For example, the data delivered by these oracles can include exchange rates, price fluctuation, or real-time flight information. 

Hardware oracles transmit information from the real world to smart contracts. They’re designed to translate physical events into digital values that can be processed and read by smart contracts. The information is accrued from electronic sensors, RFID sensors, barcode scanners, thermometers, etc. Compared to software oracles, hardware oracles are harder to compromise. They are of particular importance in supply chain management. 

Compute-enabled Oracles
Compute enabled oracles provide useful, secure, off-chain computation solutions that are impractical to achieve on-chain due to technical or financial constraints. For example, ZK Rollups, a Layer 2 solution for Ethereum, use compute-enabled oracles as an alternative to the expensive computation cost on Ethereum.

Cross-Chain Oracles
Cross-chain oracles enable interoperability between blockchains. These oracles can read and write information between different blockchains allowing data on one blockchain to prompt action on another blockchain. 

 

Oracles Use Cases

DeFi
Oracles enable algorithmic stablecoins, financial derivatives, and prediction markets to work seamlessly. All information integrated into DeFi services is obtained through external data feeds.

Supply chain
Supply chains on the blockchain need to interact with external data for tasks, including tracking shipments, monitoring quality control, and verifying customer identities. 

Digital identity
Oracles can enable smart contracts to confidentially verify and authenticate an individual’s identity (document proof, certificates, etc.). In addition, digital identity solutions can collate and update personal data and perform actions as defined in the smart contract.

Insurance
Oracles monitor data through satellite imagery, web APIs, etc., to verify the occurrence of insurable events. They trigger the settlement of insurance claims in a fully transparent and trustless manner. For example, flight insurance in the event of flight delays/cancellation, insurance on property or crops to hedge against poor weather conditions. Moreover, insurance smart contracts can make payout claims via traditional systems. 

They have impacted industries as disparate as betting, environment sustainability, education, gaming, and governance. 

The Oracle Problem

Smart contracts on the blockchain are immutable and deterministic, but they’re in no way perfect. Their inability to read the off-chain programming language and connect to the physical world necessitates reliance on oracles. Unfortunately, while oracles represent the middleware that provides blockchains with a secure and robust gateway to off-chain systems, enabling verification of external events for contract execution, they introduce vulnerabilities that are difficult to skirt around.

The architecture of hybrid smart contracts is largely dependent on oracles functioning like an application performing interface (API) to the external world. Connection with the real world is paramount to unlocking their full potential. For example, settlement of financial contracts entails the need for market information in real-time, weather information for parametric insurance, IoT sensors for supply chain management, etc. Blockchain’s inability to natively communicate with the real-world sans oracles implies smart contracts wouldn’t be able to deliver a full range of functionality to the users.

The lack of built-in connectivity with the external world forms the crux of “The Blockchain Oracle Problem.”

As blockchains are isolated networks, they’re able to deliver security and reliability to users. But this very property also prohibits them from obtaining external data resulting in an over-reliance on oracles. While oracles exist as viable solutions for bringing external data to on-chain environments, they’re third-party entities (affecting decentralization) whose reliability needs to be trusted (affecting trustlessness). 

Centralized Oracles
Reliance on centralized oracles introduces significant risk by introducing a single point of failure. All contract participants need to place their trust in the entity controlling the oracle. As the name alludes to, centralized oracles are controlled by a single entity and are the sole providers of data for a smart contract. Relying on a single entity for accurate information can jeopardize the security of the smart contract. 

In the unfortunate event of the oracle going offline, the smart contract would lose access to its only source of information resulting in improper outcomes. Also, if the oracle is corrupted by a malicious actor or supplied with incorrect data, it will directly impact the contract. If an outcome is based on falsified information, it cannot be reversed, putting users’ funds at risk due to blockchain transactions’ automated and immutable nature. Thus, centralized oracles can exercise considerable control over smart contracts, compromising decentralization. Finally, preserving security and fairness in the face of vulnerabilities like downtime, bribe, malicious manipulations, regulatory pressure, incompetence, and hacks becomes challenging.

Decentralized Oracles 
The oracle problem encompasses the security, reliability, and trust conflict between oracles operating on non-deterministic data and the trustless execution of smart contracts. With the hope of resolving the aforementioned issues and mitigating fraud, Decentralized Oracles were created. 

Decentralized oracles avoid counterparty risk and achieve trustlessness and deterministic results. They combat the reliability issue by distributing trust among many network participants instead of relying on a single source of truth. Implementing a system that queries multiple oracles and multiple data sources ensures no single point of failure and enables increased security, reliability, and end-to-end decentralization.

Decentralized oracles offer seamless off-chain data verification that has proven effective for securing smart contracts. They are responsible for the creation of Hybrid Smart Contracts and have contributed significantly to the types of on-chain collaborations these contracts can support. 

Hybrid smart contracts facilitate a relatively safer, fairer, and scalable environment with connectivity to the physical world by blending blockchains’ tamper-proof and immutable properties with decentralized oracles’ secure off-chain capabilities. Furthermore, with no modifications to the existing blockchain infrastructure, hybrid smart contracts enhance and expand functionalities previously limited by code on the blockchain.

However, decentralized oracles do not entirely eliminate the notion of trust but simply distribute it between many participants. This arrangement helps to minimize counterparty risk. Decentralized oracles deploy several security approaches off-chain to extend the guarantees provided by smart contracts. Oracles need to fulfill specific properties that include round-the-clock availability, correctness, and accountability. Data manipulation and false inputs are penalized for mitigating reliability and trust issues. In addition, the utility of these oracles extends to a specific application on the blockchain. It has no bearing on the performance of other smart contracts or the consensus mechanism of the blockchain that secures all contracts on it.

It is crucial to note decentralized oracles are the accepted solution for the oracle problem. But, in reality, they’re not sufficiently decentralized to solve all of the issues that plague oracles. The mechanism is not devoid of issues like collusion between parties, signaling, mirroring, and bribing. 

Is there an alternative approach, a built-in functionality, perhaps? Let’s delve deeper.

Alternative Approach

Can blockchains and, by extension, smart contracts pull in data from the real world instead of relying on oracles that push data into the blockchain?

Blockchain, known for carrying out highly secure and transparent transactions, has determinism integrated into its technology. Determinism is a theory that explains that every event/action is causally determined by preceding events. In the context of blockchains, it implies every node on the network produces an identical state of information for the same input at any given point. No node can have a different state/result to successfully validate transactions and uphold the system’s integrity. Determinism is vital, and in the event of different results, there would be disagreement and a failure in consensus resulting in the entire system being inconsistent and worthless. Supporting only one version of the truth makes blockchains deterministic necessitating determinism of every operation and application on it. Thus, smart contracts must be deterministic too.

Real-world data isn’t deterministic, and blockchain’s immutability limits its flexibility when processing external information. Let’s say blockchains can access external information. Still, there is no guarantee that every independent node on the network will query data from the same source or that the results will be identical. This arrangement would open the network to numerous vulnerabilities, including injection of false data, hacking, etc., putting the blockchain’s value proposition at risk.

The aforementioned reasons prohibit the integration of oracles into the base layer. However, as separate networks, oracles help translate the non-deterministic external information into a language readable on-chain, minimizing all associated risks while ensuring blockchains retain their determinism. 

Blockchain network’s capabilities clearly extend beyond the mere transfer and exchange of crypto. Smart contracts further expand the possibilities across a host of different domains, including but not limited to supply chain management, insurance, and digital identity. Oracles have contributed significantly to the tokenization of physical assets, but the oracle problem prevents determinism from extending to the oracle layer, hindering growth and preventing blockchains’ from unlocking their true potential.

But as the blockchain space advances, it witnesses constant innovation and technological developments. One such development is the Internet Computer’s proposal of an alternative approach enabling dApps to make HTTP requests directly. 

Internet Computer 

Canisters (smart contracts) on Internet Computer (IC) will be able to communicate with external servers via HTTP requests. Integration with Web 2.0 is essential for executing and implementing hybrid smart contracts. Most blockchains rely on oracles, but with its Chromium update, IC intends to empower canisters to make HTTP requests enabling direct integration with the Web 2.0 world in a trustless manner. 

The problem with the existing setup is that it entails the oracle making all the HTTP requests weakening data integrity by introducing trust assumptions. IC hopes to theoretically simplify the process of accessing external data. 

Chromium release
Every node on the IC network will make the same HTTP request with an asynchronous API provided through the management canister. Once the nodes receive a response, they shall sign it and gossip it to other nodes on the network. The next step involves the consensus layer aggregating enough signatures (at least 2/3rd of the replicas of the subnet) to include the response in the blockchain. The IC block is then finalized and routed back to the system for concluding the computation that originated the HTTP request.

IC claims this approach works well when all nodes receive the same response at the same time. Moreover, it protects against a malicious node reporting false information, as long as enough nodes report the same response. Acknowledging possible inconsistency in response, the team at IC outlines a complicated scenario wherein the requests are essentially the same but possess minor differences that don’t matter for the computational result but would make it impossible to achieve consensus on the responses received. These differences include details such as timestamps and unique identifiers. IC aims to address the issue by coding around the inconsistencies using a function that transforms the response to include only specific fields from it. This will help remove values (timestamps, unique ids, etc.) that could differ across responses, affecting consensus. IC is optimistic about receiving responses back into the smart contracts’ state in a deterministic fashion. While the long-term plan includes support for POST requests, the upcoming chromium release supports GET requests only. 

Canisters’ ability to autonomously connect with the Public Internet for submission and retrieval of data will open up the network to enhanced use cases without the involvement of third parties and additional trust assumptions. This ability to directly interface with HTTP services will be a game-changer as no other blockchain has directly integrated oracle-like functionality. IC’s subnet-based architecture, coupled with its flexible consensus implementation, enables said integration into its protocol stack. Whether IC will succeed in delivering on this functionality or not, only time will tell.

Closing thoughts
Today, there is a growing need for blockchains and, by association, smart contracts to obtain external data for the continued expansion of blockchain projects and their new and innovative use cases. Oracles empower smart contracts to seamlessly interact with the otherwise inaccessible traditional systems. Oracles are vital to unlocking a decentralized future. However, in their current state, oracles are vulnerable entities that represent the largest threat to DeFi due to data integrity, deliberate tampering, front-running, etc. The use of oracles presents many challenges, but connectivity between blockchains and external data is crucial for blockchains to have a sustainable impact in practical applications. One possible alternative includes native blockchain oracles for ensuring the integrity and security of the data or a direct integration as in the case of Internet Computer, where dApps would be able to make HTTP requests directly. 

Nevertheless, despite the drawbacks and the absence of other tested alternatives, oracles are essential for informing and enforcing decisions as we move closer to a world where decentralized solutions shall dominate the market.