The scalability of blockchain and the demand for decentralized finance
Author: Energi
Compiled by: ChainCatcher
In the recent bull market, blockchain scalability has become a focal point of attention. With the growing demand for transactions on the Ethereum network, transaction fees have also increased, with the cost of a single swap transaction on Uniswap exceeding $400 during peak times.
Limited by a maximum throughput of 15 transactions per second (TPS), the Ethereum blockchain has struggled to cope with the surge in demand, forcing users to bid increasingly higher for transactions to be processed.
This has led the crypto community to seriously discuss the need for scalability to support the ongoing growth of decentralized finance (DeFi). So, what is scalability, and is TPS really the most important part of it?
In terms of blockchain technology, scalability is a complex concept. Fundamentally, it refers to the ability of a blockchain to handle an increasing demand for transactions without encountering bottlenecks or sacrificing network stability and decentralization.
In the early days of cryptocurrency, when demand and adoption were low, scalability was not an issue. However, as interest in crypto and investment has grown, so has the demand for transactions and data storage. This has largely been driven by the explosive growth of DeFi, which requires more transactions to support its decentralized applications.
So far, the Ethereum network has largely facilitated DeFi. As the first blockchain to introduce smart contracts, Ethereum has become the default home for DeFi. However, even today’s DeFi demand is severely underserved by Ethereum’s 15 TPS, with DeFi demand accounting for less than 1% of the global financial market. This has driven up transaction fees, shutting out the vast majority of users.
But TPS is not the only issue. As more transactions are processed on-chain, the size of the chain data grows exponentially. This has led to a decrease in the number of full nodes running the Ethereum network. In fact, since the recent bull market peak, the number of Ethereum nodes has decreased by about 60%. This is because running a node has become increasingly difficult and costly. As this issue worsens, the decentralization and security of the network will weaken over time, putting the entire ecosystem at risk.
If DeFi is to meet the financial needs of billions of people, the platforms built for it will need to be able to scale to meet the incredible demands of transaction throughput and data management, not just now, but in the foreseeable future. This is why scalability is so important. Without it, DeFi cannot become a global standard, eliminating the need for centralized banks and lenders in the process.
There are almost as many approaches to solving blockchain scalability issues as there are blockchains. The most popular methods have made good progress in either scaling the throughput capacity of existing blockchains or building faster new blockchains. However, they all involve some trade-offs.
Sharding is an increasingly popular method for addressing scalability issues, especially in newer blockchains like Polkadot and Cosmos. Ethereum 2.0 will also transition the Ethereum network to a sharded ledger.
The way sharding works is by dividing the network nodes into several groups, allowing each group to handle and store a separate ledger. The shards then communicate with each other to reach consensus and ensure the entire ledger is accurate.
This approach can significantly increase the network's throughput by processing transactions in parallel. However, it also has obvious drawbacks:
Rollups are a Layer 2 solution, meaning they can be applied to existing blockchains like Ethereum to increase their throughput. Many dApps on Ethereum use some form of rollup to meet the demands of a slow network.
The functionality of rollups is to bundle transactions, execute them off-chain, and then publish the data to the Layer 1 blockchain using some proof to confirm accuracy. By doing this, they can speed up processing times while still relying on the underlying blockchain's consensus mechanism for validation and security.
There are two main types of rollups: Optimistic Rollups and Zero-Knowledge (ZK) Rollups. Both have successfully increased the throughput of existing Layer 1 blockchains, but neither has truly provided a long-term solution that can scale with the growth of DeFi. This is because:
Finally, there are some blockchains, like Solana, that conduct all activities on a single-layer network. This approach involves allowing extremely powerful servers to communicate to achieve very high levels of transaction processing per second.
Energi has a similar approach, with a network of extremely powerful servers to handle transactions. However, Energi does not operate solely on Layer 1; it has a similar system in our Layer 2 (Masternodes). These will allow Energi to achieve tens of thousands of transactions per second, but with several key advantages over single-layer blockchains. Specifically:
In the next section, we will delve deeper into Energi's approach.
At Energi, we have adopted a very different method to address scalability and transaction throughput issues. We first implemented a Layer 2 solution on top of an Ethereum-compatible codebase in our third-generation upgrade. This will allow our Masternode Layer 2 network to process transactions almost instantly. The specifications for Energi Masternodes require scaling with network demand, ensuring near-instant transaction confirmations even during high network usage. In each block, the Masternodes will dump transactions to our Layer 1 network where we ultimately store them.
This approach significantly increases throughput compared to Ethereum, which is limited by block propagation issues due to its approximately 15-second block time. Energi's 1-minute block time allows us to process more transactions on Layer 1 without giving an advantage to validators, thereby risking the decentralization of the network. We can also adjust block times as needed, allowing the Energi network to handle tens of thousands of transactions per second while maintaining an agile, fast confirmation and a highly decentralized network.
While Energi is ready to meet the growing throughput demands of DeFi, we also understand that throughput is not the most critical barrier to pursuing scalability. Chain size is an urgent issue that must be addressed if the network is to scale smoothly in the coming years.
As demand grows and more transactions are processed on-chain, the on-chain data also increases. Ethereum's archived chain data now exceeds 8 TB, making synchronization very cumbersome and requiring its nodes to provide massive storage capacity.
While Ethereum's chain size has not yet reached a critical point, it will soon. As adoption rates increase, on-chain data grows exponentially. Many projects focus on providing the throughput needed to handle the growing demand, but none have discussed solutions for managing the vast amounts of archived data necessary to maintain complete records of on-chain transactions.
Large chain data makes it impossible for smaller nodes to run the network. This leads to a trend away from decentralization, making the network less secure and more susceptible to attacks. This is why we focus first on solving the chain size issue. By using smaller active chains, more nodes can run it, resulting in greater throughput and a more robust and decentralized network.
To address this issue, we are implementing a unique archiving method that will keep our active chains at a manageable size while ensuring that archived data is well-maintained and accessible. This method is called "snapshotting," which is based on the process we used to transition from Energi 2.0 to 3.0.
Each time our main chain reaches a maximum size of ~5-10TB, we will create a snapshot of the chain at that time. This snapshot will become a separate archived chain, connected to the active chain through the running hash record of the snapshot. These archived chains will be maintained by a smaller set of validators, incentivized to do so through staking rewards. The accessibility requirements for archived nodes will be lowered, allowing them to use cheaper storage solutions like AWS's Glacier product.
As demand and throughput grow, the time required for the chain to reach these limits will decrease from years to months or even days. Archived chains will be regularly pruned, ensuring that the active chain remains at a manageable scale. This will prevent the chain from centralizing into a few large validators that hold disproportionate control over the network.
As the DeFi industry evolves, the demand for the networks that run it is also increasing. The recent bull market has highlighted the need for greater scalability, which the Ethereum network cannot meet, driving fees to absurd levels in the process. There are many approaches to addressing scalability, but few tackle the critical issue of chain size and the accompanying risks of centralization.
Energi is taking a holistic approach to solving scalability issues. We have implemented a Layer 2 solution that allows us to process many times more transactions per second than Ethereum while retaining a high degree of decentralization that most current Layer 1 blockchains lack.
Equally important, we have designed a blueprint to address the largely overlooked issue of chain size and created a system that allows Energi to process as many transactions as possible per second, with archived blockchains being pruned daily. We take pride in our unique approach to archiving chain data, which will keep Energi fast and scalable to accommodate the future.







