Those who have been in the crypto sphere for more than five minutes will have undoubtedly encountered discourse on the Bitcoin scaling problem.
The limited capacity offered by the default 1MB block size on the Bitcoin network has been the subject of ongoing debate, with disagreement on the best way to scale the network to enable Bitcoin to become a fundamental component of our global financial and payment infrastructure.
The best-publicised solutions to the Bitcoin scaling problem, at least to date, have centred around layer two solutions. That is, those that sit above the blockchain and don’t transact directly on it – such as the Lightning Network and Strike.
These solutions take transactions off the blockchain – layer one – in favour of operating on cheaper networks, known as sidechains, which are later settled on the first layer.
For nChain CTO and technical director of the Bitcoin SV Infrastructure Team, Steve Shadders, the pursuit of second-layer solutions to solve scaling not only overcomplicates matters, it moves away from the core benefits and value proposition of a blockchain. Instead, Shadders contends that Bitcoin could always scale at the first layer, it was just a matter of letting it.
Last month in Zurich, at the biannual CoinGeek conference, Shadders gave a live demonstration of what he called Teranode – a multi-machine Bitcoin node implementation built for scale – as it actively processed more than 50,000 transactions per second in real-time directly on the Bitcoin SV blockchain.
Compared to the status quo on the Bitcoin network, where first-layer solutions are limited to around seven transactions per second, it is a seemingly impossibly large increase.
The key: approaching scaling differently. The traditional view of scaling blockchains is to scale vertically, that is, by increasing the power of each node to keep up with a larger volume of transactions. Teranode challenges this stance by instead scaling horizontally across several commodity machines.
“This is how most of the world achieves scale in traditional architectures – think of the Googles and Amazons of the world – by simply spreading the workload across multiple machines in the most efficient manner possible,” Shadders explains.
“Teranode applies this philosophy to Bitcoin and is a reimagining of how to implement the Bitcoin Protocol. Several different tasks need to be carried out by a Bitcoin node, most of which are completely independent of each other.
“That means that there’s no reason why they need to sit on a single machine as there are no cross-dependencies between these individual pieces of work.”
The Teranode implementation that Shadders showed is underpinned by the Teranode Open Framework, a technology agnostic model and methodology based on configurable pipelines – essentially an ordered series of tasks unbound to specific machines, enabling it to scale horizontally across clusters of worker-nodes.
Teranode applies this methodology to Bitcoin, enabling the efficient allocation of tasks and resources for a node within a configurable and adaptable system. Simply put, that means that Teranode itself scales too. Higher transaction throughput is enabled by adding more worker-nodes to the clusters and allocating resources across them.
A decade’s work
For most, CoinGeek Zurich will be the first time the phrase Teranode would have entered their vernacular. But for Shadders, the live demonstration represented the product of the better part of a decade’s work, building and developing an idea that originated late one night while in his native Brisbane, Australia.
“Lying in bed, it must have been nearly 2am, and I was thinking about the concepts of microservices architecture and how Bitcoin dependencies are internally structured,” he says.
“Like a total nerd, I had a big whiteboard in my bedroom and when it clicked for me, I leapt out of bed and just started scrawling on it. There was a full moon and I hadn’t even had the wherewithal to turn on a light, I was just frantically drawing boxes.
“My wife woke up minutes later and asked me what the hell I was doing, before pointing out that I had no pants on and I may want to address that before continuing on with anything else.”
What Shadders now endearingly refers to as “the whiteboard incident” and the concepts which emerged from it didn’t immediately gain traction. They did, however provide the impetus for his move to London not too long after to join nChain, where he was tasked with developing the next generation of Bitcoin node software.
“I already had much of what I needed to proceed with that project,” he recalls
“I’d previously done plenty of work in bitcoinj – a java implementation of the Bitcoin protocol – to implement large parts of the protocol functionality and developed mining pool software that was used by nearly half the world’s Bitcoin miners in its heyday.
“What I didn’t have was a team that was close enough in physical proximity to be able to work together effectively.”
That part of the equation was solved when Shadders arrived in London, with nChain assembling a team of programmers and engineers who shared his broad vision for Bitcoin scaling and challenged how exactly it could be put into practice.
“When we had our first group workshops at nChain in 2019, we wrote down a few guiding principles, one of which was to assume no bounds on available resources and that every system would be pushed orders of magnitude beyond what we are thinking right now,” he explained.
“That way of thinking meant that a few of the ideas and data structures that I’d had in mind which I realised did have a scale ceiling – it was just that it was far beyond anything I thought could ever be conceivably needed.
“But as time has progressed and we have thought about new ways that Bitcoin can be used, those assumptions have crumbled and we’re now thinking about millions, even billions of transactions per second.”
This innovation comes as Bitcoin continues to endure body blows in the media over its resource consumption and environmental impact. Shadders argues that scaling Bitcoin puts these arguments to rest, as the efficiency and potential use cases of the network grow exponentially, shifting the value proposition entirely.
“The energy consumption of Bitcoin has very little relationship with the number of transactions being processed – if you are handling one million transactions per second, then you have 200,000 times better efficiency in energy cost per transaction than the current average of five transactions per second we see on BTC,” he adds.
“The per-transaction cost is what is important here. An individual transaction represents a certain amount of utility and we don’t mind using energy to create utility – no one would argue that we shouldn’t use energy to run a hospital, because it provides enormous value for the people using it and for society at large.
“Similarly, if we can define what the actual utility of Bitcoin is, then we have a metric to measure what we’re getting from the energy that we spend and can ultimately determine the value of the network. And that utility can only be realised with a Bitcoin that can scale without bounds.”