Blockchain Learning, IoT, Proof of Stake

Our Response to NIST

The National Institutes of Standards and Technology requested information for a congressional study regarding emerging marketplace trends in a variety of areas including blockchain and IoT. Read our response below as we discuss applications of Decentralized Computing, how consensus works, and what’s next.

Abstract

This paper contends the following: Blockchain as a ledger or immutable database alone is a limited technology with no real use cases. The true value of the technology is apparent through the consensus mechanisms which enable the real value proposition, namely digital value and distributed peer-to-peer trust. This paper outlines why consensus is the true innovation in the blockchain space, some methods of consensus, and applied use cases. Applied use cases discussed: digital tokens; defense applications and distributed trust; IoT and distributed data exchange in industry including energy.  It is important that this reviewing body considers the implications of such a technology in order to create smart policy. Lastly, we encourage regulatory frameworks in the space to focus on enabling technology growth on public, decentralized networks where entrepreneurs can be free to build and innovate. 

Introduction

We believe the significance of “Blockchain technology” is the creation of new value models, consensus mechanisms, and tokenization itself, not the “blockchain” technology. Blockchain technology is a broad, oft misused term that people use to describe many things and frequently just an immutable database. An immutable database, while having limited use cases in regulatory and legal cases, is not a revolutionary technology in and of itself.  While it’s important to distinguish between much of the current sensational, speculative hype surrounding cryptocurrencies and their derivatives like NFTs, we must also contrast enterprise efforts focused on Blockchain as a limited database technology. Cutting through the confusion, we will look towards the core capabilities of the technology and where this will lead to a future of digital value exchange.

The true innovation and revolutionary technology are the consensus mechanisms of these Decentralized Computing platforms powering cryptocurrencies. Similar to the computer enabling machines to create and store data or the internet allowing machines to exchange data, Decentralized Computing enables machines to create, transact, and store value. This mechanism enables trust between independent users around the globe, who do not need to know each other or even utilize a third party to moderate their interaction. Value exchange has been democratized for the first time in human history, enabling all users to control their own credit, without reliance on a third party like a bank, government, or temple to set exchange rates and mediate value exchange.

This is a fundamental change in the way we will interact with currency and value in our lives. In much the same way past revolutions displaced the main players in the industries they disrupted (The Industrial, and the Digital/Internet revolution) so too will Decentralized Computing continue to disrupt established banking and financial institutions and even beyond, to the centralized enterprises who currently control information flow on the internet.

We believe very clearly that the future of “blockchain” or Decentralized Computing will be built on open, decentralized public protocols like Bitcoin and Ethereum for the reasons outlined in this paper. It is vitally important that innovation, use cases, and regulation in this space are designed with this in mind. In short, we believe public, open, decentralized computing networks utilizing Proof of Work, Proof of Stake, or another yet designed mechanism will win out due to their unique incentive structures that draw in a wide range of users. Combined with their open architecture which allows anyone to build, public networks will create a myriad of use cases even beyond those contained below. It is important that builders on these free and open systems be enabled to innovate and create in order to maximize the potential of decentralized computing technology.

Overview of Consensus

Briefly, a consensus mechanism is designed to enable networked computers to agree over (1) a set of data, (2) modifications to or computations with that data, and (3) the rules that govern that data storage and computation. It’s consensus that allows a cryptocurrency like Bitcoin to exist, not “blockchain technology.” For example, in the Bitcoin network, there are a set of rules that govern the inflation and maximum amount of Bitcoin that can exist (21 million). These rules are agreed to by network participants and over time, trust is built up that the Bitcoin tokens will continue to conform to the rules of the network. It is this part of the distributed consensus mechanism that allows Bitcoin to have and retain value. Consider the US dollar, there is no limit on the number of dollars that can exist, however, the strength of the US economy, military, and government creates a trusted environment where the dollar can have value, even though it is a ‘fiat’ currency. Because Bitcoin doesn’t have a central government or physical assets, trust and value must be built and maintained through the consensus mechanism. To this end, Bitcoin created a unique incentive structure as part of its Proof of Work consensus (to be explained later) that made participants in the network want to conform to and maintain the rules so that value could grow. 

This network security is also what allows an open and permissionless network. Anyone can build on these networks without fear that that person could destroy the network and this is partially what makes this technology so powerful. When the openness is removed and the network becomes a Consortium consensus model or private, the ability to create is limited and becomes mediated through the controlling enterprises on the network.

There are several Consensus Mechanisms in use today: Proof of Work, Proof of Stake, Consortium, and Proof of Authority. These mechanisms provide the backbone for Decentralized Computing networks and we’ll see that the open mechanisms, Proof of Work and Stake, do far more than simply act as a “blockchain ledger.”

Proof of Work

Proof of Work is the original consensus mechanism designed by the pseudo-anonymous Satoshi Nakamoto. It is quite simple: Network participants use their computers to solve a math problem (or algorithm), the first computer to solve the problem gets rewarded in Bitcoin and the ability to process a number of transactions, earning those fees as well. This has been occurring basically non-stop since the launch of Bitcoin in 2009. The mechanism is designed so that the algorithm changes in difficulty as more computers are added to the network so that a Bitcoin block (and reward) is mined every 10 minutes. The reward from Proof of Work makes for a powerful incentive. Computers participating, especially today as the amount of computing power on the network has grown exponentially, spend quite a bit of money to purchase, maintain, and run their equipment. They must ensure the rules of the network are maintained so that they can continue to earn money from transaction fees and block rewards. Someone looking to disrupt the network would need to deploy an incredibly expensive amount of computing power to manipulate blocks or transactions, not worth it for any individual or company to do. This methodology has proven very successful though there are two major concerns. First, is the amount of electricity required to maintain the network. The answer is outside the scope of this paper, however, thus far, predictions of future energy usage have been quite incorrect and there is no indication that Bitcoin is generating demand for new energy sources, only utilizing excess energy in specific areas. The second concern is the ability of any new protocols to utilize a new Proof of Work network. Since there is so much computing power involved in the cryptocurrency space today, it is easy for an attacker to disrupt and destroy smaller proof of work networks.

Proof of Stake

There are two other key limitations of Bitcoin that drove innovation in new methodologies of consensus that resulted in Proof of Stake. The first is that Bitcoin is stateless or non-Turing complete. This means that it can’t do much else other than managing the transfer and store of Bitcoins. There isn’t a problem with this, per se, but innovators were interested in ways to build an “internet computer” a decentralized computer with built-in payment rails that would enable users to create a new generation of computer programs and new types of digital assets all natively decentralized and uncensorable (Streaming platforms, social media, music, art, etc). This, in part, required a new methodology to overcome the second limitation of Bitcoin, the speed of the network (Bitcoin blocks only come once every 10 minutes), while still maintaining the overall security of the public network. Proof of Stake emerged as the most likely solution and works as follows. Instead of rewarding computers based on the computing power, they are utilizing, Proof of Stake rewards all participants for the number of network tokens they have “staked” or deposited on the network. Participants can choose from a variety of network computers, or Validators, to stake to and desire to choose a Validator who will properly secure and maintain the network, otherwise their tokens are worthless. This incentive structure has so far proven to be a worthy competitor to Bitcoin and most new network launches (Cosmos, Solana, Polkadot, and others) have utilized a form of Proof of Stake. The Ethereum network, the original Turing-complete decentralized network, is planning to shift from Proof of Work to Proof of Stake later this year (2022). 

There are two major concerns with Proof of Stake. The first is that when compared to the Bitcoin network, Proof of Stake is a newcomer and not as established or tested. Second, there are questions as to the equitability of Proof of Stake. Since it is purely token-based, most of the token supply tends to consolidate with a few wealthy token holders on a few wealthy Validators. This potentially increases the likelihood of collusion and network manipulation. Proof of Work is seen as more equitable since any network computer could mine Bitcoin, without requiring holding any Bitcoin, to begin with, however, the network has become so large that a single computer or even a small data center of computers is unlikely to be able to mine Bitcoin consistently, so it too has tended towards centralization among the large “miners” who are able to afford the expensive equipment needed.

Consortium

Consortium consensus has been favored among enterprises, like IBM and Walmart, attempting to experiment with Blockchains. Consortium consensus grants privileges to known members to participate in consensus but is a closed network to the public. Similar to our Walmart example later (see Trade section below), who determines the identity of the participants? Who decides exchange rates or data control? What happens if one of the participants is compromised? Participants in these networks must have a level of trust with each other outside of the network itself, severely limiting their use cases outside of very specific examples.

Proof of Authority

Proof of Authority, or social consensus, is an alternative consensus mechanism that doesn’t require the computing power of Proof of Work or the expense of holding tokens for Proof of Stake. Proof of Authority grants members the ability to elect (or demote) other users on the network through voting. It is similar to Consortium consensus in the sense that participants must build some sort of trust with other participants to gain access, but there are controls in place to remove underperforming or malicious participants. The trust is distributed across the participants, enabling this network to be public and open. The major flaw in Proof of Authority is the general politicking associated with a decentralized group of participants attempting to make decisions regarding changes to the network or admitting new members. The network Factom, one of the earliest decentralized networks to launch after Bitcoin, has faced these challenges, crippling the network to the extent that it is shifting to a Proof of Stake network later this year (2022).

Recommendations

1. Peer-reviewed study of energy usage and growth of the Bitcoin network evaluated. Calculations and predictions thus far have been incorrect.

2. Policies evaluated and enacted to promote innovation and development utilizing open and decentralized network protocols.

Use Cases

Finance

Decentralized finance, or DeFi, is the most clear-cut of the use cases for Decentralized Computing today. We won’t focus on this too much as it is written about elsewhere by more qualified individuals but suffice to say, as noted above, DeFi enables a democratization of finance, enabling users to engage in a new world of value creation and exchange. Needless to say, this area of disruption is ripe with optimism as well as fraud and until more specific regulation is applied in the United States, this “wild-west of digital gold” will continue.

Trade

Supply Chain is often the immediate answer when someone asks, “What else can blockchain do other than cryptocurrencies?” This is due to the above-noted misunderstanding that “blockchain” is just an immutable database. Often cited is Walmart utilizing blockchain to track its Mango and Pork supply chain (in a few specific geographic areas). While indeed a noble cause, having better traceability in a supply chain to help prevent disease breakout and sustainably source goods – all of this could have been done (and probably should have already been done for most of our food supply chain) with traditional Web and digital technologies. The speed at which Walmart was able to audit their supply chain also has little to do with blockchain but again, the digitization of that supply chain. Walmart’s “blockchain” technology is just a supply chain software they control and force their suppliers to record to. There is no real consensus mechanism, there is no real value creation or exchange on the platform, and it is a closed architecture, meaning only federated users can participate. If Walmart changed data on the platform, who would know?

We actually do believe that Trade and Supply Chain is a powerful use case for Decentralized Computing, but not for the reasons Walmart is using it. Trade is often a complex operation, requiring stakeholders across the spectrum from Governments to Suppliers. Significant value is generated as goods move through a supply chain. This value is transacted across many distributed parties and stakeholders. As supply chains become more digital, a significant need is for data standardization across the supply chain. This is easy to do in a simple supply chain within the borders of a country. However, in a globalized world goods transit around the globe through many different countries, each with their own regulations and trust between each other. Decentralized computing offers a unique opportunity for countries to agree on a common data standard, enable transparency across supply chains, and facilitate a more efficient flow of goods across borders without requiring outside trust of the members.

Goods flowing across borders encounter a wide range of regulations, paperwork, taxes, and inspections. An open protocol would enable the standardization of data across borders while enabling participating countries to retain their own individual regulations, digital paperwork, and tax revenue. This technology could be combined into a single system where data, goods, and money flow simultaneously as the product transverses through its supply chain. This would greatly reduce trade barriers and overhead while providing increased transparency of the supply chain and ensuring compliance

Internal trade is also a potential use case. In many industries, particularly healthcare, supply chains are managed by third parties who introduce unnecessary overhead and complex contract agreements between manufacturers and end-users, like hospitals. This also creates a large information barrier where manufacturers are unaware of the long-term quality of their product, how often it is used, and have only delayed demand information, making it difficult to build a dynamic supply chain. In many cases, manufacturers are forced to purchase old data from purchasing agencies like GPOs just to better understand their own equipment. The reverse is true as well. Hospitals are locked in complex contract agreements with GPOs, providing very little price transparency or resiliency. A trusted, decentralized network of manufacturers and end-users interacting directly with each other would introduce additional transparency and information flow between stakeholders, creating a more resilient and cost-effective supply chain. Recent regulatory changes, including the 21st Century Cures Act Final Ruling, created a data standard for health information that could be leveraged to provide even better quality assessment and demand data to manufacturers.

Defense

The US Department of Defense (DoD) is an early adopter of “blockchain” for supply chain. There are some good reasons for this, unlike Walmart. First, traceability is essential for many parts and products in the DoD supply chain. Certain parts are incredibly valuable and could mean life or death, or even impact national security, and, as such, a heavy-handed immutable database tracking these parts can find use in such an environment. For example, the highly successful SUBSAFE program already uses an amended-only paper trail for ship-safety repairs on submarines. The use of an immutable database in this instance could dramatically improve a paper-heavy and cumbersome process while still preserving ship safety.  Again, these use cases are limited in nature and don’t really address a key problem in vital supply chains, namely data entry (even in our SUBSAFE example). Non-natively digital information or value whether a collectible, Rolex, or Seawater valve, when entered onto a blockchain will always depend to an extent on the person entering the information. Blockchain, although immutable, can still be error-prone (then save that error for eternity). This again highlights the fact that much of our supply chain issues are a digitization issue, not an audit trail issue.

However, there are ways we can work to reduce the chance of error and leverage emerging digitization technology to better ensure proper data entry. In a current project with the United States Navy, we’re building a blood product tracking tool for sharing blood product information across a variety of parties such as a blood bank, hospital, and ship, we’ve utilized barcodes and RFID to automate data entry and partially solve this problem as well as integrating another key use case, IoT. As the DoD continues to experiment and test Decentralized Computing, we believe two more key use cases will emerge:

1. Digital Chain of Custody: As DoD continues its digitization efforts, much of the data created will be (and already is) of the highest importance to national security, from directives, to planning, to weapons system software. Securing this data in a way to ensure it has not been tampered with is a key area of national security importance. Especially in the case of mission-critical software, which is already natively digital, Decentralized Computing can be a powerful tool to prevent malicious or even unintentional errors.

2. Decentralized and Secure Communications: Warfare is becoming increasingly informationized and cyber-based. Conflicts arising in the coming years will be a battle of systems and systems confrontation. Digital information flowing from soldiers on the ground, drones in the air, weather, intelligence, and more will be fed into a full combat, comms, and sensor integrated network from where topline decision-makers will be able to view the entire battlefield and make key decisions. Disrupting these networks will require coordinated attacks on key nodes to disrupt or destroy the system. Traditional digital and web architecture creates single points of failure that if exploited would destroy military systems and impact national security. Decentralized Computing could dramatically improve the robustness of these systems. By creating a decentralized network of nodes, information transfer can not only be increased in volume, by utilizing technologies like BitTorrent, increase security by breaking the information apart into smaller chunks, and creating a robust data transfer network where one node’s destruction will have little impact on the other nodes or operation of the system. A sophisticated consensus mechanism will also be able to observe potential bad actors attempting to participate in consensus, removing Man in the Middle or Sybil attacks, something much harder to do in traditional cyber security.

Internet of Things (IoT)

Every day a new device becomes IoT enabled, from your microwave to your grill. This certainly will continue to grow and these devices will gather more and more information from our everyday lives and beyond into industry. As IoT becomes more ubiquitous, the control the IoT devices have over our lives and the data generated will be very important, valuable, and vulnerable. Significant debate will continue to occur around who owns IoT data and what it can be used for. Possibly even more important is what the device can do. If it’s just monitoring your grill temperature, you may be less concerned about privacy or data ownership. If it’s a heart monitor, baby monitor, or a control actuator on an oil pipeline – that device’s connectivity presents a very clear vulnerability that if exploited or the device itself fails, could result in serious consequences.

In a data privacy use case, a decentralized IoT device, not dependent on a third party’s centralized server, the user can be assured that information generated by a device is owned by that person alone. Additionally, compromising a large amount of decentralized IoT devices, while in a centralized use case would only require the exploitation of a single server, would require exploitation of each IoT device, something extremely difficult, if not impossible to do. 

Centralized IoT devices are also limited by the companies that produce them, interoperable with only others in their product line, and dependent on the longevity of their maker to maintain functionality. What happens to a valve actuator on a pipeline when the maker of that connected device goes bankrupt, is acquired, or the cloud server supporting that device goes down? The IoT centralized ecosystem as it exists today is highly vulnerable to the viability of the companies making the device. Potentially fine for a light bulb or grill, but disastrous if a pipeline goes down or your pacemaker needs an update.

IoT devices being served by a decentralized network can avoid the issues that central servers create. Even in the case that a company ceases operation, the network itself will remain functional, allowing the device to continue operations. The data standard of the decentralized network, combined with current efforts today to link data across different networks also presents an opportunity for increased IoT device interoperability. Devices will be able to speak to each other across product lines, creating new use cases. Your grill could talk to your doorbell, letting you know friends have arrived for the barbecue. In a more serious use case, weather monitoring could be connected to electrical grids and oil pipelines, providing information directly to control systems to enable proactive protection of equipment.

MachineFi is the terminology being introduced to describe the intersection of Industrial IoT and Finance. The open protocol, IoTeX, is a leader in the space and describes MachineFi as a new paradigm fueled by Web3 that underpins the new machine economy, whereby machine resources and intelligence can be financialized to deliver value and ownership to the people, not centralized corporations. Though somewhat hyperbolic, the message is clear, IoT devices powered by Decentralized Computing have the potential to not only impact the security and longevity of the device but the empowerment of the users themselves.

Energy

Energy markets are increasingly distributed as ‘micropower’ sources such as wind turbines and solar panels proliferate. Battery backups to these renewable energy sources are also in high demand and are often traded not as storage but as ancillary power. The variability of renewable sources is difficult to predict, so much so that it often trades at a negative price. Current energy trading was designed around the large power sources of the past and is not dynamic enough to predict or react to changes in weather or energy production. Smaller power operators have an opportunity to more directly sell their power to the grid as digital technologies now manage the flow of energy. Additionally, grids could also have more flexibility to directly compensate for excess energy storage and backups, a must-have for extreme weather events and surge times, ensuring power is available when needed. As a whole, energy markets could be another prime area where Decentralized Computing could have a large impact utilizing a combination of the use cases discussed above. Cross border energy trade, enabled by a common data standard on an open network, robust grid and weather monitoring through widespread IoT devices, and MachineFi tools to power dynamic energy trading and real-time settlement could usher in the next generation of reliable and secure clean energy production.

A thought on Uncensorability

A clear but often overlooked value proposition of Decentralized Computing is uncensorability. This means that transactions on a public network, like Bitcoin, cannot be stopped by any third party. This is a particularly powerful empowerment technology that could enable victims of oppressive governments or those unable to access banking to be able to participate in an economy otherwise closed to them. But uncensorability has its dark side – horrible things could be empowered by this tech – Child pornography or payments to terrorists could all be funded and transmitted through these networks. As public open decentralized and uncensorable networks grow, special attention and regulation are needed to ensure that the worst of humanity is not enabled along with the best.

Conclusion

Let’s cut through the hype – tokens like Bitcoin and even Dogecoin and Bored Ape NFTs have some value, what exactly is that value? As our lives have become more digital, as our machines and data have become more digital, as software has eaten the world; we must understand that the value associated with our lives, our machines, and our data is also becoming digital. There is something here beyond mere speculation. This should be seen as a natural progression, money flowing into the digital value ecosystem in search of what’s next. The question before us is whether to allow the natural progression of value digitization by enabling the building of this new economy on free and open standards or to attempt to heavily regulate it to prevent what is seen as excess in the world of digital tokens (like Dogecoin). A few things are clear though, the robustness and uncensorability of the open protocols make it very difficult to shut down and the alternatives, private and consortium networks have been unable to produce a product that can match the potential of the free and open digital value economy.