Blockchain Learning, Proof of Stake, Uncategorized

Why We’re Building Casimir

Note: This started out as an internal document, reconfirming our priorities and values as we watched the First Contagion, the collapse of Luna, in early 2022. Today, we’re watching the Second Contagion and the collapse of  one of the over-leveraged wildcat banks we allude to below. Although we didn’t predict the collapse of FTX, nor are we trying to predict what’s next, we continue to operate by our principles first approach. This approach means working on solutions for our core principles of self custody and peer-to-peer transactions in the Web3 space. It’s why we have never worked with or used FTX before and why our current roadmap remains unchanged. What the FTX collapse has done for us, though, is affirm our approach and hasten our desire to bring about vast improvements to the UI/UX problems that create a barrier to self custody for many users. We hope the following will give insight as to why we’re taking the approach we are and we hope you’ll join us on our journey to making Web3 better, easier, and safer for all.

It’s always a good idea to revisit your priorities and values especially in this time of uncertainty in blockchain, cryptocurrency, Web3, and technology at large. We’ve written before about what the value proposition of blockchain is and why we’re building technology as close to the consensus layer as possible. Fundamentally, the consensus mechanism is what powers a new medium; exchange of value without the need for a third party, peer-to-peer transactions. It’s clear that many of the current issues in the Web3 space today are due to egregious speculative activity and attempted takeovers from centralized entities acting far away from the consensus layer. 

We’ll briefly touch on some recent issues in Web3, the major reasons we think they occurred, and why we’re building Casimir to help fix this.

Bridge Attacks – In February 2022, the DeFi platform Wormhole was exploited for $325 million. Wormhole was a popular VC backed blockchain bridge designed to allow users to access tokens across chains using a single access point. More recently the Binance Smart Chain was exploited for $100M+. While bridges are a potentially convenient solution to the mass of protocols in existence, a single smart contract or hot wallet with $100M+ of deposited tokens is proving to be too attractive of a target for hackers. So far in 2022, over $2B worth of tokens on bridges have been hacked!

Decentralized in Name Only – The first of the warning bells of the impending 2022 cryptocurrency sell-off was the collapse of Terra. There are a range of reasons why Terra collapsed but simply, algorithmic stable coins backed by digital assets have fundamental challenges due to the volatile nature of digital assets. This early breakdown from Staking Rewards names a combination of an overreliance on the yield platform Anchor combined with significant off-chain usage on exchanges being a driving factor in the collapse of Terra. Those externalities, controlled by central entities, effectively subverted the consensus mechanism of the project by operating off-chain where overleveraged risk could not be observed. Additional issues were caused by a concentration of premined tokens in the hands of Terraform Labs who essentially controlled protocol voting and overrode the desires of some in the community to reduce risks. A more recent postmortem in June 2022 showed that the liquidity issues and subsequent depegging of the UST stable coin were caused by Terraform Labs themselves.

The Rise and Fall of CeDeFi – Next to fall, and still unwinding, is the “Centralized Decentralized Finance” (CeDeFi) company Celsius. Companies like Celsius and BlockFi have driven huge growth in Web3 by offering high interest rate yields on your deposited tokens. They act as a bank but don’t do a good job of indicating the potential risk their depositors face nor do they follow the same regulations as traditional banks. Celsius was exposed to Terra and potentially lost $500M there alone. More recent are revelations that Celsius executives cashed out just prior to the collapse and bankruptcy filing.

Last of the “(first) contagion” was the collapse of Three Arrows Capital. Ongoing investigations are looking at whether 3AC took large margin longs on cryptcurrencies through fraudulent activity and then  were subsequently liquidated over the past month of pullbacks. Overall, it sounds pretty bad for 3AC management and they might be going to jail.

The unifying thread of these major collapses was the concentration of digital assets and their control into single points of failure. Even worse, the users themselves were in the dark, unaware of what was occurring with little visibility into the behind-the-scenes actions of those companies. What the latest round of speculative growth in Web3 was built around was, in short, unsustainable, over-leveraged, unregulated, wildcat banking, totally divorced from the core ideas of a decentralized currency. This mentality has unfortunately not changed since the beginning of the year and more liquidity crises are not out of the question.

Unfortunately, all of these problems were intentionally created (not the fallout of course); many players in the Web3 ecosystem today are attempting to rebuild traditional business models around SaaS and fee extractional models by creating layers of complexity that separate users from the core Web3 value proposition: Peer-to Peer-transactions.

While the 2022 drawback in Web3 did a lot to refocus the industry on its core principles, there are still growing centralization and regulatory concerns:

Ethereum Merge – Ethereum 2.0 staking is currently heavily concentrated among major cryptocurrency exchanges and the Lido Pool. So far, just two centralized staking providers, Coinbase and Lido, have mined almost 50% of Ethereum blocks post merge. Control of cryptocurrencies by “banks” (Coinbase, Kraken, BlockFi, FTX, etc) presents a threat to the uncensorable features of the Ethereum blockchain. With control of the Ethereum blockchain and operating under U.S. regulatory policies, these entities must implement any and all controls as required by law. What this means is that cryptocurrencies would effectively become fiat currencies – implemented by decree from the state.

If we are to avoid this scenario we must help create a truly decentralized ecosystem where a few centralized entities can’t control the Consensus mechanism of a Web3 protocol. We need native Web3 solutions – peer to peer, decentralized solutions and tools that empower the users, not centralized market makers. We’re building Casimir to do just that.

Decentralization – Probably the most overused and watered down word in the space is “decentralized.” Nearly everything in blockchain/web3 is called decentralized, whether or not it actually is. The unfortunate reality is that blockchains are decentralized in name only. A recent study by Trail of Bits for DARPA concludes blockchains are fairly centralized. They report that the pooled mining for Bitcoin gives a Nakamoto coefficient of 4 to Bitcoin and Proof of Stake protocols aren’t much better. I won’t get into criticism of the overall piece by Trail of Bits, particularly the misassociation of pools and protocol control for Bitcoin, but the Nakamoto Coefficient for Proof of Stake is worth analyzing. Chris Remus of Chainflow has written extensively on Staking Decentralization and currently maintains a live Nakamoto Coefficient tracker that predates the Trail of Bits report. The Nakamoto coefficient is a measure of decentralization and, by definition, the number of nodes needed to control the Consensus mechanism of the protocol. The lower the number, the less decentralized. At the time of this writing, some major protocols have very low Nakamoto Coefficients, of note Polygon is at 3.

The goal of Proof of Stake protocols should be to get the highest Nakamoto Coefficient number possible, which would make it very difficult to manipulate the protocol since it would require simultaneous compromisation of hundreds of nodes. For example, Cosmos has an active set of validators of 150, around the world. Compromising all of them would be likely impossible, however the Nakamoto Coefficient of Cosmos, is only 7, meaning that to control the Consensus mechanism of Cosmos would only take a compromise of the top 7 Cosmos validators. A tough job to be sure, but a lot easier than the 150 total active validators in the Cosmos ecosystem.

What this means in practice is that the allocation of staked tokens should be spread across all validators as equally as possible, not continually concentrated in a few of the already heavily staked validators.

So why are the Nakamoto coefficients so low? Let’s talk about the User Experience

The Crypto Experience

User experience

The Web3 user experience today… sucks. You’re forced to either leave significant returns on the table and surrender control of your assets to a major exchange; or, endure the inconvenience of manually staking across multiple protocols, wallets, platforms, and websites. It’s harder to know what’s going on and it becomes easier to get scammed through faulty or malicious smart contracts. 

What Web3 Looks Like Today

The easiest way to manage multiple digital tokens and assets is through centralized exchanges like Coinbase, which leave a lot to be desired. You give up custody of your tokens and if you’re staking, you’re missing out on potential rewards that Coinbase scoops up in the form of third party fees. If you’re more adventurous, you may have multiple wallets and multiple staking websites you use. You have the benefits of self custody but are forced to go through the process of managing the wide range of websites and wallets you have to interact with the various protocols. It becomes confusing to manage and monitor all of your stuff and there aren’t any good solutions today that help you compile everything.

What’s more, current Web3 non-custodial products, like MetaMask, fall far short of protecting users from scams or interacting with bad smart contracts. Because cryptocurrencies are so difficult to interact with and understand, even seasoned pros get manipulated and hacked.

How MetaMask Responds to UI Criticism

Let’s look at how this poor user experience even affects the Consensus mechanisms of PoS protocols. One of the easiest ways to stake in the Cosmos Ecosystem is using Keplr, a mobile/web wallet that allows you to stake to any of the Tendermint based protocols. However, users trying to stake with Keplr aren’t given much to work with. 

The Staking Page for Cosmos

A new Staker has no way of deciding who to stake to. There are no easy ways of determining whether an above listed validator is reliable or participating in the governance of a protocol. Users have no real reason to choose a validator outside of the top ten, because there are no tools to sort and research each individual validator. So, people end up picking validators from the top of the list due to the appearance of quality. We can see this effect in the Nakamoto Coefficient of Cosmos today, which is 7. What’s more, two of the top five Validators for Cosmos are cryptocurrency exchanges. In Proof of Stake today, cryptocurrency exchanges have an outsized impact on the consensus mechanism of proof of stake protocols.

So, we’re left where we started. Exchanges offer the best user experience and are gaining control over Proof of Stake protocols. Since exchanges are likely to be regulated more like banks in the future, we are looking at a future where Proof of Stake is controlled by banks. What this means is that they control consensus. They can censor accounts, users, or transactions that they don’t like or are told to by the government. That’s a fundamental threat to the idea of decentralization and Web3 as a whole – an uncensorable digital currency.

Our conclusion is that a poor user experience is driving centralization and will continue to lead to major single point of failures like Celsius unless we create tools that allow users to take full advantage of the protocols they use.

How we’re building Casimir

First, we reexamined how Web3 is being built today. It’s been often stated that Web3 is “going to be just like the internet”. It’s certainly true that there may be some parallels in growth trajectory and societal impact; however, for many projects in the space today, “just like the internet” means being built using today’s internet: AWS/Google Cloud, numerous HTML websites, and centralized SaaS powerhouses. With Casimir, we want to break the paradigm of today’s Web3 and reexamine how users interact with and use blockchains, digital value, and Web3 overall.

We are getting off the Web 2.0 rails and building something new, a native Web3 experience that prioritizes decentralization, user experience, and user control. We’re building the first true Web3 portal, capable of integrating with any wallet, any blockchain, and any token, allowing users to easily navigate Web3 and interact with the protocols directly, not through a centralized exchange or a variety of unconnected websites.

How We’re Designing Casimir

Improving the User Experience through Decentralization

We’re starting bottom up. Unlike current UIs, designed with traditional Web2 architectures, we’re starting at the Consensus and Infrastructure layers of Web3. These layers of decentralized node infrastructure providers hold fully indexed blockchain databases, provide APIs for querying, a network of worldwide decentralized nodes for consistent uptime, and build blocks of transactions as they are added to the blockchain. Today, most users are forced through third parties to access blockchains, which introduces extra costs for transactions and token management. By accessing these nodes directly, users are assured of uptime, uncensorable and low cost transactions, and minimized fees taken by the normal third party intermediaries. Also, with the right tools, users can access on-chain analytics and other information that these nodes carry. This information can protect users by providing transparency to the entities they’re interacting with as well as information about smart contracts and other on-chain data. Today there simply aren’t good enough tools to make on-chain information available and usable to the everyday user.

There are 3 key areas we’re focusing on as we design Casimir: Usability, Security, and Transparency.

Usability: Similar to a Mint or Personal Capital, it will be a place where users can aggregate their digital currencies and assets, for an easy place to manage what they have across the various protocols they use. Many Web3 users have multiple wallets and assets from a variety of protocols, so a single location for them to better manage and view their assets is much needed without it being a single point of failure for any stakeholder.  With our multi-chain approach and Multiwallet Connect we can effectively be an interoperability solution without the bridge.

Casimir will do more than just a Mint, however, it will allow users to interact with their chosen protocols, accessing mints and air-drops, Stake and manage their digital currencies across protocols beyond ethereum, and access specialized tooling that helps protect users. We’ll build and continue to add features like this that help users use Web3.

Our business model isn’t built around trading or exchange fees. Unlike an exchange, we’re not front running trades or building in hidden custodial fees. Our base product will always be free to use and we’ll make money by offering a premium subscriber product as well as through our infrastructure services. We believe you’ll not only have a better user experience, but you’ll actually save money as well.

Security: Unlike most centralized exchanges and custodians, we will never take custody of user’s wallets or tokens. This means we are able to leverage existing on-chain security to protect users at a much higher level. It also means we will never be worried about liquidity or will be trading a user’s tokens on the backend. Although Casimir will be a single site, it won’t be a single point of failure. The code is open source and we will never take custody of user’s digital tokens or NFTs. If Casimir goes away tomorrow, no funds will disappear and users will still have access to all of their tokens.

Unlike traditional Web2, we’re not building around user account management, user analytics, and productizing the user. We’ll never ask for a user’s email address and build an internal profile because not only does this create a security vulnerability for our users, it’s also unnecessary. Our users will always be able to login through their wallet which means they will always control their login credentials.

As part of our usability effort we’re building a smart contract analyzer for users to know what their interaction with a smart contract will *actually* do and monitor the smart contracts they’ve given permissions to and control permissions on old contracts. Because we are working at the protocol level, we are able to provide users with real time information and on chain analytics to help users make the best decisions with their digital assets.

Transparency: As the name indicates, every on chain action on a public blockchain is publicly accessible. Every wallet, every transaction. This transparency is unique in financial systems where the books of banks or governments are not available to everyday users. Today, many Web3 financial providers continue to hide behind their proprietary systems and their financial solvency is only available to the select. What’s worse is that these companies (in the US at least) often skirt regulation loopholes to avoid the same auditory requirements banks have. 

In a world where Bitcoin was launched in the face of a banking crisis, with a desire to bring about a new and transparent financial system to the world, the actions of many major players in the space today are in direct opposition to the values of Web3.

Casimir will help change this. We leverage our fully indexed chains to provide transparency analytics to our users. While block explorers and address balances are always available to those who know how and where to look, we’re making it easier for users to interact with and use Web3 indexed information. We’ll allow users to easily sort data, see large and identified wallets, track large transactions, and match wallets with organizations so that proof of reserves can be ensured.

We’re here to create a better Web3 user experience. For us that means enabling users to better use the decentralized capabilities of the space, not to trade better in the crypto casino. We’ve got a long way ahead of us, both to build something better but also to help users learn the importance of self-custody and what decentralization truly means.  

Over the next few months we’ll present some of the specific technology developments we’re working on to help achieve our goals including non-custodial Ethereum Staking, cross-chain wallet integrations, and a cross-chain single sign on. You can follow our progress on github and join us on Discord. We’re striving to create an open ecosystem that empowers the user and we hope you’ll join us.

Blockchain Learning, IoT, Proof of Stake

Our Response to NIST

The National Institutes of Standards and Technology requested information for a congressional study regarding emerging marketplace trends in a variety of areas including blockchain and IoT. Read our response below as we discuss applications of Decentralized Computing, how consensus works, and what’s next.


This paper contends the following: Blockchain as a ledger or immutable database alone is a limited technology with no real use cases. The true value of the technology is apparent through the consensus mechanisms which enable the real value proposition, namely digital value and distributed peer-to-peer trust. This paper outlines why consensus is the true innovation in the blockchain space, some methods of consensus, and applied use cases. Applied use cases discussed: digital tokens; defense applications and distributed trust; IoT and distributed data exchange in industry including energy.  It is important that this reviewing body considers the implications of such a technology in order to create smart policy. Lastly, we encourage regulatory frameworks in the space to focus on enabling technology growth on public, decentralized networks where entrepreneurs can be free to build and innovate. 


We believe the significance of “Blockchain technology” is the creation of new value models, consensus mechanisms, and tokenization itself, not the “blockchain” technology. Blockchain technology is a broad, oft misused term that people use to describe many things and frequently just an immutable database. An immutable database, while having limited use cases in regulatory and legal cases, is not a revolutionary technology in and of itself.  While it’s important to distinguish between much of the current sensational, speculative hype surrounding cryptocurrencies and their derivatives like NFTs, we must also contrast enterprise efforts focused on Blockchain as a limited database technology. Cutting through the confusion, we will look towards the core capabilities of the technology and where this will lead to a future of digital value exchange.

The true innovation and revolutionary technology are the consensus mechanisms of these Decentralized Computing platforms powering cryptocurrencies. Similar to the computer enabling machines to create and store data or the internet allowing machines to exchange data, Decentralized Computing enables machines to create, transact, and store value. This mechanism enables trust between independent users around the globe, who do not need to know each other or even utilize a third party to moderate their interaction. Value exchange has been democratized for the first time in human history, enabling all users to control their own credit, without reliance on a third party like a bank, government, or temple to set exchange rates and mediate value exchange.

This is a fundamental change in the way we will interact with currency and value in our lives. In much the same way past revolutions displaced the main players in the industries they disrupted (The Industrial, and the Digital/Internet revolution) so too will Decentralized Computing continue to disrupt established banking and financial institutions and even beyond, to the centralized enterprises who currently control information flow on the internet.

We believe very clearly that the future of “blockchain” or Decentralized Computing will be built on open, decentralized public protocols like Bitcoin and Ethereum for the reasons outlined in this paper. It is vitally important that innovation, use cases, and regulation in this space are designed with this in mind. In short, we believe public, open, decentralized computing networks utilizing Proof of Work, Proof of Stake, or another yet designed mechanism will win out due to their unique incentive structures that draw in a wide range of users. Combined with their open architecture which allows anyone to build, public networks will create a myriad of use cases even beyond those contained below. It is important that builders on these free and open systems be enabled to innovate and create in order to maximize the potential of decentralized computing technology.

Overview of Consensus

Briefly, a consensus mechanism is designed to enable networked computers to agree over (1) a set of data, (2) modifications to or computations with that data, and (3) the rules that govern that data storage and computation. It’s consensus that allows a cryptocurrency like Bitcoin to exist, not “blockchain technology.” For example, in the Bitcoin network, there are a set of rules that govern the inflation and maximum amount of Bitcoin that can exist (21 million). These rules are agreed to by network participants and over time, trust is built up that the Bitcoin tokens will continue to conform to the rules of the network. It is this part of the distributed consensus mechanism that allows Bitcoin to have and retain value. Consider the US dollar, there is no limit on the number of dollars that can exist, however, the strength of the US economy, military, and government creates a trusted environment where the dollar can have value, even though it is a ‘fiat’ currency. Because Bitcoin doesn’t have a central government or physical assets, trust and value must be built and maintained through the consensus mechanism. To this end, Bitcoin created a unique incentive structure as part of its Proof of Work consensus (to be explained later) that made participants in the network want to conform to and maintain the rules so that value could grow. 

This network security is also what allows an open and permissionless network. Anyone can build on these networks without fear that that person could destroy the network and this is partially what makes this technology so powerful. When the openness is removed and the network becomes a Consortium consensus model or private, the ability to create is limited and becomes mediated through the controlling enterprises on the network.

There are several Consensus Mechanisms in use today: Proof of Work, Proof of Stake, Consortium, and Proof of Authority. These mechanisms provide the backbone for Decentralized Computing networks and we’ll see that the open mechanisms, Proof of Work and Stake, do far more than simply act as a “blockchain ledger.”

Proof of Work

Proof of Work is the original consensus mechanism designed by the pseudo-anonymous Satoshi Nakamoto. It is quite simple: Network participants use their computers to solve a math problem (or algorithm), the first computer to solve the problem gets rewarded in Bitcoin and the ability to process a number of transactions, earning those fees as well. This has been occurring basically non-stop since the launch of Bitcoin in 2009. The mechanism is designed so that the algorithm changes in difficulty as more computers are added to the network so that a Bitcoin block (and reward) is mined every 10 minutes. The reward from Proof of Work makes for a powerful incentive. Computers participating, especially today as the amount of computing power on the network has grown exponentially, spend quite a bit of money to purchase, maintain, and run their equipment. They must ensure the rules of the network are maintained so that they can continue to earn money from transaction fees and block rewards. Someone looking to disrupt the network would need to deploy an incredibly expensive amount of computing power to manipulate blocks or transactions, not worth it for any individual or company to do. This methodology has proven very successful though there are two major concerns. First, is the amount of electricity required to maintain the network. The answer is outside the scope of this paper, however, thus far, predictions of future energy usage have been quite incorrect and there is no indication that Bitcoin is generating demand for new energy sources, only utilizing excess energy in specific areas. The second concern is the ability of any new protocols to utilize a new Proof of Work network. Since there is so much computing power involved in the cryptocurrency space today, it is easy for an attacker to disrupt and destroy smaller proof of work networks.

Proof of Stake

There are two other key limitations of Bitcoin that drove innovation in new methodologies of consensus that resulted in Proof of Stake. The first is that Bitcoin is stateless or non-Turing complete. This means that it can’t do much else other than managing the transfer and store of Bitcoins. There isn’t a problem with this, per se, but innovators were interested in ways to build an “internet computer” a decentralized computer with built-in payment rails that would enable users to create a new generation of computer programs and new types of digital assets all natively decentralized and uncensorable (Streaming platforms, social media, music, art, etc). This, in part, required a new methodology to overcome the second limitation of Bitcoin, the speed of the network (Bitcoin blocks only come once every 10 minutes), while still maintaining the overall security of the public network. Proof of Stake emerged as the most likely solution and works as follows. Instead of rewarding computers based on the computing power, they are utilizing, Proof of Stake rewards all participants for the number of network tokens they have “staked” or deposited on the network. Participants can choose from a variety of network computers, or Validators, to stake to and desire to choose a Validator who will properly secure and maintain the network, otherwise their tokens are worthless. This incentive structure has so far proven to be a worthy competitor to Bitcoin and most new network launches (Cosmos, Solana, Polkadot, and others) have utilized a form of Proof of Stake. The Ethereum network, the original Turing-complete decentralized network, is planning to shift from Proof of Work to Proof of Stake later this year (2022). 

There are two major concerns with Proof of Stake. The first is that when compared to the Bitcoin network, Proof of Stake is a newcomer and not as established or tested. Second, there are questions as to the equitability of Proof of Stake. Since it is purely token-based, most of the token supply tends to consolidate with a few wealthy token holders on a few wealthy Validators. This potentially increases the likelihood of collusion and network manipulation. Proof of Work is seen as more equitable since any network computer could mine Bitcoin, without requiring holding any Bitcoin, to begin with, however, the network has become so large that a single computer or even a small data center of computers is unlikely to be able to mine Bitcoin consistently, so it too has tended towards centralization among the large “miners” who are able to afford the expensive equipment needed.


Consortium consensus has been favored among enterprises, like IBM and Walmart, attempting to experiment with Blockchains. Consortium consensus grants privileges to known members to participate in consensus but is a closed network to the public. Similar to our Walmart example later (see Trade section below), who determines the identity of the participants? Who decides exchange rates or data control? What happens if one of the participants is compromised? Participants in these networks must have a level of trust with each other outside of the network itself, severely limiting their use cases outside of very specific examples.

Proof of Authority

Proof of Authority, or social consensus, is an alternative consensus mechanism that doesn’t require the computing power of Proof of Work or the expense of holding tokens for Proof of Stake. Proof of Authority grants members the ability to elect (or demote) other users on the network through voting. It is similar to Consortium consensus in the sense that participants must build some sort of trust with other participants to gain access, but there are controls in place to remove underperforming or malicious participants. The trust is distributed across the participants, enabling this network to be public and open. The major flaw in Proof of Authority is the general politicking associated with a decentralized group of participants attempting to make decisions regarding changes to the network or admitting new members. The network Factom, one of the earliest decentralized networks to launch after Bitcoin, has faced these challenges, crippling the network to the extent that it is shifting to a Proof of Stake network later this year (2022).


1. Peer-reviewed study of energy usage and growth of the Bitcoin network evaluated. Calculations and predictions thus far have been incorrect.

2. Policies evaluated and enacted to promote innovation and development utilizing open and decentralized network protocols.

Use Cases


Decentralized finance, or DeFi, is the most clear-cut of the use cases for Decentralized Computing today. We won’t focus on this too much as it is written about elsewhere by more qualified individuals but suffice to say, as noted above, DeFi enables a democratization of finance, enabling users to engage in a new world of value creation and exchange. Needless to say, this area of disruption is ripe with optimism as well as fraud and until more specific regulation is applied in the United States, this “wild-west of digital gold” will continue.


Supply Chain is often the immediate answer when someone asks, “What else can blockchain do other than cryptocurrencies?” This is due to the above-noted misunderstanding that “blockchain” is just an immutable database. Often cited is Walmart utilizing blockchain to track its Mango and Pork supply chain (in a few specific geographic areas). While indeed a noble cause, having better traceability in a supply chain to help prevent disease breakout and sustainably source goods – all of this could have been done (and probably should have already been done for most of our food supply chain) with traditional Web and digital technologies. The speed at which Walmart was able to audit their supply chain also has little to do with blockchain but again, the digitization of that supply chain. Walmart’s “blockchain” technology is just a supply chain software they control and force their suppliers to record to. There is no real consensus mechanism, there is no real value creation or exchange on the platform, and it is a closed architecture, meaning only federated users can participate. If Walmart changed data on the platform, who would know?

We actually do believe that Trade and Supply Chain is a powerful use case for Decentralized Computing, but not for the reasons Walmart is using it. Trade is often a complex operation, requiring stakeholders across the spectrum from Governments to Suppliers. Significant value is generated as goods move through a supply chain. This value is transacted across many distributed parties and stakeholders. As supply chains become more digital, a significant need is for data standardization across the supply chain. This is easy to do in a simple supply chain within the borders of a country. However, in a globalized world goods transit around the globe through many different countries, each with their own regulations and trust between each other. Decentralized computing offers a unique opportunity for countries to agree on a common data standard, enable transparency across supply chains, and facilitate a more efficient flow of goods across borders without requiring outside trust of the members.

Goods flowing across borders encounter a wide range of regulations, paperwork, taxes, and inspections. An open protocol would enable the standardization of data across borders while enabling participating countries to retain their own individual regulations, digital paperwork, and tax revenue. This technology could be combined into a single system where data, goods, and money flow simultaneously as the product transverses through its supply chain. This would greatly reduce trade barriers and overhead while providing increased transparency of the supply chain and ensuring compliance

Internal trade is also a potential use case. In many industries, particularly healthcare, supply chains are managed by third parties who introduce unnecessary overhead and complex contract agreements between manufacturers and end-users, like hospitals. This also creates a large information barrier where manufacturers are unaware of the long-term quality of their product, how often it is used, and have only delayed demand information, making it difficult to build a dynamic supply chain. In many cases, manufacturers are forced to purchase old data from purchasing agencies like GPOs just to better understand their own equipment. The reverse is true as well. Hospitals are locked in complex contract agreements with GPOs, providing very little price transparency or resiliency. A trusted, decentralized network of manufacturers and end-users interacting directly with each other would introduce additional transparency and information flow between stakeholders, creating a more resilient and cost-effective supply chain. Recent regulatory changes, including the 21st Century Cures Act Final Ruling, created a data standard for health information that could be leveraged to provide even better quality assessment and demand data to manufacturers.


The US Department of Defense (DoD) is an early adopter of “blockchain” for supply chain. There are some good reasons for this, unlike Walmart. First, traceability is essential for many parts and products in the DoD supply chain. Certain parts are incredibly valuable and could mean life or death, or even impact national security, and, as such, a heavy-handed immutable database tracking these parts can find use in such an environment. For example, the highly successful SUBSAFE program already uses an amended-only paper trail for ship-safety repairs on submarines. The use of an immutable database in this instance could dramatically improve a paper-heavy and cumbersome process while still preserving ship safety.  Again, these use cases are limited in nature and don’t really address a key problem in vital supply chains, namely data entry (even in our SUBSAFE example). Non-natively digital information or value whether a collectible, Rolex, or Seawater valve, when entered onto a blockchain will always depend to an extent on the person entering the information. Blockchain, although immutable, can still be error-prone (then save that error for eternity). This again highlights the fact that much of our supply chain issues are a digitization issue, not an audit trail issue.

However, there are ways we can work to reduce the chance of error and leverage emerging digitization technology to better ensure proper data entry. In a current project with the United States Navy, we’re building a blood product tracking tool for sharing blood product information across a variety of parties such as a blood bank, hospital, and ship, we’ve utilized barcodes and RFID to automate data entry and partially solve this problem as well as integrating another key use case, IoT. As the DoD continues to experiment and test Decentralized Computing, we believe two more key use cases will emerge:

1. Digital Chain of Custody: As DoD continues its digitization efforts, much of the data created will be (and already is) of the highest importance to national security, from directives, to planning, to weapons system software. Securing this data in a way to ensure it has not been tampered with is a key area of national security importance. Especially in the case of mission-critical software, which is already natively digital, Decentralized Computing can be a powerful tool to prevent malicious or even unintentional errors.

2. Decentralized and Secure Communications: Warfare is becoming increasingly informationized and cyber-based. Conflicts arising in the coming years will be a battle of systems and systems confrontation. Digital information flowing from soldiers on the ground, drones in the air, weather, intelligence, and more will be fed into a full combat, comms, and sensor integrated network from where topline decision-makers will be able to view the entire battlefield and make key decisions. Disrupting these networks will require coordinated attacks on key nodes to disrupt or destroy the system. Traditional digital and web architecture creates single points of failure that if exploited would destroy military systems and impact national security. Decentralized Computing could dramatically improve the robustness of these systems. By creating a decentralized network of nodes, information transfer can not only be increased in volume, by utilizing technologies like BitTorrent, increase security by breaking the information apart into smaller chunks, and creating a robust data transfer network where one node’s destruction will have little impact on the other nodes or operation of the system. A sophisticated consensus mechanism will also be able to observe potential bad actors attempting to participate in consensus, removing Man in the Middle or Sybil attacks, something much harder to do in traditional cyber security.

Internet of Things (IoT)

Every day a new device becomes IoT enabled, from your microwave to your grill. This certainly will continue to grow and these devices will gather more and more information from our everyday lives and beyond into industry. As IoT becomes more ubiquitous, the control the IoT devices have over our lives and the data generated will be very important, valuable, and vulnerable. Significant debate will continue to occur around who owns IoT data and what it can be used for. Possibly even more important is what the device can do. If it’s just monitoring your grill temperature, you may be less concerned about privacy or data ownership. If it’s a heart monitor, baby monitor, or a control actuator on an oil pipeline – that device’s connectivity presents a very clear vulnerability that if exploited or the device itself fails, could result in serious consequences.

In a data privacy use case, a decentralized IoT device, not dependent on a third party’s centralized server, the user can be assured that information generated by a device is owned by that person alone. Additionally, compromising a large amount of decentralized IoT devices, while in a centralized use case would only require the exploitation of a single server, would require exploitation of each IoT device, something extremely difficult, if not impossible to do. 

Centralized IoT devices are also limited by the companies that produce them, interoperable with only others in their product line, and dependent on the longevity of their maker to maintain functionality. What happens to a valve actuator on a pipeline when the maker of that connected device goes bankrupt, is acquired, or the cloud server supporting that device goes down? The IoT centralized ecosystem as it exists today is highly vulnerable to the viability of the companies making the device. Potentially fine for a light bulb or grill, but disastrous if a pipeline goes down or your pacemaker needs an update.

IoT devices being served by a decentralized network can avoid the issues that central servers create. Even in the case that a company ceases operation, the network itself will remain functional, allowing the device to continue operations. The data standard of the decentralized network, combined with current efforts today to link data across different networks also presents an opportunity for increased IoT device interoperability. Devices will be able to speak to each other across product lines, creating new use cases. Your grill could talk to your doorbell, letting you know friends have arrived for the barbecue. In a more serious use case, weather monitoring could be connected to electrical grids and oil pipelines, providing information directly to control systems to enable proactive protection of equipment.

MachineFi is the terminology being introduced to describe the intersection of Industrial IoT and Finance. The open protocol, IoTeX, is a leader in the space and describes MachineFi as a new paradigm fueled by Web3 that underpins the new machine economy, whereby machine resources and intelligence can be financialized to deliver value and ownership to the people, not centralized corporations. Though somewhat hyperbolic, the message is clear, IoT devices powered by Decentralized Computing have the potential to not only impact the security and longevity of the device but the empowerment of the users themselves.


Energy markets are increasingly distributed as ‘micropower’ sources such as wind turbines and solar panels proliferate. Battery backups to these renewable energy sources are also in high demand and are often traded not as storage but as ancillary power. The variability of renewable sources is difficult to predict, so much so that it often trades at a negative price. Current energy trading was designed around the large power sources of the past and is not dynamic enough to predict or react to changes in weather or energy production. Smaller power operators have an opportunity to more directly sell their power to the grid as digital technologies now manage the flow of energy. Additionally, grids could also have more flexibility to directly compensate for excess energy storage and backups, a must-have for extreme weather events and surge times, ensuring power is available when needed. As a whole, energy markets could be another prime area where Decentralized Computing could have a large impact utilizing a combination of the use cases discussed above. Cross border energy trade, enabled by a common data standard on an open network, robust grid and weather monitoring through widespread IoT devices, and MachineFi tools to power dynamic energy trading and real-time settlement could usher in the next generation of reliable and secure clean energy production.

A thought on Uncensorability

A clear but often overlooked value proposition of Decentralized Computing is uncensorability. This means that transactions on a public network, like Bitcoin, cannot be stopped by any third party. This is a particularly powerful empowerment technology that could enable victims of oppressive governments or those unable to access banking to be able to participate in an economy otherwise closed to them. But uncensorability has its dark side – horrible things could be empowered by this tech – Child pornography or payments to terrorists could all be funded and transmitted through these networks. As public open decentralized and uncensorable networks grow, special attention and regulation are needed to ensure that the worst of humanity is not enabled along with the best.


Let’s cut through the hype – tokens like Bitcoin and even Dogecoin and Bored Ape NFTs have some value, what exactly is that value? As our lives have become more digital, as our machines and data have become more digital, as software has eaten the world; we must understand that the value associated with our lives, our machines, and our data is also becoming digital. There is something here beyond mere speculation. This should be seen as a natural progression, money flowing into the digital value ecosystem in search of what’s next. The question before us is whether to allow the natural progression of value digitization by enabling the building of this new economy on free and open standards or to attempt to heavily regulate it to prevent what is seen as excess in the world of digital tokens (like Dogecoin). A few things are clear though, the robustness and uncensorability of the open protocols make it very difficult to shut down and the alternatives, private and consortium networks have been unable to produce a product that can match the potential of the free and open digital value economy.

Blockchain Learning, Proof of Stake

An Inside Look at the Celo Kitchen

We’re about a month into the Great Celo Stakeoff and it is definitely the most dynamic testnet launch I’ve ever been apart of. I’m constantly checking vote totals and running election calculations. On Discord, groups are frequently asking for new validators to associate with and are chatting late into the night. And already, many of the validators (myself included), who had a slow start, are already unable to meet the minimum threshold required to become elected as a validator – but we’re all trying to get back in the game!

I’ll be the first to admit I had only a basic understanding of how the Celo election process worked and didn’t have anything more than a basic uptime strategy coming into the testnet launch – which is probably why we were quickly out of the running. The top validator groups figured it out quickly and were able to take advantage! So, for the rest of us, here’s how the Celo elections work and some strategies that may help you get back into the game if you’re already out or thinking about running a validator on Celo in the future. 

Celo uses an election algorithm called The D’Hondt Method. The Wikipedia page has a decent explanation and I’ll use that to demonstrate how the elections work for Celo. Celo validators currently have two areas to vote: for their group and/or for their validator. For each one to have a chance to be elected, the group and validator must have at least 10,000 (10k) cGold staked. For each validator in a group, the group leader must have an additional 10k locked in the group address (4 associated validators means 40k cGold locked).

From an election standpoint, the amount staked to a validator, as long as it’s at least 10k cGold, doesn’t really matter. What does matter is the total amount staked to the group and its validators. When an election occurs, Celo identifies the group with the highest total and elects a validator from that group first. It then starts to apply The D’Hondt Method, which means first dividing the total of the top group by two (for election calculations) then looking for the next highest total. If that first group still had the highest total, even after halving their total stake, they would elect a second validator. If not, the next group with the highest total would be elected (and their effective stake would drop by half as well). This process continues until 100 validators are elected. Each time a group has a new validator elected, their effective stake (for the election only) drops by an increasing factor. So a group with 100k would go to 50k the first time elected; the second time elected, the original total would be divided by 3 to 33k; the third time, divided by 4 to 25k and so on. If that’s confusing, I’ve got an example below:

Number of Validators Total Votes
Group A 4 700k
Group B 3 600k
Group C 2 325k
Group D 1 200k

For our test case, we’ll start off with 4 validator groups, 6 electable validator positions, and 10 total potential validators. Group A has 4 validators, Group B has 3, Group C has 2, and Group D has 1. The total number of network votes is 1,825,000 divided among the groups as seen in the chart above. An election would go as follows:

Pass 1 Pass 2 Pass 3 Pass 4 Pass 5 Pass 6 Validators Elected
Group A 700k 350k 350k 233k 233k 233k 3
Group B 600k 600k 300k 300k 300k 200k 2
Group C 325k 325k 325k 325k 162k 162k 1
Group D 200k 200k 200k 200k 200k 200k 0

On the first pass, Group A gets the top spot since they have the highest total. Group B wins a validator in pass 2 because Group A’s effective votes drop by half. Now for pass 3, both Group A and B have been halved, but Group A’s votes are still higher than the rest so they win a second validator and their original votes are now divided by 3 to 233k. In pass 4, it’s Group C’s turn to win a validator. This continues until 6 validators are elected. Some things to note: Group D does not get a validator elected! Even though they have the highest ‘per-capita’ validator (tied with Group B) at 200k. Group A actually has the second lowest per-capita votes (average votes per validator) at 175k but still elects 3 validators. 

Ok so what’s the strategy here?

First, the validator total staked (or locked) doesn’t really matter. A validator with only 10k locked can be elected over validators with higher totals as long as the sum of the group total is higher. What this means is group leaders (the only ones who can lock gold for their group) should only be sending rewards to their group address and locking it. This will allow them to add additional validators and consolidate their votes.

The D’Hondt Method favors the largest groups (as demonstrated above), so groups will probably want to consolidate their group totals by adding additional validators to improve their chance of being elected. What does this mean in the long term? Currently, transactions are not allowed between third parties, so it is up to the group leader alone to save and add validators. But once the network goes to mainnet, what’s to stop people from creating super groups of 10 or more validators? And does it matter? We’re already seeing consolidation, there were over 100 groups when the games started last month and we’re down to less than 50. This will probably be amplified on the mainnet. As these groups consolidate, will that affect the decentralization of the mainnet? And again, will it matter? If validators are free to join groups as they please they can obviously leave a group that is misbehaving. This is similar to Bitcoin mining in the sense that although there are a small number of mining pools, miners can move between pools as desired. The remaining question to be answered then is how much power do the miners actually hold? In 2017, we saw the Bitcoin miners bow to users and SegWit2x failed, will Celo users wield the same authority? Once the mainnet launches, token holders will be introduced to the mix and we will see how they chose to allocate their capital to help ensure decentralization.

So far, so exciting! For those following along, the stakeoff will continue until Feb 19th, with Phase 3 starting Feb 5th. There is some talk of expanding the number of allowed validators beyond 100 to allow those who have already fallen out of the running back in – but that remains to be seen. Additionally, the Celo foundation is performing security audits on those validators that request it for their seal of approval and bonus stake. You can take a look at the current state of the network here as well as the validators participating here.