Consensus Networks is excited to announce its continued support of NuCypher through mainnet launch later this year. Consensus Networks has actively supported the network since the beginning of its incentivized testnet challenge which launched earlier this year called “Come and Stake It”.
“We are big believers in NuCypher’s mission and see a major need for the work they are doing in enabling privacy preserving applications and providing a secure compute platform.” Said Consensus Networks’ COO, Connor Smith. “We will look to continue not only supporting the network by operating Ursula Nodes on our infrastructure, but are actively building out tools to help developers and organizations connect their applications to NuCypher and leverage the network to build the next generation of privacy preserved dApps. We have had an absolute blast participating in all four phases of their incentivized testnet and cannot wait to help bring the power of the NuCypher network to the rest of the world.”
NuCypher aims to provide the cryptographic infrastructure for privacy-preserving applications by allowing users to manage secrets across dynamic environments, conditionally grant and revoke access to sensitive data to any number of recipients, and use a secure computation platform to process encrypted data while preserving the confidentiality of the inputs and outputs. It is a proxy re-encryption network designed to provide cryptographic access controls for dApps and protocols by functioning as a decentralized key management service (KMS). If you have any questions about how to integrate NuCypher with your application or if it is the right fit for what you and your team is building, contact our team today!
About Consensus Networks.
Consensus Networks is an Indiana based LLC that designs, builds, and manages dedicated infrastructure for blockchain technology. Founded in 2016 with the goal of providing direct, high-speed, low-latency network connections, Consensus enables clients to create, disseminate, and store information securely and efficiently. Consensus Networks uses advanced cryptographic techniques and smartly architected network designs to ensure maximum security and network uptime.
Healthcare is frequently mentioned on the shortlist of industries that are expected to be transformed by blockchain technology. Supporters of this assertion have a range of reasons for arguing that blockchain can improve healthcare but most tend to revolve around improving health data security, immutable claims tracking, physician credentialing, or putting patients at the center of care by taking charge of their own data. However, there are many who argue that healthcare is not ready for blockchain or that many of its theorized use cases could be better addressed using other technologies. There are good arguments to be made on both sides and, when considering the complexity of healthcare and nascency of blockchain, it can be difficult to discern what projects are ‘hype’ and which can actually drive meaningful impact.
We at Consensus Networks are bullish on the potential of blockchain in healthcare, but we also pride ourselves on taking a pragmatic view of projects to realistically assess what is feasible and what is not. This past week, we were fortunate enough to attend the inaugural Blockchain and Digital Transformation in Health in 2020 summit in Austin, TX, where our CEO Nathan Miller presented on the work we have been doing with HealthNet and developing highly secure information architectures in a regulatory environment. The Conference was hosted by the Austin Blockchain Collective in conjunction with the Dell Medical School at University of Texas Austin. There were presentations from industry and academia alike accompanied by an open discourse about the state of blockchain in healthcare, what is actually feasible, and identifying a path forward for the technology as healthcare starts its digital transformation. It was an absolutely great event with high quality information and a pragmatic assessment of the state of the industry, and we’re here to share our top three takeaways with you from the event!
1). Blockchain EMRs are not ready….. At least not yet in the U.S
Throughout the short lifespan of blockchain projects in healthcare, there have been several attempts at a blockchain-based electronic medical record (EMR) that is owned by a patient and shared with providers as needed, the most popular of which is probably Medicalchain. Medical records hold a wealth of information about an individual, containing everything from a person’s medical history to their demographic, identity, and insurance information. However, to date, medical records have been largely owned and controlled by the health systems they reside within. Aside from issues of data sovereignty and controlling who has access to that information, having isolated data silos has a decidedly negative impact on patient outcomes, especially in the U.S. Competing EMR systems are incapable of communicating well with one another. Thus, if a patient goes to multiple providers that all have different EMR systems, the data for those visits will likely never be aggregated into a single, cohesive file and instead remain as isolated fragments. This makes it nearly impossible for a provider to know what care has been administered to a patient previously and leads to billions of dollars being wasted in redundant testing, unnecessary procedures, or in the worst scenarios patient death from improper care.
A blockchain-based EMR would enable the patient to own his or her own medical record, which they would likely hold in some form of mobile application. A patient could then control who has access to their record and have the guarantee a provider is seeing her most up-to-date record as any changes would be reflected in that copy immediately. All transactions would be immutably recorded on a blockchain, and once the visit was finished the patient could revoke the physicians access. Conceptually, such a notion sounds appealing. However, one of the biggest takeaways from the conference was that such a future is far off in the U.S and requires a societal shift and fundamental rethinking of data ownership to get there.
Dr. Aman Quadri, CEO of AMSYS Blockchain and AMCHART, was one of the speakers in attendance at the event. The product he is building, AMCHART, is a blockchain-based EMR that is currently undergoing testing in India, and even he was skeptical of its prospects in the U.S. Dr. Quadri said that the reason they have started seeing AMCHART adoption in India is because people there already have a mindset of data ownership. They take responsibility over their data so a platform like AMCHART extends their current capabilities in a way that is beneficial. Dr. Quadri said that for AMCHART to have impact in the U.S it would require patients and health systems alike to change how they view and approach handling data before their could be a marked increase in value to patient care. He said that American patients have been conditioned for decades to blindly trust their data with medical providers, so shifting that view will be no easy task.
2.) Use Cases Are Emerging Around Care Coordination, Identity, and Data Sharing
The projects being spearheaded by the talented Dell Medical School faculty, visiting academics, and industry representatives in attendance covered a wide range of applications in healthcare, spanning both population health and the clinical setting. While the individual problems these solutions address vary, the common thread amongst most of them was that they centered around care coordination, identity, and data sharing applications. The consensus seemed to be that blockchain could help lay the foundation for a web of trusted access to data with the patient at the center of care.
Dr. Timothy Mercer, a faculty member at Dell Medical School and practicing physician, is exploring ways in which blockchain could be applied to help address homelessness in Austin. His research found that one of the biggest problems for the homeless population in Austin is a lack of any legal form to prove their identity. As a result, they frequently go through the process of proving who they are, which can take weeks to months to complete and delay physicians from providing critical care to the homeless. If the documents are lost or stolen, the process must start all over again. As a result, the average age of death the chronically homeless is 52-56, nearly 20 years less than the global average. Dr. Mercer is exploring ways blockchain and digital identities could be used to ease this burden and accelerate the time to care for homeless persons in Austin. This way, the myriad of stakeholders involved in caring for homeless persons would be able to verify an individual through a web or mobile application and administer care to the patient. The homeless care ecosystem involves many different organizations, all of which must properly authenticate the individual before they can legally administer care. Utilizing a blockchain-based identity application, the different caregivers would be able to verify the individual’s identity through digital documents linked to the patient’s digital identity and legally provide the care he or she needs. This ultimately would place the homeless person at the center of the care and alleviate the inefficiencies pervasive in the current continuum of care for this patient population.
Another interesting application that was highlighted at the event was Tribe Health Solutions use of blockchain in a medical imaging management solution designed for patients. Through the use of blockchain technology and the interplanetary file system (IPFS), they created a platform where patients can store medical imaging data on a distributed file system and then grant necessary providers access when needed. After care is administered, the patient can revoke access to the image and ensure that only trusted providers can access it. This solution aims to help patients overcome many of the problems associated with receiving care for skeletal or muscular injuries. In such tears or breaks, patients oftentimes seek out multiple opinions and are forced to either manage a physical copy of the medical image themselves or wait days to weeks for the file to be transferred to the provider. This not only delays the time it takes for these patients to receive the care they need and start the recovery process but in the worst case scenarios can lead to a worsening of the condition. Putting the patient in charge of the imaging data, allows her to determine who can view the image and when – ultimately reducing the time it takes to receive treatment.
3.) Blockchain Projects Must Start Small in Healthcare and Involve Many Stakeholders
Lastly, perhaps the most prevailing takeaway from the conference was that blockchain projects looking to tackle problems in healthcare need to start small and involve as many stakeholders as possible from the onset. Healthcare is a highly complex industry where ‘moving fast’ and ‘breaking things’ can have significant ramifications on patients, especially when considering that the industry is only now beginning its digital transformation. Dr. Anjum Khurshid of the Dell Medical School at University of Texas Austin and a director of the Austin Blockchain Collective is leading research on a patient credentialing platform called MediLinker. When pressed about the ability to extend the platform into an EMR type technology that could exchange clinical data, Dr. Khurshid cautioned that it’s more important to start small and clinically validate each stage of the product. In an industry that handles high fidelity information and typically averse to new technologies like healthcare, Dr. Khurshid said its important to demonstrate the value of the technology and make it more approachable. He said that the problems in the current healthcare system are so vast that even simple solutions can have a massive impact and that it is imperative to validate the benefit of the technology to patients, providers, and payers alike at each step. Any new, truly innovative and sweeping changes that are to take place in healthcare from blockchain will require all of these parties to work together and identify applications that can drive meaningful value for everyone involved. Healthcare is changing rapidly and only by taking small incremental steps will blockchain be able to integrate with the complex, multi-stakeholder ecosystem that is healthcare.
That’s all for this week! We at Consensus Networks are grateful to have been able to attend this conference and excited about the work going on in Austin to advance the industry forward. We are continuing forward with the development and commercialization of our population health data sharing network, HealthNet as well as smart tools for analyzing health data. If you are interested in learning more about HealthNet or have an idea for a new digital healthcare application you’d like to build, contact one of our experts here today!
Note: This is the final installment of a series exploring different approaches that blockchain networks have taken to achieve decentralization. Part 1 introduced decentralization and the inter-play it has with certain aspects of crypto networks like governance, incentives, and network architecture. If you missed that article I highly recommend going back and reading it here. The remaining articles have been examinations of the decentralization on Bitcoin, Factom, Cosmos, Terra, and Polkadot/Kusama. If you missed those and would like to go back and read them before we dive into Celo, you may do so, here, here, here, here, & here respectively.
Hey everyone, thank you for joining me again as our thoroughfare through varying approaches to decentralization comes to a conclusion. From the first crypto network, Bitcoin, to the new generation of ‘meta-networks’ like Polkadot and Cosmos, we have seen quite a few different ways networks have attempted to decentralize and how that has influenced their design. We have seen how factors like application design criteria and consensus architecture (i.e proof-of-work vs. proof-of-stake) influence the decentralization of a networks’ resources and participants. Moreover, by taking a chronological approach in the networks examined throughout this series, we have seen the evolution of the crypto industry over the better part of a decade and will be ending with the youngest protocol we have seen thus far, Celo. Aiming to overcome price-volatility and ease of use problems associated with many cryptocurrencies, Celo seeks to bring financial infrastructure and tools to anyone with a smartphone.
To reach this goal, Celo is taking a full-stack approach and introducing innovations at the networking and application layers with technologies like a lightweight address-based encryption scheme, an ecosystem of stable-value tokens, and novel system architectures. The project is backed by some of the most prolific crypto investment partners and Silicon Valley tech titans like a16z, Polychain Capital, Coinbase, Reid Hoffman, and Jack Dorsey to name a few. The protocol has been making waves over the last few months due to its rigorous incentivized testnet competition, known as the Great Celo Stake Off, the rigor of which was recounted two weeks ago by our CEO Nate Miller. The Stake Off is entering into its third and final phase and the mainnet is slated to launch later this year, making 2020 a big year for the protocol. So without further ado, let’s dive into Celo!
So What is Celo and How Does it Work?
Celo was launched with the mission of building a financial system capable of bringing conditions of prosperity to everyone with a smartphone. To create such an environment, Celo has proposed a theory of change to satisfy the following three verticals: satisfying people’s basic needs like access to food & healthcare, the ability to enable an individual’s growth potential, and increasing people’s social support for one another. All aspects of the protocol, from its architectural decisions, to development efforts, and even all technical and community projects support activities tied to enabling these three conditions and ensuring such a system is created. Work on the project began in the summer of 2018 when entrepreneurs turned GoDaddy executives Rene Reinsberg and Marek Olszewski raised their initial seed round of $6.5 MM from some of the Valley’s top Venture Capitalists. The pair had exited their prior company, Locu, to GoDaddy in 2014 for $70 MM, and had since been serving as vice presidents in the restaurant and small business division of the firm. Armed with little more than a white paper at the time, the team got to work and in less than a year the first public testnet was released. Celo aims to achieve its mission of bringing conditions of prosperity to everyone and being a mobile-only payments platform through the following two features: mapping users public phone numbers to an alphanumeric string (public key) needed to transact on the network, and using a network of stablecoins pegged to a basket of different cryptocurrencies to minimize price volatility.
Celo’s lightweight identity system utilizes a variant of identity-based encryption known as address-based encryption to overcome traditional user experience issues associated with transacting cryptocurrencies. Instead of the canonical having to download a wallet, generate a public/private key pair, and provide whoever is sending you crypto with a long hexadecimal address, Celo’s address-based encryption ties a user’s phone number directly to a Celo wallet. This allows the phone number to be used when a payment is initiated instead of the actual Celo address when a payment is initiated, simplifying the payment process. Additionally, only a cryptographic hash of the phone number is stored on the blockchain to preserve privacy. Celo also allows a user to link multiple phone numbers to his or her wallet address to protect against losing a phone or changing numbers. Celo also utilizes a social reputation mapping algorithm on top of this infrastructure known as EigenTrust. While the technical underpinnings of the algorithm are fairly complex, it functions similarly to Google’s Page Rank algorithm but is designed for decentralized systems. In short, this algorithm defines a given phone number’s relative reputation score based on the number of phone numbers who are connected with and trust that phone number, coupled with the weighted reputation of those connections.
Similar to Terra’s approach to creating a stablecoin network, Celo is also a network of stablecoins pegged to real-world fiat currencies and uses a seigniorage based approach to maintain stability. For the sake of brevity, I am going to gloss over what stablecoins and seigniorage are, as I discussed them at length in my post on Terra, and instead, dive into how they work in the context of Celo. Celo is designed to support an ecosystem of pegged stable currencies alongside the native token of the protocol, Celo Gold (cGold). The first, and currently only, stablecoin on the network is the Celo Dollar (cUSD), which is pegged to the price of the U.S dollar. cGold has a fixed supply and is held in a reserve contract where it is used to expand or contract the supply of cUSD to maintain a stable price through seigniorage. The network relies on a number of external oracles to provide feeds of the cGold price in USD and then allows users to exchange a dollars worth of cGold for cUSD and vice versa. When the market price rises above the $1 peg, arbitrageurs may profit by purchasing a dollar’s worth of cGold and then selling cUSD for market price. Conversely, if the price of cUSD falls under the peg, arbitrageurs can profit by purchasing a cUSD for market price and exchanging it with the protocol for a dollar’s worth of cGold and then sell the cGold on the market. Thus, network participants are able to profit in nearly any market condition. A more thorough examination of how Celo’s stability mechanism works may be found here.
Celo also goes a step further to ensure stability and has implemented a constant-product market-maker model to prevent against the cGold reserve from becoming overly depleted when the price of cGOld supplied by the oracles does not match the market price. The mechanism dynamically adjusts the offered exchange rate in response to the exchange activity and will update a new constant-product market maker to trade cGold and cUSD whenever the oracle price of cGold is updated. Hence, if the oracle price is correct, the exchange rate determined by the constant-product market-maker will be equivalent to that of the market and no arbitrage opportunity will exist. However, if the oracle price data is incorrect, the rates will differ and an arbitrage opportunity will exist until exploited enough by users to dynamically adjust the quoted exchange rate and erase the opportunity.
Celo’s Network Architecture
Consistent with their full-stack approach to creating a mobile-first financial system, Celo implements a novel tiered network architecture to optimize the end-user experience and maximizing physical decentralization. Similar to other Byzantine Fault Tolerant (BFT) proof-of-stake networks we have seen so far, like Cosmos, the core network participants responsible for producing blocks and verifying transactions is the validator. Unlike other proof-of-stake networks that encourage anyone who is willing to run a validator, Celo encourages only professional node operators to run a validator on the network. For example, Celo strongly encourages running validators in a secure data center and has been auditing validators participating in the Stake Off to see if this is the case. In maintaining a secure set of globally distributed validators, the network hopes to maximize security, performance, and stability. Celo has also attempted to implement safeguards against any single validator or organization from gathering a disproportionate amount of the network’s resources by introducing validator groups. Instead of electing individual validators to participate in consensus, validator groups comprised of a collection of individual validators are elected and then internally compete to solve blocks. The actual election process and underlying mechanisms are far more involved and complex, so if you are interested in learning more, as I said earlier, check out this blog post from our CEO, Nate Miller, which explains the process in more detail. Validator groups have their own unique identity and a fixed size to make it difficult for a single organization to manage multiple groups and consolidate disproportionate influence, thus improving decentralization.
While the ability to run a validator is fairly restricted to only professional node operators, there are two other tiers of nodes that anyone can run on the Celo Network: a full node and a light client. The Celo application/wallet has a light client embedded within it that is optimized for mobile devices, so anyone running the software on their phone is running the light client. The requests exchanged across these light clients (i.e sending & receiving transactions on the network) must be processed by full nodes, which receive a transaction fee facilitating the transaction. People running full nodes can set a minimum service fee for processing transactions from light clients and refuse to perform service if the fee they collect will be insufficient. The eventual goal of the protocol is to have these components operate such that light clients will automatically choose full nodes to peer with based on cost, latency, and reliability. However, much fundamental network infrastructure must be laid down first before this is achievable. An eventual flow of what this will look like, including validators, may be viewed below.
So How Does Governance Work on Celo?
At a high level, Celo has a governance model similar to many other proof-of-stake networks, where the respective weight of a particular user’s vote in the governance process is proportional to the amount of cGold they have staked and the duration of their stake. Similarly, Celo also supports on-chain governance to manage and upgrade all aspects of the protocol including upgrading smart contracts, adding new currencies, or modifying the reserve target asset allocation. Changes are currently made through a governance specific smart contract that acts as the overseer for making modifications to other smart contracts throughout the network. The eventual goal of the protocol is to transition from this smart contract structure for on-chain governance to a Distributed Autonomous Organization, or DAO, owned and managed by cGold holders. This could function in a form similar to how MakerDAO operates, however, it is far too early to speculate on how the Celo DAO would actually function. For more information on what a DAO is or how they work, click here.
Any network participant is eligible to submit a proposal to the governance smart contract, so long as he or she is willing to lock a portion of his or her cGold along with it. A proposal consists of a timestamp and the information needed to execute the operation code should it be accepted by the network. Submitted proposals are then added to a proposal queue for a duration of up to one week where they will be voted upon by cGold holders in hopes of passing to the next approval phase. Every cGold holder with a locked cGold account may vote for one proposal per day. The top three proposals for each day then advance to the approval phase, the original proposers may reclaim their locked gold commitment, and they then enter the referendum phase where they are voted upon by users. Any user may vote ‘yes’, ‘no’, or ‘abstain’ to the proposal and the weight of their vote is tied to their locked gold commitment. While yet to be implemented, Celo also intends on incorporating an adaptive quorum biasing component like we observed in Polkadot to accurately account for voter participation.
So How Decentralized is Celo?
As I mentioned earlier, Celo has yet to launch their mainnet so this discussion will be framed through the context of what has transpired throughout their ongoing incentivized testnet, Baklava. As of the time of writing this article, there are currently 75 different validator operations participating in the third phase of the Great Stake Off, but 152 Validators in total and 88 validator groups. Moreover, Celo is debating expanding the active set of validators on the network upon the conclusion of the Stake Off. The active set is currently set at 100 Validators, and the original plan was to wind down the number of possible validator slots to 80 before mainnet launch. However, Celo recently announced that they now plan to expand the active set to 120 so long as scale testing shows this permissible given the active engagement that validators have shown throughout the Stake Off. Considering that Celo intends on only allowing Validator nodes to be run by professional service providers, this is a major step in decentralizing their network and ensuring a globally dispersed, resilient network.
When examining the allocation of resources across the Celo network, there is somewhat of a disparity between the highest-ranked participants and those at the bottom. For example, the top elected validator group has nearly 1.1 MM votes, whereas the lowest elected validator group has only slightly over 200K. Additionally, the top elected group has 5 different participants with the largest, whereas the bottom elected group only has one. This illustrates the importance of the validator group, not the individual validator on Celo. The largest cGold holder within the largest elected validator group only has 194k locked cGold, meaning that all members of the group have fewer locked cGold than the one participant in the bottom group. Yet, the group collectively is the highest voted group so its participants are more likely to participate in consensus and gather rewards. Metrics relating to the decentralization of full nodes and light clients on Celo are not readily available since the network is still in the very early development stages. Consequently, it is difficult to attempt to quantify the degree of decentralization of these layers of the network. The Celo Wallet Application is available for the Alfajores testnet on both the Apple App Store and the Google Play Store, with over 100 downloads for the latter. This suggests that there are at least 100+ light nodes on the non-incentivized testnet Alfajores.
That’s all! I hope you have enjoyed this case study approach to decentralization as much as I have. With the last phase of the Baklava incentivized testnet coming to a close within the next few weeks, mainnet launch slated for later this year, and the protocol’s recent announcement of Celo Camp to incubate and assist startups building on the platform, it is certainly an exciting time to be involved with Celo. The Great Celo Stake Off has been no walk in the park, but it has certainly stress-tested the network technically and from an incentives standpoint. Excluding some economic barriers to entry for new validators attempting to enter the active set, it appears that Celo’s approach to decentralization has achieved its goal, at least physically. It will be interesting to see if this continues once true economic conditions are introduced on mainnet, but I am optimistic about the future of the network. If you are interested in seeing if Celo is the right blockchain for your application, running a Celo cluster, or how staking on Celo, contact one of our LedgerOps experts here. We have been involved with the protocol throughout the entire incentivized testnet and are currently in the second-highest voted group (Chainflow-Validator-Group), so we are highly familiar with the protocol. Thanks again and take care!
By Connor R. Smith, originally published March 22, 2019
Despite blockchain having existed for over a decade now, few definitive uses have been proven outside of digital currencies. There have been many experiments to apply these technologies in areas like supply chain, healthcare,real estate, and even tipping people for their tweets or for watching online ads, but, there has yet to be one vertical that has been radically transformed from them. Many feel that this is because the technology is not mature enough yet or a general lack of understanding. Certainly, a lackluster user experience and insufficient education play a part, but others have started to argue that blockchain is a solution searching for a problem that may or may not even exist. It seems like new articles surface weekly about startups raising millions of dollars promising to solve some largely nebulous problem using “Blockchain + IoT or, AI, or Drones, or all the above…”. At Consensus Networks, we’re focused on finding and supporting protocols that are technically sound and addressing real-world use cases. One area we have been particularly excited about lately is the ability of blockchain to secure internet of things (IoT) data.
Fortunately, integrating blockchain technology with IoT networks provides a path forward to overcome the scalability, privacy, and security issues facing IoT today and accelerate the adoption of both technologies. As opposed to having a centralized system with a single point of failure, a distributed system of devices could communicate in a trusted, peer-to-peer manner using blockchain technology. Structuring the network in this manner means that it would have no single point of failure, so even if a device was compromised the remaining nodes would maintain operable. Moreover, smart contracts could be integrated with the network to enable IoT devices to function securely and autonomously without the need for third party oversight. Consequently, blockchain-enabled IoT networks could exhibit greater scalability, security, and autonomy simply by modifying their current network architecture and implementing a more decentralized approach.
However, perhaps the most important benefit blockchain provides IoT networks comes from its cryptographic security. Sharing data across a cryptographically secured network makes it far less susceptible to hackers, by helping to obfuscate where data is flowing, what is being exchanged, or what devices are transacting on the network. Whereas security in modern IoT networks was added as an afterthought, encryption and cryptographic keys are a core component of blockchain technology. Moreover, some networks are beginning to incorporate zero-knowledge proofs, which means that network security for IoT devices could be bolstered even further.
The underlying mathematics and mechanics of zero-knowledge proofs are highly complex, but essentially allow two users to prove that a piece of information is true without revealing what the information is or how they know it to be true. In the context of IoT devices, this means that a network of IoT devices could share data in total anonymity and with complete privacy. No information regarding the transaction would be revealed other than proofs verifying that the network knows it is legitimate. Thus, the network maintains complete functionality while preserving maximum security. Regardless of if a blockchain-enabled network of IoT devices utilized zero-knowledge proofs or not, simply utilizing a shared, encrypted ledger of agreed upon data can provide many security benefits in IoT networks.
While there have been several projects that have attempted to tackle IoT and blockchain, one that we are excited to support is IoTeX. Founded by a team of cryptography and computer science experts in 2017, IoTeX is a privacy and security centric blockchain protocol that aims to create a decentralized network designed specifically for IoT devices. IoTeX uses a network architecture consisting of blockchains within blockchains, where a root chain manages many different subchains. Designing the network in this manner allows IoT devices that share an environment or function to do so with increased privacy, with no risk to the root chain if this subchain is compromised.
Aside from enhanced privacy and security, this design allows for greater scalability and interoperability as subchains can transact with the root chain directly or across the root chain to other subchains. IoT devices on the IoTeX network are also able to transfer data with one another in total privacy through the incorporation of lightweight stealth addresses, constant ring signatures, and bulletproofs. IoTeX also incorporates a Randomized Delegated Proof of Stake (RDPoS) mechanism for achieving consensus that they refer to as Roll-DPoS. Using this mechanism, nodes on the IoTeX network can arrive at consensus much faster with instant finality and low compute cost, making it much more friendly to IoT devices. Moreover, the IoTeX team recently released their first hardware product that leverages their blockchain network, Ucam. Ucam is a home security camera that writes data it records directly to the IoTeX blockchain, preventing it from being accessed by device manufacturers or sold to third parties like Google or Amazon. Ucam guarantees absolute privacy and provides users with secure blockchain identities which they can use to control their data.
Thanks for reading! More articles to come regarding use cases for IoT and Blockchain and what the marriage of these two technologies might look like for Web 3.0 and Industry 4.0. Let us know what you think and find us on twitter or discord if there are any questions or areas you’d like us to explore! If you’re interested in finding out more about IoTeX, Ucam, or how blockchain can improve your IoT solution, feel free to contact one of our LedgerOps experts here. We have supported IoTeX for nearly a year now, and have been running a delegate node on their mainnet since genesis. Needless to say, we are highly familiar with the protocol and eager to see if IoTeX or any of our other blockchain network services are a good fit for your IoT application!
Note: This is the sixth installment of a series detailing different approaches that blockchain networks have taken to decentralize their network. Part 1 introduced the concept of decentralization and the inter-play it has with certain aspects of crypto networks like governance, incentives, and network architecture. If you missed that article I highly recommend going back and reading it here. The subsequent articles have examined the decentralization of Bitcoin, Factom, Cosmos, & Terra. If you missed those and would like to go back and read them before we dive into Kusama & Polkadot, you may do so, here, here, here, & here respectively.
Hey everybody, and thank you for joining me again as we continue our examination of the varying approaches to decentralization throughout the crypto ecosystem. Two weeks ago we examined Cosmos’ approach to decentralization and the unique challenges the protocol faces in its quest to build the internet of blockchains, and then spent last week studying one of the blockchains building on its platform, Terra. This week we will examine Polkadot, another protocol attempting to build a network of interconnected blockchains. Yet, it is taking a radically different approach from what we observed with Cosmos. However, as Polkadot has not yet launched its mainnet, much of the discussion will be framed through the lens of what has ensued on its ‘Canary Network’, Kusama. If you are wondering what a canary network is or why we will be examining two protocols in this article, don’t worry, I promise I will address these concerns in due time. However, to better discern how Polkadot differs from Cosmos and its approach to decentralization, it is crucial to first understand the history of the protocol and the institutions leading its development.
So What is Polkadot and Where Does Kusama Come In?
The Polkadot whitepaper was released in 2016 by Dr. Gavin Wood, co-founder of Ethereum and of blockchain development powerhouse Parity Technologies. Designed to connect all types of blockchains (private, consortium, public, & permissionless) and technologies, Polkadot aims to serve as the backbone of an ecosystem for independent blockchains to seamlessly transact between one another in a trustless manner and enable the decentralized internet at scale. Gavin and Parity Technologies are veterans of the crypto industry and have been instrumental in the evolution of blockchain technology. Many people tend to associate fellow Ethereum co-founder and figurehead of the protocol, Vitalik Buterin, as the leader of the blockchain revolution and bringing stateful programming to blockchains for applications outside of digital money. These assertions are well justified, seeing as he authored the Ethereum whitepaper, and has led the direction of the protocol’s development since its inception. However, many of Ethereum’s core components and widely used technologies are the work of Gavin and Parity. For example, during Gavin’s time with Etherum, he created the Solidity smart contract language that powers all of the dApps running on the network and was responsible for the network’s first functional release in 2014. Parity Technologies are also the lead developers and, until recently, maintainers of the popular Parity Ethereum Client. The Parity Etherum client is an ultra-efficient alternative to the popular geth node run by many Ethereum developers that powers an estimated 20% of Ethereum nodes and portions of Infura, the open node cluster used by many developers that processes 13 Billion transactions a day.
Needless to say, Gavin & Parity, have been instrumental in shaping the decentralized web and blockchain, thus far. Many of the protocols that followed Ethereum have attempted to build upon or adapt its concepts in some way borrowing from the innovations that these two produced. However, throughout all of the work Gavin & Parity performed for Ethereum, they began to notice that most approaches to blockchain networks were not practical in terms of scalability or extensibility as a result of inefficiently designed consensus architectures. Hence, Polkadot was proposed as a heterogeneous multi-chain framework that would allow many different blockchains, irrespective of consensus mechanism, to be interoperable with one another and overcome the following five shortcomings of conventional crypto networks: scalability, isolatability, developability, governance, & applicability. If you are curious as to how Polkadot views them, check out their whitepaper here. Similar to Cosmos, Polkadot’s heterogeneous multi-chain architecture is hyper-focused on addressing the scalability and isolatability problems, believing that if these two are adequately addressed the subsequent ones will reap tangible benefits and see improvement as well.
Shortly thereafter, Gavin, in conjunction with Robert Hermeir & Peter Czaban of the Web3 foundation, officially founded Polkadot and commenced R&D on the ambitious effort. The Web3 foundation is a Swiss based organization founded with the intent of supporting and nurturing a user-friendly decentralized web where users own their own data and can exchange it without relying on centralized entities. The foundation conducts research on decentralized web technologies and supports different projects building them, Polkadot being the first. In May of 2018 the initial proof-of-concept for Polkadot was released as a testnet and three subsequent iterations that integrated additional features were released in less than a year. Testnets provide an excellent proving ground for networks to work out any technical bugs that could occur at scale before a mainnet launch.
Starting with Cosmos’s Game of Stakes, the idea of using incentivized testnets to entice developers to truly stress test a network before mainnet launch has become largely canonical as the step preceding the launch of any proof-of-stake network. Polkadot took this a step further and released Kusama, an early, unaudited release of Polkadot, that serves as the experimental proving ground for the network. Affectionately referred to as ‘Polkadot’s Wild Cousin’, Kusama is Polkadot’s ‘canary’ network, or a highly experimental reflection of what the production version of Polkadot will be like. Kusama allows developers to test governance, staking, and more in an authentic environment with real economic conditions. Thus, developers and those validating on the network can be adequately forewarned of any potential issues that may transpire on Polkadot and correct them before a mainnet deployment. Kusama differs from a traditional testnet in that it is an entirely separate network from Polkadot, with its own token (KSM), and is run by the community. It will exist in perpetuity so long as the community supports it and is not inherently tied to Polkadot aside from inheriting its design and functionality.
So How Do These Heterogeneous Multi-Chain Networks Work?
There are three fundamental components that comprise the architecture of the Polkadot ecosystem: the relay chain, parachians, & bridges. For those of you who have been following along in this series, each of these pieces is largely analogous to the hub, zone, & pegzone concepts described in my Decentralization of Cosmos article. Parachains, or parallelizable chains, are the individual, customized blockchains built on top of Polkadot that gather and process transactions within their own network. All computations performed on the parachain are independent from the rest of the polkadot ecosystem. Thus, parachains can implement data storage and transaction operations in a manner most befitting to the problem they are trying to solve without being tethered to the technical underpinnings of another protocol like its scripting language or virtual machine. Parachains are then connected to the relay chain, i.e Polkadot, which coordinates consensus and relays transactions of any data type between all of the chains on the network. Lastly, bridge chains are a specialized permutation of a parachain that link to protocols with their own consensus like Ethereum or Bitcoin and communicate with them without being secured by the Polkadot relay chain. An image of how these pieces all fit together may be viewed below:
Designing the network in this manner has several key benefits, namely high security and near infinite scalability. Polkadot pools all of the security from the relay chain and the parachains building on top of it, irrespective of consensus mechanism, and then shares that aggregated security across the entire network. The relay chain provides a ground source of truth for the network by handling transactions and arriving at consensus, but any computation performed on the network can be scaled out in parallel across the appropriate parachains. Moreover, parachains can be attached to other parachains to create highly distributed networks for processing transactions. This allows the transaction volume of the network to be scaled out immensely without placing a crippling burden on the relay chain itself and allowing it to maintain the same level of security. Each parachain can maintain its own notion of validity for the transactions it processes, seamlessly disseminate that information to other parachains via the relay chain, and then the network as a whole can arrive at consensus.
However, this is only feasible with participation from the following core network stakeholders: validators, collators, and nominators. Similar to other proof-of-stake networks, validators are the node operators responsible for verifying transactions on the network and producing blocks for the Polkadot blockchain. Equivalently, nominators are those who elect validators on their behalf into the active set by staking with them in exchange for a portion of their block rewards. The new player in this ecosystem is the collator, who is responsible for consolidating the transactions on the respective parachain they monitor into blocks and proposing proofs of those blocks to the validators. This eases the technical burden on validators by allowing them to only have to verify potential blocks from parachains as opposed to processing and verifying thousands of parallel transactions. Hence, the relay chain can arrive at consensus in seconds as opposed to minutes and maintain the security offered by a highly decentralized network. Collators can also act as ‘fisherman’ who are rewarded for identifying parties on the network acting maliciously. An image depicting how all of these stakeholders interact across the different network layers may be viewed below:
It is important to note that I am simplifying significant portions of how Polkadot works at the technical level. The project is highly complex, with a myriad of intricate components at each layer that would take far too long to detail in a single article. For example, Polkadot uses a novel proof-of-stake consensus algorithm known as GRANDPA (GHOST-based Recursive Ancestor Deriving Agreement) that separates block production from block finality, allowing blocks to be finalized almost immediately. For more on GRANDPA check out this article, and if you are interested in learning more about the underlying technology of the network, check out the whitepaper here.
Governance on Polkadot
Similar to other proof-of-stake networks, the crux of Polkadot’s governance is hinged on the idea of stake-weighted voting, where all proposed changes require a stake-weighted majority of DOTs (or KSM on Kusama) in order to be agreed upon. However, Polkadot also incorporates a tiered governance structure and unique voting mechanisms in an attempt to decentralize power and governing authority on the network. Anyone holding the protocol native currency, DOTs, has the ability to directly participate in governance on the network. They can do everything from vote on proposals brought forth by the community, nominate validators to participate in the network, prioritize which referenda are voted upon and more. Governance, itself, is completely dissociated from validating on the network aside from the fact that validators can use their DOTs to vote as described above.
Polkadot also has a Council that will range in size from 6-24 members and have prioritized voting rights. Anyone who is a DOT holder is eligible to run for council, and are elected by the community in hopes that they will propose referenda that are sensible and benefit the network as a whole. In addition to preferred voting rights, council members have the ability to veto incoming proposals if they believe they are harmful to the protocol. However, after a cool-down period, the proposal may be resubmitted and, if the council member who vetoed it originally is still present, he or she will be unable to do so again. To protect against council members becoming negligent in their duties or abusing their governing power, members are elected on a rolling basis, with the term of each council member being equal to the size of the council times two weeks. An illustration of this may be viewed below.
To combat the fact the total community participation for voting on any referendum is unlikely, Polkadot implements what is known as Adaptive Quorum Biasing to change the supermajority required for a proposal to pass based on the percentage of voter turnout. Consequently, when voter turnout is low a heavy supermajority of ‘aye’ votes is required for a referendum to pass or a heavy super majority of ‘nay’ votes is required to reject it. Yet, as voter turnout approaches 100% the system adapts and only a simple majority either way is required to account for the greater number of total votes. DOT holders votes are also weighed proportionally based on the amount of DOT they own and the amount of time they choose to lock those tokens for after the referendum has ended. For example, any DOT holder voting on a proposal must lock their DOTs for at least 4 weeks, but they can instead choose to lock it for up to 64 weeks to place a greater weight on their vote. All voting also occurs on-chain, so any approved proposals have a direct and immediate effect on how the network behaves.
So How Decentralized is Polkadot?
As mentioned earlier, Polkadot has yet to launch its mainnet so this discussion will be framed through the context of Kusama. As of the writing of this article, there are over 300 nodes supporting the Kusama network and 160 active validators distributed around the world. Moreover, there is currently a proposal to increase the active set from 160 to 180 validators up for election with significant support from the community, suggesting that the network will become even more decentralized in the near future. The Council has 13 members with a combined backing of over 1 MM KSM and 264 voters. Of the 8.380 MM KSM issued so far, 2.641 MM, or 31.51%, of it is staked across the active set of validators. Similar to the other proof-of-stake networks we have observed so far, the top 10 Validators control a significant portion of the staked KSM on the network, albeit far less than that of networks like Cosmos and Terra. Of the 2.641 MM KSM staked on the network, only about 18% of it resides within the top 10 validators by amount staked. Especially when considering that governance is completely decoupled from validating on the network, this is all the more impressive. Of the total possible voting power on the network the KSM held by the top 10 validators by stake only amounts to only roughly 5% of the overall voting power.
Given the scope of Polkadot to not only serve as a network of heterogeneous multi-chains, but as a platform for connecting private to public blockchains and create a truly decentralized web, having a sufficiently decentralized network across in all aspects (architecturally, economically, and from a governance perspective) will be hyper-critical to its success. If how decentralization on Kusama has materialized is an adequate proxy, then the future looks exceedingly bright for Polkadot. However, it is difficult to tell how this level of decentralization will carry over from Kusama to Polkadot. Kusama was launched to test out the different technical parameters of Polkadot and simulate what a live environment would be like for those validating on Kusama. Consequently, it has been an exceedingly open community and encouraging of people to participate, which has likely led to the magnitude of decentralization observed on the network. Considering that 50% of genesis rewards from Polkadot have already been distributed via a token presale that occurred over two years ago, it is difficult to say with certainty that this level of decentralization will occur on Polkadot come mainnet launch. While many of those on Kusama have been involved with the project for a long time and intend on participating in Polkadot, the crypto world has evolved tremendously over the last two years. Therefore, there is some inherent possibility that the old money that entered two years ago has very different interests than the newcomers who have become involved since. However, the team behind the project is a force to be reckon with in the crypto world that has worked hard to make Polkadot a reality and the community grows more and more everyday, so I’m optimistic that its launch this year will exhibit decentralization in a manner more aligned with how Kusama has evolved.
That’s all for this week, I hope you enjoyed the article! I know we unpacked quite a bit of information here, but, as I said, Polkadot is one of the most technically advanced protocols making waves right now and I really just scratched its surface. If you’re interested in learning more about Polkadot or Kusama, seeing if its a right fit for your application, or want to get involved in staking, feel free to reach out to us at Consensus Networks! We are actively involved in the community, have run Validators for both Kusama and the current Polkadot testnet (Alexander) for some time, and are gearing up for Polkadot mainnet so we are highly familiar with the protocol. Contact one of our LedgerOps experts here with any questions you may have about the network and we will get back to you as soon as we can. We are excited for the future of Polkadot and the impact it could have on the decentralized web, and eager to help you access the network. Thanks again for reading and tune in next week as I conclude my examination of decentralization with Celo.
We’re about a month into the Great Celo Stakeoff and it is definitely the most dynamic testnet launch I’ve ever been apart of. I’m constantly checking vote totals and running election calculations. On Discord, groups are frequently asking for new validators to associate with and are chatting late into the night. And already, many of the validators (myself included), who had a slow start, are already unable to meet the minimum threshold required to become elected as a validator – but we’re all trying to get back in the game!
I’ll be the first to admit I had only a basic understanding of how the Celo election process worked and didn’t have anything more than a basic uptime strategy coming into the testnet launch – which is probably why we were quickly out of the running. The top validator groups figured it out quickly and were able to take advantage! So, for the rest of us, here’s how the Celo elections work and some strategies that may help you get back into the game if you’re already out or thinking about running a validator on Celo in the future.
Celo uses an election algorithm called The D’Hondt Method. The Wikipedia page has a decent explanation and I’ll use that to demonstrate how the elections work for Celo. Celo validators currently have two areas to vote: for their group and/or for their validator. For each one to have a chance to be elected, the group and validator must have at least 10,000 (10k) cGold staked. For each validator in a group, the group leader must have an additional 10k locked in the group address (4 associated validators means 40k cGold locked).
From an election standpoint, the amount staked to a validator, as long as it’s at least 10k cGold, doesn’t really matter. What does matter is the total amount staked to the group and its validators. When an election occurs, Celo identifies the group with the highest total and elects a validator from that group first. It then starts to apply The D’Hondt Method, which means first dividing the total of the top group by two (for election calculations) then looking for the next highest total. If that first group still had the highest total, even after halving their total stake, they would elect a second validator. If not, the next group with the highest total would be elected (and their effective stake would drop by half as well). This process continues until 100 validators are elected. Each time a group has a new validator elected, their effective stake (for the election only) drops by an increasing factor. So a group with 100k would go to 50k the first time elected; the second time elected, the original total would be divided by 3 to 33k; the third time, divided by 4 to 25k and so on. If that’s confusing, I’ve got an example below:
Number of Validators
For our test case, we’ll start off with 4 validator groups, 6 electable validator positions, and 10 total potential validators. Group A has 4 validators, Group B has 3, Group C has 2, and Group D has 1. The total number of network votes is 1,825,000 divided among the groups as seen in the chart above. An election would go as follows:
On the first pass, Group A gets the top spot since they have the highest total. Group B wins a validator in pass 2 because Group A’s effective votes drop by half. Now for pass 3, both Group A and B have been halved, but Group A’s votes are still higher than the rest so they win a second validator and their original votes are now divided by 3 to 233k. In pass 4, it’s Group C’s turn to win a validator. This continues until 6 validators are elected. Some things to note: Group D does not get a validator elected! Even though they have the highest ‘per-capita’ validator (tied with Group B) at 200k. Group A actually has the second lowest per-capita votes (average votes per validator) at 175k but still elects 3 validators.
Ok so what’s the strategy here?
First, the validator total staked (or locked) doesn’t really matter. A validator with only 10k locked can be elected over validators with higher totals as long as the sum of the group total is higher. What this means is group leaders (the only ones who can lock gold for their group) should only be sending rewards to their group address and locking it. This will allow them to add additional validators and consolidate their votes.
The D’Hondt Method favors the largest groups (as demonstrated above), so groups will probably want to consolidate their group totals by adding additional validators to improve their chance of being elected. What does this mean in the long term? Currently, transactions are not allowed between third parties, so it is up to the group leader alone to save and add validators. But once the network goes to mainnet, what’s to stop people from creating super groups of 10 or more validators? And does it matter? We’re already seeing consolidation, there were over 100 groups when the games started last month and we’re down to less than 50. This will probably be amplified on the mainnet. As these groups consolidate, will that affect the decentralization of the mainnet? And again, will it matter? If validators are free to join groups as they please they can obviously leave a group that is misbehaving. This is similar to Bitcoin mining in the sense that although there are a small number of mining pools, miners can move between pools as desired. The remaining question to be answered then is how much power do the miners actually hold? In 2017, we saw the Bitcoin miners bow to users and SegWit2x failed, will Celo users wield the same authority? Once the mainnet launches, token holders will be introduced to the mix and we will see how they chose to allocate their capital to help ensure decentralization.
So far, so exciting! For those following along, the stakeoff will continue until Feb 19th, with Phase 3 starting Feb 5th. There is some talk of expanding the number of allowed validators beyond 100 to allow those who have already fallen out of the running back in – but that remains to be seen. Additionally, the Celo foundation is performing security audits on those validators that request it for their seal of approval and bonus stake. You can take a look at the current state of the network here as well as the validators participating here.
Note: This is the fifth installment of a series detailing different approaches that blockchain networks have taken to decentralize their network. Part 1 introduced the concept of decentralization and the inter-play it has with certain aspects of crypto networks like governance, incentives, and network architecture. If you missed that article I highly recommend going back and reading it here. The subsequent articles have been examinations of the decentralization on Bitcoin, Factom, & Cosmos. If you missed those and would like to go back and read them before we dive into Terra, you may do so, here, here, & here respectively.
Hey everybody, and welcome back! Last week, we dove into our first proof-of-stake network, Cosmos, and analyzed how it has approached decentralization in its quest to build the ‘Internet of Blockchains’. In addition to assessing how decentralization has worked thus far on Cosmos, we also got into the nuts and bolts of how the underlying technologies supporting its ecosystem of interconnected, application specific blockchains (Tendermint BFT, ABCI, & IBC) work, and the modular network design of Cosmos with hubs and zones. This week, I will be picking up right where we left off and analyzing one of the networks building on top of Cosmos that is seeing major real world use and adoption, Terra.
Terra aims to provide a price stable cryptocurrency, built on top of the Cosmos SDK, that will function as the infrastructure layer for decentralized financial applications. The protocol utilizes an elastic monetary supply that allows for both a stable price and the censorship resistant capabilities of Bitcoin to be maintained, enabling it to be used in everyday transactions. Currently, Terra is the backend technology powering CHAI, a mobile payments application that allows users to link their bank accounts and participate in ecommerce using Terra’s currency and receiving discounts in exchange.
Terra is currently backed by South Korean internet giant Kakao and integrated with over 10 e-commerce platforms in Southeast Asia. The platform has seen rapid growth since launching last April, having just reached over 1,000,000 users last week, and continues to grow. With so many cryptocurrencies struggling to break into the commercial sector and seemingly every new year being the year we will finally start to see adoption, this is certainly no trivial feet. So now, without further ado, let’s dive into Terra and see how this network has approached decentralization on their quest to become the largest payments platform in Asia!
So What is Terra and How Does it Work?
Before we dive into the technical underpinning of Terra, the problem it solves, and its approach to doing so, it will help to first have some context regarding the rather unconventional background of its founders and some of the history leading up to its launch. Work on the project commenced in April of 2018, led by co-founders Daniel Shin and Do Kwon. Kwon had previously worked as a software engineer at Apple and Microsoft, in addition to being founder and CEO of a startup called Anyfi that attempted to use peer-to-peer mesh networks to try and create a new, decentralized internet. Shin was a successful serial entrepreneur, having built and sold multiple e-commerce companies in East Asia, and, at the time, was CEO of his most recent startup, TicketMonster, the leading e-commerce platform in Korea. Leveraging their extensive backgrounds in e-commerce and distributed systems, the pair sought to create a modern financial system built on-top of a blockchain that could be used by people to make everyday payments. The two believed that the major roadblocks to adopting cryptocurrencies largely stemmed from the extreme price volatility and lack of a clear path to adoption that most networks exhibited. Thus, they designed Terra to be a price-stable, growth-driven cryptocurrency that was focused on real world adoption from day one. Leveraging Shin’s deep connections in e-commerce, they formed a consortium of e-commerce companies known as the Terra Alliance. Within a few months of launching, 15 Asian e-commerce platforms had joined that represented a total of $25 Billion in annual transaction volume and 40 million customers. This coincided with a $32 million seed round investment the team raised from some of the world’s largest crypto exchanges and top crypto investment firms. Having a war-chest of funding in place and real world partnerships aligned, the team was off to the races as they started building the project and integrating it with e-commerce platforms.
Seeing as Bitcoin launched over a decade ago as a peer-to-peer electronic cash system, you may be wondering why a protocol like Terra was still tackling the issue of digital payments. This is largely because Bitcoin and other cryptocurrencies exhibit significant price volatility, making consumers nervous whether it will maintain its value when they try to transact with it later. For perspective, Crypto markets can fluctuate 10% or more in either direction on any given day, and in late 2017 Bitcoin was trading at nearly $20,000/BTC and less than two months later was trading between $6000 – $9000 (at the time of writing this article Bitcoin is trading at $8415.65). Terra, was far from being the first or only project to realize that price volatility posed a significant barrier to crypto adoption. Attempts at creating a price-stable cryptocurrency, or stablecoin, date back as far 2014 with BitShares, and have proliferated at a momentous rate in the wake of the volatility exhibited in the last crypto bull market in late 2017.
Stablecoins are exactly what their name suggests, a cryptocurrency designed to be highly price-stable in respect to some reference point or asset and maintain the following three functions of money: a store of value, a unit of account, and a medium of exchange. While this sounds fairly intuitive and straightforward, the engineering behind these instruments is quite difficult with no agreed upon approach. Cornell University attempted to codify the different approaches some networks are taking, and put forth a paper classifying the different design frameworks for stablecoins, which can be found here. The finer nuances and mechanics of each approach exceed the scope of this article, but the study revealed that most stablecoins maintain price using one of the following mechanisms: a reserve of pegged/collateralized coins or assets, a dual coin design, or algorithmically.
Maintaining a reserve of a pegged or collateralized asset allows the organization controlling the stablecoin to maintain price by incentivizing users to expand or contract the supply until it returns to its pegged price. Users are able to earn money by expanding the supply when the price is high and redeeming when it is low through arbitrage until the opportunity disappears and the price has equilibrated. The dual coin approach is where a network implements a two token system in which one coin is designed to absorb the volatility of the first through a process known as seigniorage. This is where the secondary coin is auctioned in exchange for the stable coin if it dips below the peg and the proceeds are burned to contract the supply and stabilize the price. Conversely, if the price of the stablecoin is above that of the peg, new coins will be minted to those holding the secondary coin to expand the supply and level the price. Lastly, the algorithmic approach uses complex algorithms and quantitative financial techniques to adjust the currency price as needed without any backing of pegged or collateralized assets. Hence, it behaves analogously to a traditional cryptocurrency in the sense that a user’s balance and outstanding payments vary proportionately with changes in the market cap of the coin, but it provides a more stable unit of account.
Terra utilizes a dual coin approach in which the transactional currency, Terra, represents an ecosystem of cryptocurrencies pegged to real currencies like USD, EUR, KRW and the IMF SDR, and Luna is the secondary coin that absorbs the volatility. All of the Terra sub-currencies (TerraKRW, TerraUSD, etc.) can be swapped between one another instantly at the effective exchange rate for that currency pair, allowing the network to maintain high liquidity. Since the prices of these fiat currencies are unknown to the blockchain natively, a network of decentralized price oracles are used to approximate the true value of the exchange. Oracles, in this context, are essentially trusted data sources that broadcast pricing data generated from currency exchanges onto the network. They vote on what they believe the true price of the fiat currencies to be, and, so long as they are within one standard deviation of the true price, are rewarded in some amount of Terra for their service. Should the price of Terra deviate from its peg, the money supply is contracted or expanded as needed using a seigniorage method similar to that described above. Hence, oracles mining Terra transactions absorb the short Term costs of contracting the supply and gain from increased mining rewards in the mid to long term.
Luna and the Critical Role is has in the Tokenomics & Governance of Terra
However, since Terra is a proof-of-stake network, oracles must have stake in the network in order to be able to mine Terra transactions. This is where the second token, Luna, comes in. Luna is the native currency of the protocol that represents the mining power of the network, and what miners stake in order to be elected to produce blocks. Luna also plays a critical role in defending against Terra price fluctuations by allowing the system to make the price for Terra by agreeing to be a counterparty for anyone looking to swap Terra and Luna at the exchange rate. In other words, if the price of TerraSDR << 1 SDR, arbitrageurs can send 1 TerraSDR to the system for 1 Luna and vice versa. Thus, miners can benefit financially from risk-free arbitrage opportunities and the network is able to maintain an equilibrium around the target exchange rate of Terra irrespective of market conditions. Luna is also minted to match offers for Terra, allowing for any volatility in the price of Terra to be absorbed from Terra into the Luna supply. In addition to the transaction fees validators collect from producing blocks, the network also will automatically scale seigniorage by burning Luna as demand for Terra increases. As Luna is burned, mining power becomes scarcer and the price of Luna should theoretically increase. This scales with the transaction volume and demand on the network, allowing miners to earn predictable rewards in all economic conditions.
In the most recent update for the protocol (Columbus-3) on December 13, 2019, Luna gained even more utility in the Terra ecosystem by allowing its holders to participate in on-chain governance. Luna holders can now submit proposals for parameter or monetary policy changes to the network as well as make general text proposals and request funds from the community pool (a portion of the seigniorage tokens available to fund community initiatives). If a proposal receives a super majority of supporting votes, the proposal will be ratified and changes made. Not only does this extend to the functionality of Luna, but it also opens up the user base of individuals who can actively participate in the network. Before the Columbus-3 update, Luna only added value to miners on the network, but now anyone can purchase Luna and use it to participate in governance. Moreover, Luna transactions are tax free so it is even easier for non-miners to acquire to participate on the network.
Terra also has a governmental body known as the Treasury that is designed to allocate resources from seigniorage to decentralized applications (dApps) being built on top of the platform. After registering as an entity on the Terra Network, a dApp can make a proposal to the Treasury for funding, and Luna Validators may then then vote on whether to accept or reject the application based on its economic activity and use of funding. Should the application receive more than ⅓ of the of the total available Luna validating power, it will be accepted and the Treasury will allow the dApp to open an account and receive funding based on the proportional vote it received from Luna validators. The Treasury ultimately determines how funds are allocated to dApps, but, if the community feels the firm is not delivering results, validators can vote to blacklist the dApp. Ultimately, this tiered governance structure is designed to provide Luna holders with a way to determine what proposals and organizations receive funding based on the highest net impact it will have on the Terra economy.
So How Decentralized is Terra?
As of writing this article, there are currently 61 validators located around the world on the Terra Columbus 3 mainnet (We at Consensus Networks are on this list and have been active validators since Terra first launched this past April!). While only about ⅓ of the size of the number of validators on its parent network, Cosmos, this is still a fairly impressive degree of physical decentralization when considering Terra underwent a fully decentralized launch and has been concentrating on integrating with e-commerce platforms exclusively in Southeast Asia. However, as of the writing of this article, the top 9 validators control 62.8% of the voting power on the network. So, similar to what was observed last week with Cosmos, a very small handful of network participants control the majority of economic and governance resources.
However, what is less clear, is if this centralization of resources has as significant of consequences on a growth focused stablecoin network like Terra. For example, Seoul National University’s blockchain research group, Decipher, conducted an in depth study on Terra that concluded it exhibits much greater price stability than other popular stablecoins like USDC or USDT. Terra has also on-boarded 14 online e-commerce platforms and over 1,000,000 users onto its payments application, CHAI, resulting in over $130 million being processed by the network to date. They have also begun expanding outside of South Korea into areas like Mongolia and Singapore. Given that Terra’s mission was to be a price-stable cryptocurrency with a clear path to market, it objectively appears that they have been successful in their goal thus far (Especially when considering that the mainnet has been live for less than a year). With validators receiving rewards in two forms (transaction fees and Luna burn), Terra has created a rewards structure that is predictable under all economic conditions for validators, giving them little to gain from colluding in an attempt to undermine the network.
Yet, the recent additions of on-chain governance in Columbus-3, Luna being listed on more exchanges, and Luna receiving a tax exempt status on transactions introduces new layers of complexity to the Terra ecosystem that could pose a threat to the decentralization of the network. Now, anyone can vote on proposals that affect Terra’s future trajectory at both a governance and functional level. When considering that proposals on the network require a supermajority of votes to pass, the threat of collusion between a handful of parties controlling most of the resources now poses a much greater threat. For example, if the top 9 validators were to collude and try to pass a proposal that benefited them at the expense of the network, they would only need to acquire roughly 4% more of the voting power to reach the supermajority needed approve it and change the protocol at a functional level. Additionally, given Terra’s adoption-driven growth model, there are now a whole new range of stakeholders that must be factored into the ecosystem like e-commerce platforms and users of Terra-based applications. While still unclear how this will evolve over time, effectively anticipating and designing for these new dynamics is one of the primary focus areas of the team moving forward, as can be seen here
Given the major shifts in how the protocol operates and the massive influx of new stakeholders, it is far too early to speculate on how Terra’s approach to decentralization will proliferate into the future. Regardless, the fact remains that Terra’s adoption-driven approach to growth has made it one of the few cryptocurrencies that has started seeing demonstrable real world use to date. Having recently hired Uber’s former Head of Strategy, Rahul Abrol, to spearhead their international growth efforts, Terra and CHAI has a very realistic chance of achieving their goal of becoming the leading payments platform in Asia in the years to come. Thank you for reading, and I hope you enjoyed the article! Tune in next week as we explore the other massive project looking to create the internet of blockchains, Polkadot, and its Canary Network, Kusama!
Note: This is the fourth installment of a series detailing the approaches that different blockchain networks have taken to decentralize their network. Part 1 introduced the concept of decentralization at a high level, the role it plays in crypto networks and some of the different factors inherent to a protocol that influence how decentralization will propagate at scale and influence participation. Parts 2 & 3 then provided in-depth looks at Bitcoin and the Factom Protocol in addition to their approaches to decentralization. If you missed any of these articles and would like a refresher before we dive into the decentralization of Cosmos, you may do so here, here, & here respectively.
Hello everyone and thanks for joining! Over the last few weeks, we have talked quite a bit about decentralization and the different mechanisms that blockchain networks use to influence participation and dispersion of their network participants. We took a nice trip down memory lane to examine Bitcoin’s approach over the last decade, followed by fellow veteran protocol Factom. In this week’s article, we’ll dive into Cosmos, the first Proof-of-Stake (POS) protocol in our series. While conceptualized as a white paper since 2016, Cosmos was just launched on March 13, 2019. The full history leading up to this event is too long to recount in this article, but I highly recommend checking it out here, as it involves some cool blockchain history. For example, one of the organizations behind the Cosmos Network, the Interchain Foundation (ICF), raised nearly $17 million in 30 minutes for the project via an ICO and it is regarded as one of the most successful fundraising events for a blockchain-related project ever. Cosmos may be slightly less than a year in age, but it is certainly one of the most ambitious blockchain projects to emerge thus far. Cosmos is creating what many have termed the ‘Internet of Blockchains’ or ‘Blockchain 3.0’, which is an ecosystem of interconnected blockchain protocols that are interoperable and can seamlessly communicate with one another.
What is Cosmos and How Does it Work?
Beginning with Ethereum in 2014, developers started experimenting with ways to implement Bitcoin’s underlying blockchain technology into other applications outside of digital money. Specifically, many were interested in using blockchain for stateful applications that could handle more robust computer logic rather than a stateless ledger. This led to innovations like smart contracts and gave rise to the notion of decentralized applications (dApps). As developers started exploring new uses for blockchain technology an explosion of new, highly-specific crypto networks ensued. Consequently, people began to realize that one of two things was going to happen if the decentralized web was going to work: One school of thought was that everything would resolve to one or a handful of ‘fat protocols’ capable of handling all the compute needs of the different applications built on top of them with universally agreeable tokenomics. The opposing view was that there would be a myriad of different blockchains, some application-specific and others comprehensive, that would be interoperable with one another so people could transact across them as needed.
Cosmos was birthed out of the latter ideology and is essentially applying a microservices architecture to create a base-layer network that connects an ecosystem of application-specific blockchains. To achieve this goal, the teams working on Cosmos decouple the different components of blockchain networks (consensus, networking, & the application) and provide them piecewise in a ‘plug-n-play’ fashion to the protocols building their blockchains on top of Cosmos. Hence, Cosmos, while a blockchain protocol in-itself, acts as a platform wherein other blockchains can be built using a suite of customizable, shared technologies and Cosmos handles properly managing state and transactions across the different sub-networks.
The first component that makes this possible is the Tendermint BFT Consensus Engine. Tendermint BFT is an open-source technology that combines the networking and consensus layers of blockchains into a generic byzantine fault-tolerant ‘consensus engine’ that can easily be plugged into any application to create a proof-of-stake blockchain. Being byzantine fault tolerant means that up to ⅓ of the nodes in the network can arbitrarily fail or be overcome by a malicious actor, yet it will still operate as intended. If you would like more information regarding byzantine fault-tolerance or the byzantine generals problem, the original paper on the topic is available here. Proof-of-stake is an alternative consensus mechanism to proof-of-work that replaces mining with the notion of skin in the game via stake, or the amount of a network’s cryptocurrency a user owns and invests into the protocol. The larger a participant’s stake, the more likely that participant is to be elected to produce blocks and be rewarded. This is analogous to the relationship between a Bitcoin miner’s investment in computing power and his or her likelihood of mining a block, but requires significantly fewer computational resources and allows for faster transaction times. Tendermint founder Jae Kwon was actually the first person to prove and demonstrate that byzantine fault tolerance worked for proof-of-stake networks, so for more information, I invite you to check out the paper he released on the matter here.
Cosmos also provides blockchains building on top of it with a generalized application development framework (the Cosmos SDK), that makes building secure applications on top of Tendermint BFT using ABCI much easier and quicker. The SDK comes with a set of pre-built modules that allow developers to build applications without having to code everything from scratch with built-in security and separation of concerns. Hence, they simply use the components they need and can accelerate the time it takes for them to launch their blockchain.
Once a developer has launched their application-specific blockchain, he or she can then communicate and transact with other blockchains in the Cosmos ecosystem via the Inter-Blockchain Communication Protocol (IBC). Each chain in the Cosmos ecosystem is heterogeneous, meaning that they all have fast-finality consensus provided by Tendermint BFT and their own set of Validators, differing only in how they implement these components. Validators are the node operators in Proof-of-Stake networks, like Cosmos, who are responsible for participating in consensus, producing blocks, and governing the network. Anyone can set up a validator and compete to be in the active set or subset, of validators selected to participate in consensus. Before launch, a proof-of-stake network will determine how large it wants its active set to be and then validators fill that set based on the magnitude of their stake after launch. Cosmos’s consensus mechanism then selects participants probabilistically based on the amount of stake they have relative to the other validators to solve blocks and be rewarded. Validators can also suggest and vote on governance proposals for the network. Users on the network who lack the funds or stake required to be in the active set may instead stake their funds with a validator in the active set and receive a share of the rewards that a validator receives proportional to the amount they have staked minus a fee cut from the validator.
Each application-specific blockchain represents a zone, which then connects to a hub (in this case Cosmos) via IBC connections. The Cosmos Hub is specifically designed to serve as the connection point for all of the applications, or zones, built within its ecosystem and carry out seamless transfers between them via the IBC, as seen below:
Cosmos has designed its ecosystem such that there can eventually be multiple hubs connected via IBC so that the network can continue to operate at thousands of transactions per second at scale. Additionally, they plan to incorporate Peg Zones, or blockchains, whose sole responsibility is to track the state of another chain. This Peg Zone will serve as a bridge to other non-Tendermint networks like Ethereum or Bitcoin and allow them to communicate with Cosmos via IBC. Hence, the end vision for Cosmos is an ecosystem of interconnected blockchains that looks as follows:
Governance on Cosmos
Cosmos was designed with a tiered governance structure that is open to the entire community but leaves ultimate voting responsibility within the hands of those supporting the network (i.e the validators). There is a constitution that states how Cosmos is governed and how proposals for network updates are to be made. Anyone holding the network’s native currency (the Atom) is eligible to make a governance proposal and submit it for voting. The proposals can range in focus and magnitude and cover everything from changing how governance occurs on the network to overhauling aspects of the networks codebase that modifies functionality.
The one caveat is that at least 512 Atoms must be deposited toward the proposal by either the member who submitted it or the community broadly. At the end of two weeks, the proposal is either dismissed if it does not receive sufficient backing or enters into a new two week voting period where the community discusses its implications and submit a vote of either “yes”, “no”, “no with a veto”, or choose to “abstain”. Only staked Atoms count towards governance and the relative weight of a participant’s vote is based on their stake in the network. Hence, Validators with larger stakes hold more voting power. Atom holders who choose not to run a validator can stake their Atoms to any validator they feel represents their interest and inherit their vote. A proposal will only be accepted if over 40% of staked Atoms participate in the vote, over 50% of the vote is in favor of the proposal, and less than a third elected to veto the proposal. If successful, the proposal is integrated into the software that runs the network by the core developer team and the validators must coordinate an update to reflect the changes.
More on the finer details of how the governance process works can be found in the Cosmos white paper here, or in this article written by fellow Cosmos Validator Chorus One. It is also important to note that each zone built on Cosmos has its own separate constitution and governance process. Therefore, governance decisions made on the Cosmos Hub do not necessarily dictate how other zones operate. However, if any changes are made to either Tendermint BFT or the Cosmos SDK as a result of the governance update, zones will likely need to update their own networks if they are using either of these technologies.
So How Decentralized is Cosmos?
Cosmos Hub 1 (version 1 of the mainnet) launched on March 13, 2019, and was one of the first proof-of-stake networks to successfully carry out a fully decentralized launch. This is in contrast to other successful proof-of-stake network launches, like Tezos, which had a group of Foundation Validators who controlled the initial mainnet. The initial active set was capped at 100 possible validators, and 75 of these slots were filled at genesis. This meant that there were 75 different groups distributed around the world working together to coordinate the launch of the network and start actively validating transactions. However, while Cosmos may have had an impressive initial degree of decentralization based on the number of unique validators, the true magnitude of that decentralization was far more nuanced.
Within the first week after the launch of the Cosmos mainnet, the top 5 Validators controlled nearly 50% of the staked Atoms on the network. Moreover, many of these participants had been significant buyers in the Cosmos coin offering. This illustrated that, while Cosmos may have been decentralized in terms of the number of validators, the actual distribution of power and resources within the network told a far different, more centralized story. For the next three months, 15 or fewer validators controlled ⅔ of the staked atoms on the entire network at any given time (per https://twitter.com/cosmosdecentral). In October 2019, Gavin Birch, community analyst at Figment Networks, released a report detailing the negative repercussions that coupling governance with the validator role was having on Cosmos, which may be found here. In summary, the report revealed that validators were exploiting the Cosmos incentive structure to gain undue influence over network governance at the expense of other validators. Some validators were running with zero fees, meaning that 100% of the rewards they received would be distributed to those staking with them. Naturally, these zero-fee validators received a disproportionate amount of the stake as atom holders sought to maximize their returns. This forced some validators to shut down, due to their inability to continue competitively and earn rewards sufficient enough to offset the cost of their infrastructure.
Many community members grew concerned over what this meant for the future of Cosmos, as this behavior posed a major security vulnerability to the network. If validators continued to exploit the network in this fashion, it would drive away those that could no longer afford to run a validator and decrease the level of physical decentralization across the network. Thus, it would be easier for malicious actors to overtake the network. Thus, on December 11, Cosmos Hub 3 was launched based on a governance proposal that aimed to reconcile some of the alignment issues between network governance and incentives. The first major change was that the active set of validators was expanded from 100 to 125 validators, allowing for more participation in the network. The second major change was a redefining of the entire governance structure. Prior to the upgrade, voting was only a signaling mechanism. The result of a vote would have no immediate consequences other than telling the core development team what changes needed to be made to the network based on the proposal. The core developers would then release a software update, all of the validators would coordinate when to perform, and the new network would be restarted (a process known as a hard fork). With the release of Cosmos Hub 3, the network now features on-chain governance. This means that the result of a vote will automatically elicit changes to the codebase to the appropriate parameters and alter how the network behaves without having to perform a hard fork.
Instituting on-chain governance drastically shifts the incentives for network participants to vote and provides a mechanism by which delegators (those staking to other validators) can more actively participate in governance. While they still have the option to inherit the vote of the validator they are staked to, delegators can also elect to instead override that vote with their own if they disagree with how the network should proceed. This helps galvanize nominators to play a more active role in governance which helps balance the economic and governance incentives of the network while maintaining a relatively stable degree of physical decentralization. Additionally, there has been a proposal made to fund a governance working group to work alongside protocol development and help address emerging governance issues. There were several other major updates in Cosmos Hub 3, like the introduction of a rewards pool from which network participants can vote on proposals to fund projects that enhance the ecosystem. The full proposal may be found here.
It is still too early to say if expanding the active set of validators and introducing on-chain governance was the right answer to solving the centralization of power and economics on Cosmos. Figment Networks’ December Cosmos update revealed that the voting power of the bottom 90% of validators on the network had increased 3% from November. However, it also found that the top 10 validators on the network controlled 46% of the voting and consensus power. This analysis was only done between the top 100 validators in December to maintain consistency from November, so this month’s analysis will better reflect the impact that increasing the validator set to 125 had on the decentralization of the network. And who knows, with Cosmos slated to conduct Game of Zones in the coming months to refine their inter- blockchain communication module, a slew of major updates are likely to happen this year that will fundamentally change how the network operates. Thanks for reading and join me next week as we dissect the decentralization of one of the fastest growing networks in South Korea and a zone within the Cosmos ecosystem, Terra!
Note: This is the third installment of a multi-part series detailing the approaches that different blockchain networks are taking towards decentralization. Part 1 introduced the concept of decentralization at a high level, the role it plays in crypto networks, and some of the different factors inherent to a protocol that influence how decentralization will propagate at scale and influence participation. Part 2 then provided an indepth look at Bitcoin and its approach to decentralization. If you missed either of these articles and would like a refresher before we dive into the decentralization of the Factom, you can do so here and here respectively.
Hello again and I hope everyone’s 2020 is off to a good start! In this week’s article, I will be diving into how Factom has approached decentralization. Conceptualized in 2014 and released the following year, Factom is one of the more veteran protocols still in use today. After observing the speed, cost, & bloat limitations developers experienced when building applications on top of Bitcoin, Factom was released as a developer friendly way to secure information into the Bitcoin and Ethereum Blockchains without having to transact with those networks directly. Factom is designed to help ensure data integrity and has been used to secure data to the blockchain for the likes of the Department of Energy, Department of Homeland Security, and the Bill and Melinda Gates Foundation to name a few. Most recently, Factom has seen use as the base layer network for PegNet, a decentralized CPU-minable stablecoin network that allows users to convert between a network of pegged assets (crypto and real world) for less than one-tenth of a cent. So without further ado, let’s get into nuts and bolts of how Factom works!
Factom is essentially a collection of blockchains that immutably record any form of data in a very structured, accessible way. A user simply creates a chain for a topic and then writes data to that chain, where it is recorded as transactions in its blocks. This data is then secured to the Factom blockchain to leverage the power of the overall network. Factom is composed of several layers of data structures that are hierarchical in nature. The highest layer data structure is called the Directory Layer, which organizes the Merkle Roots of the Entry Blocks. Basically, this layer is just a hash generated from all of the entry blocks plus their corresponding Chain IDs. The next layer down is the Entry Block Layer itself which holds reference pointers to all of the entries with a particular Chain ID that arrived within a given time. Underneath the Entry Block Layer comes the Entries themselves, which are the raw application data written to Factom. Lastly, come chains, which are a grouping of entries for a particular application or topic in an application. An image of how all of these different layers interact may be viewed below. In short, application data is organized into chains, which are added to entry blocks and hashed into the Directory Layer to be secured by Bitcoin and Ethereum.
Factom utilizes a two token model in which there is the token associated with the protocol (the Factoid) and another token used to submit entries to the network known as the entry credit. The Factoid, like other cryptocurrencies, is price sensitive and varies with the market over time. The entry credit, by contrast, maintains a fixed price of a tenth of a cent and may only be used to submit entries to Factom. This allows developers and enterprises to interact with the Factom blockchain at a stable, predictable price while still leveraging the hash power of more price volatile networks like Bitcoin and Ethereum. To transact on the network, developers use Factoids to purchase entry credits that are in turn used to submit application data to the blockchain. The application then records an entry to the Factom blockchain and Factom servers will create appropriate entry and directory blocks. Factom then secures an anchor, or hash of the directory block, to Bitcoin and Ethereum. An overview of the architecture of how this process looks in practice may be viewed below.
Nodes on the Factom Network are split into two classes of servers in an attempt to decouple the two rolls that Bitcoin Miners essentially play: recording entries in a final order and auditing the validity of entries. Hence, there are federated Factom servers and auditing servers. Federated servers are those responsible for accepting entries to a chain on Factom, assembling them into blocks, and then fixing the order of all of the entries across the network. Roughly every 10 minutes a block for all entries on the network is recorded to the Factom Blockchain by the Federated servers and a hash of the data is then inserted into the anchors on to Bitcoin and Ethereum. Auditing Servers, by contrast, simply audit all entries made on the network. The verification of all entries is done client side, enabling audit servers to perform their job in either a trusted or trustless manner depending on the client’s needs and level of trust a priori.
Managing all of the chains on the Factom Network is no simple task. Since any application can have as many independent chains as needed to secure different data sources, there are thousands of chains and entries that must be validated, written to the Factom blockchain, and then propagated to Bitcoin and Ethereum. A network of independent, globally distributed Authority Nodes bear this responsibility on the Factom Protocol. These Authority Servers are the set of federated and audit servers that essentially operate Factom. The federated servers in the Authority Set are responsible for ordering entries made to the network, and the audit servers in this set duplicate the work of the Federated servers and will take over in the event a federated server fails. To ensure no one party in the authority set has too much power, each server is only responsible for a small part of the system, servers double check the work of other servers, and servers cycle responsibilities every minute or so. Only a small group of trusted, community elected parties are permitted to run Authority Servers, so the network is able to record entries quickly using a variant of a Proof-of-Authority Consensus Mechanism.
Those running Authority Servers are known as Authority Node Operators (ANOs), and are the network participants who benefit economically in the Factom Ecosystem. They are also the parties responsible for governing Factom, and thus hold considerable power within the Factom ecosystem. Unlike Bitcoin, however, where anyone can mine and increase their relative economic power by investing in more infrastructure, ANOs must be elected by the community of existing ANOs. This process is done typically once per year and there will only ever be 65. All governance decisions made by the ANOs are done off-chain. A thorough explanation of Factom’s governance process may be found here, but is generally as follows: The community will draft a document for any change it is considering making to the network. A ‘Major Timed Discussion’ is then opened on Factomize, the community’s forum channel, and members have 8 days to voice their opinions or concerns over the proposal. At the end of the discussion, a vote is made by the ANOs on whether to implement the change or not. If the vote passes it will be sent to the democratically elected legal committee for review, and then added to the governance documents.
In exchange for running the infrastructure that supports the network and playing an active role in governance, ANOs are compensated in Factoids. They are rewarded 2246 FCT per month minus their efficiency. Efficiency refers to the percentage of their FCT that ANOs forego to the community grant pool and is related to the amount of work an ANO contributes to Factom outside of infrastructure services. A full breakdown of how ANOs are compensated with examples may be found here, but tends to take the following form: An ANO only providing infrastructure and governance services is expected to operate at around a 60% efficiency, meaning they contribute ~60% of their total possible rewards to the grant pool to fund core protocol work, marketing, or other projects that use or promote Factom. Conversely, if an ANO intends on providing these types of services to the network in addition to other responsibilities, they can set a lower efficiency and be compensated more for their efforts.
So How Decentralized is Factom?
There are currently 28 independent Authority Node Operators who are supporting the Factom Protocol and participating in its governance. While primarily located within the U.S or Europe, the different firms serving as ANOs are distributed around the globe and are comprised of a mix of Factom-specific companies and ledger-agnostic entities. The parties acting as ANOs represent a host of different interests and skill sets ranging from the protocol associated corporation that primarily maintains the core codebase, Factom Inc., to consulting and investment firms like VBIF, and everything in between. (We at Consensus Networks are ANOs for Factom as well!) From this network of ANOs, there are 5 Guides who were elected by the broader Factom community to help implement a governance framework and ensure it runs efficiently. Guides do not inherently possess any greater power or influence, but rather help ensure that ANOs are up-to-date on what governance matters need to be attended to and that all there are democratic, efficient procedures in place for ANOs to use to do so.
While 28 nodes itself is not a large number of authority nodes when compared to thousands of nodes on networks like Bitcoin and Ethereum, it has proven sufficient for the needs of a data layer protocol like Factom in terms of network uptime and speed. It is also important to note that even though only 28 nodes are maintaining the network, the utility of the Factom Protocol is still accessible to anyone. For example, a group of ANOs provide and maintain an Open Node that any developer around the world can utilize to interact with the Factom Protocol via an API endpoint. The most recent network update for Factom revealed that in excess of 150 million requests had been made to this node during the one month period from Nov. 13 – December 13. Additionally, anyone, ANO or not, is eligible to apply for grants to fund their Factom based projects. Seeing as ANO elections tend to occur roughly every 6-12 months, it is reasonable to suspect that there will likely be another election round sometime during 2020. The exact number of ANOs onboarded varies from round to round, but would likely fall somewhere between 4-6 new operators based off of the number of new ANOs brought on each of the last two rounds.
Despite the small cohort of ANOs governing the protocol, they represent quite a diverse range of interests and have varying opinions about what constitutes proper ANO activities and the future direction of the protocol. If you go through nearly any of the major timed discussions on Factomize, you will see healthy discourse over various proposals with multiple viewpoints represented across the different ANOs. To better incentivize ANOs to act in the long-term interests of the network, increase their activity within the community, and prevent them from ‘free-riding’ if elected into the group of 65 ANOs, procedures for ANO promotion and demotion were recently instituted into the governance documents of the network. Hence, the community can better voice their approval or disapproval of an ANOs performance and now has mechanisms in place to punish poor ANO performance. If the community collectively decides that an ANO is not carrying out their responsibilities effectively for two quarters, they can elect to demote them from being an ANO. Ultimately this provides another layer of checks and balances into the power structure of network participants.
Only time will tell if Factom’s open approach to decentralization where no single community member is inherently more powerful than another and all decisions relating to the protocol are made via a democratic process and voted on by all ANOs is successful. With PegNet recently being listed on exchanges and Factom seeing use in the US HOA industry by Avanta Risk Management and for expediting Banking Regulatory Compliance in the UK by Knabu, 2020 is looking to be a bright year for this veteran protocol. I hope you have enjoyed this article and found it informative! If you have any questions or comments please feel free to leave them below. Next week I will be continuing this series and diving into Cosmos, an ambitious project attempting to create the ‘Internet of Blockchains’ so stay tuned and thanks for reading!
Note: This is the second part of a multi-part series in which I will examine the approaches that different blockchain networks are taking towards decentralization. Part 1 introduced the concept of decentralization at a high level, the role it plays in crypto networks, and some of the different factors inherent to a protocol that influence how decentralization will propagate at scale and influence participation. If you missed that article or would like a refresher before we dive into decentralization of the Bitcoin Network, you can do so here.
With 2019 having just come to a close along with the decade, it seems like an apt time to be reflecting on Bitcoin’s approach to decentralization and how it has played out. Having entered the decade as little more than a whitepaper and an open source protocol with a handful of users and contributors, Bitcoin enters the 2020s as the best performing asset of the 2010s. Bitcoin was the first true cryptocurrency and blockchain network, and hence the longest lived. There was no blueprint for Bitcoin to follow, or generally accepted framework for launching a blockchain protocol back in 2009. The Bitcoin network we see today is a true first stab at a decentralized network for transacting value and serves as the longest running experiment for us to learn from. All blockchain protocols that have come since have been influenced by Bitcoin, mimicking certain aspects of it and trying to improve upon others. Hence, Bitcoin serves as a great foundational network to begin our examination of decentralization, as it will help you to better understand why the other networks we will explore have made the choices they did.
An Overview of How the Bitcoin Network Works
One of the most important things to understand about Bitcoin, and most commonly overlooked, is that none of the base technologies used in the creation of Bitcoin were inherently novel. Elliptic curve cryptography, distributed systems, and the idea of proof-of-work had been well established well before Bitcoin. Even the concept of digital cash had been tried as early as the 1990s with David Chaum’s DigiCash. What was novel about Bitcoin’s design was how it combined all of these elements to solve the Double Spend Problem and built an incentive structure around it that allowed digitally native value to be securely transacted across a peer to peer network in a way that was agreed upon through a Proof-of-Work Consensus Algorithm. For those of you who might be new to crypto, I will provide a brief overview. However, I highly recommend reading the Bitcoin Whitepaper if you want a more in-depth explanation.
In short, Bitcoin’s Proof-of-Work Mechanism works as follows. Each block created on the Bitcoin Blockchain has a cryptographic hash associated with it that is generated from its index, timestamp, the block data (Bitcoin transactions), the hash of the previous block, and what is known as a nonce. The nonce is some value that will always ensure that the leading bits of the block hash will always be zeros. There is no way for miners to know what the nonce will be a priori, so they compete to solve it and ultimately the block hash. Moreover, as the compute, or hashing, power on the network increases, so does the difficulty of guessing the correct value. The hash difficulty of the network is periodically adjusted to take 10 minutes on average and once the correct nonce is computed it is broadcast to the network, the miner who calculated it is rewarded, and consensus is reached.
As you can see, miners play a very critical role in the Bitcoin ecosystem. They’re the ones responsible for adding blocks to the Bitcoin blockchain and are compensated for contributing their compute power and helping secure the network. Yet, miners are just one actor within the Bitcoin protocol. Network participants also have the option to run either a full node or a light node on the network. Full nodes are responsible for hosting a copy of the entire Bitcoin ledger and verifying the authenticity of its transaction history all the way back to the genesis block. They simply maintain and distribute the most trusted version of the blockchain to other nodes on the network, thus not requiring the same computational resources as miners. Light nodes play a similar role to full nodes, but instead of keeping a full version of the ledger, they download the block headers to validate the authenticity of transactions. They are often peered to full nodes to further decentralize the network or can be used to help restore a full node if it is corrupted.
Aligning Incentives: Governance and Reward Structures
Let’s examine how Bitcoin attempted to balance the economics and power of stakeholders while decentralizing the network. Using the tiered network architecture described above, Bitcoin sought to create a bifurcation between how network participants are rewarded and how governance occurs. When Bitcoin was first released and blocks were CPU minable anyone could run a miner and be rewarded for contributing their compute power to secure the network while also running a full node. Given the grassroot beginnings of Bitcoin, it made sense that the small group of early adopters should be rewarded economically and have the responsibility of participating in governance. However, as the difficulty of solving blocks increased, miners started utilizing computationally superior ASICs. This made the barrier to profitable mining much steeper as just one top of the line miner cost in excess of $1000, leaving only a small subset of network participants willing to make the necessary capital investment. This high barrier to entry looks like it will continue to manifest and reduce the number of individual and decentralized miners participating on the network.
Without all network participants being able to truly compete in the mining process, there is some inherent risk of a 51% attack. This is where an individual or group of participants are able to take control of the majority of the network hashing, or computing, power to prevent transactions and determine what blocks are added to the blockchain. While in control of the economics of network, miners do not inherently hold any special authority, nor are they required to participate in governance responsibilities. Seeing as Bitcoin is an open source project, anyone running a node on the network can theoretically propose changes to the codebase that alter how transactions are validated, arrive at consensus, etc. A more thorough examination of how Bitcoin governance works may be found here, but the process is generally as follows: A user conducts research to solve some problem with Bitcoin. Once they have a solution they notify all other protocol developers, typically via a Bitcoin Improvement Proposal (BIP). After the proposal has been made, other interested protocol developers begin implementing and testing it to give a formal peer review. If the change is well received and approved it will be implemented into the node software and then node operators, exchanges, and other community members must be convinced to update the software. As long as the majority of the community finds it reasonable the network will be updated and the new rules or functionality are put into place.
So How Has Decentralization Played Out for Bitcoin?
As of writing this article there are just shy of 9000 full nodes supporting the Bitcoin network. 25.30% of these nodes reside in the U.S, 20.76% in Germany, and the remaining 53.94% are dispersed across the rest of the world, largely in Europe and Asia as can be seen below.
It is worth noting that some, like Bitcoin Core Developer Luke Dashjr, speculate the true number of nodes on the network to be closer to 100,000. This estimate allegedly accounts for all nodes on the network and not just nodes in “listening mode” that node monitoring services use when calculating the number of nodes on the network. Regardless, this expansive, global network of nodes is what makes Bitcoin generally regarded as the most secure distributed network. If an individual wanted to undermine the network or rewrite the transaction history, he or she would have to have enough computing power to simultaneously rewrite the ledger on a majority of the nodes, which is nearly impossible (what about quantum computing?).
Many are concerned about the future of Bitcoin if mining power is to remain centrally concentrated within the hands of a few mining pools. If four mining pools control most of the hashing power, don’t they essentially control the future of the network? I can’t say for certain how Bitcoin will evolve in the future, but there is evidence from Bitcoin’s past to suggest that miners will not dictate how the network changes. In November of 2017, Bitcoin was slated to undergo a massive hard fork known as SegWit2x that was designed to upgrade its block size limit from 1MB to 2MB. The true motives for the update remain a heatedly debated topic. Allegedly, the intent of the upgrade was to overcome the scalability problems associated with Bitcoin and allow faster payments. However, many believed that it was a move by miners and large Bitcoin operations to subvert the network and profit by collecting more fees and selling more expensive equipment. Prior to the fork the majority of miners signaled supported the update, but there was no support from Bitcoin Core Developers and little from the community broadly. The Bitcoin community pushed for UASF (User Activated Soft Fork) which was designed to activate SegWit (Segregated Witness) without the condition of SegWit2x that block size must increase. With such low support from the community, SegWit2x failed and UASF (or BIP-148) passed – a major victory for node operators and currency hodlers.
Only time will tell if the approach Satoshi took will prove the most optimal in the long run, but so far his assumptions have been well supported. The network has been self-regulating and the incentives have been so well aligned that it has continued to gain adoption and use for over a decade. In the next article, I will explore how another veteran protocol, Factom, has approached decentralization and draw on some of the lessons to be learned. Until then, Happy New Year!