Healthcare

How Hospital Consolidation is Impacting Medical Supply Chains

By Connor Smith

Introduction

The U.S healthcare system has undergone unprecedented levels of consolidation over the past decade. The largest health system (HCA Healthcare) now has 214 member hospitals and the 25 largest systems collectively manage almost 20% of all hospitals in the U.S. A recent study by Definitive Healthcare found that, in 2019 alone, there were 294 hospital mergers and acquisitions, a slight decrease from 330 the year prior. It goes without saying that such high levels of consolidation have a massive impact on medical supply chains. However, whether these changes result in a positive or negative financial impact remains a heatedly debated topic in the healthcare community. Some argue that consolidation allows providers to leverage economies of scale and reduce costs. Yet, some evidence suggests this may not be the case and that consolidation can actually increase cost of care by up to 40%. In this article, I aim to briefly explore the driving forces behind hospital consolidation, how it is affecting medical supply chains, and the role technology plays in managing these increasingly complex supply chains.

A Brief History of the Driving Forces Behind Hospital Consolidation

The passing of the Affordable Care Act (ACA) in 2010 was a momentous juncture in U.S history, and arguably the most sweeping change to the nation’s healthcare system since the institution of Medicare & Medicaid. While the goal of the plan may have been to make healthcare more affordable and accessible to U.S citizens, the bill incentivized large scale consolidation across the entire industry that permeates to this day. Providers became incentivized to shift care delivery to an outpatient setting and work towards a value-based care model. As a result, health systems began rapidly acquiring and investing in facilities like ambulatory surgery centers, clinics, and physicians groups. This vertical integration caused health systems to evolve from mere collections of hospitals into fully fledged Integrated Delivery Networks (IDNs) that provide and coordinate patient care across the continuum. Similarly, health systems began consolidating horizontally by merging with other IDNs to improve access to capital, increase market share, and improve efficiencies. Constant external pressures to reduce costs and lower the price of treatment have exacerbated this trend and caused consolidation to persist at an accelerated pace, since integrated provider networks can more easily share physicians across facilities to reduce overall headcount and negotiate bulk supplies discounts.

Impact of Consolidation on Medical Supply Chains

Theoretically, these new, ‘mega-systems’ should be able to use these economies of scale to procure supplies at a lower cost and coordinate care more efficiently, thereby reducing costs. Traditionally hospitals have relied on Group Purchasing Organizations (GPOs) to leverage the aggregated buying power of multiple facilities and negotiate lower costs for medical supplies from manufacturers, wholesalers, and other vendors. As IDNs acquire more hospitals and grow in size, they can often negotiate near equal prices with manufacturers without these middlemen and realize greater cost savings. However, savings from such mergers are, unfortunately, often negligible, and they oftentimes actually cause supply chain costs to increase.

A study of 1200 hospitals done by researchers at the Wharton School found that the average supply chain savings for a target hospital in a merger of equal sized systems was only about $176,000. Moreover, the acquirers are often left spending more on supply chain due to complexities that arise from an acquisition like managing physician preference items. The researchers found that, while an acquirer saved an average of 6.4% on inexpensive commodities in the supply chain, these savings were more than offset by a 1.1% increase in costs relating to physician preference items. As more and more hospitals are integrated into an IDN, it becomes increasingly difficult to standardize purchasing decisions across a growing number of physicians unless the system has a robust digital infrastructure and inventory tracking system. Unfortunately, 64% of providers lack dedicated supply chain management systems and 80% still rely on manual processes in some capacity to manage supplies. Consequently, consolidation oftentimes leads to less efficient supply chains that procure similar products from a myriad of suppliers, lack transparency as to where products are located in a system, and have excess supply stocks that are often wasted or expire. 

How Technology Can Break the Cycle

Prior to the pandemic, consolidation was already among the top trends to watch in healthcare for 2020 and experts are speculating that the financial effects of COVID-19 are likely to accelerate this phenomena in the post-pandemic era. Pending any drastic regulatory or political actions, it is therefore unlikely that hospital consolidation is going to decelerate anytime in the near future. Hence, IDNs must leverage digital technologies if they want to properly manage their increasingly complex supply chains and control costs. One of the easiest solutions they can implement are supply chain management platforms. Manual systems are inefficient and error prone, whereas digital supply chain management systems enable automation, reduce error, and can be coupled with advanced analytics to drive further efficiencies. 

Using supply chain analytics, IDN’s can track usage of supplies to help reduce sources of waste like purchasing physician preference items and maintaining expired products. The majority of healthcare executives and supply chain managers agree that supply chain analytics could positively impact hospital costs and even improve hospital margins by over one percent. Considering that hospitals, on average, realize net revenues in excess of $300 Million, a one percent improvement in operating margins can equate to millions of dollars saved. Moreover, these systems can be further integrated with technologies like RFID to enable real-time medical supply tracking within an IDN and help providers load balance supplies across the hospitals in their system based on demand to realize even greater cost savings.

Conclusion

I hope you enjoyed this article! If you are interested in ways healthcare supply chains are evolving, check out my series on the future of medical supply chains. Part one can be found here. Additionally, if you are looking for ways your medical supply chain can be improved, feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era. Until next time!

Healthcare

COVID-19 is Forcing Medical Supply Chains to Change… And that’s a Good Thing: Part III

By Connor Smith

Hello everybody! Welcome to the conclusion of my series on how COVID-19 is changing medical supply chains. In Parts I & II I talked about how the pandemic is forcing U.S medical supply chains of the future to embrace digitization and automation, rep-less sales models for high cost implantable medical devices, and the influence that domestic manufacturing and value-based healthcare are having on their evolution. Before I conclude this series, I’ll add the caveat that medical supply chains will evolve in a myriad of ways far beyond just the six I addressed. These are simply what I believe are the most profound changes on the horizon for these complex systems. Now, let me dive into the two final transformative forces that will reshape U.S medical supply chains in the post-pandemic era.

5. Sophisticated Risk Analysis & Disaster Planning Software

COVID-19 is far from the first environmental disruption global supply chains have faced in recent times. There was the SARS outbreak in 2003, the Ebola virus in 2014, and natural disasters like earthquakes and hurricanes that occur frequently and can have devastating economic impacts. In fact, as many as 85% of supply chains experience at least one major disruption per year. The reality of the modern era is that globalized supply chain networks allow us to consume goods and services at incredibly low  prices, but they are also riddled with ‘achilles heels’ that can shake global economies if perturbed.

As I mentioned in Part II, one way U.S based providers and manufacturers will look to mitigate such interruptions in the future is by ensuring geographic redundancies throughout supplier networks and logistics partners and leveraging domestic manufacturing. While this approach will be one core component of U.S medical supply chain architecture in the future, an overall proactive, strategic approach to supply chain design will be paramount. Hence, medical supply chains of the future will be designed using sophisticated risk analysis software that leverages predictive analytics and advanced simulations to model for possible disruptions like pandemics and natural disasters before committing to suppliers and plan accordingly.

Academics have been developing operational supply chain risk analysis models that leverage AI and advanced computational methods for well over a decade, and supply chain experts across all industries have known the benefits of proactive supply chain risk mitigation for some time. A 2017 study conducted by Deloitte found that taking a proactive approach to supply chain risk management can save firms up to 50% when managing major disruptions. As medical supply chains become increasingly transparent and digitized, investments into proactive risk assessment technologies will be one of the best ROI that medical supply chain management teams can make. Using tools like predictive analytics, supply chain management teams will be able to quantify the risk of using a particular supplier by modeling scenarios like the spread of a virus to a particular region, losing a key manufacturing plant to a natural disaster, and more. Some researchers have already started developing simulation based analyses for global supply chain disruptions resulting from the spread of COVID-19, and it is reasonable to expect that using such models for supply chain planning in the future will become the norm. 

6. Integrating Population Health Analytics 

Originally posited by David Kindig and Greg Stoddart as “the health outcome of a group of individuals, including the distribution of such outcomes within the group”, population health has become a fairly nebulous term used to describe the overall health of patient populations. ‘Pop health’ research encompasses everything from studying how socioeconomic factors like income levels, race, or education level influence the prevalence of certain conditions to the prevalence of various genes in communities and how that affects disease spread and more. Depending on who you talk to, they likely have a different view as to what population health is and the responsibility that providers and manufacturers have in improving outcomes.

Regardless of your thoughts around what population health ecompases, as healthcare continues to evolve towards a value-based model it is inevitable that population health will play an increasingly vital role in how providers deliver care. Value-based incentive structures are designed to drive healthcare towards what the Institute of Healthcare Improvement refers to as the ‘Triple Aim’: improving the patient care experience, improving the health of populations, and reducing the per capita cost of healthcare. An overview of the triple aim is pictured below.

Image Adapted from Health Catalyst

Prior to the pandemic, providers were starting to integrate population health initiatives with supply chain management to combat the increasing strain they felt from over half of the U.S adult population having at least one or more chronic diseases. For example, Indiana-based Eskenazi Health extended its partnership with Meals on Wheels in 2017 to deliver healthy meals to patients discharged from the hospital at their home in an effort to reduce readmission rates. A recent analysis published by Providence Health System found that COVID-19 is accelerating this transition to a distributed care model in which patients receive personalized care from their homes or local clinics instead of at a Health System. They found that the use of virtual care technologies coupled with the desire to reduce unnecessary visits because of the pandemic is forcing medical supply chains to become more personalized. 

It is reasonable to suspect that the medical supply chains of the future will become increasingly patient-centric and account for the socioeconomic, genetic, and other key factors that influence a patient population’s well-being. The Internet of Medical Things market is growing at over 20% year over year and technologies and will likely increase due to the rapid adoption of telehealth & remote patient monitoring technologies because of the pandemic. These platforms will enable providers to gather more information about their patient population than ever before and construct truly personalized care plans. Medical supply chain teams of the future will integrate this information with predictive analytics models to not only ensure that patients receive the best possible care, when and where they need it, and at the lowest cost, but also make cost effective population-level interventions that improve overall societal health levels. 

In Conclusion
I hope you enjoyed the final installment of my series on the future of U.S medical supply chains! Interested in learning more or ways your medical supply chain could be improved? Feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era.

Press Release

Consensus Networks Joins MATTER

South Bend, IN Sep. 22, 2020 — Consensus Networks announces its admission into MATTER, a leading healthtech incubator based in Chicago, IL, to accelerate the commercialization of its HealthNet technology and reinvent medical supply chains for the post-covid era. 

“Being accepted into MATTER has been a complete game changer in our journey bringing HealthNet to market.” Said Consensus Networks’ COO, Connor Smith. “Their team and network of healthcare experts and mentors have helped us tremendously. They understand the nuances and intricacies of the Health IT landscape and have really helped us fine tune our value proposition, messaging, and offering to all of the different stakeholders. MATTER has gone above and beyond to connect us with the potential end users, advisors, and partners we need to successfully commercialize HealthNet and we are excited to be a part of this vibrant community of healthtech innovators.”

MATTER, the premier healthcare incubator and innovation hub, includes hundreds of cutting-edge startups from around the world, working together with dozens of hospitals and health systems, universities and industry-leading companies to build the future of healthcare. Together, the MATTER community is accelerating innovation, advancing care and improving lives. For more information, visit matter.health and follow @MATTERhealth.

About Consensus Networks.

Consensus Networks is an Indiana based LLC that is reinventing medical supply chains for the post-covid era. Founded in 2016 with the goal of creating trusted data environments to derive novel insights, Consensus enables clients to integrate clinical and supply chain data to optimize inventory management and improve patient outcomes. Consensus Networks uses serverless application architecture, blockchain technology, and predictive analytics to ensure the best materials are used to treat patients at the lowest cost.

Healthcare

COVID-19 is Forcing Medical Supply Chains to Change… And that’s a Good Thing: Part II

By Connor Smith

Hello everybody! Welcome back to my series on how COVID-19 is forcing medical supply chains to change. In Part I I talked about how the pandemic is forcing the medical supply chains of the future to embrace digitization and automation and rep-less sales models for high cost implantable medical devices. If you have not checked out the first part of this series, you may do so here. Without further delay, let’s dive into two other ways medical supply chains will evolve.

3. Building Resilience through On-Shoring Production

It’s no secret that China is the world’s largest producer and distributor of medical supplies. Prior to the pandemic, China manufactured 48% of the United States’ personal protective equipment (PPE) and over 90% of certain active pharmaceutical ingredients (APIs). When factoring in other foreign suppliers, the U.S imports nearly $30 Billion worth of medical supplies every year, or roughly 30% of all its medical equipment. While this may seem like a fairly trivial portion of U.S total medical supplies, the pandemic has illustrated the profoundly negative consequences of relying entirely on foreign sources for particular items. There are over 25 drug shortages related to COVID-19 (17 of which are from increased demand) and critical shortages of PPE over six months into the pandemic.

Aside from the public health consequences posed by such disruptions, failure to make domestic supply chains more resilient pose a national security threat. Accordingly, shoring up medical supply chains is a priority for both presidential candidates for the upcoming election. President Trump recently announced an executive order aimed at increasing U.S domestic manufacturing and onshoring supply chains for pharmaceutical and medical supplies to protect against potential shortages. Similarly, Democratic Presidential Nominee Joe Biden put forth a 5 page plan articulating actions he will take, if elected, to rebuild domestic manufacturing capacity and ensure U.S medical supply chains remain resilient and geographically redundant in the future.  

Likewise, the commercial sector has expressed similar sentiments. A recent study conducted by McKinsey & Company found that 93% of surveyed supply chain experts listed increasing resiliency across the chain as a top priority with an emphasis on supplier redundancy and near-shoring production. The U.S medical supply chains of the future will emphasize  localization to mitigate as many disruptions as possible. Such regionalization will also make it easier to shorten the distance between suppliers and customers. As pointed out by Brad Payne of PCI Pharma Services, “Shortening the distance in the supply chain expedites deliveries and lessens room for complicating factors, like customs clearance”. 

Forward looking supply chain leadership on both the provider and vendor sides will be investigating ways to leverage these new logistics and distribution paradigms to reduce their bottom line and improve the quality of their services. For example, technologies like Predictive Analytics can be used to improve last mile delivery for medical supplies manufacturers. Other industries have leveraged logistical data analytics in this manner to reduce fuel costs and improve the quality and performance of their deliveries. Healthcare supply chains of the future will leverage these technologies and other tools like RFID to provide more efficient deliveries and optimize procurement strategies.

4. Use Value-Based Procurement Strategies

Historically, providers received payment through a fee-for-service model in which they are reimbursed based on the number of services they render or procedures they order. This strategy made sense when it was instituted as the reimbursement mechanism for Medicare & Medicaid in 1965, but it has perversely affected the way healthcare is paid for today. Physicians are incentivized to provide as many billable services as possible and take a ‘defensive’ approach to healthcare, ordering procedures and tests just to be safe. Consequently, they may order unnecessary tests and procedures without hesitation, as neither they nor the patient are financially responsible, increasing the cost of care. Over the past decade, insurers have started implementing ‘value-based’ payment models that link reimbursement to patient outcomes in an effort to reduce the overall cost of care and improve patient outcomes. Early results suggest that value-based reimbursement can reduce the cost of claims by nearly 12% and improve chronic disease management. The benefits of value-based models by stakeholder may be seen below.

Image adapted from NEJM Catalyst

Prior to the pandemic, pure fee-for-service was expected to account for less than 26% of all reimbursements in the U.S by 2021. Given the immense financial impact of the virus, the impetus to reduce costs by transitioning to value based care has never been greater. Hence, value based procurement, or ensuring that the right product is delivered to the right patient at the right time and at the lowest cost, will be the norm of medical supply chains of the future. As members of HIMSS have pointed out, the future of the healthcare supply chain will emphasize connecting cost, quality, and outcomes to help realize the Triple Aim and optimize provider  performance.  

Similar to the adoption of rep-less sales models I described in Part I, the foundational technology underlying value-based procurement will be clinical supply chain integration. Connecting supply chain and clinical data sources enables providers to redefine how they administer care. Once integrated, predictive analytics can be used to guide procurement decisions based on the cost of the material and its outcome for the patient. Some hospitals have already saved over $15 Million from clinical supply chain integration. Additionally, EHR integration with supply chain data allows providers to assess clinical variance in treatment with granular detail. Such functionality will be critical as providers look to minimize the impact felt by the pandemic, and continue playing a key role in how they operate into the future.

In Conclusion

I hope you enjoyed the second installment of my series on the future of U.S medical supply chains! I will be back next week with my final two insights as to what the future of these systems will look like. Interested in learning more or ways your medical supply chain could be improved? Feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era. Until next week!

Uncategorized

COVID-19 is Forcing Medical Supply Chains to Change… And that’s a Good Thing: Part I

By Connor Smith

Without question, the dedication and tireless work ethic exhibited by medical first responders has been nothing short of heroic throughout these trying times. Unfortunately, despite (and partially due to) the valiant efforts on the ground, the healthcare sector is one of the most negatively financially impacted industries by the pandemic. A recent report by the American Hospital Association estimates the total financial impact of COVID-19 on the U.S healthcare system from March 2020-June 2020 to be $202.6 Billion. While a significant amount of this financial strain stems from treating patients infected with the virus and forgone revenue from elective procedures, many problems brought on by the virus could have been mitigated with more efficient and resilient U.S medical supply chains.

While the magnitude of a global pandemic is something few could have properly anticipated, COVID-19 introduced few truly new problems for U.S medical supply chains. Instead, it exacerbated problems and inefficiencies that have plagued the industry for years which have never been properly addressed. Yes, over 70% of active pharmaceutical ingredients that supply the U.S healthcare market are imported and constrained as a result of the pandemic and unprecedented global surges in demand for PPE created massive shortages in basic medical equipment for U.S frontline responders. However, siloed data, lack of automation & interoperability, and a reactionary approach to purchasing contributed to upwards of $25 Billion in wasted supply chain spending each year long before the pandemic. In fact, analysts project that 2020 will be the first year in which supply chain costs surpass those of labor as the largest cost component of delivering medical care by providers.

Some are optimistic that a COVID-19 vaccine will be ready by the end of 2020, but this is far from guaranteed and it could be a year or more before one becomes readily available. Consequently, the U.S healthcare system must face a stark reality that it may continue to lose upwards of $20 Billion per month for the foreseeable future. Major operational improvements must be made to its supply chains in order to help offset these costs.

The status quo for medical supply chain management is no longer tolerable and inefficiencies that were previously ignored must be corrected. Medical supply chains need to be reinvented over the coming months if the U.S healthcare system is to survive the pandemic and thrive into the future. This is the first of a three part series in which I will explore 6 different ways that U.S medical supply chains will look in the post-pandemic era and the technologies and external forces that will enable them. So without further ado, let’s dive into the future of U.S medical supply chain management!

1. Automation & Digitization

While this may sound obvious, over 80% of clinicians and hospital leaders still rely on manual processes for managing inventory. Consequently, health systems struggle to know what supplies they currently own, where they are located, if they have expired, and a myriad of other problems. Unfortunately, patients pay the largest price for these manual, inefficient systems. One survey found that 40% of clinicians postponed a patient’s treatment and 23% know of adverse events occurring to patients because of inadequate inventory. Considering that some industries are well into the ‘Supply Chain 4.0’ era, the first seismic shift for medical supply chain & inventory management will be to go entirely digital and implement technologies already used in more mature supply chains like retail.

Regulatory pressures from the FDA’s medical device unique device identifier (UDI) requirements and the Drug Supply Chain Security Act have already begun forcing some of this change and major compliance requirements are going into effect over the coming months. Digitizing medical supply chains will not only significantly reduce errors and inefficiencies arising from human error and manual entry, but also enable the use of technologies like AI and blockchain that can elicit even greater cost savings. For example, firms in other industries have seen cost savings of 32% across an entire operation by implementing AI based inventory management systems. Such implementations can be used to improve inventory management by enabling features like predictive forecasting and optimizing surplus inventory levels that should be maintained at all times. Other heavily regulated industries, like food, are experimenting with blockchain for product traceability applications in supply chains and estimate they can reduce costs of compliance by 30% within a few years. 

Certainly, there is much foundational work that must be done before medical supply chains can integrate these more advanced solutions. Implementing RFID chips to enable real-time asset tracking, achieving basic systems interoperability through data standardization, and switching to cloud-based healthcare inventory management software are among the baby steps that must first be taken. However, given the lack of digital infrastructure currently in place in many medical supply chains, there is an opportunity to ‘leapfrog’ legacy supply chain technology and implement cutting edge solutions to realize even greater cost savings immediately. Forward looking medical supply chain management teams will be looking to implement such solutions to ensure their supply chains remain resilient and future proof to future disruptions or pandemics.

2. Rep-less Medical Device Sales Models

Manufacturers of implantable medical devices have traditionally used an in-person sales model for distribution. These sales reps form close relationships with clinicians and are oftentimes even present in the OR during the surgery. While this model has been the standard practice for decades, it also increases the cost of delivering care tremendously. For example, in orthopedics, it’s estimated that the sales process for implantable devices accounts for 35-50% of the cost of sales. Moreover, this sales process makes it nearly impossible for device manufacturers to track their field inventory and providers to manage their consignment inventory, resulting in further cost increases of up to 25%.

The pandemic has not only made these inefficiencies no longer bearable for providers, but it has hindered the ability of manufacturers to even sell their products. Whether through self-selected or provider mandated deferral, there have been 50% fewer patients receiving care compared to normal, and elective surgeries for implantable devices like knee replacements have dropped by 99%. Moreover, nearly 50% of orthopedic sales reps have been forced to use exclusively digital means to support physicians or have been unable to provide support at all. 

It is reasonable to suspect that providers will continue to limit who is allowed into a hospital at least until a vaccine is readily available, if not longer, and patients can only forgo necessary care for so long. Hence, providers and manufacturers alike will be required to implement technologies that make rep-less sales models attainable. One key technological enabler of this transition will be integrated data environments of supply chain, medical IoT, and EHR data. Integrating supply chain data with EHR data had already been a top priority for many providers entering 2020. Such environments will serve as the cornerstone for other tools like video conferencing software and payment processing tools that can enable a rep-less sales model and save providers  millions of dollars per year

In Conclusion
I hope you enjoyed the first part of my series on the future of U.S medical supply chains! I will be back next week with two more insights regarding what the future of these complex systems will look like. If you are interested in learning more or ways your medical supply chain could be improved, feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era. Until next week!

Uncategorized

Top Three Takeaways From Blockchain and Digital Transformation in Health in 2020

By Connor Smith

Healthcare is frequently mentioned on the shortlist of industries that are expected to be transformed by blockchain technology. Supporters of this assertion have a range of reasons for arguing that blockchain can improve healthcare but most tend to revolve around improving health data security, immutable claims tracking, physician credentialing, or putting patients at the center of care by taking charge of their own data. However, there are many who argue that healthcare is not ready for blockchain or that many of its theorized use cases could be better addressed using other technologies. There are good arguments to be made on both sides and, when considering the complexity of healthcare and nascency of blockchain, it can be difficult to discern what projects are ‘hype’ and which can actually drive meaningful impact. 

We at Consensus Networks are bullish on the potential of blockchain in healthcare, but we also pride ourselves on taking a pragmatic view of projects to realistically assess what is feasible and what is not. This past week, we were fortunate enough to attend the inaugural Blockchain and Digital Transformation in Health in 2020 summit in Austin, TX, where our CEO Nathan Miller presented on the work we have been doing with HealthNet and developing highly secure information architectures in a regulatory environment. The Conference was hosted by the Austin Blockchain Collective in conjunction with the Dell Medical School at University of Texas Austin. There were presentations from industry and academia alike accompanied by an open discourse about the state of blockchain in healthcare, what is actually feasible, and identifying a path forward for the technology as healthcare starts its digital transformation. It was an absolutely great event with high quality information and a pragmatic assessment of the state of the industry, and we’re here to share our top three takeaways with you from the event!

1). Blockchain EMRs are not ready….. At least not yet in the U.S

Throughout the short lifespan of blockchain projects in healthcare, there have been several attempts at a blockchain-based electronic medical record (EMR) that is owned by a patient and shared with providers as needed, the most popular of which is probably Medicalchain. Medical records hold a wealth of information about an individual, containing everything from a person’s medical history to their demographic, identity, and insurance information. However, to date, medical records have been largely owned and controlled by the health systems they reside within. Aside from issues of data sovereignty and controlling who has access to that information, having isolated data silos has a decidedly negative impact on patient outcomes, especially in the U.S. Competing EMR systems are incapable of communicating well with one another. Thus, if a patient goes to multiple providers that all have different EMR systems, the data for those visits will likely never be aggregated into a single, cohesive file and instead remain as isolated fragments. This makes it nearly impossible for a provider to know what care has been administered to a patient previously and leads to billions of dollars being wasted in redundant testing, unnecessary procedures, or in the worst scenarios patient death from improper care.

A blockchain-based EMR would enable the patient to own his or her own medical record, which they would likely hold in some form of mobile application. A patient could then control who has access to their record and have the guarantee a provider is seeing her most up-to-date record as any changes would be reflected in that copy immediately. All transactions would be immutably recorded on a blockchain, and once the visit was finished the patient could revoke the physicians access. Conceptually, such a notion sounds appealing. However, one of the biggest takeaways from the conference was that such a future is far off in the U.S and requires a societal shift and fundamental rethinking of data ownership to get there.

Dr. Aman Quadri, CEO of AMSYS Blockchain and AMCHART, was one of the speakers in attendance at the event. The product he is building, AMCHART, is a blockchain-based EMR that is currently undergoing testing in India, and even he was skeptical of its prospects in the U.S. Dr. Quadri said that the reason they have started seeing AMCHART adoption in India is because people there already have a mindset of data ownership. They take responsibility over their data so a platform like AMCHART extends their current capabilities in a way that is beneficial. Dr. Quadri said that for AMCHART to have impact in the U.S it would require patients and health systems alike to change how they view and approach handling data before their could be a marked increase in value to patient care. He said that American patients have been conditioned for decades to blindly trust their data with medical providers, so shifting that view will be no easy task.

2.) Use Cases Are Emerging Around Care Coordination, Identity, and Data Sharing

The projects being spearheaded by the talented Dell Medical School faculty, visiting academics, and industry representatives in attendance covered a wide range of applications in healthcare, spanning both population health and the clinical setting. While the individual problems these solutions address vary, the common thread amongst most of them was that they centered around care coordination, identity, and data sharing applications. The consensus seemed to be that blockchain could help lay the foundation for a web of trusted access to data with the patient at the center of care.

Dr. Timothy Mercer, a faculty member at Dell Medical School and practicing physician, is exploring ways in which blockchain could be applied to help address homelessness in Austin. His research found that one of the biggest problems for the homeless population in Austin is a lack of any legal form to prove their identity. As a result, they frequently go through the process of proving who they are, which can take weeks to months to complete and delay physicians from providing critical care to the homeless. If the documents are lost or stolen, the process must start all over again. As a result, the average age of death the chronically homeless is 52-56, nearly 20 years less than the global average. Dr. Mercer is exploring ways blockchain and digital identities could be used to ease this burden and accelerate the time to care for homeless persons in Austin. This way, the myriad of stakeholders involved in caring for homeless persons would be able to verify an individual through a  web or mobile application and administer care to the patient. The homeless care ecosystem involves many different organizations, all of which must properly authenticate the individual before they can legally administer care. Utilizing a blockchain-based identity application, the different caregivers would be able to verify the individual’s identity through digital documents linked to the patient’s digital identity and legally provide the care he or she needs. This ultimately would place the homeless person at the center of the care and alleviate the inefficiencies pervasive in the current continuum of care for this patient population.

Image Adapted From Change Healthcare

Another interesting application that was highlighted at the event was Tribe Health Solutions use of blockchain in a medical imaging management solution designed for patients. Through the use of blockchain technology and the interplanetary file system (IPFS), they created a platform where patients can store medical imaging data on a distributed file system and then grant necessary providers access when needed. After care is administered, the patient can revoke access to the image and ensure that only trusted providers can access it. This solution aims to help patients overcome many of the problems associated with receiving care for skeletal or muscular injuries. In such tears or breaks, patients oftentimes seek out multiple opinions and are forced to either manage a physical copy of the medical image themselves or wait days to weeks for the file to be transferred to the provider. This not only delays the time it takes for these patients to receive the care they need and start the recovery process but in the worst case scenarios can lead to a worsening of the condition. Putting the patient in charge of the imaging data, allows her to determine who can view the image and when – ultimately reducing the time it takes to receive treatment.

3.) Blockchain Projects Must Start Small in Healthcare and Involve Many Stakeholders

Image Adapted from AdvaMed

Lastly, perhaps the most prevailing takeaway from the conference was that blockchain projects looking to tackle problems in healthcare need to start small and involve as many stakeholders as possible from the onset. Healthcare is a highly complex industry where ‘moving fast’ and ‘breaking things’ can have significant ramifications on patients, especially when considering that the industry is only now beginning its digital transformation. Dr. Anjum Khurshid of the Dell Medical School at University of Texas Austin and a director of the Austin Blockchain Collective is leading research on a patient credentialing platform called MediLinker. When pressed about the ability to extend the platform into an EMR type technology that could exchange clinical data, Dr. Khurshid cautioned that it’s more important to start small and clinically validate each stage of the product. In an industry that handles high fidelity information and typically averse to new technologies like healthcare, Dr. Khurshid said its important to demonstrate the value of the technology and make it more approachable. He said that the problems in the current healthcare system are so vast that even simple solutions can have a massive impact and that it is imperative to validate the benefit of the technology to patients, providers, and payers alike at each step. Any new, truly innovative and sweeping changes that are to take place in healthcare from blockchain will require all of these parties to work together and identify applications that can drive meaningful value for everyone involved. Healthcare is changing rapidly and only by taking small incremental steps will blockchain be able to integrate with the complex, multi-stakeholder ecosystem that is healthcare.

That’s all for this week! We at Consensus Networks are grateful to have been able to attend this conference and excited about the work going on in Austin to advance the industry forward. We are continuing forward with the development and commercialization of our population health data sharing network, HealthNet as well as smart tools for analyzing health data. If you are interested in learning more about HealthNet or have an idea for a new digital healthcare application you’d like to build, contact one of our experts here today!

Blockchain Learning, Proof of Stake

Celo’s Approach to Decentralization

By Connor Smith

Note: This is the final installment of a series exploring different approaches that blockchain networks have taken to achieve decentralization. Part 1 introduced decentralization and the inter-play it has with certain aspects of crypto networks like governance, incentives, and network architecture. If you missed that article I highly recommend going back and reading it here. The remaining articles have been examinations of the decentralization on Bitcoin, Factom, Cosmos, Terra, and Polkadot/Kusama. If you missed those and would like to go back and read them before we dive into Celo, you may do so, here, here, here, here, & here respectively.

Hey everyone, thank you for joining me again as our thoroughfare through varying approaches to decentralization comes to a conclusion. From the first crypto network, Bitcoin, to the new generation of ‘meta-networks’ like Polkadot and Cosmos, we have seen quite a few different ways networks have attempted to decentralize and how that has influenced their design. We have seen how factors like application design criteria and consensus architecture (i.e proof-of-work vs. proof-of-stake) influence the decentralization of a networks’ resources and participants. Moreover, by taking a chronological approach in the networks examined throughout this series, we have seen the evolution of the crypto industry over the better part of a decade and will be ending with the youngest protocol we have seen thus far, Celo. Aiming to overcome price-volatility and ease of use problems associated with many cryptocurrencies, Celo seeks to bring financial infrastructure and tools to anyone with a smartphone. 

To reach this goal, Celo is taking a full-stack approach and introducing innovations at the networking and application layers with technologies like a lightweight address-based encryption scheme, an ecosystem of stable-value tokens, and novel system architectures. The project is backed by some of the most prolific crypto investment partners and Silicon Valley tech titans like a16z, Polychain Capital, Coinbase, Reid Hoffman, and Jack Dorsey to name a few. The protocol has been making waves over the last few months due to its rigorous incentivized testnet competition, known as the Great Celo Stake Off, the rigor of which was recounted two weeks ago by our CEO Nate Miller. The Stake Off is entering into its third and final phase and the mainnet is slated to launch later this year, making 2020 a big year for the protocol. So without further ado, let’s dive into Celo!

So What is Celo and How Does it Work?

Celo was launched with the mission of building a financial system capable of bringing conditions of prosperity to everyone with a smartphone. To create such an environment, Celo has proposed a theory of change to satisfy the following three verticals: satisfying people’s basic needs like access to food & healthcare, the ability to enable an individual’s growth potential, and increasing people’s social support for one another. All aspects of the protocol, from its architectural decisions, to development efforts, and even all technical and community projects support activities tied to enabling these three conditions and ensuring such a system is created. Work on the project began in the summer of 2018 when entrepreneurs turned GoDaddy executives Rene Reinsberg and Marek Olszewski raised their initial seed round of $6.5 MM from some of the Valley’s top Venture Capitalists. The pair had exited their prior company, Locu, to GoDaddy in 2014 for $70 MM, and had since been serving as vice presidents in the restaurant and small business division of the firm. Armed with little more than a white paper at the time, the team got to work and in less than a year the first public testnet was released. Celo aims to achieve its mission of bringing conditions of prosperity to everyone and being a mobile-only payments platform through the following two features: mapping users public phone numbers to an alphanumeric string (public key) needed to transact on the network, and using a network of stablecoins pegged to a basket of different cryptocurrencies to minimize price volatility.

The team believes that creating a price-stable, more user-friendly transaction experience is the only way that a cryptocurrency payments solution will see success, and thus has sought to redefine the entire blockchain networking stack to optimize for these characteristics. Hence, Celo is a mobile-first solution operating as a proof-of-stake smart contract platform based on Ethereum and composed of the following three components: lightweight identity for a better user experience, stability mechanism for stable-value currencies, and systems for incentives and governance to ensure platform stability. An image of the Celo stack may be viewed below.

Celo’s lightweight identity system utilizes a variant of identity-based encryption known as address-based encryption to overcome traditional user experience issues associated with transacting cryptocurrencies. Instead of the canonical having to download a wallet, generate a public/private key pair, and provide whoever is sending you crypto with a long hexadecimal address, Celo’s address-based encryption ties a user’s phone number directly to a Celo wallet. This allows the phone number to be used when a payment is initiated instead of the actual Celo address when a payment is initiated, simplifying the payment process. Additionally, only a cryptographic hash of the phone number is stored on the blockchain to preserve privacy. Celo also allows a user to link multiple phone numbers to his or her wallet address to protect against losing a phone or changing numbers. Celo also utilizes a social reputation mapping algorithm on top of this infrastructure known as EigenTrust. While the technical underpinnings of the algorithm are fairly complex, it functions similarly to Google’s Page Rank algorithm but is designed for decentralized systems. In short, this algorithm defines a given phone number’s relative reputation score based on the number of phone numbers who are connected with and trust that phone number, coupled with the weighted reputation of those connections.

Similar to Terra’s approach to creating a stablecoin network, Celo is also a network of stablecoins pegged to real-world fiat currencies and uses a seigniorage based approach to maintain stability. For the sake of brevity, I am going to gloss over what stablecoins and seigniorage are, as I discussed them at length in my post on Terra, and instead, dive into how they work in the context of Celo. Celo is designed to support an ecosystem of pegged stable currencies alongside the native token of the protocol, Celo Gold (cGold). The first, and currently only, stablecoin on the network is the Celo Dollar (cUSD), which is pegged to the price of the U.S dollar. cGold has a fixed supply and is held in a reserve contract where it is used to expand or contract the supply of cUSD to maintain a stable price through seigniorage. The network relies on a number of external oracles to provide feeds of the cGold price in USD and then allows users to exchange a dollars worth of cGold for cUSD and vice versa. When the market price rises above the $1 peg, arbitrageurs may profit by purchasing a dollar’s worth of cGold and then selling cUSD for market price. Conversely, if the price of cUSD falls under the peg, arbitrageurs can profit by purchasing a cUSD for market price and exchanging it with the protocol for a dollar’s worth of cGold and then sell the cGold on the market. Thus, network participants are able to profit in nearly any market condition. A more thorough examination of how Celo’s stability mechanism works may be found here.

Celo also goes a step further to ensure stability and has implemented a constant-product market-maker model to prevent against the cGold reserve from becoming overly depleted when the price of cGOld supplied by the oracles does not match the market price. The mechanism dynamically adjusts the offered exchange rate in response to the exchange activity and will update a new constant-product market maker to trade cGold and cUSD whenever the oracle price of cGold is updated. Hence, if the oracle price is correct, the exchange rate determined by the constant-product market-maker will be equivalent to that of the market and no arbitrage opportunity will exist. However, if the oracle price data is incorrect, the rates will differ and an arbitrage opportunity will exist until exploited enough by users to dynamically adjust the quoted exchange rate and erase the opportunity. 

Celo’s Network Architecture

Image adapted from celo-org medium here

Consistent with their full-stack approach to creating a mobile-first financial system,  Celo implements a novel tiered network architecture to optimize the end-user experience and maximizing physical decentralization. Similar to other Byzantine Fault Tolerant (BFT) proof-of-stake networks we have seen so far, like Cosmos, the core network participants responsible for producing blocks and verifying transactions is the validator. Unlike other proof-of-stake networks that encourage anyone who is willing to run a validator, Celo encourages only professional node operators to run a validator on the network. For example, Celo strongly encourages running validators in a secure data center and has been auditing validators participating in the Stake Off to see if this is the case. In maintaining a secure set of globally distributed validators, the network hopes to maximize security, performance, and stability. Celo has also attempted to implement safeguards against any single validator or organization from gathering a disproportionate amount of the network’s resources by introducing validator groups. Instead of electing individual validators to participate in consensus, validator groups comprised of a collection of individual validators are elected and then internally compete to solve blocks. The actual election process and underlying mechanisms are far more involved and complex, so if you are interested in learning more, as I said earlier, check out this blog post from our CEO, Nate Miller, which explains the process in more detail. Validator groups have their own unique identity and a fixed size to make it difficult for a single organization to manage multiple groups and consolidate disproportionate influence, thus improving decentralization.

While the ability to run a validator is fairly restricted to only professional node operators, there are two other tiers of nodes that anyone can run on the Celo Network: a full node and a light client. The Celo application/wallet has a light client embedded within it that is optimized for mobile devices, so anyone running the software on their phone is running the light client. The requests exchanged across these light clients (i.e sending & receiving transactions on the network) must be processed by full nodes, which receive a transaction fee facilitating the transaction. People running full nodes can set a minimum service fee for processing transactions from light clients and refuse to perform service if the fee they collect will be insufficient. The eventual goal of the protocol is to have these components operate such that light clients will automatically choose full nodes to peer with based on cost, latency, and reliability. However, much fundamental network infrastructure must be laid down first before this is achievable. An eventual flow of what this will look like, including validators, may be viewed below.

Image adapted from celo-org medium here

So How Does Governance Work on Celo?

At a high level, Celo has a governance model similar to many other proof-of-stake networks, where the respective weight of a particular user’s vote in the governance process is proportional to the amount of cGold they have staked and the duration of their stake. Similarly, Celo also supports on-chain governance to manage and upgrade all aspects of the protocol including upgrading smart contracts, adding new currencies, or modifying the reserve target asset allocation. Changes are currently made through a governance specific smart contract that acts as the overseer for making modifications to other smart contracts throughout the network. The eventual goal of the protocol is to transition from this smart contract structure for on-chain governance to a Distributed Autonomous Organization, or DAO, owned and managed by cGold holders. This could function in a form similar to how MakerDAO operates, however, it is far too early to speculate on how the Celo DAO would actually function. For more information on what a DAO is or how they work, click here.

Any network participant is eligible to submit a proposal to the governance smart contract, so long as he or she is willing to lock a portion of his or her cGold along with it. A proposal consists of a timestamp and the information needed to execute the operation code should it be accepted by the network. Submitted proposals are then added to a proposal queue for a duration of up to one week where they will be voted upon by cGold holders in hopes of passing to the next approval phase. Every cGold holder with a locked cGold account may vote for one proposal per day. The top three proposals for each day then advance to the approval phase, the original proposers may reclaim their locked gold commitment, and they then enter the referendum phase where they are voted upon by users. Any user may vote ‘yes’, ‘no’, or ‘abstain’ to the proposal and the weight of their vote is tied to their locked gold commitment. While yet to be implemented, Celo also intends on incorporating an adaptive quorum biasing component like we observed in Polkadot to accurately account for voter participation. 

So How Decentralized is Celo?

Image adapted from celo medium account here

As I mentioned earlier, Celo has yet to launch their mainnet so this discussion will be framed through the context of what has transpired throughout their ongoing incentivized testnet, Baklava. As of the time of writing this article, there are currently 75 different validator operations participating in the third phase of the Great Stake Off, but 152 Validators in total and 88 validator groups. Moreover, Celo is debating expanding the active set of validators on the network upon the conclusion of the Stake Off. The active set is currently set at 100 Validators, and the original plan was to wind down the number of possible validator slots to 80 before mainnet launch. However, Celo recently announced that they now plan to expand the active set to 120 so long as scale testing shows this permissible given the active engagement that validators have shown throughout the Stake Off. Considering that Celo intends on only allowing Validator nodes to be run by professional service providers, this is a major step in decentralizing their network and ensuring a globally dispersed, resilient network.

When examining the allocation of resources across the Celo network, there is somewhat of a disparity between the highest-ranked participants and those at the bottom. For example, the top elected validator group has nearly 1.1 MM votes, whereas the lowest elected validator group has only slightly over 200K. Additionally, the top elected group has 5 different participants with the largest, whereas the bottom elected group only has one. This illustrates the importance of the validator group, not the individual validator on Celo. The largest cGold holder within the largest elected validator group only has 194k locked cGold, meaning that all members of the group have fewer locked cGold than the one participant in the bottom group. Yet, the group collectively is the highest voted group so its participants are more likely to participate in consensus and gather rewards. Metrics relating to the decentralization of full nodes and light clients on Celo are not readily available since the network is still in the very early development stages. Consequently, it is difficult to attempt to quantify the degree of decentralization of these layers of the network. The Celo Wallet Application is available for the Alfajores testnet on both the Apple App Store and the Google Play Store, with over 100 downloads for the latter. This suggests that there are at least 100+ light nodes on the non-incentivized testnet Alfajores. 

That’s all! I hope you have enjoyed this case study approach to decentralization as much as I have. With the last phase of the Baklava incentivized testnet coming to a close within the next few weeks, mainnet launch slated for later this year, and the protocol’s recent announcement of Celo Camp to incubate and assist startups building on the platform, it is certainly an exciting time to be involved with Celo. The Great Celo Stake Off has been no walk in the park, but it has certainly stress-tested the network technically and from an incentives standpoint. Excluding some economic barriers to entry for new validators attempting to enter the active set, it appears that Celo’s approach to decentralization has achieved its goal, at least physically. It will be interesting to see if this continues once true economic conditions are introduced on mainnet, but I am optimistic about the future of the network. If you are interested in seeing if Celo is the right blockchain for your application, running a Celo cluster, or how staking on Celo, contact one of our LedgerOps experts here. We have been involved with the protocol throughout the entire incentivized testnet and are currently in the second-highest voted group (Chainflow-Validator-Group), so we are highly familiar with the protocol. Thanks again and take care!

Blockchain Learning, IoT, Proof of Stake

IoT: The Problem Blockchain has Been Looking For?

By Connor R. Smith, originally published March 22, 2019

Despite blockchain having existed for over a decade now, few definitive uses have been proven outside of digital currencies. There have been many experiments to apply these technologies in areas like supply chain, healthcare, real estate, and even tipping people for their tweets or for watching online ads, but, there has yet to be one vertical that has been radically transformed from them. Many feel that this is because the technology is not mature enough yet or a general lack of understanding. Certainly, a lackluster user experience and insufficient education play a part, but others have started to argue that blockchain is a solution searching for a problem that may or may not even exist. It seems like new articles surface weekly about startups raising millions of dollars promising to solve some largely nebulous problem using “Blockchain + IoT or, AI,  or Drones, or all the above…”. At Consensus Networks, we’re focused on finding and supporting protocols that are technically sound and addressing real-world use cases. One area we have been particularly excited about lately is the ability of blockchain to secure internet of things (IoT) data.

In 2018, there were over 17 Billion connected devices around the world, 7 Billion of which were IoT enabled. These numbers are projected to double or triple over the next 6 years. IoT devices communicate with one another by gathering and exchanging data through sensors embedded in the device, enabling greater automation and efficiency. Devices that are seemingly unrelated can communicate with one another, driving the convergence of many verticals ranging from smart homes, cities, and cars to medical and industrial IoT. For example, IoT-enabled machines in a manufacturing plant could communicate information regarding system health and other mechanical data via a centralized platform. Plant operators could then take corrective action before a malfunction occurs, easily conduct more preventative and informed maintenance, and more accurately predict production rates. In fact, studies have found that over 90% of tech, media, & telecommunications executives feel that IoT is critical to nearly all of their business units and will drive the greatest business transformation over the next 3 years. 

Now you’re probably thinking, “Okay so if IoT can fix all of the world’s problems why do we need to add blockchain too?”. IoT may be a powerful emerging force, but it has some critical flaws. While IoT devices are great for communicating streams of data and supporting real-time device monitoring, they often have extremely poor endpoint security. For example, in 2017 the FDA had to recall over 500,000 internet enabled pacemakers after finding vulnerabilities that allowed hackers to gain control of the device. Beyond healthcare, IoT data privacy and security issues are an even greater concern when considering the connected future of autonomous vehicles, homes and smart cities. Another shortcoming of current IoT networks lies in their scalability. Conventional IoT network architectures are centralized with the network of devices sending data into the cloud, where it is processed and sent back to the devices. Considering the projected deluge of IoT devices projected to enter into the market, scaling this infrastructure will be highly difficult and expose  vulnerabilities to hackers to compromise the network and access your data.

Fortunately, integrating blockchain technology with IoT networks provides a path forward to overcome the scalability, privacy, and security issues facing IoT today and accelerate the adoption of both technologies. As opposed to having a centralized system with a single point of failure, a distributed system of devices could communicate in a trusted, peer-to-peer manner using blockchain technology. Structuring the network in this manner means that it would have no single point of failure, so even if a device was compromised the remaining nodes would maintain operable. Moreover, smart contracts could be integrated with the network to enable IoT devices to function securely and autonomously without the need for third party oversight. Consequently, blockchain-enabled IoT networks could exhibit greater scalability, security, and autonomy simply by modifying their current network architecture and implementing a more decentralized approach. 

However, perhaps the most important benefit blockchain provides IoT networks comes from its cryptographic security. Sharing data across a cryptographically secured network makes it far less susceptible to hackers, by helping to obfuscate where data is flowing, what is being exchanged, or what devices are transacting on the network. Whereas security in modern IoT networks was added as an afterthought, encryption and cryptographic keys are a core component of blockchain technology. Moreover, some networks are beginning to incorporate zero-knowledge proofs, which means that network security for IoT devices could be bolstered even further. 

Image adapted from Lukas Shor here

The underlying mathematics and mechanics of zero-knowledge proofs are highly complex, but essentially allow two users to prove that a piece of information is true without revealing what the information is or how they know it to be true. In the context of  IoT devices, this means that a network of IoT devices could share data in total anonymity and with complete privacy. No information regarding the transaction would be revealed other than proofs verifying that the network knows it is legitimate. Thus, the network maintains complete functionality while preserving maximum security. Regardless of if a blockchain-enabled network of IoT devices utilized zero-knowledge proofs or not, simply utilizing a shared, encrypted ledger of agreed upon data can provide many security benefits in IoT networks.

IoTeX Logo

While there have been several projects that have attempted to tackle IoT and blockchain, one that we are excited to support is IoTeX. Founded by a team of cryptography and computer science experts in 2017, IoTeX is a privacy and security centric blockchain protocol that aims to create a decentralized network designed specifically for IoT devices. IoTeX uses a network architecture consisting of blockchains within blockchains, where a root chain manages many different subchains. Designing the network in this manner allows IoT devices that share an environment or function to do so with increased privacy, with no risk to the root chain if this subchain is compromised. 

Aside from enhanced privacy and security, this design allows for greater scalability and interoperability as subchains can transact with the root chain directly or across the root chain to other subchains. IoT devices on the IoTeX network are also able to transfer data with one another in total privacy through the incorporation of lightweight stealth addresses, constant ring signatures, and bulletproofs. IoTeX also incorporates a Randomized Delegated Proof of Stake (RDPoS) mechanism for achieving consensus that they refer to as Roll-DPoS. Using this mechanism, nodes on the IoTeX network can arrive at consensus much faster with instant finality and low compute cost, making it much more friendly to IoT devices. Moreover, the IoTeX team recently released their first hardware product that leverages their blockchain network, Ucam. Ucam is a home security camera that writes data it records directly to the IoTeX blockchain, preventing it from being accessed by device manufacturers or sold to third parties like Google or Amazon. Ucam guarantees absolute privacy and provides users with secure blockchain identities which they can use to control their data.

Image adapted from Venture Beats here

Thanks for reading! More articles to come regarding use cases for IoT and Blockchain and what the marriage of these two technologies might look like for Web 3.0 and Industry 4.0. Let us know what you think and find us on twitter or discord if there are any questions or areas you’d like us to explore! If you’re interested in finding out more about IoTeX, Ucam, or how blockchain can improve your IoT solution, feel free to contact one of our LedgerOps experts here. We have supported IoTeX for nearly a year now, and have been running a delegate node on their mainnet since genesis. Needless to say, we are highly familiar with the protocol and eager to see if IoTeX or any of our other blockchain network services are a good fit for your IoT application!

Blockchain Learning, Proof of Stake

Kusama & Polkadot’s Approach to Decentralization

By Connor Smith

Note: This is the sixth installment of a series detailing different approaches that blockchain networks have taken to decentralize their network. Part 1 introduced the concept of decentralization and the inter-play it has with certain aspects of crypto networks like governance, incentives, and network architecture. If you missed that article I highly recommend going back and reading it here. The subsequent articles have examined the decentralization of Bitcoin, Factom, Cosmos, & Terra. If you missed those and would like to go back and read them before we dive into Kusama & Polkadot, you may do so, here, here, here, & here respectively.

Hey everybody, and thank you for joining me again as we continue our examination of the varying approaches to decentralization throughout the crypto ecosystem. Two weeks ago we examined Cosmos’ approach to decentralization and the unique challenges the protocol faces in its quest to build the internet of blockchains, and then spent last week studying one of the blockchains building on its platform, Terra. This week we will examine Polkadot, another protocol attempting to build a network of interconnected blockchains. Yet, it is taking a radically different approach from what we observed with Cosmos. However, as Polkadot has not yet launched its mainnet, much of the discussion will be framed through the lens of what has ensued on its ‘Canary Network’, Kusama. If you are wondering what a canary network is or why we will be examining two protocols in this article, don’t worry, I promise I will address these concerns in due time. However, to better discern how Polkadot differs from Cosmos and its approach to decentralization, it is crucial to first understand the history of the protocol and the institutions leading its development.

So What is Polkadot and Where Does Kusama Come In?

The Polkadot whitepaper was released in 2016 by Dr. Gavin Wood, co-founder of Ethereum and  of blockchain development powerhouse Parity Technologies. Designed to connect all types of blockchains (private, consortium, public, & permissionless) and technologies, Polkadot aims to serve as the backbone of an ecosystem for independent blockchains to seamlessly transact between one another in a trustless manner and enable the decentralized internet at scale. Gavin and Parity Technologies are veterans of the crypto industry and have been instrumental in the evolution of blockchain technology. Many people tend to associate fellow Ethereum co-founder and figurehead of the protocol, Vitalik Buterin, as the leader of the blockchain revolution and bringing stateful programming to blockchains for applications outside of digital money. These assertions are well justified, seeing as he authored the Ethereum whitepaper, and has led the direction of the protocol’s development since its inception. However, many of Ethereum’s core components and widely used technologies are the work of Gavin and Parity. For example, during Gavin’s time with Etherum, he created the Solidity smart contract language that powers all of the dApps running on the network and was responsible for the network’s first functional release in 2014. Parity Technologies are also the lead developers and, until recently, maintainers of the popular Parity Ethereum Client.  The Parity Etherum client is an ultra-efficient alternative to the popular geth node run by many Ethereum developers that powers an estimated 20% of Ethereum nodes and portions of Infura, the open node cluster used by many developers that processes 13 Billion transactions a day.

Needless to say, Gavin & Parity, have been instrumental in shaping the decentralized web and blockchain, thus far. Many of the protocols that followed Ethereum have attempted to build upon or adapt its concepts in some way borrowing from the innovations that these two produced. However, throughout all of the work Gavin & Parity performed for Ethereum, they began to notice that most approaches to blockchain networks were not practical in terms of scalability or extensibility as a result of inefficiently designed consensus architectures. Hence, Polkadot was proposed as a heterogeneous multi-chain framework that would allow many different blockchains, irrespective of consensus mechanism, to be interoperable with one another and overcome the following five shortcomings of conventional crypto networks: scalability, isolatability, developability, governance, & applicability. If you are curious as to how Polkadot views them, check out their whitepaper here. Similar to Cosmos, Polkadot’s heterogeneous multi-chain architecture is hyper-focused on addressing the scalability and isolatability problems, believing that if these two are adequately addressed the subsequent ones will reap tangible benefits and see improvement as well.

Shortly thereafter, Gavin, in conjunction with Robert Hermeir & Peter Czaban of the Web3 foundation, officially founded Polkadot and commenced R&D on the ambitious effort. The Web3 foundation is a Swiss based organization founded with the intent of supporting and nurturing a user-friendly decentralized web where users own their own data and can exchange it without relying on centralized entities. The foundation conducts research on decentralized web technologies and supports different projects building them, Polkadot being the first. In May of 2018 the initial proof-of-concept for Polkadot was released as a testnet and three subsequent iterations that integrated additional features were released in less than a year. Testnets provide an excellent proving ground for networks to work out any technical bugs that could occur at scale before a mainnet launch. 

Starting with Cosmos’s Game of Stakes, the idea of using incentivized testnets to entice developers to truly stress test a network before mainnet launch has become largely canonical as the step preceding the launch of any proof-of-stake network. Polkadot took this a step further and released Kusama, an early,  unaudited release of Polkadot, that serves as the experimental proving ground for the network. Affectionately referred to as ‘Polkadot’s Wild Cousin’, Kusama is Polkadot’s ‘canary’ network, or a highly experimental reflection of what the production version of Polkadot will be like. Kusama allows developers to test governance, staking, and more in an authentic environment with real economic conditions. Thus, developers and those validating on the network can be adequately forewarned of any potential issues that may transpire on Polkadot and correct them before a mainnet deployment. Kusama differs from a traditional testnet in that it is an entirely separate network from Polkadot, with its own token (KSM), and is run by the community. It will exist in perpetuity so long as the community supports it and is not inherently tied to Polkadot aside from inheriting its design and functionality.

So How Do These Heterogeneous Multi-Chain Networks Work?

There are three fundamental components that comprise the architecture of the Polkadot ecosystem: the relay chain, parachians, & bridges. For those of you who have been following along in this series, each of these pieces is largely analogous to the hub, zone, & pegzone concepts described in my Decentralization of Cosmos article. Parachains, or parallelizable chains, are the individual, customized blockchains built on top of Polkadot that gather and process transactions within their own network. All computations performed on the parachain are independent from the rest of the polkadot ecosystem. Thus, parachains can implement data storage and transaction operations in a manner most befitting to the problem they are trying to solve without being tethered to the technical underpinnings of another protocol like its scripting language or virtual machine. Parachains are then connected to the relay chain, i.e Polkadot, which coordinates consensus and relays transactions of any data type between all of the chains on the network. Lastly, bridge chains are a specialized permutation of a parachain that link to protocols with their own consensus like Ethereum or Bitcoin and communicate with them without being secured by the Polkadot relay chain. An image of how these pieces all fit together may be viewed below:

Image adapted from https://polkadot.network/technology. Pink denotes the relay chain, orange the parachains, and blue a bridge chain

Designing the network in this manner has several key benefits, namely high security and near infinite scalability. Polkadot pools all of the security from the relay chain and the parachains building on top of it, irrespective of consensus mechanism, and then shares that aggregated security across the entire network. The relay chain provides a ground source of truth for the network by handling transactions and arriving at consensus, but any computation performed on the network can be scaled out in parallel across the appropriate parachains. Moreover, parachains can be attached to other parachains to create highly distributed networks for processing transactions. This allows the transaction volume of the network to be scaled out immensely without placing a crippling burden on the relay chain itself and allowing it to maintain the same level of security. Each parachain can maintain its own notion of validity for the transactions it processes, seamlessly disseminate that information to other parachains via the relay chain, and then the network as a whole can arrive at consensus.

However, this is only feasible with participation from the following core network stakeholders: validators, collators, and nominators. Similar to other proof-of-stake networks, validators are the node operators responsible for verifying transactions on the network and producing blocks for the Polkadot blockchain. Equivalently, nominators are those who elect validators on their behalf into the active set by staking with them in exchange for a portion of their block rewards. The new player in this ecosystem is the collator, who is responsible for consolidating the transactions on the respective parachain they monitor into blocks and proposing proofs of those blocks to the validators. This eases the technical burden on validators by allowing them to only have to verify potential blocks from parachains as opposed to processing and verifying thousands of parallel transactions. Hence, the relay chain can arrive at consensus in seconds as opposed to minutes and maintain the security offered by a highly decentralized network. Collators can also act as ‘fisherman’ who are rewarded for identifying parties on the network acting maliciously. An image depicting how all of these stakeholders interact across the different network layers may be viewed below:

It is important to note that I am simplifying significant portions of how Polkadot works at the technical level. The project is highly complex, with a myriad of intricate components at each layer that would take far too long to detail in a single article. For example, Polkadot uses a novel proof-of-stake consensus algorithm known as  GRANDPA (GHOST-based Recursive Ancestor Deriving Agreement) that separates block production from block finality, allowing blocks to be finalized almost immediately. For more on GRANDPA check out this article, and if you are interested in learning more about the underlying technology of the network, check out the whitepaper here.

Governance on Polkadot

Similar to other proof-of-stake networks, the crux of Polkadot’s governance is hinged on the idea of stake-weighted voting, where all proposed changes require a stake-weighted majority of DOTs (or KSM on Kusama) in order to be agreed upon. However, Polkadot also incorporates a tiered governance structure and unique voting mechanisms in an attempt to decentralize power and governing authority on the network. Anyone holding the protocol native currency, DOTs, has the ability to directly participate in governance on the network. They can do everything from vote on proposals brought forth by the community, nominate validators to participate in the network, prioritize which referenda are voted upon and more. Governance, itself, is completely dissociated from validating on the network aside from the fact that validators can use their DOTs to vote as described above.

Polkadot also has a Council that will range in size from 6-24 members and have prioritized voting rights. Anyone who is a DOT holder is eligible to run for council, and are elected by the community in hopes that they will propose referenda that are sensible and benefit the network as a whole. In addition to preferred voting rights, council members have the ability to veto incoming proposals if they believe they are harmful to the protocol. However, after a cool-down period, the proposal may be resubmitted and, if the council member who vetoed it originally is still present, he or she will be unable to do so again. To protect against council members becoming negligent in their duties or abusing their governing power, members are elected on a rolling basis, with the term of each council member being equal to the size of the council times two weeks. An illustration of this may be viewed below.

To combat the fact the total community participation for voting on any referendum is unlikely, Polkadot implements what is known as Adaptive Quorum Biasing to change the supermajority required for a proposal to pass based on the percentage of voter turnout. Consequently, when voter turnout is low a heavy supermajority of ‘aye’ votes is required for a referendum to pass or a heavy super majority of ‘nay’ votes is required to reject it. Yet, as voter turnout approaches 100% the system adapts and only a simple majority either way is required to account for the greater number of total votes. DOT holders votes are also weighed proportionally based on the amount of DOT they own and the amount of time they choose to lock those tokens for after the referendum has ended. For example, any DOT holder voting on a proposal must lock their DOTs for at least 4 weeks, but they can instead choose to lock it for up to 64 weeks to place a greater weight on their vote. All voting also occurs on-chain, so any approved proposals have a direct and immediate effect on how the network behaves.

So How Decentralized is Polkadot?

As mentioned earlier, Polkadot has yet to launch its mainnet so this discussion will be framed through the context of Kusama. As of the writing of this article, there are over 300 nodes supporting the Kusama network and 160 active validators distributed around the world. Moreover, there is currently a proposal to increase the active set from 160 to 180 validators up for election with significant support from the community, suggesting that the network will become even more decentralized in the near future. The Council has 13 members with a combined backing of over 1 MM KSM and 264 voters. Of the 8.380 MM KSM issued so far, 2.641 MM, or 31.51%,  of it is staked across the active set of validators. Similar to the other proof-of-stake networks we have observed so far, the top 10 Validators control a significant portion of the staked KSM on the network, albeit far less than that of networks like Cosmos and Terra. Of the 2.641 MM KSM staked on the network, only about 18% of it resides within the top 10 validators by amount staked. Especially when considering that governance is completely decoupled from validating on the network, this is all the more impressive. Of the total possible voting power on the network the KSM held by the top 10 validators by stake only amounts to only roughly 5% of the overall voting power.

Given the scope of Polkadot to not only serve as a network of heterogeneous multi-chains, but as a platform for connecting private to public blockchains and create a truly decentralized web, having a sufficiently decentralized network across in all aspects (architecturally, economically, and from a governance perspective) will be hyper-critical to its success. If how decentralization on Kusama has materialized is an adequate proxy, then the future looks exceedingly bright for Polkadot. However, it is difficult to tell how this level of decentralization will carry over from Kusama to Polkadot. Kusama was launched to test out the different technical parameters of Polkadot and simulate what a live environment would be like for those validating on Kusama. Consequently, it has been an exceedingly open community and encouraging of people to participate, which has likely led to the magnitude of decentralization observed on the network. Considering that 50% of genesis rewards from Polkadot have already been distributed via a token presale that occurred over two years ago, it is difficult to say with certainty that this level of decentralization will occur on Polkadot come mainnet launch. While many of those on Kusama have been involved with the project for a long time and intend on participating in Polkadot, the crypto world has evolved tremendously over the last two years. Therefore, there is some inherent possibility that the old money that entered two years ago has very different interests than the newcomers who have become involved since. However, the team behind the project is a force to be reckon with in the crypto world that has worked hard to make Polkadot a reality and the community grows more and more everyday, so I’m optimistic that its launch this year will exhibit decentralization in a manner more aligned with how Kusama has evolved.

That’s all for this week, I hope you enjoyed the article! I know we unpacked quite a bit of information here, but, as I said, Polkadot is one of the most technically advanced protocols making waves right now and I really just scratched its surface. If you’re interested in learning more about Polkadot or Kusama, seeing if its a right fit for your application, or want to get involved in staking, feel free to reach out to us at Consensus Networks! We are actively involved in the community, have run Validators for both Kusama and the current Polkadot testnet (Alexander) for some time, and are gearing up for Polkadot mainnet so we are highly familiar with the protocol. Contact one of our LedgerOps experts here with any questions you may have about the network and we will get back to you as soon as we can. We are excited for the future of Polkadot and the impact it could have on the decentralized web, and eager to help you access the network. Thanks again for reading and tune in next week as I conclude my examination of decentralization with Celo.

Blockchain Learning, Proof of Stake

Terra’s Approach to Decentralization

By Connor Smith

Note: This is the fifth installment of a series detailing different approaches that blockchain networks have taken to decentralize their network. Part 1 introduced the concept of decentralization and the inter-play it has with certain aspects of crypto networks like governance, incentives, and network architecture. If you missed that article I highly recommend going back and reading it here. The subsequent articles have been examinations of the decentralization on Bitcoin, Factom, & Cosmos. If you missed those and would like to go back and read them before we dive into Terra, you may do so, here, here, & here respectively.

Hey everybody, and welcome back! Last week, we dove into our first proof-of-stake network, Cosmos, and analyzed how it has approached decentralization in its quest to build the ‘Internet of Blockchains’. In addition to assessing how decentralization has worked thus far on Cosmos, we also got into the nuts and bolts of how the underlying technologies supporting its ecosystem of interconnected, application specific blockchains (Tendermint BFT, ABCI, & IBC) work, and the modular network design of Cosmos with hubs and zones. This week, I will be picking up right where we left off and analyzing one of the networks building on top of Cosmos that is seeing major real world use and adoption, Terra

Terra aims to provide a price stable cryptocurrency, built on top of the Cosmos SDK, that will function as the infrastructure layer for decentralized financial applications. The protocol utilizes an elastic monetary supply that allows for both a stable price and the censorship resistant capabilities of Bitcoin to be maintained, enabling it to be used in everyday transactions. Currently, Terra is the backend technology powering CHAI, a mobile payments application that allows users to link their bank accounts and participate in ecommerce using Terra’s currency and receiving discounts in exchange. 

Terra is currently backed by South Korean internet giant Kakao and integrated with over 10 e-commerce platforms in Southeast Asia. The platform has seen rapid growth since launching last April, having just reached over 1,000,000 users last week, and continues to grow. With so many cryptocurrencies struggling to break into the commercial sector and seemingly every new year being the year we will finally start to see adoption, this is certainly no trivial feet. So now, without further ado, let’s dive into Terra and see how this network has approached decentralization on their quest to become the largest payments platform in Asia!

So What is Terra and How Does it Work?

Before we dive into the technical underpinning of Terra, the problem it solves, and its approach to doing so, it will help to first have some context regarding the rather unconventional background of its founders and some of the history leading up to its launch. Work on the project commenced in April of 2018, led by co-founders Daniel Shin and Do Kwon. Kwon had previously worked as a software engineer at Apple and Microsoft, in addition to being founder and CEO of a startup called Anyfi that attempted to use peer-to-peer mesh networks to try and create a new, decentralized internet. Shin was a successful serial entrepreneur, having built and sold multiple e-commerce companies in East Asia, and, at the time, was CEO of his most recent startup, TicketMonster, the leading e-commerce platform in Korea. Leveraging their extensive backgrounds in e-commerce and distributed systems, the pair sought to create a modern financial system built on-top of a blockchain that could be used by people to make everyday payments. The two believed that the major roadblocks to adopting cryptocurrencies largely stemmed from the extreme price volatility and lack of a clear path to adoption that most networks exhibited. Thus, they designed Terra to be a price-stable, growth-driven cryptocurrency that was focused on real world adoption from day one. Leveraging Shin’s deep connections in e-commerce, they formed a consortium of e-commerce companies known as the Terra Alliance. Within a few months of launching, 15 Asian e-commerce platforms had joined that represented a total of $25 Billion in annual transaction volume and 40 million customers. This coincided with a $32 million seed round investment the team raised from some of the world’s largest crypto exchanges and top crypto investment firms. Having a war-chest of funding in place and real world partnerships aligned, the team was off to the races as they started building the project and integrating it with e-commerce platforms.

Seeing as Bitcoin launched over a decade ago as a peer-to-peer electronic cash system, you may be wondering why a protocol like Terra was still tackling the issue of digital payments. This is largely because Bitcoin and other cryptocurrencies exhibit significant price volatility, making consumers nervous whether it will maintain its value when they try to transact with it later. For perspective, Crypto markets can fluctuate 10% or more in either direction on any given day, and in late 2017 Bitcoin was trading at nearly $20,000/BTC and less than two months later was trading between $6000 – $9000 (at the time of writing this article Bitcoin is trading at $8415.65). Terra, was far from being the first or only project to realize that price volatility posed a significant barrier to crypto adoption. Attempts at creating a price-stable cryptocurrency, or stablecoin, date back as far 2014 with BitShares, and have proliferated at a momentous rate in the wake of the volatility exhibited in the last crypto bull market in late 2017. 

Stablecoins are exactly what their name suggests, a cryptocurrency designed to be highly price-stable in respect to some reference point or asset and maintain the following three functions of money: a store of value, a unit of account, and a medium of exchange. While this sounds fairly intuitive and straightforward, the engineering behind these instruments is quite difficult with no agreed upon approach. Cornell University attempted to codify the different approaches some networks are taking, and put forth a paper classifying the different design frameworks for stablecoins, which can be found here. The finer nuances and mechanics of each approach exceed the scope of this article, but the study revealed that most stablecoins maintain price using one of the following mechanisms: a reserve of pegged/collateralized coins or assets, a dual coin design, or algorithmically. 

Maintaining a reserve of a pegged or collateralized asset allows the organization controlling the stablecoin to maintain price by incentivizing users to expand or contract the supply until it returns to its pegged price. Users are able to earn money by expanding the supply when the price is high and redeeming when it is low through arbitrage until the opportunity disappears and the price has equilibrated. The dual coin approach is where a network implements a two token system in which one coin is designed to absorb the volatility of the first through a process known as seigniorage. This is where the secondary coin is auctioned in exchange for the stable coin if it dips below the peg and the proceeds are burned to contract the supply and stabilize the price. Conversely, if the price of the stablecoin is above that of the peg, new coins will be minted to those holding the secondary coin to expand the supply and level the price. Lastly, the algorithmic approach uses complex algorithms and quantitative financial techniques to adjust the currency price as needed without any backing of pegged or collateralized assets. Hence, it behaves analogously to a traditional cryptocurrency in the sense that a user’s balance and outstanding payments vary proportionately with changes in the market cap of the coin, but it provides a more stable unit of account. 

Terra utilizes a dual coin approach in which the transactional currency, Terra, represents an ecosystem of cryptocurrencies pegged to real currencies like USD, EUR, KRW and the IMF SDR, and Luna is the secondary coin that absorbs the volatility.  All of the Terra sub-currencies (TerraKRW, TerraUSD, etc.) can be swapped between one another instantly at the effective exchange rate for that currency pair, allowing the network to maintain high liquidity. Since the prices of these fiat currencies are unknown to the blockchain natively, a network of decentralized price oracles are used to approximate the true value of the exchange. Oracles, in this context, are essentially trusted data sources that broadcast pricing data generated from currency exchanges onto the network. They vote on what they believe the true price of the fiat currencies to be, and, so long as they are within one standard deviation of the true price, are rewarded in some amount of Terra for their service. Should the price of Terra deviate from its peg, the money supply is contracted or expanded as needed using a seigniorage method similar to that described above. Hence, oracles mining Terra transactions absorb the short Term costs of contracting the supply and gain from increased mining rewards in the mid to long term. 

Luna and the Critical Role is has in the Tokenomics & Governance of Terra

However, since Terra is a proof-of-stake network, oracles must have stake in the network in order to be able to mine Terra transactions. This is where the second token, Luna, comes in. Luna is the native currency of the protocol that represents the mining power of the network, and what miners stake in order to be elected to produce blocks. Luna also plays a critical role in defending against Terra price fluctuations by allowing the system to make the price for Terra by agreeing to be a counterparty for anyone looking to swap Terra and Luna at the exchange rate. In other words, if the price of TerraSDR << 1 SDR, arbitrageurs can send 1 TerraSDR to the system for 1 Luna and vice versa. Thus, miners can benefit financially from risk-free arbitrage opportunities and the network is able to maintain an equilibrium around the target exchange rate of Terra irrespective of market conditions. Luna is also minted to match offers for Terra, allowing for any volatility in the price of Terra to be absorbed from Terra into the Luna supply. In addition to the transaction fees validators collect from producing blocks, the network also will automatically scale seigniorage by burning Luna as demand for Terra increases. As Luna is burned, mining power becomes scarcer and the price of Luna should theoretically increase. This scales with the transaction volume and demand on the network, allowing miners to earn predictable rewards in all economic conditions.

In the most recent update for the protocol (Columbus-3) on December 13, 2019, Luna gained even more utility in the Terra ecosystem by allowing its holders to participate in on-chain governance. Luna holders can now submit proposals for parameter or monetary policy changes to the network as well as make general text proposals and request funds from the community pool (a portion of the seigniorage tokens available to fund community initiatives). If a proposal receives a super majority of supporting votes, the proposal will be ratified and changes made. Not only does this extend to the functionality of Luna, but it also opens up the user base of individuals who can actively participate in the network. Before the Columbus-3 update, Luna only added value to miners on the network, but now anyone can purchase Luna and use it to participate in governance. Moreover, Luna transactions are tax free so it is even easier for non-miners to acquire to participate on the network. 

Terra also has a governmental body known as the Treasury that is designed to allocate resources from seigniorage to decentralized applications (dApps) being built on top of the platform. After registering as an entity on the Terra Network, a dApp can make a proposal to the Treasury for funding, and Luna Validators may then then vote on whether to accept or reject the application based on its economic activity and use of funding. Should the application receive more than ⅓   of the of the total available Luna validating power, it will be accepted and the Treasury will allow the dApp to open an account and receive funding based on the proportional vote it received from Luna validators. The Treasury ultimately determines how funds are allocated to dApps, but, if the community feels the firm is not delivering results, validators can vote to blacklist the dApp. Ultimately, this tiered governance structure is designed to provide Luna holders with a way to determine what proposals and organizations receive funding based on the highest net impact it will have on the Terra economy.

So How Decentralized is Terra?

As of writing this article, there are currently 61 validators located around the world on the Terra Columbus 3 mainnet (We at Consensus Networks are on this list and have been active validators since Terra first launched this past April!). While only about ⅓ of the size of the number of validators on its parent network, Cosmos, this is still a fairly impressive degree of physical decentralization when considering Terra underwent a fully decentralized launch and has been concentrating on integrating with e-commerce platforms exclusively in Southeast Asia. However, as of the writing of this article, the top 9 validators control 62.8% of the voting power on the network. So, similar to what was observed last week with Cosmos, a very small handful of network participants control the majority of economic and governance resources.

However, what is less clear, is if this centralization of resources has as significant of consequences on a growth focused stablecoin network like Terra. For example, Seoul National University’s blockchain research group, Decipher, conducted an in depth study on Terra that concluded it exhibits much greater price stability than other popular stablecoins like USDC or USDT. Terra has also on-boarded 14 online e-commerce platforms and over 1,000,000 users onto its payments application, CHAI, resulting in over $130 million being processed by the network to date. They have also begun expanding outside of South Korea into areas like Mongolia and Singapore. Given that Terra’s mission was to be a price-stable cryptocurrency with a clear path to market, it objectively appears that they have been successful in their goal thus far (Especially when considering that the mainnet has been live for less than a year). With validators receiving rewards in two forms (transaction fees and Luna burn), Terra has created a rewards structure that is predictable under all economic conditions for validators, giving them little to gain from colluding in an attempt to undermine the network. 

Yet, the recent additions of on-chain governance in Columbus-3, Luna being listed on more exchanges, and Luna receiving a tax exempt status on transactions introduces new layers of complexity to the Terra ecosystem that could pose a threat to the decentralization of the network. Now, anyone can vote on proposals that affect Terra’s  future trajectory at both a governance and functional level. When considering that proposals on the network require a supermajority of votes to pass, the threat of collusion between a handful of parties controlling most of the resources now poses a much greater threat. For example, if the top 9 validators were to collude and try to pass a proposal that benefited them at the expense of the network, they would only need to acquire roughly 4% more of the voting power to reach the supermajority needed approve it and change the protocol at a functional level. Additionally, given Terra’s adoption-driven growth model, there are now a whole new range of stakeholders that must be factored into the ecosystem like e-commerce platforms and users of Terra-based applications. While still unclear how this will evolve over time, effectively anticipating and designing for these new dynamics is one of the primary focus areas of the team moving forward, as can be seen here

Given the major shifts in how the protocol operates and the massive influx of new stakeholders, it is far too early to speculate on how Terra’s approach to decentralization will proliferate into the future. Regardless, the fact remains that Terra’s adoption-driven approach to growth has made it one of the few cryptocurrencies that has started seeing demonstrable real world use to date. Having recently hired Uber’s former Head of Strategy, Rahul Abrol, to spearhead their international growth efforts, Terra and CHAI has a very realistic chance of achieving their goal of becoming the leading payments platform in Asia in the years to come. Thank you for reading, and I hope you enjoyed the article! Tune in next week as we explore the other massive project looking to create the internet of blockchains, Polkadot, and its Canary Network, Kusama