Press Release

We’re Hiring!

We’ve got some exciting announcements in the coming months but in the meantime we’re hiring several key roles as we scale our flagship product HealthNet. If you’re interested in joining a growing startup here in the midwest, you can read the job descriptions and apply here or send your resume to careers@consensusnetworks.com!

Our key open roles include a data scientist, EHR integration engineer, and lead product manager.

Blockchain Learning, IoT, Proof of Stake

Our Response to NIST

The National Institutes of Standards and Technology requested information for a congressional study regarding emerging marketplace trends in a variety of areas including blockchain and IoT. Read our response below as we discuss applications of Decentralized Computing, how consensus works, and what’s next.

Abstract

This paper contends the following: Blockchain as a ledger or immutable database alone is a limited technology with no real use cases. The true value of the technology is apparent through the consensus mechanisms which enable the real value proposition, namely digital value and distributed peer-to-peer trust. This paper outlines why consensus is the true innovation in the blockchain space, some methods of consensus, and applied use cases. Applied use cases discussed: digital tokens; defense applications and distributed trust; IoT and distributed data exchange in industry including energy.  It is important that this reviewing body considers the implications of such a technology in order to create smart policy. Lastly, we encourage regulatory frameworks in the space to focus on enabling technology growth on public, decentralized networks where entrepreneurs can be free to build and innovate. 

Introduction

We believe the significance of “Blockchain technology” is the creation of new value models, consensus mechanisms, and tokenization itself, not the “blockchain” technology. Blockchain technology is a broad, oft misused term that people use to describe many things and frequently just an immutable database. An immutable database, while having limited use cases in regulatory and legal cases, is not a revolutionary technology in and of itself.  While it’s important to distinguish between much of the current sensational, speculative hype surrounding cryptocurrencies and their derivatives like NFTs, we must also contrast enterprise efforts focused on Blockchain as a limited database technology. Cutting through the confusion, we will look towards the core capabilities of the technology and where this will lead to a future of digital value exchange.

The true innovation and revolutionary technology are the consensus mechanisms of these Decentralized Computing platforms powering cryptocurrencies. Similar to the computer enabling machines to create and store data or the internet allowing machines to exchange data, Decentralized Computing enables machines to create, transact, and store value. This mechanism enables trust between independent users around the globe, who do not need to know each other or even utilize a third party to moderate their interaction. Value exchange has been democratized for the first time in human history, enabling all users to control their own credit, without reliance on a third party like a bank, government, or temple to set exchange rates and mediate value exchange.

This is a fundamental change in the way we will interact with currency and value in our lives. In much the same way past revolutions displaced the main players in the industries they disrupted (The Industrial, and the Digital/Internet revolution) so too will Decentralized Computing continue to disrupt established banking and financial institutions and even beyond, to the centralized enterprises who currently control information flow on the internet.

We believe very clearly that the future of “blockchain” or Decentralized Computing will be built on open, decentralized public protocols like Bitcoin and Ethereum for the reasons outlined in this paper. It is vitally important that innovation, use cases, and regulation in this space are designed with this in mind. In short, we believe public, open, decentralized computing networks utilizing Proof of Work, Proof of Stake, or another yet designed mechanism will win out due to their unique incentive structures that draw in a wide range of users. Combined with their open architecture which allows anyone to build, public networks will create a myriad of use cases even beyond those contained below. It is important that builders on these free and open systems be enabled to innovate and create in order to maximize the potential of decentralized computing technology.

Overview of Consensus

Briefly, a consensus mechanism is designed to enable networked computers to agree over (1) a set of data, (2) modifications to or computations with that data, and (3) the rules that govern that data storage and computation. It’s consensus that allows a cryptocurrency like Bitcoin to exist, not “blockchain technology.” For example, in the Bitcoin network, there are a set of rules that govern the inflation and maximum amount of Bitcoin that can exist (21 million). These rules are agreed to by network participants and over time, trust is built up that the Bitcoin tokens will continue to conform to the rules of the network. It is this part of the distributed consensus mechanism that allows Bitcoin to have and retain value. Consider the US dollar, there is no limit on the number of dollars that can exist, however, the strength of the US economy, military, and government creates a trusted environment where the dollar can have value, even though it is a ‘fiat’ currency. Because Bitcoin doesn’t have a central government or physical assets, trust and value must be built and maintained through the consensus mechanism. To this end, Bitcoin created a unique incentive structure as part of its Proof of Work consensus (to be explained later) that made participants in the network want to conform to and maintain the rules so that value could grow. 

This network security is also what allows an open and permissionless network. Anyone can build on these networks without fear that that person could destroy the network and this is partially what makes this technology so powerful. When the openness is removed and the network becomes a Consortium consensus model or private, the ability to create is limited and becomes mediated through the controlling enterprises on the network.

There are several Consensus Mechanisms in use today: Proof of Work, Proof of Stake, Consortium, and Proof of Authority. These mechanisms provide the backbone for Decentralized Computing networks and we’ll see that the open mechanisms, Proof of Work and Stake, do far more than simply act as a “blockchain ledger.”

Proof of Work

Proof of Work is the original consensus mechanism designed by the pseudo-anonymous Satoshi Nakamoto. It is quite simple: Network participants use their computers to solve a math problem (or algorithm), the first computer to solve the problem gets rewarded in Bitcoin and the ability to process a number of transactions, earning those fees as well. This has been occurring basically non-stop since the launch of Bitcoin in 2009. The mechanism is designed so that the algorithm changes in difficulty as more computers are added to the network so that a Bitcoin block (and reward) is mined every 10 minutes. The reward from Proof of Work makes for a powerful incentive. Computers participating, especially today as the amount of computing power on the network has grown exponentially, spend quite a bit of money to purchase, maintain, and run their equipment. They must ensure the rules of the network are maintained so that they can continue to earn money from transaction fees and block rewards. Someone looking to disrupt the network would need to deploy an incredibly expensive amount of computing power to manipulate blocks or transactions, not worth it for any individual or company to do. This methodology has proven very successful though there are two major concerns. First, is the amount of electricity required to maintain the network. The answer is outside the scope of this paper, however, thus far, predictions of future energy usage have been quite incorrect and there is no indication that Bitcoin is generating demand for new energy sources, only utilizing excess energy in specific areas. The second concern is the ability of any new protocols to utilize a new Proof of Work network. Since there is so much computing power involved in the cryptocurrency space today, it is easy for an attacker to disrupt and destroy smaller proof of work networks.

Proof of Stake

There are two other key limitations of Bitcoin that drove innovation in new methodologies of consensus that resulted in Proof of Stake. The first is that Bitcoin is stateless or non-Turing complete. This means that it can’t do much else other than managing the transfer and store of Bitcoins. There isn’t a problem with this, per se, but innovators were interested in ways to build an “internet computer” a decentralized computer with built-in payment rails that would enable users to create a new generation of computer programs and new types of digital assets all natively decentralized and uncensorable (Streaming platforms, social media, music, art, etc). This, in part, required a new methodology to overcome the second limitation of Bitcoin, the speed of the network (Bitcoin blocks only come once every 10 minutes), while still maintaining the overall security of the public network. Proof of Stake emerged as the most likely solution and works as follows. Instead of rewarding computers based on the computing power, they are utilizing, Proof of Stake rewards all participants for the number of network tokens they have “staked” or deposited on the network. Participants can choose from a variety of network computers, or Validators, to stake to and desire to choose a Validator who will properly secure and maintain the network, otherwise their tokens are worthless. This incentive structure has so far proven to be a worthy competitor to Bitcoin and most new network launches (Cosmos, Solana, Polkadot, and others) have utilized a form of Proof of Stake. The Ethereum network, the original Turing-complete decentralized network, is planning to shift from Proof of Work to Proof of Stake later this year (2022). 

There are two major concerns with Proof of Stake. The first is that when compared to the Bitcoin network, Proof of Stake is a newcomer and not as established or tested. Second, there are questions as to the equitability of Proof of Stake. Since it is purely token-based, most of the token supply tends to consolidate with a few wealthy token holders on a few wealthy Validators. This potentially increases the likelihood of collusion and network manipulation. Proof of Work is seen as more equitable since any network computer could mine Bitcoin, without requiring holding any Bitcoin, to begin with, however, the network has become so large that a single computer or even a small data center of computers is unlikely to be able to mine Bitcoin consistently, so it too has tended towards centralization among the large “miners” who are able to afford the expensive equipment needed.

Consortium

Consortium consensus has been favored among enterprises, like IBM and Walmart, attempting to experiment with Blockchains. Consortium consensus grants privileges to known members to participate in consensus but is a closed network to the public. Similar to our Walmart example later (see Trade section below), who determines the identity of the participants? Who decides exchange rates or data control? What happens if one of the participants is compromised? Participants in these networks must have a level of trust with each other outside of the network itself, severely limiting their use cases outside of very specific examples.

Proof of Authority

Proof of Authority, or social consensus, is an alternative consensus mechanism that doesn’t require the computing power of Proof of Work or the expense of holding tokens for Proof of Stake. Proof of Authority grants members the ability to elect (or demote) other users on the network through voting. It is similar to Consortium consensus in the sense that participants must build some sort of trust with other participants to gain access, but there are controls in place to remove underperforming or malicious participants. The trust is distributed across the participants, enabling this network to be public and open. The major flaw in Proof of Authority is the general politicking associated with a decentralized group of participants attempting to make decisions regarding changes to the network or admitting new members. The network Factom, one of the earliest decentralized networks to launch after Bitcoin, has faced these challenges, crippling the network to the extent that it is shifting to a Proof of Stake network later this year (2022).

Recommendations

1. Peer-reviewed study of energy usage and growth of the Bitcoin network evaluated. Calculations and predictions thus far have been incorrect.

2. Policies evaluated and enacted to promote innovation and development utilizing open and decentralized network protocols.

Use Cases

Finance

Decentralized finance, or DeFi, is the most clear-cut of the use cases for Decentralized Computing today. We won’t focus on this too much as it is written about elsewhere by more qualified individuals but suffice to say, as noted above, DeFi enables a democratization of finance, enabling users to engage in a new world of value creation and exchange. Needless to say, this area of disruption is ripe with optimism as well as fraud and until more specific regulation is applied in the United States, this “wild-west of digital gold” will continue.

Trade

Supply Chain is often the immediate answer when someone asks, “What else can blockchain do other than cryptocurrencies?” This is due to the above-noted misunderstanding that “blockchain” is just an immutable database. Often cited is Walmart utilizing blockchain to track its Mango and Pork supply chain (in a few specific geographic areas). While indeed a noble cause, having better traceability in a supply chain to help prevent disease breakout and sustainably source goods – all of this could have been done (and probably should have already been done for most of our food supply chain) with traditional Web and digital technologies. The speed at which Walmart was able to audit their supply chain also has little to do with blockchain but again, the digitization of that supply chain. Walmart’s “blockchain” technology is just a supply chain software they control and force their suppliers to record to. There is no real consensus mechanism, there is no real value creation or exchange on the platform, and it is a closed architecture, meaning only federated users can participate. If Walmart changed data on the platform, who would know?

We actually do believe that Trade and Supply Chain is a powerful use case for Decentralized Computing, but not for the reasons Walmart is using it. Trade is often a complex operation, requiring stakeholders across the spectrum from Governments to Suppliers. Significant value is generated as goods move through a supply chain. This value is transacted across many distributed parties and stakeholders. As supply chains become more digital, a significant need is for data standardization across the supply chain. This is easy to do in a simple supply chain within the borders of a country. However, in a globalized world goods transit around the globe through many different countries, each with their own regulations and trust between each other. Decentralized computing offers a unique opportunity for countries to agree on a common data standard, enable transparency across supply chains, and facilitate a more efficient flow of goods across borders without requiring outside trust of the members.

Goods flowing across borders encounter a wide range of regulations, paperwork, taxes, and inspections. An open protocol would enable the standardization of data across borders while enabling participating countries to retain their own individual regulations, digital paperwork, and tax revenue. This technology could be combined into a single system where data, goods, and money flow simultaneously as the product transverses through its supply chain. This would greatly reduce trade barriers and overhead while providing increased transparency of the supply chain and ensuring compliance

Internal trade is also a potential use case. In many industries, particularly healthcare, supply chains are managed by third parties who introduce unnecessary overhead and complex contract agreements between manufacturers and end-users, like hospitals. This also creates a large information barrier where manufacturers are unaware of the long-term quality of their product, how often it is used, and have only delayed demand information, making it difficult to build a dynamic supply chain. In many cases, manufacturers are forced to purchase old data from purchasing agencies like GPOs just to better understand their own equipment. The reverse is true as well. Hospitals are locked in complex contract agreements with GPOs, providing very little price transparency or resiliency. A trusted, decentralized network of manufacturers and end-users interacting directly with each other would introduce additional transparency and information flow between stakeholders, creating a more resilient and cost-effective supply chain. Recent regulatory changes, including the 21st Century Cures Act Final Ruling, created a data standard for health information that could be leveraged to provide even better quality assessment and demand data to manufacturers.

Defense

The US Department of Defense (DoD) is an early adopter of “blockchain” for supply chain. There are some good reasons for this, unlike Walmart. First, traceability is essential for many parts and products in the DoD supply chain. Certain parts are incredibly valuable and could mean life or death, or even impact national security, and, as such, a heavy-handed immutable database tracking these parts can find use in such an environment. For example, the highly successful SUBSAFE program already uses an amended-only paper trail for ship-safety repairs on submarines. The use of an immutable database in this instance could dramatically improve a paper-heavy and cumbersome process while still preserving ship safety.  Again, these use cases are limited in nature and don’t really address a key problem in vital supply chains, namely data entry (even in our SUBSAFE example). Non-natively digital information or value whether a collectible, Rolex, or Seawater valve, when entered onto a blockchain will always depend to an extent on the person entering the information. Blockchain, although immutable, can still be error-prone (then save that error for eternity). This again highlights the fact that much of our supply chain issues are a digitization issue, not an audit trail issue.

However, there are ways we can work to reduce the chance of error and leverage emerging digitization technology to better ensure proper data entry. In a current project with the United States Navy, we’re building a blood product tracking tool for sharing blood product information across a variety of parties such as a blood bank, hospital, and ship, we’ve utilized barcodes and RFID to automate data entry and partially solve this problem as well as integrating another key use case, IoT. As the DoD continues to experiment and test Decentralized Computing, we believe two more key use cases will emerge:

1. Digital Chain of Custody: As DoD continues its digitization efforts, much of the data created will be (and already is) of the highest importance to national security, from directives, to planning, to weapons system software. Securing this data in a way to ensure it has not been tampered with is a key area of national security importance. Especially in the case of mission-critical software, which is already natively digital, Decentralized Computing can be a powerful tool to prevent malicious or even unintentional errors.

2. Decentralized and Secure Communications: Warfare is becoming increasingly informationized and cyber-based. Conflicts arising in the coming years will be a battle of systems and systems confrontation. Digital information flowing from soldiers on the ground, drones in the air, weather, intelligence, and more will be fed into a full combat, comms, and sensor integrated network from where topline decision-makers will be able to view the entire battlefield and make key decisions. Disrupting these networks will require coordinated attacks on key nodes to disrupt or destroy the system. Traditional digital and web architecture creates single points of failure that if exploited would destroy military systems and impact national security. Decentralized Computing could dramatically improve the robustness of these systems. By creating a decentralized network of nodes, information transfer can not only be increased in volume, by utilizing technologies like BitTorrent, increase security by breaking the information apart into smaller chunks, and creating a robust data transfer network where one node’s destruction will have little impact on the other nodes or operation of the system. A sophisticated consensus mechanism will also be able to observe potential bad actors attempting to participate in consensus, removing Man in the Middle or Sybil attacks, something much harder to do in traditional cyber security.

Internet of Things (IoT)

Every day a new device becomes IoT enabled, from your microwave to your grill. This certainly will continue to grow and these devices will gather more and more information from our everyday lives and beyond into industry. As IoT becomes more ubiquitous, the control the IoT devices have over our lives and the data generated will be very important, valuable, and vulnerable. Significant debate will continue to occur around who owns IoT data and what it can be used for. Possibly even more important is what the device can do. If it’s just monitoring your grill temperature, you may be less concerned about privacy or data ownership. If it’s a heart monitor, baby monitor, or a control actuator on an oil pipeline – that device’s connectivity presents a very clear vulnerability that if exploited or the device itself fails, could result in serious consequences.

In a data privacy use case, a decentralized IoT device, not dependent on a third party’s centralized server, the user can be assured that information generated by a device is owned by that person alone. Additionally, compromising a large amount of decentralized IoT devices, while in a centralized use case would only require the exploitation of a single server, would require exploitation of each IoT device, something extremely difficult, if not impossible to do. 

Centralized IoT devices are also limited by the companies that produce them, interoperable with only others in their product line, and dependent on the longevity of their maker to maintain functionality. What happens to a valve actuator on a pipeline when the maker of that connected device goes bankrupt, is acquired, or the cloud server supporting that device goes down? The IoT centralized ecosystem as it exists today is highly vulnerable to the viability of the companies making the device. Potentially fine for a light bulb or grill, but disastrous if a pipeline goes down or your pacemaker needs an update.

IoT devices being served by a decentralized network can avoid the issues that central servers create. Even in the case that a company ceases operation, the network itself will remain functional, allowing the device to continue operations. The data standard of the decentralized network, combined with current efforts today to link data across different networks also presents an opportunity for increased IoT device interoperability. Devices will be able to speak to each other across product lines, creating new use cases. Your grill could talk to your doorbell, letting you know friends have arrived for the barbecue. In a more serious use case, weather monitoring could be connected to electrical grids and oil pipelines, providing information directly to control systems to enable proactive protection of equipment.

MachineFi is the terminology being introduced to describe the intersection of Industrial IoT and Finance. The open protocol, IoTeX, is a leader in the space and describes MachineFi as a new paradigm fueled by Web3 that underpins the new machine economy, whereby machine resources and intelligence can be financialized to deliver value and ownership to the people, not centralized corporations. Though somewhat hyperbolic, the message is clear, IoT devices powered by Decentralized Computing have the potential to not only impact the security and longevity of the device but the empowerment of the users themselves.

Energy

Energy markets are increasingly distributed as ‘micropower’ sources such as wind turbines and solar panels proliferate. Battery backups to these renewable energy sources are also in high demand and are often traded not as storage but as ancillary power. The variability of renewable sources is difficult to predict, so much so that it often trades at a negative price. Current energy trading was designed around the large power sources of the past and is not dynamic enough to predict or react to changes in weather or energy production. Smaller power operators have an opportunity to more directly sell their power to the grid as digital technologies now manage the flow of energy. Additionally, grids could also have more flexibility to directly compensate for excess energy storage and backups, a must-have for extreme weather events and surge times, ensuring power is available when needed. As a whole, energy markets could be another prime area where Decentralized Computing could have a large impact utilizing a combination of the use cases discussed above. Cross border energy trade, enabled by a common data standard on an open network, robust grid and weather monitoring through widespread IoT devices, and MachineFi tools to power dynamic energy trading and real-time settlement could usher in the next generation of reliable and secure clean energy production.

A thought on Uncensorability

A clear but often overlooked value proposition of Decentralized Computing is uncensorability. This means that transactions on a public network, like Bitcoin, cannot be stopped by any third party. This is a particularly powerful empowerment technology that could enable victims of oppressive governments or those unable to access banking to be able to participate in an economy otherwise closed to them. But uncensorability has its dark side – horrible things could be empowered by this tech – Child pornography or payments to terrorists could all be funded and transmitted through these networks. As public open decentralized and uncensorable networks grow, special attention and regulation are needed to ensure that the worst of humanity is not enabled along with the best.

Conclusion

Let’s cut through the hype – tokens like Bitcoin and even Dogecoin and Bored Ape NFTs have some value, what exactly is that value? As our lives have become more digital, as our machines and data have become more digital, as software has eaten the world; we must understand that the value associated with our lives, our machines, and our data is also becoming digital. There is something here beyond mere speculation. This should be seen as a natural progression, money flowing into the digital value ecosystem in search of what’s next. The question before us is whether to allow the natural progression of value digitization by enabling the building of this new economy on free and open standards or to attempt to heavily regulate it to prevent what is seen as excess in the world of digital tokens (like Dogecoin). A few things are clear though, the robustness and uncensorability of the open protocols make it very difficult to shut down and the alternatives, private and consortium networks have been unable to produce a product that can match the potential of the free and open digital value economy.

Healthcare

How Hospital Consolidation is Impacting Medical Supply Chains

By Connor Smith

Introduction

The U.S healthcare system has undergone unprecedented levels of consolidation over the past decade. The largest health system (HCA Healthcare) now has 214 member hospitals and the 25 largest systems collectively manage almost 20% of all hospitals in the U.S. A recent study by Definitive Healthcare found that, in 2019 alone, there were 294 hospital mergers and acquisitions, a slight decrease from 330 the year prior. It goes without saying that such high levels of consolidation have a massive impact on medical supply chains. However, whether these changes result in a positive or negative financial impact remains a heatedly debated topic in the healthcare community. Some argue that consolidation allows providers to leverage economies of scale and reduce costs. Yet, some evidence suggests this may not be the case and that consolidation can actually increase cost of care by up to 40%. In this article, I aim to briefly explore the driving forces behind hospital consolidation, how it is affecting medical supply chains, and the role technology plays in managing these increasingly complex supply chains.

A Brief History of the Driving Forces Behind Hospital Consolidation

The passing of the Affordable Care Act (ACA) in 2010 was a momentous juncture in U.S history, and arguably the most sweeping change to the nation’s healthcare system since the institution of Medicare & Medicaid. While the goal of the plan may have been to make healthcare more affordable and accessible to U.S citizens, the bill incentivized large scale consolidation across the entire industry that permeates to this day. Providers became incentivized to shift care delivery to an outpatient setting and work towards a value-based care model. As a result, health systems began rapidly acquiring and investing in facilities like ambulatory surgery centers, clinics, and physicians groups. This vertical integration caused health systems to evolve from mere collections of hospitals into fully fledged Integrated Delivery Networks (IDNs) that provide and coordinate patient care across the continuum. Similarly, health systems began consolidating horizontally by merging with other IDNs to improve access to capital, increase market share, and improve efficiencies. Constant external pressures to reduce costs and lower the price of treatment have exacerbated this trend and caused consolidation to persist at an accelerated pace, since integrated provider networks can more easily share physicians across facilities to reduce overall headcount and negotiate bulk supplies discounts.

Impact of Consolidation on Medical Supply Chains

Theoretically, these new, ‘mega-systems’ should be able to use these economies of scale to procure supplies at a lower cost and coordinate care more efficiently, thereby reducing costs. Traditionally hospitals have relied on Group Purchasing Organizations (GPOs) to leverage the aggregated buying power of multiple facilities and negotiate lower costs for medical supplies from manufacturers, wholesalers, and other vendors. As IDNs acquire more hospitals and grow in size, they can often negotiate near equal prices with manufacturers without these middlemen and realize greater cost savings. However, savings from such mergers are, unfortunately, often negligible, and they oftentimes actually cause supply chain costs to increase.

A study of 1200 hospitals done by researchers at the Wharton School found that the average supply chain savings for a target hospital in a merger of equal sized systems was only about $176,000. Moreover, the acquirers are often left spending more on supply chain due to complexities that arise from an acquisition like managing physician preference items. The researchers found that, while an acquirer saved an average of 6.4% on inexpensive commodities in the supply chain, these savings were more than offset by a 1.1% increase in costs relating to physician preference items. As more and more hospitals are integrated into an IDN, it becomes increasingly difficult to standardize purchasing decisions across a growing number of physicians unless the system has a robust digital infrastructure and inventory tracking system. Unfortunately, 64% of providers lack dedicated supply chain management systems and 80% still rely on manual processes in some capacity to manage supplies. Consequently, consolidation oftentimes leads to less efficient supply chains that procure similar products from a myriad of suppliers, lack transparency as to where products are located in a system, and have excess supply stocks that are often wasted or expire. 

How Technology Can Break the Cycle

Prior to the pandemic, consolidation was already among the top trends to watch in healthcare for 2020 and experts are speculating that the financial effects of COVID-19 are likely to accelerate this phenomena in the post-pandemic era. Pending any drastic regulatory or political actions, it is therefore unlikely that hospital consolidation is going to decelerate anytime in the near future. Hence, IDNs must leverage digital technologies if they want to properly manage their increasingly complex supply chains and control costs. One of the easiest solutions they can implement are supply chain management platforms. Manual systems are inefficient and error prone, whereas digital supply chain management systems enable automation, reduce error, and can be coupled with advanced analytics to drive further efficiencies. 

Using supply chain analytics, IDN’s can track usage of supplies to help reduce sources of waste like purchasing physician preference items and maintaining expired products. The majority of healthcare executives and supply chain managers agree that supply chain analytics could positively impact hospital costs and even improve hospital margins by over one percent. Considering that hospitals, on average, realize net revenues in excess of $300 Million, a one percent improvement in operating margins can equate to millions of dollars saved. Moreover, these systems can be further integrated with technologies like RFID to enable real-time medical supply tracking within an IDN and help providers load balance supplies across the hospitals in their system based on demand to realize even greater cost savings.

Conclusion

I hope you enjoyed this article! If you are interested in ways healthcare supply chains are evolving, check out my series on the future of medical supply chains. Part one can be found here. Additionally, if you are looking for ways your medical supply chain can be improved, feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era. Until next time!

Healthcare

COVID-19 is Forcing Medical Supply Chains to Change… And that’s a Good Thing: Part III

By Connor Smith

Hello everybody! Welcome to the conclusion of my series on how COVID-19 is changing medical supply chains. In Parts I & II I talked about how the pandemic is forcing U.S medical supply chains of the future to embrace digitization and automation, rep-less sales models for high cost implantable medical devices, and the influence that domestic manufacturing and value-based healthcare are having on their evolution. Before I conclude this series, I’ll add the caveat that medical supply chains will evolve in a myriad of ways far beyond just the six I addressed. These are simply what I believe are the most profound changes on the horizon for these complex systems. Now, let me dive into the two final transformative forces that will reshape U.S medical supply chains in the post-pandemic era.

5. Sophisticated Risk Analysis & Disaster Planning Software

COVID-19 is far from the first environmental disruption global supply chains have faced in recent times. There was the SARS outbreak in 2003, the Ebola virus in 2014, and natural disasters like earthquakes and hurricanes that occur frequently and can have devastating economic impacts. In fact, as many as 85% of supply chains experience at least one major disruption per year. The reality of the modern era is that globalized supply chain networks allow us to consume goods and services at incredibly low  prices, but they are also riddled with ‘achilles heels’ that can shake global economies if perturbed.

As I mentioned in Part II, one way U.S based providers and manufacturers will look to mitigate such interruptions in the future is by ensuring geographic redundancies throughout supplier networks and logistics partners and leveraging domestic manufacturing. While this approach will be one core component of U.S medical supply chain architecture in the future, an overall proactive, strategic approach to supply chain design will be paramount. Hence, medical supply chains of the future will be designed using sophisticated risk analysis software that leverages predictive analytics and advanced simulations to model for possible disruptions like pandemics and natural disasters before committing to suppliers and plan accordingly.

Academics have been developing operational supply chain risk analysis models that leverage AI and advanced computational methods for well over a decade, and supply chain experts across all industries have known the benefits of proactive supply chain risk mitigation for some time. A 2017 study conducted by Deloitte found that taking a proactive approach to supply chain risk management can save firms up to 50% when managing major disruptions. As medical supply chains become increasingly transparent and digitized, investments into proactive risk assessment technologies will be one of the best ROI that medical supply chain management teams can make. Using tools like predictive analytics, supply chain management teams will be able to quantify the risk of using a particular supplier by modeling scenarios like the spread of a virus to a particular region, losing a key manufacturing plant to a natural disaster, and more. Some researchers have already started developing simulation based analyses for global supply chain disruptions resulting from the spread of COVID-19, and it is reasonable to expect that using such models for supply chain planning in the future will become the norm. 

6. Integrating Population Health Analytics 

Originally posited by David Kindig and Greg Stoddart as “the health outcome of a group of individuals, including the distribution of such outcomes within the group”, population health has become a fairly nebulous term used to describe the overall health of patient populations. ‘Pop health’ research encompasses everything from studying how socioeconomic factors like income levels, race, or education level influence the prevalence of certain conditions to the prevalence of various genes in communities and how that affects disease spread and more. Depending on who you talk to, they likely have a different view as to what population health is and the responsibility that providers and manufacturers have in improving outcomes.

Regardless of your thoughts around what population health ecompases, as healthcare continues to evolve towards a value-based model it is inevitable that population health will play an increasingly vital role in how providers deliver care. Value-based incentive structures are designed to drive healthcare towards what the Institute of Healthcare Improvement refers to as the ‘Triple Aim’: improving the patient care experience, improving the health of populations, and reducing the per capita cost of healthcare. An overview of the triple aim is pictured below.

Image Adapted from Health Catalyst

Prior to the pandemic, providers were starting to integrate population health initiatives with supply chain management to combat the increasing strain they felt from over half of the U.S adult population having at least one or more chronic diseases. For example, Indiana-based Eskenazi Health extended its partnership with Meals on Wheels in 2017 to deliver healthy meals to patients discharged from the hospital at their home in an effort to reduce readmission rates. A recent analysis published by Providence Health System found that COVID-19 is accelerating this transition to a distributed care model in which patients receive personalized care from their homes or local clinics instead of at a Health System. They found that the use of virtual care technologies coupled with the desire to reduce unnecessary visits because of the pandemic is forcing medical supply chains to become more personalized. 

It is reasonable to suspect that the medical supply chains of the future will become increasingly patient-centric and account for the socioeconomic, genetic, and other key factors that influence a patient population’s well-being. The Internet of Medical Things market is growing at over 20% year over year and technologies and will likely increase due to the rapid adoption of telehealth & remote patient monitoring technologies because of the pandemic. These platforms will enable providers to gather more information about their patient population than ever before and construct truly personalized care plans. Medical supply chain teams of the future will integrate this information with predictive analytics models to not only ensure that patients receive the best possible care, when and where they need it, and at the lowest cost, but also make cost effective population-level interventions that improve overall societal health levels. 

In Conclusion
I hope you enjoyed the final installment of my series on the future of U.S medical supply chains! Interested in learning more or ways your medical supply chain could be improved? Feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era.

Press Release

Consensus Networks Joins MATTER

South Bend, IN Sep. 22, 2020 — Consensus Networks announces its admission into MATTER, a leading healthtech incubator based in Chicago, IL, to accelerate the commercialization of its HealthNet technology and reinvent medical supply chains for the post-covid era. 

“Being accepted into MATTER has been a complete game changer in our journey bringing HealthNet to market.” Said Consensus Networks’ COO, Connor Smith. “Their team and network of healthcare experts and mentors have helped us tremendously. They understand the nuances and intricacies of the Health IT landscape and have really helped us fine tune our value proposition, messaging, and offering to all of the different stakeholders. MATTER has gone above and beyond to connect us with the potential end users, advisors, and partners we need to successfully commercialize HealthNet and we are excited to be a part of this vibrant community of healthtech innovators.”

MATTER, the premier healthcare incubator and innovation hub, includes hundreds of cutting-edge startups from around the world, working together with dozens of hospitals and health systems, universities and industry-leading companies to build the future of healthcare. Together, the MATTER community is accelerating innovation, advancing care and improving lives. For more information, visit matter.health and follow @MATTERhealth.

About Consensus Networks.

Consensus Networks is an Indiana based LLC that is reinventing medical supply chains for the post-covid era. Founded in 2016 with the goal of creating trusted data environments to derive novel insights, Consensus enables clients to integrate clinical and supply chain data to optimize inventory management and improve patient outcomes. Consensus Networks uses serverless application architecture, blockchain technology, and predictive analytics to ensure the best materials are used to treat patients at the lowest cost.

Healthcare

COVID-19 is Forcing Medical Supply Chains to Change… And that’s a Good Thing: Part II

By Connor Smith

Hello everybody! Welcome back to my series on how COVID-19 is forcing medical supply chains to change. In Part I I talked about how the pandemic is forcing the medical supply chains of the future to embrace digitization and automation and rep-less sales models for high cost implantable medical devices. If you have not checked out the first part of this series, you may do so here. Without further delay, let’s dive into two other ways medical supply chains will evolve.

3. Building Resilience through On-Shoring Production

It’s no secret that China is the world’s largest producer and distributor of medical supplies. Prior to the pandemic, China manufactured 48% of the United States’ personal protective equipment (PPE) and over 90% of certain active pharmaceutical ingredients (APIs). When factoring in other foreign suppliers, the U.S imports nearly $30 Billion worth of medical supplies every year, or roughly 30% of all its medical equipment. While this may seem like a fairly trivial portion of U.S total medical supplies, the pandemic has illustrated the profoundly negative consequences of relying entirely on foreign sources for particular items. There are over 25 drug shortages related to COVID-19 (17 of which are from increased demand) and critical shortages of PPE over six months into the pandemic.

Aside from the public health consequences posed by such disruptions, failure to make domestic supply chains more resilient pose a national security threat. Accordingly, shoring up medical supply chains is a priority for both presidential candidates for the upcoming election. President Trump recently announced an executive order aimed at increasing U.S domestic manufacturing and onshoring supply chains for pharmaceutical and medical supplies to protect against potential shortages. Similarly, Democratic Presidential Nominee Joe Biden put forth a 5 page plan articulating actions he will take, if elected, to rebuild domestic manufacturing capacity and ensure U.S medical supply chains remain resilient and geographically redundant in the future.  

Likewise, the commercial sector has expressed similar sentiments. A recent study conducted by McKinsey & Company found that 93% of surveyed supply chain experts listed increasing resiliency across the chain as a top priority with an emphasis on supplier redundancy and near-shoring production. The U.S medical supply chains of the future will emphasize  localization to mitigate as many disruptions as possible. Such regionalization will also make it easier to shorten the distance between suppliers and customers. As pointed out by Brad Payne of PCI Pharma Services, “Shortening the distance in the supply chain expedites deliveries and lessens room for complicating factors, like customs clearance”. 

Forward looking supply chain leadership on both the provider and vendor sides will be investigating ways to leverage these new logistics and distribution paradigms to reduce their bottom line and improve the quality of their services. For example, technologies like Predictive Analytics can be used to improve last mile delivery for medical supplies manufacturers. Other industries have leveraged logistical data analytics in this manner to reduce fuel costs and improve the quality and performance of their deliveries. Healthcare supply chains of the future will leverage these technologies and other tools like RFID to provide more efficient deliveries and optimize procurement strategies.

4. Use Value-Based Procurement Strategies

Historically, providers received payment through a fee-for-service model in which they are reimbursed based on the number of services they render or procedures they order. This strategy made sense when it was instituted as the reimbursement mechanism for Medicare & Medicaid in 1965, but it has perversely affected the way healthcare is paid for today. Physicians are incentivized to provide as many billable services as possible and take a ‘defensive’ approach to healthcare, ordering procedures and tests just to be safe. Consequently, they may order unnecessary tests and procedures without hesitation, as neither they nor the patient are financially responsible, increasing the cost of care. Over the past decade, insurers have started implementing ‘value-based’ payment models that link reimbursement to patient outcomes in an effort to reduce the overall cost of care and improve patient outcomes. Early results suggest that value-based reimbursement can reduce the cost of claims by nearly 12% and improve chronic disease management. The benefits of value-based models by stakeholder may be seen below.

Image adapted from NEJM Catalyst

Prior to the pandemic, pure fee-for-service was expected to account for less than 26% of all reimbursements in the U.S by 2021. Given the immense financial impact of the virus, the impetus to reduce costs by transitioning to value based care has never been greater. Hence, value based procurement, or ensuring that the right product is delivered to the right patient at the right time and at the lowest cost, will be the norm of medical supply chains of the future. As members of HIMSS have pointed out, the future of the healthcare supply chain will emphasize connecting cost, quality, and outcomes to help realize the Triple Aim and optimize provider  performance.  

Similar to the adoption of rep-less sales models I described in Part I, the foundational technology underlying value-based procurement will be clinical supply chain integration. Connecting supply chain and clinical data sources enables providers to redefine how they administer care. Once integrated, predictive analytics can be used to guide procurement decisions based on the cost of the material and its outcome for the patient. Some hospitals have already saved over $15 Million from clinical supply chain integration. Additionally, EHR integration with supply chain data allows providers to assess clinical variance in treatment with granular detail. Such functionality will be critical as providers look to minimize the impact felt by the pandemic, and continue playing a key role in how they operate into the future.

In Conclusion

I hope you enjoyed the second installment of my series on the future of U.S medical supply chains! I will be back next week with my final two insights as to what the future of these systems will look like. Interested in learning more or ways your medical supply chain could be improved? Feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era. Until next week!

Uncategorized

COVID-19 is Forcing Medical Supply Chains to Change… And that’s a Good Thing: Part I

By Connor Smith

Without question, the dedication and tireless work ethic exhibited by medical first responders has been nothing short of heroic throughout these trying times. Unfortunately, despite (and partially due to) the valiant efforts on the ground, the healthcare sector is one of the most negatively financially impacted industries by the pandemic. A recent report by the American Hospital Association estimates the total financial impact of COVID-19 on the U.S healthcare system from March 2020-June 2020 to be $202.6 Billion. While a significant amount of this financial strain stems from treating patients infected with the virus and forgone revenue from elective procedures, many problems brought on by the virus could have been mitigated with more efficient and resilient U.S medical supply chains.

While the magnitude of a global pandemic is something few could have properly anticipated, COVID-19 introduced few truly new problems for U.S medical supply chains. Instead, it exacerbated problems and inefficiencies that have plagued the industry for years which have never been properly addressed. Yes, over 70% of active pharmaceutical ingredients that supply the U.S healthcare market are imported and constrained as a result of the pandemic and unprecedented global surges in demand for PPE created massive shortages in basic medical equipment for U.S frontline responders. However, siloed data, lack of automation & interoperability, and a reactionary approach to purchasing contributed to upwards of $25 Billion in wasted supply chain spending each year long before the pandemic. In fact, analysts project that 2020 will be the first year in which supply chain costs surpass those of labor as the largest cost component of delivering medical care by providers.

Some are optimistic that a COVID-19 vaccine will be ready by the end of 2020, but this is far from guaranteed and it could be a year or more before one becomes readily available. Consequently, the U.S healthcare system must face a stark reality that it may continue to lose upwards of $20 Billion per month for the foreseeable future. Major operational improvements must be made to its supply chains in order to help offset these costs.

The status quo for medical supply chain management is no longer tolerable and inefficiencies that were previously ignored must be corrected. Medical supply chains need to be reinvented over the coming months if the U.S healthcare system is to survive the pandemic and thrive into the future. This is the first of a three part series in which I will explore 6 different ways that U.S medical supply chains will look in the post-pandemic era and the technologies and external forces that will enable them. So without further ado, let’s dive into the future of U.S medical supply chain management!

1. Automation & Digitization

While this may sound obvious, over 80% of clinicians and hospital leaders still rely on manual processes for managing inventory. Consequently, health systems struggle to know what supplies they currently own, where they are located, if they have expired, and a myriad of other problems. Unfortunately, patients pay the largest price for these manual, inefficient systems. One survey found that 40% of clinicians postponed a patient’s treatment and 23% know of adverse events occurring to patients because of inadequate inventory. Considering that some industries are well into the ‘Supply Chain 4.0’ era, the first seismic shift for medical supply chain & inventory management will be to go entirely digital and implement technologies already used in more mature supply chains like retail.

Regulatory pressures from the FDA’s medical device unique device identifier (UDI) requirements and the Drug Supply Chain Security Act have already begun forcing some of this change and major compliance requirements are going into effect over the coming months. Digitizing medical supply chains will not only significantly reduce errors and inefficiencies arising from human error and manual entry, but also enable the use of technologies like AI and blockchain that can elicit even greater cost savings. For example, firms in other industries have seen cost savings of 32% across an entire operation by implementing AI based inventory management systems. Such implementations can be used to improve inventory management by enabling features like predictive forecasting and optimizing surplus inventory levels that should be maintained at all times. Other heavily regulated industries, like food, are experimenting with blockchain for product traceability applications in supply chains and estimate they can reduce costs of compliance by 30% within a few years. 

Certainly, there is much foundational work that must be done before medical supply chains can integrate these more advanced solutions. Implementing RFID chips to enable real-time asset tracking, achieving basic systems interoperability through data standardization, and switching to cloud-based healthcare inventory management software are among the baby steps that must first be taken. However, given the lack of digital infrastructure currently in place in many medical supply chains, there is an opportunity to ‘leapfrog’ legacy supply chain technology and implement cutting edge solutions to realize even greater cost savings immediately. Forward looking medical supply chain management teams will be looking to implement such solutions to ensure their supply chains remain resilient and future proof to future disruptions or pandemics.

2. Rep-less Medical Device Sales Models

Manufacturers of implantable medical devices have traditionally used an in-person sales model for distribution. These sales reps form close relationships with clinicians and are oftentimes even present in the OR during the surgery. While this model has been the standard practice for decades, it also increases the cost of delivering care tremendously. For example, in orthopedics, it’s estimated that the sales process for implantable devices accounts for 35-50% of the cost of sales. Moreover, this sales process makes it nearly impossible for device manufacturers to track their field inventory and providers to manage their consignment inventory, resulting in further cost increases of up to 25%.

The pandemic has not only made these inefficiencies no longer bearable for providers, but it has hindered the ability of manufacturers to even sell their products. Whether through self-selected or provider mandated deferral, there have been 50% fewer patients receiving care compared to normal, and elective surgeries for implantable devices like knee replacements have dropped by 99%. Moreover, nearly 50% of orthopedic sales reps have been forced to use exclusively digital means to support physicians or have been unable to provide support at all. 

It is reasonable to suspect that providers will continue to limit who is allowed into a hospital at least until a vaccine is readily available, if not longer, and patients can only forgo necessary care for so long. Hence, providers and manufacturers alike will be required to implement technologies that make rep-less sales models attainable. One key technological enabler of this transition will be integrated data environments of supply chain, medical IoT, and EHR data. Integrating supply chain data with EHR data had already been a top priority for many providers entering 2020. Such environments will serve as the cornerstone for other tools like video conferencing software and payment processing tools that can enable a rep-less sales model and save providers  millions of dollars per year

In Conclusion
I hope you enjoyed the first part of my series on the future of U.S medical supply chains! I will be back next week with two more insights regarding what the future of these complex systems will look like. If you are interested in learning more or ways your medical supply chain could be improved, feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era. Until next week!

Uncategorized

Consensus Networks Will Support NuCypher

Consensus Networks is excited to announce its continued support of NuCypher through mainnet launch later this year. Consensus Networks has actively supported the network since the beginning of its incentivized testnet challenge which launched earlier this year called “Come and Stake It”. 

“We are big believers in NuCypher’s mission and see a major need for the work they are doing in enabling privacy preserving applications and providing a secure compute platform.” Said Consensus Networks’ COO, Connor Smith. “We will look to continue not only supporting the network by operating Ursula Nodes on our infrastructure, but are actively building out tools to help developers and organizations connect their applications to NuCypher and leverage the network to build the next generation of privacy preserved dApps. We have had an absolute blast participating in all four phases of their incentivized testnet and cannot wait to help bring the power of the NuCypher network to the rest of the world.”

NuCypher aims to provide the cryptographic infrastructure for privacy-preserving applications by allowing users to manage secrets across dynamic environments, conditionally grant and revoke access to sensitive data to any number of recipients, and use a secure computation platform to process encrypted data while preserving the confidentiality of the inputs and outputs. It is a proxy re-encryption network designed to provide cryptographic access controls for dApps and protocols by functioning as a decentralized key management service (KMS). If you have any questions about how to integrate NuCypher with your application or if it is the right fit for what you and your team is building, contact our team today!

About Consensus Networks.

Consensus Networks is an Indiana based LLC that designs, builds, and manages dedicated infrastructure for blockchain technology. Founded in 2016 with the goal of providing direct, high-speed, low-latency network connections, Consensus enables clients to create, disseminate, and store information securely and efficiently. Consensus Networks uses advanced cryptographic techniques and smartly architected network designs to ensure maximum security and network uptime. 

Uncategorized

Top Three Takeaways From Blockchain and Digital Transformation in Health in 2020

By Connor Smith

Healthcare is frequently mentioned on the shortlist of industries that are expected to be transformed by blockchain technology. Supporters of this assertion have a range of reasons for arguing that blockchain can improve healthcare but most tend to revolve around improving health data security, immutable claims tracking, physician credentialing, or putting patients at the center of care by taking charge of their own data. However, there are many who argue that healthcare is not ready for blockchain or that many of its theorized use cases could be better addressed using other technologies. There are good arguments to be made on both sides and, when considering the complexity of healthcare and nascency of blockchain, it can be difficult to discern what projects are ‘hype’ and which can actually drive meaningful impact. 

We at Consensus Networks are bullish on the potential of blockchain in healthcare, but we also pride ourselves on taking a pragmatic view of projects to realistically assess what is feasible and what is not. This past week, we were fortunate enough to attend the inaugural Blockchain and Digital Transformation in Health in 2020 summit in Austin, TX, where our CEO Nathan Miller presented on the work we have been doing with HealthNet and developing highly secure information architectures in a regulatory environment. The Conference was hosted by the Austin Blockchain Collective in conjunction with the Dell Medical School at University of Texas Austin. There were presentations from industry and academia alike accompanied by an open discourse about the state of blockchain in healthcare, what is actually feasible, and identifying a path forward for the technology as healthcare starts its digital transformation. It was an absolutely great event with high quality information and a pragmatic assessment of the state of the industry, and we’re here to share our top three takeaways with you from the event!

1). Blockchain EMRs are not ready….. At least not yet in the U.S

Throughout the short lifespan of blockchain projects in healthcare, there have been several attempts at a blockchain-based electronic medical record (EMR) that is owned by a patient and shared with providers as needed, the most popular of which is probably Medicalchain. Medical records hold a wealth of information about an individual, containing everything from a person’s medical history to their demographic, identity, and insurance information. However, to date, medical records have been largely owned and controlled by the health systems they reside within. Aside from issues of data sovereignty and controlling who has access to that information, having isolated data silos has a decidedly negative impact on patient outcomes, especially in the U.S. Competing EMR systems are incapable of communicating well with one another. Thus, if a patient goes to multiple providers that all have different EMR systems, the data for those visits will likely never be aggregated into a single, cohesive file and instead remain as isolated fragments. This makes it nearly impossible for a provider to know what care has been administered to a patient previously and leads to billions of dollars being wasted in redundant testing, unnecessary procedures, or in the worst scenarios patient death from improper care.

A blockchain-based EMR would enable the patient to own his or her own medical record, which they would likely hold in some form of mobile application. A patient could then control who has access to their record and have the guarantee a provider is seeing her most up-to-date record as any changes would be reflected in that copy immediately. All transactions would be immutably recorded on a blockchain, and once the visit was finished the patient could revoke the physicians access. Conceptually, such a notion sounds appealing. However, one of the biggest takeaways from the conference was that such a future is far off in the U.S and requires a societal shift and fundamental rethinking of data ownership to get there.

Dr. Aman Quadri, CEO of AMSYS Blockchain and AMCHART, was one of the speakers in attendance at the event. The product he is building, AMCHART, is a blockchain-based EMR that is currently undergoing testing in India, and even he was skeptical of its prospects in the U.S. Dr. Quadri said that the reason they have started seeing AMCHART adoption in India is because people there already have a mindset of data ownership. They take responsibility over their data so a platform like AMCHART extends their current capabilities in a way that is beneficial. Dr. Quadri said that for AMCHART to have impact in the U.S it would require patients and health systems alike to change how they view and approach handling data before their could be a marked increase in value to patient care. He said that American patients have been conditioned for decades to blindly trust their data with medical providers, so shifting that view will be no easy task.

2.) Use Cases Are Emerging Around Care Coordination, Identity, and Data Sharing

The projects being spearheaded by the talented Dell Medical School faculty, visiting academics, and industry representatives in attendance covered a wide range of applications in healthcare, spanning both population health and the clinical setting. While the individual problems these solutions address vary, the common thread amongst most of them was that they centered around care coordination, identity, and data sharing applications. The consensus seemed to be that blockchain could help lay the foundation for a web of trusted access to data with the patient at the center of care.

Dr. Timothy Mercer, a faculty member at Dell Medical School and practicing physician, is exploring ways in which blockchain could be applied to help address homelessness in Austin. His research found that one of the biggest problems for the homeless population in Austin is a lack of any legal form to prove their identity. As a result, they frequently go through the process of proving who they are, which can take weeks to months to complete and delay physicians from providing critical care to the homeless. If the documents are lost or stolen, the process must start all over again. As a result, the average age of death the chronically homeless is 52-56, nearly 20 years less than the global average. Dr. Mercer is exploring ways blockchain and digital identities could be used to ease this burden and accelerate the time to care for homeless persons in Austin. This way, the myriad of stakeholders involved in caring for homeless persons would be able to verify an individual through a  web or mobile application and administer care to the patient. The homeless care ecosystem involves many different organizations, all of which must properly authenticate the individual before they can legally administer care. Utilizing a blockchain-based identity application, the different caregivers would be able to verify the individual’s identity through digital documents linked to the patient’s digital identity and legally provide the care he or she needs. This ultimately would place the homeless person at the center of the care and alleviate the inefficiencies pervasive in the current continuum of care for this patient population.

Image Adapted From Change Healthcare

Another interesting application that was highlighted at the event was Tribe Health Solutions use of blockchain in a medical imaging management solution designed for patients. Through the use of blockchain technology and the interplanetary file system (IPFS), they created a platform where patients can store medical imaging data on a distributed file system and then grant necessary providers access when needed. After care is administered, the patient can revoke access to the image and ensure that only trusted providers can access it. This solution aims to help patients overcome many of the problems associated with receiving care for skeletal or muscular injuries. In such tears or breaks, patients oftentimes seek out multiple opinions and are forced to either manage a physical copy of the medical image themselves or wait days to weeks for the file to be transferred to the provider. This not only delays the time it takes for these patients to receive the care they need and start the recovery process but in the worst case scenarios can lead to a worsening of the condition. Putting the patient in charge of the imaging data, allows her to determine who can view the image and when – ultimately reducing the time it takes to receive treatment.

3.) Blockchain Projects Must Start Small in Healthcare and Involve Many Stakeholders

Image Adapted from AdvaMed

Lastly, perhaps the most prevailing takeaway from the conference was that blockchain projects looking to tackle problems in healthcare need to start small and involve as many stakeholders as possible from the onset. Healthcare is a highly complex industry where ‘moving fast’ and ‘breaking things’ can have significant ramifications on patients, especially when considering that the industry is only now beginning its digital transformation. Dr. Anjum Khurshid of the Dell Medical School at University of Texas Austin and a director of the Austin Blockchain Collective is leading research on a patient credentialing platform called MediLinker. When pressed about the ability to extend the platform into an EMR type technology that could exchange clinical data, Dr. Khurshid cautioned that it’s more important to start small and clinically validate each stage of the product. In an industry that handles high fidelity information and typically averse to new technologies like healthcare, Dr. Khurshid said its important to demonstrate the value of the technology and make it more approachable. He said that the problems in the current healthcare system are so vast that even simple solutions can have a massive impact and that it is imperative to validate the benefit of the technology to patients, providers, and payers alike at each step. Any new, truly innovative and sweeping changes that are to take place in healthcare from blockchain will require all of these parties to work together and identify applications that can drive meaningful value for everyone involved. Healthcare is changing rapidly and only by taking small incremental steps will blockchain be able to integrate with the complex, multi-stakeholder ecosystem that is healthcare.

That’s all for this week! We at Consensus Networks are grateful to have been able to attend this conference and excited about the work going on in Austin to advance the industry forward. We are continuing forward with the development and commercialization of our population health data sharing network, HealthNet as well as smart tools for analyzing health data. If you are interested in learning more about HealthNet or have an idea for a new digital healthcare application you’d like to build, contact one of our experts here today!

Blockchain Learning, Proof of Stake

Celo’s Approach to Decentralization

By Connor Smith

Note: This is the final installment of a series exploring different approaches that blockchain networks have taken to achieve decentralization. Part 1 introduced decentralization and the inter-play it has with certain aspects of crypto networks like governance, incentives, and network architecture. If you missed that article I highly recommend going back and reading it here. The remaining articles have been examinations of the decentralization on Bitcoin, Factom, Cosmos, Terra, and Polkadot/Kusama. If you missed those and would like to go back and read them before we dive into Celo, you may do so, here, here, here, here, & here respectively.

Hey everyone, thank you for joining me again as our thoroughfare through varying approaches to decentralization comes to a conclusion. From the first crypto network, Bitcoin, to the new generation of ‘meta-networks’ like Polkadot and Cosmos, we have seen quite a few different ways networks have attempted to decentralize and how that has influenced their design. We have seen how factors like application design criteria and consensus architecture (i.e proof-of-work vs. proof-of-stake) influence the decentralization of a networks’ resources and participants. Moreover, by taking a chronological approach in the networks examined throughout this series, we have seen the evolution of the crypto industry over the better part of a decade and will be ending with the youngest protocol we have seen thus far, Celo. Aiming to overcome price-volatility and ease of use problems associated with many cryptocurrencies, Celo seeks to bring financial infrastructure and tools to anyone with a smartphone. 

To reach this goal, Celo is taking a full-stack approach and introducing innovations at the networking and application layers with technologies like a lightweight address-based encryption scheme, an ecosystem of stable-value tokens, and novel system architectures. The project is backed by some of the most prolific crypto investment partners and Silicon Valley tech titans like a16z, Polychain Capital, Coinbase, Reid Hoffman, and Jack Dorsey to name a few. The protocol has been making waves over the last few months due to its rigorous incentivized testnet competition, known as the Great Celo Stake Off, the rigor of which was recounted two weeks ago by our CEO Nate Miller. The Stake Off is entering into its third and final phase and the mainnet is slated to launch later this year, making 2020 a big year for the protocol. So without further ado, let’s dive into Celo!

So What is Celo and How Does it Work?

Celo was launched with the mission of building a financial system capable of bringing conditions of prosperity to everyone with a smartphone. To create such an environment, Celo has proposed a theory of change to satisfy the following three verticals: satisfying people’s basic needs like access to food & healthcare, the ability to enable an individual’s growth potential, and increasing people’s social support for one another. All aspects of the protocol, from its architectural decisions, to development efforts, and even all technical and community projects support activities tied to enabling these three conditions and ensuring such a system is created. Work on the project began in the summer of 2018 when entrepreneurs turned GoDaddy executives Rene Reinsberg and Marek Olszewski raised their initial seed round of $6.5 MM from some of the Valley’s top Venture Capitalists. The pair had exited their prior company, Locu, to GoDaddy in 2014 for $70 MM, and had since been serving as vice presidents in the restaurant and small business division of the firm. Armed with little more than a white paper at the time, the team got to work and in less than a year the first public testnet was released. Celo aims to achieve its mission of bringing conditions of prosperity to everyone and being a mobile-only payments platform through the following two features: mapping users public phone numbers to an alphanumeric string (public key) needed to transact on the network, and using a network of stablecoins pegged to a basket of different cryptocurrencies to minimize price volatility.

The team believes that creating a price-stable, more user-friendly transaction experience is the only way that a cryptocurrency payments solution will see success, and thus has sought to redefine the entire blockchain networking stack to optimize for these characteristics. Hence, Celo is a mobile-first solution operating as a proof-of-stake smart contract platform based on Ethereum and composed of the following three components: lightweight identity for a better user experience, stability mechanism for stable-value currencies, and systems for incentives and governance to ensure platform stability. An image of the Celo stack may be viewed below.

Celo’s lightweight identity system utilizes a variant of identity-based encryption known as address-based encryption to overcome traditional user experience issues associated with transacting cryptocurrencies. Instead of the canonical having to download a wallet, generate a public/private key pair, and provide whoever is sending you crypto with a long hexadecimal address, Celo’s address-based encryption ties a user’s phone number directly to a Celo wallet. This allows the phone number to be used when a payment is initiated instead of the actual Celo address when a payment is initiated, simplifying the payment process. Additionally, only a cryptographic hash of the phone number is stored on the blockchain to preserve privacy. Celo also allows a user to link multiple phone numbers to his or her wallet address to protect against losing a phone or changing numbers. Celo also utilizes a social reputation mapping algorithm on top of this infrastructure known as EigenTrust. While the technical underpinnings of the algorithm are fairly complex, it functions similarly to Google’s Page Rank algorithm but is designed for decentralized systems. In short, this algorithm defines a given phone number’s relative reputation score based on the number of phone numbers who are connected with and trust that phone number, coupled with the weighted reputation of those connections.

Similar to Terra’s approach to creating a stablecoin network, Celo is also a network of stablecoins pegged to real-world fiat currencies and uses a seigniorage based approach to maintain stability. For the sake of brevity, I am going to gloss over what stablecoins and seigniorage are, as I discussed them at length in my post on Terra, and instead, dive into how they work in the context of Celo. Celo is designed to support an ecosystem of pegged stable currencies alongside the native token of the protocol, Celo Gold (cGold). The first, and currently only, stablecoin on the network is the Celo Dollar (cUSD), which is pegged to the price of the U.S dollar. cGold has a fixed supply and is held in a reserve contract where it is used to expand or contract the supply of cUSD to maintain a stable price through seigniorage. The network relies on a number of external oracles to provide feeds of the cGold price in USD and then allows users to exchange a dollars worth of cGold for cUSD and vice versa. When the market price rises above the $1 peg, arbitrageurs may profit by purchasing a dollar’s worth of cGold and then selling cUSD for market price. Conversely, if the price of cUSD falls under the peg, arbitrageurs can profit by purchasing a cUSD for market price and exchanging it with the protocol for a dollar’s worth of cGold and then sell the cGold on the market. Thus, network participants are able to profit in nearly any market condition. A more thorough examination of how Celo’s stability mechanism works may be found here.

Celo also goes a step further to ensure stability and has implemented a constant-product market-maker model to prevent against the cGold reserve from becoming overly depleted when the price of cGOld supplied by the oracles does not match the market price. The mechanism dynamically adjusts the offered exchange rate in response to the exchange activity and will update a new constant-product market maker to trade cGold and cUSD whenever the oracle price of cGold is updated. Hence, if the oracle price is correct, the exchange rate determined by the constant-product market-maker will be equivalent to that of the market and no arbitrage opportunity will exist. However, if the oracle price data is incorrect, the rates will differ and an arbitrage opportunity will exist until exploited enough by users to dynamically adjust the quoted exchange rate and erase the opportunity. 

Celo’s Network Architecture

Image adapted from celo-org medium here

Consistent with their full-stack approach to creating a mobile-first financial system,  Celo implements a novel tiered network architecture to optimize the end-user experience and maximizing physical decentralization. Similar to other Byzantine Fault Tolerant (BFT) proof-of-stake networks we have seen so far, like Cosmos, the core network participants responsible for producing blocks and verifying transactions is the validator. Unlike other proof-of-stake networks that encourage anyone who is willing to run a validator, Celo encourages only professional node operators to run a validator on the network. For example, Celo strongly encourages running validators in a secure data center and has been auditing validators participating in the Stake Off to see if this is the case. In maintaining a secure set of globally distributed validators, the network hopes to maximize security, performance, and stability. Celo has also attempted to implement safeguards against any single validator or organization from gathering a disproportionate amount of the network’s resources by introducing validator groups. Instead of electing individual validators to participate in consensus, validator groups comprised of a collection of individual validators are elected and then internally compete to solve blocks. The actual election process and underlying mechanisms are far more involved and complex, so if you are interested in learning more, as I said earlier, check out this blog post from our CEO, Nate Miller, which explains the process in more detail. Validator groups have their own unique identity and a fixed size to make it difficult for a single organization to manage multiple groups and consolidate disproportionate influence, thus improving decentralization.

While the ability to run a validator is fairly restricted to only professional node operators, there are two other tiers of nodes that anyone can run on the Celo Network: a full node and a light client. The Celo application/wallet has a light client embedded within it that is optimized for mobile devices, so anyone running the software on their phone is running the light client. The requests exchanged across these light clients (i.e sending & receiving transactions on the network) must be processed by full nodes, which receive a transaction fee facilitating the transaction. People running full nodes can set a minimum service fee for processing transactions from light clients and refuse to perform service if the fee they collect will be insufficient. The eventual goal of the protocol is to have these components operate such that light clients will automatically choose full nodes to peer with based on cost, latency, and reliability. However, much fundamental network infrastructure must be laid down first before this is achievable. An eventual flow of what this will look like, including validators, may be viewed below.

Image adapted from celo-org medium here

So How Does Governance Work on Celo?

At a high level, Celo has a governance model similar to many other proof-of-stake networks, where the respective weight of a particular user’s vote in the governance process is proportional to the amount of cGold they have staked and the duration of their stake. Similarly, Celo also supports on-chain governance to manage and upgrade all aspects of the protocol including upgrading smart contracts, adding new currencies, or modifying the reserve target asset allocation. Changes are currently made through a governance specific smart contract that acts as the overseer for making modifications to other smart contracts throughout the network. The eventual goal of the protocol is to transition from this smart contract structure for on-chain governance to a Distributed Autonomous Organization, or DAO, owned and managed by cGold holders. This could function in a form similar to how MakerDAO operates, however, it is far too early to speculate on how the Celo DAO would actually function. For more information on what a DAO is or how they work, click here.

Any network participant is eligible to submit a proposal to the governance smart contract, so long as he or she is willing to lock a portion of his or her cGold along with it. A proposal consists of a timestamp and the information needed to execute the operation code should it be accepted by the network. Submitted proposals are then added to a proposal queue for a duration of up to one week where they will be voted upon by cGold holders in hopes of passing to the next approval phase. Every cGold holder with a locked cGold account may vote for one proposal per day. The top three proposals for each day then advance to the approval phase, the original proposers may reclaim their locked gold commitment, and they then enter the referendum phase where they are voted upon by users. Any user may vote ‘yes’, ‘no’, or ‘abstain’ to the proposal and the weight of their vote is tied to their locked gold commitment. While yet to be implemented, Celo also intends on incorporating an adaptive quorum biasing component like we observed in Polkadot to accurately account for voter participation. 

So How Decentralized is Celo?

Image adapted from celo medium account here

As I mentioned earlier, Celo has yet to launch their mainnet so this discussion will be framed through the context of what has transpired throughout their ongoing incentivized testnet, Baklava. As of the time of writing this article, there are currently 75 different validator operations participating in the third phase of the Great Stake Off, but 152 Validators in total and 88 validator groups. Moreover, Celo is debating expanding the active set of validators on the network upon the conclusion of the Stake Off. The active set is currently set at 100 Validators, and the original plan was to wind down the number of possible validator slots to 80 before mainnet launch. However, Celo recently announced that they now plan to expand the active set to 120 so long as scale testing shows this permissible given the active engagement that validators have shown throughout the Stake Off. Considering that Celo intends on only allowing Validator nodes to be run by professional service providers, this is a major step in decentralizing their network and ensuring a globally dispersed, resilient network.

When examining the allocation of resources across the Celo network, there is somewhat of a disparity between the highest-ranked participants and those at the bottom. For example, the top elected validator group has nearly 1.1 MM votes, whereas the lowest elected validator group has only slightly over 200K. Additionally, the top elected group has 5 different participants with the largest, whereas the bottom elected group only has one. This illustrates the importance of the validator group, not the individual validator on Celo. The largest cGold holder within the largest elected validator group only has 194k locked cGold, meaning that all members of the group have fewer locked cGold than the one participant in the bottom group. Yet, the group collectively is the highest voted group so its participants are more likely to participate in consensus and gather rewards. Metrics relating to the decentralization of full nodes and light clients on Celo are not readily available since the network is still in the very early development stages. Consequently, it is difficult to attempt to quantify the degree of decentralization of these layers of the network. The Celo Wallet Application is available for the Alfajores testnet on both the Apple App Store and the Google Play Store, with over 100 downloads for the latter. This suggests that there are at least 100+ light nodes on the non-incentivized testnet Alfajores. 

That’s all! I hope you have enjoyed this case study approach to decentralization as much as I have. With the last phase of the Baklava incentivized testnet coming to a close within the next few weeks, mainnet launch slated for later this year, and the protocol’s recent announcement of Celo Camp to incubate and assist startups building on the platform, it is certainly an exciting time to be involved with Celo. The Great Celo Stake Off has been no walk in the park, but it has certainly stress-tested the network technically and from an incentives standpoint. Excluding some economic barriers to entry for new validators attempting to enter the active set, it appears that Celo’s approach to decentralization has achieved its goal, at least physically. It will be interesting to see if this continues once true economic conditions are introduced on mainnet, but I am optimistic about the future of the network. If you are interested in seeing if Celo is the right blockchain for your application, running a Celo cluster, or how staking on Celo, contact one of our LedgerOps experts here. We have been involved with the protocol throughout the entire incentivized testnet and are currently in the second-highest voted group (Chainflow-Validator-Group), so we are highly familiar with the protocol. Thanks again and take care!