Blockchain Learning, Proof of Stake, Uncategorized

Why We’re Building Casimir

Note: This started out as an internal document, reconfirming our priorities and values as we watched the First Contagion, the collapse of Luna, in early 2022. Today, we’re watching the Second Contagion and the collapse of  one of the over-leveraged wildcat banks we allude to below. Although we didn’t predict the collapse of FTX, nor are we trying to predict what’s next, we continue to operate by our principles first approach. This approach means working on solutions for our core principles of self custody and peer-to-peer transactions in the Web3 space. It’s why we have never worked with or used FTX before and why our current roadmap remains unchanged. What the FTX collapse has done for us, though, is affirm our approach and hasten our desire to bring about vast improvements to the UI/UX problems that create a barrier to self custody for many users. We hope the following will give insight as to why we’re taking the approach we are and we hope you’ll join us on our journey to making Web3 better, easier, and safer for all.

It’s always a good idea to revisit your priorities and values especially in this time of uncertainty in blockchain, cryptocurrency, Web3, and technology at large. We’ve written before about what the value proposition of blockchain is and why we’re building technology as close to the consensus layer as possible. Fundamentally, the consensus mechanism is what powers a new medium; exchange of value without the need for a third party, peer-to-peer transactions. It’s clear that many of the current issues in the Web3 space today are due to egregious speculative activity and attempted takeovers from centralized entities acting far away from the consensus layer. 

We’ll briefly touch on some recent issues in Web3, the major reasons we think they occurred, and why we’re building Casimir to help fix this.

Bridge Attacks – In February 2022, the DeFi platform Wormhole was exploited for $325 million. Wormhole was a popular VC backed blockchain bridge designed to allow users to access tokens across chains using a single access point. More recently the Binance Smart Chain was exploited for $100M+. While bridges are a potentially convenient solution to the mass of protocols in existence, a single smart contract or hot wallet with $100M+ of deposited tokens is proving to be too attractive of a target for hackers. So far in 2022, over $2B worth of tokens on bridges have been hacked!

Decentralized in Name Only – The first of the warning bells of the impending 2022 cryptocurrency sell-off was the collapse of Terra. There are a range of reasons why Terra collapsed but simply, algorithmic stable coins backed by digital assets have fundamental challenges due to the volatile nature of digital assets. This early breakdown from Staking Rewards names a combination of an overreliance on the yield platform Anchor combined with significant off-chain usage on exchanges being a driving factor in the collapse of Terra. Those externalities, controlled by central entities, effectively subverted the consensus mechanism of the project by operating off-chain where overleveraged risk could not be observed. Additional issues were caused by a concentration of premined tokens in the hands of Terraform Labs who essentially controlled protocol voting and overrode the desires of some in the community to reduce risks. A more recent postmortem in June 2022 showed that the liquidity issues and subsequent depegging of the UST stable coin were caused by Terraform Labs themselves.

The Rise and Fall of CeDeFi – Next to fall, and still unwinding, is the “Centralized Decentralized Finance” (CeDeFi) company Celsius. Companies like Celsius and BlockFi have driven huge growth in Web3 by offering high interest rate yields on your deposited tokens. They act as a bank but don’t do a good job of indicating the potential risk their depositors face nor do they follow the same regulations as traditional banks. Celsius was exposed to Terra and potentially lost $500M there alone. More recent are revelations that Celsius executives cashed out just prior to the collapse and bankruptcy filing.

Last of the “(first) contagion” was the collapse of Three Arrows Capital. Ongoing investigations are looking at whether 3AC took large margin longs on cryptcurrencies through fraudulent activity and then  were subsequently liquidated over the past month of pullbacks. Overall, it sounds pretty bad for 3AC management and they might be going to jail.

The unifying thread of these major collapses was the concentration of digital assets and their control into single points of failure. Even worse, the users themselves were in the dark, unaware of what was occurring with little visibility into the behind-the-scenes actions of those companies. What the latest round of speculative growth in Web3 was built around was, in short, unsustainable, over-leveraged, unregulated, wildcat banking, totally divorced from the core ideas of a decentralized currency. This mentality has unfortunately not changed since the beginning of the year and more liquidity crises are not out of the question.

Unfortunately, all of these problems were intentionally created (not the fallout of course); many players in the Web3 ecosystem today are attempting to rebuild traditional business models around SaaS and fee extractional models by creating layers of complexity that separate users from the core Web3 value proposition: Peer-to Peer-transactions.

While the 2022 drawback in Web3 did a lot to refocus the industry on its core principles, there are still growing centralization and regulatory concerns:

Ethereum Merge – Ethereum 2.0 staking is currently heavily concentrated among major cryptocurrency exchanges and the Lido Pool. So far, just two centralized staking providers, Coinbase and Lido, have mined almost 50% of Ethereum blocks post merge. Control of cryptocurrencies by “banks” (Coinbase, Kraken, BlockFi, FTX, etc) presents a threat to the uncensorable features of the Ethereum blockchain. With control of the Ethereum blockchain and operating under U.S. regulatory policies, these entities must implement any and all controls as required by law. What this means is that cryptocurrencies would effectively become fiat currencies – implemented by decree from the state.

If we are to avoid this scenario we must help create a truly decentralized ecosystem where a few centralized entities can’t control the Consensus mechanism of a Web3 protocol. We need native Web3 solutions – peer to peer, decentralized solutions and tools that empower the users, not centralized market makers. We’re building Casimir to do just that.

Decentralization – Probably the most overused and watered down word in the space is “decentralized.” Nearly everything in blockchain/web3 is called decentralized, whether or not it actually is. The unfortunate reality is that blockchains are decentralized in name only. A recent study by Trail of Bits for DARPA concludes blockchains are fairly centralized. They report that the pooled mining for Bitcoin gives a Nakamoto coefficient of 4 to Bitcoin and Proof of Stake protocols aren’t much better. I won’t get into criticism of the overall piece by Trail of Bits, particularly the misassociation of pools and protocol control for Bitcoin, but the Nakamoto Coefficient for Proof of Stake is worth analyzing. Chris Remus of Chainflow has written extensively on Staking Decentralization and currently maintains a live Nakamoto Coefficient tracker that predates the Trail of Bits report. The Nakamoto coefficient is a measure of decentralization and, by definition, the number of nodes needed to control the Consensus mechanism of the protocol. The lower the number, the less decentralized. At the time of this writing, some major protocols have very low Nakamoto Coefficients, of note Polygon is at 3.

The goal of Proof of Stake protocols should be to get the highest Nakamoto Coefficient number possible, which would make it very difficult to manipulate the protocol since it would require simultaneous compromisation of hundreds of nodes. For example, Cosmos has an active set of validators of 150, around the world. Compromising all of them would be likely impossible, however the Nakamoto Coefficient of Cosmos, is only 7, meaning that to control the Consensus mechanism of Cosmos would only take a compromise of the top 7 Cosmos validators. A tough job to be sure, but a lot easier than the 150 total active validators in the Cosmos ecosystem.

What this means in practice is that the allocation of staked tokens should be spread across all validators as equally as possible, not continually concentrated in a few of the already heavily staked validators.

So why are the Nakamoto coefficients so low? Let’s talk about the User Experience

The Crypto Experience

User experience

The Web3 user experience today… sucks. You’re forced to either leave significant returns on the table and surrender control of your assets to a major exchange; or, endure the inconvenience of manually staking across multiple protocols, wallets, platforms, and websites. It’s harder to know what’s going on and it becomes easier to get scammed through faulty or malicious smart contracts. 

What Web3 Looks Like Today

The easiest way to manage multiple digital tokens and assets is through centralized exchanges like Coinbase, which leave a lot to be desired. You give up custody of your tokens and if you’re staking, you’re missing out on potential rewards that Coinbase scoops up in the form of third party fees. If you’re more adventurous, you may have multiple wallets and multiple staking websites you use. You have the benefits of self custody but are forced to go through the process of managing the wide range of websites and wallets you have to interact with the various protocols. It becomes confusing to manage and monitor all of your stuff and there aren’t any good solutions today that help you compile everything.

What’s more, current Web3 non-custodial products, like MetaMask, fall far short of protecting users from scams or interacting with bad smart contracts. Because cryptocurrencies are so difficult to interact with and understand, even seasoned pros get manipulated and hacked.

How MetaMask Responds to UI Criticism

Let’s look at how this poor user experience even affects the Consensus mechanisms of PoS protocols. One of the easiest ways to stake in the Cosmos Ecosystem is using Keplr, a mobile/web wallet that allows you to stake to any of the Tendermint based protocols. However, users trying to stake with Keplr aren’t given much to work with. 

The Staking Page for Cosmos

A new Staker has no way of deciding who to stake to. There are no easy ways of determining whether an above listed validator is reliable or participating in the governance of a protocol. Users have no real reason to choose a validator outside of the top ten, because there are no tools to sort and research each individual validator. So, people end up picking validators from the top of the list due to the appearance of quality. We can see this effect in the Nakamoto Coefficient of Cosmos today, which is 7. What’s more, two of the top five Validators for Cosmos are cryptocurrency exchanges. In Proof of Stake today, cryptocurrency exchanges have an outsized impact on the consensus mechanism of proof of stake protocols.

So, we’re left where we started. Exchanges offer the best user experience and are gaining control over Proof of Stake protocols. Since exchanges are likely to be regulated more like banks in the future, we are looking at a future where Proof of Stake is controlled by banks. What this means is that they control consensus. They can censor accounts, users, or transactions that they don’t like or are told to by the government. That’s a fundamental threat to the idea of decentralization and Web3 as a whole – an uncensorable digital currency.

Our conclusion is that a poor user experience is driving centralization and will continue to lead to major single point of failures like Celsius unless we create tools that allow users to take full advantage of the protocols they use.

How we’re building Casimir

First, we reexamined how Web3 is being built today. It’s been often stated that Web3 is “going to be just like the internet”. It’s certainly true that there may be some parallels in growth trajectory and societal impact; however, for many projects in the space today, “just like the internet” means being built using today’s internet: AWS/Google Cloud, numerous HTML websites, and centralized SaaS powerhouses. With Casimir, we want to break the paradigm of today’s Web3 and reexamine how users interact with and use blockchains, digital value, and Web3 overall.

We are getting off the Web 2.0 rails and building something new, a native Web3 experience that prioritizes decentralization, user experience, and user control. We’re building the first true Web3 portal, capable of integrating with any wallet, any blockchain, and any token, allowing users to easily navigate Web3 and interact with the protocols directly, not through a centralized exchange or a variety of unconnected websites.

How We’re Designing Casimir

Improving the User Experience through Decentralization

We’re starting bottom up. Unlike current UIs, designed with traditional Web2 architectures, we’re starting at the Consensus and Infrastructure layers of Web3. These layers of decentralized node infrastructure providers hold fully indexed blockchain databases, provide APIs for querying, a network of worldwide decentralized nodes for consistent uptime, and build blocks of transactions as they are added to the blockchain. Today, most users are forced through third parties to access blockchains, which introduces extra costs for transactions and token management. By accessing these nodes directly, users are assured of uptime, uncensorable and low cost transactions, and minimized fees taken by the normal third party intermediaries. Also, with the right tools, users can access on-chain analytics and other information that these nodes carry. This information can protect users by providing transparency to the entities they’re interacting with as well as information about smart contracts and other on-chain data. Today there simply aren’t good enough tools to make on-chain information available and usable to the everyday user.

There are 3 key areas we’re focusing on as we design Casimir: Usability, Security, and Transparency.

Usability: Similar to a Mint or Personal Capital, it will be a place where users can aggregate their digital currencies and assets, for an easy place to manage what they have across the various protocols they use. Many Web3 users have multiple wallets and assets from a variety of protocols, so a single location for them to better manage and view their assets is much needed without it being a single point of failure for any stakeholder.  With our multi-chain approach and Multiwallet Connect we can effectively be an interoperability solution without the bridge.

Casimir will do more than just a Mint, however, it will allow users to interact with their chosen protocols, accessing mints and air-drops, Stake and manage their digital currencies across protocols beyond ethereum, and access specialized tooling that helps protect users. We’ll build and continue to add features like this that help users use Web3.

Our business model isn’t built around trading or exchange fees. Unlike an exchange, we’re not front running trades or building in hidden custodial fees. Our base product will always be free to use and we’ll make money by offering a premium subscriber product as well as through our infrastructure services. We believe you’ll not only have a better user experience, but you’ll actually save money as well.

Security: Unlike most centralized exchanges and custodians, we will never take custody of user’s wallets or tokens. This means we are able to leverage existing on-chain security to protect users at a much higher level. It also means we will never be worried about liquidity or will be trading a user’s tokens on the backend. Although Casimir will be a single site, it won’t be a single point of failure. The code is open source and we will never take custody of user’s digital tokens or NFTs. If Casimir goes away tomorrow, no funds will disappear and users will still have access to all of their tokens.

Unlike traditional Web2, we’re not building around user account management, user analytics, and productizing the user. We’ll never ask for a user’s email address and build an internal profile because not only does this create a security vulnerability for our users, it’s also unnecessary. Our users will always be able to login through their wallet which means they will always control their login credentials.

As part of our usability effort we’re building a smart contract analyzer for users to know what their interaction with a smart contract will *actually* do and monitor the smart contracts they’ve given permissions to and control permissions on old contracts. Because we are working at the protocol level, we are able to provide users with real time information and on chain analytics to help users make the best decisions with their digital assets.

Transparency: As the name indicates, every on chain action on a public blockchain is publicly accessible. Every wallet, every transaction. This transparency is unique in financial systems where the books of banks or governments are not available to everyday users. Today, many Web3 financial providers continue to hide behind their proprietary systems and their financial solvency is only available to the select. What’s worse is that these companies (in the US at least) often skirt regulation loopholes to avoid the same auditory requirements banks have. 

In a world where Bitcoin was launched in the face of a banking crisis, with a desire to bring about a new and transparent financial system to the world, the actions of many major players in the space today are in direct opposition to the values of Web3.

Casimir will help change this. We leverage our fully indexed chains to provide transparency analytics to our users. While block explorers and address balances are always available to those who know how and where to look, we’re making it easier for users to interact with and use Web3 indexed information. We’ll allow users to easily sort data, see large and identified wallets, track large transactions, and match wallets with organizations so that proof of reserves can be ensured.

We’re here to create a better Web3 user experience. For us that means enabling users to better use the decentralized capabilities of the space, not to trade better in the crypto casino. We’ve got a long way ahead of us, both to build something better but also to help users learn the importance of self-custody and what decentralization truly means.  

Over the next few months we’ll present some of the specific technology developments we’re working on to help achieve our goals including non-custodial Ethereum Staking, cross-chain wallet integrations, and a cross-chain single sign on. You can follow our progress on github and join us on Discord. We’re striving to create an open ecosystem that empowers the user and we hope you’ll join us.

Blockchain Learning, IoT, Proof of Stake

Our Response to NIST

The National Institutes of Standards and Technology requested information for a congressional study regarding emerging marketplace trends in a variety of areas including blockchain and IoT. Read our response below as we discuss applications of Decentralized Computing, how consensus works, and what’s next.

Abstract

This paper contends the following: Blockchain as a ledger or immutable database alone is a limited technology with no real use cases. The true value of the technology is apparent through the consensus mechanisms which enable the real value proposition, namely digital value and distributed peer-to-peer trust. This paper outlines why consensus is the true innovation in the blockchain space, some methods of consensus, and applied use cases. Applied use cases discussed: digital tokens; defense applications and distributed trust; IoT and distributed data exchange in industry including energy.  It is important that this reviewing body considers the implications of such a technology in order to create smart policy. Lastly, we encourage regulatory frameworks in the space to focus on enabling technology growth on public, decentralized networks where entrepreneurs can be free to build and innovate. 

Introduction

We believe the significance of “Blockchain technology” is the creation of new value models, consensus mechanisms, and tokenization itself, not the “blockchain” technology. Blockchain technology is a broad, oft misused term that people use to describe many things and frequently just an immutable database. An immutable database, while having limited use cases in regulatory and legal cases, is not a revolutionary technology in and of itself.  While it’s important to distinguish between much of the current sensational, speculative hype surrounding cryptocurrencies and their derivatives like NFTs, we must also contrast enterprise efforts focused on Blockchain as a limited database technology. Cutting through the confusion, we will look towards the core capabilities of the technology and where this will lead to a future of digital value exchange.

The true innovation and revolutionary technology are the consensus mechanisms of these Decentralized Computing platforms powering cryptocurrencies. Similar to the computer enabling machines to create and store data or the internet allowing machines to exchange data, Decentralized Computing enables machines to create, transact, and store value. This mechanism enables trust between independent users around the globe, who do not need to know each other or even utilize a third party to moderate their interaction. Value exchange has been democratized for the first time in human history, enabling all users to control their own credit, without reliance on a third party like a bank, government, or temple to set exchange rates and mediate value exchange.

This is a fundamental change in the way we will interact with currency and value in our lives. In much the same way past revolutions displaced the main players in the industries they disrupted (The Industrial, and the Digital/Internet revolution) so too will Decentralized Computing continue to disrupt established banking and financial institutions and even beyond, to the centralized enterprises who currently control information flow on the internet.

We believe very clearly that the future of “blockchain” or Decentralized Computing will be built on open, decentralized public protocols like Bitcoin and Ethereum for the reasons outlined in this paper. It is vitally important that innovation, use cases, and regulation in this space are designed with this in mind. In short, we believe public, open, decentralized computing networks utilizing Proof of Work, Proof of Stake, or another yet designed mechanism will win out due to their unique incentive structures that draw in a wide range of users. Combined with their open architecture which allows anyone to build, public networks will create a myriad of use cases even beyond those contained below. It is important that builders on these free and open systems be enabled to innovate and create in order to maximize the potential of decentralized computing technology.

Overview of Consensus

Briefly, a consensus mechanism is designed to enable networked computers to agree over (1) a set of data, (2) modifications to or computations with that data, and (3) the rules that govern that data storage and computation. It’s consensus that allows a cryptocurrency like Bitcoin to exist, not “blockchain technology.” For example, in the Bitcoin network, there are a set of rules that govern the inflation and maximum amount of Bitcoin that can exist (21 million). These rules are agreed to by network participants and over time, trust is built up that the Bitcoin tokens will continue to conform to the rules of the network. It is this part of the distributed consensus mechanism that allows Bitcoin to have and retain value. Consider the US dollar, there is no limit on the number of dollars that can exist, however, the strength of the US economy, military, and government creates a trusted environment where the dollar can have value, even though it is a ‘fiat’ currency. Because Bitcoin doesn’t have a central government or physical assets, trust and value must be built and maintained through the consensus mechanism. To this end, Bitcoin created a unique incentive structure as part of its Proof of Work consensus (to be explained later) that made participants in the network want to conform to and maintain the rules so that value could grow. 

This network security is also what allows an open and permissionless network. Anyone can build on these networks without fear that that person could destroy the network and this is partially what makes this technology so powerful. When the openness is removed and the network becomes a Consortium consensus model or private, the ability to create is limited and becomes mediated through the controlling enterprises on the network.

There are several Consensus Mechanisms in use today: Proof of Work, Proof of Stake, Consortium, and Proof of Authority. These mechanisms provide the backbone for Decentralized Computing networks and we’ll see that the open mechanisms, Proof of Work and Stake, do far more than simply act as a “blockchain ledger.”

Proof of Work

Proof of Work is the original consensus mechanism designed by the pseudo-anonymous Satoshi Nakamoto. It is quite simple: Network participants use their computers to solve a math problem (or algorithm), the first computer to solve the problem gets rewarded in Bitcoin and the ability to process a number of transactions, earning those fees as well. This has been occurring basically non-stop since the launch of Bitcoin in 2009. The mechanism is designed so that the algorithm changes in difficulty as more computers are added to the network so that a Bitcoin block (and reward) is mined every 10 minutes. The reward from Proof of Work makes for a powerful incentive. Computers participating, especially today as the amount of computing power on the network has grown exponentially, spend quite a bit of money to purchase, maintain, and run their equipment. They must ensure the rules of the network are maintained so that they can continue to earn money from transaction fees and block rewards. Someone looking to disrupt the network would need to deploy an incredibly expensive amount of computing power to manipulate blocks or transactions, not worth it for any individual or company to do. This methodology has proven very successful though there are two major concerns. First, is the amount of electricity required to maintain the network. The answer is outside the scope of this paper, however, thus far, predictions of future energy usage have been quite incorrect and there is no indication that Bitcoin is generating demand for new energy sources, only utilizing excess energy in specific areas. The second concern is the ability of any new protocols to utilize a new Proof of Work network. Since there is so much computing power involved in the cryptocurrency space today, it is easy for an attacker to disrupt and destroy smaller proof of work networks.

Proof of Stake

There are two other key limitations of Bitcoin that drove innovation in new methodologies of consensus that resulted in Proof of Stake. The first is that Bitcoin is stateless or non-Turing complete. This means that it can’t do much else other than managing the transfer and store of Bitcoins. There isn’t a problem with this, per se, but innovators were interested in ways to build an “internet computer” a decentralized computer with built-in payment rails that would enable users to create a new generation of computer programs and new types of digital assets all natively decentralized and uncensorable (Streaming platforms, social media, music, art, etc). This, in part, required a new methodology to overcome the second limitation of Bitcoin, the speed of the network (Bitcoin blocks only come once every 10 minutes), while still maintaining the overall security of the public network. Proof of Stake emerged as the most likely solution and works as follows. Instead of rewarding computers based on the computing power, they are utilizing, Proof of Stake rewards all participants for the number of network tokens they have “staked” or deposited on the network. Participants can choose from a variety of network computers, or Validators, to stake to and desire to choose a Validator who will properly secure and maintain the network, otherwise their tokens are worthless. This incentive structure has so far proven to be a worthy competitor to Bitcoin and most new network launches (Cosmos, Solana, Polkadot, and others) have utilized a form of Proof of Stake. The Ethereum network, the original Turing-complete decentralized network, is planning to shift from Proof of Work to Proof of Stake later this year (2022). 

There are two major concerns with Proof of Stake. The first is that when compared to the Bitcoin network, Proof of Stake is a newcomer and not as established or tested. Second, there are questions as to the equitability of Proof of Stake. Since it is purely token-based, most of the token supply tends to consolidate with a few wealthy token holders on a few wealthy Validators. This potentially increases the likelihood of collusion and network manipulation. Proof of Work is seen as more equitable since any network computer could mine Bitcoin, without requiring holding any Bitcoin, to begin with, however, the network has become so large that a single computer or even a small data center of computers is unlikely to be able to mine Bitcoin consistently, so it too has tended towards centralization among the large “miners” who are able to afford the expensive equipment needed.

Consortium

Consortium consensus has been favored among enterprises, like IBM and Walmart, attempting to experiment with Blockchains. Consortium consensus grants privileges to known members to participate in consensus but is a closed network to the public. Similar to our Walmart example later (see Trade section below), who determines the identity of the participants? Who decides exchange rates or data control? What happens if one of the participants is compromised? Participants in these networks must have a level of trust with each other outside of the network itself, severely limiting their use cases outside of very specific examples.

Proof of Authority

Proof of Authority, or social consensus, is an alternative consensus mechanism that doesn’t require the computing power of Proof of Work or the expense of holding tokens for Proof of Stake. Proof of Authority grants members the ability to elect (or demote) other users on the network through voting. It is similar to Consortium consensus in the sense that participants must build some sort of trust with other participants to gain access, but there are controls in place to remove underperforming or malicious participants. The trust is distributed across the participants, enabling this network to be public and open. The major flaw in Proof of Authority is the general politicking associated with a decentralized group of participants attempting to make decisions regarding changes to the network or admitting new members. The network Factom, one of the earliest decentralized networks to launch after Bitcoin, has faced these challenges, crippling the network to the extent that it is shifting to a Proof of Stake network later this year (2022).

Recommendations

1. Peer-reviewed study of energy usage and growth of the Bitcoin network evaluated. Calculations and predictions thus far have been incorrect.

2. Policies evaluated and enacted to promote innovation and development utilizing open and decentralized network protocols.

Use Cases

Finance

Decentralized finance, or DeFi, is the most clear-cut of the use cases for Decentralized Computing today. We won’t focus on this too much as it is written about elsewhere by more qualified individuals but suffice to say, as noted above, DeFi enables a democratization of finance, enabling users to engage in a new world of value creation and exchange. Needless to say, this area of disruption is ripe with optimism as well as fraud and until more specific regulation is applied in the United States, this “wild-west of digital gold” will continue.

Trade

Supply Chain is often the immediate answer when someone asks, “What else can blockchain do other than cryptocurrencies?” This is due to the above-noted misunderstanding that “blockchain” is just an immutable database. Often cited is Walmart utilizing blockchain to track its Mango and Pork supply chain (in a few specific geographic areas). While indeed a noble cause, having better traceability in a supply chain to help prevent disease breakout and sustainably source goods – all of this could have been done (and probably should have already been done for most of our food supply chain) with traditional Web and digital technologies. The speed at which Walmart was able to audit their supply chain also has little to do with blockchain but again, the digitization of that supply chain. Walmart’s “blockchain” technology is just a supply chain software they control and force their suppliers to record to. There is no real consensus mechanism, there is no real value creation or exchange on the platform, and it is a closed architecture, meaning only federated users can participate. If Walmart changed data on the platform, who would know?

We actually do believe that Trade and Supply Chain is a powerful use case for Decentralized Computing, but not for the reasons Walmart is using it. Trade is often a complex operation, requiring stakeholders across the spectrum from Governments to Suppliers. Significant value is generated as goods move through a supply chain. This value is transacted across many distributed parties and stakeholders. As supply chains become more digital, a significant need is for data standardization across the supply chain. This is easy to do in a simple supply chain within the borders of a country. However, in a globalized world goods transit around the globe through many different countries, each with their own regulations and trust between each other. Decentralized computing offers a unique opportunity for countries to agree on a common data standard, enable transparency across supply chains, and facilitate a more efficient flow of goods across borders without requiring outside trust of the members.

Goods flowing across borders encounter a wide range of regulations, paperwork, taxes, and inspections. An open protocol would enable the standardization of data across borders while enabling participating countries to retain their own individual regulations, digital paperwork, and tax revenue. This technology could be combined into a single system where data, goods, and money flow simultaneously as the product transverses through its supply chain. This would greatly reduce trade barriers and overhead while providing increased transparency of the supply chain and ensuring compliance

Internal trade is also a potential use case. In many industries, particularly healthcare, supply chains are managed by third parties who introduce unnecessary overhead and complex contract agreements between manufacturers and end-users, like hospitals. This also creates a large information barrier where manufacturers are unaware of the long-term quality of their product, how often it is used, and have only delayed demand information, making it difficult to build a dynamic supply chain. In many cases, manufacturers are forced to purchase old data from purchasing agencies like GPOs just to better understand their own equipment. The reverse is true as well. Hospitals are locked in complex contract agreements with GPOs, providing very little price transparency or resiliency. A trusted, decentralized network of manufacturers and end-users interacting directly with each other would introduce additional transparency and information flow between stakeholders, creating a more resilient and cost-effective supply chain. Recent regulatory changes, including the 21st Century Cures Act Final Ruling, created a data standard for health information that could be leveraged to provide even better quality assessment and demand data to manufacturers.

Defense

The US Department of Defense (DoD) is an early adopter of “blockchain” for supply chain. There are some good reasons for this, unlike Walmart. First, traceability is essential for many parts and products in the DoD supply chain. Certain parts are incredibly valuable and could mean life or death, or even impact national security, and, as such, a heavy-handed immutable database tracking these parts can find use in such an environment. For example, the highly successful SUBSAFE program already uses an amended-only paper trail for ship-safety repairs on submarines. The use of an immutable database in this instance could dramatically improve a paper-heavy and cumbersome process while still preserving ship safety.  Again, these use cases are limited in nature and don’t really address a key problem in vital supply chains, namely data entry (even in our SUBSAFE example). Non-natively digital information or value whether a collectible, Rolex, or Seawater valve, when entered onto a blockchain will always depend to an extent on the person entering the information. Blockchain, although immutable, can still be error-prone (then save that error for eternity). This again highlights the fact that much of our supply chain issues are a digitization issue, not an audit trail issue.

However, there are ways we can work to reduce the chance of error and leverage emerging digitization technology to better ensure proper data entry. In a current project with the United States Navy, we’re building a blood product tracking tool for sharing blood product information across a variety of parties such as a blood bank, hospital, and ship, we’ve utilized barcodes and RFID to automate data entry and partially solve this problem as well as integrating another key use case, IoT. As the DoD continues to experiment and test Decentralized Computing, we believe two more key use cases will emerge:

1. Digital Chain of Custody: As DoD continues its digitization efforts, much of the data created will be (and already is) of the highest importance to national security, from directives, to planning, to weapons system software. Securing this data in a way to ensure it has not been tampered with is a key area of national security importance. Especially in the case of mission-critical software, which is already natively digital, Decentralized Computing can be a powerful tool to prevent malicious or even unintentional errors.

2. Decentralized and Secure Communications: Warfare is becoming increasingly informationized and cyber-based. Conflicts arising in the coming years will be a battle of systems and systems confrontation. Digital information flowing from soldiers on the ground, drones in the air, weather, intelligence, and more will be fed into a full combat, comms, and sensor integrated network from where topline decision-makers will be able to view the entire battlefield and make key decisions. Disrupting these networks will require coordinated attacks on key nodes to disrupt or destroy the system. Traditional digital and web architecture creates single points of failure that if exploited would destroy military systems and impact national security. Decentralized Computing could dramatically improve the robustness of these systems. By creating a decentralized network of nodes, information transfer can not only be increased in volume, by utilizing technologies like BitTorrent, increase security by breaking the information apart into smaller chunks, and creating a robust data transfer network where one node’s destruction will have little impact on the other nodes or operation of the system. A sophisticated consensus mechanism will also be able to observe potential bad actors attempting to participate in consensus, removing Man in the Middle or Sybil attacks, something much harder to do in traditional cyber security.

Internet of Things (IoT)

Every day a new device becomes IoT enabled, from your microwave to your grill. This certainly will continue to grow and these devices will gather more and more information from our everyday lives and beyond into industry. As IoT becomes more ubiquitous, the control the IoT devices have over our lives and the data generated will be very important, valuable, and vulnerable. Significant debate will continue to occur around who owns IoT data and what it can be used for. Possibly even more important is what the device can do. If it’s just monitoring your grill temperature, you may be less concerned about privacy or data ownership. If it’s a heart monitor, baby monitor, or a control actuator on an oil pipeline – that device’s connectivity presents a very clear vulnerability that if exploited or the device itself fails, could result in serious consequences.

In a data privacy use case, a decentralized IoT device, not dependent on a third party’s centralized server, the user can be assured that information generated by a device is owned by that person alone. Additionally, compromising a large amount of decentralized IoT devices, while in a centralized use case would only require the exploitation of a single server, would require exploitation of each IoT device, something extremely difficult, if not impossible to do. 

Centralized IoT devices are also limited by the companies that produce them, interoperable with only others in their product line, and dependent on the longevity of their maker to maintain functionality. What happens to a valve actuator on a pipeline when the maker of that connected device goes bankrupt, is acquired, or the cloud server supporting that device goes down? The IoT centralized ecosystem as it exists today is highly vulnerable to the viability of the companies making the device. Potentially fine for a light bulb or grill, but disastrous if a pipeline goes down or your pacemaker needs an update.

IoT devices being served by a decentralized network can avoid the issues that central servers create. Even in the case that a company ceases operation, the network itself will remain functional, allowing the device to continue operations. The data standard of the decentralized network, combined with current efforts today to link data across different networks also presents an opportunity for increased IoT device interoperability. Devices will be able to speak to each other across product lines, creating new use cases. Your grill could talk to your doorbell, letting you know friends have arrived for the barbecue. In a more serious use case, weather monitoring could be connected to electrical grids and oil pipelines, providing information directly to control systems to enable proactive protection of equipment.

MachineFi is the terminology being introduced to describe the intersection of Industrial IoT and Finance. The open protocol, IoTeX, is a leader in the space and describes MachineFi as a new paradigm fueled by Web3 that underpins the new machine economy, whereby machine resources and intelligence can be financialized to deliver value and ownership to the people, not centralized corporations. Though somewhat hyperbolic, the message is clear, IoT devices powered by Decentralized Computing have the potential to not only impact the security and longevity of the device but the empowerment of the users themselves.

Energy

Energy markets are increasingly distributed as ‘micropower’ sources such as wind turbines and solar panels proliferate. Battery backups to these renewable energy sources are also in high demand and are often traded not as storage but as ancillary power. The variability of renewable sources is difficult to predict, so much so that it often trades at a negative price. Current energy trading was designed around the large power sources of the past and is not dynamic enough to predict or react to changes in weather or energy production. Smaller power operators have an opportunity to more directly sell their power to the grid as digital technologies now manage the flow of energy. Additionally, grids could also have more flexibility to directly compensate for excess energy storage and backups, a must-have for extreme weather events and surge times, ensuring power is available when needed. As a whole, energy markets could be another prime area where Decentralized Computing could have a large impact utilizing a combination of the use cases discussed above. Cross border energy trade, enabled by a common data standard on an open network, robust grid and weather monitoring through widespread IoT devices, and MachineFi tools to power dynamic energy trading and real-time settlement could usher in the next generation of reliable and secure clean energy production.

A thought on Uncensorability

A clear but often overlooked value proposition of Decentralized Computing is uncensorability. This means that transactions on a public network, like Bitcoin, cannot be stopped by any third party. This is a particularly powerful empowerment technology that could enable victims of oppressive governments or those unable to access banking to be able to participate in an economy otherwise closed to them. But uncensorability has its dark side – horrible things could be empowered by this tech – Child pornography or payments to terrorists could all be funded and transmitted through these networks. As public open decentralized and uncensorable networks grow, special attention and regulation are needed to ensure that the worst of humanity is not enabled along with the best.

Conclusion

Let’s cut through the hype – tokens like Bitcoin and even Dogecoin and Bored Ape NFTs have some value, what exactly is that value? As our lives have become more digital, as our machines and data have become more digital, as software has eaten the world; we must understand that the value associated with our lives, our machines, and our data is also becoming digital. There is something here beyond mere speculation. This should be seen as a natural progression, money flowing into the digital value ecosystem in search of what’s next. The question before us is whether to allow the natural progression of value digitization by enabling the building of this new economy on free and open standards or to attempt to heavily regulate it to prevent what is seen as excess in the world of digital tokens (like Dogecoin). A few things are clear though, the robustness and uncensorability of the open protocols make it very difficult to shut down and the alternatives, private and consortium networks have been unable to produce a product that can match the potential of the free and open digital value economy.

Press Release

We’re Hiring!

We’ve got some exciting announcements in the coming months but in the meantime we’re hiring several key roles as we scale our flagship product HealthNet. If you’re interested in joining a growing startup here in the midwest, you can read the job descriptions and apply here or send your resume to careers@consensusnetworks.com!

Our key open roles include a data scientist, EHR integration engineer, and lead product manager.

Healthcare

How Hospital Consolidation is Impacting Medical Supply Chains

By Connor Smith

Introduction

The U.S healthcare system has undergone unprecedented levels of consolidation over the past decade. The largest health system (HCA Healthcare) now has 214 member hospitals and the 25 largest systems collectively manage almost 20% of all hospitals in the U.S. A recent study by Definitive Healthcare found that, in 2019 alone, there were 294 hospital mergers and acquisitions, a slight decrease from 330 the year prior. It goes without saying that such high levels of consolidation have a massive impact on medical supply chains. However, whether these changes result in a positive or negative financial impact remains a heatedly debated topic in the healthcare community. Some argue that consolidation allows providers to leverage economies of scale and reduce costs. Yet, some evidence suggests this may not be the case and that consolidation can actually increase cost of care by up to 40%. In this article, I aim to briefly explore the driving forces behind hospital consolidation, how it is affecting medical supply chains, and the role technology plays in managing these increasingly complex supply chains.

A Brief History of the Driving Forces Behind Hospital Consolidation

The passing of the Affordable Care Act (ACA) in 2010 was a momentous juncture in U.S history, and arguably the most sweeping change to the nation’s healthcare system since the institution of Medicare & Medicaid. While the goal of the plan may have been to make healthcare more affordable and accessible to U.S citizens, the bill incentivized large scale consolidation across the entire industry that permeates to this day. Providers became incentivized to shift care delivery to an outpatient setting and work towards a value-based care model. As a result, health systems began rapidly acquiring and investing in facilities like ambulatory surgery centers, clinics, and physicians groups. This vertical integration caused health systems to evolve from mere collections of hospitals into fully fledged Integrated Delivery Networks (IDNs) that provide and coordinate patient care across the continuum. Similarly, health systems began consolidating horizontally by merging with other IDNs to improve access to capital, increase market share, and improve efficiencies. Constant external pressures to reduce costs and lower the price of treatment have exacerbated this trend and caused consolidation to persist at an accelerated pace, since integrated provider networks can more easily share physicians across facilities to reduce overall headcount and negotiate bulk supplies discounts.

Impact of Consolidation on Medical Supply Chains

Theoretically, these new, ‘mega-systems’ should be able to use these economies of scale to procure supplies at a lower cost and coordinate care more efficiently, thereby reducing costs. Traditionally hospitals have relied on Group Purchasing Organizations (GPOs) to leverage the aggregated buying power of multiple facilities and negotiate lower costs for medical supplies from manufacturers, wholesalers, and other vendors. As IDNs acquire more hospitals and grow in size, they can often negotiate near equal prices with manufacturers without these middlemen and realize greater cost savings. However, savings from such mergers are, unfortunately, often negligible, and they oftentimes actually cause supply chain costs to increase.

A study of 1200 hospitals done by researchers at the Wharton School found that the average supply chain savings for a target hospital in a merger of equal sized systems was only about $176,000. Moreover, the acquirers are often left spending more on supply chain due to complexities that arise from an acquisition like managing physician preference items. The researchers found that, while an acquirer saved an average of 6.4% on inexpensive commodities in the supply chain, these savings were more than offset by a 1.1% increase in costs relating to physician preference items. As more and more hospitals are integrated into an IDN, it becomes increasingly difficult to standardize purchasing decisions across a growing number of physicians unless the system has a robust digital infrastructure and inventory tracking system. Unfortunately, 64% of providers lack dedicated supply chain management systems and 80% still rely on manual processes in some capacity to manage supplies. Consequently, consolidation oftentimes leads to less efficient supply chains that procure similar products from a myriad of suppliers, lack transparency as to where products are located in a system, and have excess supply stocks that are often wasted or expire. 

How Technology Can Break the Cycle

Prior to the pandemic, consolidation was already among the top trends to watch in healthcare for 2020 and experts are speculating that the financial effects of COVID-19 are likely to accelerate this phenomena in the post-pandemic era. Pending any drastic regulatory or political actions, it is therefore unlikely that hospital consolidation is going to decelerate anytime in the near future. Hence, IDNs must leverage digital technologies if they want to properly manage their increasingly complex supply chains and control costs. One of the easiest solutions they can implement are supply chain management platforms. Manual systems are inefficient and error prone, whereas digital supply chain management systems enable automation, reduce error, and can be coupled with advanced analytics to drive further efficiencies. 

Using supply chain analytics, IDN’s can track usage of supplies to help reduce sources of waste like purchasing physician preference items and maintaining expired products. The majority of healthcare executives and supply chain managers agree that supply chain analytics could positively impact hospital costs and even improve hospital margins by over one percent. Considering that hospitals, on average, realize net revenues in excess of $300 Million, a one percent improvement in operating margins can equate to millions of dollars saved. Moreover, these systems can be further integrated with technologies like RFID to enable real-time medical supply tracking within an IDN and help providers load balance supplies across the hospitals in their system based on demand to realize even greater cost savings.

Conclusion

I hope you enjoyed this article! If you are interested in ways healthcare supply chains are evolving, check out my series on the future of medical supply chains. Part one can be found here. Additionally, if you are looking for ways your medical supply chain can be improved, feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era. Until next time!

Healthcare

COVID-19 is Forcing Medical Supply Chains to Change… And that’s a Good Thing: Part III

By Connor Smith

Hello everybody! Welcome to the conclusion of my series on how COVID-19 is changing medical supply chains. In Parts I & II I talked about how the pandemic is forcing U.S medical supply chains of the future to embrace digitization and automation, rep-less sales models for high cost implantable medical devices, and the influence that domestic manufacturing and value-based healthcare are having on their evolution. Before I conclude this series, I’ll add the caveat that medical supply chains will evolve in a myriad of ways far beyond just the six I addressed. These are simply what I believe are the most profound changes on the horizon for these complex systems. Now, let me dive into the two final transformative forces that will reshape U.S medical supply chains in the post-pandemic era.

5. Sophisticated Risk Analysis & Disaster Planning Software

COVID-19 is far from the first environmental disruption global supply chains have faced in recent times. There was the SARS outbreak in 2003, the Ebola virus in 2014, and natural disasters like earthquakes and hurricanes that occur frequently and can have devastating economic impacts. In fact, as many as 85% of supply chains experience at least one major disruption per year. The reality of the modern era is that globalized supply chain networks allow us to consume goods and services at incredibly low  prices, but they are also riddled with ‘achilles heels’ that can shake global economies if perturbed.

As I mentioned in Part II, one way U.S based providers and manufacturers will look to mitigate such interruptions in the future is by ensuring geographic redundancies throughout supplier networks and logistics partners and leveraging domestic manufacturing. While this approach will be one core component of U.S medical supply chain architecture in the future, an overall proactive, strategic approach to supply chain design will be paramount. Hence, medical supply chains of the future will be designed using sophisticated risk analysis software that leverages predictive analytics and advanced simulations to model for possible disruptions like pandemics and natural disasters before committing to suppliers and plan accordingly.

Academics have been developing operational supply chain risk analysis models that leverage AI and advanced computational methods for well over a decade, and supply chain experts across all industries have known the benefits of proactive supply chain risk mitigation for some time. A 2017 study conducted by Deloitte found that taking a proactive approach to supply chain risk management can save firms up to 50% when managing major disruptions. As medical supply chains become increasingly transparent and digitized, investments into proactive risk assessment technologies will be one of the best ROI that medical supply chain management teams can make. Using tools like predictive analytics, supply chain management teams will be able to quantify the risk of using a particular supplier by modeling scenarios like the spread of a virus to a particular region, losing a key manufacturing plant to a natural disaster, and more. Some researchers have already started developing simulation based analyses for global supply chain disruptions resulting from the spread of COVID-19, and it is reasonable to expect that using such models for supply chain planning in the future will become the norm. 

6. Integrating Population Health Analytics 

Originally posited by David Kindig and Greg Stoddart as “the health outcome of a group of individuals, including the distribution of such outcomes within the group”, population health has become a fairly nebulous term used to describe the overall health of patient populations. ‘Pop health’ research encompasses everything from studying how socioeconomic factors like income levels, race, or education level influence the prevalence of certain conditions to the prevalence of various genes in communities and how that affects disease spread and more. Depending on who you talk to, they likely have a different view as to what population health is and the responsibility that providers and manufacturers have in improving outcomes.

Regardless of your thoughts around what population health ecompases, as healthcare continues to evolve towards a value-based model it is inevitable that population health will play an increasingly vital role in how providers deliver care. Value-based incentive structures are designed to drive healthcare towards what the Institute of Healthcare Improvement refers to as the ‘Triple Aim’: improving the patient care experience, improving the health of populations, and reducing the per capita cost of healthcare. An overview of the triple aim is pictured below.

Image Adapted from Health Catalyst

Prior to the pandemic, providers were starting to integrate population health initiatives with supply chain management to combat the increasing strain they felt from over half of the U.S adult population having at least one or more chronic diseases. For example, Indiana-based Eskenazi Health extended its partnership with Meals on Wheels in 2017 to deliver healthy meals to patients discharged from the hospital at their home in an effort to reduce readmission rates. A recent analysis published by Providence Health System found that COVID-19 is accelerating this transition to a distributed care model in which patients receive personalized care from their homes or local clinics instead of at a Health System. They found that the use of virtual care technologies coupled with the desire to reduce unnecessary visits because of the pandemic is forcing medical supply chains to become more personalized. 

It is reasonable to suspect that the medical supply chains of the future will become increasingly patient-centric and account for the socioeconomic, genetic, and other key factors that influence a patient population’s well-being. The Internet of Medical Things market is growing at over 20% year over year and technologies and will likely increase due to the rapid adoption of telehealth & remote patient monitoring technologies because of the pandemic. These platforms will enable providers to gather more information about their patient population than ever before and construct truly personalized care plans. Medical supply chain teams of the future will integrate this information with predictive analytics models to not only ensure that patients receive the best possible care, when and where they need it, and at the lowest cost, but also make cost effective population-level interventions that improve overall societal health levels. 

In Conclusion
I hope you enjoyed the final installment of my series on the future of U.S medical supply chains! Interested in learning more or ways your medical supply chain could be improved? Feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era.

Press Release

Consensus Networks Joins MATTER

South Bend, IN Sep. 22, 2020 — Consensus Networks announces its admission into MATTER, a leading healthtech incubator based in Chicago, IL, to accelerate the commercialization of its HealthNet technology and reinvent medical supply chains for the post-covid era. 

“Being accepted into MATTER has been a complete game changer in our journey bringing HealthNet to market.” Said Consensus Networks’ COO, Connor Smith. “Their team and network of healthcare experts and mentors have helped us tremendously. They understand the nuances and intricacies of the Health IT landscape and have really helped us fine tune our value proposition, messaging, and offering to all of the different stakeholders. MATTER has gone above and beyond to connect us with the potential end users, advisors, and partners we need to successfully commercialize HealthNet and we are excited to be a part of this vibrant community of healthtech innovators.”

MATTER, the premier healthcare incubator and innovation hub, includes hundreds of cutting-edge startups from around the world, working together with dozens of hospitals and health systems, universities and industry-leading companies to build the future of healthcare. Together, the MATTER community is accelerating innovation, advancing care and improving lives. For more information, visit matter.health and follow @MATTERhealth.

About Consensus Networks.

Consensus Networks is an Indiana based LLC that is reinventing medical supply chains for the post-covid era. Founded in 2016 with the goal of creating trusted data environments to derive novel insights, Consensus enables clients to integrate clinical and supply chain data to optimize inventory management and improve patient outcomes. Consensus Networks uses serverless application architecture, blockchain technology, and predictive analytics to ensure the best materials are used to treat patients at the lowest cost.

Healthcare

COVID-19 is Forcing Medical Supply Chains to Change… And that’s a Good Thing: Part II

By Connor Smith

Hello everybody! Welcome back to my series on how COVID-19 is forcing medical supply chains to change. In Part I I talked about how the pandemic is forcing the medical supply chains of the future to embrace digitization and automation and rep-less sales models for high cost implantable medical devices. If you have not checked out the first part of this series, you may do so here. Without further delay, let’s dive into two other ways medical supply chains will evolve.

3. Building Resilience through On-Shoring Production

It’s no secret that China is the world’s largest producer and distributor of medical supplies. Prior to the pandemic, China manufactured 48% of the United States’ personal protective equipment (PPE) and over 90% of certain active pharmaceutical ingredients (APIs). When factoring in other foreign suppliers, the U.S imports nearly $30 Billion worth of medical supplies every year, or roughly 30% of all its medical equipment. While this may seem like a fairly trivial portion of U.S total medical supplies, the pandemic has illustrated the profoundly negative consequences of relying entirely on foreign sources for particular items. There are over 25 drug shortages related to COVID-19 (17 of which are from increased demand) and critical shortages of PPE over six months into the pandemic.

Aside from the public health consequences posed by such disruptions, failure to make domestic supply chains more resilient pose a national security threat. Accordingly, shoring up medical supply chains is a priority for both presidential candidates for the upcoming election. President Trump recently announced an executive order aimed at increasing U.S domestic manufacturing and onshoring supply chains for pharmaceutical and medical supplies to protect against potential shortages. Similarly, Democratic Presidential Nominee Joe Biden put forth a 5 page plan articulating actions he will take, if elected, to rebuild domestic manufacturing capacity and ensure U.S medical supply chains remain resilient and geographically redundant in the future.  

Likewise, the commercial sector has expressed similar sentiments. A recent study conducted by McKinsey & Company found that 93% of surveyed supply chain experts listed increasing resiliency across the chain as a top priority with an emphasis on supplier redundancy and near-shoring production. The U.S medical supply chains of the future will emphasize  localization to mitigate as many disruptions as possible. Such regionalization will also make it easier to shorten the distance between suppliers and customers. As pointed out by Brad Payne of PCI Pharma Services, “Shortening the distance in the supply chain expedites deliveries and lessens room for complicating factors, like customs clearance”. 

Forward looking supply chain leadership on both the provider and vendor sides will be investigating ways to leverage these new logistics and distribution paradigms to reduce their bottom line and improve the quality of their services. For example, technologies like Predictive Analytics can be used to improve last mile delivery for medical supplies manufacturers. Other industries have leveraged logistical data analytics in this manner to reduce fuel costs and improve the quality and performance of their deliveries. Healthcare supply chains of the future will leverage these technologies and other tools like RFID to provide more efficient deliveries and optimize procurement strategies.

4. Use Value-Based Procurement Strategies

Historically, providers received payment through a fee-for-service model in which they are reimbursed based on the number of services they render or procedures they order. This strategy made sense when it was instituted as the reimbursement mechanism for Medicare & Medicaid in 1965, but it has perversely affected the way healthcare is paid for today. Physicians are incentivized to provide as many billable services as possible and take a ‘defensive’ approach to healthcare, ordering procedures and tests just to be safe. Consequently, they may order unnecessary tests and procedures without hesitation, as neither they nor the patient are financially responsible, increasing the cost of care. Over the past decade, insurers have started implementing ‘value-based’ payment models that link reimbursement to patient outcomes in an effort to reduce the overall cost of care and improve patient outcomes. Early results suggest that value-based reimbursement can reduce the cost of claims by nearly 12% and improve chronic disease management. The benefits of value-based models by stakeholder may be seen below.

Image adapted from NEJM Catalyst

Prior to the pandemic, pure fee-for-service was expected to account for less than 26% of all reimbursements in the U.S by 2021. Given the immense financial impact of the virus, the impetus to reduce costs by transitioning to value based care has never been greater. Hence, value based procurement, or ensuring that the right product is delivered to the right patient at the right time and at the lowest cost, will be the norm of medical supply chains of the future. As members of HIMSS have pointed out, the future of the healthcare supply chain will emphasize connecting cost, quality, and outcomes to help realize the Triple Aim and optimize provider  performance.  

Similar to the adoption of rep-less sales models I described in Part I, the foundational technology underlying value-based procurement will be clinical supply chain integration. Connecting supply chain and clinical data sources enables providers to redefine how they administer care. Once integrated, predictive analytics can be used to guide procurement decisions based on the cost of the material and its outcome for the patient. Some hospitals have already saved over $15 Million from clinical supply chain integration. Additionally, EHR integration with supply chain data allows providers to assess clinical variance in treatment with granular detail. Such functionality will be critical as providers look to minimize the impact felt by the pandemic, and continue playing a key role in how they operate into the future.

In Conclusion

I hope you enjoyed the second installment of my series on the future of U.S medical supply chains! I will be back next week with my final two insights as to what the future of these systems will look like. Interested in learning more or ways your medical supply chain could be improved? Feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era. Until next week!

Uncategorized

COVID-19 is Forcing Medical Supply Chains to Change… And that’s a Good Thing: Part I

By Connor Smith

Without question, the dedication and tireless work ethic exhibited by medical first responders has been nothing short of heroic throughout these trying times. Unfortunately, despite (and partially due to) the valiant efforts on the ground, the healthcare sector is one of the most negatively financially impacted industries by the pandemic. A recent report by the American Hospital Association estimates the total financial impact of COVID-19 on the U.S healthcare system from March 2020-June 2020 to be $202.6 Billion. While a significant amount of this financial strain stems from treating patients infected with the virus and forgone revenue from elective procedures, many problems brought on by the virus could have been mitigated with more efficient and resilient U.S medical supply chains.

While the magnitude of a global pandemic is something few could have properly anticipated, COVID-19 introduced few truly new problems for U.S medical supply chains. Instead, it exacerbated problems and inefficiencies that have plagued the industry for years which have never been properly addressed. Yes, over 70% of active pharmaceutical ingredients that supply the U.S healthcare market are imported and constrained as a result of the pandemic and unprecedented global surges in demand for PPE created massive shortages in basic medical equipment for U.S frontline responders. However, siloed data, lack of automation & interoperability, and a reactionary approach to purchasing contributed to upwards of $25 Billion in wasted supply chain spending each year long before the pandemic. In fact, analysts project that 2020 will be the first year in which supply chain costs surpass those of labor as the largest cost component of delivering medical care by providers.

Some are optimistic that a COVID-19 vaccine will be ready by the end of 2020, but this is far from guaranteed and it could be a year or more before one becomes readily available. Consequently, the U.S healthcare system must face a stark reality that it may continue to lose upwards of $20 Billion per month for the foreseeable future. Major operational improvements must be made to its supply chains in order to help offset these costs.

The status quo for medical supply chain management is no longer tolerable and inefficiencies that were previously ignored must be corrected. Medical supply chains need to be reinvented over the coming months if the U.S healthcare system is to survive the pandemic and thrive into the future. This is the first of a three part series in which I will explore 6 different ways that U.S medical supply chains will look in the post-pandemic era and the technologies and external forces that will enable them. So without further ado, let’s dive into the future of U.S medical supply chain management!

1. Automation & Digitization

While this may sound obvious, over 80% of clinicians and hospital leaders still rely on manual processes for managing inventory. Consequently, health systems struggle to know what supplies they currently own, where they are located, if they have expired, and a myriad of other problems. Unfortunately, patients pay the largest price for these manual, inefficient systems. One survey found that 40% of clinicians postponed a patient’s treatment and 23% know of adverse events occurring to patients because of inadequate inventory. Considering that some industries are well into the ‘Supply Chain 4.0’ era, the first seismic shift for medical supply chain & inventory management will be to go entirely digital and implement technologies already used in more mature supply chains like retail.

Regulatory pressures from the FDA’s medical device unique device identifier (UDI) requirements and the Drug Supply Chain Security Act have already begun forcing some of this change and major compliance requirements are going into effect over the coming months. Digitizing medical supply chains will not only significantly reduce errors and inefficiencies arising from human error and manual entry, but also enable the use of technologies like AI and blockchain that can elicit even greater cost savings. For example, firms in other industries have seen cost savings of 32% across an entire operation by implementing AI based inventory management systems. Such implementations can be used to improve inventory management by enabling features like predictive forecasting and optimizing surplus inventory levels that should be maintained at all times. Other heavily regulated industries, like food, are experimenting with blockchain for product traceability applications in supply chains and estimate they can reduce costs of compliance by 30% within a few years. 

Certainly, there is much foundational work that must be done before medical supply chains can integrate these more advanced solutions. Implementing RFID chips to enable real-time asset tracking, achieving basic systems interoperability through data standardization, and switching to cloud-based healthcare inventory management software are among the baby steps that must first be taken. However, given the lack of digital infrastructure currently in place in many medical supply chains, there is an opportunity to ‘leapfrog’ legacy supply chain technology and implement cutting edge solutions to realize even greater cost savings immediately. Forward looking medical supply chain management teams will be looking to implement such solutions to ensure their supply chains remain resilient and future proof to future disruptions or pandemics.

2. Rep-less Medical Device Sales Models

Manufacturers of implantable medical devices have traditionally used an in-person sales model for distribution. These sales reps form close relationships with clinicians and are oftentimes even present in the OR during the surgery. While this model has been the standard practice for decades, it also increases the cost of delivering care tremendously. For example, in orthopedics, it’s estimated that the sales process for implantable devices accounts for 35-50% of the cost of sales. Moreover, this sales process makes it nearly impossible for device manufacturers to track their field inventory and providers to manage their consignment inventory, resulting in further cost increases of up to 25%.

The pandemic has not only made these inefficiencies no longer bearable for providers, but it has hindered the ability of manufacturers to even sell their products. Whether through self-selected or provider mandated deferral, there have been 50% fewer patients receiving care compared to normal, and elective surgeries for implantable devices like knee replacements have dropped by 99%. Moreover, nearly 50% of orthopedic sales reps have been forced to use exclusively digital means to support physicians or have been unable to provide support at all. 

It is reasonable to suspect that providers will continue to limit who is allowed into a hospital at least until a vaccine is readily available, if not longer, and patients can only forgo necessary care for so long. Hence, providers and manufacturers alike will be required to implement technologies that make rep-less sales models attainable. One key technological enabler of this transition will be integrated data environments of supply chain, medical IoT, and EHR data. Integrating supply chain data with EHR data had already been a top priority for many providers entering 2020. Such environments will serve as the cornerstone for other tools like video conferencing software and payment processing tools that can enable a rep-less sales model and save providers  millions of dollars per year

In Conclusion
I hope you enjoyed the first part of my series on the future of U.S medical supply chains! I will be back next week with two more insights regarding what the future of these complex systems will look like. If you are interested in learning more or ways your medical supply chain could be improved, feel free to contact us at Consensus Networks, where our HealthNet technology is being used to reinvent medical supply chains for the post-pandemic era. Until next week!

Uncategorized

Consensus Networks Will Support NuCypher

Consensus Networks is excited to announce its continued support of NuCypher through mainnet launch later this year. Consensus Networks has actively supported the network since the beginning of its incentivized testnet challenge which launched earlier this year called “Come and Stake It”. 

“We are big believers in NuCypher’s mission and see a major need for the work they are doing in enabling privacy preserving applications and providing a secure compute platform.” Said Consensus Networks’ COO, Connor Smith. “We will look to continue not only supporting the network by operating Ursula Nodes on our infrastructure, but are actively building out tools to help developers and organizations connect their applications to NuCypher and leverage the network to build the next generation of privacy preserved dApps. We have had an absolute blast participating in all four phases of their incentivized testnet and cannot wait to help bring the power of the NuCypher network to the rest of the world.”

NuCypher aims to provide the cryptographic infrastructure for privacy-preserving applications by allowing users to manage secrets across dynamic environments, conditionally grant and revoke access to sensitive data to any number of recipients, and use a secure computation platform to process encrypted data while preserving the confidentiality of the inputs and outputs. It is a proxy re-encryption network designed to provide cryptographic access controls for dApps and protocols by functioning as a decentralized key management service (KMS). If you have any questions about how to integrate NuCypher with your application or if it is the right fit for what you and your team is building, contact our team today!

About Consensus Networks.

Consensus Networks is an Indiana based LLC that designs, builds, and manages dedicated infrastructure for blockchain technology. Founded in 2016 with the goal of providing direct, high-speed, low-latency network connections, Consensus enables clients to create, disseminate, and store information securely and efficiently. Consensus Networks uses advanced cryptographic techniques and smartly architected network designs to ensure maximum security and network uptime. 

Uncategorized

Top Three Takeaways From Blockchain and Digital Transformation in Health in 2020

By Connor Smith

Healthcare is frequently mentioned on the shortlist of industries that are expected to be transformed by blockchain technology. Supporters of this assertion have a range of reasons for arguing that blockchain can improve healthcare but most tend to revolve around improving health data security, immutable claims tracking, physician credentialing, or putting patients at the center of care by taking charge of their own data. However, there are many who argue that healthcare is not ready for blockchain or that many of its theorized use cases could be better addressed using other technologies. There are good arguments to be made on both sides and, when considering the complexity of healthcare and nascency of blockchain, it can be difficult to discern what projects are ‘hype’ and which can actually drive meaningful impact. 

We at Consensus Networks are bullish on the potential of blockchain in healthcare, but we also pride ourselves on taking a pragmatic view of projects to realistically assess what is feasible and what is not. This past week, we were fortunate enough to attend the inaugural Blockchain and Digital Transformation in Health in 2020 summit in Austin, TX, where our CEO Nathan Miller presented on the work we have been doing with HealthNet and developing highly secure information architectures in a regulatory environment. The Conference was hosted by the Austin Blockchain Collective in conjunction with the Dell Medical School at University of Texas Austin. There were presentations from industry and academia alike accompanied by an open discourse about the state of blockchain in healthcare, what is actually feasible, and identifying a path forward for the technology as healthcare starts its digital transformation. It was an absolutely great event with high quality information and a pragmatic assessment of the state of the industry, and we’re here to share our top three takeaways with you from the event!

1). Blockchain EMRs are not ready….. At least not yet in the U.S

Throughout the short lifespan of blockchain projects in healthcare, there have been several attempts at a blockchain-based electronic medical record (EMR) that is owned by a patient and shared with providers as needed, the most popular of which is probably Medicalchain. Medical records hold a wealth of information about an individual, containing everything from a person’s medical history to their demographic, identity, and insurance information. However, to date, medical records have been largely owned and controlled by the health systems they reside within. Aside from issues of data sovereignty and controlling who has access to that information, having isolated data silos has a decidedly negative impact on patient outcomes, especially in the U.S. Competing EMR systems are incapable of communicating well with one another. Thus, if a patient goes to multiple providers that all have different EMR systems, the data for those visits will likely never be aggregated into a single, cohesive file and instead remain as isolated fragments. This makes it nearly impossible for a provider to know what care has been administered to a patient previously and leads to billions of dollars being wasted in redundant testing, unnecessary procedures, or in the worst scenarios patient death from improper care.

A blockchain-based EMR would enable the patient to own his or her own medical record, which they would likely hold in some form of mobile application. A patient could then control who has access to their record and have the guarantee a provider is seeing her most up-to-date record as any changes would be reflected in that copy immediately. All transactions would be immutably recorded on a blockchain, and once the visit was finished the patient could revoke the physicians access. Conceptually, such a notion sounds appealing. However, one of the biggest takeaways from the conference was that such a future is far off in the U.S and requires a societal shift and fundamental rethinking of data ownership to get there.

Dr. Aman Quadri, CEO of AMSYS Blockchain and AMCHART, was one of the speakers in attendance at the event. The product he is building, AMCHART, is a blockchain-based EMR that is currently undergoing testing in India, and even he was skeptical of its prospects in the U.S. Dr. Quadri said that the reason they have started seeing AMCHART adoption in India is because people there already have a mindset of data ownership. They take responsibility over their data so a platform like AMCHART extends their current capabilities in a way that is beneficial. Dr. Quadri said that for AMCHART to have impact in the U.S it would require patients and health systems alike to change how they view and approach handling data before their could be a marked increase in value to patient care. He said that American patients have been conditioned for decades to blindly trust their data with medical providers, so shifting that view will be no easy task.

2.) Use Cases Are Emerging Around Care Coordination, Identity, and Data Sharing

The projects being spearheaded by the talented Dell Medical School faculty, visiting academics, and industry representatives in attendance covered a wide range of applications in healthcare, spanning both population health and the clinical setting. While the individual problems these solutions address vary, the common thread amongst most of them was that they centered around care coordination, identity, and data sharing applications. The consensus seemed to be that blockchain could help lay the foundation for a web of trusted access to data with the patient at the center of care.

Dr. Timothy Mercer, a faculty member at Dell Medical School and practicing physician, is exploring ways in which blockchain could be applied to help address homelessness in Austin. His research found that one of the biggest problems for the homeless population in Austin is a lack of any legal form to prove their identity. As a result, they frequently go through the process of proving who they are, which can take weeks to months to complete and delay physicians from providing critical care to the homeless. If the documents are lost or stolen, the process must start all over again. As a result, the average age of death the chronically homeless is 52-56, nearly 20 years less than the global average. Dr. Mercer is exploring ways blockchain and digital identities could be used to ease this burden and accelerate the time to care for homeless persons in Austin. This way, the myriad of stakeholders involved in caring for homeless persons would be able to verify an individual through a  web or mobile application and administer care to the patient. The homeless care ecosystem involves many different organizations, all of which must properly authenticate the individual before they can legally administer care. Utilizing a blockchain-based identity application, the different caregivers would be able to verify the individual’s identity through digital documents linked to the patient’s digital identity and legally provide the care he or she needs. This ultimately would place the homeless person at the center of the care and alleviate the inefficiencies pervasive in the current continuum of care for this patient population.

Image Adapted From Change Healthcare

Another interesting application that was highlighted at the event was Tribe Health Solutions use of blockchain in a medical imaging management solution designed for patients. Through the use of blockchain technology and the interplanetary file system (IPFS), they created a platform where patients can store medical imaging data on a distributed file system and then grant necessary providers access when needed. After care is administered, the patient can revoke access to the image and ensure that only trusted providers can access it. This solution aims to help patients overcome many of the problems associated with receiving care for skeletal or muscular injuries. In such tears or breaks, patients oftentimes seek out multiple opinions and are forced to either manage a physical copy of the medical image themselves or wait days to weeks for the file to be transferred to the provider. This not only delays the time it takes for these patients to receive the care they need and start the recovery process but in the worst case scenarios can lead to a worsening of the condition. Putting the patient in charge of the imaging data, allows her to determine who can view the image and when – ultimately reducing the time it takes to receive treatment.

3.) Blockchain Projects Must Start Small in Healthcare and Involve Many Stakeholders

Image Adapted from AdvaMed

Lastly, perhaps the most prevailing takeaway from the conference was that blockchain projects looking to tackle problems in healthcare need to start small and involve as many stakeholders as possible from the onset. Healthcare is a highly complex industry where ‘moving fast’ and ‘breaking things’ can have significant ramifications on patients, especially when considering that the industry is only now beginning its digital transformation. Dr. Anjum Khurshid of the Dell Medical School at University of Texas Austin and a director of the Austin Blockchain Collective is leading research on a patient credentialing platform called MediLinker. When pressed about the ability to extend the platform into an EMR type technology that could exchange clinical data, Dr. Khurshid cautioned that it’s more important to start small and clinically validate each stage of the product. In an industry that handles high fidelity information and typically averse to new technologies like healthcare, Dr. Khurshid said its important to demonstrate the value of the technology and make it more approachable. He said that the problems in the current healthcare system are so vast that even simple solutions can have a massive impact and that it is imperative to validate the benefit of the technology to patients, providers, and payers alike at each step. Any new, truly innovative and sweeping changes that are to take place in healthcare from blockchain will require all of these parties to work together and identify applications that can drive meaningful value for everyone involved. Healthcare is changing rapidly and only by taking small incremental steps will blockchain be able to integrate with the complex, multi-stakeholder ecosystem that is healthcare.

That’s all for this week! We at Consensus Networks are grateful to have been able to attend this conference and excited about the work going on in Austin to advance the industry forward. We are continuing forward with the development and commercialization of our population health data sharing network, HealthNet as well as smart tools for analyzing health data. If you are interested in learning more about HealthNet or have an idea for a new digital healthcare application you’d like to build, contact one of our experts here today!