Skip to main content
Markkula Center for Applied Ethics

Why Private Blockchains Are Interesting



A response to Bruce Schneier

Ahmed Amer

Ahmed Amer is an associate professor of computer engineering at Santa Clara University and an Emerging Issues Fellow at the Markkula Center for Applied Ethics. Views are his own.

In a recent article titled “There’s No Good Reason to Trust Blockchain Technology,” noted security expert Bruce Schneier writes,

Trust is essential to society. As a species, humans are wired to trust one another. Society can’t function without trust, and the fact that we mostly don’t even think about it is a measure of how well trust works. …

What blockchain does is shift some of the trust in people and institutions to trust in technology. You need to trust the cryptography, the protocols, the software, the computers and the network. And you need to trust them absolutely, because they’re often single points of failure.

He argues that “when you analyze both blockchain and trust, you quickly realize that there is much more hype than value. Blockchain solutions are often much worse than what they replace.”

I confess to sometimes getting defensive in the face of many criticisms of blockchain technology, but that's because they often come in the form of inaccurate statements about the underlying technology, or in technically illiterate arguments conflating the basic technology's potential usefulness with the viability of specific crypto-currencies. In this case, however, I largely agree with Bruce Schneier, as he doesn't make any such claims. Rather, he's saying that any block-chain solution is only as secure and trustworthy as the most vulnerable of the individual pieces that go into realizing it.

But if I were to exaggerate one of the things with which I could disagree in this article, I would argue that his point is an uninteresting observation that's long been known to be true in the security world. For example, having the strongest encryption algorithms is pointless if they are implemented within a system that makes mistakes in handling the unencrypted data, and we've long known this.  Of course, if I were to say something like that, then I’d be unfair. Just because a problem is well-known does not mean it has been solved, let alone that it should be ignored whenever we run into it.

Or, to put it another way, I see, for example, no fundamental criticism of bicycle helmets, or of the idea that a helmet is good to wear when riding a bicycle, when someone (especially one who really understands helmets) points out that it's not helpful to wear a bicycle helmet on one's foot.

Where I disagree with Schneier is on subjective statements like "[p]rivate blockchains are completely uninteresting"—and I am concerned with how the definition of a private block-chain can be misunderstood. He's correct to say that distributed consensus based on an authorized list of users is a well-studied problem, but if distributed consensus algorithms were "uninteresting," then data storage and systems software research efforts that tackle data storage and consistency at scale would not be needed. Efforts by Amazon, Facebook, and Google to solve such problems at scale would not be necessary, and would not have raised any interest among systems software researchers—but they are, and they do.

What's uninteresting from a security expert's view is not necessarily uninteresting from a distributed/operating systems perspective. In the interest of full disclosure, I should explain that I consider some of the creators of such systems, some truly wonderful storage systems researchers, to be dear friends and respected colleagues, so I am perhaps being overly sensitive to a perceived slight against operating and storage systems research (and that’s not just because of my modest role in organizing the latest iterations of the longest-running storage systems conference, which we will host at Santa Clara University  in May).

In addition, when Schneier says that a "private" blockchain is uninteresting, he can be seen as implying that private means a limited number of authorized users. But "private" just means limited to identifiable/authorized users, and since there is no reason a "private" blockchain could not have a desired user list that encompasses billions of people (if not the entire planet), then any such assumption of "private" is misleading. Perhaps I’m being a little pedantic; I just think that it's important to clarify that "private" doesn't mean "limited to a manageable subset of possible users," and that authenticated users can easily mean "everyone."

Importantly, the point on which I cannot agree with Schneier strongly enough is his statement that "any evaluation of the security of the system has to take the whole socio-technical system into account." I'd go further and argue that's true of all systems we build--be they computer-based or not--but then again, the point does seem to be dangerously overlooked in the security domain, and Bruce Schneier has long been a great voice calling out that sort of nonsense. So I agree with the caution and prudence he advises; I simply disagree with inaccurate conclusions that a casual reader might draw from the headline and the article’s tone.

[A footnote: aside from the complexity of achieving distributed consistency, which made projects like Zookeeper so broadly useful, or projects like Google’s Spanner so interesting to systems researchers, it is worth noting that the keynote at last year’s Usenix ATC was a fascinating talk looking at Blockchain through a classical systems lens and vice versa.]

Photo by Ladislau Girona, cropped, used under a Creative Commons license.

Feb 15, 2019

Subscribe to Our Blogs

* indicates required
Subscribe me to the following blogs: