The rise and rebirth of P2P – and why it matters

Lurking deep inside the crypto froth, hidden between NFTs for mammoth tusks, virtual mega-yachts in the metaverse, and Bitcoin, there is a technology trend so transformative it’s going to usher in the next wave of Decacorn startups – and almost no one is talking about it.

There’s no shortage of buzzword bingo in the crypto space, from proof-of-whatever to ERC20 elypical-ED25515-thingamajig, but there’s one little acronym gluing things together you need to pay attention to. Dust off your Doc Martens and your candy-colored Mac, because we’re about to kick it 1990s old school with a little thing called P2P.

If you’re old enough to remember those glorious days of music piracy, day trading, and khakis, you probably just thought of Napster when I just said P2P. Kids, Napster was an app your mom and I used to trade pirated music files with the entire planet. Get off my lawn.

Shawn Fanning invented Napster when he was a college student. Way back then, one had to purchase a thing known as an album, which did not conveniently play on demand from your handheld all-in-wonder smartphone. Stay with me.

If you wanted to use Steve Jobs’ new iPod, you needed to buy an album, and rip those phat tracks to mp3. College students, then as now, didn’t exactly have a lot of cash to torch off on albums, and Fanning took note his friends were struggling to, uh, “share” MP3s with each other using web and file servers.

Fanning’s idea was a system to stitch together everyone’s personal computers and network connections into one, big, virtual directory for sharing music. The magic of Napster was its peer-to-peer (P2P) protocol, which allowed each independent computer to coordinate with millions of others around the world to serve a common purpose. The community was the cloud.

Then there was an explosion of P2P file sharing apps and protocols, including KazAa, Gnutella, Emule, Limewire, Bearshare, Freenet, Bittorrent, and many more. This led to serious academic research in P2P network architectures, culminating in the development of protocols like Pastry and Chord, which each leveraged Distributed Hash Tables (DHTs).

After a few years and a lot of lawsuits aimed at file sharers, academic interest and P2P development began to wane, right up until Bitcoin and distributed ledger exploded on the scene. This brings us to the present moment, in which DHTs are now a critical part of modern decentralised protocols like Bitcoin, Ethereum, IPFS, and many others.

Exploring the paradoxical rise and uncertain future of crypto

Doug Gorman, insights analyst at GWI, explores the paradoxical rise and uncertain future of the crypto space. Read here

The Internet itself was designed from the very beginning to be decentralised. The fact that most of our data and computing ended up being centralised on Big Tech’s cloud platforms is something of an evolutionary fluke. To understand why, it is helpful to have some historical context.

We can think of the evolution of the Internet as being divided into four distinct phases. The First Internet (the “Al Gore Internet”) was about connecting computers together over a global network for the first time. The Second Internet was about getting people and businesses online and conducting commerce digitally for the first time. The Third Internet, the one we’re still in now, was all about mobile computing.

The scale challenges created by the sudden demand for smartphones and their apps created intense economic pressure to centralise. Housing millions of similar servers in gigantic data centres was and is more economically efficient, and technically “good enough” for the mobile apps of the day.

Things are about to change in what we call the Fourth Internet, which is about connecting machines to other machines. Imagine a future where human beings are surrounded by millions of connected sensors, autonomous robots, smart vehicles, and other devices which work together to seamlessly improve our quality of life. How will these different devices share and process data, and how will developers write apps that run everywhere?

Building apps for the Fourth Internet presents some unique challenges with the growth of data and the latency between edge devices and the cloud. Apps are moving away from abstract uses like gaming, web browsing, and social media, and into the real world where apps will drive our cars, operate heavy machinery, augment our senses, and make decisions in real time.

It’s not about cloud vs edge, it’s about connections

How can cloud and edge computing effectively be brought together to ensure strong connections across organisations’ networks?. Read here

In other words, speed matters in the Fourth Internet. Connecting things, sharing data, making decisions – these things have to happen dynamically, between different devices, using different computers, in real time, everywhere. It’s here that P2P will be truly transformative.

For engineers and developers, building distributed systems is extremely complex, and P2P systems are one of the toughest of all. Emerging startup Protocol Labs is advancing the state of the art with open source projects including libp2p and IPFS, making it easier for developers to build P2P apps. If you’ve heard of libp2p outside of developer circles, it’s probably because Polkadot, Ethereum 2.0, and Substrate are all using libp2p to create their own P2P protocols.

I know, you thought the best years of P2P were behind us. Better get out your checkered flannel and sock puppet because we’re going to NFT that stuff, apparently. Don’t just take my word for it though, just take a quick virtual trip to the USPTO patent search or Google Scholar, and you’ll see what I mean.

Written by James Thomason, CTO and co-founder of EDJX

Editor's Choice

Editor's Choice consists of the best articles written by third parties and selected by our editors. You can contact us at timothy.adler at stubbenedge.com