Primary logo
Making sense of the internet
Support the Research

Get data. Get insights. Help affect change.

View joining options
DNSRF Corporate Logo - words and line shorter version
Search
{item._type | case 'page' 'Web page' 'blog/blog' publication 'adnewsfeed/news' 'News' 'docs/article' 'Docs'}{item.section.title} / {item.chapter.title} / {item.topic.title}  | {category.title}
{item.publishDate | date 'DD MMM YYYY' | append ': '}

Blog

The Rise of the Intermediaries

Media current

Intermediaries play an essential role in shaping the user experience on the Internet, acting as middlemen that facilitate communication between users and the vast network of online resources. These intermediaries, which encompass a range of tools and services such as Content Delivery Networks (CDNs), malware detectors, and privacy-preserving tools, work together to enhance the speed, security, and privacy of users as they navigate the Internet.

In the context of Internet infrastructure, intermediaries collectively contribute to a more resilient and user-friendly online environment. Users may not be explicitly aware of the workings of these intermediaries, but they directly benefit from their presence through improved speed, enhanced security, and increased privacy. As the Internet continues to evolve, the role of intermediaries will likely become even more critical in addressing emerging challenges and meeting the evolving needs of users.

Three Key Intermediaries

One of the key intermediaries is the Content Delivery Network (CDN). CDNs are distributed networks of servers strategically located across the globe. When a user requests a web page or any online content, the CDN helps deliver it more efficiently by caching and storing copies of the content on servers close to the user's geographical location. This reduces latency and accelerates the loading time of websites, creating a smoother and faster browsing experience. Users may not directly interact with CDNs, but they benefit from their presence by enjoying speedier access to online content.

Malware detectors represent another category of intermediaries that operate in the Internet infrastructure. These tools identify and mitigate malicious software or code that could harm users' devices or compromise their data. Malware detectors often work in real-time, scanning websites, downloads, and email attachments to detect and block potential threats. From the user's perspective, these intermediaries operate silently in the background, providing a layer of security that helps ensure a safer online experience.

Privacy-preserving tools, including virtual private networks (VPNs) and anonymizing proxies, are intermediaries that focus on safeguarding user privacy. These tools route Internet traffic through secure channels, encrypting data and masking the user's IP address. By doing so, privacy-preserving intermediaries help users maintain anonymity, protect sensitive information, and evade potential surveillance. Users who employ VPNs, for example, can access the internet with an added layer of security, especially when using public Wi-Fi networks.

Intermediaries and Standards at the IETF

This last collection of intermediaries is interesting from a standards perspective. For many years, the IETF has been a strong advocate for both security and privacy. In fact, every new RFC that the IETF publishes must have a "Security Considerations" section, which addresses the relationship between the published standard and any security issues that are expected to arise as that standard is implemented. The "Security Considerations" section does not always include "Privacy Considerations," but the IETF and the IAB have been working extensively on privacy-preserving protocols.

Positioning from the IAB

For starters, the IAB has drafted an introduction to privacy-preserving approaches using intermediaries. The draft, currently in the RFC Editor's queue and being prepared for publication as an RFC, explains the historic approach to proxies and then examines four different protocols that use intermediaries as a privacy tool. Most importantly, the draft also examines the limitations of using intermediaries.

Privacy Pass

Have you ever gone to a website and been asked to identify which of nine pictures has a bicycle in it? Such a tool aims to separate human users of a website from robots. It would be better if there was another, less frustrating way to do this. One of the examples in the IAB paper is the protocol called Privacy Pass. In a nutshell, Privacy Pass is a protocol designed to establish a user's identity using cryptography rather than puzzles. In addition, Privacy Pass allows users to request resources from a server without divulging private information.

In practice, Privacy Pass is a complicated protocol with working parts called Origins, Attestors and Issuers. Imagine you went to a summer beer-tasting festival. You got your tickets online, and when you came to the festival park, you got a wristband for the day and two tickets for beer samples. The wristband proves you paid for that day's admission, and the tickets can't be traced back to you. Still, the festival knows you've paid and the beer tent knows to give you a beer without tracing that it was you who asked for it.

Privacy Pass is like that but for the Internet. It connects content users with content providers but separates the two so that the user is assured of some privacy during the transaction. The IETF is about to publish the protocol and the architecture as an RFC (the documents are currently in the RFC Editor's queue), and Cloudflare has produced browser extensions for Firefox and Chrome to take advantage of websites that support the protocol. In addition, Apple's Private Access Tokens are an implementation of Privacy Pass.

OHTTP

Another protocol mentioned in the IAB paper is Oblivious HTTP, or OHTTP for short. HTTP has always had the concept of a relay built-in, but OTTP takes things a step further. Web servers have long had the ability to collect information from the browsers that access them. By correlating multiple visits, over a range of web sites, an amazing amount of data can be captured about the user, their interests and behaviors.

OHTTP provides a user a way to use a proxy to make requests of a server. The server gets a minimal amount of information about the client and never enough to correlate information over a collection of visits. OHTTP is primarily a transactional protocol: it isn't intended as a general-purpose protocol for viewing web content. However, in cases such as DNS or single-use queries (checking to see if a digital certificate has expired), OHTTP makes sense. A recent IETF presentation included an overview of the technology and some important use cases.

Other Works in Progress

While OHTTP is not great for the general consumption of media, others have turned their attention to the problem of making the transport of information and media more private. The most important of these is called MASQUE (Multiplexed Application Substrate over QUIC Encryption!). Like OHTTP, MASQUE is built on top of HTTP. However, it differs from OHTTP in that it is intended to support multiple simultaneous connections of streams inside an HTTP connection.

The work on MASQUE led to two very useful informational documents: one on Proxying IP in HTTP (RFC9484) and Proxying UDP in HTTP. Standardization for MASQUE is still in its early stages, but a mailing list supports the working group and is an excellent place to learn more.

A different working group is addressing the problem of preserving privacy while measuring Internet traffic. The Distributed Aggregation Protocol (DAP) is a protocol built to help retain privacy while measuring sensitive data. DAP uses an intermediary to aggregate data without disclosing private information to those making measurements.

Why Does it Matter?

The Internet began with an architectural adage called the End-to-End principle. The edges of the network were where the intelligence was supposed to reside, and the middle of the network simply transported bits without any extra services. That principle is history: the rise of the intermediaries provides an interesting evolution of the kinds of protocols that the IETF designs. In the past, the IETF largely concentrated on protocols between two endpoints. Now, protocols are being designed with a deliberate man-in-the-middle.

But if protocols begin to bake in dependencies on intermediaries, then protocol design is enforcing network and economic centralization. We will address that concern in a future post.

Thank you for signing up for our mailing list.
Unfortunately we could not sign you up for our mailing list at this time. Please try again later

Latest posts here

Top