Law isn’t code. The vote is just the beginning.
In the data portability world, foundational laws like the EU’s General Data Protection Regulation and California’s privacy laws provide users with the right to request their data from businesses, and the Digital Markets Act in Europe imposes additional portability duties on designated gatekeepers. But tech policy doesn’t stop being relevant when laws are adopted; to the contrary, that’s when the real work begins.
Lawrence Lessig was one of the writers I read extensively in grad school in the early 2000s. Among many other contributions, Lessig is famous for the phrase “Code is Law,” characterizing the far-reaching practical effects that software and hardware design can have on regulating behavior. Coined in 1999, this evocative phrase marked the growth in scholarly (and, eventually, public) understanding that digital technologies aren’t just powering business tools, games, and other things ancillary to everyday life, but are rather interwoven with our economy and society in deep and fundamental ways.
For many years, I spoke to every new class of hires at Mozilla about tech policy. One of the talking points I used in all of those presentations (later incorporated into my “Tech Policy 101” talks given to fellows at the Aspen Tech Policy Hub, Foundation for American Innovation, and other venues) was an addendum to Lessig’s famous quote: “Code is law, but law is law, too.” Technology design shapes how we are able to use it in our lives, and laws in turn shape how technology is designed and deployed.
As we work at DTI on cutting-edge tech policy related to data portability, I’m starting to think this cognitive frame deserves a new dimension: “But law is not code.” Many of today’s technology laws are adopted without perfect clarity as to how, exactly, they should be implemented. In addition, law is not self-executing – it’s a very human product, and turning the words of law into constraints on action involves subsequent human processes.
I don’t mean anything normative when I say “Law is not code,” by the way. In 2023, we’re mostly over the collective honeymoon period when we believed that tech companies were to be held blameless for their products and designs; code isn’t inherently superior or objective.
Of course, unclear language in a law can create significant costs and risks that may have been avoided with greater clarity up front. But not taking action – not passing a law – is itself an action, with consequences. So while there are certainly instances where the cost of uncertainty in a law exceeds the cost of inaction, I’m very much not of the view that we should only pass laws when all uncertainty can be removed.
The modern policy reality has shifted.
Policy advocacy has historically centered around dramatic campaigns leading up to a highly visible vote, a discrete “go/no-go” moment that determines whether a bill becomes a law or whether a rule is approved. We build our governance processes around this center. We focus on transparency before the vote on what is in a pending law and why, on who lobbied for or against the measure. But after the vote, too often, all of the energy just … dissipates away. The “interesting” part is over, the “policy” part is over, now it’s just boring enforcement work.
Or so the story was told. But because law is not code, and because lawmakers are racing ahead to try to keep pace with technology development and passing laws with more and more uncertainty as to their intended implementation, the real work in tech policy now happens after the vote. That’s when we all have to figure out what, exactly, to do.
The EU’s Digital Markets Act is refreshingly honest about this modern governance dynamic. Its Article 6 adopts “Obligations for gatekeepers susceptible of being further specified” – one of which, provision 9, is the DMA’s data portability language:
\9. The gatekeeper shall provide end users and third parties authorised by an end user, at their request and free of charge, with effective portability of data provided by the end user or generated through the activity of the end user in the context of the use of the relevant core platform service, including by providing, free of charge, tools to facilitate the effective exercise of such data portability, and including by the provision of continuous and real-time access to such data.
We don’t really know what kinds of future innovations and competitive dynamics the DMA will unlock. (Though I’m certainly not alone in saying that the DMA and DSA will have GDPR-like echoes around the world.) Nor do we know the policy questions that will come up along the way.
What happens if data portability really takes off?
Here’s a thought experiment. What happens if the scale of third party data portability, on behalf of users, expands significantly? Including the still-undefined “continuous and realtime” portability required under the Digital Markets Act? As I wrote in my last newsletter, it’s possible to imagine a world where the scale of third-party access to user data becomes quite significant.
User access to their data must be provided at no cost to the user; that principle is enshrined in both the GDPR and the DMA. But if a third party is effectively aggregating up many users’ worth of data on their behalf, is there a threshold at which the host social platform is in practice operating more like a foundation for downstream tools than as a direct user experience?
Digital platforms create compelling downstream business opportunities for third-party services, allowing their scale to increase opportunities particularly for small businesses, as the 2016 Digital Economy Board of Advisors report emphasized. If at-scale personal data portability expands that horizon, requiring access to be provided for free in perpetuity would create some challenging incentives, which could be better aligned if the resulting revenue generated could be shared with the platform.
Perhaps this all feels pretty hypothetical in 2023. But it’s exactly the future I proposed Twitter embrace in a piece I wrote with Richard Reisman last year, “The Future of Twitter is Open, or Bust.” In it, we proposed that rather than locking down the network to monetize it, Twitter should encourage third parties to layer their own client experiences, including content moderation, over the platform, and find a pathway to benefit from others’ innovations and success downstream. The value of a social network is incredible, and it can be hard to monetize that effectively with a single client interface, or even with a single approach to content moderation.
How a platform could charge for access is an open question.
For much of the history of telecommunications services, regulators have set the fees by which wholesale networks charge retail providers for access. The idea that such rate regulation could creep into the internet sector in any way has been the subject of intense debate for years.
This hypothetical market – for paid third-party personal data transfers – is currently nonexistent, and the baseline is zero-cost user access. Presumably, without any baseline for evaluating commercial viability, rate regulation would be unwise, if not outright impossible. But what is the law likely to say? The recently finalized Data Act provides some clues.
Article 9 of the Data Act, titled “Compensation for making data available,” establishes rules to govern payments between businesses for (presumably non-personal) data transfers. At its core, such charges must be “nondiscriminatory and reasonable.” In many contexts, this terminology is called “RAND” – short for Reasonable And Non-Discriminatory. Sometimes, an alternative methodology is specified: FRAND, adding “Fair” into the mix.
Reasonable and nondiscriminatory is a compelling principle. But Article 9 also says “The Commission shall adopt guidelines on the calculation of reasonable compensation.” So it’s unclear at this point whether the Commission will use RAND principally as a remedy to evaluate disagreements on pricing, or as a lever to try to influence rates before negotiation; and the market repercussions of such interpretations will be significant.
Because law is not code, the words adopted by legislatures and regulators tell only the beginning of the story, including for future market-defining questions like how downstream innovations could emerge from digital platforms in a sustainable way. The work of policy advocates is critical throughout the journey, and can’t stop after Chapter 1.
DTI In the News
Why the Internet Needs a Data Portability Ecosystem (by Delara Derakhshani and Zander Arnao)