Keeping Thumbs Off the Scale: Nondiscrimination on Digital PlatformsAugust 28, 2019
In a recent podcast episode, economist Luigi Zingales related a joke familiar in tech circles: “Why did it take police years to find the body?” “It was buried on page two of Google’s search results!” The witticism rings all too true to internet users. We know that our likelihood of selecting a digital pathway is based on where it appears in our feeds. Our trust in these rankings gives the companies that control them a great deal of power in online economies. This is why experts are increasingly convinced that we need a new cop on the beat, to make sure the gatekeepers who generate those feeds aren’t putting their thumbs on the scale to stifle competition.
The method that first made Google Search famous was its algorithm, which relied on users to determine a site’s search ranking by linking to it and clicking on it (and thus attesting to its value). This method solved the age-old problem of search: how to make sure the most relevant and valuable sites come first. Digital platforms, be they search engines like Google or marketplaces like Amazon and the Apple app store, rely on similar algorithms, which have since conditioned us to trust the top search results by virtue of the Wisdom of Crowds. But this logic assumes that the algorithms doing traffic control only discriminate based on the user’s preferences. And in recent years, reports have emerged that some of the large platforms may nudge their algorithms to favor their own results over competitors.
Consider the implications for competition online. If the top results on Amazon’s Marketplace become reserved for AmazonBasics, start-up retailers may find themselves cut out of a huge portion of e-commerce. The same could be said for apps in Apple’s app store. Google may not sell their own products, but if Search prioritizes Google services over rivals (for instance, prioritizing Google Maps over Yelp or Youtube over other video-streaming sites), they can keep surfers on-site and hoard critical ad revenue. Such was the subject of the EU’s $2.7 billion fine levied on Google in 2017.
This is known as commercial or vertical discrimination, and its not a new phenomenon; cable operators and railroads have been accused of similar abuses in the past. However, Harold Feld of Public Knowledge argues that digital markets are uniquely susceptible to anticompetitive discrimination. Digital platforms benefit from powerful network effects and vast repositories of data that can enable them to dominate their particular lane (search, e-commerce, apps, etc.). And dominant platforms can be gatekeepers for potential rivals. Observers increasingly believe that one of the only ways to compete with these platforms is to rise to the top in a narrow set of products or services that fills a particular demand, sometimes called a “vertical.” Competitors can then expand into other “adjacent” sectors. Think about how Amazon started as a bookseller before adding other products, and Google as merely a search engine before adding Maps, Gmail, etc. However, new competitors to these giants may be forced to rely on them to get off the ground in any vertical at all. For example, a news app or navigation app may rely on Google Play store or Apple App store to be found. Similarly, a video streaming site likely relies on Google Search to be noticed. These platforms likely have the ability and the incentive to prevent a new competitor to one of their verticals from getting a fair shake.
Common Carriage: Not a Perfect Fit
In the past, the answer to problems of platform discrimination has often been a policy called common carriage. By declaring something like a railroad or a phone company a common carrier, the state requires them to provide the same level of service to all traffic. This is why we expect phone companies to treat all phone numbers equally when connecting our calls.
But common carriage falters when it comes to ranking and preferencing online. The whole point of digital platforms is to advantage the options that their algorithms think are most useful to users. Requiring all websites, products, or videos to be treated the same defeats the purpose of the service, and breaks the streamlined user experience that platforms have developed. However, there are two other possible solutions being discussed: Requiring structural separation and/or vertical nondiscrimination.
Structural separations could be created between the platform and the subsidiary that competes on the platform. This could mean separate boards, accounting streams, and a prohibition on coordination between the two companies. Antitrust attorney Lina Khan and presidential candidate Elizabeth Warren even suggest that complete separation might be necessary to address the problem online. This would mean a digital platform is completely barred from owning companies that compete in that marketplace.
Even with full structural separation, platforms could leverage favoritism toward completely unaffiliated sellers. For example, Amazon offers sellers its shipping services, “Fulfillment By Amazon,” in return for a fee. Investigative journalists and consultants have found that sellers who pay for FBA are more likely to be chosen as the default seller on an Amazon product page. Sellers also claim that Amazon leverages its platform control to discourage them from offering lower prices elsewhere or selling to other outlets (this is allegedly done by offering better page placement, or working to curtail counterfeiters if sellers work exclusively with Amazon). Amazon denies any favoritism and claims that third-party sellers regularly out-perform its own products.
With or without structural separation, we could protect competition and build trust right away with the creation of vertical nondiscrimination rules.
When we hear “nondiscrimination,” we often think of equality for all people of all demographics, or the First Amendment’s protections for speech and religion. However, nondiscrimination in this context refers to vertical nondiscrimination; policies designed to ensure a fair and competitive economic playing field. Platforms may have an incentive to discriminate against companies that currently rely on the platform (in other words, that have a “vertical” relationship to the platform), but also are, or may in the future become, a competitor to the platform. In a vertical nondiscrimination regime, the algorithm could prioritize based on any neutral criteria (say, whatever diapers are selling, or whatever news is getting the most clicks). Platforms could also remove content that violates community guidelines and policies. Debates about content moderation and how platforms handle hate speech online are important, but a separate conversation. However, what companies couldn’t do is put their thumbs on the scale to down- or upgrade certain results for economic advantage. The report by the United Kingdom’s Competition & Markets Authority (also known as the Furman Report) lays out a sample “code of competitive conduct” to “ensure that business users are:
- provided with access to designated platforms on a fair, consistent and transparent basis
- provided with prominence, rankings, and reviews on designated platforms on a fair, consistent, and transparent basis
- not unfairly restricted from, or penalized for, utilizing alternative platforms or routes to market.”
These nondiscrimination rules are straightforward to describe but may require new experts to effectively enforce them. Anticompetitive discrimination can be extremely difficult to detect in something as inscrutable as algorithmic search. It is hard to determine exactly why certain results are ranked above others without examining the code or testing the algorithm. However, allowing competitors to test the algorithm for evidence of discrimination could allow them to reverse-engineer the secret sauce that makes the platform so effective, and allowing companies that compete on the platform to test the algorithm could show them how to “game the system.” Feld presents several proposals that could help to solve this problem and enforce vertical nondiscrimination while maintaining the value and standards of each platform.
First, regulators should develop and run a form of “black-box” testing that can check for discriminatory behavior in an algorithm without exposing it to repeated probing designed to reverse-engineer it. The second important piece would be a speedy, transparent complaint process. Investigations by the Federal Communications Commission have been known to conclude after the statute of limitations has run or the victimized company has gone bankrupt. The Federal Trade Commission, as a law enforcement agency, provides complaining parties little insight into the status of investigations. As the University of Chicago’s Stigler Center Report on competition and digital platforms points out, clear, speedy resolution is especially key in the digital economy “[d]ue to the fast pace of change in these industries, the short amount of time it takes to destabilize or eliminate an entrant, the substantial discrepancy in bargaining power between digital bottlenecks and their business customers.” Finally, Feld calls for a private right of action so that injured parties could also sue in civil court. These solutions may not catch every attempt to discriminate, but together they could prevent the worst anticompetitive practices.
A New Cop on the Beat
Readers might be realizing by now that the fix for vertical discrimination is not simple. Designing nondiscrimination rules to prevent anticompetitive channel placement on cable networks was hard enough. Anticompetitive discrimination by digital platforms could prove an even tougher nut to crack. But that doesn’t mean it is not worthwhile to preserve the integrity of online markets.
So how do we get there? Existing agencies have done important work in this area, but the best way to set up workable vertical discrimination rules for digital platforms is to give the authority to do so to an agency focused specifically on digital platforms. That way they can develop expertise and carefully craft an effective rule. While the FCC played the key role in preventing anticompetitive discrimination in the cable industry and breaking up the national telephone monopoly, there is no agency that is tasked with promoting competition in digital platforms. The FTC enforces existing antitrust laws, but cannot make new pro-competition rules, and is responsible for many industries across the economy. The FCC, in turn, only has authority over the communications networks (telephone, cable, broadband, broadcast, and satellite) that can deliver digital information, rather than the digital ecosystem itself.
In some cases, vertical discrimination may violate our existing antitrust laws. Europe already levied a large fine on Google for discrimination in search, though those competition laws are different from U.S. antitrust law in some important ways. In the U.S., the FTC’s powers to penalize monopolistic behavior under Section 2 of the Sherman Act could theoretically be brought to bear, although a similar investigation in 2013 was closed without bringing a complaint related to discrimination. This may indicate that Google was not discriminating against competitors or potential competitors in a way that violates U.S. antitrust law. The Department of Justice’s current review of the tech sector may look at whether or not discrimination by any of the platforms is taking place.
The House of Representatives Antitrust Subcommittee has begun a broad investigation into digital platforms and competition. This Congressional investigation is incredibly important to ensure that public sees all of the facts and to build a record for platform regulation. Analyzing digital markets and crafting any remedies that might be needed is highly complex. It goes beyond one single platform’s dominance and covers a myriad of services designed to serve users in crosscutting functions. We need expert analysis of digital platforms and their practices. This is why it is time to create a sector-specific regulatory agency for the digital economy.
In the past, when an important economic sector demanded complex policy solutions, a new agency like the FCC or the Federal Aviation Administration was formed to protect the public interest. A digital regulator staffed by lawyers, computer scientists, economists, and other experts would be well-suited to analyzing the sector and making rules to stop any anticompetitive commercial discrimination. It could run the black-box testing system Feld prescribes, and ensure timely, transparent, and effective responses to complaints. Without expert regulators keeping an eye on the digital economy, questions about the neutrality of algorithmic arbiters may continue to swirl.