Rather than continue waiting for Congress to act, the executive branch is right to pursue measures targeting some of the lowest-hanging national security risks posed by data brokerage.
On Feb. 28, President Biden issued an executive order (EO) on “Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern.” It is driven by concerns that foreign countries are continually trying to access “bulk data” about Americans, which they could combine with other datasets, analyze with artificial intelligence, or otherwise use to “engage in espionage, influence, kinetic, or cyber operations or to identify other potential strategic advantages over the United States.” The order particularly calls attention to sensitive personal data linked to government employees, the military, and other populations where the data “can be used to reveal insights about those populations and locations and threaten national security.”
Though the order has many elements, touching on everything from federally funded research programs to submarine cable security reviews, its overwhelming focus is on bulk transfers of Americans’ data through data brokerage. “Companies are collecting more of Americans’ data than ever before, and it is often legally sold and resold through data brokers,” the White House press release noted. “Commercial data brokers and other companies can sell this data to countries of concern, or entities controlled by those countries, and it can land in the hands of foreign intelligence services, militaries, or companies controlled by foreign governments.” In this vein, the EO directs the Department of Justice, working with the Department of Homeland Security and other agencies, to issue regulations to restrict certain sales of U.S. persons’ data to foreign entities where there is a scoped national security risk.
Despite the fact that transactions of Americans’ sensitive personal data come with national security risks, major points of contention surround the order. However, one of the order’s biggest improvements to the data brokerage landscape may be compelling some companies to better assess and monitor what data they are selling, to whom, for what purpose, and with what technical privacy or security measures in place. Given the recency of the executive order and the length of the Justice Department’s Advance Notice of Proposed Rulemaking, a comprehensive discussion of all the proposed regulations and their implications would be an impossible task for one article. But it attempts to draw out some of the biggest discussion points and highlight how the EO’s signing is driving forward some action on the sale of Americans’ data and risks to U.S. national security.
The Executive Order and Legal Authority
The EO has a few central provisions related to data brokerage. It directs the Attorney General to issue regulations, in coordination with the Secretary of Homeland Security and consultation with relevant agencies, that identify specific classes of data transactions that would pose unacceptable risks to U.S. national security and foreign policy (and should be prohibited). It also instructs the Attorney General, again coordinating with the Secretary of Homeland Security and consulting with relevant agencies, to issue regulations for types of data transactions that will need to meet security requirements to mitigate risk (and can then proceed). Those security requirements are to be established by the Director of the Cybersecurity and Infrastructure Security Agency, or CISA. For their part, the Secretaries of Defense, Health and Human Services, and Veterans Affairs, along with the head of the National Science Foundation, must additionally consider issuing regulations or guidance to limit the covered transfer of Americans’ bulk sensitive personal data through federally funded research. And it also looks backwards in time, directing the Attorney General, Secretary of Homeland Security, and Director of National Intelligence to recommend actions to detect, assess, and mitigate national security risks posed by past bulk transfers of U.S. persons’ data to countries of concern, among other elements in the EO.
Biden issued this executive order under the authorities of the International Emergency Economic Powers Act (IEEPA), the same authority former President Trump cited when he issued an executive order in August 2020 attempting to ban TikTok from the United States. President Trump’s executive order was subsequently overturned in court due to the “Berman amendments,” or IEEPA’s 50 U.S.C. § 1702(b)(3) provision. This provision says the president cannot use IEEPA to prohibit transactions related to “any information or informational materials,” irrespective of the “format or medium of transmission.” Because TikTok is a platform that hosts and disseminates information, Trump’s executive order violated IEEPA. By contrast, the Biden administration’s executive order focuses on commercial transactions, and executive branch officials maintain it would not violate IEEPA because bulk data transfers are not expressive. The regulations will also explicitly exclude expressive information under IEEPA, such as videos, artworks, or publications—to make this distinction very clear.
Defining Data Brokerage and the Advance Notice of Proposed Rulemaking
Definitions are critical in this effort, because regulations on bulk data transfers could focus narrowly on the brokerage of certain types of data or easily extend well beyond the intended scope to impact wider cross-border data flows. Of the many definitions discussed in the Justice Department’s Advance Notice of Proposed Rulemaking (ANPRM), which solicits public comments on key regulatory questions, “data brokerage” is one of the most significant for how these regulations will practically impact the collection and sale of data on U.S. individuals. The ANPRM defines “data brokerage” as:
the sale of, licensing of access to, or similar commercial transactions involving the transfer of data from any person (the provider) to any other person (the recipient), where the recipient did not collect or process the data directly from the individuals linked or linkable to the collected or processed data. [emphasis in original]
This definition is much more comprehensive and effective than many used in current data broker laws or regulatory approaches because it focuses on the activity of selling data, rather than yielding endless debate about whether a company meets the criteria to be a data broker entity. By way of example, data broker registry laws in California, Vermont, Texas, and Oregon all have roughly the same definition of a data broker: entities selling personal data about consumers with whom they have no direct business relationship. In other words, if a company sells data about its own customers or users, it is not considered a data broker under those registry laws (among other exceptions). Such an approach omits from coverage the many first-party companies and organizations—such as apps, websites, telehealth platforms, retailers, and nonprofits that directly interact with consumers—which can legally sell consumers’ data, including to third-party data brokers.
The Federal Trade Commission (FTC)’s landmark 2014 report on data brokers diverges slightly from the state registry laws by dropping the third-party limitation, but it likewise focuses on the question of data brokers as entities. For instance, the report defines a company as a data broker only if this data activity is its “primary business.” This leaves out the many companies for which selling data earns secondary revenue, such as the family safety app Life360 that previously sold its users’ geolocation data on the side. And it similarly leads to debates about whether a specific company fits the exact criteria of a “data broker” based on its data sales, rather than focusing on the data sales themselves. Hence, the ANPRM’s focus on data brokerage, the action of selling or licensing data, avoids this trap and makes sure to encompass first parties and third parties selling Americans’ sensitive data. This covers risks to national security such as a mobile app collecting geolocation data directly from its users and then selling the data to foreign entities of concern.
Defining the regulations around data brokerage, the activity—rather than the technical ways in which data flows in and out of the U.S.—also makes clear the order focuses on certain types of commercial transactions rather than cross-border data flows per se.
For now, the Justice Department’s ANPRM considers defining “countries of concern”—the places to which covered data transactions will be restricted—as China (including Hong Kong and Macau), Russia, Iran, North Korea, Cuba, and Venezuela. This is the same list of countries identified by the Commerce Department as “foreign adversaries” per the 2019 executive order on “Securing the Information and Communications Technology and Services Supply Chain.” The proposed Justice Department regulations will also further scope “sensitive personal data” to cover:
- “Specifically listed categories and combinations of covered personal identifiers (not all personally identifiable information,” including Social Security Numbers, driver’s license numbers, mobile advertising IDs (MAIDs), and MAC addresses;
- Precise geolocation data;
- Biometric identifiers;
- Human genomic data;
- Personal health data; and
- Personal financial data.
It will carve out public records data, such as court records and other government records, personal communications, and “expressive information” under IEEPA (as mentioned above). The public records carve-out is a standard carve-out in privacy laws and regulations about personal data, including every single state consumer privacy law. (In a privacy context, this is highly problematic when it comes to stalking and gendered violence, but the EO is focused on national security—and such reforms to data brokerage and public records would need to come from Congress anyway.)
The EO and the Justice Department’s ANPRM have several additional carve-outs for transactions “ordinarily incident to and part of financial services, payment processing, and regulatory compliance”; for multinational companies internally using data for ancillary business operations; for U.S. federally funded health and research activities, and transactions; and for transactions required or authorized by international agreements or federal law. Based on how they are written, some of these exceptions are meant as clarifications; multinational companies internally transferring their own employee data would not be normally considered “data brokerage” anyway. But these exceptions also make sense for at least three reasons.
First, the administration should be careful about limiting any data sharing that may be needed for important health research. Leaving the funding agencies in charge of those risk decisions—as the order does—rather than the Justice Department means the decisions of balancing research and health necessities with potentially unacceptable national security risks will lie with agencies better-grounded in area expertise and context. It should hopefully minimize the risk of unintended regulatory consequences for health research and advancements. Second, as the order is based on a national security and not a consumer privacy authority, the focus should remain on areas where bulk data transfers and transactions pose considerable risks to national security relative to necessary societal or economic functions (such as payment processing). And third, the Justice Department does not have unlimited time and resources. Keeping the focus on covered transactions of sensitive data about Americans—such as government employees and clearance-holding contractors—is a better way for the Justice Department to reduce national security risks, rather than casting too broad a net around federally funded research and other areas.
The Implications and Major Points of Contention
The national security risks arising from the highly unregulated U.S. data brokerage ecosystem are well established. Some critics of the EO contend that it will not stop all national security-damaging data sales to China and Russia, that the real “solution” lies with Congressional action (not an executive order), and that it hurts the U.S.’ reputation with other open-internet democracies given recent walk-backs of cross-border data flow commitments. But imagining a perfect enforcement regime, through an EO no less, is a false expectation: the point is to take some steps to mitigate national security risks, not pretending to eliminate them entirely. If the EO can tackle some of the lowest-hanging fruit among the many ways adversaries can obtain sensitive U.S. persons’ data—through data brokerage—it will have improved on the status quo. It does not rule out the possibility of and need for Congressional action, either. And while critical discussions of the U.S. Trade Representative’s actions on data flows are important (and the decisions concerning), it should not negate the executive branch attempting to mitigate some of the national security risks posed by the collection, packaging, and sale of Americans’ sensitive data.
As asserted by the White House and demonstrated by a recent study I led on the sale of sensitive data about active-duty U.S. military personnel, foreign adversaries could easily and inexpensively acquire information about government personnel, security-cleared contractors, and military servicemembers for profiling, blackmail, and more. For instance, in the Duke University study, we purchased health, financial, and other data (including data on religious practices and children) about active-duty military servicemembers from U.S. data brokers, for as cheap as $0.12 per servicemember, with little to no vetting. We even purchased these and other data types from U.S. data brokers through a .asia domain and geofenced the datasets to places like Fort Liberty (formerly, Fort Bragg) and Quantico, Virginia.
Foreign governments could spend hundreds or thousands of dollars—pocket change to an intelligence agency in China or Russia—to purchase data they may not be able to acquire through hacking a government database. Foreign governments could also spend far more to acquire real-time geolocation data pulled from phones and motor vehicles—or genomic datasets from direct-to-consumer genetic testing companies. Data brokerage is one of many vectors available for foreign governments and malicious actors to acquire data on Americans.
Risk is about possibility and consequence, and purchasing this data is certainly possible. The data brokerage ecosystem in the U.S. involves thousands of companies and continues to expand in areas like geolocation data. Recent consumer privacy-focused actions by the Federal Trade Commission (FTC) alone have demonstrated that there are telehealth companies, pregnancy and fertility apps, third-party location data firms, and even anti-virus companies that can legally gather, package, and sell or share Americans’ data to hosts of unknown third parties. Consumer Financial Protection Bureau director Rohit Chopra likewise remarked, after the executive order was announced, that “corporate data brokers are assembling and selling extremely sensitive data on all of us, including U.S. military personnel, to foreign purchasers.” If academic researchers could purchase the above military data, it would not be difficult for a foreign intelligence organization—with all the freedom in the world to lie and deceive—to acquire even more data.
The consequences for national security could be quite serious. If a foreign actor acquired geolocation data on U.S. military personnel, they could determine where individuals are going off-base, such as to engage in illicit activity, quietly seek mental healthcare, or even have an affair. As indicated when the exercise app Strava revealed users’ geolocations, it is also possible to use geolocation data to identify the locations of secret military and government facilities. The same goes for financial information: perhaps a U.S. government employee with a security clearance has not reported some of their personal debts or that they are suffering from gambling addiction to the government, but this information is readily available in commercial datasets. Foreign governments could use these insights to run intelligence operations. This view has been reflected from the U.S. intelligence community: a January 2022 advisory group report for the Office of the Director of National Intelligence, declassified in June 2023, stated that commercially available information “raises counter-intelligence risks” that should be considered.
Despite these national security risks, there has already been a fair amount of debate about the order’s effectiveness. For instance, some journalists and other commentators have asked how this effort compares to export controls on physical goods, pointing out that restrictions on semiconductor technology sales to Russia have not prevented Russia from getting its hands on chips. In this case, the analogous question is how a single executive order is expected to prevent the Chinese or Russian governments from tapping into the U.S. data brokerage ecosystem, including through front companies and proxy organizations, to acquire sensitive data on U.S. persons. But this is a false bar. Policy analysts, civil society groups, and those in the national security community should certainly scrutinize any attempts to export control technologies or data. It is also important to question underlying assumptions about what can or cannot be controlled in the digital arena, especially when it comes to data. The U.S. government should also be as transparent as possible about its regulatory development and processes, taking care to scope what cannot be disclosed or shared for business confidentiality or national security reasons. This need for transparency includes providing the public with information about both enforcement activity levels and regulatory effectiveness.
Simultaneously, export controls on semiconductor technology are not going to completely prevent a sophisticated foreign intelligence service like Russia’s from acquiring U.S. semiconductor technology—much like restrictions on certain sales of Americans’ data are not going to completely prevent all foreign actors from ever obtaining the desired information. The U.S. data brokerage ecosystem is too large; foreign governments interested in U.S. personal data are too persistent; and the real “solution” to the privacy, personal safety, and national security risks posed by data brokerage generally will be comprehensive federal and state legislation. Judging the recent EO by the metric of prevention is unrealistic; it misses that the point is to mitigate risk, not to imagine the U.S. government can eliminate it. If the EO leads many data brokers to implement or improve customer vetting programs to monitor what data they are selling, to whom, for what purpose, and with what technical privacy or security measures in place, that would already improve on today’s reality.
This leads to another counterargument: that Congress needs to respond to the risks of data brokerage, rather than a presidential administration via executive order. Congressional legislation is absolutely the most effective way to address the risks and harms of data brokerage. The data brokerage ecosystem is so large—worth billions of dollars a year in the U.S. alone and encompassing thousands of companies—that comprehensive federal laws and regulations are best positioned to limit the collection and sale of people’s information. State laws (such as the California Delete Act) are driving forward some action in the meantime. Congressional action is also best-positioned to address the entire range of risks and harms at play; because the EO is about national security, not privacy, it is Congress that can and should push forward meaningful reforms to how companies sell data and contribute to stalking and gendered violence, predatory location-based targeting, the outing of closeted LGBTQ+ people, surveillance of protesters, invasions of kids’ privacy, and other activities and impacts well outside IEEPA’s remit. (And it should also implement mitigations for national security risks.)
Yet, none of this means the executive branch should ignore the national security risks in the meantime. In the face of Congressional inaction (despite important work from some Democrats and Republicans), the executive branch should use its current authorities to design impactful, transparent, and targeted means of mitigating the national security risks from data sales. The administration has also been quite clear in its public messaging that the EO does not change its support for a bipartisan, comprehensive federal privacy law to protect people’s data.
Lastly, there have been criticisms aired about how the executive order sits against the U.S. Trade Representative’s recent position change on cross-border data flows and potential, bigger shifts over the last several years in how the U.S. approaches regulating data flows. This is a far more complicated topic, and many of these criticisms raised are highly important discussion points. For example, the Office of the U.S. Trade Representative in October 2023 ended its longstanding support at the World Trade Organization for objectives such as protecting cross-border data flows and resisting data localization and source code inspection requirements in foreign countries. As written in a recent civil society letter to the State Department, the Commerce Department, and USTR—with signatories ranging from the ACLU to the Information Technology and Innovation Foundation—the USTR’s move “may be read to signal an abandonment” of certain principles around internet openness and data regulation. This action could also further undermine the cybersecurity of data, devices, and networks where foreign governments leverage source code inspection requirements and data localization laws to find vulnerabilities in technology products and access data on their own residents as well as non-residents.
But this concerning USTR shift and the complexity of those policy problems exists alongside U.S. companies long resisting mandatory privacy or cybersecurity requirements—and Congress continually failing to pass bipartisan and even industry-supported legislation to raise the floor of best-practices on data privacy and security. There are plenty of reasons to explore what the executive branch can and should do in the interim, such as by pushing for at least some improved protections around data brokerage and national security.
The Coming Months
As comments roll in concerning the Justice Department’s Advance Notice of Proposed Rulemaking, it will be interesting to track industry, civil society, and even individual comments on the proposed regulations. In particular, the Justice Department has important questions to consider about how to draw thresholds on what constitutes “bulk” data transfer; what kinds of due diligence companies selling covered data should implement to reasonably safeguard against harmful data sales; and how it should consider different combinations of identifiers that can be used to pinpoint specific individuals. It should also continue to weigh how certain types of data are defined—and the various pros and cons of approaching regulations and regulatory enforcement based on data type, versus data harm (or risk), and so on. Biden’s order starts to approach this problem from a data type standpoint, but the regulations and their enforcement could still make additional determinations about risk assessment factors.
The fact remains that the best response to the harms and risks of data brokerage, including the national security risks to the country, will be from Congressional legislation. Companies gathering and selling data on Americans is a broad privacy issue—as well as an issue of stalking and gendered violence, law enforcement surveillance, and more—that cannot and should not be fully addressed through a national security lens. An executive order is also limited in drawing on existing legal authorities rather than, as Congress can, pushing for new authorities to keep up with technological evolutions, market changes, and current privacy contexts.
Nonetheless, the sale of data on U.S. military personnel, the sale of U.S. persons’ genetic data, and many other data brokerage activities present ongoing risks to national security. Rather than continue waiting for Congress to act, the executive branch is right to pursue measures targeting some of the lowest-hanging national security risks posed by data brokerage. If the EO and the regulations compel many data brokers to introduce more compliance programs and customer vetting, with clear national security risks in mind, that alone will improve considerably on the current state of the market.
– Justin Sherman is a contributing editor at Lawfare. He is also the founder and CEO of Global Cyber Strategies, a Washington, DC-based research and advisory firm.