Privacy Regulation Update: The Doors Are Locking on User Data
Issue 62: June 30, 2022
We have frequently written about how the digital privacy and data environment outside of the US is being transformed as a result of international government efforts to enact policies and regulations that seek to protect its citizens and their privacy. The EU community is leading the way on these initiatives, and we caution all organizations to be aware and forewarned about the changes as they already affect global operations and will ultimately impact their US operations.
The US has not yet put into place individual online privacy protections. In the absence of a federal policy or framework, individual states are expected to enact protections for their citizens. But here’s the complication: digital extends beyond borders, national and international. Organizations face significant management and compliance challenges anywhere they provide information, services, or products to a geographically diverse set of stakeholders.
Data has become a fundamental pillar of our digital economy. It is a powerful tool both realistically and aspirationally to reach the right customers, create awareness or gain conversion via advertising. Data also refine how to assess and understand engagement, measure sales, and develop relevant marketing and product efforts.
The challenge is when an organization’s digital inputs are from the majorly passive channels awaiting traffic among the exponentially growing number of other passive channels across the expanse of the web. App stores are disintermediating organizations and making it harder and harder for them to get in front of potential stakeholders. In turn, organizations are compromised by direct or indirect data collection and leveraging the right data which makes it possible to drive potential stakeholders to the organization’s passive islands.
The data environment in the US may finally be changing given recent actions in Congress. Before we dive into that topic and surface its impact on organizations, let’s review some of the recent activities in the European Union.
EU’s Digital Services and Digital Markets Acts
The EU continues to maintain its global leadership in the privacy protections arena and has approved its Digital Services Act and Digital Markets Act. It has also gained agreement from major social platform companies on establishing a code of conduct that seeks to put controls and actions in place when misinformation is posted on their platforms. Companies are now able to take action to remove posts deemed misinformation (false or misleading information as defined by the EU), suspend, or remove accounts and posts/content that is considered “illegal” in terms of the definitions set forth by the EU. Citizens will also be provided with guidance and tools to identify information that is deemed false or misleading. The code of conduct also provides a framework for platform companies to stop serving ads on content that is misinformation to prevent those who spread misinformation to make money via advertising.
The two Acts are an evolution of the EU’s protections that began with the Government Data Protection Regulations (GDPR). At the time, the growth of false and misleading information resulted in companies taking their own initiatives to manage what was posted on their platforms. That approach was complicated, given the lack of overarching policies or regulations, to ensure the rights to freedom of speech and expression were not compromised. It was further complicated by how and who would determine what content may be false or misleading.
When policies were put in place in the internet’s infancy, they sought to protect companies from any liability for content published on their platforms. The prevailing viewpoint was that the companies simply served as platforms with related infrastructure. The desire among companies to create a facilitated business-friendly environment for the technology sector failed to recognize the unintended consequences that would result.
Data Collection and Management
The Digital Services and Digital Markets Acts will come into effect as early as 2023, so let’s share some perspective in a review of the EU’s accomplishments in establishing privacy protections for individual citizens and a regulatory environment to guide organizations.
- All organizations must receive consent or decline from users who are citizens of the EU before the user is permitted to interact with the digital representation of the organization.
- Consent preferences must be recorded and applied before any data can be collected from a user’s interaction with an organization via the web or apps.
- Currently, in the US states with privacy protections in place, a user only needs to be presented with an option to consent or decline and does not need to provide either before using an organization’s site or app.
- Users in the EU can consent or refuse specific categories of cookies, choose how they want to receive communications from an organization, and consent or decline types of retargeted advertising.
- Joint controls have also been put in place for how data, collected with consent, can be leveraged, managed, and shared with other organizations.
These policies do indeed protect individuals’ privacy and provide them with control of their information and data. Operating in a digital economy powered by technology is deemed as an advantage, however, competitive decisions typically benefit the organization and overlook the consequences to individuals.
It is a delicate tightrope act to do what is right for an individual by giving them a choice, in balance with an organization’s aspiration to become data-driven. For example, web and app-based user data (what we often refer to as “behavioral data”) include inference-based data that are collected including physical location, frequency, device, immersion, and engagement. This user data can help an organization infer who users are what interests they have, what other services, products, or information they may want/need, and what advertising they should see and are likely to interact with. The overall value of behavioral data is significantly compromised when users can decline to have organizations use their personal information. As a side note, we will be discussing and detailing inference-based data in a future newsletter in the weeks ahead.
EU-based organizations are already seeing an average 50% decline in collected data as large sectors of their audiences are declining consent to collect data and track their behaviors. With access to only 50% of the audience, data-based knowledge and insights can often equate to rolling the dice with a 50/50 chance of being right or wrong and getting or not getting the expected results. Organizations, therefore, are returning to the past when the ability to collect data was limited in a marketplace where it was the luck of the draw to drive the right traffic to a site or app. Advertising lacked any ability to know who users were or how to get them to interact with the product or service. A solution to mitigate a return to the past is detailed in our First-Party Data Playbook.
The American Data Privacy and Protection Act
Let’s turn our attention back to the current important, impactful activity in the US. In June 2022, the US Congress (specifically a House of Representatives subcommittee sitting under the Energy and Commerce Committee) developed legislation to enact a Federal Individual Privacy Framework and Policy which has gained traction as the American Data Privacy and Protection Act (ADPPA) and moved it out of the subcommittee this week with bipartisan approval and support. The Act is now moving forward to its parent committee for review and a vote.
Many predict that the Act in current or modified form is likely to pass the House and be considered in the Senate. A cautionary note, however, is that many prior efforts over the past 20 years have died on the vine as Congress couldn’t resolve establishing protections for individuals and at the same time be responsive to businesses concerned about new limitations that harken to what EU-based organizations are now experiencing.
This time, the public is weighing in more vocally in response to recent issues in our society including false and misleading information (otherwise known as “fake news”). The public outcry is creating pressure for action, and the Act may actually come to fruition.
A Rock and a Hard Place
Many organizations, including technology companies, are supporting the House effort. They fear if the current environment continues, they will have to manage and maintain compliance with 50 different privacy laws. The states that have established state-based privacy policies and regulations include California, Colorado, Nevada, and Virginia. Each of the state-based frameworks is similar but also has distinct differences in how consent is managed, which types of data require consent, which organizations are subject to the laws, and what the responsibilities of organizations, subject to the laws, are required to do.
US states have not yet addressed how to manage false or misleading information on available platforms in any substantial way and several including Texas, are attempting to prevent platform organizations from suspending or limiting accounts that post alternative facts and information. Although the Texas effort is controversial, it is based on fact. The constitutional right to an individual’s freedom of speech and expression of opinion transcends how a platform organization can determine what is or isn’t compromising freedom of speech or expression. Each individual’s perception, as we have often written, is their reality. Of course, as recognized by the EU, some intentionally publish misinformation for financial gain, and something must be done in that regard.
The ADPPA does not address false or misleading information but there are other efforts across Congress being considered in response to how organizations manage what is posted on their platforms.
What ADPPA Includes
John McKinnon wrote in the Wall Street Journal that “broadly the legislation puts new limits on how businesses, especially large technology companies (Google, Meta, Twitter, TikTok, etc.) can collect and use consumers’ data. Companies would be limited to collecting, processing, and transferring only the data that is ‘reasonably’ necessary to provide their services.
“Individuals would have the right to access, correct, delete, and export data covered by the legislation. And for some sensitive categories of data, like geolocation or biometric data, the legislation would prohibit the transfer or restrict it to very limited circumstances.”
Similar to the EU’s Digital Services Act, users in the US would have the ability to control and/or opt-out of targeted advertising and the transfer of any of their data to third parties.
Other major items of the legislation include:
- Protect individuals from discrimination and harm when algorithms use data to make decisions on what to present, demonstrate or offer. Remember algorithms are created by humans and are fed, often unintentionally, by our biases. Information used by algorithms can represent the limits of our own knowledge and/or only have the information to use that was provided.
- Child protections would be expanded to include those under 17. COPPA and similar laws in place focus on those 13 and under. Companies would not be permitted to target advertising to those under 17 and any data collected from individuals under 17 would be considered sensitive and subject to more stringent restrictions.
- The Federal Trade Commission as well as State Attorney Generals would be responsible for enforcement.
- Consumers would be provided the ability to pursue lawsuits against organizations that have violated their privacy or who have failed to take requested actions on an individual’s data.
- And finally, the legislation creates requirements for companies to specify where and when data is shared with countries outside of the US.
ADPPA In Focus
Let’s break down a few important points that create a substantial impact on most organizations:
- Companies would be limited to collecting, processing, and transferring only the data that is “reasonably” necessary to provide their services.
The definition of reasonable often equates to data that is collected under “functional” cookies. Functional cookies represent the data necessary to deliver a webpage, web service, or the like. That data may include web browser ID, browser location, device type (mobile or web browser), and any data necessary to allow the user to move through the website (carrying data from page to page as an example). The “functional” cookie does not include any personal or otherwise identifying information. Other data sought by an organization would require the user’s consent to collect that data. If consent is not given, additional knowledge of who that user is or what that user wants remains unknown.
Many organizations leverage advertising via social platforms (Facebook, TikTok, Twitter, and LinkedIn), and also leverage Google AdWords and advertising to generate traffic to their websites and acquire new customers (stakeholders). Since the social and advertising companies are limited in what data they can collect, manage, share, or process, the quality of the audience targeting mechanism leveraged via these platforms by organizations is going to significantly decrease in value. Consider the earlier example in the EU of a 50% decline in identified users. US-based organizations will soon be faced with similar dice-throwing-based decision-making. The further impact on the platform companies is on their current business models, which derive revenue from the collection, use, and dissemination of data to keep the platforms freely available to users.
- Ability to control and/or opt-out of targeted advertising and the transfer of any of their data to third parties.
Data’s foundational value as currently collected by any organization will be significantly limited in the US as it is currently in the EU and other countries. The ability of the user to manage, control and opt out of targeted advertising further compromises an organization’s ability to reach and acquire new customers. A return to the past will repeat the same set of challenges and issues that existed before today’s data-rich wild west. How an organization actively reaches prospective customers or creates awareness may become as ambiguous as broadcast media advertising or spraying and praying marketing campaigns.
- And finally, Child protections would be expanded to include those under 17. Companies would not be permitted to target advertising to those under 17 and any data collected from individuals under 17 would be considered sensitive and subject to more stringent restrictions.
Everyone would concur that it is important to protect our country’s youth. It would be challenging to come up with an argument that counters that intent and necessity. The Child Online Privacy and Protection Act (COPPA), became law in the US in 1998. The Act required organizations to seek age verification and prohibit certain types of content to be presented to those under the age of 13. COPPA focused on commercial sites and transactions. It has its critics and has been challenged in the courts but has ultimately continued to remain in place. The ADPPA expands the age to 17, adding a new requirement for organizations, while at the same time COPPA remains in place. The ADPPA adds further protections on data and eliminates advertising. In 1998, the types and amount of data were not yet well understood and advertising (retargeting, herding, profiling, etc.) was in its infancy compared to the current sophistication. COPPA itself has been ripe for updating and expansion for some time.
When we consider the amount of advertising that spans the sites and properties used by everyone, we recognize that those under 17 represent a significant portion of the overall online audience. Therefore, the identification of those under 17 and the elimination of retargeting and sensitizing data are going to create further reductions in the amount of data an organization has available to manage and leverage. Those under 17 are often not the intended customer targets of some organizations, but they are absorbed as a percentage of an organization’s overall web traffic. Organizations that are seeking to engage those under 17 will surely find themselves in a different environment with new challenges, majorly resulting in a new blind spot they didn’t anticipate having to deal with. For those organizations not targeting those under 17, the correction to the overall traffic may lead to a resetting of the audience size and the results an organization can expect.
The Time Is Now
It clearly is time for the US to enact privacy protections for its citizens. The technology industry and the internet are no longer in their infancy. They dominate our economy and the day-to-day lives of most citizens. It is time to educate individuals, require them to critically assess what they are giving away for what they are gaining, and ensure companies are being socially responsible and protecting our youth in how they leverage and use the public’s data.
Organizations have to pause and critically assess their strategies, current tactics, and plans. They need to prepare adaptations of their Mesosystem in the context of the larger Macro and Microenvironments. Over the next two weeks, we will share suggestions for summer reading from our 62 issues of this newsletter that focus on data, the new construct of being data-driven, and measuring what matters. Each is a weighty topic on its own, but each represents subject matter that comes together to inform organizational strategies. And mastering the new world order of a data-driven economy is critical.
Summer is a time for relaxation, but also a time to consider, think and plan. As fall arrives, organizations may find that the Act has passed and has become law. Even if it hasn’t, an organization engaged with customers outside of the US, the EU’s Digital Services Act and Digital Markets Act will come into force sooner than you think.
Get “The Truth about Transformation”
The 2040 construct to change and transformation. What’s the biggest reason organizations fail? They don’t honor, respect, and acknowledge the human factor. We have compiled a playbook for organizations of all sizes to consider all the elements that comprise change and we have included some provocative case studies that illustrate how transformation can quickly derail.