CCPA won’t be enough to fix tech’s data entitlement problem


Fredrick Lee Contributor
When the California Consumer Privacy Act (CCPA) rolled out on January 1st, many companies were still scrambling to become compliant with the data privacy regulation, which is estimated to cost businesses $55 billion. But even checking all of the compliance boxes isn’t enough to safeguard consumer data. The past few years of rampant breaches and data misuse have shown how quickly personal details can fall into the wrong hands. They’ve also shown how often simple user error enabled by poor data practices leads to big consequences.
The way to solve this issue isn’t solely through legislation — it’s companies taking a hard look at their behavior and processes. Laws like CCPA and GDPR help set the groundwork for change, but they don’t address the broader issue: businesses feel entitled to people’s data even when it’s not part of their core product offering and have encoded that entitlement into their processes.
Legislated and top-down calls for accountability won’t fix the problem on their own. To protect consumers, companies need to architect internal systems around data custodianship rather than data ownership. Doing so will establish processes that not only hit compliance benchmarks but make responsible data handling the default action.
Privacy compliance over true procedural change is a cop-out
The prevailing philosophy in Silicon Valley is one of data ownership, which impacts how consumers’ personal information is used. The consequences have been widely reported on everything from the revelations surrounding Cambridge Analytica to Uber’s 57-million-user data breach. Tech companies are losing the trust of customers, partners and governments around the world. In fact, Americans’ perception of tech companies has steadily dropped since 2015. More must be done to win it back.
Companies that rely on regulations like CCPA and GDPR to guide their data policies essentially ask someone else to draw the line for them, so they can come as close to it as possible — which leads to a “check-the-box” approach to compliance rather than a core philosophy that prioritizes the privacy expectations of their customers. If tech and security leaders build data policies with privacy in mind, we won’t have to spend valuable resources meeting government regulations.
How to take the entitlement out of data handling
Responsible, secure data handling is achievable for every company. The most important step is for businesses to go beyond the bare minimum when reevaluating their data access processes. What’s been most helpful for the companies I’ve worked with is organizing these practices around a simple idea: You can’t lose what you don’t have.
In practice, this idea is known as the Principle of Least Privilege, whereby companies give employees only the data access they need to do their jobs effectively. Here’s an example that applies to most customer-facing businesses out there: Say I’m a customer service rep and a person calls me about a problem with their account. If I operate according to the Principle of Least Privilege, the following data access rules would apply:
- I would only have access to that specific customer’s account information;
- I would only have access to the specific part of their account where the problem is happening;
- I would only have access until the problem is solved.
Sounds intuitive, right? Yet, many companies — particularly those operating without the Principle of Least Privilege in place — discovered through the GDPR and CCPA compliance process that their data access controls did not work this way. This is how major breaches happen. An employee downloads an entire database — much more data than they need to perform a specific task — their laptop is compromised, and suddenly hackers can access the entire database.
POLP works because it introduces a bit of friction into the data-request process. The goal here is to make the right decision easy and the wrong decision harder, so everyone is intentional about their data use. How a company achieves this will differ based on their business model and growth stage. One option is to have only a single database with an added layer of infrastructure that grants data access through POLP rules.
Alternatively, companies can work these rules into their CRM software. In the example I mentioned, the system would grant data access to a rep only when it recognizes a corresponding customer support case. If an employee tries to access data that is not directly tied to a customer problem, they would encounter an additional login step like two-factor authentication.
There’s no one-size-fits-all approach; rather, data access should operate on a spectrum. For one business, it may mean limiting data access to a single business account and the related set of customer information. At another company, an engineer may need access to multiple customers’ information to fix a product issue. When this happens, the data access should be both time-bound and highly visible, so that other employees can see how the data is used. There may also be times when an employee needs to access data in the aggregate to do their job — for example, to run a report. In this case, the data should always be anonymized.
Protecting consumer data is a moral obligation, not just a legal one
The power of privacy-focused data processes and a system like the Principle of Least Privilege is that, by design, they guide employees to use data with the customer’s best interest in mind. The Golden Rule should apply: We each must treat consumer data in the way we’d want our own data used. With the right functional procedures in place, infrastructure can make responsible data access intuitive.
No company is entitled to data; they are entrusted with it. Consumers must be aware of how their data is treated and hold companies accountable. Regulations like CCPA make this easier, but businesses must uphold their end of the bargain.
Trust, not data, is the most valuable currency for businesses today. But current data practices do nothing to earn that trust and we can’t count on regulation alone to change that. Only practices built with privacy and transparency in mind can bring back customer trust and keep personal data protected.
Fredrick Lee Contributor Fredrick “Flee” Lee is chief information security officer at Gusto, the people platform for 100,000 small businesses nationwide. He previously led security at Square after holding senior security roles at Bank of America, Twilio and NetSuite. When the California Consumer Privacy Act (CCPA) rolled out on January…
Recent Posts
- The newly announced PSVR 2 price cut might finally make it a viable Meta Quest 3 competitor
- Hoto’s 48-in-1 electric screwdriver set hits a record low $70
- Amazon says its new quantum computing chip will make error correction more efficient
- I think Microsoft is smart to follow OpenAI in making these premium features free
- Pokémon Presents 2025: all the biggest news and trailers
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010