We have a serious problem with the misuse of geolocation in apps. A few months back when I was reviewing the California Privacy Rights Act (CPRA) — now on the ballot in California as Proposition 24 — I saw that the CPRA included geolocation as part of the definition of “sensitive personal information” that consumers would be able to limit the use of. At the time I thought it was an important feature, but as time has gone on with the United States seeing mass rallies and protests, the need for Covid contact tracing, and new reports on companies misusing geolocation data, it has gone from important to “holy crap” to me. A recent article in the Wall Street Journal brought this home for me.
WSJ Article Really Cements It for Me
As reported by Byron Tau on August 7th in the Wall Street Journal, in an article entitled “U.S. Government Contractor Embedded Software in Apps to Track Phones”, the fact that someone associated with the US Government was doing this was a wee bit shocking, but the subtitle was what grabbed me. It said: “Anomaly Six has ties to military, intelligence agencies and draws location data from more than 500 apps with hundreds of millions of users.” That’s a lot of apps and a lot of people. Ironically, we have a situation right now where the US Government is worrying about TikTok and potentially what the Chinese government is doing with data that TikTok gathers, and here in the US we have a government contractor injecting itself into 500 apps with hundreds of millions of users. Yikes.
Key highlights of the article for me included:
· “A small U.S. company with ties to the U.S. defense and intelligence communities has embedded its software in numerous mobile apps, allowing it to track the movements of hundreds of millions of mobile phones world-wide”
· “Anomaly Six LLC … said in marketing material it is able to draw location data from more than 500 mobile applications, in part through its own software development kit, or SDK, that is embedded directly in some of the apps.”
· “Consumers have no way to know whether SDKs are embedded in apps; most privacy policies don’t disclose that information.”
The result is 100s of million of users are unknowingly having their geolocation data harvested and sold. So, you may say “What’s the Big Deal?” Well, as the WSJ article notes:
“In the data drawn from apps, each cellphone is typically represented by an alphanumeric identifier that isn’t linked to the name of the cellphone’s owner. But the movement patterns of a phone over time can allow analysts to deduce its ownership—for example, where the phone is located during the evenings and overnight is likely where the phone-owner lives.”
So, point is the user is no longer anonymous after you analyze their geolocation for a while. If the location data shows someone going back to a certain location every evening, they can deduce that the data is associated with a certain person. And then from there you can track if that person went to a Black Lives Matter rally, a church or synagogue, a cancer treatment center, etc. And companies can buy that data and could easily deny you loans or housing or …
No Clue and No Regulations
The WSJ article quotes a member of a trade group that represents “advertising and marketing companies who deal in location data” who said “app-makers should be more transparent with consumers about how the data may be used once it is collected.” Ya think? He added “I think the average consumer doesn’t have a clue.” Lovely.
So not surprising, the WSJ article notes that “there is little regulation in the U.S. about the buying and selling of location data,” which led an industry analyst calling this situation “the Wild West.”
The WSJ does note that “Anomaly Six said it would support regulation to require more disclosure by apps of how data is collected and used.” That’s nice of them to think of us.
But unfortunately, no regulations really exist.
And that may be one of the reasons that even if we got a decent contact tracing regime in place for Covid, that people may decide to opt-out as they may fear businesses or governments will track and trace them for other purposes.
Privacy Proposals Look to Address this Issue
In my review of various federal privacy law proposals (from Senators Wicker, Cantwell, Moran and Gillibrand), and the GDPR, it turns out Wicker/Cantwell/Moran/GDPR offer the concept of “sensitive personal information” which does include geolocation with various limitations on its use. But as I covered in those blogs, there has not been a significant privacy bill coming out of DC since HIPAA in 1996; i.e. pre the creation and use of modern cell phones and apps with geolocation capabilities.
Of course none of those proposals are enacted US laws, and the most robust State law, the California Consumer Privacy Act (CCPA), does not have the concept of sensitive personal information (which geolocation would be part of the definition) and the ability to limit its use.
So yes, there are really no regulations out there limiting the use or selling of geolocation data.
There is a Big Opportunity to Get Some Regulation Here
Who knows if a Federal privacy law can emerge, the process has clearly been bogged down for years, and the Senate will probably be close to 50-50 even in a Blue/Democratic wave. And privacy may take a back seat at the Federal level to solving Covid-related issues, the economic downturn, etc.
But the voters in the largest state in the USA do have one way to limit the use of geolocation (e.g. you can tell Uber to get your geolocation when using the Uber app to let the driver know where to pick you up, but not sell or use your location for other purposes — and same with health apps that track your location when you are jogging etc.). This issue is in fact on the ballot this fall in California with Prop 24 and the California Privacy Rights Act. [I am, by the way, a Yes on Prop24 after much investigation and research.]
Specifically, CPRA adds the concept of “sensitive personal information” (SPI) and adds geolocation to the definition of SPI. It also defines the concept of “precise geolocation”:
“Precise geolocation” means any data that is derived from a device and that is used or intended to be used to locate a consumer within a geographic area that is equal to or less than the area of a circle with a radius of one thousand, eight hundred and fifty (1,850) feet, except as prescribed by regulations.
As the folks that Californians for Consumer Privacy (CCP) say about this definition:
“1,850 feet may not sound like lot, but this works out to a circle of about 250 acres. So this new right means you can stop a business knowing exactly where you are—it can know you are in LA or San Diego, but it can’t know whether you’re in the gym, the hospital or a bar. This allows for basic advertising—you’d see an ad for an LA car dealer, and not one in Louisiana—but businesses wouldn’t be able to track how many times you’d eaten at McDonalds or left work early.”
The CPRA will also require a “Limit the Use of My Sensitive Personal Information” link on the bottom of a business’ home page (or in the Settings in the mobile app), which will really truly let a consumer say “yes I am going to let you use my location to have a driver pick me up at the right location, but not for anything else.” With the CPRA, businesses must also adhere to “purpose limitation”, i.e. they can only use the data for the purposes they clearly tell you when they collect it.
Which would mean, per the CCP, “no more tracking consumers in rehab, a cancer clinic, at the gym (for how long) at a fast food restaurant (how often), sleeping in a separate part of the house from their partner (how recently), etc., all with the intention of monetizing that most intimate data that makes up people’s lives.“
To be honest, this “feature” did not strike me as super important back say in February as compared to some others (such as a dedicated privacy enforcement agency that the CPRA calls for). But as the country has experienced Covid and the need for contact tracing, as well as the protests over George Floyd, etc. — wow, it has become a such a significant feature that we really need to regulate.
Between this and the CPRA’s adding further protection of kids’ online privacy (as Covid has forced the use of more technology by our children), to me CPRA is even more significant legislation than even I initially saw in it. Just getting these two items are big wins, enough by themselves for voters to vote Yes on Prop 24 (but I also list out 10 other major features of it). It turns out that Prop 24 may be the most consequential State ballot we have here in California this year.
And as I have discussed before, California is often the bellwether for other consumer protections (e.g. auto-emissions), so passage of CPRA could be the impetus needed to pass federal legislation and/or other laws in other States to regulate the misuse of geolocation.