Today’s 4th Annual Privacy Law Forum: Silicon Valley

Today’s 4th Annual Privacy Law Forum: Silicon Valley had a great set of speakers, and even managed to have some representatives from the opposing view on the panel.

For instance, Tyler Newby, who in a former life executed searches on e-mails, computers, etc. as part of criminal prosecutions, called us all out a bit by noting that nearly all of today’s crimes rely on the ability to collect evidence from electronic devices. Thus the battle for the power between users & law enforcement isn’t a “nerd” issue anymore (my interpretation) – it’s an issue of values (his words, which I agree with). He proffered that the issue really comes down to whether it’s more important to protect peoples’ privacy, or to investigate crime.

Well. Fair point, I suppose.

At times I forget that pretty much any crime one can commit these days would leave a digital footprint. It seems like just yesterday, it was 2005 & only the cool kids had their whole lives on electronic devices. But that could also be due to my sample bias – I still know people who have little to no digital footprint, and even some who only recently got “smart” phones.

I am of course hugely comforted by Riley v. California ruling that our cell phones can’t be searched without a warrant. But I also want to believe that the court system is sufficiently neutral that, if a judge has granted a warrant, that perhaps at that point, law enforcement has proven it should be able to search a suspect’s phone. Whether it be the obvious kidnapping scenario, or even a case where evidence of millions of dollars of embezzlement is on a computer, once law enforcement has obtained a warrant, shouldn’t law enforcement be able to execute that warrant? If Apple has its way, Apple can’t decrypt devices, so no matter what the court says, law enforcement would not be able to collect any evidence stored on an Apple device.
Now, don’t get me started on the horrors of how electronic evidence is collected these days. It’s terrible. No one has any idea (as far as I’ve read) what they’re doing (although perhaps Mr. Newby did), so the digital forensic folks essentially come in and either take everything, or they search through everything and then take what they need. Either way, they see everything.

Arguably, the same thing happens in Real Space. If a policeman shows up at your house with a warrant, looking for your grandmother’s watch, (I’m no expert in Criminal Procedure, but) he gets to search the house for your grandmother’s watch. And he might see your [insert thing you really don’t want him to see] while he’s at it. #Awkward.
On the other side, however, most people don’t keep all that much correspondence lying around. Computers, however, potentially have a decade or more of a suspect’s correspondence. I’m just saying – comparing searches in Cyber Space and searches in Real Space is like comparing Apples and Oranges.

Danielle Citron was one of my favorites. She, of course, was the main reason I was interested in attending the conference. Her keynote speech was on Revenge Porn and other Hate Crimes in Cyberspace, and she finally gave me some ammo to get around the Free Speech arguments that pervade the “well there’s no conduct, so it’s just speech” arguments. I mean, I get that it’s ridiculously hard to apply the “Fighting Words” doctrine to Online Harassment. But that doesn’t mean there isn’t perfectly good other precedent for stopping it. I had the pleasure of chatting with Danielle for a little bit- she’s extremely passionate about her work, and she of course had all the rebuttals ready for any view that free speech protected online harassment. It was impressive. She suggested using precedent against:

  • Solicitation for violence
  • Breach of confidence
  • Intentional Infliction of Emotional Distress
  • And/or potentially a very narrow carve out for section 230 (47 U.S. Code § 230 essentially gives immunity to Internet Service Providers who publish information submitted by others)

My favorite solution that she proposed was using code similar to that used by law enforcement in child pornography cases. Apparently, part of the process with a child pornography case can be to write code which will look for images of the child on the Internet and remove them. It doesn’t seem like rocket science that, at the very least, once a victim has established that s/he is a victim of revenge porn, that law enforcement could simply use the same code and automate deletion of images of the victim posted without her consent.

Finally, an overall theme was that the notion of “privacy” and “personally identifiable information” are no longer standard terms, even within the legal privacy world. Privacy has sub-categories, with attorneys specializing in areas like HIPAA (Health Information Privacy Accountability Act), and while some users are perfectly happy to post their real-time jogging information on Twitter, some don’t even want their jogging information tracked at all. To keep users happy, companies therefore need to not only provide options for their users, but those same companies need to be sure to honor those commitments. And part of that solution, in the modern day and age, means adding one more drill to the list: now companies don’t just stop, drop, and roll, or run and hide under a desk or doorway, they also have to do data security breach drills. I bet it looks just about as exciting as the hacking to cause the breach itself did (hacking is one of the least exciting things to watch – I promise).

3 thoughts on “Today’s 4th Annual Privacy Law Forum: Silicon Valley

Debug This!

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s