So Much For Working In Your Pajamas
Online Test-Takers Feel Anti-Cheating Software’s Uneasy Glare, NYTimes, April 6, 2015

“Once her exam started, Ms. Chao said, a red warning band appeared on the computer screen indicating that Proctortrack was monitoring her computer and recording video of her. To constantly remind her that she was being watched, the program also showed a live image of her in miniature on her screen.”
See So Much For Working In Your Pajamas (Cont.) further down this in this piece for my full article summary and analysis.
Litigation Updates
The Hulu VPPA Decision and What We Can Learn from It, IAPP, the Daily Dashboard, Aril 3, 2015,
This critical Video Privacy Protection Act (VPPA) litigation finally ended after a four-year battle. ”Dominique Shelton, CIPP/US, in this Privacy Tracker post,” writes “’Magistrate Judge Laurel Beeler dismissed the plaintiffs’ claims with prejudice.’
Ultimately, “The court made its decision based on the technology.”
Note: The judge appears to have had her hands tied by the VPPA’s requirement that Hulu have “’knowingly disclose[d]” consumers’ personal information in connection with their video-viewing information”. Shelton entices readers to use their IAPP login to read more of her article by writing “[t]he ‘litigation has elucidated certain guiding posts that companies can consider in practices going forward. . . offering in dicta guidance and lessons that can be learned from the ruling.”
Via @DailyDashboard
Note: If you have an IAPP log in, I bet this article is worth looking at.
Jay Edelson, the Class-Action Lawyer Who May Be Tech’s Least Friended Man, NYTimes, April 4, 2015
“Conor Dougherty When technology executives imagine the boogeyman, they see a baby-face guy in wire-rim glasses. His name is Jay Edelson.. . . His firm, Edelson PC, specializes in suing technology companies, claiming privacy violations. He has gone after pretty much every tech company you have heard of — Amazon, Apple, Google — as well as many that you have not. His cases read like a time capsule of the last decade, charting how computers have been steadfastly logging data about our searches, our friends, our bodies.
Note: “Mr. Edelson is full of self-deprecating comments about how he is “not technologically savvy at all” and that his move into privacy law was “a total accident.”” Mr. Edelson, all you have to do is ask and I’m happy to bring you up to speed. My classes have been talking about mythical class-action privacy lawyers for a while now – it would be a pleasure to meet a real, live one.
Stop, Drop, and … Wait what does our policy say to do next??
Video: In Depth – Sotto Details Who, What, Why of Today’s Cyber Threat Landscape, Hunton Privacy Blog, April 6, 2015 Today, Sotto says, cybersecurity is a legal issue, a risk issue and a governance issue, and one that matters to shareholders, boards of directors and regulators. View the video segment.
Via @Hunton_Privacy @LisaSotto
Note: as we discussed at the 4th Annual Privacy Law Forum: Silicon Valley, data breaches are something companies today must plan for and train for just like Fire and Earthquake drills. “[I]t is no longer a question of if but rather a matter of when your company will be hit.” Also note, I have not watched this video, but the source is a reliable one, and the summary sounds like they’re going to hit some important topics.
Your Daily Dose of “That’s Sort of Cute”
Facebook Scrapbook lets children inherit a digital identity., Slate, April 6, 2015

“To take advantage of this feature, users must explicitly identify their relationships on their profile. Parents can create scrapbooks only for children they have listed in the ‘Family and Relationships’ section of their profile. If the relationship is added through the Scrapbook tutorial, it is only visible to the user by default. Parents can only co-own a scrapbook with the partner they have identified in their relationship status.”
Via @Slate @FutureTenseNow @DearPriya
Note: I’m surprised it took Facebook this long. I mean seriously, my friends and relatives post pictures of their kids daily, and the photographic proof that exists of my childhood can fit into a closet (I used to think it was just a few scrapbooks, but my mother assured me that there were more photos that just never had gotten organized). Lucky for me, I got my first digital camera in like 2003 or something, so then I started chronicling everything. Now I’m a digital hoarder – but who isn’t?!
So Much For Working In Your Pajamas (Cont.) & A Medical Data Piece Much Like My Blog Piece
(Both Food For Thought)
Online Test-Takers Feel Anti-Cheating Software’s Uneasy Glare, NYTimes, April 6, 2015
“Once her exam started, Ms. Chao said, a red warning band appeared on the computer screen indicating that Proctortrack was monitoring her computer and recording video of her. To constantly remind her that she was being watched, the program also showed a live image of her in miniature on her screen.” Mr. Gamino, a Rutgers sophomore, believes that “teachers would quit outright if they had to grade papers in the privacy of their own homes and be monitored and be forced to pay for it out of their own pocket.” But for Ms. Chao, she would have to pay out of pocket to take her exam either way, and surrendering her privacy saves her $120.
In Verificient’s defense, in an effort to “alleviat[e] students’ concerns, @Verificient recently posted a pledge on its blog saying that Proctortrack did not share students’ data with third parties; that it typically deleted students’ data after 30 to 60 days; and that students could remove the software from their computers once they had uploaded their test data.”
Via @NYTimes @Natashanyt @Verificient
Note: First off, the entire reason anyone does work from home is to say in their P.J.s (ok and if they have small children… or a long commute… or if it’s cost-prohibitive to attend a proctored exam; but that last one is new information for me). Second, Verificient’s privacy policy seemed solid, until NYTimes pointed out: “the company has not changed its privacy policy — which states that it may unilaterally amend its policies at any time and that it may disclose users’ personal information to third-party service providers or in the event of a company merger, sale or bankruptcy.” Which, effectively means “your information is private so long as it isn’t extraordinarily valuable to us.” Because what happens in the case of merger, sale, or bankruptcy? (pretty sure there should be a comma between sale and bankruptcy) Your personal data, especially that super-awesome video of you forgetting to turn your webcam off during your exam, is wayyyyyy more valuable than any other product that Verificient has.
Seriously, throwing away any bad behavior on Verificient’s side, merely your name, address, and other seemingly innocuous personal information will likely be worth more to a purchasing company than anything else Veritificient has to provide, in the even Verificient files for bankruptcy.
Using Patient Data to Democratize Medical Discovery, NYTimes, April 2, 2015
“One technical tool that could push things along is Apple’s ResearchKit, which was introduced last month. The open-source software in ResearchKit would let medical researchers write applications to recruit subjects and collect their data from their iPhones or iPhone-linked fitness monitors, like the FitBit or Apple Watch. . . In its risks and benefits page, the project lists the benefits as contributing to a cause that advances science and satisfying personal curiosity. On the risk side, it states that in principle someone with scientific skills could use a person’s genome to infer paternity, generate statistical evidence that might hurt a person’s chances of getting a job, insurance or loans, and make synthetic DNA to plant at a crime scene.”
Via @NYTimes @SteveLohr
Note: This article essentially touches on the same topics I do in my blog piece, Even HIPAA Can’t Protect You: Personalized Prohibitions Based On Your Medical Data Could Be Coming. I’m am literally the biggest proponent you will find for being able to use Big Data for diagnostic and curing purposes, but even I am willing to put those dreams on hold until we can find a way to ensure that the data is either 1) first depersonalized, and/or 2) there can be no negative ramifications from a breach (Ben Heywood, co-founder and president of PatientsLikeMe, touched on an issue I had also brought up: that, historically, disclosure of health information was potentially an issue because it “might be more difficult to obtain health insurance.” But that “the Affordable Care Act bans discrimination by insurers based on so-called pre-existing conditions.” Thus “[t]he biggest deterrent has been removed by law.” This implies that there is a path towards a safe course of action for sharing this medical data: proper regulations.)