For Realz
Measuring Social ‘Trust’ to Make Loans, NYTimes, April 7, 2015

“Alternative consumer lenders tend to fit into one of two camps: peer-to-peer marketplaces like Lending Tree and Prosper, and start-ups using data science to parse credit risk, like Affirm, Earnest and Zest Finance.. . . Vouch Financial, which is emerging from its pilot testing phase this week, has an unusual spin on the data science approach.. . . Vouch wants you to construct a social network of people who trust you financially — people who will, yes, vouch for you.. . . and typically to commit some amount of money” if you do not repay your loan.
Via @NYTimes @SteveLohr @Vouch
Note: This is mind-boggling. If you do this, promise me you will be super careful. As a “voucher”, you “sign an electronic agreement to pay, if necessary, that is legally enforceable [and] agree to have [your] own credit history[y] looked up.” Vouch is not messing around, and this is not monopoly money.
FTC to review complaint that YouTube Kids over-advertises to children, The Washington Post, April 7, 2015
Consumer groups filed a complaint Tuesday with the Federal Trade Commission, alleging “YouTube’s free app that launched in February contains too many ads that young children can’t distinguish from entertainment. On television, federal rules keep advertising to a minimum on children’s programs but on the app and others like it, the groups say those rules are disregarded.” YouTube responded in a statement essentially contending that ads are necessary to support free entertainment, and that “great content shouldn’t be reserved for only those families who can afford it”.
Via @TheWashingtonPost @TheSwitch @CeciliaKang
Note: Kids certainly have a harder time distinguishing ads and shows, but I’m not sure how you can get legal, free, TV without ads. I’ll leave this one to the experts.
Healthy High-Tech
A Deep Dive Into the Privacy and Security Risks for Health, Wellness and Medical Apps, Privacy Association, April 6, 2015
On Feb. 9, the Food and Drug Administration (FDA), issued “Mobile Medical Applications: Guidance for Food and Drug Administration Staff,” (available here) wherein it declared that it would not “actively regulate health and wellness apps—as well as a host of medical-related apps.” These App developers began believing that “no regulator [wa]s watching” – turns out, the FTC was watching closely, and these App developers are coming under fire their inexperience in complying with “online privacy and security”.
One example of where an App developer could violate FTC protocol without even intending to is where “An app that has no third-party network traffic associated with it, but that nevertheless transmits in plaintext over HTTP that “John Smith just reminded himself to take his hepatitis medication””
Via @DailyDashboard
Note: “plaintext”=data without any encryption. So as I explained in my blog piece Consider A Book – Oh and Casper Doesn’t Run Ghost Networks, and as the author explains in this article, sending a message without any encryption “makes it possible for any person within range of the end-user’s WiFi signal to sniff the data.”
Another vocabulary phrase for the day: “sniff the data”= “capture the data”. So in this case, a message sent in plain English, not even written in Pig Latin, could easily be intercepted by someone watching all of the messages being sent and received on a particular network. And as the article notes, on an “employer or school connection”, IT routinely can see all of the messages being sent and received.
My advice? If you’re using any health/wellness/medical apps and inputting any remotely sensitive information: 1) turn off the lock-screen notifications (so that no one picking up your phone will see your reminder to [insert thing], 2) use only health/wellness/medical apps that let you password-protect access to the app itself, and use the password-protection feature, and 3) if the app transmits data, don’t permit the app to run in the “background”, and turn off wifi when the app is running.
Recruiting people for genetics studies on Facebook., Slate, April 8, 2015
“Genetic researchers at the University of Michigan recently launched the Genes for Good program, an innovative study that’s using Facebook and an appeal to the common good to encourage citizens to donate their genetic samples. This initiative follows on President Obama’s State of the Union address launching the Precision Medicine Initiative, which seeks to collect genetic and other data from more than 1 million volunteers to power the transition to precision medicine.. . . The most promising quid pro quo to encourage genetic donations is therefore information and self-discovery. At this time, Genes for Good offers participants information about their ancestry and limited health information.. . . We must provide an appealing path for citizens who wish to do so to become full participants in the genomic revolution. After all, if we are going to put ourselves out there, let’s find out what we’re made of—genes and all.”
Via @Slate @FutureTenseNow
Note: This is for sure a time to read the Terms of Service 😉
Righting A Wrong
Hackers breaking into baby cams are actually trying to help, Fusion, April 7, 2015
“[T]he chief operating officer for Foscam’s U.S. distribution arm, Chase Rhymes” says that Foscam wanted “to give . . . customers the freedom to keep it easy and not have to make their own password.” Fortunately, “with new products, Foscam does force customers to put a customized password on the devices,” but Rhymes explained that ”there [i]s no way for the company to communicate to some of the people who had bought the security-defective cameras the company had made in the past.”
Via: @Fusion @TheRealFuture @KashHill @Foscam
Note: Consider this your PSA. Put a password on your babycam.
Privacy and Depression Do Not Crash Planes, Privacy Association, April 7, 2015
“Immediately after initial news of the [Germanwings Flight 9525] crash . . . [c]ritics and journalists lashed out against privacy laws”. They uncovered evidence of the pilot’s “major depressive episode he’d experienced while still in flight school”, and allege that this condition lead the pilot to “commit suicide” by crashing the plane, and that the airline covered up the condition citing privacy law.
How did they uncover evidence? 1) They read his e-mail to his employer, and 2) they searched his “click history” which uncovered that, in the days leading up to the accident, he “searched for means of suicide and locking the cockpit door.” 1) The e-mail he sent disclosed an episode which occurred back in 2009, and 2) that search could have been “out of safety concerns”.
Further, blaming privacy law fails on two additional counts:
1) It unnecessarily “stigmatize[s those] in positions of great responsibility who suffer from anxiety or mild depression and are treated by antidepressants without posing any risk to themselves or others.. . . typically fully functional members of their families, workplaces and society. And if medical confidentiality were compromised, not only the depressed but also those suffering from diabetes, hypertension or high cholesterol would be forced to hide their symptoms.”
2) “if privacy were breached, patients would not get any better but rather take their symptoms underground, go untreated or self-medicate.”
“To be sure, even the most stringent confidentiality laws have exceptions.. . . A surgeon with epilepsy or a diabetic power grid worker must take special precautions under existing laws and regulations.” But “[i]ndividuals must have a zone of privacy to vent fears and weaknesses, admit their wrongdoings and seek help. The alternative, a world where individuals do not have such private places, is very scary, indeed.”
Via @DailyDashboard
Note: I tried to summarize this article as much as possible, but if these are issues that you are interested in, I highly recommend that you read this article in its entirety. The author makes some excellent points, making an effort to reduce stigma in an age where it is way too easy to search “click history” and find one reason or another to point blame. What could we find on your computer tomorrow to prove that you were at fault for an unintentional accident?