Category Archives: Personal Data Management

2017 Conference dates chosen!

Decorative - 4T Data Literacy conference logo

We’re excited to announce that the 2017 4T Data Literacy Virtual Conference dates have been announced! We’ll meet virtually on July 20-21, 2017. This year, we’re focusing our presentations on three (and a half) themes:

  1. Big Data, including citizen science
  2. Ethical data use
  3. Personal data management

Registration and more details will be forthcoming soon. If you registered last year, you’re already on our list and will let you know when it’s time to sign up!

Reading Recommendation: Predictive Analytics

When used to make predictions, data can be quite powerful! A common example is the story of the retailer Target’s prediction of a customer’s pregnancy. When the company sent coupons for baby products to a teen, her father complained. However, it turned out that she was indeed pregnant. Such stories can be impressive and concerning. In addition to learning trends and patterns from data, data can lead to new information. In the case of Target and the teen, the store did not just know what the teen bought. Those data suggested more information: her pregnancy. As Eric Siegel writes:

[t]his isn’t a case of mishandling, leaking, or stealing data. Rather, it is the generation of new data, the indirect discovery of unvolunteered truths about people. Organizations predict these powerful insights from existing innocuous data, as if creating them out of thin air.

To understand how predictive analytics work, Siegel provides a wealth of examples and in-depth explanations in Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die. Understanding how organizations glean information from data and use that information helps us understand marketing and decisionmaking today. It also helps us manage our personal data.

 

Source: Siegel, Eric. Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die. Hoboken, New Jersey: John Wiley & Sons, 2013.

Image: “Women Grocery Shopping.jpg” by Bill Branson (Photographer), on Wikimedia Commons. Public Domain. 

Location data points can identify individuals

We are looking forward embarking on the second year of our project in the fall! During Year Two, we’ll focus on a second set of themes. One of the areas is personal data management. Here’s a sneak peek of what that theme will cover.

Our actions, from using a cell phone to paying with credit cards, generate data. That data goes into the hands of companies and organizations. Often, we don’t know or have control of what they do with it. Use of this data can cause privacy issues. One common example is the Netflix contest for improving its movie recommendations, which went awry when researchers could re-identify Netflix customers despite the anonymization of the released data.  

A recent study from the Columbia University Data Science Institute and Google revealed that individuals can be re-identified with location data from two accounts. As a preview to the issues we will start exploring this fall, check out that study.

Image: “Compass Navigation Map Direction Journey Travel,” by PDPics, on Pixabay. CC0 Public Domain.

Reading Recommendation: What Stays in Vegas

One industry that uses personal data from customers is gaming. Through loyalty programs, casinos can glean information about people to customize advertising and services. Adam Tanner describes this practice in What Stays in Vegas: The World of Personal Data–Lifeblood of Big Bussiness–and the End of Privacy as We Know It:

Boosted by vast banks of computers, Caesars today know the names of the vast majority of their clients, exactly how much they spend, where they like to spend it, how often they come, and many other characteristics. They even know exactly where many of their customers are at a given moment–whether they are sitting at a specific Wheel of Fortune slot machine or playing blackjack in the wee hours of the morning. They gather all these details with the consent of those who choose to participate in their loyalty program.

Loyalty programs supply your personal data to the companies with which you sign up for them. This book made me think twice about signing up for and using loyalty programs, despite their benefits, because they require giving up so much information about my habits. I had no idea!

In What Stays in Vegas, Tanner also brings up ethical issues, such as the justifications that commercial companies have for tracking people. He questions where the line between creepy and useful is. Tanner proposes that consumers should be able to see what data that private companies have and that privacy policies should be provided consistently and recognizably. Check out his appendix for actionable ways to control your personal data, such as using an email address that does not identify you by name for communications from commercial companies and signing up for the Do Not Call Registry.

What are ways that you limit your personal data sharing? Do you participate in loyalty programs?

 

Source: Tanner, Adam. What Stays in Vegas: The World of Personal Data–Lifeblood of Big Bussiness–and the End of Privacy as We Know It. New York: PublicAffairs, a Member of the Perseus Book Group, 2014.

Image: “A view of the card tables inside the casino” by Kym Koch Thompson, on Wikipedia. CC BY 2.0. 

Reading Recommendation: Data and Goliath

Where are your data stored, and who has control of your data?

The answer to this question is not always straightforward. We don’t always know whose eyes are on our data. For example, cell phone data reside on servers of private companies. A lot of information can be gleaned from data, from your location to your relationships.

Bruce Schneier writes about surveillance via data in Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World. For anyone curious about what data that companies and the government keep and monitor, it is a fascinating read.

One of Schneier’s points is about security and privacy, which pertain to data. Access to data, like cell phone logs, can reduce privacy but support security. He writes:   

[o]ften the debate is characterized as “security versus privacy.” This simplistic view requires us to make some kind of fundamental trade-off between the two: in order to become secure, we must sacrifice our privacy and subject ourselves to surveillance. And if we want some level of privacy, we must recognize that we must sacrifice some security in order to get it.

However, this contrast between security and privacy might not be necessary. Schneier goes on to point out that:

[i]t’s a false trade-off. First, some security measures require people to give up privacy, but others don’t impinge on privacy at all: door locks, tall fences, guards, reinforced cockpit doors on airplanes. When we have no privacy, we feel exposed and vulnerable; we feel less secure. Similarly, if our personal spaces and records are not secure, we have less privacy. The Fourth Amendment of the US Constitution talks about ‘the right of the people to be secure in the persons, houses, papers, and effects’… . Its authors recognized that privacy is fundamental to the security of the individual.

More generally, our goal shouldn’t be to find an acceptable trade-off between security and privacy, because we can and should maintain both together.

Schneier’s book is illuminating for considering personal data management (one of the themes for the upcoming second year of our project in 2016-2017!) in light of data use by commercial companies and government. Schneier takes a philosophical approach to discussing data, security, and privacy. He concludes with useful tips for protecting your data. Read Data and Goliath for some great food for thought!


Source: Schneier, Bruce. Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World. New York: W.W. Norton & Company, 2015.

Image: “People Lens White Eye Large” by skitterphoto.com, on Pexels. CC0 Public Domain. 

Internet research skills on mobile technology

How do you teach good online research skills to students who use mobile technology?

Librarians are observing that students approach research differently on mobile technology. Infinite scrolling makes re-finding difficult. The abundance of information has led to differing ideas about what sources are credible. Our team member Wendy Stephens wrote about these issues on School Library Journal. Included in her piece are insights from our team member Tasha Bergson-Michelson.

Wendy writes:

Evaluating information is necessarily a more time intensive and complicated process than retrieving information in a networked environment, but teens have demonstrated shifting notions about what makes a source valuable. Pickard, Shenton, and Johnson (2014) found that the young people they surveyed at an English secondary school, when presented with a list of particular evaluative criteria for online research, were not interested in traditional authority of information. Those students instead prioritized currency and topicality, lack of mechanical errors, and verifiability. The last item in particular suggests that young people find recurring information, shared in a variety of places, to be a hallmark of authenticity at odds with earlier notions of authorial attributions.

“Search is a garbage in, garbage out process,” says Tasha Bergson-Michelson, instructional and programming librarian at Castilleja School in Palo Alto, CA. “Choosing search terms is hard. If you have the right words, you can find the data.”

Transferring research standards to current technology is necessary, as Wendy concludes:

The topics may differ and the sources might look different, but online research still points to many of the hallmarks of an established process. Contextualizing the acquisition of search skills, as Martin suggests, and refining search terms as Bergson-Michelson advocates, reiterate principles of bibliographic instruction grounded in print research. But the necessary authenticity of the research task will remain integral, and this is where librarians are key in championing and supporting inquiry projects of students’ own devising, helping young people connect to a range of resources to inform their particular passions.

These points connect to data literacy because knowing how search works is part of responsible digital citizenship and, relatedly, personal data management. Thanks, Wendy and Tasha!

Image: “Apple Iphone Smartphone Technology Mobile Phone,” by Pexels on Pixabay. CC0 Public Domain.

Team member Connie Williams on privacy and teens

This post is Part 1 in a two-part series highlighting our team members’ work with Choose Privacy Week. This initiative of the American Library Association puts a spotlight on issues of privacy in today’s digital world, such as tracking in online searches. Knowing how your data are used is a component of data literacy, and we are excited to feature our team members’ blog posts on these topics.

Connie wrote about the traces that online actions leave and how they affect teens on the Choose Privacy Week blog. Here is an excerpt from her piece:

…there are universal norms that our students must know about their online presence: what you post can describe you, once a post leaves your device it is no longer in your control, and there is indeed, a digital footprint that gets left behind.

What this means for children and teens is that their online lives can follow them through their offline lives. If they post provocative things or mean things or negative things, they will be perceived by their online friends as those things; even if they are none of those things in their offline lives. One of the hardest ideas for teens to grasp sometimes is the idea that they are often creating a ‘body of work’ that can define them to others.

Online work can certainly have broad implications. Being active online, and managing privacy at the same time, are not always easy, though. Connie suggests establishing norms:

…it is important that we begin thinking about how we will allow our growing children online access while still keeping them protected. While online security is not a typical survival necessity, it is one that can impact our children. As adults, the information we share about our children with our own friends and families is the first step to modeling positive online behavior. Setting up norms that children learn to follow and understand – ‘hand holding’ –  will allow parents and educators to loosen that grip, enabling them to expand their access as they grow and demonstrate their abilities to participate positively.

Instruction on best practices for students can take a variety of forms, and Connie goes on to provide examples. Thanks, Connie!

Image: “Choose Privacy Week 2013,” American Library Association, on Choose Privacy Week

In the News: Internet privacy

In what ways do you limit your data sharing? Do you join or avoid loyalty rewards programs that track your habits? Do you block or regularly clean out cookies in your browser? Those steps are some areas where you have control over your information. Yet, data sharing to third parties is sometimes out of your hands or buried in the fine print of services that you use.

This last week, the Federal Communications Commission proposed new rules to give you a choice to opt out of data sharing to third parties by your Internet service provider. While this rule does not apply to sharing by websites, as critics point out, it does take a step toward consumer control of data sharing in the United States. It will be interesting to see what comes of this possibility!

Image: “Binary Map Internet Technology World Digital” by Pete Linforth on Pixabay. CC0 Public Domain. https://pixabay.com/en/binary-map-internet-technology-1012756/

FitBit Charge HR from Wikipedia.

Should your school track your fitness?

Students at Oral Roberts University have been asked to submit their exercise records since the school began. Recently, the school announced that students will be required to purchase and use a FitBit personal fitness and health tracker, with the data being synced with university record-keepers.

Now, before I go any further, I want to say that I have a FitBit and really like the kinds of data I get. I opt to keep my data to myself (I don’t like competing with other people about my exercise – I find that a lot of social media’s motivational push actually backfires on me). In my case, the benefits to my health, when weighed against the fact that my heart rate is floating around in the cloud somewhere, win out.

But this Oral Roberts decision brings up an interesting question relating to next year’s themes of and personal data management. What happens when others — not just anonymous FitBit employees, but the people who teach us, feed us on campus, house us in dorms, and, in the case of a religious university, guide our spiritual development, also have access to our data?

And what kinds of data do they get? Exercise minutes, as in the past, to be sure. But today’s fitness trackers don’t just count steps, like an old-school pedometer or a distance measuring wheel used in track and field. They are what Zuboff would call “automating” devices — there’s so much more data that can be captured. For example, your FitBit data can display the time of day of exercise (violating curfew, anybody? Now you’re busted). And even the heart rate (not a good health insurance risk!). If a user logs food, that gets tracked, too (Twinkie? Gotcha. At the same time, if it looks like you are anorexic? We can intercede.)

What are the ethical responsibilities of those who hold our data, whether it’s my steps or my photo collection? What are the responsibilities of those who have access to our data? Most importantly, what are the unintended consequences, and how do we teach ourselves to

These are the kinds of questions that make me excited for year two of our grant. Oh, and according to my FitBit, I’ll be hitting the gym tonight.

Kristin Fontichiaro

 

 

 

Image: “Traqueur d’activité Fitbit Charge HR au poignet” by Wuefab on Wikipedia. CC-BY-SA-4.0.