Who’s Watching the Children? Everyone.

By Joelle Renstrom

Facebook recently rolled out Scrapbook, which allows parents to create a photo album specific to their children.

Scrapbook3(source: http://newsroom.fb.com)

Posting photos is not a new concept, but with Scrapbook, parents can tag photos of their own children, which is.

Facebook has become a forum for parents to show off their adorable newborns and toddlers. It makes sense—these parents have undergone a major life change, and they’re generally too busy and overwhelmed to do so in person. So posting photos and updates on Facebook is the most efficient way to reach large numbers of friends and family, and generally stay connected with the outside world.

One has to ask, though, whether employing social media as a means to stay in touch is a good idea, specifically because of the “deep learning” algorithms employed by Facebook (and Google and Microsoft.)

Facebook uses artificial intelligence to analyze users’ posts and photos. When kids are mentioned and tagged, they unwittingly become part of a system that will mine information about them for the rest of their lives. Kids will have virtual identities before they have actual ones. Scrapbook may make that even easier.

baby-84626_640 (1)

Maybe some people accept the lack of privacy as a fair price in return for conveniences gained. Inherent in the idea of privacy is each person’s right to decide how much to push back, if at all, against this paradigm, which is precisely the problem with parents posting on Facebook—parents cede their kids’ right to privacy before the kids even have the chance to think about it.

How bad is Facebook, really? Most people understand that despite ever-changing privacy settings, information and photos on Facebook aren’t guaranteed to remain private. We’ve all seen the ads on Facebook that just happen to offer something we’ve recently searched for and/or posted about—that’s thanks to tracking cookies, which send a log of one’s internet activities to a remote database, which Facebook can access via “like” buttons. Coupled with third-party data selling and digital fingerprinting (which essentially makes each computer a unique one, through which activity can be traced,) data-mining is ongoing, especially since some cookies can’t die.

cloud-709104_640

Think about what happens, then, when someone posts news about being pregnant or posts an ultrasound picture. Thanks to account information and push notifications, Facebook knows roughly when and where the baby will be born. Facial recognition software will allow the site to identify the baby in photos with other people and in different environments. Depending on additional account information, updates, and web searches, Facebook likely also knows the sex of the baby, as well as the parents’ socioeconomic class, and whether a parent is staying home to care for the child or whether it’s in daycare. Further posts and pictures will likely reveal what kind of baby gear the parents have or need, as most either search for or ask Facebook friends for recommendations about car seats, slings, and co-sleepers.

facebook-715811_640

Facebook will know the baby’s name, birth date and gender, and as the parents continue to post updates and search online, Facebook will know that the baby is sitting up or walking, that it wears Gymboree onesies, that it has colic or is teething. But thanks to deep learning, Facebook can do more than access search information, updates and photos—and this is the part people might not be aware of, and thus, where the even bigger problem lies.

Deep learning algorithms emulate the human brain when it comes to data processing, using simulated networks and even simulated brain cells. This goes beyond identifying likes and key words—this applies to the identification of objects in photos, and not just faces. Google’s image recognition capabilities are particularly potent, but Facebook isn’t far behind.

New software processes data that allows it to identify objects, as well as interpret scenes so it can put those objects in context: “Two pizzas sitting on top of a stove” or “a group of young people playing a game of Frisbee.” This particular software has two neural works, kind of like the right and left sides of a brain. The first “sees” an image and describes it mathematically, while the second translates the data into a sentence. What this means is that based on one adorable photo, an image recognition and translation program could understand “a dad feeds his baby mashed carrots” or “a mom plays Peekaboo with a baby.” That might not seem like dangerous information by itself, but what parent posts only a single picture on Facebook? And what parent pays close attention to the objects in the photo besides the baby? Most parents post entire albums, from which image recognition programs glean frightening amounts of data about the kid, the parents, their home, and their activities.

female-296990_640man-294556_640

Deep learning programs can also identify users’ emotions (which Facebook has been known to successfully manipulate). Image recognition programs can discern if someone is smiling or laughing, and thus conclude that the person is happy.

The context of those photos offers clues about the source of that emotion—if a person is smiling while holding a basketball, a deep learning program could deduce that the person enjoys basketball, and may be in the market to buy basketball-related goods, tickets to a local game, a LeBron James jersey, or a cable package featuring NBA games. Thus, Facebook can figure out not just what the parents like and don’t like, but what their kids like and don’t like—which makes those kids advertising targets. And just as Facebook tailored users’ news feeds to try to induce happiness or sadness, Facebook (or Google, or any company with deep learning algorithms) can attempt to manipulate the emotions of children, which as could lead to headaches beyond data-mining.

Deep learning enables Facebook to be predictive, which is particularly insidious when it comes to babies. Before a baby even has self-awareness, Facebook can already know what makes it happy and what makes it scream, and can thus predict what the baby will like and want when it gets old enough to start asking its parents for things. Even something as innocent as watching Sesame Street or being in the room when parents are watching the Superbowl can lead to assumptions and intuitions about preferences later in life, as well as buying habits. Those assumptions will trigger ads, which could create a self-fulfilling cycle by successfully induce a desire for certain goods or activities. The potential for manipulation is frightening; more frightening still is what companies such as Facebook (and, of course, the NSA) will know about them.

Maybe it’s just delaying the inevitable, but we need to take steps (like resisting Scrapbook, and sharing pictures of our families on the internet) to protect our children from acquiring data-based identities—and the complications associated with them.

———————–

Joelle teaches writing and research with a focus on science and technology at Boston University. She maintains an award-winning blog, Could This Happen?, about the relationship between science and science fiction. Her work has appeared in SlateGuernicaBriarpatchand others. Her collection of essays, Closing the Book: Travels in Life, Loss, and Literature, will be published in August, 2015.