Blog 18: Current impact and the future of self-tracking

Blog 18: Current impact and the future of self-tracking

Here, hear, last official blog entry for the course! We are wrapping up the last two chapters of Neff & Nafus’s Self-Tracking this week, and will move quickly into the end of the semester. It is hard to believe what we have covered in less than 16 weeks, plus the discussions we had in class and online (these blogs); I just wish there were more time for us to engage with emerging content as part of our journey in studying rhetoric, technology, and the internet.

In closing their book, Neff and Nafus made an explicit attempt to focus on the medical industry by discussing ways in which wearable technology and self-tracking affect the nature of the work and practice in medicine. Expectedly, the medical industry is one that is most immediately impacted by wearable devices as its history with technology traces back to implantables and bio-technology that may not necessarily have the ability to compute. With the popularizing of self-tracking devices for monitoring health and wellness, the whole rhetoric of health and medicine is brought to a new level of user-controlled, user-generated meaning of well-being that’s beyond what practitioners (doctors and care-providers) would imagine.

There are, of course, other industries that are directly influenced by wearable trackers, such as business, education, communications, and entertainment. As with the effects brought about by the invention of the internet, consumer wearable technology seems to set out to impact almost every part of our daily lives. Neff and Nafus certainly don’t have the space to discuss all of these in their pocket-sized reader. This also means that more needs to be said and published.

In their last chapter, Neff and Nafus, like Greengard, discuss the foreseeable future of self-tracking. They envision the need to amplify what we know and mean by “datafication”– the fight over the meanings of privacy, ownership, and data use. In fact, these fights are already underway. To improve our technologies we simple need to keep pressing on. What I see as more important across the board of innovation and technological change is the cultivation of digital literacy. It may sound pedantic that we need to educate users not just the what, where, and when of data collection/use, but also the how and why. To empower users is to give them the knowledge (to be literate) in maneuvering technology for their own purposes.

My research interest continues to revolve around such development of emerging literacy, and I am glad that we have been able to touch on a few of the topics I consider important in this semester.

To close, here are my final discussion questions:

  • Who should drive the future of technological change/innovation? Why?
  • How is technological change a social process?

Neff, G. & Nafus, D. (2016). Self-Tracking. Cambridge, MA: MIT Press. [Chapters 5 & 6]

Blog 17: A life tracked and measured

Blog 17: A life tracked and measured

This is the second official blog entry for this class! I am excited to close the course with concluding thoughts about the internet, technology, and their rhetorical impact on various aspects of our society. This conversation will be held on our final class session on May 4.

In Chapters 3 and 4, Neff and Nafus provide us with plenty of strategies to handle wearable technologies such as trackers and artificial intelligence/task managers. What I’ve found most interesting between these pages is the common purposes of tracking the authors have identified. I dedicate this space to review them.

1. Monitoring and evaluating. I can see how the immediate use of self-tracking devices are for keeping count of our activity and events. The evaluative part of this purpose is one that most older technologies (like pen-and-paper logs) couldn’t automate. The use of wearable computers for monitoring and evaluating data collected helps minimize effort needed to perform both of those tasks.

2. Eliciting sensations. This is one that I thought was intriguing but not surprising. The affective/emotional part of wearable computing has been a growing focus of the next phase of computing (as seen on the Hype Cycle kept by UMN IT). The integration of feelings into computing seems to be the logical next-step to human-centered technological innovation. However, I am curious how this changes our relationships with machines that we use and interact with constantly.

3. Aesthetic curiosity. This is a romanticized idea, I believe. The notion that raw information is beautiful (cue A Beautiful Mind), and can be “painted” with big data. I recall several desktop apps that were created to track user activity and produce a visualized result of that tracking. An example is the IOGraphica (check it out, it’s free).

4. Debugging a problem. I guess the most practical use of constant tracking, besides monitoring and evaluating activity, is to identify sources of problem and ways to resolve them. While it doesn’t always guarantee a solution, self-tracking tends to lead users to insights about themselves that they don’t usually see by self-report (because of bias).

5. Cultivating a habit. The last purpose of self-tracking, according to Neff and Nafus, is to create a habit of monitoring. This operates on an underlying assumption that monitoring is good and should be encouraged. I can see why scientist and academic lightbulbs getting lit here, but I would like to be cautious about the implications of such assumption. Donna Haraway’s “Cyborg Manifesto” offers a useful entry to this discussion. I’d recommend reading it for inspirations.

Questions for discussion:

  • What do we expect our devices to do given our goals (above) and worries about surveillance and privacy?
  • How might culture be factored into the purposes above?

Neff, G. & Nafus, D. (2016). Self-Tracking. Cambridge, MA: MIT Press. [Chapters 3 & 4]

Blog 16: Watching and watched

Blog 16: Watching and watched

The narratives on the pervasiveness of wearable technology often lead to discussions of surveillance. More plainly, we are concerned with the always-on cameras and recorders that would capture our likeness and actions with or without our acknowledgment. Undoubtedly, it’s easy for users to surreptitiously take photographs or record video or audio files using a smartwatch or smart glasses. Covert capture of videos and images of sensitive areas, as well as confidential information, is a very real concern. I recall personal anecdotes from members of my research group that their friends and family members would not allow them to wear a pair of Google Glass while in the company of others unless it is previously communicated and agreed upon. Given the speed with which wearable computers are being adopted and proliferated in various aspects of our lives, we can only look forward to a day when we no longer could ask another user to remove their wearables simply because we don’t feel comfortable being a subject of the potential recording. It would be akin to asking a random smartphone user on the street to not use his or her phone in the open because you’re uncomfortable with the fact that it’s capable of recording you in the background.

In an open society today, everyday citizens assume freedom to express themselves in public domains but also want to be left alone in particular moments when they are not fond of being monitored. In this notion, critics argue that we live in a time where we are constantly watched, in one way or another, and there’s no escaping of this reality. On open streets and public squares, we are under the surveillance of traffic and city-owned cameras. In banks, airports, retailers, and businesses, we are again under the lens of closed circuit recorders. Each time we use a credit or loyalty card to make a purchase, our activities are documented. Even in the comfort of our own homes and personal workspaces, we are monitored by computers and phones. The websites we browse, the channels we watch, the emails we send, the keystrokes we enter… up to the conversations we have in the presence of these devices, can be recorded for tracking and data mining purposes.

The rise of wearables gives us the ability to watch and watch back through sousveillance, a juxtaposition of surveillance. Sousveillance denotes bringing monitoring from high-up architectures––metaphorically and literally––down to the human level. In other words, everyday citizens can now be walking/traveling surveillances themselves with the help of wearable devices such as the new Spectacles by Snapchat and lifelogging cameras like the Sony Xperia Eye.

These wearables use sensors to detect faces, smiles, and moments of interest. Common narratives for the use of these wearables revolve around freeing the wearers from holding or staring at a screen so they can regain the experience of staying connected with the real world. While the seamless interfaces of these wearables highlight the benefits of not needing to interrupt any moment with a glaring hardware, it downplays the fact that these devices are now omnipresent and can be used to monitor or spy on others. Steve Mann and his colleagues (2003) argue that this “inverse surveillance” makes for a new system of observation where “individuals now can invert an organization’s gaze and watch the watchers by collecting data on them” (p. 336). They elaborate:

Wearable computing devices afford possibilities for mobile individuals to take their own sousveillance with them. Given this frequent sociophysical mobility, it makes sense to invent forms of wearable computing to situate research devices on the bodies of the surveilled (customer, taxicab passenger, citizen, etc.). The act of holding a mirror up to society, or the social environment, allows for a transformation of surveillance techniques into sousveillance techniques in order to watch the watchers. (Mann, Nolan, & Wellman, 2003, p. 337)

This transformation, while fearsome to many, creates new sociotechnical dimensions in composition and communication wherein our bodies become the subject that’s subjected not just to scrutiny and quantification, but to subjective use for monitoring of others. Under this circumstance, we use our bodies not just as representations of our messages and meanings, but as an agent of vigilance for ourselves and others.

For our class discussion:

  • Is the grand narrative around surveillance mythified by the cyborgian notion of a society? Or is it true to some extent?
  • How do/might everyday citizen “watch back” at their Big Brother?

Mann, Steve, Nolan, Jason, & Wellman, Barry (2003). Sousveillance: Inventing and using wearable computing devices for data collection in surveillance environments. Surveillance & Society, 1(3), 331-355.  

Neff, G. & Nafus, D. (2016). Self-Tracking. Cambridge, MA: MIT Press.

Blog 15: Closing thoughts on the IoT

Blog 15: Closing thoughts on the IoT

Greengard brings it to full circle when he provides a vision for a “day in the life of” a fictional character in year 2025. I have to admit that I buy most of his imaginary technologies in the years to come. I can see myself being “Mary Smith,” waking up to smart pajamas, ordering food from my personal device (smartphone or who-knows-something-else), and making meals based on suggestions from my computer––which has been monitoring my health and dietary behaviors.

It is worth noting that the future of connected technologies and Internet of Things should not be a merely positive image. Greengard’s last line in Chapter 7 reads, “Only time will eventually reveal these answers and let us know if a connected world really equals to a better world” (p.189). There are many, many factors to focus on before we accept the IoT to be an inevitable future.

One such factor is the meaning we assign to objects and “things” in the Internet of Things. One could consider this from the perspective of object-oriented ontology/philosophy/rhetoric (OOO, OOP, OOR). To pick the one I am more familiar with, object oriented ontology (meaning being) is the exploring “the reality, agency, and “private lives” of nonhuman (and nonliving) entities (things, objects)” [read more here].

So, this is akin to asking what your smartphone or smartwatch wants from you. What kind of agency (power, authority, and motivation) does the object assume?

When we rely on these “smart” objects/devices to help us in performing our daily tasks, what kind of power are we assigning to them? I have personally come across the experience of feeling *powerless* when my smartphone was lost and I *had* to live my life without smart assistance. I felt powerless, like a headless chicken running around trying to get things done but had no clue whether I have completed anything at all.

This begs the question: Who/what is in control? With the degree of power/authority we assign to the devices and things in the IoT, we risk losing some of the control on our end. Or do we?

[Object oriented ontology is] a brand of materialism that goes hand in hand with what you might call posthumanist egalitarianism, or panpsychism: none of the things you can name can be thought of as intrinsically less real, vital, or important than any other—an ecological viewpoint of existence that rejects any idea of human specialness as simple arrogance. (Kerr, 2016)

What’s posthuman about IoT? Are we taking the human out of the picture, because like Greengard says, “smart systems, dumb people”? Or is it that human assuming a different level of humanness now that IoT could assume our regular human daily tasks?

If you haven’t noticed, I have more questions than answers. In closing this book, I would like us to consider the human factors (not just the ergonomics) and values, and how they (should) manifest in our design of IoT systems or infrastructures. Further, I ask:

  • What kind of relationship should we have with IoT?
  • Could things/objects assume power, identity, and control? In what ways?

Greengard, Samuel. The internet of things. Cambridge, MA: MIT Press. [Introduction, Chapter 7]

Blog 14: Putting IoT and data in context

Blog 14: Putting IoT and data in context

I’ve enjoyed Greengard’s Chapters 5 & 6 because he highlights the importance of context in technology design and use. For the Internet of Things, Greengard proclaims that there are many considerations to be made beyond just technical and practical matters. While it is important to pay attention to the features and functionality of IoT, we need to also consider the human factors (in a communicative sense, not ergonomics).

When considering what data to collect and use, we need to think about the meaning(s) embedded in the data as well as our (or the consumer’s) expectations for the data.

Let’s re-examine the Oral Roberts University Fitbit integration program––where students are asked to wear a Fitbit tracker while attending classes to monitor their physical movements, travels, and other location-based monitoring. The premise for this integration might be as simple as just reducing human labor in entering these activity data (as reported by many presses on the issue).

However, what is missing in the conversations is the meaning behind monitoring student activity. That is, what does it mean to track student movement––for administrators, for faculty members, for staff, for parents, for students, and for the public? Although one might say that we are reading too much into the integration program, it is not without reason that we need to shine lights on the void this program has failed to bridge for its various constituents.

Context matters. The time, place, and other environmental, cultural, historical, and social aspects of data collection should be addressed in conjunction to the people involved in a given IoT or data analytics instance. For the ORU case, the fact that the university prioritizes a Christian worldview and enforces an honor code that sees certain behaviors as accepted or unaccepted needs to be brought into the discussions when designing a data collection program such as the Fitbit integration.

What happens when students are “caught” doing something that’s deemed against the university’s honor code––thanks to the activity data from their Fitbit? When and where are students “off the clock”? What counts as surveillance and what counts as mere data aggregation?

No technology is neutral. When we design a technology or system (including IoT), we build our image and bias into the technology. A few days ago, ProPublica published a very interesting article on this––the image of the designers in their own design––that I think is timely for our discussion this week.

Read: When the designer shows up in the design

In the ORU case, again, the developers of the Fitbit integration program, knowingly or not, builds into the program their own intentions or motivations for the collection and use of student activity data. Whether they are biased or objective, these intentions need to be spelled out, and students should have the rights to know them before participating in the program.

For this week, I pose these discussion questions:

  • What is the common narrative around the Internet of Things? What do people think when we talk about IoT? How are these narratives reflective of our public assumptions/bias/motivations for IoT?
  • What steps might we take to ensure transparency and authenticity in IoT infrastructure?

Greengard, Samuel. The internet of things. Cambridge, MA: MIT Press. [Introduction, Chapter 5 & Chapter 6]

Blog 13: Big data… whose data?

Blog 13: Big data… whose data?

In Chapters 3 and 4, Greengard discusses how the Internet of Things is about tapping into big data by giving value to information collected. He has identified a few components that crucial to information collection:

  • Sensors
  • GPS and real-time location systems
  • RFID tags

Greengard also talks about the ways big data analytics are going to “revolutionalize” different sectors, including military, medical, business, and homes.

As I reflect upon the popular narratives about the Internet of Things and how it might affect our lives, I can’t help but to play devil’s advocate, and ask the question about data ownership and security. We have talked for a little bit in class about the Oral Roberts University (ORU) case where incoming freshmen were required to wear a Fitbit to record their physical activity and movement for the purposes of the university’s health and wellness program.

This example allows us to question the ownership of data. From the perspective of the students, we may ask: who owns the activity data generated by the student population at large? Is it the individual students? The faculty members (I sure hope not)? The administrators (some argue so)? Or everyone?

The fact is that current privacy laws in educational settings are not compatible with the technological practices in these settings. FERPA and HIPPA are the two many acts protecting student information from being shared with third parties––including their parents, if the student is over 18 years old and does not give consent to the university for sharing their records. Yet, in the ORU case, activity data are aggregated data. They are not specific to individual students and thus FERPA and HIPPA would not apply.

That being said, students still have the rights to their own records, and how these records are being used by the university administration. If not treated with care and ethics, these (big) data could be used for for-profit or surveillance purposes. For instance, university administration could use findings from these data analytics to determine when students tend to visit the library, or the cafeteria, or the amount of activity at various times of day. These information could inform how the different university locations can manipulate their operations to better serve students, or to advertise products or services to students based on their locations.

While it might seem bizarre at first, these are very practical ideas that could be easily implemented for as long as university administration has control of student information. Checks and balances would need to be put in place to ensure the proper and ethical use of student data. Otherwise, we would run into problems of misuse, such as the following news story on the ORU Fitbit program––regarding the administrators possible misuse of the device to spy on students.

The topic of privacy and surveillance still needs a lot attention. We are on the edge stepping into a new version of the university, where the business of education is no longer just about knowledge making and dissemination, but also how to profit out of the process using connected technologies. Part of the goals of this course is for us to see this oncoming phenomenon, and prepare ourselves to respond when necessary.

Questions for discussion:

  • Besides tracking student activity, what else could the Internet of Things infrastructure offer higher education?
  • How might the Internet of Things help improve our interpersonal communication?

Greengard, Samuel. The internet of things. Cambridge, MA: MIT Press. [Introduction, Chapter 3 & Chapter 4]

Blog 12: On the Internet of Things

Blog 12: On the Internet of Things

We wrapped up Richard Toye’s book last week and enter into a text this week, The Internet of Things, by Samuel Greengard. So far in this course, we have covered the history and development of the internet, including understanding its materiality, and the foundational principles of rhetoric, which we use to analyze the design and uses of various digital technologies.

In this new unit, we are yet again diving into something that we are not very familiar with. The Internet of Things, IoT in short, has been a buzzword for the past few years but each time I discuss it with my students and friends, they seem to think that it means searching about things on the internet. That’s far from what the IoT is. IoT is not really a thing at all. It is a concept whereby everyday objects and digital technologies connect and communicate with each other to create an ecosystem. This ecosystem is, supposed, supposed to minimize human input and interference in data collection and analysis, as well as automation of technology reaction based on the data mined.

jacada-internet-of-things-diagram.png

In other words, IoT aims to provide a seamless interaction experience between humans and technologies, such that the labor in using technology is removed and replaced with automation and machine learning. It is, however, not to take human entirely out of the picture; it is to create an ecology where human is served by technologies, instead of otherwise.

To achieve that kind of ecology, we now have smart AI (artificial intelligence) agents or assistants that would help mediate us with the technologies we use. Examples include  Amazon Echo (Alexa) and Google Home. These AI agents serve as the coordinators. Like personal assistants, human users speak to these AI agents directly without needing to tend to the individual technologies at homes or workplaces.

Among the questions of interest to rhetoricians, regarding these AI intermediary and the IoT ecology, is agency in nonhuman and inanimate objects. For decades rhetoricians have studied and theorized human agency (such as control, motivation, authority, authorship, etc.); the shift to considering nonhuman agency means understanding how machines “learn” to make decisions and develop a sense of power/control. How do Alexa and Google Home “decide” to turn on/off certain devices given particular circumstances? How would they handle an ethical dilemma similar to the Trolley Problem (see video below)?

We are at the brink of a new technological society– one that we must include our machines in the process of education and civic development. What I am thinking about here is beyond a utopian or dystopian narrative, or technological determinism. I am concerned about the ways we interact with machines and treat them as counterparts of our lives. There’s much to talk about in the next few weeks in the unit.

For discussions:

  • What are the key concerns for technical communicators and writers when it comes to the internet of things?
  • What kind of literacies do we need to develop given the growth of machine agency and automation?

Greengard, Samuel. The internet of things. Cambridge, MA: MIT Press. [Introduction, Chapter 1 & Chapter 2]