I’ve enjoyed Greengard’s Chapters 5 & 6 because he highlights the importance of context in technology design and use. For the Internet of Things, Greengard proclaims that there are many considerations to be made beyond just technical and practical matters. While it is important to pay attention to the features and functionality of IoT, we need to also consider the human factors (in a communicative sense, not ergonomics).
When considering what data to collect and use, we need to think about the meaning(s) embedded in the data as well as our (or the consumer’s) expectations for the data.
Let’s re-examine the Oral Roberts University Fitbit integration program––where students are asked to wear a Fitbit tracker while attending classes to monitor their physical movements, travels, and other location-based monitoring. The premise for this integration might be as simple as just reducing human labor in entering these activity data (as reported by many presses on the issue).
However, what is missing in the conversations is the meaning behind monitoring student activity. That is, what does it mean to track student movement––for administrators, for faculty members, for staff, for parents, for students, and for the public? Although one might say that we are reading too much into the integration program, it is not without reason that we need to shine lights on the void this program has failed to bridge for its various constituents.
Context matters. The time, place, and other environmental, cultural, historical, and social aspects of data collection should be addressed in conjunction to the people involved in a given IoT or data analytics instance. For the ORU case, the fact that the university prioritizes a Christian worldview and enforces an honor code that sees certain behaviors as accepted or unaccepted needs to be brought into the discussions when designing a data collection program such as the Fitbit integration.
What happens when students are “caught” doing something that’s deemed against the university’s honor code––thanks to the activity data from their Fitbit? When and where are students “off the clock”? What counts as surveillance and what counts as mere data aggregation?
No technology is neutral. When we design a technology or system (including IoT), we build our image and bias into the technology. A few days ago, ProPublica published a very interesting article on this––the image of the designers in their own design––that I think is timely for our discussion this week.
In the ORU case, again, the developers of the Fitbit integration program, knowingly or not, builds into the program their own intentions or motivations for the collection and use of student activity data. Whether they are biased or objective, these intentions need to be spelled out, and students should have the rights to know them before participating in the program.
For this week, I pose these discussion questions:
- What is the common narrative around the Internet of Things? What do people think when we talk about IoT? How are these narratives reflective of our public assumptions/bias/motivations for IoT?
- What steps might we take to ensure transparency and authenticity in IoT infrastructure?
Greengard, Samuel. The internet of things. Cambridge, MA: MIT Press. [Introduction, Chapter 5 & Chapter 6]