The Future of Human Computer Interaction

In January - March of 2020, I was contracted by RLab to produce a report on The Future of Human Computer Interaction which, “focused on trends affecting hardware and software interfaces, to spread awareness of RLab’s expertise in the areas of XR and spatial computing, as well as further our connections within New York City’s ecosystem.”


INTRODUCTION

Each of my conversations with the interviewees had a beautiful undercurrent of innovation, and I’ve enjoyed thinking about the future of human computer interaction. I’ve organized my initial thoughts focused on

trends affecting hardware and software interfaces, to spread awareness of RLab’s expertise in the areas of XR and spatial computing, as well as further our connections within New York City’s ecosystem

since I want to make sure I’m calling attention to things I learned that could help RLab support education, entrepreneurship and innovation around XR/Spatial Computing technologies.

INNOVATION

As RLab continues to support XR/Spatial Computing innovation in New York City, what are some themes that we might want to tell others to pay attention to?

Truth and disinformation are main concerns for those practicing data analytics and storytelling. John Peters questions, “How do you actually pay for news because everybody expects something online to be free? I think it's our civic duty to be taxed. Just the way our taxes pay for schools, our taxes should pay for the news or at least public information.” Media companies experimenting with XR such as USA Today must continue to ask themselves if their business goals erode journalistic integrity. How can a newsroom create useful, impactful, personalized and fast news that makes someone love their brand, while still maintaining truth and journalistic integrity? I fear that the pursuit of the combination of utility, impact, personalization and speed -- would lead to a deeply unethical journalistic practice. I personally believe the team at USA Today’s approach is more appropriate for a long-form documentary or gamelike experience and not news.

Trust continues to be a theme when speaking about our interaction with systems, whether we’re observing the phenomenon of trust between the user and the technology system, or between the user and the business or government institution that created it. Several interviewees have expressed a strong distrust in systems, drawing from the long-standing tradition of cybernetic theory. Others have spoken about trust from a research perspective. Jeffrey Heer’s research into human agency, trust and certainty provokes questions about how our over-trust in systems leads to humans becoming less introspective, and thinking less critically.

Technologies create imbalances of power. As John Peters notes, “most technologies have just ended up differentially empowering different classes over others.” Unsurprisingly, when interviewees have spoken about inclusion, it has seemed to inspire them to think more optimistically about the future. Our interviewees want a future powered by technologies that include everyone of all races, genders, socioeconomic backgrounds, physical abilities and disabilities, etc. Luke Dubois is most interested in using VR for therapy, to create experiences that help people with various abilities and disabilities by sensorily transporting them so they feel centered, relaxed, balanced and supported, even if they’re in high-stress situations.

Data Governance and Data Rights were themes we heard many interviewees speak about. In cyber physical space, who will have access to which information about which people? How can we delineate boundaries of data archives that create meaningful engagement in local, state, national and global communities? What are our data rights and how can they be protected? Steve Feiner spoke about data rights in public space; he’s concerned about the possibility that tiny seemingly insignificant pieces of data (that have time/location metadata) from multiple devices could get pieced together to spy on someone by corporations or governments. As Ken Perlin put it, “What are the rights and privacy issues around technology when I could know everything about you?”

The pursuit of Cyber Physical Space (a term we’ve heard from interviewees in the Asia Pacific Region and previously known as the AR Cloud) was mentioned by many interviewees. Books, tv shows and films have helped to create this shared image of the future in cyber physical space, and interviewees have different perspectives on how best to move toward this future. A very interesting point we heard from the Hakuhodo team is how Japanese religious beliefs impact their relationship with machines. They imagine many gods in the world around them, and they view XR as a way to inhabit that imaginary world. Therefore, their company’s mission to create cyber physical space is one they pursue without reserve. They know that other cultures have hang-ups that slow innovation toward this goal. Although Genevieve Bell is also located in APAC, she is busy teaching students to ask ethical questions that lead to a more equitable future in cyber physical space.

The environmental impacts of innovation are cause for concern. Mark Parsons thinks efficiency, carbon neutrality, less waste, lower costs to building and quicker returns on investments are the big changes that we're going to see in the next 10 years. Amy LaMeyer is most excited about the ways that XR technologies could help us create less waste/garbage, and collaboration and telephony tools help us lessen carbon emissions from international travel.

EDUCATION

When we’ve talked to folks about how best to approach teaching XR/Spatial Computing, we’ve heard many people say the main challenge is mediating, supporting and creating the inevitable tension that arises from bringing together a diverse group of students. Negotiating the tension is critical to supporting inclusive design environments that lead to inclusively-designed systems. I believe we heard the most nuanced, actionable directives from Genevieve Bell, Jeffrey Heer and Stephanie Dinkins.

Another consideration for education is how the lessons get translated into jobs and the job market. Genevieve Bell mentions that graduates of 3Ai have gone on to work for governments and companies. My opinion is that one can only hope that their organizations’ structures and their roles in those structures support the ability to ask critical ethics questions and make real change. Stephanie Dinkins creates compelling, provocative work that invites people to ask valuable questions about power, culture and inclusion yet there is not an established pathway for other artists to do the same. She is currently focused on helping to create that pathway for other artists.

I’ll not name interviewees here, but we heard a few professors speak about their distrust for companies and the government. I personally am very sympathetic to many of the opinions they expressed. However, I’m left wondering, “Is it responsible to teach students to distrust the work that they’ll inevitably end up doing for companies and the government, without giving them a way to make positive change within those institutions or others?” In my ideal world, RLab would connect these people with each other to facilitate conversation around regulation and policy. I think it would be very interesting and important to speak to people who are currently working in AI Ethics/Data Regulation. RLab could also support artists in creating speculative and critical work, like what Stephanie Dinkins creates, to continue to engage the wider public in these conversations.

CONCLUSION

There is a need to balance the speed of innovation with the asking of thoughtful questions about how our inventions might impact individuals, communities and the earth. Just because we can invent something does not mean that we should. How can we educate companies and governments on how to ask these questions, and when to discontinue innovation projects that would create more harm than good? How can RLab continue to make money even if we recommend that an innovation project be discontinued? Can innovators create business models that don’t rely on shipping products fast? We need regulations to protect our democracy, our individual rights, and our country.

Previous
Previous

On Affective Computing

Next
Next

We need to build Trust and Accountability into the use of AI.