Affiliation:
1. Monash University, Australia
Abstract
In this article, we develop a framework for robotic autonomy as contingent. We do so with an account of a series of online research workshops that asked people to design and test robot behaviours for a public space scenario of their choice, as a means to surface and discuss their understandings of robots. We show how, as people manipulated robots in a simulator, they came to understand the capacities and limits of robots in distinctive ways. Thinking with these virtual encounters with robots, we argue that robotic contingency can be understood as dependent on spatial context; as unfolding in encounters between people, robots, and other things, creatures and substances; and as subject to forms of accountability and responsibility that are ongoingly made in an uncertain world. Our analysis reinforces work on automated infrastructures, which understands them as made sense of and operating relationally. This is important because those infrastructures are a part of how people understand robotic technologies in an uncertain and processual world, and they shape their expectations about and imagination of automated technologies, such as robots, into the future. We conclude by speculating on implications for an open-ended robotics design that works with contingency rather than seeking to control it, and ask how a more expansive understanding of accountability might be assembled as part of this more emergent approach.
Funder
Monash Data Futures Institute
Reference29 articles.
1. Unit, vibration, tone: a post-phenomenological method for researching digital interfaces
2. Babuska R (n.d.) Robots that Learn Like Humans. Available at: www.tudelft.nl/en/3me/research/check-out-our-science/robots-that-learn-like-humans (accessed 20 July 2023).
3. Bhuiyan J (2023) TechScape: Self-driving cars are here and they’re watching you. The Guardian, 4 July. Available at: https://www.theguardian.com/technology/2023/jul/04/smile-youre-on-camera-self-driving-cars-are-here-and-theyre-watching-you (accessed 1 August 2023).