In AI We Trust: Power, Illusion and Control of Predictive Algorithms
NB: This is a secondhand book in very good condition. See our FAQs for more information. Please note that the jacket image is indicative only. A description of our secondhand books is not always available. Please contact us if you have a question about this title.
Author: Helga Nowotny (Swiss Federal Institute of Technology (ETH), Zurich)
Format: Hardback
Number of Pages: 200
One of the most persistent concerns about the future is whether it will be dominated by the predictive algorithms of AI - and, if so, what this will mean for our behaviour, for our institutions and for what it means to be human. AI changes our experience of time and the future and challenges our identities, yet we are blinded by its efficiency and fail to understand how it affects us. At the heart of our trust in AI lies a paradox: we leverage AI to increase our control over the future and uncertainty, while at the same time the performativity of AI, the power it has to make us act in the ways it predicts, reduces our agency over the future. This happens when we forget that that we humans have created the digital technologies to which we attribute agency. These developments also challenge the narrative of progress, which played such a central role in modernity and is based on the hubris of total control. We are now moving into an era where this control is limited as AI monitors our actions, posing the threat of surveillance, but also offering the opportunity to reappropriate control and transform it into care. As we try to adjust to a world in which algorithms, robots and avatars play an ever-increasing role, we need to understand better the limitations of AI and how their predictions affect our agency, while at the same time having the courage to embrace the uncertainty of the future.
Author: Helga Nowotny (Swiss Federal Institute of Technology (ETH), Zurich)
Format: Hardback
Number of Pages: 200
One of the most persistent concerns about the future is whether it will be dominated by the predictive algorithms of AI - and, if so, what this will mean for our behaviour, for our institutions and for what it means to be human. AI changes our experience of time and the future and challenges our identities, yet we are blinded by its efficiency and fail to understand how it affects us. At the heart of our trust in AI lies a paradox: we leverage AI to increase our control over the future and uncertainty, while at the same time the performativity of AI, the power it has to make us act in the ways it predicts, reduces our agency over the future. This happens when we forget that that we humans have created the digital technologies to which we attribute agency. These developments also challenge the narrative of progress, which played such a central role in modernity and is based on the hubris of total control. We are now moving into an era where this control is limited as AI monitors our actions, posing the threat of surveillance, but also offering the opportunity to reappropriate control and transform it into care. As we try to adjust to a world in which algorithms, robots and avatars play an ever-increasing role, we need to understand better the limitations of AI and how their predictions affect our agency, while at the same time having the courage to embrace the uncertainty of the future.
Description
NB: This is a secondhand book in very good condition. See our FAQs for more information. Please note that the jacket image is indicative only. A description of our secondhand books is not always available. Please contact us if you have a question about this title.
Author: Helga Nowotny (Swiss Federal Institute of Technology (ETH), Zurich)
Format: Hardback
Number of Pages: 200
One of the most persistent concerns about the future is whether it will be dominated by the predictive algorithms of AI - and, if so, what this will mean for our behaviour, for our institutions and for what it means to be human. AI changes our experience of time and the future and challenges our identities, yet we are blinded by its efficiency and fail to understand how it affects us. At the heart of our trust in AI lies a paradox: we leverage AI to increase our control over the future and uncertainty, while at the same time the performativity of AI, the power it has to make us act in the ways it predicts, reduces our agency over the future. This happens when we forget that that we humans have created the digital technologies to which we attribute agency. These developments also challenge the narrative of progress, which played such a central role in modernity and is based on the hubris of total control. We are now moving into an era where this control is limited as AI monitors our actions, posing the threat of surveillance, but also offering the opportunity to reappropriate control and transform it into care. As we try to adjust to a world in which algorithms, robots and avatars play an ever-increasing role, we need to understand better the limitations of AI and how their predictions affect our agency, while at the same time having the courage to embrace the uncertainty of the future.
Author: Helga Nowotny (Swiss Federal Institute of Technology (ETH), Zurich)
Format: Hardback
Number of Pages: 200
One of the most persistent concerns about the future is whether it will be dominated by the predictive algorithms of AI - and, if so, what this will mean for our behaviour, for our institutions and for what it means to be human. AI changes our experience of time and the future and challenges our identities, yet we are blinded by its efficiency and fail to understand how it affects us. At the heart of our trust in AI lies a paradox: we leverage AI to increase our control over the future and uncertainty, while at the same time the performativity of AI, the power it has to make us act in the ways it predicts, reduces our agency over the future. This happens when we forget that that we humans have created the digital technologies to which we attribute agency. These developments also challenge the narrative of progress, which played such a central role in modernity and is based on the hubris of total control. We are now moving into an era where this control is limited as AI monitors our actions, posing the threat of surveillance, but also offering the opportunity to reappropriate control and transform it into care. As we try to adjust to a world in which algorithms, robots and avatars play an ever-increasing role, we need to understand better the limitations of AI and how their predictions affect our agency, while at the same time having the courage to embrace the uncertainty of the future.
In AI We Trust: Power, Illusion and Control of Predictive Algorithms