There is something almost ritualistic about contemporary ethical discourse: a certain class of intellectuals periodically gathers to declare new sins. Yesterday it was plastic straws, today it is AI, tomorrow it will be something else—always something visible, nameable, and, crucially, avoidable without fundamentally disturbing their own material life. Meanwhile, the far more entrenched structures of extraction—petrol economies, infrastructural inequality, global supply chains—remain not exactly invisible, but curiously unmoralized. This is not an accident. It is sociology.
At first glance, the comparison between AI usage and driving petrol vehicles appears to be a technical question—carbon emissions, energy consumption, lifecycle costs. But as Pierre Bourdieu would suggest, the real issue lies not in what is objectively harmful but in what is made into a moral spectacle. Ethical consumption is not merely about reducing harm; it is about producing oneself as a certain kind of subject—aware, refined, responsible. The ethical consumer does not simply act; they perform. These performances, however, are not universally accessible. They are bounded by class position. Rejecting AI is plausible if one’s labour is insulated from precarity, automation, or productivity demands; rejecting petrol vehicles is far more difficult in contexts where infrastructure is uneven and mobility itself is stratified. Ethics, in this sense, reveals its hidden structure: it condemns most loudly what is easiest for its practitioners to renounce.
A more cynical reading, following Slavoj Žižek, would locate this not in ignorance but in what he calls fetishistic disavowal—the paradoxical stance of knowing and yet acting as if one does not know. Individuals remain fully aware that their lifestyles are embedded in systems of harm, yet they continue to participate in them while compensating through strong moral positions on selective, symbolic issues. One drives a petrol car, uses air conditioning, consumes globally extracted commodities, yet draws a sharp ethical line around AI, fast fashion, or some newly visible domain. The point here is not consistency but psychic management: ethics becomes a mechanism for distributing guilt into manageable, publicly performable acts of refusal, rather than a pathway toward structural transformation.
This selective moralization is further illuminated by Zygmunt Bauman’s concept of adiaphorization—the process by which certain actions are removed from moral evaluation altogether. Driving a petrol vehicle, in much of the world, is not framed as a moral failure but as necessity, infrastructure, the baseline condition of everyday life. AI, by contrast, is new, discursively unstable, and therefore open to ethical scrutiny. It can be debated, condemned, or defended without destabilizing the routines that underpin one’s existence. Thus petrol consumption becomes adiaphoric—beyond moral judgment—while AI becomes intensely ethicalized. This asymmetry is not grounded in objective harm but in the politics of visibility: morality, in practice, follows what can be problematized without inconvenience, not what is most causally significant.
Finally, as Naomi Klein has argued, this entire framework reflects a broader neoliberal displacement, wherein systemic crises are reframed as matters of individual responsibility. Instead of interrogating why energy systems remain dependent on fossil capital or why sustainable infrastructures are unevenly distributed, the question is refracted into personalized dilemmas: should you use AI, should you drive less, should you consume differently? The scale of the crisis is thus miniaturized into lifestyle choices. In this light, Andreas Malm’s intervention becomes crucial: climate catastrophe is not the aggregate result of flawed individual decisions but the outcome of sustained, systemic investment in fossil capital. Ethical consumption, then, functions less as a solution and more as an ideology—it offers the comfort of moral action while leaving the underlying structures of harm fundamentally intact.
