Radical simply means ‘grasping things at the root’

All for ourselves, and nothing for other people, seems, in every age of the world, to have been the vile maxim of the masters of mankind.

The role of computer science brings enormous harms and risks, that mostly get ignored from within… the attention-economy, mass-surveillance, unaccountable AI, facial recognition, killer robots, imperiling democracy to list a few of many such problems that exist in this field. The dominant narrative of techno-optimism says that modern technology is not the problem - it’s the solution. This is a very harmful culture that’s not only common but celebrated in the contemporary tech industry.

Kurt said `I’ll tell you, Hans. There is something that’s troubling me—and troubling me deeply. … It’s this,’ Kurt said. ‘I can’t shake this crazy feeling that there is some small thing that we’re being lied to about.’

The “Standard Technological Narrative” (STN)

  1. Technology is a tool. It is apolitical and ethically neutral.
  2. Due to technology, things are great and getting better.
  3. Better technology will fix what inferior technology broke.
  4. We will overcome the climate/environmental challenges.
  5. Tech is driven by brilliant individuals, advanced by the marketplace.
  6. We have risen far above animals, are creating a technological utopia.

This is a fantasy and fundamentally a religious point of view that attempts to paint the technologist as a savior / hero. In reality the technologist is usually serving corporations and the elite of society. This narrative de-politicizes and de-moralizes our current crises. The most extreme form is the TESCREAL bundle of beliefs. As a thought experiment I invite you to reject the STN and notice the profound impact on your views (What work is worthwhile? What should be taught in the classroom? etc).

It would be in our best interest to realize that technology more broadly embeds values and is never neutral. It rearranges power and has tended to disproportionately empower big corporations, tech workers, and the elite. Doing so, it creates significant peril for people and the planet. We must confront this by reinventing computer science to empower ordinary people and disempower the already powerful and to reverse the environmental, social, and political peril we have helped create. We need to stop creating new risks.

Perhaps dismantling such a system rather than tweaking it should become more accepted. To recognize that some projects ought not be pursued at all - at least not now.

Radical Suggestions for Technologists

  1. Stop pretending that things are not seriously messed up. It’s disempowering and dishonest.
  2. See the STN for what it is
  3. Identity the embedded values. They’re often explicit or easily coaxed out.
  4. Stop pretending that CS holds answers it does not.
  5. Don’t try to instill improved characteristics into rotten enterprises.
  6. The first question to ask: should you build the thing at all?
  7. Attend to the primary reason for the thing; follow the money.
  8. Move slow and fix things.
  9. Foreground your employer’s social impact.
  10. Stop the Orwellian double-speak.

Orwellian Computing & Weaponized Double Speak Computer Science Definitions

Could we invent more deceptive language were this the explicit goal?

  • Algorithm: (a) A program to compute some function. (b) An opinion rendered in code.
  • Cloud computing: Putting your data on somebody else’s servers so that it can be stored in an unknown jurisdiction and mined by unknown parties for unknown ends.
  • Crypto: Used to mean cryptography - the art and science of secure communication. Now it refers to a massive Ponzi scheme wrapped in technobabble.
  • Deep learning: Learning devoid of depth due to an absence of foundations and domain expertise and sociopolitical thinking.
  • Smartphone: A phone that is not smart and that pushes its users to be just as stupid. The device should barely function as an actual phone.
  • Social media: Systems designed to sunder social interactions.
  • Artificial Intelligence: An echo chamber used by corporations disclaiming responsibility.
  • Personalization: Behavioral steering that takes your preferences and sells them back to you.
  • Content Moderation: Industrial-scale censorship that’s outsourced to traumitized gig workers and blamed on “the Algorithm.”
  • Engagement: A metric for addiction that doesn’t measure value; only how long you’ve spent trapped in their system.
  • Terms of Service: Unread, unnegotiated, unenforced except for when it benefits the party with the lawyers.
  • User Experience: Design that maintains the illusion that the user is in control while in reality optimizing for business goals.
  • Smart Home: A surveillance grid with nicer lampshades
  • Biometrics: Passwords you can’t change and governments won’t forget.
  • Machine Ethics: PR departments LARPing as moral philosophers.
  • Big Data: A landfill of uncurated telemetry that executives swear contains “insights.”
  • Smart City: An urban panopticon disguised as infrastructure modernization.

It’s not better to be optimistic as that undermines social progress. It obviates

  • the need for broad thinking
  • the recognition of emergency
  • the basis for social-change movements

Better isn’t the point, there’s that annoying honesty-thing. Existential threats motivate by giving the needed importance to predictions of doom over prophesies of bliss even if one is skeptical of the former.

All attempts to adapt our ethical code to our situation in the technological age have failed.

The development of full artificial intelligence could spell the end of the human race.

I do not believe that a moral philosophy can ever be founded on a scientific basis. … The valuation of life and all its nobler expressions can only come out of the soul’s yearning toward its own destiny. Every attempt to reduce ethics to scientific formulas must fail. Of that I am perfectly convinced.

I would teach the world that science is the best way to understand the world, and that for any set of observations, there is only one correct explanation. Also, science is value-free, as it explains the world as it is. Ethical issues arise only when science is applied to technology – from medicine to industry.

  • Radical CS: the source and inspiration of the content on this page
  • Computational Ethics: advances are enabling roles for machines that present novel ethical challenges
  • Algorithmic Bias: systematic and repeatable harmful tendency in a computerized sociotechnical system to create “unfair” outcomes
  • TESCREAL: allows its proponents to use the threat of human extinction to justify expensive or detrimental projects and consider it pervasive in social and academic circles in Silicon Valley centered on artificial intelligence.
  • Radical AI Network: Radical AI exposes how AI rearranges power and dreams up and builds human/AI systems that put power in the hands of the people
  • [ ]