Software is the staging ground for the future, affording us the time and space to get our ethics right, before the stakes are raised.
This semester, I took a required class called Leadership & Corporate Accountability. At its core it was a course in ethics. I knew this going in, and felt skeptical about the subject’s usefulness; wouldn’t the answer in every case be obvious? Do the most ethical thing possible.
As it turned out, my skepticism sat askew from reality. For when I heard ethics I understood it as rules, and rules are things you can follow. Not every rule is right, but rules at least offer a semblance of clarity; because of their visibility, they can be accepted or railed against or worried over.
But, as I learned, ethics and rules aren’t at all synonymous. Ethics provide a framework for making impossible decisions: for talking about consequences, for hacking at the brambles in the foggy thicket where you’ll find yourself. They’re about what you do when the rules don’t exist yet, or when you realize the rules are leading to dreadful, unintended outcomes.
If you try to do anything new, or accept any amount of responsibility for any number of people, you will encounter impossible choices. I believe everyone deserves practice in making these choices; that struggling with impossibility before you encounter it is the best way to prepare yourself for the discomfort you’ll need to one day push through.
Elsewhere in Harris’s essay, he writes that “We could ask our educational institutions to add an ethics curriculum to every engineering program.” Many will read that and groan, but I read that and feel hope. The study of applied ethics is gripping because the answers aren’t obvious. Struggling toward them together before the stakes are raised lets us acclimate to discomfort and uncertainty, and to build friendships that will steady us in the future as we navigate through the fog.