As entrepreneurs, engineers and designers, we launch new ideas into the market all the time. Almost every one involves creating new technology. We build these great things, and all these things need data — data that belongs to people. With that great power, goes the saying, comes great responsibility.
Let’s talk about conscience
If you’re a small and growing company, or even a big one, you may be thinking, “We’re good people! It’s in our DNA! Plus, who has time to consider the finer points of ethics when we have to grow, grow, grow?” At the same time, we read the ever-larger headlines of security breaches, more stories of identity theft, data loss, cover-ups, and tarnished trust. Then add in the effortless weaponization of social engineering and overseas influence. We read about household names, week after week, year after year:…Facebook, Yahoo, Target, Equifax, EBay, Uber…all hapless victims? Unlikely. The perception is undeniable:
As a technology community, we are clearly failing.
As the leverage increases, so do the stakes. What if users decide they can’t trust a company that holds their data? Loss of trust is the #1 existential threat in almost any business. Some of these incidents happened at firms who hired whole generations of the best, brightest, most accomplished talent around. Telling people “don’t be evil” isn’t enough — that only covers things they do. Too much breaks because of things they don’t do. The hits keep coming, with bigger names each month…more giants of technology sending people out front to issue the next mea culpa. And that’s the common thread: people.
People are usually the problem…
It’s easy to get caught in the bits and bytes of each breakdown, and miss the fact people were truly to blame. Very few of the most notorious incidents were true hardcore black-hat hacking. Most often, they were exploits of ordinary, glaring errors, foreseeable lapses, or nonexistent basic controls. Worse, many were covered up, sometimes for years after the fact. These are not the acts of innocent bystanders.
So, take a wider view: people failed in most of these cases, and they failed way back when they designed and built the product. There was often no malice, but at the same time, there was also no conscience. The lesson we’re (re)learning today is: technology has real consequences.
…and people are always the solution.
Passion is the best weapon here. I also consider the very best examples of conscience I’ve seen from my colleagues. Some people who refused to do something half-baked. Other people who staked their job on a moral stand. And a couple people who lost their jobs but not their principles. So many people do the right thing, but if lapses continue, the users will paint everyone with the same brush.
So what to do? The first step is to walk out of the weeds and really think about it. The best way to do that is make a list of thought provoking stuff. I call this list the Developer’s Credo, although it’s meant for everyone who works with technology.
A Fearless Inventory
Things like Agile Manifesto tell us how to manage software development; Material Design tells us what it should look like, and Design Thinking aims at how to execute on ideas. There’s plenty of ‘standards’ to look to when building new business and new tools.
But the Developer’s Credo is trying to define terms of engagement between humans and technology, expressed in a punch-list for the people who dream, design and build those products. Other “codes” exist that obliquely address some of the immediate issues, or have standard ‘contract ethics’ boilerplate attached. This list is a bit more ‘on the nose.’
It applies to anybody creating any kind product, be it start-up or established: apps, infrastructure, workspace design, intelligent systems, telematics, embedded, IoT, crypto, just to name a few. It also applies very early on — these are conversations you need to have in order to preserve trust later.
So, for your consideration:
Developer’s Credo, Version 0.9
I understand and affirm that:
- systems I design or create will affect the life of a human, in ways both large and small
- regardless of its size, my system is still only one link in a chain, and lax effort on my part will break the entire chain
- by mere virtue of its existence, my system will be a target for hacking, intrusion, and misuse, both today and for many years to come
- I have a personal, ethical responsibility for the performance, safety and security of systems and data that I manage
I resolve to:
- understand every requirement, and understand each comes with risks I must manage
- design my systems to fail gracefully and fail safe in every foreseeable case
- consider privacy a default requirement and have a complete plan to secure data, encryption keys and credentials
- understand every group that uses my product, and strictly manage permissions, and audit access by these groups
- never design systems whose purpose is to harm any person or group
- never design systems that distort, misrepresent or exfiltrate data without clear disclaimer
- update my systems and frameworks regularly, and understand the implications of those changes
- always escalate defects, exploits, and breaches when I find them, with a plan to fix or mitigate them
- document my work to the point it can immediately carry on when I am absent
- always act in good faith, to a standard equal to or greater than my contractural and legal obligations, and escalate if they are in conflict
- strive to stay aware of best practice, innovate upon it, give credit where credit is due, and share back to the community whenever possible
Comments welcome. I would like as much input in making this as concise and focused as possible — many of you out there have more insight.
There’s likely a very small number of edge cases where it doesn’t work. But in the end, the Credo is intended to get people thinking about the ever-widening effects of design and technology on humans and specifically, their attitudes to the New.
Last note, I don’t want to hear ‘that’s not my job.’ It’s literally everyone’s job now, it goes beyond the people writing the code, to the folks designing and marketing the product, the UI and UX designers, senior management, interns — everybody who has a say in how the business works or makes a decision during the day.
What do you think?
Frank Sikernitsky
An engineer at heart, Frank Sikernitsky has been a startup Founder, a startup Janitor, a CEO, at CIO, an International Bank Officer, an Author, a Cartoonist, an Editor-In-Chief, and for a little while a Morning Drive-Time radio personality.