Last year when Apple launched a credit card, they also debuted in the technology bias domain. A wealthy tech entrepreneur (one of their own) was given ten times more credit limit than his wife on the new Apple Card. It was contrary to the fact that the couple had their assets held in common. When one complained, the answer was, "It is just the algorithm," Steve Wozniak (Apple’s co-founder) said. Apple as well as Goldman Sachs, the bank backing the card, both were unable to explain, why's that!

In layperson's term, it was more like, "Computer says so...!" jingle that any company will sing at you when they have declined you for a loan, or something similar. That is THE problem.

Computerized bias is not new. But when that happens, it is near impossible to hold anyone accountable. Last month, MIT Tech Review's senior editor, Antonio Regalado, published a piece that listed several technology failures of 2019, one of those was Apple's, as quoted above.

What's going on here?

It hasn't been just the matter with Apple's Card or Uber or Boing plane crashes for that matter. These issues have been universal. Whether it is about recent problems with Westpac's tech mess or multi-million dollars wasted by one of the Australian Telco (a case study in my book), there is a pattern.

Looking at that pattern will drive you to only one conclusion - Problems are not wholly with the technology but with the people in charge of it.

We often overemphasize technology and its importance in business and life. But tech is not everything; it also takes an appropriate combination of people, process, and timing to achieve the best results.

Technology is not everything; it also takes an appropriate combination of people, process, and timing to achieve the best results.

Unfortunately, several tech giants are continually pushing user companies in a danger zone, where companies don’t know what they don’t know. This push is dangerous, not only from a risk concentration perspective but also for your own business’s sake.

Why is it important?

Technology is becoming dominant and sophisticated, day by day. Understanding it and being careful is imperative. "There is an app for it" mentality has seeped into the emerging-techs, which is causing a significant influx of junk tech in the market. It is getting riskier to choose and use the right tech.

Not having the right technology can severely impact your business's competitive ability, and affects bottom-line. It also poses security and reputational risks for an organization. On top of that, the general tendency has been to blame the tech whenever something goes wrong (see Apple's case above). People who design, deploy, and use these technologies are moving away from responsibility.

Tech-solutionism has also led to a reckless push to use technology for anything and everything. The thrust is making several companies to take the ready-fire-aim approach. It is not only detrimental to the company’s growth but also is dangerous for customers on many levels.

A better approach is to consider suitability & applicability, apply some phronesis, and do what is necessary. However, fear of missing out is leading to several missteps. It is eventually creating a substantial intellectual debt that we may never be able to pay.

Fear of missing out is leading to several missteps and is eventually creating a substantial intellectual debt that we may never be able to pay.

Why is it necessary now?

In my view, problems should be dealt with in their nascent stage, to avoid them from growing more substantial than our abilities to handle them. If we allow ethical and responsibility issues to be baked into the tech fabric, it will be extremely costly and challenging to stay in control going forward.

If we miss the timing, we lose control, as simple as that. Unfortunately, we will not know if something is off unless significant problems occur, and that would be too late.

Fortunately, tech-industry, as well as several other sectors, are acknowledging the importance of the topic. Everyone now knows "why" ethical and responsible tech is necessary, but "how" to work on that to make it happen still a grey area.

How can we address it?

To begin with - by being pragmatic, and by staying in full control. By applying sanity and not being obsessed with technology as a means but focusing on the end from the customer's perspective.

 
Effective problem solving

 

Looking at the issues mentioned above from a wholesome perspective is quite important and critical to tech-adoption. It means acknowledging that good problem-solving needs more than just the right solution. If you want to solve problems effectively, covering three things is quite critical, i.e., doing the right thing, doing it right, and responsibly addressing potential risks.

Much like a three-legged stool, all three aspects are equally important. Missing out one of the three crucial factors can produce a solution that is either bulky, or risky, or outright useless.

The point is

A prescriptive framework that covers all three critical problem-solving aspects can bring a sense of discipline in your technology endeavors. It will also enable you to benchmark and to assess technology risks objectively.

Taking responsibility for our actions in conceiving, designing, executing, and owning powerful technologies, including AI, is not only ethical but also a responsible behavior. Let's embrace it in the right spirit.

Finally, I would like to invite you to read my latest book, Keeping Your AI Under Control, where I cover this topic in greater detail. I would highly appreciate your feedback.

Keeping Your AI Under Control

I would also encourage you to try applying techniques and methods explained in the book for your company and educate others about the same. Remember...

The technology works best for you, only if you stay in charge of it!

We sure need collective efforts in doing this - Let's make technology responsible again!