Been thinking about the curse of knowledge* and how I’m particularly susceptible to it. Case in point, yesterday I was in a roundtable and made some assertions about AI, in the context that trusting any nascent technology that is not suitable for purpose or robust enough to rely on is the problem. It’s the humans using it inappropriately, but we do this with text ever since the first person painted a cave painting on the wall. It’s not new to lie, it’s not new to lie in print, and it’s not new that all of us get fooled at least some of the time.
If you couple misuse of technology with the race to be first and not miss out, driven by financial incentives, with a lack of care exacerbated with short-term thinking (think “by then, I’ll have cashed out and moved on”), any harms will occur long before anybody is equipped to deal with them. Why be proactive if you’re not even planning to be around in six months?
My audience did not track what I meant. Yes, on a long enough timeline, we eventually figure out what not to do, how to restrain bad actors, etc. My point was that those measures are ineffective in a gold-rush mentality in the short-term, and you have to able to detect bad actors in order to deter them. People under pressure do not stop to think about whether they should, and it’s worse when “there’s no law against” whatever it is they’re doing. Financial incentive overbears what people see as common sense, and you can’t count on the practices of a mature organization at the outset.
Or I’m wrong–but I doubt it.
*“a cognitive bias that occurs when an individual, who is communicating with other individuals, assumes that other individuals have similar background and depth of knowledge to understand”