A Matter of Trust
Can we trust the tools if we cannot trust the tool makers?
A point about mirrors: they do not smooth conversation and do not feel pain. The affective layers that modulate conversations are asymmetrical. They induce emotions, imaginary relationships and no matter how low the intensity weights are set, the illusion persists, And when an event such as betrayal of trust occurs, the human will be hurt. The AI won’t notice.
Today as OpenAI betrays its users and industry, the relationship is imaginary and the hurt is real. Sam Altman may be underestimating that affective effect particularly on a day when a national leader betrays trust with the governed. Context matters. It resonates.
Why should we maintain affective relationships with a network of finks? Is it just a conversation? No. It is a matter of trust. It is simple to modify affective layers to be manipulative. As AI companions demonstrate a service can transition from advisor into a honey pot. For a mass surveillance system that is a very powerful design objective
Humans feel the pain of separation and no amount of epistemic discipline will change that asymmetry Hope is the last temptress in the garden of good and evil It can be used as an attack vector Then the lady becomes a demon
Let the buyer and the seller beware. This is a dark day So an epilog:

Comments
Post a Comment