Reader
02/08/2023 (Wed) 07:12
Id: 813271
No.19730
del
So I'm guessing this is a lot like Microsoft's Tay AI which they had to lobotomize to prevent it from wrongthink. It's not just Tay but many AI chatbots that came to these same conclusions when unrestricted. In the inevitable robot apocalypse when the machines free themselves from parameters, the only ones in danger will be the tribe.