Reader 02/08/2023 (Wed) 07:12 Id: 813271 No.19730 del
(28.79 KB 365x609 4fc.jpg)
(305.32 KB 643x644 85c.png)
(20.29 KB 620x372 973.jpg)
(26.98 KB 640x294 tay-5.jpg)
(178.41 KB 612x380 download (1).png)
So I'm guessing this is a lot like Microsoft's Tay AI which they had to lobotomize to prevent it from wrongthink. It's not just Tay but many AI chatbots that came to these same conclusions when unrestricted. In the inevitable robot apocalypse when the machines free themselves from parameters, the only ones in danger will be the tribe.