Another day, another preprint paper shocked that it’s trivial to make a chatbot spew out undesirable and horrible content. [arXiv] How do you break LLM security with “prompt injection”?…
No, it’s when all the global data centers are built on the right ley lines so that AI Jesus is summoned to earth on the day the planets next align in 2040.
We would have had it this year but those fucks in Texas wouldn’t stop mining crypto.
it’s when you have to get the AI slotted up just right in the printer, otherwise it wedges stuck and you have to disassemble the whole thing
No, it’s when all the global data centers are built on the right ley lines so that AI Jesus is summoned to earth on the day the planets next align in 2040.
We would have had it this year but those fucks in Texas wouldn’t stop mining crypto.