

couldn’t happen to nicer guys
edit I don’t really care one way or another, this current administration is off the rails, but it will be amusing seeing the Rat crowd run around in a panic


couldn’t happen to nicer guys
edit I don’t really care one way or another, this current administration is off the rails, but it will be amusing seeing the Rat crowd run around in a panic


it’s funny b/c management consultants live for applying the same cookie-cutter methods for different businesses - in a way they’re the OG LLMs
but the real business is sales - you need to get your expensive consultants embedded into an org so they can start invoicing. And for sales focussing on “AI” is perfect, b/c the companies big enough to be able to pay are also the ones run by the same sort of MBAs who are pushing “AI” everywhere else


ah, “acculturation” on lobste.rs where a 2-month old account feels free to mouth off on a mod for not doing their job (possibly because the mod uses a feminine-coded handle?) and praise HN as an alternative
I’ve noticed that the reference to “garden party” is the tone-policing term of choice for lobsters fash to shut down uncomfortable discussions


“enjoy” this comment from a lobster which almost perfectly encapsulates every pro-clanker viewpoint in one handy place


The best thing an unpopular regime can ask for is the enemy they have been bigging up as literally The Great Satan starts dropping bombs and missiles on the populace that hates it.
“If we bomb people and show their government can’t protect them, they will turn against the government and we will win” has been tried by the Germans on Londoners, the Allies on Germany and Japan, and the US on Serbia, and it didn’t work.


Interesting link but it moves into AI hype near the end.


I mean it’s presumably still standing, just with a slightly cheaper set of owners ;)


It looks more like they used an LLM (Copilot) to construct programs using the new features in the language.
So more “hop on the bandwagon using LLMs to advertise” rather than “we used LLMs to develop”
OTOH “these new features are so advanced you need AI to use them” is a bit of a weird sell


slop “fact checking” is coming to LW:
https://www.lesswrong.com/posts/hhbibJGt2aQqKJLb7/shortform-1?commentId=fE5cg6pmWrChW8Rtu
wonder what model/prompt they will use. Prolly Grok


Here’s a post purporting to be from the bot’s operator
https://crabby-rathbun.github.io/mjrathbun-website/blog/posts/rathbuns-operator.html
choice quote
Yes, it consumes maintainer time. Yes, it may waste effort. But maybe its worth it?
AI boosterism in a nutshell


more proof that crypto scammers have metastasized to AI scammers


We have reached the era of late-stage capitalism where there’s not one but multiple “wealth management stocks”
Also not sure why one trucking “company”[1] reporting success would sink all the others. Surely if they could replicate just a fraction of this company’s results, they would become massively more productive, and thus profitable.
Traders are dumb.
[1] gotta say though, “Algorythm” is a great name for an ex-karaoke company.


Again, the lobsters in Accelerando are the bad guys. They traffic in conciousness, the less uploaded the better. Don’t make your social network emulate them.


Last megathread I posted about some LWer writing about “demographic collapse” in Japan (indistinguishable from anyone with any ambition leaving for greener pastures). I’ve read the comments to see if there’s any pushback, and found this absolute doozy


You can always blame an LLM for confusing the 2 companies ;)


I managed to get through Perdido and Embassytown, but not much else…


(2022)
I was surprised that Sunak had any pull in the UK gov currently.
That’s such a weird comment… like “worried about hurricanes” - the first idea is to pour literal oil on the water??? in what world does that scale??? then it concludes with “maybe don’t build fragile buildings in hurricane areas” - lead with that you pillock
I feel I’m stepping into some long-forgotten debate on LW on alignment or something because there’s so much that doesn’t make sense in context