A massive IT systems outage caused by issues with CrowdStrike software affects banks, airports, supermarkets and media companies across Australia and around the world.
It’s the other way around. All those PCs are bluescreening at boot. So that prevents fixing the system remotely and on a large scale. Now poor IT guys have to fix evey single one by hand.
Missing data in the boot sequence if that data is stored as a cloud init or a key is needed for auth during boot. So if you’re running thin clients and rely on something like Ansible, but now the thin client can’t get to the service it can’t boot, so critical error.
It seems to be crowdstrike reacting to the new update.
We have got ours up by the very manual process of:
1 Boot into safe mode.
Navigate to C:\windows\system32\drivers\crowdstrike
Delete C-00000291*.sys
Reboot normally
Yeah, CS posted this in a support article. Gonna be fun watching their share price on the Nasdaq overnight.
What’s their ticker? I looked up BSOD but that’s not it…
lol - it should be after this. CRWD…
You looked up Blue Screen of Death’s stock price‽
I mean that’s a fair assumption of what their ticker might’ve been
Maybe a stupid question but why would not reaching an online service (?) blue screen your computer?
It’s the other way around. All those PCs are bluescreening at boot. So that prevents fixing the system remotely and on a large scale. Now poor IT guys have to fix evey single one by hand.
Missing data in the boot sequence if that data is stored as a cloud init or a key is needed for auth during boot. So if you’re running thin clients and rely on something like Ansible, but now the thin client can’t get to the service it can’t boot, so critical error.
It has a privileged service running locally - csagent.sys - that was crashing causing the BSOD.
I guess if the code acted as if it got a valid response without checking it could get into a very weird state. Or the code just fails hard.
At the driver level it’s very easy to kill things.