If you think it’s bad now, it’s gonna get worse in the coming years.
Reading and analysing what’s been happening around us for the past 10-20 years, it’s clear that we’re in a Nazi Germany situation. And when fascism will be here, we will ask “how did that happen?”
The future looks bleak.
Makes me wonder, why the hell are these even happening? I thought people were aware why such rights were established in the first place? Or do they not teach that/have stopped teaching that in Western Academia?
In my experience I know all too much people who think the civil rights era, gay marriage, and trans care, all were established legally by just voooooooooooting hard. In a way the school system sets up people to not understand their rights so they can be taken easier.
capitalist “education”