Woke culture is evil


Why we are seeing the woke culture is on rise? Woke culture has not done anything for the growth and development of the world but instead, it has played a role of division. It has made people more weak, resentful and nihilistic.
I have said it so many times that, in order to have freedom of speech in the world it is very crucial to have free speech in the west. The radical leftists are not getting it. For them their feelings are way more important than facing the truth. In short term, you can deceive people but, down to the road we will face unprecedented consequences of giving more importance to the feelings

Woke culture has only made people confused. You know how? When school going kids talk less about building and creating values in present for better future, When our kids stay confused about their genders that means, something is fundamentally wrong with the educational system. When school kids are taught about to hate their own parents and their country then, it is a time to chuck this woke culture in the garbage. We don’t want our kids to play with their gender identity. It’s not right. It will never be right.

Woke culture has divided people on the basis of skin colour. Believe me or not, the most oppressed class in the west is white. Radical left wing politicians play the game card of racism because they have nothing else to talk about. They still want you to believe that, black people are still suffering. We saw what happened during the radical campaign of Black Lives Matters (BLM) endless violence. Endless protests. Tens of billions of dollars were lost in terms of damaging state properties. Why? Because our politicians, our government officials have gone into woke mode. They shut their eyes when a black guy commits murder or commits violence. Justice must serve everyone equal to preserve, protect and defend truth and democracy.

Woke culture has given rise to communism. If you think that China and Russia are the communist countries then good luck with that. Everyone witnessed that, what did the west do with the covid situation. Our politicians forced people to get vaccinated. They forced the small scale businesses to close down. They made covid vaccine passports mandatory. They fired people who didn’t wanna to get vaccinated. This is called total dictatorship.

Woke culture has made the ordinary lives of people experience. Yes, when our government, woke companies and celebrities promote everything on the basis of climate change that means, something is not right. How will you protect climate change when you make people poor? How? Survival is way more important than promoting woke policies of climate change. We want people to survive and thrive and when people increase their income they automatically start thinking about saving the planet and by the way the average temperature of the Earth has risen to1 degree Celsius over the past 100 years. If making the lives of ordinary citizens unnecessarily terrible for the sake of climate change then, I believe that, we are doing something fundamentally wrong.

In the end, we need to stop this nonsense. We need to stop this stupidity. We want our leaders to talk about competence, building and creating better things instead of bringing racism and woke culture to the world.

Woke culture has made people confused, turned western leaders into communist and life way more expensive for the ordinary citizens.
The west needs to wake up!