Just wanted to get the sangat's opinion on a subject that's been brewing for a while now but has only hit the mainstream since recent political and social upheavals in the West.
Are whites who claim they are being oppressed, marginalised, and rendered powerless in their own countries, exaggerating or is there something valid in those claims?
All replies are welcome, but some maturity is preferable. Avoid half-cocked personal prejudices.
My opinion is in the spoiler tag below if anyone is interested. It's a very brief summary that I may expound later, time permitting.