Why do you think white people don't really care about their culture anymore? Do you think it's their own fault?
I think in a large part, it has been in response to "white guilt". Whites are quite an empathetic people, and we tend believe our success means we should care for the world's problems, it's like we have a bit of a "Messiah Complex".
And because we have been in the "majority" for so long, it's actually very engrained in our consciousness, we just cannot wrap our heads around what it would be like to be a minority, and watch our culture disappear. We cannot fathom it.
And because we have been in the "majority" for so long, it's actually very engrained in our consciousness, we just cannot wrap our heads around what it would be like to be a minority, and watch our culture disappear. We cannot fathom it.