I recently read something that gave me pause. Do you realize that all rights in this country have only been given because white men decided to give them? Women only receive rights when white men decide to give them. Minorities only receive rights when white men decide to give them. No one else has the power to grant rights to anyone, on a national or state level at least. Yet whites are not privileged?
It was Congress that gave women and blacks the right to vote. See the difference? Today, there are more women and people of color in Congress than ever before. It is a logical fallacy to say that because I'm white, I have the power to bestow upon people their right to vote. Clearly, I don't.
Now that said, there are a number of aspects of society -- language, science, math, philosophy, theories of economics, law, civics and government, business practices, religions, arts & cultural traditions, etc. that have been inherited from the cultures of the Europeans who colonized this continent. This can naturally create a bias in favor of the people from these cultural traditions--due simply to their familiarity with it, if nothing else. However, calling this "white privilege" is so vague and obtuse that it doesn't accomplish anything other than to stir up broad racial tensions.
Start by talking about "European cultural heritage and biases" and then debate them one by one, if you want to have a meaningful discussion.
Yes, this country and it's system of government were founded predominantly by men of European descent, but they created a system of government that is supposed to accommodate and represent the people that it serves. If the system is working properly, those people have a voice in it.
And if, by virtue of my being a white male, you still associate me with the people who founded this Republic and finally gave all people the right to vote, then I guess all I can say is, "You're welcome. Sorry it took so long."