body#layout #main-top { display:none; } -->

Tuesday, 27 November 2007

"JUST TELL THEM YOU ARE A CHRISTIAN....."

I was recently informed that a certain tour company (U.S. based btw) which operates throughout the Middle East has informed the Muslim guides working for them in Israel to tell their clients that they are Christians if anyone asks. What the hell is that supposed to mean?

Would an American Jewish or Christian tourist feel uncomfortable knowing they are in the presence of a Muslim? Is that Muslim really expected to deny who he is for the comfort of some ignorant or perhaps even racist person, or should they proudly declare what they are and see where the 'cards fall'. Are we expected to put our native Muslims 'behind the wall' to satisfy these people? Are we to go into denial, saying that no Muslims live in the Middle East?

I remember the days when Afro-Americans in the States were also in a state of denial. They used bleach substances on their skin to lighten it, irons to straighten their hair, anything to make themselves look 'white'....
Did that too bring comfort to the racist thinking that everyone around them was white? What kind of a crazy world do we live in?? More