03/10/2014 01:30 pm ET Updated May 10, 2014

#Racism in 140 Characters

They called him 'paki.' My friend did not let his anger show. He kept walking. He is pretty sure they meant it in a derogatory way.

On Twitter, 'paki' is the second most used racial slur. 'White boy' is the first and 'whitey' is the third, according to a recent study from UK think tank Demos.

The report found that "10,000 tweets containing a racial slur are posted on Twitter every day," and 70 percent are used in a non-derogatory fashion.

Author of the report, Jamie Bartlett, said: "Context is king and it's more or less lost on Twitter."

She is right. Context is important. Rappers use the N-word all the time and few are offended. In fact, within a close circle of friends, nobody usually minds the odd racial slur once in a while. If a friend jokingly comments that you are doing the white boy overbite, you either laugh or -- if you are overtly sensitive -- stop dancing.

But Twitter is not the same as chatting on the dance floor, or in the cafe or any other social hangout for that matter. When you tweet there is no eye contact, facial expression or body language to read. There are 140 characters.

So how do you really know if a tweet is subliminally racist? The platform provides unending distance between the tweeter and the reader, making it easy to weave racism, sexism and a whole raft of other 'isms' into the Twittersphere without any consequences.

Racism however, is not a black-and-white issue -- pun intended. There are varying degrees, and like Barlett said, "context is king."

Communal etiquette would normally act as a strong enough deterrent for underhanded racism in a normal social setting, but that is not the case on Twitter. Social media seldom carries the same rules as offline socializing.

And sometimes, that is exactly why people take to Twitter. The freedom of having an unlimited stock of hashtags and handles allow people to be just that little more passive aggressive and say things they would never normally say to someone's face.

I'm not of course talking about explicit Twacism -- Twitter racism -- which the courts have in the past dealt with.

Logan Smith started an account @YesYoureRacist, which proves that racism, still runs deep in society. He told Britain's national newspaper, The Independent: "It's that the people I retweet -- the vast majority of which appear to be teenagers -- genuinely don't understand whether they're being racist."

He's right. Most people don't know when they are being racist. I have stopped counting the number of times someone has begun a conversation with me saying, "I'm not being racist but..." Sometimes what follows is ignorance and other times it is old-fashioned racism.

So should we curb the use of racial slurs on Twitter in an attempt to stem the flow of casual racism?

I am not sure it is a yes or no answer. But I do know that I would find it offensive if someone tweeted at me using the word 'paki.'

The level of offence would depend on what the tweet was actually about. But for some issues no amount of characters will justify the use of a racial slur.