Please note: This story contains reference to someone who has died.

 

Racism and racial attacks within the AFL community and towards AFL players has been on the rise, fuelled by fake accounts on social media platforms.

The question put to the AFL is what is being done to protect their players, not only at the games but online too. 

The AFL has responded to the online attacks by saying they condemn racial vilification towards their players, staff, and their families across all levels of the game.

The League released a statement outlining the changes in the AFL’s policies, with the most recent change of their vilification rule, previously known as Rule 30/35, now being called the ‘Peek Rule’ after late AFL executive Tony Peek. 

“While we know that we cannot eradicate racism, we will continue to evolve our policies and use our influence to increase actions against acts of racism, while further protecting our people who find themselves impacted by this harmful behaviour,” the statement read. 

“These previous policies and the current Peek Rule send a strong message that our game will stand up to racism and that Aboriginal and Torres Strait Islander, and culturally diverse players, feel protected when faced with any kind of vilification.”

The statement was a collaboration between AFL CEO Gillon McLachlan and Executive General Manager of Inclusion and Social Policy, Tanya Hosch. 

Although the AFL has released strong messages against online racial vilification and encouraged that attackers must continue to be called out — the racial attacks haven’t stopped. 

NIT reached out to Tanya Hosch to ask what the League is specifically doing in terms of online attacks and what preventative measures are being put in place. 

Hosch revealed she has been in talks with social media platforms including Facebook and Twitter to combat the issue. 

“We’ve met with Twitter and Facebook in relation to these issues and we also are working closely with the E-Safety Commissioner and their office as well,” Hosch said.

“At the end of 2019, the Federal Government announced that they were going to do an inquiry into looking at expanding the powers of the E-Safety Commissioner to deal with adults and harmful behaviour towards adults as well as children.

“We did put in a submission supporting that and that legislation has just recently been passed in the Federal Parliament, which we’re really pleased to support and some of our players supported as well.”

Hosch mentioned that once the legislation receives royal assent, possibly in the new year, the E-Safety Commissioner can then force social media platforms to remove material within 24 hours instead of 48 hours. 

“At the moment you can’t force them to take racist material down unless it’s targeting children, so that’s a really good move,” she said.

“Also, the fines that will be attracted to the social media companies if they don’t act within the time frame, have gone up quite considerably, I think to potentially half a million dollars.” 

NIT asked how these moves will stop people from continuing to throw racial hate at AFL players online, as the current method of reporting isn’t stopping the trolls.

Hosch said that falls within the social media platforms’ jurisdictions. 

“All we can do is report it and we send it on to the E-Safety Commissioner and we also will send it on to the platform, but we don’t have any power to make them do anything or disclose that information to us.”

“Our AFL integrity does investigate any reports, to see if we can locate the individual concerned, but unless the person who’s been attacked wants to report it to the police, we can’t do anything unless that person is a member of a club or AFL,” she said.

Social media giant Facebook, which owns Instagram, is making moves to combat racism.

As part of their policy to combat racial abuse being sent through Direct Messages (DMs) on Instagram, Instagram will permanently delete accounts that repeatedly send abusive messages.

Stricter changes have also been implemented such as message controls, comment controls and filtering out messages from people you don’t know. 

A company spokesperson for Facebook said in the first quarter of 2021, the company removed over 33 million pieces of content identified as hate speech on Facebook and Instagram.

“Ninety-three per cent of that content we banned before anyone reported it to us,” the spokesperson said.

“We encourage people to report racist content and use other safety features like Hidden Words, a tool which means no one has to see abuse in their comments or DMs.”

Although there have been many new features adapted to filter out the hate, finding the solution to identifying those behind the attacks and fake accounts is proving difficult. 

Chair of the Indigenous Players Alliance Des Headland told NIT the AFL are a powerhouse and should be putting more pressure on social media platforms to find the people behind the profiles.

“The AFL is the biggest code in the country and they work and coincide with social media to promote the game, internally there should be a harder push to find who’s making these profiles,” Headland said.

Eddie Betts, Paddy Ryder, Liam Ryan, Neville Jetta, Matt Parke, Zac Williams and Bradley Hill are just a handful of players who have been subject to racial vilification online. 

By Teisha Cloos