A University of Wollongong team is developing software to detect brawls and suspicious behaviour around train stations.
Subscribe now for unlimited access.
$0/
(min cost $0)
or signup to continue reading
The team, from the SMART Infrastructure Facility and led by research fellow Dr Johan Barthelemy, will be developing artificial intelligence (AI) that can monitor CCTV footage in real time and pick out things like fights, aggressive behaviour or if someone is being followed.
The project is one of four winners in the Transport for NSW Safety After Dark Innovation Challenge, which aims to develop ways to make women feel safer on public transport.
READ MORE: MP opposed to cutting speed limits to 30km/h
"We want all our customers to feel safe on the network and it is not good enough that nine out of 10 Australian women experience harassment on the street and modify their behaviour in response," Transport Minister Andrew Constance said.
Dr Barthelemy said the AI will be able to identify an unsafe environment, such as where there is a lack of lighting.
"What we are going to do is talk with Transport for NSW and Sydney Trains to understand what are the most important type of incidents we need to pick up," Dr Barthelemy said.
"Is it people fighting or is it people following someone else, or is it people behaving strangely on a platform? It could be any of those behaviours. Once we have decided what to detect, we are going to train our software to detect that kind of behaviour."
That training could take a number of different forms - including members of the research team staging pretend brawls in the lab.
"We will need to show it some examples of what we want it to look for," he said.
"We may need to make some fake fighting in the lab, or we may need to collect some video or look at historical video footage where an incident happened.
"Then we are going to tell the AI, 'look, there is an incident in that video. That's what you need to look for'.
Once it detects a potential problem, the AI then alerts a human operator who can determine if it is an issue or not - a response the technology will use to improve itself.
"The alert might be 'I found something strange at camera number five, please have a look' and the operator will go and check that video," Dr Barthelemy said.
"If there's something wrong, that operator can react quickly. If there was nothing, if it was a false alarm then the operator can tell the AI it was nothing.
"The AI can keep learning from that feedback."
Dr Barthelemy said the project would enter a six-month testing phase and, if it worked well then it would be rolled out across the public transport network.
We depend on subscription revenue to support our journalism. If you are able, please subscribe here. If you are already a subscriber, thank you for your support.