When John McCarthy coined the term Artificial Intelligence (AI) in 1955, few could have conceived the impact it might have on life in the 21st Century — and on our privacy.
Yet the frontiers of how technology watches us now shift daily.
We have Alexa and Siri listening out for commands (and other conversation) at home, Facebook automatically face-tagging us in photos and self-driving cars.
Cameras scan our public spaces, our data accumulates on the servers of strangers and casual retinal scans are no longer merely the stuff of big-budget movies.
But the tension lies between the intrusive nature of some advances and the health, security and convenience benefits offered by machines that watch our every move.
So, what are the secrets your body is sharing about you?
Eye-tracking is not a new technology — the first eye-tracking device was developed in 1908 — but as a technology it’s about to explode. US tech giants Apple, Google and Facebook have all bought eye tracking start-ups. Swedish eye-tracking firm, Tobii, is integrating eye-tracking technology into modern life, from reinventing computer game experiences in virtual reality to helping market researchers and advertisers to inform the placement of online ads and the design of real-world shops.
But the technology has value beyond games and retail and offers new horizons to those living with physical disabilities. Last year Tobii collaborated with Microsoft to adapt eye-tracking technologies to personal computing, and users can now control devices with eye movements alone. The potential for people living with severely limited mobility is nothing short of revolutionary.
Dr Shane Rogers of ECU's School of Arts and Humanities is using Tobii devices to understand human social behaviour by studying the patterns in our personal interactions.
The findings could be used to help detect autism, Parkinson’s, Alzheimer’s and schizophrenia, and potentially ADHD and dyslexia.
“The findings could help in a diagnostic setting, understanding the starting points of normal behaviour and using it to determine disorders or diseases that have symptoms affecting eye patterns,” he says.
Beyond health research, Dr Rogers is also looking at how eye movements might assist in the social and working interactions between humans and robots.
The eyes are not just windows to the soul, in this case, but also a potentially confusing interface for a robot trying to anticipate what the person wants.
With robots now being used in carer-support functions, improving responsiveness is going to be important, particularly for vulnerable patients unable to communicate verbally.
“Humans tend to do a little dance where we look away from a person’s face, then look back, in order to help signal turn-taking to help conversation flow,” Rogers says.
“Understanding behaviours like these will help us develop more intuitive, more productive, robots."
Humans intuitively understand our dancing eyes, and have similarly advanced ability to read and recognise each other’s faces.
But the competition from Artificial Intelligence is growing.
At one extreme, Google recently released an app that can compare any face to a catalogue of famous paintings, matching a selfie to its art doppelganger.
Disney has new algorithms which not only recognise expressions on the faces of test audiences but predicts how those expressions will change through the film.
Facial recognition is also being used by business to enhance performance, productivity and profit.
The newly-public Amazon Go app uses in-store cameras and sensors to offer shopping with fewer lines, cashiers or hassles, while a Paris business school plans to use facial recognition to determine when online students are paying attention and set quizzes accordingly.
And while Facebook’s use of automatic tagging has raised many privacy concerns, the company that provided Facebook with phototagging power is now developing Face2Gene, an app combining facial recognition AI with genomic data to improve the diagnosis of 7,000 rare diseases.
ECU researcher and lecturer Dr Syed Shamsul Islam is using similar technology to develop diagnostic tools for the most remote and disadvantaged of communities.
Islam and his colleagues first combined ECU’s in-house AI facilities with 3D imaging and machine learning to research ways to better diagnose sleep apnoea.
Now the team is working with Princess Margaret Children’s Hospital to develop new screening tools, especially for children in remote communities, to make a diagnosis online via video or photographs.
“The incidence of rare diseases in Aboriginal children is increasing and when they’re located in areas where access to specialists and experts is limited, some kids can go undiagnosed and not receive appropriate treatment,” he says.
While these forms of AI don’t sound like robot tyrants bent on controlling the human race, the pace of advances inevitably raises questions of privacy.
What are the risks of being seen and known by machines? Who gets to decide how your biometric data is used?
Islam’s background includes using biometrics for security and warns that it is surprisingly easy to fake details about facial features or even fingerprints, making the world of biometric information fraud an untapped opportunity for organised crime and terrorism.
He believes that more unexpected forms of AI identification such as analysing the shape of ears or other parts of the body might be needed to create complex and safer biometric profiles.
But he says the responsibilities and risks of AI need to be taken seriously.
“Artificial Intelligence in many aspects can help people, however, there are challenges and we need to mitigate those as much as possible,” he says.
“New laws and policies will help.”
ECU's Dr Rogers agrees, believing this is an exciting and positive stage in our history.
“In the past AI was developed really rigidly, but thanks to many advances in the last decade, it has become much more flexible and that’s why we’re all getting really excited,” he says.
“It’s a massive step forward to be able to develop Artificial Intelligence in ways that are more like people — AI that is flexible in its thinking. This is something we should all be excited about.”
Working with the Western Australian Academy of Performing Art’s elite ballerinas, Dr Luke Hopper is using advanced motion analysis technology to figure out exactly how tiny movements help ballet dancers keep their balance, even while turning spectacularly on one leg.
A lecturer and director of WAAPA's Dance Research Group, Hopper uses ECU's unique motion capture laboratory at the Mount Lawley campus to collaborate with ECU dance academics, and draw on his own biomechanical expertise to help understand how the best dancers achieve such feats.
His research aims to strengthen the quality of dancers in Australia and gain career-saving insights into the causes of dancing injuries and how they can be prevented.
“Ballet dancers often have to turn three, four or even five times in one pirouette while balancing over one foot,” Hopper says.
“When a ballet dancer does a pirouette it often looks like they’re a spinning top over their leg but in fact they have to be continually adjusting their posture to maintain balance through the turn.”
The motion capture laboratory's world-class analysis can give dancers fresh insight into how they’re moving, which helps improve technique and prevent injury, and helps ballerinas better understand the mechanics of their bodies.
Cassie Tattersall, a student and ballet dancer at WAPPA, credits Hopper and the motion capture laboratory in understanding her own body and improving her dancing.
“Being able to perform a movement and then see it through the motion lab screen is so helpful in visualising how I am able to correct myself,” Tattersall says.
“The technology shows how my body is stacking and where my weight is sitting, and from there I’m able to see what physically needs to change to improve the movement.”
Please leave a comment about your rating so we can better understand how we might improve the page.