Review: Invisibility, MOD museum, Adelaide
Disinformation, algorithms, big data, care work, climate change, cultural knowledge: they can all be invisible.
In her New York Times bestseller, Weapons of Math Destruction (2016), subtitled “how big data increases inequality and threatens democracy”, mathematician and data scientist Cathy O’Neil unpacks the elusive algorithms of our everyday lives and how accurate, fair or biased they might be.
Algorithms hide behind the assumed objectivity of maths, but they very much contain the biases, subjective decisions and cultural frameworks of those who design them. With scant detail on how these algorithms are created, O’Neil describes them as “inscrutable black boxes”.
Opaqueness is intentional.
In one of the upstairs galleries at the spacious MOD, we are greeted in large text as we enter: “what do algorithms think about you?”
Can an algorithm think?, we ask. And, if so, what informs the decisions it makes about us?
Biometric Mirror was created by science fiction artist Lucy McRae and computer scientists Niels Wouters and Frank Vetere. They created an algorithm to judge our personalities by asking 33,000 people to look at photographs of faces and come up with possible personality traits.
Can an algorithm tell you who you really are? Topbunk
We don’t see who the photos are of or who is doing the evaluating – and therefore we don’t know what biases might be reproduced.
You are invited to gaze into a mirror which scans your face. From this scan, the algorithm creates a profile of your age, gender and personality, which appears as statistical data overlaid on your reflection.
When I look into the mirror, I am told I am neither trustworthy nor emotionally stable. The algorithm underneath guesses my age by a few years, and I score highly for intelligence and uncertainty – an unhelpful combination.
Despite my doubts about the algorithm, I notice myself focusing on the more favourable data.
In this context, the data is benign. But facial recognition technology has been used to survey and monitor activists and has been responsible for thousands of inaccurate identifications by police in the UK.
Using data to illuminate cultural knowledge
In one of the more impressive works in the exhibition, contemporary data visualisation is used to illustrate Aboriginal forms of knowing and the intrinsic relationship between spatial awareness, Country and kinship.
Ngapulara Ngarngarnyi Wirra (Our Family Tree) is a collaboration between Angie Abdilla from design agency Old Ways, New, Adnyamathanha and Narungga footballer Adam Goodes and contemporary artist Baden Pailthorpe.
In every AFL game Goodes played, his on-field movements was recorded via satellites, which connected with a tracking device in the back of his jumper. 20 million data points were then fused with data scans of a Red River Gum, or Wirra, to form an impressive data visualisation projected onto two large screens in a darkened gallery.
In Ngapulara Ngarngarnyi Wirra (Our Family Tree), data from Adam Goodes’ football games is returned to Country. Topbunk
Here, Goodes’s data is returned to Country to form part of the roots of the tree as well as the swirling North and South Winds of his ancestors. The data is also translated into sound and amplified, inviting us to listen to what would otherwise be inaudible.
In a small room between the screens – or within the tree – drone footage of the Adnyamathanha Country (Flinders Ranges) plays against the retelling of the creation story in Adnyamathanha language.
What results is the synthesis of traditional Aboriginal knowledge with cutting edge technology, revealing different ways of sensing space and time.
The power of the invisible
While it’s easy to focus on how technology is used and exposed in the works in Invisibility, down the corridors and hanging from the ceiling in MOD are a few other exhibits that flesh out the concept of invisibility.
Women’s Work recognises the leadership of Indigenous women. Topbunk
Women’s Work celebrates the leadership of South Australian Aboriginal Women with striking black and white photographs. Tucked away down the hall on the second level is Fostering Ties, a series of images drawing attention to children in foster care.
This exhibition foregrounds invisibility as a way to contend with our own blind-spots, knowledge systems, biases and cultural frameworks.
What is invisible to us may not be to those from demographics, cultural or language groups that differ from ours.
Drawing attention to the invisible encourages us to shift our perspective. If we don’t have the answer to solve a problem, maybe another cultural perspective – or life form – does.


Tesla Earnings Beat Expectations as EV Growth Holds Amid Robotics and AI Shift
Amazon Stock Rises as Meta Expands AWS Partnership for AI Infrastructure
SpaceX Eyes $60B Cursor Deal to Boost AI Power Ahead of IPO
U.S. Raises Alarm Over Chinese AI Firms’ Alleged IP Theft Through Model Distillation
Nvidia Pushes 800V Data Center Power Systems to Boost Efficiency and Cut Costs
Jeff Bezos Eyes $10 Billion Funding Round for AI Venture Project Prometheus
Elon Musk Signals Intel 14A Chips for Tesla’s Terafab AI Semiconductor Venture
Florida Launches Criminal Probe Into OpenAI Over FSU Shooting Incident
DeepSeek Launches V4 AI Models with Enhanced Reasoning and 1M Token Context Window
SK Hynix Launches 192GB SOCAMM2 Memory for Nvidia’s Next-Gen AI Chips
Chinese Robotics Stocks React as Humanoid Robot Marathon Sparks Competition Concerns
NVIDIA Acquisition Rumors Dismissed by Morgan Stanley as Strategically Flawed
John Ternus Signals Apple’s Future with Product-First AI Strategy
Elon Musk Faces French Probe Over X and Grok Amid Rising U.S.-EU Tensions
Florida Investigates OpenAI and ChatGPT Over Alleged Role in FSU Shooting
LG Innotek Stock Hits Record High on $68M Automotive Wi-Fi 7 Deal
Meta Expands AI Training With Employee Activity Tracking Tools 






