Review: Invisibility, MOD museum, Adelaide
Disinformation, algorithms, big data, care work, climate change, cultural knowledge: they can all be invisible.
In her New York Times bestseller, Weapons of Math Destruction (2016), subtitled “how big data increases inequality and threatens democracy”, mathematician and data scientist Cathy O’Neil unpacks the elusive algorithms of our everyday lives and how accurate, fair or biased they might be.
Algorithms hide behind the assumed objectivity of maths, but they very much contain the biases, subjective decisions and cultural frameworks of those who design them. With scant detail on how these algorithms are created, O’Neil describes them as “inscrutable black boxes”.
Opaqueness is intentional.
In one of the upstairs galleries at the spacious MOD, we are greeted in large text as we enter: “what do algorithms think about you?”
Can an algorithm think?, we ask. And, if so, what informs the decisions it makes about us?
Biometric Mirror was created by science fiction artist Lucy McRae and computer scientists Niels Wouters and Frank Vetere. They created an algorithm to judge our personalities by asking 33,000 people to look at photographs of faces and come up with possible personality traits.
Can an algorithm tell you who you really are? Topbunk
We don’t see who the photos are of or who is doing the evaluating – and therefore we don’t know what biases might be reproduced.
You are invited to gaze into a mirror which scans your face. From this scan, the algorithm creates a profile of your age, gender and personality, which appears as statistical data overlaid on your reflection.
When I look into the mirror, I am told I am neither trustworthy nor emotionally stable. The algorithm underneath guesses my age by a few years, and I score highly for intelligence and uncertainty – an unhelpful combination.
Despite my doubts about the algorithm, I notice myself focusing on the more favourable data.
In this context, the data is benign. But facial recognition technology has been used to survey and monitor activists and has been responsible for thousands of inaccurate identifications by police in the UK.
Using data to illuminate cultural knowledge
In one of the more impressive works in the exhibition, contemporary data visualisation is used to illustrate Aboriginal forms of knowing and the intrinsic relationship between spatial awareness, Country and kinship.
Ngapulara Ngarngarnyi Wirra (Our Family Tree) is a collaboration between Angie Abdilla from design agency Old Ways, New, Adnyamathanha and Narungga footballer Adam Goodes and contemporary artist Baden Pailthorpe.
In every AFL game Goodes played, his on-field movements was recorded via satellites, which connected with a tracking device in the back of his jumper. 20 million data points were then fused with data scans of a Red River Gum, or Wirra, to form an impressive data visualisation projected onto two large screens in a darkened gallery.
In Ngapulara Ngarngarnyi Wirra (Our Family Tree), data from Adam Goodes’ football games is returned to Country. Topbunk
Here, Goodes’s data is returned to Country to form part of the roots of the tree as well as the swirling North and South Winds of his ancestors. The data is also translated into sound and amplified, inviting us to listen to what would otherwise be inaudible.
In a small room between the screens – or within the tree – drone footage of the Adnyamathanha Country (Flinders Ranges) plays against the retelling of the creation story in Adnyamathanha language.
What results is the synthesis of traditional Aboriginal knowledge with cutting edge technology, revealing different ways of sensing space and time.
The power of the invisible
While it’s easy to focus on how technology is used and exposed in the works in Invisibility, down the corridors and hanging from the ceiling in MOD are a few other exhibits that flesh out the concept of invisibility.
Women’s Work recognises the leadership of Indigenous women. Topbunk
Women’s Work celebrates the leadership of South Australian Aboriginal Women with striking black and white photographs. Tucked away down the hall on the second level is Fostering Ties, a series of images drawing attention to children in foster care.
This exhibition foregrounds invisibility as a way to contend with our own blind-spots, knowledge systems, biases and cultural frameworks.
What is invisible to us may not be to those from demographics, cultural or language groups that differ from ours.
Drawing attention to the invisible encourages us to shift our perspective. If we don’t have the answer to solve a problem, maybe another cultural perspective – or life form – does.


SpaceX Pushes for Early Stock Index Inclusion Ahead of Potential Record-Breaking IPO
Nvidia CEO Jensen Huang Says AI Investment Boom Is Just Beginning as NVDA Shares Surge
SoftBank and Intel Partner to Develop Next-Generation Memory Chips for AI Data Centers
SpaceX Updates Starlink Privacy Policy to Allow AI Training as xAI Merger Talks and IPO Loom
TSMC Eyes 3nm Chip Production in Japan with $17 Billion Kumamoto Investment
Instagram Outage Disrupts Thousands of U.S. Users
Global PC Makers Eye Chinese Memory Chip Suppliers Amid Ongoing Supply Crunch
Oracle Plans $45–$50 Billion Funding Push in 2026 to Expand Cloud and AI Infrastructure
SpaceX Prioritizes Moon Mission Before Mars as Starship Development Accelerates
SoftBank Shares Slide After Arm Earnings Miss Fuels Tech Stock Sell-Off
Palantir Stock Jumps After Strong Q4 Earnings Beat and Upbeat 2026 Revenue Forecast
Nvidia Nears $20 Billion OpenAI Investment as AI Funding Race Intensifies
Alphabet’s Massive AI Spending Surge Signals Confidence in Google’s Growth Engine
Nvidia, ByteDance, and the U.S.-China AI Chip Standoff Over H200 Exports
Sony Q3 Profit Jumps on Gaming and Image Sensors, Full-Year Outlook Raised
Jensen Huang Urges Taiwan Suppliers to Boost AI Chip Production Amid Surging Demand
Elon Musk’s Empire: SpaceX, Tesla, and xAI Merger Talks Spark Investor Debate 






