Lately, Channel 10’s ‘The Challenge’ aired a phase on inTruth Applied sciences, the corporate I based in 2021 to sort out one of the vital important challenges of our world immediately; self consciousness and psychological well being.
inTruth is the primary of its variety; a know-how that may monitor emotion with medical grade accuracy by means of client grade wearables.
We construct software program that restructures the info and interprets the emotion. Our tech can combine with any {hardware} that has a PPG sensor (most client wearables). The Challenge took a fear-based method, presenting our work on feelings and AI as probably invasive.
Whereas this angle makes for a dramatic narrative, it misses a vital level: inTruth was based to supply options to the very actual psychological well being disaster we’re experiencing, to not add to it.
Proper now, we face unprecedented charges of psychological well being challenges, together with excessive suicide charges and pervasive emotions of isolation. We urgently want scalable, preventative instruments, and emotional perception is essential to creating significant progress on these fronts. inTruth is a frontier in its subject.
At inTruth, our mission is to empower folks to grasp and handle their emotional well being.
Our know-how is designed to put information possession firmly within the palms of customers, not companies, fostering a tradition the place emotional perception is as pure and empowering as respiration.
We’re removed from an organization that can sit, Mr Burns fashion, behind our dashboard and enjoy workers surveilling their workers.
Our imaginative and prescient is one in every of empowerment and freedom, in a world the place many at the moment really feel polarised and trapped. This isn’t about surveillance or management—it’s about creating transparency, fostering self-mastery, and giving folks the instruments to proactively handle their well-being.
Sadly, the phase didn’t embody the detailed factors I made round decentralisation and information sovereignty, core ideas that outline inTruth’s method. As a substitute, opinions have been featured from “specialists” who appeared out of contact with the actual potential of this know-how and the lengths we go to in defending consumer autonomy.
Misrepresentation like this will gasoline public worry, which finally dangers pushing Australia’s prime expertise abroad to environments which can be extra open to innovation. This “mind drain” is a big danger that we can’t afford, and as an Aussie – I need to see us thrive.
It’s additionally value difficult the misperception—raised within the phase—that solely giant establishments can successfully defend information. In actuality, it’s nimble, purpose-driven startups like ours which can be main the best way in decentralisation and moral information administration.
Bigger establishments usually wrestle to implement these ideas with agility, whereas startups are pioneering options that prioritise consumer management and strong privateness safeguards.
With the speedy acceleration of AI, it’s clear this know-how is right here to remain. The query, then, is which firms will we need to assist as shoppers? Organisations dedicated to goal and decentralisation—like inTruth—are those constructing a future worthy of belief.

The inTruth app
Our know-how has unparalleled potential to remodel lives by offering nuanced perception into feelings, which are sometimes triggered unconsciously each 200 milliseconds and deeply impression our choices and psychological well being. With out addressing these patterns, we can’t hope to sort out the broader challenges we face as a society. Emotion is driving 80% of all choices we make, which stay largely unconscious to us.
This consciousness can heal the appreciable divide we see immediately in international conversations.
So sure, scrutiny is welcome, and I face it each day as a founder on the forefront of this work. I deal with objections every single day from media, funds and potential companions. Simply as all world-changing founders and firms have.
Uber, Spotify, Tesla all discovered themselves on this very place at first. It’s one thing that have to be embraced not backdown from.
I return to this query; what higher different do we’ve got to unravel this disaster?
With no path towards emotional maturity and self-regulation, that up-levels our capability to deal with unprecedented ranges of energy and intelligence accountability and mindfully, the AI revolution may result in a much more dystopian future than a world the place emotional perception is known, normalised and revered.
At inTruth, we’re right here to fulfill this want, step-by-step, and we’re optimistic in regards to the future we’re constructing.
And to those that doubt startups’ skill to safeguard information—just because giants have struggled—simply watch. Within the coming years, purpose-driven innovators will set a brand new commonplace in information safety and consumer belief, one which establishments will wrestle to maintain up with.
- Nicole Gibson is a multi-award-winning social entrepreneur, and the founder, and CEO of inTruth Applied sciences.