These are the headlines, reports, and stories that took me on this journey. I followed my curiosity from one discovery to the next, unpacking what I found along the way.
For international readers: ABC in this book refers to Australia's national public broadcaster (like the BBC), not the US commercial network. It operates under federal legislation requiring accuracy and editorial independence.
In a world where so much online content is scraped into AI systems without consent, I’ve focused on material that is meant to be publicly read and shared, news stories, public reports, books and articles. I've chosen sources that are:
Where useful, I’ve included research and reports. Where possible, I've included sources showing how individual-level changes scale into community harm, my community development lens. The following sources document the journey. All were publicly available at time of writing, though AI moves fast.
On Claude's "blackmail" test:
On Understanding the basics and how LLM’s actually work
On AI hallucinations
On the Gemini self-criticism incident:
On Reddit AI persuasion experiments:
On Agentic AI and Community Agency
On breastmilk still being the healthiest way to feed your baby
On microplastics in breast milk:
On the scale of plastic chemicals:
Note on plastic chemical testing data: The claim that most plastic chemicals lack safety data comes from multiple sources. For industrial chemicals broadly, research on the EU REACH system found that around 80% of registered chemicals had not been assessed for safety after more than 10 years (Perssonet al., 2022).
For plastics specifically, the 2024 PlastChem Report identified 16,325 chemicals used in plastic production, with more than 10,000 (approximately 66%) lacking sufficient hazard information to determine potential risks (Monclúset al., published in Nature, 2024).
On "microwave-safe" labels:
On domestic violence
On compostable cups:
ABC News (2024, Nov 29) In a sewerage system flush with wet wipes, fatbergs are costing Victorian ratepayers millions of dollars a year
ABC News (2024, Nov 29) In a sewerage system flush with wet wipes, fatbergs are costing Victorian ratepayers millions of dollars a year On child labour in cobalt mines:
On companies can’t see through the whole supply chain
On “Don’t just Boycott”
On electric vehicle being still the best environmental impact
On the "Buy Now!" documentary:
On facial recognition in Australian retail:
On Amazon "Just Walk Out" stores and surveillance retail:
On Surveillance Capitalism
In the audiobook I describe Shoshana Zuboff as an “MIT researcher”. This was a mistake. Zuboff is Professor Emerita at Harvard Business School and a Faculty Associate at Harvard Law School’s Berkman Klein Center. She coined the term “surveillance capitalism” and develops it in The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (2019, Dec)
On vehicle data collection:
On ‘when systems getting better at nudging more in debt
On Flushable Wet Wipes
NOTE: In 2022 Australia and New Zealand brought in a voluntary flushable products standard (AS/NZS 5328:2022). It was put together with the water industry so that “flushable” finally meant something testable – the product has to break up quickly in water, move through pipes without causing blockages, and it isn’t allowed to contain plastic if it’s going to call itself flushable. That’s a real improvement on the old “if it fits, it flushes” labelling. The catch is that the standard is voluntary, so not every brand has to follow it, which is why water utilities still tell people to stick to the 3 Ps (pee, poo, paper).
On where do biosolids actually go
On Australian AI research (Moreton Bay microplastics, enzyme bacteria):
On Data centres
Note on the “Brisbane on a peak summer day” comparison
The analogy that Meta’s new AI facilities could use “about as much electricity as a city the size of Brisbane on a peak summer day” is based on order-of-magnitude figures, not a formal forecast.
Meta CEO Mark Zuckerberg has described the planned Hyperion AI data centre as providing around 5 GW of power for AI workloads. Powerlink Queensland reports record summer transmission demand for the whole state at about 11,000 MW, with Brisbane as the largest load centre. This places a single 5 GW AI campus in the same ballpark as the electricity needed to keep a major city like Brisbane running on an extreme-heat day. The comparison in the book is intended to illustrate scale, not to claim an exact one-to-one match.
On dishwashers
At the time of writing (mid-2025) I wasn’t able to find any brand-by-brand comparison of dishwasher microplastic release, or any Australian product labelling/rating system for “microplastic-reducing” dishwashers. The available research and reporting (UQ, ABC) focus on the problem at a system level rather than naming specific brands.
On decision fatigue
On individualisation of responsibility
On local recycling
On externalities
On Plastics Industry quote
PBS Frontline (2020, March 31) Plastics Industry Insiders Reveal the Truth About Recycling
On Symbols
On UBI
On the East India Company:
On Elon Musk and Grok:
On What happens when inconvenient stories get edited out
On Facebook news shutdown in Australia:
On X (Twitter) and the eSafety Commissioner case:
On Starlink
On government contracts and data partnerships:
On No legal privilege in chats
On AI market dynamics and network effects:
On foundation model costs and key players:
On NVIDIA and AI chips:
On AI talent and researcher movement:
On Australia's AI infrastructure:
On text bubbles
On Fast Food and Tobacco
On Juuken Gorge Destruction
On Risk Appetite
On Sam Altman quote – Correction Note: In the original text, I presented what I believed was a direct quote from Sam Altman about AI becoming "extensions of ourselves, seamlessly integrated into our thinking, conversations and daily workflows as personal collaborators rather than mere software." While the concept accurately reflects Altman's stated vision (particularly from his TED2025 talk), the specific wording was a synthesis rather than a direct quote. Sam Altman has described a future where AI becomes deeply integrated into our lives. In his TED2025 conversation, Altman discussed "how models like ChatGPT could soon become extensions of ourselves. TED (2025, Apr) OpenAI's Sam Altman talks ChatGPT, AI agents and superintelligence — live at TED2025This vision suggests AI won't be a distant tool, but rather something seamlessly integrated into our thinking, conversations, and daily workflows—personal collaborators rather than mere software.
On Codex getting internet access
On Risk with internet access
On Scapegoating
On Microsoft Bing
On AI took your job
On ‘Don’t blame AI, Blame Capitalism
On Supermarkets and Banks
On the warnings about need for public input into AI
On Professor Toby Walsh
Audiobooks need to flow naturally when spoken aloud, and sometimes that means losing a bit of nuance. The audiobook phrase "before it's too late" appears without enough context about what Walsh means by "too late."
What's accurate: Professor Walsh isn't saying it's too late to stop AI - he's saying we need public input now, before it's too late to shape how AI develops. He's one of Australia's leading AI researchers who is a proponent of AI's potential while advocating for ensuring it benefits everyone. The eBook has been updated to better reflect his position.
On Helen Toner and Sam Altman
The audiobook phrase "over safety concerns" oversimplifies why the board removed Altman.
What's accurate: Helen Toner explained in a May 2024 interview that the decision was about trust and governance, the board couldn't effectively oversee the company because Altman repeatedly wasn't candid with them about major decisions, conflicts of interest, and company processes. While safety was among their concerns, the core issue was governance and Altman's communications with the board.
The eBook now describes this as: "concerns about how the company was being run and how fast it was pushing ahead."
The eBook describes this as: "one of the directors who tried to fire Sam Altman after concerns about how the company was being run and how fast it was pushing ahead."
On warning about AI
On AI could exacerbate inequality
On AGI and Superintelligence
On United Airlines Video
On Shein and toxic chemicals
On WhatsApp privacy policy
On Bob Geldof, Life Aid and Live Aid
On Ryan’s Rule
On Fixed It
On 23andMe
In 2018, GlaxoSmithKline invested US$300 million in 23andMe as part of a four-year collaboration giving GSK access to de-identified genetic and health data from customers who had opted into research.
On Merging human minds with machines
On ChatGPT-5 Release
On Lateral Violence
On Gaming
On Aspirational Labour
On Cap Cut
On AI Companions
On AI and mental strength
On decoding AI
On Synthetic data
On amplifying knowledge with AI
On Woolworths Workers
On Duolingo and Audible
On AI Washing
On Services Australia
On AI Job Washing
On double dipping
On Australian labour movement
On dating apps and AI-generated profiles:
On Microsoft's "40 jobs at risk" report:
On Blockchain
On Aboriginal songlines and knowledge systems:
On Windsurf acquisition:
People keep asking me what community development actually means.
Here's the simple version: it's based on the idea that communities already know what they need. My job is to help make that knowledge visible so people can act on it together.
It's showing single mums they can redesign parenting programs that actually help. Young people telling councils what they really need. Communities and developers actually listening to each other and finding solutions that work for everyone.
The core principles? Things like genuine participation - working WITH people, not doing things TO them. Building on strengths rather than focusing on deficits. Creating transparency so everyone can see how decisions get made. And always, always keeping power-sharing at the centre - because real change happens when people have agency over their own lives.
Throughout this book, I've used this same approach - trying to make visible what's been hidden about AI and the systems shaping our world. Just like I'd do in any community project: exposing who holds power, who makes decisions, who carries the burden. Because once we see how personal struggles connect to bigger patterns, everything shifts. We stop carrying burdens that were never ours. Start asking different questions. Start finding each other.
That collaborative discovery? That's community development. Whether it's starting a football club or understanding AI, the process is the same - bring people together, share real information, and watch what emerges when everyone has a seat at the table.
This book is social commentary based on publicly available information and my own experience
Throughout this book, I’ve talked about real platforms, technologies, and people - including ChatGPT, Claude, TikTok, Replika, CapCut, Elon Musk, Grok, and others. I’ve shared what I’ve seen, what I’ve read, and how I’ve experienced these shifts - as a parent, a community worker, and someone trying to make sense of it all.
These mentions aren’t endorsements or attacks. They’re part of the world we’re living in. Everything I’ve said is based on public information, personal experience, or publicly available statements. This project is completely independent. I wasn’t paid by, sponsored by, or affiliated with any of the companies or people mentioned. And that independence matters to me.
Copyright © 2025 Felicity Hill - All Rights Reserved.