💸 Support This Newsletter
If you enjoy Momentum Monday and want to help keep it ad-free and human-first, clicking on a featured ad helps fund the work. Think of it as tossing a coin in the creator jar. 🪙🛰️🌀
🌞 Morning Piece of Mind
Good morning. When tech starts scanning our resumes before anyone scans our humanity, it's time to pause and ask: Are we optimizing our systems while erasing ourselves?
🎁 Members-Only Treasure Chest
Exclusive Download:
📥 Bias-Check Hiring Audit – A 10-minute worksheet to help HR leaders assess and document risk exposure from automated tools—before a lawsuit forces your hand.
🔗 [Unlock Your Exclusive Access] (premium subscribers only)
📌 Quick To-Dos (Fast Wins for the Week Ahead)
✅ Review how AI is used in your hiring funnel.
✅ Talk with your DEIB (Diversity, Equity, Inclusion, and Belonging) partners about possible bias hiding in plain sight.
✅ Book time with legal. Don't wait for the fine print to find you first.
Get Matched With the Best HRIS/ATS Software, for Free!
Researching HR systems shouldn’t feel like a second job.
The old way meant hours of demos, irrelevant product suggestions, getting bombarded with cold emails and sales calls.
But there’s a better way.
With SelectSoftware Reviews, spend 15 minutes with an HR software expert and get 2–3 vendor recommendations tailored to your unique needs—no sales pitches, no demos.
SSR’s free HR software matching service helps you cut through the noise and focus only on solutions that truly fit your team’s needs. No guesswork. No fluff. Just insights from real HR experts.
Why HR teams trust SSR:
✅ 100% free service with no sales pressure
✅ 2–3 tailored recommendations from 1,000+ vetted options
✅ Rated 4.9/5 by HR teams and trusted by 15,000+ companies
Skip the old way—find your right HRIS/ATS in a new way, for free!
📊 This Week in Shambles (Work, Tech & Culture)
LinkedIn cuts 281 jobs in CA.
Microsoft lays off 6,000 globally, shifting toward AI-heavy roles.
AI-generated resumes are getting flagged (and rejected) by hiring managers.
CEO pay jumps 10% (median $16.3M) while worker pay remains stagnant.
Workday faces collective-action lawsuit for alleged AI bias against applicants over 40—raising real questions about accountability in HR tech.
🖥️ Ctrl+Z: A Quick Fix for Chaos
If your team relies heavily on AI for hiring, try this reset: Review the last 10 rejections manually. Were they fair? Transparent? Human? Don’t let algorithms hide what matters.
🌀 Chaos Theory: Big Idea of the Week
🧠 Proof of Person: Navigating Identity in the Age of AI
The Workday lawsuit didn’t just drop headlines—it dropped a gauntlet. At the core of it is one simple truth: we must design systems that see people, not proxies.
AI can optimize. It can scale. But it can also encode and automate bias if we’re not careful. In this moment, “human-first” can’t just be a philosophy—it has to be a product feature.
📖 Word to Your Motherboard
Word: Humanware
Definition: The empathy, intuition, and lived experience that no machine can replicate—but every great system must make room for.
🛠️ The 1% Hackcident
Boost your "humanware": Block off 15 minutes today for something non-automated. Write. Call. Listen. Human.
🧩 The Missing Piece: Quick Hits & Fun Stuff
🎧 Podcast Rec: The Human Edge – Real convos on what separates us from the bots.
🗳️ Poll: What freaks you out more—an AI writing your review or a synthetic teammate in your next meeting?
🔑 Join The Reset Room: Need a judgment-free zone to talk about work, tech, and change? We’re here.
🔗 [Claim Your Seat]
🧿 Support This Newsletter If you see an ad that interests you, clicking on it helps fund this newsletter and keeps Momentum Monday accessible and independent. Every click counts—thank you for being part of this bold, bias-busting community. 🧬🛸🌐
📣 Share With Someone Who…
...still believes that your humanity is your superpower, not a liability.
🔗 [Forward This Gem]
🎤 Mic Drop: Collective-Action Lawsuit Against Workday: Breaking Down the AI Bias Case That’s Rocking HR Tech
A federal judge just allowed a lawsuit against Workday to move forward as a collective action and it might change everything about how we build and buy hiring tech.
This isn’t just about one platform. It’s about power, liability, and what happens when the promise of “AI-driven hiring” meets the reality of discrimination law.
Lemme break it down:
🧾 The Lawsuit
In Mobley v. Workday, the plaintiffs argue that Workday’s AI hiring tools disproportionately reject applicants over 40 creating a disparate impact under the Age Discrimination in Employment Act (ADEA).
Judge Rita Lin ruled in May 2025 that the lawsuit can proceed as a nationwide collective action. That means every candidate age 40+ who applied through Workday’s AI-powered systems may now join the case.

District Judge Rita F. Lin
Workday said it doesn’t make hiring decisions. The court didn’t buy it.
Workday’s own materials show its software grades, recommends, and in many cases, rejects candidates using algorithmic screening. That’s not a passive tool. That’s gatekeeping.
The judge ruled that even if individual companies used the tools differently, the AI system itself is a common policy—and that’s enough to certify the class.
This is the first time in U.S. history an algorithmic hiring bias case based on age has reached this point.
⚙️ What’s Actually at Stake?
The plaintiffs don’t have to prove that Workday meant to discriminate. That’s not how disparate impact law works.
They just have to prove:
The AI disproportionately harms older applicants.
That harm stems from specific practices or features in the software.
There’s a causal link between the two.
Examples? Rejections issued within minutes of applying at 1:50 AM. Résumés flagged as low-match even when fully qualified. And algorithmic recommendations that “learn” from past hiring decisions, potentially reinforcing biased patterns.
The claim is that Workday’s tools intended to increase efficiency actually automated ageism at scale.
⚖️ Why This Is a Big Deal (Legally)
This case introduces a game-changing theory: that HR tech vendors can be sued as agents of the employer.
That means if your tool screens candidates automatically, and you’ve been delegated hiring tasks like shortlisting or rejecting, you might be liable.
This breaks the usual model, where the vendor sells the tool, the employer uses it, and no one takes accountability if something goes wrong.
If the courts uphold this, it’ll close a massive loophole and force the industry to rethink what it means to “own” algorithmic decisions.
The U.S. Equal Employment Opportunity Commission (EEOC) has already backed the plaintiff’s position. That alone should make the industry pause.
🧠 Strategy Takeaways for HR, TA, and Product Leaders
This case isn’t just about Workday. It’s about any AI hiring tool used without auditing, transparency, or human oversight.
Here’s what this means for the rest of us:
If you’re in TA: Know how your tools work. Ask your vendors to prove their algorithms don’t disproportionately impact older, disabled, or racially marginalized candidates. Don’t wait for legal to tell you to fix it! This is your credibility moment.
If you’re building HR tech: Document your design decisions. Run real bias audits. Make sure you know where your models might fail and fix it before you’re asked to in court.
If you’re a founder or exec: Assume that “we didn’t know” is no longer a defense. The court made it clear: delegating to AI doesn’t absolve you of responsibility. It multiplies it.
If you’re in DEIB (Diversity, Equity, Inclusion, and Belonging): Use this case as a rallying point. Age isn’t just a footnote in DEIB. It’s a frontline. A system that screens out entire age groups is not inclusive. Period.
🔍 The Bigger Picture
This lawsuit is part of a much larger shift.
Regulators are paying attention. The U.S. Equal Employment Opportunity Commission (EEOC), the U.S. Department of Justice (DOJ), and lawmakers are pushing new policies around algorithmic accountability. New York City already requires bias audits for AI in hiring. Illinois has rules on video assessments. And more states are lining up behind transparency requirements.
You can feel the tension: HR tech is racing toward automation, while the legal system is scrambling to catch up. This case is where they collide.
And it’s just the beginning.
💡 Final Thought
🧿 Support This Newsletter
Your clicks help keep this newsletter free and fiercely independent. Tap an ad that speaks to you—it helps more than you think. 🪷📡🧊
AI bias isn’t just a tech problem. It’s a leadership problem.
If your systems make decisions about people, you need to understand—and own—how those decisions are made.
Because here’s the truth:
You don’t just adopt AI. You adopt its impact.
Now’s the time to take a hard look at your hiring stack, your vendors, your data practices, and your values.
If you’re serious about building systems that are smart, fair, and human-first—this case is your wake-up call.
👉 Paid Members: Download the Bias-Check Hiring Audit worksheet now to start protecting your team and your brand.
🔍 Bonus Resource: Introducing AI PoP (Proof of Person)
The First AI Evaluation Tool Built for HR, by HR
🚀 I'm currently building AI PoP (Proof of Person) and will be launching a beta testing program very soon. If you're an HR leader or tech evaluator who wants early access to the system, I’d love to hear from you. Reply to this newsletter or email me if you want to help shape the tool before it goes live.
AI PoP uses the "Hotdog/Not Hotdog™" method to evaluate hiring tools for bias, equity, and transparency—no data science degree required.
✅ 10-minute evaluations, not weeks of consulting
✅ Built for real HR use cases: hiring, promotions, performance reviews
✅ Clear, actionable reports for legal, exec, and product conversations
Why AI PoP is Different:
Human-first by design
Proactive bias detection before the lawsuit hits
Designed with your real-world HR stack in mind
Be one of the first to shape the most human-centered AI assessment framework in HR.
Stop reacting. Start designing.
🪐 One Last Thing
Want to help keep newsletters like this alive? Click on an ad that catches your eye—it fuels this work, one micro-moment at a time. 🌌🔭🧿
Let's build with clarity and purpose—together.
Remember: In a world full of algorithms, your humanity is your highest credential.
Let’s build accordingly. 💜 Your Friend,
Jackye