Inspired by Lamia Youseff: What My CS Freshman Actually Needs in the AI Era

My friend Lamia Youseff wrote a brilliant piece on education in the AI era. As an AI engineer with 25+ years in tech—and a parent with a CS freshman and a midshipman at the Merchant Marine Academy—I took her framework and pressure-tested it against what my sons are actually getting.

Inspired by Lamia Youseff: What My CS Freshman Actually Needs in the AI Era

My friend Lamia Youseff recently published a piece on LinkedIn that I haven’t been able to stop thinking about: “Dear Parents and Educators: The World Your Child Is Being Prepared for No Longer Exists.”

If you haven’t read it, go read it now. Seriously. I’ll wait.

Lamia brings a rare combination of perspectives to this conversation—former multi-billion-dollar AI product engineering executive at Apple, founding member of Google Cloud, Stanford GSB alum, and someone who’s lived across four distinct educational models from Cairo to Cambridge to Palo Alto. When she speaks about what education needs to become, she’s drawing from a depth of experience that most commentators on this topic simply don’t have.

Her core thesis resonates deeply with me: the skills that will matter most in the AI era—critical thinking, emotional intelligence, sound judgment, adaptability, and AI fluency—are not what most schools are optimizing for. She’s not saying technical education doesn’t matter. She’s saying it’s necessary but insufficient, and that liberal arts, case-based learning, experiential education, and even intentional boredom are the missing ingredients.

I’ve been in tech for over 25 years. I’ve watched entire categories of engineering work appear, mature, and get automated away—sometimes within a single decade. I’ve learned the hard way that the skills that kept me relevant weren’t the ones I thought were most important early in my career. The ability to learn quickly, to communicate across disciplines, to exercise judgment when the playbook doesn’t exist—those are the things that compound. And now, as an AI engineer who works with these tools every day—and as a parent with one son studying Computer Science at a UC campus and another at the United States Merchant Marine Academy—I feel an urgent responsibility to make sure I’m giving them the right guidance, not the guidance that would have been right ten years ago.

I wanted to take Lamia’s framework and do something concrete with it: audit what my CS freshman is actually getting, identify the gaps, and build a practical playbook for what he should be doing right now.

Why Lamia’s Framework Hits Home

Lamia makes several points that I want to amplify, because the data backs them up even more strongly than her post had room to explore.

The “10x Engineer” Is Dead—And That’s Okay

In her companion piece on the future of work, Lamia asks a devastating question: “What happens when Claude Code can write 100x the code?” The era of the singular genius programmer who outproduces everyone through sheer coding ability is over. The 2026 CS Playbook from DEV Community confirms this shift: “The graduates who will thrive are not those who memorize syntax or chase hot frameworks. They are the ones who understand systems deeply, use AI as a multiplier rather than a crutch, and build projects that demonstrate thinking instead of mimicry.”

This isn’t cause for panic—it’s cause for reorientation. The value moves up the stack, from writing code to directing and evaluating code. That’s exactly the shift Lamia is describing.

Liberal Arts for STEM Is Not a Detour—It’s an Accelerant

When Lamia describes studying Nietzsche and Descartes alongside her CS coursework at AUC, she’s describing something the research increasingly validates. A February 2026 book review in CBE—Life Sciences Education highlights how liberal arts education develops “critical thinking, emotional intelligence, cultural competence, teamwork, ethical reasoning, and coachability”—what the author calls life skills. Georgia Tech’s Ivan Allen College of Liberal Arts reports that their liberal arts graduates out-earn peers, attributing it to the combination of humanities skills with technological education.

Georgetown’s CEW published The Major Payoff in October 2025, confirming that while STEM median earnings lead at $98,000, the graduates who combine technical and humanistic skills are the ones positioned for leadership roles where the highest earnings actually live.

Lamia nailed it: “When an engineer has read Machiavelli, they think differently about power, systems and what might go wrong.” The data says she’s right.

Experiential Learning Isn’t Optional—It’s Transformational

This is where Lamia’s point about case-based and experiential learning hits closest to home for me, because I’m watching it play out in real time with my two sons.

My older son is a midshipman at the United States Merchant Marine Academy at Kings Point. If you’re not familiar with it, Kings Point is one of the five federal service academies—and by many measures, the most demanding. Here’s why: USMMA’s signature feature is Sea Year, a full year where cadets ship out on commercial vessels and military sealift ships, standing watches, maintaining engineering plants, and navigating real oceans with real cargo and real consequences.

But Sea Year doesn’t extend the program—it compresses it. While other service academies spread their academics across four years, Kings Point cadets must complete the same rigorous academic curriculum in just three years of classroom instruction to make room for their year at sea. That means heavier course loads per semester, fewer elective choices, and an academic pace that’s relentless. Add to that the military regimentation, physical fitness requirements, and the reality that they’re living aboard working ships in some of the most demanding maritime environments on Earth—and you have what the Princeton Review consistently identifies as one of the toughest schools in the country.

The Academy’s motto is “Acta Non Verba”—Deeds, Not Words. Over 80% of the Navy’s Strategic Sealift Officer force are USMMA graduates, and that’s not because of what they learned in a lecture hall. It’s because of what they learned standing watch at 3 AM in the South China Sea.

I’ve watched Sea Year transform my son in ways no classroom could. The judgment, the composure under pressure, the ability to make decisions when the stakes are real and the information is incomplete—these are exactly the skills Lamia is calling for. The National Survey of Student Engagement consistently finds that students in experiential learning programs report significantly higher gains in practical competencies and critical thinking compared to traditional lecture-based peers. A 2024 study in the Journal of Experiential Education found that structured work-integrated learning increases self-efficacy by up to 34% and strengthens students’ ability to transfer knowledge to novel situations.

When Lamia advocates for case-based and experiential learning “at every level” and “starting much earlier,” she’s describing what the service academies have understood for decades. The question is why so few CS programs have caught on.

Pressure-Testing a Real CS Curriculum

So how does my CS freshman’s actual curriculum stack up against Lamia’s framework? I pulled the current curriculum charts and course catalog from his university’s engineering school and ran it through five pillars drawn from her framework.

A caveat before I share the results: university websites and published course catalogs are often playing catch-up with reality. Departments add new courses, revise prerequisites, and launch initiatives at a pace that the registrar’s office can’t always reflect in real time. So what I found represents the published curriculum—the actual on-the-ground experience may be better than what the catalog suggests. Still, if a parent or student is trying to evaluate a program from the outside, the published curriculum is what they have to work with, and the gaps I found are common across CS programs nationwide.

Liberal arts integration: B-. The degree requires general education breadth across humanities and social sciences—good. But there’s no intentional bridge between those courses and the CS curriculum. A student can satisfy their humanities requirements without ever grappling with the ethics of the systems they’re learning to build. Georgia Tech’s deliberate pairing of liberal arts with engineering—which their own research credits for producing graduates who out-earn peers—shows what a more integrated model can look like.

Case-based and experiential learning: C+. Capstone and senior design projects exist, and they’re team-based. But case-based learning—the kind Lamia describes from Stanford GSB, where you step into someone else’s shoes and make decisions under pressure—is essentially absent. The experiential components are concentrated at the end of the degree, when they should arguably start in year one.

Emotional intelligence and personal growth: D. This isn’t a knock on my son’s school specifically—virtually no CS program in the country has figured this out yet. Scott Galloway at NYU Stern has been vocal about this gap: “We’re producing technically skilled graduates who can’t navigate a difficult conversation with a coworker.” Nothing in the degree requirements intentionally develops self-awareness, resilience, or the identity shifts Lamia correctly identifies as essential for the AI era.

AI fluency as basic literacy: B. This is where the program has genuine strength—AI, machine learning, NLP, and deep learning courses are all available, with some even offered at the lower-division level. There’s active AI research and growing industry partnerships. But here’s the gap: AI fluency isn’t required. A CS student can graduate without taking a single AI course. For context, Carnegie Mellon now offers an Agentic AI executive program covering agent architectures and RAG patterns. UC Berkeley is funding faculty to integrate AI tools directly into undergraduate engineering courses. Stanford’s 2025 AI Index Report found that while 87% of CS departments offer ML courses, fewer than 15% require them for graduation. The ingredients are there; they just haven’t been assembled into a requirement.

Intentional boredom: N/A. Lamia’s most unconventional point—that schools should preserve space for unstructured time and slow thinking—isn’t something a curriculum can mandate. That one’s on us as parents.

The overall picture? The building blocks are on the board at most strong CS programs. What’s missing is the connective tissue—a coherent strategy that makes every CS graduate AI-fluent, ethically grounded, and experienced in the kind of messy human judgment that defines great engineering.

The Practical Playbook: What I’m Telling My Son

Since the curriculum won’t do all of this automatically, here’s what I’m actually telling my CS freshman. After 25 years in this industry, I’ve seen enough technology cycles to know what endures and what evaporates. This playbook is informed by Lamia’s framework, validated by the research, and grounded in what I’ve learned the hard way.

1. Treat Your Humanities GEs as Core Training

Don’t just “get them out of the way.” Seek out philosophy, ethics, economics, and writing courses. These are where you build the judgment and reasoning muscles that will differentiate you from an AI that already writes better boilerplate code than most junior engineers.

Lamia said it perfectly: “Liberal arts for STEM majors does not dilute technical training—it widens the aperture.” A February 2026 review in Washington Monthly argues the same: the humanities matter more now, in the AI era, than they ever have. Steve Jobs credited his calligraphy course at Reed College with shaping Apple’s approach to design—a reminder that the most career-defining courses are often the ones that seem least “practical” at the time.

2. Build AI Fluency Now—Don’t Wait for Upper-Division Electives

If your school offers a lower-division ML or AI course, take it early. But also don’t wait for formal coursework to start developing AI fluency:

  • Use GitHub Copilot or Cursor in your assignments—not as a shortcut, but as a learning partner. Watch what the AI generates, critique it, understand why it made certain choices. The core skill Lamia describes—“guiding intelligence, not just creating it”—starts here.
  • Learn prompt engineering through practice. Build a small project with an LLM API. This is 2026’s equivalent of learning to Google effectively in 2005.
  • Understand the agent paradigm. The industry has moved from “AI as autocomplete” to “AI as autonomous agent.” Carnegie Mellon’s new Agentic AI program covers agent architectures and multi-agent systems. You don’t need to enroll there—but reading about how AI coding agents work (Claude Code, Copilot agent mode) and learning to decompose tasks, write clear instructions, and verify autonomous output is a skill that transfers to every engineering role.

3. Go Deep on Fundamentals, Not Broad on Frameworks

Frameworks churn. What endures:

  • Data structures and algorithms—not because you’ll hand-write a red-black tree, but because understanding computational complexity is how you evaluate whether an AI-generated solution is actually good.
  • Systems design—how distributed systems work, how databases store and retrieve data, how networks route traffic. AI can generate code for any of these; understanding them is what lets you architect and debug.
  • Computer architecture and operating systems—the lower you go in the stack, the harder it is for AI to replace your judgment.

4. Build Projects That Demonstrate Thinking, Not Just Output

A GitHub profile full of tutorial follow-alongs tells an employer nothing in 2026. What matters:

  • Build something that solves a real problem. The problem-selection itself demonstrates judgment.
  • Document your decisions, not just your code. Write READMEs that explain why you chose a particular architecture and what trade-offs you considered.
  • Contribute to open source. The skill isn’t the code—it’s learning to work within existing systems, read other people’s code, and communicate asynchronously.

5. Seek Out the Experiential Learning That the Curriculum Doesn’t Provide

This is the biggest gap. The curriculum won’t hand you case-based learning or emotional intelligence development. You have to go find it:

  • Join team-based projects and clubs. Hackathons, robotics, open-source teams—anything where you coordinate with humans under time pressure. Navigating disagreements, aligning on priorities, delivering together—AI can’t teach you this.
  • Practice explaining technical concepts to non-technical people. If you can explain to your parent why a database migration is risky, you can explain it to a VP. This is one of the highest-leverage skills in engineering.
  • Take on ambiguity. Volunteer for projects where the requirements aren’t clear. This is where judgment grows.

I wish every CS program had something analogous to Sea Year—a period where students work on real systems with real consequences, where the theory meets the ocean (literally or figuratively). And I wish they had to do it under the kind of time pressure Kings Point imposes, where you can’t coast because there simply isn’t room. The closest thing in CS education is an industry co-op or a serious open-source contribution. Seek those out aggressively.

6. Protect Your Boredom

Lamia’s point deserves more than a hand-wave. Creativity and genuine curiosity don’t emerge from constant stimulation:

  • Schedule unstructured time without screens. Walk. Let your brain wander. The ideas that differentiate great engineers from good ones tend to arrive during these gaps.
  • Read books, not threads. Long-form reading builds sustained attention—itself a competitive advantage in an age of infinite distraction.

The Sea Year Lesson

I keep coming back to the contrast between my two sons’ educational experiences. At Kings Point, the Merchant Marine Academy doesn’t just talk about experiential learning—it is experiential learning. Cadets compress a full four-year academic load into three years of classroom time, then spend a full year at sea, standing real watches, maintaining real engines, navigating real weather. They come back fundamentally changed—more composed, more decisive, more capable of functioning when the plan falls apart.

And here’s what makes that compression so revealing: it forces the Academy to be ruthlessly intentional about what it teaches. When you only have three years of classroom time, every course has to earn its place. There’s no room for padding. Every credit hour must deliver real capability. That constraint produces graduates who are—by necessity—both deeply competent and broadly capable.

Compare that to a typical four-year CS program, where there’s enough slack in the schedule for students to drift through without ever being forced to integrate their knowledge under pressure. The time is there, but the intentionality often isn’t.

That transformation is exactly what Lamia is describing when she talks about emotional intelligence, case-based learning, and personal growth. The service academies have known this for a long time: you can’t lecture someone into good judgment. You have to put them in situations where judgment is required and let them develop it through practice.

CS education hasn’t figured this out yet. The capstone project is a start, but it’s a semester at the end of a four-year degree. What if experiential, team-based, ambiguity-rich projects started in the freshman year? What if the equivalent of Sea Year was an industry rotation where students worked on production systems with real users? An arxiv paper from March 2026 on “Preparing Students for AI-Driven Agile Development” makes exactly this case—proposing project-based AI engineering curricula that integrate agile practices throughout the program, not just at the end.

The Bigger Picture

Lamia closes her post with this: “We’re not preparing students to out-compute machines. We’re preparing them to out-think, out-feel, and out-judge them.”

That’s exactly right. And from where I sit—watching one son learn to direct AI agents in a VS Code terminal and another learn to direct a 600-foot container ship in the Pacific—the through-line is the same. The future belongs to people who can exercise judgment, lead under uncertainty, and bring the deeply human skills that no model can replicate.

As a father who’s spent 25 years in this industry, I’ve watched the “essential” skills list rewrite itself every few years. What never changes is the premium on adaptability, clear thinking, and the ability to work with people. Those are the skills that kept me employed through the dot-com bust, the mobile revolution, the cloud migration wave, and now the AI transformation. I’d bet everything that they’ll be what keeps my sons employed too.

The curriculum at most strong CS programs isn’t there yet—but the building blocks are, and a student who’s intentional about how they use their four years can assemble exactly the education Lamia is describing.

Lamia, thank you for writing what so many parents needed to hear. This post is my attempt to take your framework—which I think is genuinely brilliant—and translate it into a Monday morning plan for one CS freshman navigating the most consequential shift our industry has ever seen.

For any parent reading this with the same question Lamia surfaced—“How am I supposed to prepare my kids for a future I can’t even imagine?”—the answer isn’t to panic. It’s to be intentional. The skills your student needs are buildable right now, this quarter, with the courses and opportunities already available to them.

Start there.