Well, why do I need to be ChatGPT's expert? Someone's got to check its work. It's great at doing research. It's great at being that assistant and accomplishing tasks that can be time-consuming, but it's not infallible. We can't trust it to that level yet. Maybe someday, but we still need to be the experts. We can't offload that to the machines.
This is Right About Now with Ryan Alford, a Radcast Network production. We are the number one business show on the planet with over 1 million downloads a month. Taking the BS out of business for over 6 years and over 400 episodes. You ready to start snapping necks and cashing checks? Well, it starts right about now.
Right about now. Hello and welcome to Right About Now. We're always talking about what the world has going on right now in the world of business, marketing, personal development. We cover the gamut ultimately, but we're here to help you get ahead. Entrepreneurs, executives, everyone out there listening, we know you have choices. Thank you for being here. New data shows that about 70% of US workers feel unprepared for today's workforce. That raises a bigger question. Is the problem education, employers, or how we think about skills altogether. Today I'm joined by Ryan Lufkin, the VP of Global Strategy in Instructure. He has spent years helping companies and institutions rethink how people actually become job ready, and we're getting into what's broken and what needs to change. Ryan, welcome to Right About Now.
Awesome, thanks, Ryan.
Hey, Ryan and Ryan, you got a good start already, man. We're like brothers from another— talk to me, man. Instructure, what that word means.
We're the makers of Canvas, Canvas learning management system. At this point, about half of all the college students in North America and about a third of all K-12 students in North America use Canvas on a daily basis in learning. We're all about instruction. Instructure is the name of the company. It's funny, you wear a Canvas shirt to Home Depot and people are like, oh, you give me flashbacks to college. You wear an Instructure shirt, not as many people know. I always like to explain that connection.
I did recognize that in our notes and that name. I was twitching a little bit. I'm not sure.
Well, it's funny, you can tell how good their educators were at designing courses in Canvas if they were like, oh, I love Canvas, or, oh, I hate Canvas. You're like, yeah, the teacher wasn't using it right.
Yeah. Yes. That's your point of view, I guess. It's probably how literate they were with technology.
And a lot of times it's how much support they had and what does a good online course or a good hybrid course look like. I think too often, especially if they learned really rapidly at the beginning of COVID our solution grew rapidly because people had to move online in over the course of about 2 days. You could tell the educators that were given a lot of support and a lot of resources and really trained what good looked like. And then the ones that were kind of thrown into it without that support. Our goal, it really has improved over the last 5 5 years of that level of online learning and course design.
There's some meta talk there, Ryan. The meta of the fact that the teachers weren't prepared to use the software appropriately, and we're talking about how workers in general feel unprepared.
You talk about this pace of change that started 6 years ago, March of 2020, and that pace of change hasn't slowed. Just when everybody thought, oh, COVID's over, we can take a breath, November 30th, 2022, OpenAI launched ChatGPT, and we got everything got turned on its head again. The last 3 years from that have been just an incredibly fast-paced evolution of how do we use this technology? I have a chart, it's almost the 7 stages of grief going through all of that. We're really in the acceptance phase with AI now and how are we applying it appropriately.
70% of workers feel unprepared. What does that really mean?
Well, you've got this kind of schism between education that in many ways is still really dealing with academic integrity idea, this idea that AI is just for cheating versus businesses that are trying to figure out how do we optimize our jobs with these tools. In many cases don't feel like universities are preparing them for that. It's even worse than K-12. When you look at K-12 and there's a kind of an anti-technology an education movement going on that ignores some of the more glaring benefits of using technology to reach people that have accessibility challenges and rural and frankly just maybe are missing school, things like that. We've got to get everybody on the same page. And Google announced a really great program trying to provide free AI education to 6 million educators across the United States. We are getting ready to release some courses around AI literacy. And even the detractors, those educators that are scared of AI or have doubled down and said we're not using AI, honestly, take an AI literacy course, at least understand how these tools tools work, then you understand what they're capable of, and then you can actually make the more informed decision about how deeply do you want to use these tools and put you in a better position to help your students understand how to use them ethically, how to use them effectively, things like that.
If we're going to learn something for the sake of learning and for teaching ourselves how to problem solve, are these the problems that we should be solving? Do we need to learn calculus? Do we need to learn things that AI can do and will foreseevably be around unless we all go back to the old ages because power goes away or something, or Wi-Fi goes away? Are we really teaching what we should be teaching? And why do we still need to learn the things we learned 30 years ago.
I will point to what was called the strawberry conundrum. ChatGPT 3.2, if you ask it how many Rs were in the word strawberry, it would tell you 2. And you'd say, go back and maybe look that again and tell me how many are there. And it would say, there are 2 Rs in strawberry. Of course there's 3 Rs in strawberry. And they could not figure out, because essentially large language models are a black box, you don't know what's going on inside there. They could not figure out why it hung up on that. And it wasn't until the next model, 3.3, came out, they fixed that. That's the reason that we all need to be experts. ChatGPT has a high propensity for what we would call hallucination. More often it's confidently incorrect. What it's trying to do is give you what you've asked it for. Some cases it makes things up. It'll make up links, sources, it'll make up whole sets of information because it's just trying to give you what you want. And if it doesn't find it or doesn't spend the time to find it, it just kind of makes it up. We don't know exactly why, but when we are the experts, we can double-check that work and say, oh, you know what, that's not right.
Let's go back and fix it. Let's not perpetuate that strawberry conundrum. What's really scary is that next generation that takes that approach that you're talking about and says, well, why do I need to be ChatGPT's expert? Someone's got to check its work. It's great at doing research. It's great at being that assistant and accomplishing tasks that can be time-consuming, but it's not infallible. We can't trust it to that level yet. Maybe someday, but we still need to be the experts. We can't offload that to the machines.
So there's one thing I've learned running multiple businesses. It's how fast things fall apart when communication gets messy. Missed calls, scattered text threads, team members not seeing the same conversation— that stuff quietly costs you time and revenue. That's why today's episode is brought to you by QWO, spelled Q-U-O, the smarter way to run your business communications. We realize that at my new card shop, Collector Station. Between customers calling about card inventory, grading submissions, trade offers, and even trade nights, Things were moving fast, and when communication wasn't centralized, things got messy quick. I also like that it works wherever I am. Phone, laptop, doesn't matter. I kept my existing number, added teammates in just minutes, and everything lives in one clean view. And their AI automatically logs calls and pulls out next steps. Make this the season where no opportunity and no customer slips away. Try QWO for free, plus get 20% off your first 6 months when you go to quo.com/ryan. That's quo.com/ryan. That's Q-U-O dot com slash Ryan. QWO. No missed calls, no missed opportunities. Even things as simple as math is direct in black and white. When I think of what you just described, and I agree with it, I tell people there still will be jobs because of what you said and discernment and humanity that have to be overlaid on the top of these things.
There's not only just that, along with certain factual applications in the real world. But when doing 10 10 or a² b² c c², even at advanced levels, that's fairly black and white. Are you saying that math is done wrong by ChatGPT?
I think it's the reasoning behind it. One of the things that's really important about math isn't just getting the answer, it's actually having the logic and reasoning behind how do we get from here to there. There's a whole Bloom's Taxonomy. If you're in education, you understand what Bloom's Taxonomy is. It's this skill tree. There's a new Bloom's Taxonomy that actually says, okay, these are the skills that are going to be easily replaced by AI, and these are the skills that aren't. These are the human skills, and they're problem solving, they're creative thinking, they're communication, they're consensus building, they're things like that. And that's why, to your point, the— we will always have jobs, we will always have the ability, because we need to be able to connect with other humans while we let AI do the more mundane tasks. And in some cases, yeah, it might get math wrong, it might get very convoluted. An accountant— the idea of an accountant giving up control their books or understanding the formulas and letting them run without being able to double-check that is scary. And not every job needs advanced math. If we're in marketing, I don't use advanced math on a daily basis, but there are occasions when it's important to have that reasoning, that logic.
If math is still fairly critical learning and important to keep in the curriculum, what are the things that should be changing then? I don't want to pretend that what we were doing 30 years ago— but I do hear some of the subjects and some of the stuff and I'm like, that's the same thing. Someone with a higher pay grade, smarter than me, knows if that's what they need to be learning. But I'm like, that is not real applicable. When does K-12 become more preparatory for the real world versus the standard topics that we've always taught because it was important 30, 40, 50 years ago that we did that?
That's where the tension, especially in K-12, comes right now. There's this kind of undercurrent of anti-technology going on, but then our STEM scores are going in the toilet. How do we address STEM? You know, science, technology, and math— science, technology, engineering, math. How do we do that if we're not going to actually leverage technology in the teaching of those things. And meanwhile, you've got a group that's pushing that we should be teaching cursive again. And why aren't we teaching cursive in elementary school? No one writes long-form letters during the Civil War when they would write the, like, you know, pen poetic letters. We don't do that anymore. There's been a lot of news articles on that lately. Who's pushing that and why? How does that benefit us? There's a really great story. I was at California system meeting and we had a panel of educators. It was a professor of product design and he said, okay, we used to spend the first 60% of a course designing a product, doing specs, doing market analysis, doing all this stuff. The last 40% kind of analyzing the work, presenting to each other, things like that. Now AI, we can do all of that within the first 10% of the course.
Come up with an idea, have AI help us do a product design, have it do the research around that for us. Now he makes his students actually reach out and contact experts in the market, set up a meeting, and present to them this idea and get their feedback. And I always say, if I call my daughter, she won't answer the phone, she'll text me right back. She's a junior in college, she does not answer the phone, she doesn't pick up the phone. He's pushing those students to lean back into those more human skills. Reaching out, making a connection, setting up a presentation, presenting your thing, collecting feedback— those things that AI can't easily do for us. That's really the focus. It's kind of that evolution in teaching. We're not necessarily teaching entirely new things, but we're doing it in more innovative ways that lean into those human skills and not just assigning 20-page papers that kids are going to use AI to write.
Are schools teaching the wrong things or just teaching them the wrong way?
It's the wrong way, really. It comes down to an assessment of mastery. How do we actually know whether or not somebody has learned that skill? How do we make sure it sticks in their brain? Isn't just, I memorized these data points, I took the quiz, I dumped those memory points out of my brain and then move on to the next thing. How do we really teach skills? That's the biggest challenge right now, is educators who are already underpaid and underappreciated and have school classes that are too big are being asked to evolve how they're teaching, fundamentally redesign their courses, and then measure those outcomes in truly significant ways that can then be reported on to the Department of Labor and things like that, that evolution teaching online and with technology during COVID We need to address this the same way. We need to provide educators the resources they need to involve how they're teaching with AI, with preparing those students, because it's not going away. And I think those educators that I can opt out, we've had students say, I want to opt out. You're not going to be able to opt out of AI.
There's a Waymo that just drove past people's houses. They don't get to say, you don't get to drive past here. Nope, that automated car is driving down the street. Street, you're going to be involved with AI at some level and understanding it, embracing it, understanding the ethical aspects of it are really important.
Talking with Ryan Lufkin, the VP of Global Strategy at Instructure. Ryan, you kind of teed this up. You're teeing all my notes up perfectly, by the way. I really like this. You're an excellent guest. Skills versus degrees. It seems like we're in a skills-based economy versus a degree-based economy. And I don't know if degrees hold the same value they used to. You, you can prove me wrong. What skills actually matter most right now?
We've talked about those human skills and how important they are. I will say I'm still a big believer that the degree— the, the associate degree, the bachelor's degree, the master's degree— they are still the preferred currency in hiring. And there's a reason for that, because they're not just about the hard skills generally, the math. They include the soft skills— working with groups, being empathetic, building consensus, communication— like all those things that are incredibly important, those human skills that we're talking about. They embody all of those. A year or two ago, you saw a lot of businesses kind that they were taking out a degree as a requirement for hiring. And what you saw is a lot of them slowly reintroduced that because it really is the best barometer of whether or not a person is prepared to enter the modern workforce. That said, because of this pace of change, we live in a world that we all need to be lifelong learners, every single one of us. That means go to school for 12 years or 16 years or 20 years, then we would go to a job, often the same job, for 30 years, then we would retire with our pension.
And I mean, That was our parents' generation. That doesn't exist anymore. The average tenure at a job is about 3 years. They're saying, well, we're going to work until we die. There's no pensions anymore. It's incumbent on us to prepare ourselves to be upwardly mobile within the job market by constantly training new skills. And in most cases, employers aren't owning that. The college university system still owns that upskilling and reskilling of adult learners. And they really have done an amazing job post-COVID, especially of rolling out new credential programs because you can actually roll out a credential program, a certificate program, program without all of the regulatory requirements around accreditation that a standard college program needs. In many cases, it's the same or similar content repackaged in more bite-sized chunks. So you can go and take a 3 or 4 course certificate, but then that certificate I can post to LinkedIn, I can post to an employer and show, look, if you look at my LinkedIn, I've got certificate in data-driven marketing from eCornell. I did an AI regulation and compliance course from Oxford Said Business School. I'm doing my master's right now with Arizona State university because to me I enjoy learning.
It's one of those things, but it's also that way that you qualify yourself for being upwardly mobile in a job. That's one of the things that we've shifted towards, and it's not necessarily breaking it down to the component skills. We haven't all agreed on like a skills taxonomy that makes sense to everyone, and a lot of the old skills taxonomies are really outdated. They don't capture a lot of the more modern skills that we need. Not quite there yet, but getting those certificates to help demonstrate for employers that I'm a learner, I want to evolve, and these We're getting there pretty quickly.
Hey guys, if you've ever built a website before, you know how quickly it can turn into a time suck. Recently I've been playing around with Wix's new hybrid editor called Wix Harmony. You basically start by telling it what you're trying to build. You prompt it to generate a professional-grade site just like you want it. And here's the part I like, you can easily go back and forth between AI and hands-on editing whenever you want. The AI agent Aria is an expert in website design and business. She can answer questions or perform direct actions throughout the process, which has been huge for me when I'm trying to perfect the look of my website. They've also got built-in tools for selling bookings and marketing. Pretty much all the stuff you actually need once the site's live. If you're building anything right now, a side project, brand, business, whatever, Wix Harmony honestly makes it easier to get out of your own way and start making stuff happen. Go to wix.com/harmony. .com/harmony. That's wix.com/harmony. Start your website today. I wish that we had gotten there when I was in college. Definitely learning soft skills. I would dare say, make this statement, Ryan, I have made more money per GPA than anyone that I went to school with.
I had the lowest— I graduated 2.01. I was 1 point away from not graduating on a calculus test, mainly because I never went to class. I had started with the highest grade I could get was a 90 because they deducted 10 points. So I had to essentially make A's on everything because to just to pass the class essentially because I was bored. I'm an entrepreneur, I'm partly ADD, I know I'm wired different, but it taught me absolutely nothing. Now here's what I learned. You nailed it. The soft skills, people coming to consensus, absolutely. Probably it was worth the money just for that. However, I might could have learned that and gotten further towards where I wanted to go doing something different if different had existed.
What's cool though, yeah, to your point, there are more educational opportunities, more paths to find yourself into a well-paying job than there ever have been in the history of the world. And the biggest challenge is finding out what do you want to do, and then finding through all the noise what's the, what's the best path to get together. And it's not always a 2 or 4-year degree. But I was the same way, I'll be honest. I diagnosed ADHD, missed class a lot, on academic probation at one point. What you learn is all of a sudden you're like, it clicks, and you're like, oh, if I go to class, if the teacher knows my name, if I do the basic studying, I will graduate. And those are the things. It's those social skills. College is almost about as much as learning what you need to do to be successful. There's a lot of kind of noise around employees not being prepared for the job. They don't want to come to the office, they don't want to show up on time, they don't want to spend the time there. What college does is train you that you need to be, you need to be here from this time to this time, you need to make connections with the people that your bosses, your educator, and your peers, and those are what make you successful.
That model applies in the workspace in ways that we're really kind of just starting to really appreciate. As we do the debate between remote work and in-person work, how do we make sure we judge that? One of the challenges I face was I had a job working for an ad agency when I was a senior in college, and I was like, well, why do I need to— I've already accomplished my goal, which is getting hired. Why do I need to finish? And that was at odds for me. But my first job was working for Coke's ad agency in the western United States, writing 20-second customized radio bumpers for Leadville, Colorado, and Boise, Idaho, and St. George, Utah, things like that. There were 4 interns, and that's what we did. All of those could be done by AI by one person. Now? Entry-level job that really got my foot in the door and got me hired by the agency full-time and was the start of my career is now gone. What are schools doing to provide more experiential learning to better prepare students to make that leap? And then what are businesses doing to maintain those entry-level positions?
That's one of the biggest challenges right now that people need to be talking more about.
Yeah, great point. I do want to talk about Canvas a little bit. Workers falling behind because of tech or because of how we train them, and it seems like Canvas is helping close that The goal really is to provide this framework for students to understand how they navigate learning.
We've added some AI features and things like translation and discussion summaries that save educators time and help keep students on track. But the bigger aspect is we provide this open architecture. We work with Google and Gemini and Microsoft and Copilot and Claude and Anthropic, the big players. For schools, they want that choice. What do we plug in? What do we make available based off of the contracts we have and the organization that we want? Our goal really has been to be the framework that supports all that innovation, make sure that they, they meet the unique requirements of education— accessibility, privacy, security. Those are non-starters for education. We are governed by very strict laws around student data privacy and accessibility. And so how do we make sure that these tools support that? We support K-12, higher ed, and more increasingly the government and corporate space as well. People graduate and say, I really like using Canvas, why don't I have something like that here in my job? We've moved to support that as well. That lifelong learning approach is incredibly important, and we want to make sure we're there to support institutions as they deliver that.
I try to be reflective outside of my own curiosity and skill set or whatever I'm interested in too. And then sometimes I step back, call it empathy, call it just understanding other sides of it, but it's a lot to digest. It is intimidating. I've got Claude and ChatGPT for the best answers. Oh, I kind of like that answer better than this one, especially like when prognosticating my age, how long I'll live. I do think it is overwhelming because I'm very good with change and adapting, but I go, damn, this is a lot to take on and understand how to apply it.
There's this constant stream of new innovations and new things coming out. Moore's Law is that acceleration of technology innovation, and Moore's Law is out the window with AI. It's so rapid, it's insane. There's announcements like CLAW, which is an open-source large language model. There was a person that said, okay, go out and make me a reservation for dinner. And it went out and it tried to access the website. Website wasn't working, so it gave itself a voice and it called the restaurant and made the reservation. And that freaked everyone out aggressively because because it accomplished the task that it was given, and it did so by going outside the bounds of what the person asking it to do the task thought it was possible. We all want Jeeves. We all want the virtual assistant that we've been promised forever, that you click a button and you can say, oh, make me a reservation, go do that, put me a flight to Cancun, find a good hotel and restaurant, set all that up, and then let's go. We all want that level of ease, but with that comes risk. And that's where we're in this kind of phase right now of how much should we trust these tools to be autonomous and how much do we really need to continue the oversight.
But it's a lot. It can feel very, very overwhelming. And you almost have to shrink your bubble into the world that you control and that you have aspect and kind of ignore a lot of the other noise.
How do you think employers should think differently about training and onboarding in this day and age?
I have a really good friend, Troy, who was in a car accident and suffered a traumatic brain injury and was out of work for about a year. And when he was ready to go back, he would do these interviews and they'd say, well, do you know AI? And he would say, what do you want me to know about AI? And they'd say, well, but do you know AI? And what became clear very quickly was that they didn't actually know what they wanted. They wanted someone with skills in using AI and adapting these tools to come in and help them deliver that kind of innovation. He reached out to me, we had a conversation, I sent him some courses that he could take and he upskilled quickly and then got a job pretty rapidly from that. But we're in this position where a lot of colleges and universities aren't necessarily taking ownership of teaching AI literacy. And employers aren't necessarily taking ownership. They want these graduates to come out with these skills to help them. We need both sides to take ownership of training students on AI literacy and how to use these tools effectively and ethically across the board.
That other aspect is there's this old adage that I've heard from a lot of employers that are like, well, what, what if we train our employees and they leave? The counterpoint is, what if you don't train them and they stay? We need employers to really own the upskilling and reskilling of their employees to make their workforce better and not hide behind the fear that those skilled employees will then take that training and go somewhere else.
Yeah, that's very flawed. We've seen both sides of it. We've come up in this innovation realm, but we're also part of the analog world, the way a lot of old employers and people thought, very scarcity-driven.
That scarcity mentality I think is really important to acknowledge. And I think what you're seeing is, as the average age— as we cycle through the older leadership, we're starting to get that more innovative leadership in a a lot of industries that previously were very focused on coming back into the office. Let's not use technology. We're being forced to really adapt to the modern world in a way. If you want to keep a good employee, you're going to have flexible hours. You're going to have good technology support. There's these new requirements for students that are just very different than when we came out of college or entered the workforce.
These tools have an empowerment component. How much they unlock. I came up in the ad agency business too, Ryan. I worked 20 20 years plus, 15 work for other agencies. There's no better silo world than the ad agency business. I imagine it's gotten better, but I'm 10 years removed from it. Ryan, you're going to get this. Much like I was out of fish out of water in college, I was a creative strategic account guy at an ad agency. I was a good writer. I think in headlines. I'm a big idea guy, but I'm actually pretty organized, pretty good account guy, and I understood strategy. It served me well because I was in a very entrepreneurial agency. Agency that actually took advantage of my skills. In most agencies, a fish out of water. I bring this up through my own context because, Ryan, you got that. Now with AI, used to be, okay, how can we get the most out of this person at the cheapest possible? Give them 7 hats. And it was just financially driven. But now with AI and the abilities, people can sit in a lot of different boxes because AI unlocks a lot of abilities and skills, agentic or otherwise, that allows them to do those things.
That is an incredible example. I'm going to steal that for later because I love that example of that siloed organization. I remember having an idea because I was on the account management side, a creative idea. I'm like, what if we do this? And I was told, stop, shut up, not your department, go talk to the customer. I had been a commercial designer. I had the right to be creative. And it was so funny that like that got shut down. Even the stages within the creative field, somebody would come up with a headline, the writers came up with headlines, the designers came up with maybe storyboards to pitch the idea. Now you can have the idea You can actually work with AI to do the writing and improve the writing. You can actually have AI create the storyboards for you. To your point, exactly. Now imagine educators, educators who are not necessarily really trained on how to be an educator. They're really subject matter experts in their field. Like, I'm a history professor. I know history like the back of my hand. But suddenly I have the ability to use these tools to create short videos, to create imagery, to make my writing less dry, things like that.
If we lean into that, if We support that, holy cow, the potentials are just unlocked for employees to be so much more productive in ways that we, you know, but we've got to break down that frame, that box, those boxes that we put people in. Love that analogy. That was a good one.
That's a big opportunity. Ryan, as we close out here, I got a little rapid fire for you if you don't mind. 3 things. Most overrated skill?
Coding, unfortunately.
Most underrated skill?
Good, clear communication. Is incredible.
The one mindset shift people need.
I really like what you just called out around the boxing of organizations and this idea that everybody needs to stay in their box and not try to cross over. Because think how many great ideas have never found the light of day because somebody was sitting in the wrong box. Really has kind of got me thinking now. I think that's just a great point.
Yeah. If we can get people thinking, the leaders going, how do we use AI to enable our people to to step outside of their boxes because they can, what do we unlock?
It's good leaders being willing to be more flexible, being willing to really call out those employees that are using these tools the right way and are using them effectively. Again, modeling what good looks like, show people what's possible, and then encourage everybody else to kind of follow that freeing of their minds.
AI creates these laboratories of opportunity, and if you enable that and you empower that, you're getting— it's kind of like in science and technology, how many more tests can they do improve the model that could change the world or at least change the organization.
We're just at the point where we're building the trust and we're starting to use these tools in really creative ways, but they're brand new and we don't even know exactly what they're all capable of. If we suspend our fear a little bit and if we lean into using them and learning them, I think there's a lot of opportunity that they, they uncover.
Ryan, where can everybody keep up with what you're up to? Instructure, Canvas, et cetera.
I'm on LinkedIn. instructure.com is our website. We blog frequently and then my my boss Melissa Loble, who's their Chief Academic Officer, she and I do a podcast called EduCast 3000. You've got a huge number more episodes. I think we have about 41 episodes. You have hundreds. We talk to some of the smartest people in education and really pick their brains to make us smarter.
It's great. Really appreciate you, Ryan, for coming on.
Great conversation. I really enjoyed it.
Hey guys, you want to find us? Ryanisright.com. You'll find show notes from today, highlight clips, the full-length episodes. Go check us out on YouTube. Hit that subscribe button. We appreciate it. You got to see this Army green I'm rocking. Ryan's rocking it too. We appreciate you. We're lucky that you're here. We'll see you next time on Right About Now.
This has been Right About Now with Ryan Alford, a Radcast Network production. Visit ryanisright.com for full audio and video versions of the show or to inquire about sponsorship opportunities. Thanks for listening.
New data shows that nearly 70% of workers feel unprepared for today’s workforce, raising bigger questions about how we define job readiness. In this episode, Ryan Alford sits down with Ryan Lufkin of Instructure to unpack what’s actually broken in education and how AI is accelerating the gap between learning and real-world skills.
They explore why AI isn’t replacing expertise but instead demands stronger critical thinking, communication, and human judgment. The conversation also challenges whether schools are teaching the wrong things—or simply teaching them the wrong way.
From the rise of lifelong learning to the debate between skills and degrees, this episode highlights what both employers and educators need to rethink to prepare the next generation.
What We Covered
70% of workers feel unprepared – What’s driving the growing skills gap in today’s workforce
AI in education and work – Why AI requires more expertise, not less
Skills vs degrees – Are traditional degrees still the best signal for employers?
The problem with modern education – Teaching the wrong things vs teaching the wrong way
Lifelong learning – Why continuous upskilling is now required for career growth
Breaking workplace “boxes” – How AI is empowering employees to operate across roles
Connect with the Guest
Ryan Lufkin
VP of Global Strategy — Instructure (Canvas)
Website: https://www.instructure.com
LinkedIn: https://www.linkedin.com/in/ryanlufkin
Podcast: EduCast3000
Connect with the Host
Ryan Alford
Host — Right About Now
Website RyanIsRight.com
Instagram: https://instagram.com/ryanalford
LinkedIn: https://linkedin.com/in/ryanalford