ClouderaNOW Learn about AI Agents, Cloud Bursting, and Data Fabrics for AI  |  April 8

Register now
| Business

Dr. Jake Trippel on Why Your Technical Debt Is Compounding

Cloudera Author Profile Picture
Woman looking at data

AI is only as powerful as the data architecture behind it.

In  episode 52 of The AI Forecast, Why LLMs Aren’t Enough and How AI Fabrics Will Change Everything, host Paul Muller sits down with Dr. Jake Trippel, Dean of the College of Business and Technology at Concordia University, St. Paul, and Co-Founder & CTO of Codename 37, to unpack what’s holding enterprises back from scaling AI:

  • Siloed data architecture

  • Misunderstanding of the  power of machine learning, deep learning, and neural networks

  • Compounding technical debt

Their conversation spans cloud versus on-prem economics to the coming shift from SaaS applications to bot-based experiences. Below are key moments from their discussion.

Why AI Architectures Are Hitting Their Limits

Paul: Tell us about what we’ve seen in the past with AI and data architectures, and why we need to rethink them now.

Jake: We went through the digital transformation era, that was the challenge with data. We stayed in data silos because that's how our platforms were architected, and that's how data was organized. Then we tried to do a bunch of integrations. We tried to do all these app integration engines. We tried to find nifty ways to do it, but what happened was we created a spaghetti mess pulling  ELT to ETL, system to system.

Now fast forward to today. The challenge now is that these organizations are incentivized to keep us in silos because now comes AI data silos, the data still in silos, and that's where the power of cloud comes in. That's where we're proud to be a Cloudera partner.

Imagine the same problem, except amplified. I’ve got AI agents up the kazoo — awesome — but they’re only working inside their own data silo.

People are going to want more. They’re going to want agents that can work together, talk together, and reason together. But how do you do that if your data is still stuck in silos? To get to this data mesh state is going to require a transformational change, and that's why Cloudera is a cool solution that can help folks do that.

Why Large Language Models Aren’t Enough

Paul: What are some of the hacks, best practices, tips or tricks that you use to help you get the most out of what you do with data?

Jake: The biggest thing is understanding that large language models are not the answer for everything. AI is a big world.

Large language models are awesome for some things, but they’re really bad for others. People have to understand the power of machine learning, deep learning, and neural networks — which are really the guts of the other two.

The skillset of our time right now is being able to develop or use the right models for the right use cases, and to rapidly get through data. That’s where people need to focus.

The Compounding Effect of Technical Debt

Paul: How do organizations, in your opinion and experience, pragmatically start to move from where they’ve been to where they’re going? How do they clean their data up? Is there a mechanism by which they can do it without breaking?

Jake: That's a big loaded question, so I'll try to pull it apart a little bit. You’re three decades in for a reason. We still see AS/400s out there — and they work. You got to give IBM credit.

The challenge that these organizations have though is how much capital are you expending? Because of the compounding effect of this technical debt — you can kick the can down the road year after year, decade after decade. The cost is only going to grow.

But now at least you have options. We can take the data out and we can do a lot more with it than we ever have before. Instead of ripping off the Band-Aid approach, as long as we have access to the data and continue access to the data, we can now create any type of experience we want in parallel.

Why Some AI Workloads Are Moving Back On-Prem

Paul: What are you seeing with your existing clients today as they’re looking to deploy new workloads?

Jake: We are seeing a massive migration back to on-prem. Couldn’t believe it. Never would have predicted that.

As these organizations are doing more model development, training, and so on, the cloud cost model is just too expensive. I have not met a CFO who’s excited about spending how much a month training these models.

So, they’re making the investment. They’re going back to data centers. They’re depreciating it over the next five years. We’re seeing this in medical devices, financial services, aviation — it’s typically hybrid, but for particular workloads, especially training and development, it’s way more cost effective.

AI as an Amplifier for Learning — Good and Bad

Paul: What are you seeing in terms of the academic world and how we prepare the workforce of the future?

Jake: AI is an amplifier. It’s going to amplify the good — and it’s going to amplify the bad.

On the good side, people will learn 10, 20 times faster than they ever have before. I’ve built models that can read books in three seconds flat. I can now immerse myself in the data and create any type of learning experience I want adapted to my learning style.

The bad side is students choosing, I don’t have to do anything. I can let AI do all my work and I’m not going to learn anything. That’s the part that scares me.

The skillset of our time is, I hope you like learning. You’re going to be doing it every single day of the rest of your career.

Listen to the full conversation with Dr. Jake Trippel on The AI Forecast on Spotify, Apple Podcasts, and YouTube.

Ready to Get Started?

Your form submission has failed.

This may have been caused by one of the following:

  • Your request timed out
  • A plugin/browser extension blocked the submission. If you have an ad blocking plugin please disable it and close this message to reload the page.