ScaleUp:AI

The importance of flexibility in AI systems

Insight Partners | April 17, 2025| 2 min. read

Artificial intelligence is rapidly evolving, and flexibility has emerged as a defining trait for successful AI adoption. Dell’s CTO, AI and Data Solutions, Chris Bennett, and Iternal Technology’s SVP, Product Development, Joseph Balsamo, outlined how businesses of all sizes can leverage flexibility to harness AI effectively, from initial experimentation to full-scale enterprise deployment. Iternal Technologies provides an AI platform that turns unstructured data into reusable content blocks to streamline communication across enterprise teams.

Key takeaways

  • Flexibility in AI solutions — spanning infrastructure, tooling, deployment, and scalability — is crucial for businesses looking to leverage AI effectively.
  • Dell emphasizes an ecosystem approach, partnering with providers of all sizes to deliver flexible, end-to-end AI services.
  • Businesses benefit by starting with targeted AI solutions, validating results, and then scaling confidently with infrastructure partners.
  • Misconceptions around generative AI, such as ease of deployment and guaranteed accuracy, underscore the need for robust data curation and realistic expectations.

These insights came from our ScaleUp:AI event in November 2024, an industry-leading global conference that features topics across technologies and industries. Watch the full session below:

Dell’s ecosystem-driven approach

Dell, historically known for hardware, now plays a central role in AI infrastructure. Bennett emphasized Dell’s transformation over the last 40 years: “We’ve been built from a college dorm room all the way up to the data center and consumer services provider that we are today.”

He explained that Dell has intentionally cultivated an ecosystem of partners: “We have an entire ecosystem of partnerships with the smallest of the small ISVs all the way up to the largest companies you can imagine.” This enables Dell to deliver exceptional results by combining hardware, software, and services — either directly or through collaborators like Iternal.

“We’re all about flexibility”

Flexibility in AI infrastructure goes beyond choosing between cloud and on-premises deployments. Bennett explained that Dell aims to support portability across environments: “If I want to pick it up from my hyperscaler cloud platform and move it to a co-location provider or go the other way, I’ve got to be able to do that.”

To avoid vendor lock-in and manage long-term costs, Dell advocates for infrastructure that supports open-source tools, diverse silicon platforms, and modular software stacks. As Bennett put it: “We’re all about flexibility. We obviously have a strong bias towards on-prem, but we interoperate where our customers live.”

Iternal’s perspective on predictable innovation

From the startup perspective, Balsamo highlighted how Iternal values flexibility to maintain performance and predictability as AI applications scale. “We like working with Dell because even though they’re a large company, they can get small real fast if we need them to,” he said.

Balsamo added that predictability in AI performance is just as important as speed: “If your results vary too much, the data might not be useful to you.” He also emphasized the importance of data confidence, even with small datasets: “We’re just there to help you with the confidence…which is really what matters.”

He pointed out that cloud is often a great place to start, especially for automating proposals or small-scale tasks, but as businesses scale, they “start to see a curve where your performance is going to start to drop and your predictability is going to start to drop. And that’s where Dell comes in.”

Avoiding common AI misconceptions

Both speakers cautioned against misconceptions about AI, particularly generative AI. Balsamo remarked: “Prompt engineering is easy, you’re just going to plug in the system and everything’s going to work and you’re going to eliminate 10,000 jobs…your data is in great shape and you shouldn’t have to worry about it — because all your responses are going to be good. And what’s really scary is the responses are going to be good, they just might not be true.”

Bennett added that many equate AI only with chatbots: “It’s not all a chatbot. It really isn’t.” He encouraged businesses to start small and practical: “Think of things like the 800-page HR manual… the ROI is almost instantaneous to build a digital assistant for HR policy.”

“You need to understand the consumer of your application”

Bennett shared a success story that evolved through flexible thinking. In Dell’s internal pilot of a retrieval-augmented generation (RAG) platform, early challenges with access control and data availability surfaced. The initial version of the system was designed for users with enterprise-level access, but when rolled out to a broader internal audience, the model sometimes fabricated answers if it couldn’t reach the right data sources.

“The model would respond in the best way it knew how… it didn’t really understand how to say ‘I don’t know the answer,'” Bennett explained.  “You need to understand the consumer of your application.”

By adapting the system design and tightening the alignment between user access and data access, Dell improved the tool’s reliability.

Future-proofing your AI strategy

Looking ahead, Bennett emphasized that having flexibility at every layer — from hardware to software platforms — is critical to future-proofing AI systems: “Having a platform that supports you moving your LLM or changing your LLM or having multiple LLMs is really, really important.”

Balsamo agreed: “Choosing a platform that gives you the flexibility to swap out components as either capabilities mature…or maybe your needs change — that, to me, is 100% paramount.”

Watch more sessions from ScaleUp:AI, and see scaleup.events for updates on ScaleUp:AI 2025.