
When it comes to new AI analytics services from AWS, CIOs can expect more of the same, said David Linthicum, independent consultant and retired chief cloud strategy officer at Deloitte Consulting. “Realistically, they can expect AWS to keep integrating its existing services; the key test will be whether this shows up as less complexity and faster time-to-insight, not just new service names,”
Lack of cohesion in AI platform strategy
That complexity isn’t confined to analytics alone. The same lack of cohesion is now spilling over into AWS’s AI platform strategy, where the cloud giant risks ceding mindshare despite its compute advantage.
“SageMaker is still respected, but it no longer dominates the AI platform conversation. Open source frameworks like Ray, MLflow, and KubeRay are rapidly capturing developer mindshare because they offer flexibility and avoid lock in,” Fersht said.
This fragmentation is exactly what partners want AWS to fix by offering clearer, more opinionated MLOps paths, deeper integration between Bedrock and SageMaker, and ready-to-use patterns that help enterprises progress from building models to deploying real agents at scale.
More plug-and-play, less build-it-yourself
AWS’s tooling shortcomings don’t end there, said Fersht. The hyperscaler’s focus on providing the parts for agentic AI and leaving others to build with them make it harder for business users to consume its services.
“AWS is giving strong primitives, but competitors are shipping business-ready agents that sit closer to workflows and outcomes. Enterprises want both power and simplicity,” Fersht said.




















