By Joe Sasson
Publication Date: 2026-02-21 15:00:00
AI is evolving rapidly and software developers no longer have to remember syntax. However, it is becoming increasingly valuable to think like an architect and understand the technology that enables systems to operate safely at scale.
I also want to reflect on how I have been working as an AI Solutions Engineer at Cisco for a year now. I work with clients every day from a variety of industries – healthcare, financial services, manufacturing, law firms – and they are all trying to answer largely the same questions:
- What is our AI strategy?
- Which use cases actually fit our data?
- Cloud vs. on-premise vs. hybrid?
- How much will it cost – not just today, but on a large scale?
- How do we secure it?
These are the real practical limitations that immediately surface when trying to operationalize AI beyond a POC.
We recently added a Cisco UCS C845A to one of our labs. It features 2x NVIDIA RTX PRO 6000 Blackwell GPUs, 3.1TB NVMe, ~127 assignable CPU cores and 754GB RAM. I have decided…

