AI researchers are calling the next class of models large action models (LAMs). For The New Stack, I explored what LAMs are, what examples are emerging in the market, and what experts think.
Nordic APIs, the API-specific blog I edit, was recently ranked the top API blog online by FeedSpot. After ten years managing this presence, I reflect a bit on the journey thus far.
AI coding is the easy part. Now it's time to focus on DevOps to get it into production. In a recent interview for LeadDev's DirectorPlus, Honeycomb's CTO, Charity Majors, shares expert tips on how to accomplish this.
Semantic caching is like typical caching, but for AI. It could eliminate a lot of redundant API calls to LLMs, reducing costs and improving performance.
Some internal projects are too good to stay hidden. For DirectorPlus, Spotify's Pia Nilsson shares how to identify and externalize internal platforms — like they did with Backstage.
New research shows LLMs outperform humans in software requirements engineering, reducing weeks of work to seconds while improving completeness by 10.2%.
Containers are designed to be isolated—that's great for security but tricky for networking. My feature with InfoWorld explores how eBPF is evolving container networking.