The Go programming language having just turned 15 years old on November 10, proponents now are planning to adapt the Go language to large multicore systems, the latest vector and matrix hardware instructions, and the needs of AI workloads.
In a blog post on November 11, Austin Clements of the Go team said that, looking forward, Go would be evolved to better leverage the capabilities of current and future hardware. “In order to ensure Go continues to support high-performance, large-scale production workloads for the next 15 years, we need to adapt to large multicores, advanced instruction sets, and the growing importance of locality in increasingly non-uniform memory hierarchies,” Clements said. The Go 1.24 release will have a new map implementation that is more efficient on modern CPUs, and the Go team is prototyping new garbage collection algorithms that are designed for modern hardware. Some improvements will be in the form of APIs and tools that allow Go developers to make better use of modern hardware.
For AI, efforts are under way to make Go and AI better for each other, by enhancing Go capabilities in AI infrastructure, applications, and developer assistance. The goal is to make Go a “great” language for building production AI systems. The dependability of Go as a language for cloud infrastructure has made it a choice for LLM (large language model) infrastructure, Clements said. “For AI applications, we will continue building out first-class support for Go in popular AI SDKs, including LangChainGo and Genkit,” he said. Go developers already view the language as a good choice for running AI workloads.