As Meta introduces its lineup of new AI chips, the company joins other tech giants in diversifying the AI accelerators used for specific workloads, and says that mainstream GPUs built for large-scale pre-training are less cost-effective for inference workloads.