Tag: LLMs

MWC 2024: More than AI and APIs

STL Partners’ Research team present their observations from and analysis of the biggest mobile industry event of the year. There was a lot of buzz around AI and API but behind the tech jargon, we saw evidence that our industry continues to morph to become more open and customer-focused.

Edge AI – How AI is sparking the adoption of edge computing

AI applications will require low-latency, local compute for rapid inferencing and large scale data collection, triage, and engineering. Edge compute will therefore play a key role in AI app delivery. However it’s not just about infrastructure – commercial scale for edge AI will depend on effective ecosystem collaboration models.