That’s a great question! I believe that the rate of improvement of large language models (LLMs) is so far meeting, and maybe even exceeding, the expectations of experts. The process of training LLMs is expensive and time-consuming and takes large amounts of data to complete which slows down the process of innovation. That being said, the advancements we have seen so far, especially in the efficiency and capabilities of LLMs, have been achieved at a remarkable rate. It will be interesting to see if this pace slows down as the amount of available information to train models becomes more limited. If anyone else has thoughts on this please feel free to share!
Would you say the growth speed of these LLMs has been within expectations or have they exceeded what experts believed would happen?
That’s a great question! I believe that the rate of improvement of large language models (LLMs) is so far meeting, and maybe even exceeding, the expectations of experts. The process of training LLMs is expensive and time-consuming and takes large amounts of data to complete which slows down the process of innovation. That being said, the advancements we have seen so far, especially in the efficiency and capabilities of LLMs, have been achieved at a remarkable rate. It will be interesting to see if this pace slows down as the amount of available information to train models becomes more limited. If anyone else has thoughts on this please feel free to share!
Great job on these weekly updates!
Thank you, appreciate the support!