Ask HN: Anyone tracking the degradation of LLM's against new domains?

3 points by czhu12 9 hours ago

I'm starting to notice that for newer projects that I've been playing around with (most recently react router V7), LLM's are worse due to lack of training data, compared to something like Rails, that was around long before LLM's.

I would imagine this dependency on LLM coding makes people less willing to adopt new technologies, ergo, training data would probably no longer be as plentiful? Doesn't this eventually lead to a long term degradation of LLM's?

Anyone else noticing this?

matt_s 15 minutes ago

In general LLMs are going to lag behind what humans do in every domain. They need data to be absorbed and processed and that data comes from humans. In this way, they work the same way as search engines in the 2000's. Something new will take time to move up in search results, especially if there is a low rank assigned and the site might be untrusted.

I don't think of this as degradation, more that they will always lag behind the human content. Degradation would be more along the lines of if there is less and less training data being generated and the LLM has something wrong that would lead to degradation of usefulness.