• 0 Posts
  • 57 Comments
Joined 3 years ago
cake
Cake day: July 3rd, 2023

help-circle





  • Lmfao and computers are just for nerds

    Edit: OpenAI, Anthropic, etc can all die, but LLMs are not. You can run a local model.

    Now I completely agree with the hype train is completely out of control and its a monetary bubble, but the tool itself is not going away.

    Edit2: I think the dotcom bubble is a good analogy, the underlying idea of the internet and all it can do and online ordering and such was solid, just an insane amount of hype on top that simply couldn’t be reached at that time. But now, the biggest companies ever are mainly internet/tech companies.


  • Ever heard of skills? You can essentially “teach” it new things that are not directly available in its model, right now it’s still pretty early but it (to me) feels like quite a leap compared to model-only usage.

    Its by no means perfect, but I do not think we’re even close to scratching the surface of what all can be done with the tech.

    I would bet people back at the advent of computers would scoff at many of the things computers can do now as fantasy.

    Edit: Right now, context size is a limiting factor, but you can do things like assign sub-agents to specific tasks/skills and have the overall agent call the subagent to complete the task thereby reducing the context size needed for the skill on the original agent call, it sorta acts as a mediator. Of course you still need to ensure you’re documenting what does/doesn’t work and have that available for future tasks in the same vein so it doesn’t repeat mistakes.

    On your point about the underlying model used to train it, I imagine at some point there will be a breakthrough where it becomes more dynamic, I think skills are kind of a stepping stone to that. Maybe instead of models being gigantic, data is broken down into individual skills that are called to inform specific actions, and those skills can easily be dynamic already.