• Emmie@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    edit-2
    3 months ago

    AI needs human content and a lot of it, someone calculated that to be good it needs like some extreme amount of data impossible to even gather now hence all the hallucinations and effort to optimize and get by on scraps of semi forged data. Semi forged, artificial data isn’t anywhere close to random gibberish of garbage ai output

    • merari42@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      Depends on what you do with it. Synthetic data seems to be really powerful if it’s human controlled and well built. Stuff like tiny stories (simple llm-generated stories that only use the complexity of a 3-year olds vocabulary) can be used to make tiny language models produce sensible English output. My favourite newer example is the base data for AlphaProof (llm-generated translations of proofs in Math-Papers to the proof-validation system LEAN) to teach an LLM the basic structure of Mathematics proofs. The validation in LEAN itself can be used to only keep high-quality (i.e. correct) proofs. Since AlphaProof is basically a reinforcement learning routine that uses an llm to generate good ideas for proof steps to reduce the size of the space of proof steps, applying it yields new correct proofs that can be used to further improve its internal training data.