Skip Navigation
is the 4k context length of llama2 for real?
  • You are supposed to manually set scale to 1.0 and base to 10000 when using llama 2 with 4096 context. The automatic scaling assumes the model was trained for 2048. Though as I say in the OP, that still doesn't work, at least with this particular fine tune.

  • is the 4k context length of llama2 for real?

    I've been using airoboros-l2-70b for writing fiction, and while overall I'd describe the results as excellent and better than any llama1 model I've used, it doesn't seem to be living up to the promise of 4k token sequence length.

    Around 2500 tokens output quality degrades rapidly, and either starts repeating previous text verbatim, or becomes incoherent (grammar, punctuation and capitalization disappear, becomes salad of vaguely related words)

    Any other experiences with llama2 and long context? Does the base model work better? Are other fine tunes behaving similarly? I'll try myself eventually, but the 70b models are chunky downloads, and experimentation takes a while at 1 t/s.

    (I'm using GGML Q4_K_M on kobold.cpp, with rope scaling off like you're supposed to do with llama2)

    10
    Why is the front page suddenly so stale?

    The All feed on both Hot and Active modes is exactly the same as it was most of a day ago, all the same posts in the same order, except they're all 12h+ old now. At first I thought federation might not be working due to overload, but on New posts are coming in all the time, both local and remote.

    What's up?

    0
    How are we going to pay for all this?
  • Reddit has over 2,000 employees most of whom are doing bullshit nobody using the site actually needs or wants, it's possible to run a lot leaner than that. Like Reddit itself used to, before they started burning hundreds of millions trying to compete with every other social media site at once instead of being Reddit

  • InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)AC
    actually-a-cat @sh.itjust.works
    Posts 2
    Comments 2