Skip Navigation
12 comments
  • Wake up baby, it's time to run the latest Meta-Llama3.3-80B-Parameters-16bit-QUANT-pretrained+instruct-trained with full 16bit context quant on your 4xNVDIA-RTX4090-24GB-VRAM GPU cluster <3

    Com' on, I wanna see u coming up with a new prompt format! ~ ~ ~

    I genuinely care about your interests <3

12 comments