LongQLoRA: Efficiently Extending LLaMA2-13B Context Length
Click the card below to follow the “LiteAI” public account This article will introduce our work on efficiently extending the context length of large models with low resources:LongQLoRA. It will involve knowledge related to Position Interpolation and QLoRA, and we recommend combining it with previous articles to help understand this work: Illustration of RoPE Rotational … Read more