The RAG vs Long-Context Debate: No Need to Fight

The RAG vs Long-Context Debate: No Need to Fight

Introduction Hello everyone, I am Liu Cong from NLP. As the context length supported by large models continues to increase, a debate has emerged online (many groups are discussing this topic, so I would like to share my thoughts) regarding RAG and Long-Context, which is really unnecessary… The main point is that the two are … Read more

How Kimi-k1.5 is Developed?

How Kimi-k1.5 is Developed?

Yesterday, everyone was shouting about the New Year! Kimi has released a rare technical report, let’s take a look at the technical details. The technical report is here: https://github.com/MoonshotAI/Kimi-k1.5 First, as always, Kimi emphasizes long context. So the question arises: if we provide the model with a longer ‘thinking space’, can it naturally learn to … Read more