Introducing ∞-former: Infinite Long-Term Memory for Any Length Context
Reported by Machine Heart Machine Heart Editorial Team Can it hold context of any length? Here is a new model called ∞-former. In the past few years, the Transformer has dominated the entire NLP field and has also crossed into other areas such as computer vision. However, it has its weaknesses, such as not being … Read more