Anchor-Free Tracker Based on Space-Time Memory Network

Guang HAN, Chen CAO, Jixin LIU, Sam KWONG

Research output: Journal PublicationsJournal Article (refereed)peer-review

1 Citation (Scopus)


In the visual object tracking task, the existing trackers cannot well solve the appearance of deformation, occlusion, and similar object interference, etc. To address these problems, this article proposes a new Anchor-free Tracker based on Space-time Memory Network (ATSMN). In this work, we innovatively use the space-time memory network, memory feature fusion network, and transformer feature cross fusion network. Through the synergy of above-mentioned innovations, trackers can make full use of temporal context information in the memory frames related to the object and better adapt to the appearance change of the object, which can obtain accurate classification and regression results. Extensive experimental results on challenging benchmarks show that ATSMN can achieve the SOTA level tracking performance compared with other advanced trackers.

Original languageEnglish
Pages (from-to)73-83
Number of pages11
JournalIEEE Multimedia
Issue number1
Early online date23 Sept 2022
Publication statusPublished - Mar 2023
Externally publishedYes

Bibliographical note

Funding Information:
This work was supported in part by the Natural Science Foundation of China NSFC under Grants 61871445 and 61302156, and in part by Key R and D Foundation Project of Jiangsu Province under Grant BE2016001-4

Publisher Copyright:
© 1994-2012 IEEE.


  • Anchor-free
  • Feature cross fusion
  • Object tracking
  • Space-time memory network


Dive into the research topics of 'Anchor-Free Tracker Based on Space-Time Memory Network'. Together they form a unique fingerprint.

Cite this