Machine Heart Column
Table of Contents
1. What is the Hawkeye Library
2. Models and Methods Supported by Hawkeye
3. Installing Hawkeye
4. Training Models with Hawkeye
1. What is the Hawkeye Library

Hawkeye is a deep learning tool library for fine-grained image recognition based on PyTorch, designed specifically for researchers and engineers in related fields. Currently, Hawkeye includes various representative paradigms for fine-grained recognition methods, including “based on deep filters”, “based on attention mechanisms”, “based on high-order feature interactions”, “based on special loss functions”, “based on network data” and other methods.
The Hawkeye project features good coding style, clear and readable structure, and strong extensibility. For those new to the field of fine-grained image recognition, Hawkeye is relatively easy to get started with, facilitating their understanding of the main processes and representative methods of fine-grained image recognition, while also allowing for quick implementation of their algorithms on this toolkit. Additionally, we provide training example code for each model in the library, and self-developed methods can also be quickly adapted and added to Hawkeye according to the examples.
Hawkeye open-source library link: https://github.com/Hawkeye-FineGrained/Hawkeye
2. Models and Methods Supported by Hawkeye
Hawkeye currently supports a total of 16 models and methods from the main learning paradigms in fine-grained image recognition, as detailed below:
-
Based on Deep Filters
-
S3N (ICCV 2019)
-
Interp-Parts (CVPR 2020)
-
ProtoTree (CVPR 2021)
-
Based on Attention Mechanisms
-
OSME+MAMC (ECCV 2018)
-
MGE-CNN (ICCV 2019)
-
APCNN (IEEE TIP 2021)
-
Based on High-Order Feature Interactions
-
BCNN (ICCV 2015)
-
CBCNN (CVPR 2016)
-
Fast MPN-COV (CVPR 2018)
-
Based on Special Loss Functions
-
Pairwise Confusion (ECCV 2018)
-
API-Net (AAAI 2020)
-
CIN (AAAI 2020)
-
Based on Network Data
-
Peer-Learning (ICCV 2021)
-
Other Methods
-
NTS-Net (ECCV 2018)
-
CrossX (ICCV 2019)
-
DCL (CVPR 2019)
3. Installing Hawkeye
Installing Dependencies
Use conda or pip to install the required dependencies:
-
Python 3.8
-
PyTorch 1.11.0 or higher
-
torchvison 0.12.0 or higher
-
numpy
-
yacs
-
tqdm
Clone the repository:
Preparing the Dataset
We provide 8 commonly used fine-grained recognition datasets along with the latest download links:
-
CUB200: https://data.caltech.edu/records/65de6-vp158/files/CUB_200_2011.tgz
-
Stanford Dog: http://vision.stanford.edu/aditya86/ImageNetDogs/images.tar
-
Stanford Car: http://ai.stanford.edu/~jkrause/car196/car_ims.tgz
-
FGVC Aircraft: https://www.robots.ox.ac.uk/~vgg/data/fgvc-aircraft/archives/fgvc-aircraft-2013b.tar.gz
-
iNat2018: https://ml-inat-competition-datasets.s3.amazonaws.com/2018/train_val2018.tar.gz
-
WebFG-bird: https://web-fgvc-496-5089-sh.oss-cn-shanghai.aliyuncs.com/web-bird.tar.gz
-
WebFG-car: https://web-fgvc-496-5089-sh.oss-cn-shanghai.aliyuncs.com/web-car.tar.gz
-
WebFG-aircraft: https://web-fgvc-496-5089-sh.oss-cn-shanghai.aliyuncs.com/web-aircraft.tar.gz
First, download a dataset (using CUB200 as an example):
We provide metadata files for the above 8 datasets, which can match with the FGDataset in the library to conveniently load training and testing sets, with the training and testing sets being the official splits provided by each dataset. When using different datasets, simply modify the dataset configuration in the experiment’s config file for easy switching.
Modify the dataset configuration in the experiment’s config file, as shown below:
4. Training Models with Hawkeye
For each method supported by Hawkeye, we provide separate training templates and configuration files. For example, training APINet requires just one command:
Experiment parameters are all in the corresponding yaml files, which are highly readable and easy to modify, such as:
The main program for the experiment, Examples/APINet.py, has a trainer APINetTrainer that inherits from Trainer, eliminating the need to write complex training processes, loggers, model saving, configuration loading, and other code; only necessary modules need to be modified. We also provide multiple hooks for the training phase to meet the specific implementation methods of some approaches.
Log files, model weight files, training code used, and the configuration files at that time will all be saved in the experiment output directory log_dir, making it easy to back up configurations and training code for future comparison of different experiments.
For more detailed examples, please refer to the specific information in the project link: https://github.com/Hawkeye-FineGrained/Hawkeye
References
[1] X.-S. Wei, Y.-Z. Song, O. Mac Aodha, J. Wu, Y. Peng, J. Tang, J. Yang, and S. Belongie. Fine-Grained Image Analysis with Deep Learning: A Survey. IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), DOI: 10.1109/TPAMI.2021.3126648. https://ieeexplore.ieee.org/document/9609630
Recruitment! From Professors to Interns, Pengcheng Laboratory & Peking University Shenzhen Graduate School, Zhiyuan Research Institute, IDEA, MSRA, and other institutions are waiting for you
This issue of “TalentAI” recommends the following positions:
-
Pengcheng Laboratory & Peking University Shenzhen Graduate School “AI for Science” Scholar Recruitment:Professors and Assistant Professors -
Beijing Zhiyuan Artificial Intelligence Research Institute Recruitment:Algorithm Researchers, Algorithm Research Engineers, etc.
-
Guangdong-Hong Kong-Macau Greater Bay Area Digital Economy Research Institute (Futian) Recruitment:Geometric Constraint Calculation Algorithm Engineers, Graphics Algorithm Researchers, R&D Project Managers, etc.
-
MSRA Intelligent Multimedia Group Intern Recruitment:Computer Vision, Natural Language Understanding, Reinforcement Learning in multiple directions
-
Lu Chen Technology Intern Recruitment:AIGC Large Model Algorithm Engineers, AI Compiler Development Engineers, Frontend Development Engineers, etc.
-
Lan Zhou Technology Recruitment:Algorithm Interns, Algorithm Development Engineers, Product Interns, Senior Frontend Development Engineers, etc.
© THE END
For reprints, please contact this public account for authorization
For submissions or seeking reports: content@jiqizhixin.com