facexlib plus is a enhanced version of FaceXLib, which provides more functions while maintaining compatibility with the top level API.
facexlib aims at providing ready-to-use face-related functions based on current SOTA open-source methods.
Only PyTorch reference codes are available. For training or fine-tuning, please refer to their original repositories listed below.
Note that we just provide a collection of these algorithms. You need to refer to their original LICENCEs for your intended use.
If facexlib is helpful in your projects, please help to ⭐ this repo. Thanks😊
Function | Sources | Original LICENSE |
---|---|---|
Detection | Retinaface / YOLO | MIT / AGPL 3.0 |
Alignment | AdaptiveWingLoss | Apache 2.0 |
Recognition | InsightFace / FaceNet | MIT / MIT |
Parsing | face-parsing.PyTorch | MIT |
Matting | MODNet | CC 4.0 |
Headpose | deep-head-pose | Apache 2.0 |
Tracking | SORT | GPL 3.0 |
Super Resolution | SwinIR / DRCT | Apache 2.0 / MIT |
Anti-Spoofing | Silent-Face-Anti-Spoofing | Apache 2.0 |
Expression | EmotiEffLib | Apache 2.0 |
Recognition (age-invariant) | MTLFace | - |
Gender & Age | MiVOLO | - |
Assessment | hyperIQA | - |
Utils | Face Restoration Helper | - |
Migrate from insightface. We have supported the detection and recognition models antelopev2
and buffalo_l
for Insighface (identical models with a few differences), without the need to install any onnx runtime. For users who are unable to install the onnx runtime due to issues with glib, python, or CUDA versions, or who need to calculate losses on the results of the generated model, we suggest using our repository. See migration tutorial for migrating from Insightface to facexlib. 😍
- Python >= 3.7 (Recommend to use Anaconda or Miniforge (mamba))
- PyTorch >= 1.10 (Recommend NOT using torch 1.12! It would cause abnormal performance.)
- Option: NVIDIA GPU + CUDA
pip install git+https://github.com/bhcao/facexlib.git
If your network is not stable, you can download in advance (may with other download tools), and put them in the folder: PACKAGE_ROOT_PATH/facexlib/weights
. PACKAGE_ROOT_PATH
defaults to the installation path of facexlib
, you can also change it by passing the argument model_rootpath
during the initialization of each model.
If your network is stable, you can set the argument auto_download
of build_model
function to True
to enable automatically downloading of pre-trained models at the first inference.
This project is released under the MIT license.
If you have any question, open an issue or email xintao.wang@outlook.com
.