TigerBx is a deep learning toolkit for brain extraction and tissue segmentation. It provides:
- Pretrained models for structural brain segmentation.
- A stand-alone application for Windows, macOS, and Linux.
- Python APIs for advanced users and scripting.
- Designed strictly for research purposes only—not for clinical or commercial use.
TigerBx is licensed under the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license. Commercial use is not permitted. See LICENSE for details.
# CPU runtime
pip install --no-cache-dir "tigerbx[cpu] @ https://github.com/htylab/tigerbx/archive/release.zip"
# GPU runtime (CUDA 12)
pip install --no-cache-dir "tigerbx[cu12] @ https://github.com/htylab/tigerbx/archive/release.zip"import tigerbx
# Brain mask + brain image + ASEG + deep gray matter (recommended)
tigerbx.run('bmad', 'T1w.nii.gz', 'output_dir')pip install --no-cache-dir "tigerbx[cpu] @ https://github.com/htylab/tigerbx/archive/refs/tags/v0.2.3.tar.gz"
pip install --no-cache-dir "tigerbx[cu12] @ https://github.com/htylab/tigerbx/archive/refs/tags/v0.2.3.tar.gz"Note: To install an archived version in the 0.1.x series, use the simpler URL form (no extras required):
pip install https://github.com/htylab/tigerbx/archive/refs/tags/v0.1.20.tar.gz
import tigerbx
# Brain mask + brain image + ASEG + deep gray matter (recommended)
tigerbx.run('bmad', 'T1w.nii.gz', 'output_dir')
# Full pipeline — all output types
tigerbx.run('bmacdCSWq', 'T1w.nii.gz', 'output_dir')
# Process a directory; outputs saved next to each input file
tigerbx.run('bm', '/data/T1w_dir')
# GPU inference
tigerbx.run('bmadg', '/data/T1w_dir', '/data/output')tiger bx T1w.nii.gz -bmad -o output_dir
tiger bx T1w.nii.gz -bmacdCSWq -o output_dir
tiger bx /data/T1w_dir -bmadg -o /data/outputSee bx usage for a complete flag reference and output file naming.
Maps FreeSurfer-style labels to 56 hierarchical regions. Also produces cortical thickness and CSF/GM/WM probability maps.
The HLC module was developed by Pin-Chuan Chen.
import tigerbx
# Default: HLC parcellation only
tigerbx.hlc('T1w.nii.gz', 'output_dir')
# All outputs (brain mask, bet, HLC, cortical thickness, CSF/GM/WM)
tigerbx.hlc('T1w.nii.gz', 'output_dir', save='all')
# Cortical thickness + tissue probability maps with GPU
tigerbx.hlc('T1w.nii.gz', 'output_dir', save='tcgw', GPU=True)tiger hlc T1w.nii.gz -o output_dir
tiger hlc T1w.nii.gz --save all -o output_dir
tiger hlc T1w.nii.gz --save tcgw -g -o output_dirSee HLC usage for a complete description.
Supports affine (C2FViT / ANTs), VMnet, FuseMorph, SyN, and SyNCC registration. High-level workflows such as VBM are documented separately under pipeline usage.
The registration pipeline was developed by Pei-Mao Sun.
import tigerbx
# Affine registration (C2FViT)
tigerbx.reg('A', r'C:\T1w_dir', r'C:\output_dir')
# Affine + VMnet nonlinear registration
tigerbx.reg('AV', r'C:\T1w_dir', r'C:\output_dir')
# Affine + FuseMorph with ANTs affine
tigerbx.reg('AF', r'C:\T1w_dir', r'C:\output_dir', affine_type='ANTs')
# Apply a saved warp field to a label map
tigerbx.transform(r'C:\moving.nii.gz', r'C:\warp.npz', r'C:\output_dir',
interpolation='nearest')tiger reg A T1w.nii.gz -o output_dir
tiger reg AV T1w.nii.gz -o output_dir
tiger reg AF T1w.nii.gz -o output_dir --affine_type ANTsSee registration usage for detailed usage.
Use tigerbx.pipeline(name, ...) for opinionated multi-stage workflows.
Currently, TigerBx exposes the vbm pipeline through the dispatcher, a Python
alias, and a dedicated CLI subcommand.
import tigerbx
# Recommended dispatcher entry point
tigerbx.pipeline('vbm', r'C:\T1w_dir', r'C:\output_dir')
# Convenience alias
tigerbx.vbm(r'C:\T1w_dir', r'C:\output_dir')
# Customize the registration stage used inside the pipeline
tigerbx.pipeline('vbm', r'C:\T1w_dir', r'C:\output_dir',
reg_plan='AF', affine_type='ANTs')tiger vbm /data/T1w_dir -o /data/output
tiger vbm /data/T1w_dir -o /data/output --reg-plan AF --affine_type ANTsSee pipeline usage for dispatcher and VBM details.
Corrects geometric distortions in EPI scans using a GAN-based displacement field predictor, without requiring field maps or reversed-phase-encode acquisitions [Kuo et al., 2025].
import tigerbx
# Correct a single DTI file
tigerbx.gdm('dti.nii.gz', 'output_dir')
# Specify b0 index or .bval file
tigerbx.gdm('dti.nii.gz', 'output_dir', b0_index=1)
tigerbx.gdm('dti.nii.gz', 'output_dir', b0_index='dti.bval')
# Save displacement map with GPU
tigerbx.gdm('dti.nii.gz', 'output_dir', dmap=True, GPU=True)tiger gdm dti.nii.gz -o output_dir
tiger gdm dti.nii.gz -b0 1 -o output_dir
tiger gdm dti.nii.gz -b0 dti.bval -o output_dir
tiger gdm dti.nii.gz -m -g -o output_dirSee GDM usage for a complete description.
Extracts hippocampus and amygdala patches and encodes them into latent vectors using a variational autoencoder. Embeddings can be used for downstream tasks such as Alzheimer's disease detection.
The NERVE module was developed by Pei-Shin Chen.
import tigerbx
# Encode to latent vectors
tigerbx.nerve('e', 'T1w.nii.gz', 'output_dir')
# Encode and save ROI patch images
tigerbx.nerve('ep', 'T1w.nii.gz', 'output_dir')
# Evaluate reconstruction quality
tigerbx.nerve('v', 'T1w.nii.gz', 'output_dir')
# Decode previously saved .npz files
tigerbx.nerve('d', '/data/nerve_out', '/data/recon_out')tiger nerve T1w.nii.gz -e -o output_dir
tiger nerve T1w.nii.gz -e -p -o output_dir
tiger nerve T1w.nii.gz -v -o output_dir
tiger nerve /data/nerve_out -d -o /data/recon_outSee NERVE usage for a complete description.
Computes quantitative metrics between a ground-truth and a predicted image. Accepts NIfTI file paths, nibabel images, or numpy arrays.
import tigerbx
# Segmentation — evaluate ASEG prediction against ground truth
result = tigerbx.run('a', 'T1w.nii.gz', 'output/')
scores = tigerbx.eval('gt_aseg.nii.gz', result['aseg'], 'dice',
labels=[10, 11, 17, 18])
# → {'dice': {'10': 0.91, '11': 0.89, '17': 0.94, '18': 0.92, 'mean': 0.915}}
# Multiple metrics at once
scores = tigerbx.eval('gt.nii.gz', 'pred.nii.gz', ['dice', 'hd95'],
labels=[1, 2, 3])
# Reconstruction quality (e.g. after GDM)
scores = tigerbx.eval('ref.nii.gz', 'pred_gdm.nii.gz', ['psnr', 'ssim', 'ncc'])Supported metrics: dice, iou, hd95, asd, mae, mse, psnr, ssim, ncc, mi, ksg_mi, accuracy, precision, recall, f1.
See eval usage for the full API reference and examples.
TigerBx ships with a ready-to-use skill pack for AI coding assistants that support the skills standard, including Claude Code and Codex CLI.
Once installed, your assistant will automatically know when and how to call tigerbx.run, tigerbx.hlc, tigerbx.reg, tigerbx.gdm, tigerbx.nerve, and tigerbx.eval — without you having to explain the API.
# Project-level (this project only)
cp -r skills/tigerbx .claude/skills/
# User-level (all your projects)
cp -r skills/tigerbx ~/.claude/skills/Reload Claude Code. The /tigerbx skill becomes available, and Claude will proactively use it for any brain MRI analysis task.
In Codex CLI interactive mode, run:
$skill-installer https://github.com/htylab/tigerbx/tree/main/skills/tigerbx
| Skill file | Covers |
|---|---|
SKILL.md |
Environment check, module dispatch table, conventions |
bx.md |
run() flag reference, output naming |
hlc.md |
hlc() save options, tissue maps |
reg.md |
reg() / transform() registration modes |
gdm.md |
gdm() EPI distortion correction |
nerve.md |
nerve() hippocampus/amygdala VAE embedding |
eval.md |
eval() metrics, use cases, kwargs |
labels.md |
ASEG, DeepGM, HLC, SynthSeg label tables |
Download the latest stand-alone release (no Python required): https://github.com/htylab/tigerbx/releases
After installation, all subcommands are available via tiger:
tiger bx --help
tiger hlc --help
tiger reg --help
tiger gdm --help
tiger nerve --help- Windows and macOS
- Ubuntu 20.04 or newer
If you use TigerBx in your research, please cite the following:
-
Weng JS, et al. (2022) Deriving a robust deep-learning model for subcortical brain segmentation by using a large-scale database: Preprocessing, reproducibility, and accuracy of volume estimation. NMR Biomed. 2022; e4880. https://doi.org/10.1002/nbm.4880
-
Wang HC et al. (2024) Comparative Assessment of Established and Deep Learning Segmentation Methods for Hippocampal Volume Estimation in Brain MRI Analysis. NMR in Biomedicine; e5169. https://doi.org/10.1002/nbm.5169
-
Kuo CC, et al. (2025) Referenceless reduction of spin-echo echo-planar imaging distortion with generative displacement mapping. Magn Reson Med. 2025; 1–16. https://doi.org/10.1002/mrm.30577
-
Sun PM, et al. (2026) DeepVBM: A fully automatic and efficient voxel-based morphometry via deep learning-based segmentation and registration methods. Magn Reson Imaging. 2026; 128: 110637. https://doi.org/10.1016/j.mri.2026.110637
See Label definitions for a full list of anatomical regions used in segmentation outputs.
See Validation for accuracy, reproducibility, and comparison against other tools.
Contributions are welcome! See the Developer Guide for instructions on setting up a local environment, branch and commit conventions, running tests, and submitting pull requests.
This software is intended solely for research use and has not been reviewed or approved by the FDA or any regulatory body. It must not be used for diagnostic, treatment, or other clinical purposes.
The software is provided "as is", without warranty of any kind. The developers assume no responsibility for any consequences arising from the use of this software.

