From 3cb3916b11b3e2d0afe65efde362d0a8db8d5877 Mon Sep 17 00:00:00 2001 From: Cyril Achard Date: Fri, 8 May 2026 14:08:48 +0200 Subject: [PATCH 01/10] Remove 'napari-DLC' prefix from headings Shorten top-level headings in the napari documentation by removing the redundant "napari-DLC -" prefix for consistency and brevity. Updated docs/gui/napari/basic_usage.md and docs/gui/napari/advanced_usage.md to use simpler headings. --- docs/gui/napari/advanced_usage.md | 2 +- docs/gui/napari/basic_usage.md | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/gui/napari/advanced_usage.md b/docs/gui/napari/advanced_usage.md index 20aaafd25..6e0af3a36 100644 --- a/docs/gui/napari/advanced_usage.md +++ b/docs/gui/napari/advanced_usage.md @@ -9,7 +9,7 @@ deeplabcut: (file:napari-dlc-advanced-features)= -# napari-DLC - Advanced features +# Advanced features napari-DLC provides several additional features to enhance the annotation experience. diff --git a/docs/gui/napari/basic_usage.md b/docs/gui/napari/basic_usage.md index b10880e4a..913d231d3 100644 --- a/docs/gui/napari/basic_usage.md +++ b/docs/gui/napari/basic_usage.md @@ -9,7 +9,7 @@ deeplabcut: (file:napari-dlc-basic-usage)= -# napari-DLC - Basic usage +# Basic usage `napari-deeplabcut` is a napari plugin for keypoint annotation and label refinement. It can be used either as part of the DeepLabCut GUI or as a standalone annotation tool. From 98a9fbd136e6ad409f22e19c9e1af943dfff4a6a Mon Sep 17 00:00:00 2001 From: Cyril Achard Date: Fri, 8 May 2026 14:09:22 +0200 Subject: [PATCH 02/10] Add tracking controls documentation and image Add a new user guide page for the Tracking Controls widget (docs/gui/napari/tracking/basic_usage.md) describing UI, workflow, requirements, tracking actions, refinement/merge tools, model attribution (CoTracker3), troubleshooting, and limitations. Also add the accompanying controls image (docs/images/napari/tracking/controls.png). These additions document how to run point tracking, refine results, and merge tracked points back into DeepLabCut projects. --- docs/gui/napari/tracking/basic_usage.md | 291 +++++++++++++++++++++++ docs/images/napari/tracking/controls.png | Bin 0 -> 18444 bytes 2 files changed, 291 insertions(+) create mode 100644 docs/gui/napari/tracking/basic_usage.md create mode 100644 docs/images/napari/tracking/controls.png diff --git a/docs/gui/napari/tracking/basic_usage.md b/docs/gui/napari/tracking/basic_usage.md new file mode 100644 index 000000000..a3717bf88 --- /dev/null +++ b/docs/gui/napari/tracking/basic_usage.md @@ -0,0 +1,291 @@ +# Automated annotation with point tracking + +```{seealso} +For basic usage of the annotation plugin, see {ref}`file:napari-dlc-basic-usage` for the recommended workflow. +``` + +> We use third-party, open-source models for point tracking, and we thank the original authors and developers. +> Please see {ref}`sec:napari-tracking-models-attribution` at the end of this page for information about the tracking models used in the plugin and their citation information. + +## Overview + +The **Tracking Controls** widget is designed to help automate DeepLabCut annotation workflows: + +1. Manually annotate a small set of keypoints on a *reference frame*. +1. Use a point tracking model to propagate those keypoints forward and/or backward in time. +1. Inspect, refine, delete, and merge tracked results before exporting them back to DeepLabCut. + +## Requirements + +```{important} +Before using tracking, you must: + +- Load a **video** as an `Image` layer with time as the first dimension. + - The easiest is to drag-and-drop one of the `labeled-data` folders from your DLC project +- Ensure you have a **Points** layer containing DeepLabCut-style keypoints. + - If annotating from scratch, drag-and-drop the `config.yaml` file from your DLC project to create a new Points layer with the correct metadata. + - If loading an folder which already contains a `CollectedData_*.h5` file, the plugin will automatically create a Points layer with the existing annotations. +- Annotate at least one frame with valid keypoints. +``` + +If you do not have PyTorch installed or if you are using the plugin without the DeepLabCut package installed, install with: + +```bash +pip install napari-deeplabcut[tracking] +``` + +## User interface + +```{figure} ../../../images/napari/tracking/controls.png +--- +name: tracking-controls +caption: Tracking Controls widget with annotated keypoints and tracking results. +--- +Tracking Controls widget with annotated keypoints and tracking results. +``` + +### Showing the widget + +Use: + +> Plugins -> napari-deeplabcut -> Tracking controls + +### 1. Model selection + +- **Tracker**: Selects the tracking backend from `AVAILABLE_TRACKERS`. +- **Info button**: Hover to see tracker-specific details. + +### 2. Layer selection + +| Control | Description | +| ------------- | -------------------------------------------------------- | +| **Keypoints** | Points layer containing manually annotated DLC keypoints | +| **Video** | Image layer containing the video to track | + +The widget automatically updates when layers are added, removed, or reordered. + +### 3. Reference frame selection + +- The **Current** spinbox always reflects the viewer's current time index. +- This frame is used as the **query frame** for tracking. + - This means that the model will generate tracking predictions based on the keypoints present on this frame, and use them as seeds to track forward and/or backward in time. + +```{note} +Only keypoints *visible on the reference frame* are used to initialize tracking. +``` + +### 4. Frame Range Controls + +Tracking range can be specified **relative** or **absolute** to the reference frame. + +#### Backward (left) + +- Slider: relative negative offset +- `<< Abs`: absolute frame index +- `<< Rel`: relative frame offset + +#### Forward (right) + +- Slider: relative positive offset +- `Abs >>`: absolute frame index +- `Rel >>`: relative frame offset + +### 5. Tracking Actions + +| Button | Action | +| ------ | ----------------------------- | +| ◀ | Track backward | +| ◀◀ | Track backward to first frame | +| ▶ | Track forward | +| ▶▶ | Track forward to last frame | +| ⟳ | Track both directions | +| ■ | Stop tracking | + +```{note} +Tracking runs in a background worker thread. You may edit layers while it is running, and results will appear as a new layer once tracking is complete. +``` + +## Keyboard shortcuts + +Most tracking functions have keyboard shortcuts for easier usage. + +```{tip} +You can see shortcuts and their status using: +> Help -> Show napari-dlc shortcuts +This is only available if the Keypoint controls widget has been opened. +``` + +## Tracking results + +```{tip} +**Hiding layers, and being able to distinguish which results originate from which layer, is a very important notion for effectively using the plugin.** +Layers can be toggled (visible/invisible) with `V` by default or by clicking the eye icon next to the layer name in the layer list. +Grid mode (toggled with `Ctrl+G` by default) can also help visually separate different layers and their results. +``` + +Each tracking run creates a **new Points layer**: + +- Named automatically (`[Tracking vXX] Ref. layer name - tT - Tracker name`) + - `XX` refers to the iteration number (if multiple tracking runs are performed from the same reference frame) + - `T` refers to the reference frame index used to generate the tracking result +- Visually different from manual annotations: + - Cross symbol + - Slight transparency + - Green border + +```{note} +The original annotation layer is never modified by tracking. +This has to be done manually by merging, see below. +``` + +```{important} +If you run into accessibility issues with the default visualization style, please [open an issue](https://github.com/DeepLabCut/napari-deeplabcut/issues), we would be happy to expand settings and provide more customization options if requested. +``` + +## Refinement & saving tools + +### Deleting tracked Points in future frames + +**Oftentimes, tracking results will be satisfactory for a certain number of frames, then start to drift or produce errors.** +This is inherent to the tools, and as such we provide a simple way to delete incorrect tracking results in future frames while preserving the original annotations on the reference frame. + +1. Select a tracking-result Points layer. + +- This is always disabled for the original annotation layer. + +2. Select one or more points on the **current frame**. +1. Click **Delete selected points in future frames**. + +Only *exact identity matches* in future frames are removed. + +```{important} +Points on the current frame are preserved so you can correct them and re-run tracking. +``` + +### Merge (save) tracked Points + +The **Merge tracked points** workflow allows you to: + +- Combine multiple tracking passes +- Resolve overlaps or conflicts +- Produce a clean final annotation layer + +This is especially useful when tracking was run from multiple reference frames. +There are several merge options available to help you achieve the desired result: + +- **Fill missing only**: Existing keypoints are always preserved. Missing keypoints in frames are filled with tracked results. + - Intended for merging final tracking results into the original annotation layer. +- **Overwrite existing target points**: Tracked keypoints always overwrite existing ones, regardless of presence. + - Intended for replacing poor tracking results with a new, updated tracking pass. + +```{danger} +There is **currently no undo option**. Any **deletion or merging action you perform is irreversible**, so we recommend keeping track of your layers and using visibility toggles to compare before/after merge results. +``` + +## Workflow example + +### Loading & annotating from scratch + +1. Create a DeepLabCut project and add the videos to label. +1. Extract frames in the videos. + +- Currently implemented trackers prefer continuous video frames. We recommend avoiding large gaps in frame indices, which can make tracking more difficult. + +3. Go to the `labeled-data` folder, drag-and-drop a folder with extracted frames into napari. + +- This will create an Image layer with the frames + +4. Drag-and-drop the `config.yaml` file from your DLC project into napari. + +- This will create an empty Points layer with the correct DLC metadata, ready for annotation. + +5. Annotate keypoints on a reference frame. + +See {ref}`sec:tracking-workflow-guides`. + +### Loading and annotating from existing DLC annotations + +1. Go to the `labeled-data` folder, drag-and-drop a folder with extracted frames into napari. + +- This will create an Image layer with the frames. +- The existing annotation from the `CollectedData_*.h5` file will be loaded as a Points layer. + +2. Inspect existing annotations, select a reference frame, and refine keypoints if needed. + +See {ref}`sec:tracking-workflow-guides`. + +(sec:tracking-workflow-guides)= + +### Tracking + +1. Open the Tracking Controls widget (`Plugins -> napari-deeplabcut -> Tracking controls`). +1. Go to the desired reference frame (with annotated keypoints visible). +1. Select the forward/backward tracking range using the sliders, OR track to beginning/end of the video using the fast-forward buttons. +1. Inspect the tracking results + - You can use "Show trajectories" in the Keypoint Controls layer to visualize the trajectories of tracked points across frames, which can help identify where tracking starts to drift. + - The plot is filtered by selected keypoints, so you can select a subset of points to inspect their trajectories more closely. +1. On the frame where tracking starts to drift: +1. Select the problematic point(s) and click "Delete selected points in future frames" to remove incorrect tracking results while preserving the tracked point(s) on the current frame. +1. Refine the keypoint(s) on the current frame to correct their position. +1. Re-run tracking from that frame to propagate the correction forward/backward in time. +1. Merge the new tracking result back into the previous tracking layer (e.g. using "Overwrite existing target points") +1. Repeat until satisfied with the tracking result, then merge into the original annotation layer using "Fill missing only" to preserve your original annotations and only add tracked keypoints in frames where you don't have manual annotations. +1. **Remember to save the original Points annotation layer.** This is the only step that writes back to the DLC project folder directly and integrates with the `h5` file. + +```{note} +The "Show trails" feature is currently not implemented for tracking result layers, please [open an issue](https://github.com/DeepLabCut/napari-deeplabcut/issues) if this is something you would like to see in the future! +``` + +## Troubleshooting + +### No keypoints found on reference frame + +Ensure: + +- You are on the intended frame +- The correct Points layer is selected +- Points exist exactly on that frame index + +### Tracking buttons do nothing + +Check that: + +- A video layer is selected +- A keypoint layer is selected +- Tracking is not already running + +(sec:napari-tracking-models-attribution)= + +## Models information and citation info + +### CoTracker3 + +CoTracker is a fast transformer-based model that can track any point in a video. It brings to tracking some of the benefits of Optical Flow. + +[Link to GitHub repository](https://github.com/facebookresearch/co-tracker) +[Citation information](https://github.com/facebookresearch/co-tracker#citing-cotracker) + +```{admonition} Emprical observations +--- +class: tip +--- +This information is based on our own testing and experience with the model. +Please share any feedback or insights you have with us! + +- **Strengths:** fast on GPU, can output 10-100 frames of satisfactory tracking results, depending on difficulty +- **Limitations:** strong preference for continuous video frames, struggles with large gaps in frame indices (e.g. automated DLC frame extraction via clustering, or uniform extraction with large step size) +``` + +## Limitations and future directions + +- We currently only provide CoTracker3 as a model. It is however easy to add new models to the plugin via the registry; feel free to ask if you would like to contribute a model or see a specific model added! +- Saving tracking layers as CSV is supported, but they will not be loaded correctly as tracking results in the plugin. We currently recommend using the "Merge tracked points" workflow to save results back into the original annotation layer, which is then saved to the DLC project folder and can be loaded in future sessions. +- If there is demand, we may add support for saving/loading tracking layers as separate files in the DLC project folder. +- Manual curation is still essential for good tracking results, and the tracking models do not fully replace the need for manual annotation. +- Ideally mixing manual annotations from challenging/distincts frames with tracking results from easier frames would yield the best results. +- Be mindful of training set imbalance: if you flood your training set with easy frames that are well tracked, and only have few hand-picked frames with rare or difficult poses, your model may not learn to generalize well to those challenging poses. + +## Getting help and providing feedback + +- [GitHub issues](https://github.com/DeepLabCut/napari-deeplabcut/issues): for bug reports, feature requests, or general questions. We welcome your feedback and contributions! +- [Discussion forum](https://forum.image.sc/tag/deeplabcut): for general discussion, questions, and sharing your work with the community. We also provide troubleshooting help and guidance here, but may open an issue for actual bugs or feature requests and request more information there. diff --git a/docs/images/napari/tracking/controls.png b/docs/images/napari/tracking/controls.png new file mode 100644 index 0000000000000000000000000000000000000000..682f8e59e619c77cd8cd37b2b61f23c57c1d05fd GIT binary patch literal 18444 zcmc$_WmsInwk_H~a3{E1fZ*;LGy#GHcXxM}puq|5?jGEOySsaEcfX75ea_uSzH{Gw z@5lSmAF#T5bx~{99Al0->WA!CaYQ&=I1mVg_*p_k9s~jl0ba2%P{0+MY{N(32bjIQ z_$N^LDE>b1$2(&oX(14(DiZ!l7ZUgz)<#0r9t0u?dV7II?wlKfK;l`SMT8Vww2oK2 zHNF|AGn^sImS!r#n@IYAY9UP0EXCpT<^>(=4kMZ#ekSZ`!A-6nhJEdsP%C3D%bRW5 zGbt&bFJn2eEE=Rcm@!vgvE?cG6f0DCQ$S~i0fS-zS)d3;Lv@qbbW;d00?eYoc9Bdqw<$wtV6?I{Yh62TvOj}z!4DS~H?REE{#*bN^|K62U z08{~u>@6!rc|on;#~+Bn(C%N*OS!QWiZwBA(5A zQql{4BnU`UR8(#}w@qwXR8-)TA|fK|%6LAt+S*co>@d*1YBt@!-XRF)&O7Zg*=d~?h8ErQWKJq^v!8!@=JB8!pgnK7n&>#@7BW&p64i89)i-=%H zU>#1ZuF_-HqY+>PEIFSLdmlD)s;ULXFeawmj?m3l zM8zFZbx;KdvT>X9OWrMkK8Zma-U|b_bT-&j+}zyb<2E>mhzXiVXrQxej<~b-KBD$} z?Xj(WbOI!T7Z!fnMKa848g6R1$Vfc+=YjUkoxD({KhYC+ml*=R&THP#cMt1>x60kp zrTzh?XTKrH&h`s3Ql|31-ebJ2X-!>C_Nqw@wahlc4m77A50acPv>?mFF80Qf|Do_J zQe-$Gth7I7vQSyPm;fW1KH)1@4XT>SCy%rGAxmshXto&hvp#==k3B6fIA5>!dvgVCt zk2v2U;s(x7cLcRs}i)bj)d= zmUuY=wc!ixdur}2!h_##yu(j>&5e&Nz>bf7#z=^ITj6m^!hg54Lo`7jhm8GT-`$JQ zIl@ik--Fbt2rV>z(BP?NE7Ocm1x0dHMTvq}#x@sPbP~5K>NfHl_>EngMH#T4{=WW?XM{LwB?#vKjD5`_vUaOHK#;wqW; zLK@R$L;B*8AzdXE@~p zGYyFOIN^7yecf6Tn_7!}$F_N7%*KxB__m0+KkHoQ-{m}MCmO3Q4>z~+R^eyPUv z=fnA4;{6PzOoF(uj_0w&2|ToC@LTO?bZwkuX+&)sGJY;+NTDhjp(*M-XfeIv9AC!4 zrJK^2f<57oH}HXxJvEjHHs{s2HBe%DQX=qe+{R5G;pOA~_OT%j zzOFBcZLsa!PGrv~Y@Q95-kjd?(WuUu(?ZRb%mwUtBn@tTcWA+m2-5ep4tR#ByDliT zA3CIoY=_j1G@W=z+To3WWnC}nMR^Mdj-5G$X|9?alI}b!n;cy$Zrees(99V?dXnZT;(^=@nT{)a8XFr_DMs6au!7cjDEp z%JUyJjTV>Bv}R;fdtOgg8{o^0o_N~c&xjr#o+#MZ!LKi;`h_K@EgUF>`5iw$w0HS{ z-EP*gO3eU!QW+Od5dtzAmr1tYJ0%e}M|ZNCh(7gqedXNxTNX5BVg%i#TtHZ3P@yA6 zHp)Q|DnVfe??{2yY%!w6#rJ~AlNmbV$6IDuJjI4bNN1wpekOrQyo4x&&@w5(*Ldd} z?ZTOhx`vy4+=68&Y*nCzM;Bhae+>=9D(Cu>>2$qB1&uWLN@)VFO36h8M1b^j65f(v z$rcDyGrM`@sQopV$KC z8`Cvl(5X+GS&%5JPj9^Wo~L59b=O`WVcc(@1<<`3@VxJb?L*DCOFR|nm5;(f1Rbh)w zbu#;am{RHeD*k7e$JsM@<@vcm`uL!{Q*9-hdip9TSaQ$#{re7&W1kvg!3g40eYl{xBs>&UK%#P< z>Wyx(cX$JMaULWG1iE7uI=UV|GME}&pr$2yAS?t^r6M;=ZF0Q^#-To%mye znuo#nRb4iZZam|QJ|%Q?4Q^A+au$3^mfg3G$1dyVRpWQ;8;^BaljLV01XmlX#gUk4 zKNNIu4a*0xxr%#17PaJl{KW{oW5E*6q?H@8@{~x?bWod~Cw@m+!_`220}?T@2*j)T zg~2@t0Tx!od@9n!+C0d+8@pR9g~C!-v3%q~g%IC|Cd1qJbV(;^I8bkrtpOvjT5N-J zX@iJaA=%&YD!qcUa83vY?mWob2A?oQsV#$D!@9mjCS#Vulyp&B#gyW7?W=*gY4=E$ zfVf4N>Gy?SkwzNr%ONB=eTkH3ejX`j&yfP_2a2Ma{V|i4U5t7@0#EvzXR`j7WKRrR zN2+9taV6sc956An!$@_TSFcNZbR_loo~iAhNn z-^gBWh%azifk_qxHJTapa^<<}>S)7Ex~-BInmNd|^MQL)g^BF{>m z(s79f0eo;561G)L^Y!&8r&cTey}cB}kH@D`%o{{Z8>f_8*NU&-eIZVJPLQ3~BTub- z#okj}eHDQ_l8frQ4>I=p!DjtuQ{ZKb@ugCi%xd-NGc7o1+~iXg^)1HJkulkMv*^%S zPlEOjQGC6~@4TOl`QZMYg&!pJMtC9|&X)K7@%Ds1eOk&Gf1`d(&0fz?FCN?a@1UVK z>MwUUU8)G@$~E7IF0dKQKkK2%wL3od5p|Q(JUj21+7S*qkBO%ye$_G`&lFN=vw(fs z5>@e__+0f}%RF_Raqhi6*zT)XEZ`=9`qA5n!SRyBw+Fg?{hj-D%$uKH(m zRAQKa=_hO(v=TJwu~Tjdlm#ZoL4nQgAMz>0U?Am+7sMjd{lwSKQ;YBKKpx**k=WRv zo}L*fgH<&P)cv1DFMXb9hZE`e;_;$mt#(g}1f~MAtQwzQkie(31&v}=f|vIfYt8Qt zN-yjCjoZr|u*p*=!hJsluv*mEc)|`2IqB+_#wsb1^5wFs9(f!&#zKP<{p$z6$hLX? zyyAs|cOKf2JexHzY}8aqv~(+LU0{mWlx^p@$O^LgGd0#@Pz~9h7PLQ=A2I;)$?$3l z7~_8hXJSI`>gy99KKV|*Z4jilX18(D>V|wV?7Z~3|0DHpulp+2fmr;wz%tRxC?DC} z@|dZZsBblxN&BId?S&xqsap1>lu26o_{D|#PYw@iM+|D_R@2gBmO}-!biZe>Sf|V_ zrTnC|zqI#F9v1NjFy}>ZpI*$T;xZv445hNGL`9s%>j3o@E#msEEF74$pydMR3-c}{`f z7V#{Zxj}@O<$}|lov?pxjVOUZ8#88vH5r;o@6Lv4q4d`UuUHti@9*D&Ohb-MF67uU zTmD_TWOU3Nqr!K-aer2UeagaiGvi*VbHg5+1NInitlJwXZ<->TYevne?-XxpB93Lf zUk2lR@lg`K5_>7vv!a{7>U@^2b7=~%(zK4XCQseJeoIFGhYKy*hy&smW8em%?wQ;Y zsMqh%vToW2(nhr#f`NR%v~cuA|2@4%135)==`DCb7K%5wg{zy1DiYFXZ)LC2AbiTL z^a(H?W!+F-3y(X^p*2vtYCAy^n>lT`*O3Qr1<#M$VWD-HhyyYdHu=?TOrE+FaJcB7 z?XP@#lPg6P&dY@dM#LD^uZW@d1<1pMWD z^Qq_}o*#%1!Z?dxnoG4TEuhOpv-K97JG z3xrb()wKmXgk)&LuVTvD+S;w*0_|++D)g4JRsv)GM{rBah=ic#Z0mO+^Y`=WC_ip< zvnZ#w>jS#PR%L?~d20JH5(aSLn8a7HD92mH40l|4lM=`CxB`Xzpf(1&DHV1(HL@!N z8;$VOO)iDiqWp}pcZR4OsCPv(UkqWh-HEZmo(UVHPOO<3&c>=9yi z)uoNJhrKaN%qGu$@o7g1M42YfDJY9uN4<&oA}BMYL=1AD#frjExDm=j>!dEG&sj7vu94 zX%v_9gfr}|Qn5&nnk8k?m(VmY&R&H5xIcLO^K5KNSWfv+ zqk-)9{^bi0$1VQwYg(Ncajv1|dw9N^HW5Wt34tN2VXP`A#Ynrv8;B6h@VQyJbELMP zRsMLHENC1Fa`RISb)l#B#jBdj##Cumg-|1ZE9D?TcG%q3mQ!^_g7($vjQp0G}WAKJ8@Itk>)O3KQFZ%(y3 zx4oaBTOLP^M9j##Nqw#u_y}?M-~yUgyPj{6yEYTzc?igZTM*sRx1$3Q@j}>2?9UxS zkQ5bX?k>HbA))+C?{=amZ0{bmzY5w&FxI_3KF+_Kp7J~)S4RNMCOx;B5E25?wWg90 zG5>W6o%eb=1_qvGH`{_j}~49Ri)xYzB{??{{8U2uL{&olIhxN!84C{C{$& zgYQeuR&R{vC(9f&FaYame12~lI;@bP*>jiO%cSPH{)u!{?v01t{Acp@FHFsL=Di#i z0s>5TWfaQY{lgDck>|h`-W#KmhxT8(k>3m9vB~@218?Z6NCYQZ0JHgfo!#C#?{DnR-5*eh zH*v{!>uj88s$iRc{xP%~aKU_7Luk(l4f}(OlRSS3HK7%L=aBNpfCxQ6BjmttQiwvM zNHO=F02EXU9j0Pxg?9)=KNWRrKn!!=IcK*V@MPB03_(BEE`Bv?r-mfV)dfD#LzSTJ&#ZH3f*y|5e3@m&%MtvZs z(?9x^HqNlC$rjY@9PVhCC-L>~-6>XGb`dU^nFD`_Ku`*4Wy%!54k#-}wola(`a$UW57bj%r2 zU9;J)5`t9nb04*Z?-xIjRV?@QvN~YS7Q-p)41f_k=C=UDI2PL~V)Gzf+})^91Z2*W zZ-%(Z5xWfOYrS4ScG|PlHXZDSB{1X^i?zk$H!FOD*xEVmHb}UV%Eaa(a))=I8vTWY zX1mZF70MSn@y7CJa!?=DA8caD^V@IhH zkMX=cS!I+)`|_GGgVIi#KOh}*UOP&Bj<^x(cg%KGjF_A?v^e%ov)~WFtvxW+N0V+s zIdbO=B7?beLdCzIvwl?w#bN$;8o+ZYow(brA&6!-8!o(Hw;#bzq(Y1QERv^NqPS|B zsf&y`yc%f7s-ir;_8qp6%7R+@*R6aMh69D!p*0E zP!kIaw3v}{R`!j~07<=HLvF;fBE*-q4m!KkS+;5Z+QuNBI-=EwQlpi&`+`GOX967OCCE zhLeXGJRy3*@3O^03_^fSL5EJQ|&E5zdxDW*mQsgd;j8lAz=Px)ZF1+o^SL(lZ$2s^&4WE63hS5D8)$$`l5cElW^t9zZUXv~Au z_He``ARy3-fBWaB^~7g!rwtO_;AcsrRCRDD9*D`n7yC^dzM2{KB z*Zp9Lo;l;6U0!#Y1@qfM`K8X4s3f`=R@BA+4|Vun?pQK0_=< zgZ!)^(!pR%%|97NS?kKr+0_UZptn)SF1fg}=PL*X5Fxmn`s!|eLskZXFr-xU8bE)88pl%-cHHPd-2MekXF_!&Rpe+?uH1jp z4fEJ;K~mPb2yBcl#rz=FFbHDa>?uZlQYneCM9U<}36svU5Hq$kWWK!NR5*pyPjS;{ zW6acn(+X~|<*~o6qi(acU zz?xwrWUxh4`AQ?x2c346k+}NzbTwy<1-+9yl%L%8tVN> zmm*YY@(1XjWG){kScgBHUfw4+CKe@D+b9{nK#!*=^Cc;14m=L`B-0OKc=pG%Y`$QDHTQ7Am2BV_iJxr}e*BG7sb6ikcJ7m%Agg$XIyh32K0UO` zcHT;LI4G=HT-=do7RPi&f3VbU!vVnC`-qgF9AY}nW3w%pZF=44;%a1(=P#b&Y65iy z?&$PCIy!f2LKNjhwh8=Cl_VlZ6Og_+@5k_kG#qK(0Xriwt&R`+hN_SOkW$`6f)IWj zs^AB1D7XvqG{_Ks-YcMY<_Y$1n#unO?=8?{ENz%m0P&?t{J&LP{sHA_7pI)cQ3 z0sg-1WsyNhVytjk-;Iip;qKIVxJtjz(ghcNbBGf;xYz)^9|Mxo1-@6tk zK=hEeC@OugrwHBu32^picmB<8hirZZST0e>{)r#^BFHC-ru>TvtBh^C6h-nPOokU;c&a!J#j*T6OU7;?#A1(_K_CmzRv9(HIG> zl3x<)6x%4|A|(OELd`(zx~X7{NOQXO1?-vsss5?96lPZFM4)_{QZnNirKLVdE#Gu z9f?T}G_xdT6m)uzXovd+{QQr=kGwzKFHgFNQ<}=w@eX~}dV|aAZA&O4Y2?}#e=F40 z2!Zeq)zbGVj52VfoleypIrG0TuJv~rG(`0EV-hW#@~VIht$`M!IXmJD&RNrn!iX%; zv!1jKvlmH#8k*#edz!S%GO9l`_^JP#ncf}A;Q_gFCpFgQE%8W%i6CI|a}Yjo*wJ?E zE;qZ#2oSy}kpB7px&i8FN!{$>o3^iA%5K)USW>zYhE;EbnJJ4W+&gb7BSzR)Gm z$4HtgUP*Ozo>7m74~vUSbf(A0oR?uafH2yGDQ)Qfv51Uw=EV8wvTiBRgwGu2d?L zzS@H08YQX7)}WwCMzL0F_1sAiU2s67-5{x|=9BwP)#7fy{OSOx?F#YsF?DU9aAKJy zWI6QTsv5UEaa2GvJIw#2|JCGo+YSxwkl~M}!oO9S9M;trKxw8(gM>rkL;V*Wdm0i@ zs4-)LA3hkD8@{E+yR52-B4lhiZKuakOQ8OuqF7otV1LuN-^B6%2P*i=3tvdbEn)x# zDOv{!86%?(F=*ZBP@0v^3t8V1(Msc0ZrMM0**`R8$aop=Ja>`UQomIz5-|!||6X= zyPxJ$2*@z^ZNJSrpV|VWS2~V`Wg_|0gEzL^tp1hh%eH&KgK`l)P)?7EGyHal_--+>K1#88jY0 z_U)XTlaaDZeiP-;Tp8P&8W`cs1M+7{ z{`GJ`E%Zg~lO8kLD#&gJ1P)qs;AODPIP3kqr}a<8WI!bk8Uav1Nvho4*nzdDNqt6W z;*&+WZc|gxMxWBKMLUHR77k21O~SirIN}=L7x*1-FWmQZcwh#E-;R z<3go0N`BzIOh|QN^g~l!y*W{Yu=LT3gdSY(k5Y5Wo|c?cY&&>bmyT?JnP1|p=iNP- zGUp?XgL`Jf;`II40oSC*Rx34SDz#|Id1X>Pdl-xmg9>2IS=Y=KH!= zq+T!ms^uO~D{Tj+XP&~*h3;xQOPLW4>lF|BCT*jkaVN6?#NWU<0417=QBnzGgUZ1Mx5XT2ST_(1V3%J0Ek_*`7(+A_B!#Koc8 z%o#h^yg`8oc=?lvCxV*NY;6xutlTHds4PDU;epk1K8z;}m|TAhB&Gcnni&OM`Z97p z(|anbZNA>lMR6bo+>!@aO=8mQmyu(jcP6r}`d_+P|01aTqxIz#*(nME0nS8A-n~!s zMBK;y_)%=OmTgvf?4Y!*6&7HJo6r-U|LH13KdQUae4hk%CS{WWZ>qvuyu?eDD%OnP=1bL|kKKof$Frb9iZ z2Y~_etwjJi|0WLOO`*{Jxs8Gr3;P|s@$AtfK~y9eIT3VF-CQvSaMP6=`})=_ZRn7X z_ET~>0XF#N9Zt5k6!BAT%;7-oU2ic> z2wUm7e~^_@ZHI?fboT_ZKy$STFYJ3qN?V5o+dW5_B|08v_k{hR(;@$p^y&9ZJB~Np z&%yMBuoy(T;X%I3rA zgJbJgSL-Inn{s$RYuwP+BiUY?WJb~GZdUyjbj+Cz-#RjOzHxwjWJLX$atj4uT3)>A zSTv$35xZ4y+gd_G!!J4PcoJm>i0n*ZTcuyIa|4H*y2E5SXF~;!#gXD-MPg$0!4WV4 z6Dr1^3(LjR#Qf45tnq*)_u?X1csgs}C0yoN*D9%*$XRyYaH?M|m2}>$HJyx^+kGRV z0#5JV2I+Pn8L?8yk+}Sm%qZ zs1>5_rdEyo`v1m-LE<(FM82`crFiyTql)o2$7k;Ix>13J^s~P+!~MTWPXY5JrznB) z8+$c^*w{WK~KX?&DU9~0(JJq zPoalH-nwBmh!&%R)rArr<+SH~w7w9TLjJZQhqG0_?uCl64VClHmpMQqpnlOxHP7># ztI%>-6HbM6OnMVPAe#UT*@`gljfy@SU<>Bs+{ml%5sQBWRIsvLa<79)jN*xolPk@k>va6TTI|1a}VK=Ne zJse6$@64zHFtT$_3ho1_fw&qpmnkhluv`qt-$y2z*F=;$@aINz#uZv>)zTrC1Ex9XVEb z(*v^l`Zi^o7a6Cwc_p__brSjM-LVFhAu1@iVMyr@ykCtxfVOf*Ne6z)9=2aVb3Iv} z0lkry8{y8_jE=Z8H?w2D#xvf=Dmw9KZ7eV@n>%cZoX#s8n^iV`ko9cZdO(*364kExQ4M9)WmUak z&}ONzt*I4kdc{W(X|bkhsI+eCMv@F5&CT^n;?3xZ$3GK~<2`6+?iqT%bj>4kJrMsY zKi2oWrWL&~Q+H_=VIMq~=+?<4RM@EjN@?uA7$3DBG|X)GMaN zYeo9%(n?lpZeCiEKBMaC{=VFe6mT&d{gurB!=3)!y0BSHiG}ztTKP7q>;1;aEGAQE zlh0hf`}sqS4v_bsh?^oaDu{pzJ9uDPw1s^obglsMpSgXu;IWc^CDPHX^FNo<7sZ1A z(75T^H6>S!hskVR3~E}5#3&CuJr5tkIyzTAxS0P+Ty?%0`o}=S4+3u7qrvS0}edx9DGc(Z3)!o$q zwsHy~pwKohvV84d`TS|L$(9OGnsKVDD~r6S;{loelSu1t_3{z5qyC?ZdEEx3#h?39 z0C36jDo_URiM>v{2I&C`H(`G8KmmI}nD1BbWra81*9KMklk_1KjevjekLwSn$goa^jM*Z7LYzEYoE1`TQgku> zv=3Q;c8H7Uw%KM0tmNH}zrkhKYr4D5sgHZvq*$bU`~1MOi)HfF=rxXCz*u(L(+V$2*@Csic z){AtF*bI*uH{!Kc{5VJ~uN8$H`e4F{{?9ofEfD~DRKOzeijph-Bs>?>Z z6DB+~D;NnA$F*=d|LYeDRG4@$n zbTYEG6>Aq>6+9$v-B+s!LdavB~5ObO9A z1ahOJzV-mEmN#0+7Y?v-iNtGq>K^8c=S|3q_qF70^4HB zkjZIdvHo$(KuR=z!F)RY1Zrc*Xj+)rwx+Z70GvLQ^q9<}%PidI_}c zo1GGffLdenj;CQi?N{5oH@R<7#$9Ne{#qlGk>`5?cZwh9Exi$UAtlacar?0l~ z-~If$X-O|oIy!`0UL(~RxkuH#uB6h`DXaOOgT9IN^{ihjYxf>nzkq4GcOXv=;Xm7i zQl_6SuQ7;*(0`VyD^bx7QGER+By|I`>M%-hotiuuE!Dg#xzV-zs@*8h7F~wB_^x_y7A2+2P|v5|ww^_a+fg)zMcxlh z7ra{Dm&h*9GYPG3*de~z?-9^MR>M2K`+makdxz=^{L+aE1xqS_@kgg)zyjLobU7l5 zp5E8eqA5fb$G&IVYiL3wfilfnB4%NRt(MBk?u0}%!N0H{=Bv@|x2G_5PVO~kC^+c( zoJt!9RM*LSD2MfLoWv0q-?1tReZ=Zerce0ShdbL(wgh?4kE#9ItaRKRD9t#as3- zPxd^9{<-oD464Bic6Jlf4vND!5)t&I)|_jrgG*zsHZ~}$gX#eSEv-FUe(w}U>%)A% zbMNetZrM`{ zL{@8!;lS@b1}y2;V_o(qn%q2V%=KT5-lInE+5$lo9L$USZ)*2{v=cTu&P@IG5k!E8 z_06crNECgJYH>JulNlh-Jlx6C?N28c?&BM4#XI2ptN*=M-FtcL>N46YX}qicXI2n+ z>$32_P7c^T7ra3A=JXEg9kKV88vgT`r&8ooxH&2|3P_?|^=vfX5e5kfjf3)n3K;es z2)LdO&*nG|ISX4+KX* zFoUfrq>Jlh?%-y=eBk7SX?C{i>v(N`pR=Vmyg#TbZtN|vGQ}j@Kzw^E86D4!IQDFm zHFIicY&hI|SkV@Sg~`yzp>_7yly2gnfsma;8r^>E9ja<%*F`#}T#35>+E&CH`8Mcn zFt9cLT;=DNxSVjn1NRE=3PV62jm8T$6Q=v6t>OD8UI`>9n7Kbn_65%u26IjCO z_T0R$T~$BdD{+9bfzKw!G0mBc>M8YV%vctlnp?1@%~4styWl7n^_iBmurd|B38j&> zEWCb*28Rmyn!EQ?_N>pl=WR(-($g)#GA-b2akYGyhOfa=NnoED;b=yDQx*JQk?m%Qiq9 z$eyU~=+hs?o$=B=+ICh08WxWa2QLYiLS-*{?N3JV)2wkIARPxMM7uAK8-YC?3gq^! zw^LS**O5jhnQkyR3=xR{@oe9jQ_2jB~D$LZ-qq%bb<`m;p2XaQ5_sy7}=rAd1Ha~^Q9n9VRiXTlvxL!O)JrW;c!>tzZUuJu49>krd z;er9v1{~94UH`cV_e@Fj$?#dedwQW7*0YtJ^eyUkFrAm&!zou2x^YaZf!o;45`UYnFhA{Q~jSLk^Ix zu2j?d65st^G9cD<+#3J5-q6cnRO^j^q7;6 z$JBYVC@ol?7`?vr)Y}UWmz)#Y+%z1Vv8miNfId(za?E`dW|EvBbw(8^I8$OkKdIGF#N(A)QPY;X(kvj#^ z4dt2YCm{pEIN^-Jrk+CDeDqB;R?pfpe-COcNC0*!-Y>@1VpYss)QIYHrj<*n>HevJ zp9lncv(RA7%N-kVJMFPMa-`nP<0qhRpz+Q-c_xO`Xws6E+J;0@cB!eWA=QzUHhi_m)J*+=T=0+ z^UK^@+Ub!k|_z5cCdUG3Yq5q!Nd?p{qxL$feZ`?+%HRmodVNvb2`? zG{0+c@AqfDwtFPnKh<$dwoZdO0hx9+G0>zRv4#1Rb$tDx3hr?YVK|wP8sATzO8%QK zqWr1lw`YYGy2uQ1pr&^!+;_Cp>WjE@mEbm{Zr%n(HBli>P>HS{#myVDvPH0HE&CM% z5jGN-H)ldAF`tvfk2p_DVBDiZTgNe4u@;6igxqPu8~mY~0MhM;-(lvr^R3*i`7k}> zv1Y%mNvXNaX$|Pl5}oWt<;r?Qfb;YhmZg{Gx78(41{FSz1dIdc&pn*`soYIY`&LY{ zS2$4E2YCXHEKwnTcd}ORie<8fvEj<*eYn(-;?K8M{xqM+$13>)jae#60&y3AMub1x z=Ku5}#UmHCf$!iV)_}5OdcmMq<+hKjUWJsnlv19RPdfr35JSR65$Ed{CS|bz7n8mR zuRJJsIMvLPm#QZS+NYZwTG$6;X&6xlFsWeY-5TZq)qQ1I$|yEQpFGeLM!__KtK^eb zZU36?EFK1ZL*piz*Uth?1MyV>W_RSNNk08dBL^0Gcz5L2TJhu*V_*Qu;zX^abGUEz z*!U;BS)V4g{URgkRdyMt3bEo|t(=ojT%g}7v`SLh3=#FOS=6hQXolw{RqYaq9JV*L zo4X$=4t7a|%vfuaWv6FCx`ZP*`IUfPG3`;yfg4plW@JI}>?Cm!+Kql0y6a@iw*)Ne zXP#E1?9qXpKq`X09l|rr(bCU-Dk?#~0qbn~6&aO4ttK z&DdIB%xmrxoWR<{)b(~MGz=Ee_Pw}sP1D{UjrU_UAdY}_{xT8gd{JBcMFwI@yV4y? z^Q{*@$g7#WVrSN;_SZ{i8h@NU$tanQUrU2t#IjT6MaPUK4N3hqYjE|P81|KyfQqIT zi?_Kjv`e;}5gRY+%>RinCMqXUq0Z1Ji+`HA1?W-anK|D8Ptn+fZIHOM=i51TMM?JE zG!D6eGo}X@!p>J9t#!Jo78HHT;+4;h(B2V_sO^hd1L6Z2Vj%EE~_rIWyT5<*8{S}!G z;+nE*=8#=(@C9%#WB|AVz0&0Ro{qyRc;F4v02jLr(4F;z1n7YC;U8%Y%ck#t?Gb?h zu^I62@Ic{oVFZU>fqn&hIXkyC=LP5s{oY=KU!A(8iD_FzJRV0X9NJ#_8&OI3tAF^v zIN|j_fqTdEVUH{J$lt#Q>Hq%itA`07E$Rg_*2@1cfZY%K_f*YG zHaKul2Cx+$F30BCX@S`}%(@ba*Y{s(B~bj}qW^j5fGXcj%Mmp;@CN6i+du(tCr5=A z6B{mr_Hz?M=J7j_l^rGZPxt^YvYU8j0X@*g+r`1td6)Ua8+^hp8R&P-=DBJ6L*X)w z453^1zctlVWKB@2k0#AN0rY~U{-vUzQo{aHi0}j8p#qNfuuxTChx*?^9jUj0^At-9 zSXyqbN7PjPy9fTQN$b7D7V_)LN?_-F?fnt$?d)EdSJd?vdVu$)&C){ zV3aB_BUiT|1TGeCr&|mGM)ZLD@ciy<+BAMO*eY7_-hV&7S824~Y#NN*U}-eLfr4q~ z*V-P;2E7^!hP7muVqKqgkX7__RC)&mj}SPBbxJuU&>q;ZindS5``^+IegqV>Wtw#N z=~WG$o90)X73>SPs0Su%cor?`IH|^~Y=;Vz0!UW9Uq{B^I+lrpTo>=0)k1uXBaERG*T68Jn$^>ME z$yZB=Fg@)DxLEi`P*jxVVulOHaVk`7oLcWuTuJQ>JRevL3l&CS-SB2#vfH~}0P?AF zlG+w)WU`>&?YNf%&Co>CAh&xc?CoCla)xlIh)48`N2cGh41ZKhsGlSj*A!Y-!b_?P zy@8VpSU5J>*s`RSMWAOt)0qeNd8u2$4~PUs|h13loq7lnA+h9uAp^V3~`^hO{gU zuJ)-6%M;9!%Jd2f(U(y3SrR5QDy1^Zyr{)y*MKYWg}gcT+L=Z=DueMoX!)QHn{9dS zN0)bc$EqJ&kp*po=+~IkTeW~QIXaJSvy$S6Rr=MYoDzhG4E!GuU;Ft&vcWG8m$ny< zJLZ3V)RyzryFCARpiBt0WSkUs8scv;<%-Kdm$0I3khE7`;wS@?{SxH+U4s%?09Kgm zOAj9H*6Jt_6j@YjKA!iJi7gI68j1~{k`rvcd$l`YmG+_`lxoRrP9G|sc0k$twZeRS z7(sHQ09~b&9pvzOr>tbLMJsXvk2l(la`xI#&l)4KI^?!^ED0OA^ys3k=ymHt Date: Fri, 8 May 2026 14:09:27 +0200 Subject: [PATCH 03/10] Update _toc.yml --- _toc.yml | 1 + 1 file changed, 1 insertion(+) diff --git a/_toc.yml b/_toc.yml index 495a9110c..adb3d3a27 100644 --- a/_toc.yml +++ b/_toc.yml @@ -33,6 +33,7 @@ parts: sections: - file: docs/gui/napari/basic_usage - file: docs/gui/napari/advanced_usage + - file: docs/gui/napari/tracking/basic_usage - file: docs/dlc-live/dlc-live-gui/index sections: - file: docs/dlc-live/dlc-live-gui/quickstart/install From f666cf24df0efbe0e13bf7a40850fc2c35dd5f98 Mon Sep 17 00:00:00 2001 From: Cyril Achard Date: Fri, 8 May 2026 14:10:05 +0200 Subject: [PATCH 04/10] chore(metadata): update docs/notebooks metadata --- docs/gui/napari/tracking/basic_usage.md | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/docs/gui/napari/tracking/basic_usage.md b/docs/gui/napari/tracking/basic_usage.md index a3717bf88..668546c75 100644 --- a/docs/gui/napari/tracking/basic_usage.md +++ b/docs/gui/napari/tracking/basic_usage.md @@ -1,3 +1,11 @@ +--- +deeplabcut: + last_metadata_updated: '2026-05-08' + last_verified: '2026-05-08' + verified_for: 3.0.0rc14 + ignore: false +--- + # Automated annotation with point tracking ```{seealso} From 7c1c7683c27f653d9e5d42e42b9a05123ca97b07 Mon Sep 17 00:00:00 2001 From: Cyril Achard Date: Fri, 8 May 2026 14:40:00 +0200 Subject: [PATCH 05/10] chore(metadata): update docs/notebooks metadata --- docs/gui/napari/tracking/basic_usage.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/gui/napari/tracking/basic_usage.md b/docs/gui/napari/tracking/basic_usage.md index 668546c75..fbc45fb8e 100644 --- a/docs/gui/napari/tracking/basic_usage.md +++ b/docs/gui/napari/tracking/basic_usage.md @@ -4,6 +4,7 @@ deeplabcut: last_verified: '2026-05-08' verified_for: 3.0.0rc14 ignore: false + last_content_updated: '2026-05-08' --- # Automated annotation with point tracking From 3e1bcb2dadbaba8202deae1d95a7db1d1fbb6c48 Mon Sep 17 00:00:00 2001 From: Cyril Achard Date: Fri, 8 May 2026 14:43:04 +0200 Subject: [PATCH 06/10] docs: fix tracking docs typos and restructure sections Update docs/gui/napari/tracking/basic_usage.md to improve clarity and fix typos: rename "Keypoint Controls layer" to "Keypoint Controls dock widget", correct "Emprical" to "Empirical", and small whitespace/wording tweaks. Restructure the Limitations/Future directions area by adding "Important considerations" and "Future features" headings, move and reword the CoTracker3 note into Future features, consolidate and bullet manual curation and training-set imbalance guidance, and remove duplicate lines. These changes improve readability and provide clearer guidance for users. --- docs/gui/napari/tracking/basic_usage.md | 22 ++++++++++++++-------- 1 file changed, 14 insertions(+), 8 deletions(-) diff --git a/docs/gui/napari/tracking/basic_usage.md b/docs/gui/napari/tracking/basic_usage.md index fbc45fb8e..94434d88c 100644 --- a/docs/gui/napari/tracking/basic_usage.md +++ b/docs/gui/napari/tracking/basic_usage.md @@ -33,7 +33,7 @@ Before using tracking, you must: - The easiest is to drag-and-drop one of the `labeled-data` folders from your DLC project - Ensure you have a **Points** layer containing DeepLabCut-style keypoints. - If annotating from scratch, drag-and-drop the `config.yaml` file from your DLC project to create a new Points layer with the correct metadata. - - If loading an folder which already contains a `CollectedData_*.h5` file, the plugin will automatically create a Points layer with the existing annotations. + - If loading a folder which already contains a `CollectedData_*.h5` file, the plugin will automatically create a Points layer with the existing annotations. - Annotate at least one frame with valid keypoints. ``` @@ -231,7 +231,7 @@ See {ref}`sec:tracking-workflow-guides`. 1. Go to the desired reference frame (with annotated keypoints visible). 1. Select the forward/backward tracking range using the sliders, OR track to beginning/end of the video using the fast-forward buttons. 1. Inspect the tracking results - - You can use "Show trajectories" in the Keypoint Controls layer to visualize the trajectories of tracked points across frames, which can help identify where tracking starts to drift. + - You can use "Show trajectories" in the Keypoint Controls dock widget to visualize the trajectories of tracked points across frames, which can help identify where tracking starts to drift. - The plot is filtered by selected keypoints, so you can select a subset of points to inspect their trajectories more closely. 1. On the frame where tracking starts to drift: 1. Select the problematic point(s) and click "Delete selected points in future frames" to remove incorrect tracking results while preserving the tracked point(s) on the current frame. @@ -269,12 +269,12 @@ Check that: ### CoTracker3 -CoTracker is a fast transformer-based model that can track any point in a video. It brings to tracking some of the benefits of Optical Flow. +> CoTracker is a fast transformer-based model that can track any point in a video. It brings to tracking some of the benefits of Optical Flow. [Link to GitHub repository](https://github.com/facebookresearch/co-tracker) [Citation information](https://github.com/facebookresearch/co-tracker#citing-cotracker) -```{admonition} Emprical observations +```{admonition} Empirical observations --- class: tip --- @@ -287,12 +287,18 @@ Please share any feedback or insights you have with us! ## Limitations and future directions -- We currently only provide CoTracker3 as a model. It is however easy to add new models to the plugin via the registry; feel free to ask if you would like to contribute a model or see a specific model added! +### Important considerations + +- Manual curation is still essential for good tracking results, and the tracking models do not fully replace the need for manual annotation. +- Ideally mixing manual annotations from challenging/distinct frames with tracking results from easier frames would yield the best results. +- Be mindful of training set imbalance: if you flood your training set with easy frames that are well tracked, and only have a few hand-picked frames with rare or difficult poses, your model may not learn to generalize well to those challenging poses. + +#### Future features + +- We currently only provide CoTracker3 as a model. It is however relatively easy to add new models to the plugin via the registry; feel free to ask if you would like to contribute a model or see a specific model added! - Saving tracking layers as CSV is supported, but they will not be loaded correctly as tracking results in the plugin. We currently recommend using the "Merge tracked points" workflow to save results back into the original annotation layer, which is then saved to the DLC project folder and can be loaded in future sessions. - If there is demand, we may add support for saving/loading tracking layers as separate files in the DLC project folder. -- Manual curation is still essential for good tracking results, and the tracking models do not fully replace the need for manual annotation. -- Ideally mixing manual annotations from challenging/distincts frames with tracking results from easier frames would yield the best results. -- Be mindful of training set imbalance: if you flood your training set with easy frames that are well tracked, and only have few hand-picked frames with rare or difficult poses, your model may not learn to generalize well to those challenging poses. +- If you have ideas for specific refinement tools, shortcuts or other features that would be useful to add to the plugin, please share them with us! ## Getting help and providing feedback From da58038e0cfdd183c061079a421d6817709b08ea Mon Sep 17 00:00:00 2001 From: Cyril Achard Date: Fri, 8 May 2026 14:57:58 +0200 Subject: [PATCH 07/10] Fix formatting and list numbering in tracking docs Clean up docs/gui/napari/tracking/basic_usage.md: add missing blank line in a tip, normalize bullet indentation and spacing, fix inconsistent numbered lists (renumber steps and convert some items to sub-bullets), and improve clarity in workflow instructions. Also add a brief consideration that manual annotation can sometimes be faster than heavy tracking corrections and note preference for continuous frames. These are purely documentation/formatting edits to improve readability and usability. --- docs/gui/napari/tracking/basic_usage.md | 40 ++++++++++++++----------- 1 file changed, 23 insertions(+), 17 deletions(-) diff --git a/docs/gui/napari/tracking/basic_usage.md b/docs/gui/napari/tracking/basic_usage.md index 94434d88c..1df690f5b 100644 --- a/docs/gui/napari/tracking/basic_usage.md +++ b/docs/gui/napari/tracking/basic_usage.md @@ -121,6 +121,7 @@ Most tracking functions have keyboard shortcuts for easier usage. ```{tip} You can see shortcuts and their status using: > Help -> Show napari-dlc shortcuts + This is only available if the Keypoint controls widget has been opened. ``` @@ -128,8 +129,8 @@ This is only available if the Keypoint controls widget has been opened. ```{tip} **Hiding layers, and being able to distinguish which results originate from which layer, is a very important notion for effectively using the plugin.** -Layers can be toggled (visible/invisible) with `V` by default or by clicking the eye icon next to the layer name in the layer list. -Grid mode (toggled with `Ctrl+G` by default) can also help visually separate different layers and their results. +- Layers can be toggled (visible/invisible) with `V` by default or by clicking the eye icon next to the layer name in the layer list. +- Grid mode (toggled with `Ctrl+G` by default) can also help visually separate different layers and their results. ``` Each tracking run creates a **new Points layer**: @@ -160,9 +161,10 @@ This is inherent to the tools, and as such we provide a simple way to delete inc 1. Select a tracking-result Points layer. -- This is always disabled for the original annotation layer. + - This is always disabled for the original annotation layer. + +1. Select one or more points on the **current frame**. -2. Select one or more points on the **current frame**. 1. Click **Delete selected points in future frames**. Only *exact identity matches* in future frames are removed. @@ -196,19 +198,20 @@ There is **currently no undo option**. Any **deletion or merging action you perf ### Loading & annotating from scratch 1. Create a DeepLabCut project and add the videos to label. + 1. Extract frames in the videos. -- Currently implemented trackers prefer continuous video frames. We recommend avoiding large gaps in frame indices, which can make tracking more difficult. + - Currently implemented trackers prefer continuous video frames. We recommend avoiding large gaps in frame indices, which can make tracking more difficult. -3. Go to the `labeled-data` folder, drag-and-drop a folder with extracted frames into napari. +1. Go to the `labeled-data` folder, drag-and-drop a folder with extracted frames into napari. -- This will create an Image layer with the frames + - This will create an Image layer with the frames -4. Drag-and-drop the `config.yaml` file from your DLC project into napari. +1. Drag-and-drop the `config.yaml` file from your DLC project into napari. -- This will create an empty Points layer with the correct DLC metadata, ready for annotation. + - This will create an empty Points layer with the correct DLC metadata, ready for annotation. -5. Annotate keypoints on a reference frame. +1. Annotate keypoints on a reference frame. See {ref}`sec:tracking-workflow-guides`. @@ -216,10 +219,10 @@ See {ref}`sec:tracking-workflow-guides`. 1. Go to the `labeled-data` folder, drag-and-drop a folder with extracted frames into napari. -- This will create an Image layer with the frames. -- The existing annotation from the `CollectedData_*.h5` file will be loaded as a Points layer. + - This will create an Image layer with the frames. + - The existing annotation from the `CollectedData_*.h5` file will be loaded as a Points layer. -2. Inspect existing annotations, select a reference frame, and refine keypoints if needed. +1. Inspect existing annotations, select a reference frame, and refine keypoints if needed. See {ref}`sec:tracking-workflow-guides`. @@ -233,10 +236,10 @@ See {ref}`sec:tracking-workflow-guides`. 1. Inspect the tracking results - You can use "Show trajectories" in the Keypoint Controls dock widget to visualize the trajectories of tracked points across frames, which can help identify where tracking starts to drift. - The plot is filtered by selected keypoints, so you can select a subset of points to inspect their trajectories more closely. -1. On the frame where tracking starts to drift: -1. Select the problematic point(s) and click "Delete selected points in future frames" to remove incorrect tracking results while preserving the tracked point(s) on the current frame. -1. Refine the keypoint(s) on the current frame to correct their position. -1. Re-run tracking from that frame to propagate the correction forward/backward in time. +1. If there are problematic points: + 1. On the frame where tracking starts to drift, select the problematic point(s) and click "Delete selected points in future frames" to remove incorrect tracking results while preserving the tracked point(s) on the current frame. + 1. Refine the keypoint(s) on the current frame to correct their position. + 1. Re-run tracking from that frame to propagate the correction forward/backward in time. 1. Merge the new tracking result back into the previous tracking layer (e.g. using "Overwrite existing target points") 1. Repeat until satisfied with the tracking result, then merge into the original annotation layer using "Fill missing only" to preserve your original annotations and only add tracked keypoints in frames where you don't have manual annotations. 1. **Remember to save the original Points annotation layer.** This is the only step that writes back to the DLC project folder directly and integrates with the `h5` file. @@ -289,6 +292,9 @@ Please share any feedback or insights you have with us! ### Important considerations +- As correcting labels can be time-consuming, annotating by hand may sometimes be faster than running tracking and heavily correcting its results. + - The benefits are mostly for long, continuous videos with many frames to annotate, where tracking can save time by propagating annotations across many frames at once. + - In very high variability/very challenging videos, annotating by hand may still be more efficient than running tracking and correcting its results, especially if you only have a few frames to annotate. - Manual curation is still essential for good tracking results, and the tracking models do not fully replace the need for manual annotation. - Ideally mixing manual annotations from challenging/distinct frames with tracking results from easier frames would yield the best results. - Be mindful of training set imbalance: if you flood your training set with easy frames that are well tracked, and only have a few hand-picked frames with rare or difficult poses, your model may not learn to generalize well to those challenging poses. From 8cac078c979c0f5cb35916c0232abb3309959f2d Mon Sep 17 00:00:00 2001 From: Cyril Achard Date: Fri, 8 May 2026 15:19:38 +0200 Subject: [PATCH 08/10] Final consistency pass MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Revise and clarify the Tracking Controls documentation: wrap third-party model attribution in a note, add a brief reminder that tracking accelerates but does not replace manual review, and split Requirements into “In napari” and “In your Python environment”. Improve UI documentation with a control/description table, a figure placeholder, and a note about available models. Reword and tighten many instructions (frame/reference wording, tracking range, actions, keyboard shortcuts), clarify that tracking runs in a background worker, and explain that tracking-result layers are intermediate and must be merged/saved to update DLC project files. Add a prominent warning that there is currently no undo for deletion/merge operations. Minor formatting and grammar fixes throughout (CoTracker attribution links, troubleshooting bullets, phrasing) to improve readability and consistency. --- docs/gui/napari/tracking/basic_usage.md | 170 +++++++++++++----------- 1 file changed, 95 insertions(+), 75 deletions(-) diff --git a/docs/gui/napari/tracking/basic_usage.md b/docs/gui/napari/tracking/basic_usage.md index 1df690f5b..43f61b9e3 100644 --- a/docs/gui/napari/tracking/basic_usage.md +++ b/docs/gui/napari/tracking/basic_usage.md @@ -13,8 +13,10 @@ deeplabcut: For basic usage of the annotation plugin, see {ref}`file:napari-dlc-basic-usage` for the recommended workflow. ``` -> We use third-party, open-source models for point tracking, and we thank the original authors and developers. -> Please see {ref}`sec:napari-tracking-models-attribution` at the end of this page for information about the tracking models used in the plugin and their citation information. +```{note} +We use third-party, open-source models for point tracking, and we thank the original authors and developers. +Please see {ref}`sec:napari-tracking-models-attribution` at the end of this page for information about the tracking models used in the plugin and their citation information. +``` ## Overview @@ -24,20 +26,31 @@ The **Tracking Controls** widget is designed to help automate DeepLabCut annotat 1. Use a point tracking model to propagate those keypoints forward and/or backward in time. 1. Inspect, refine, delete, and merge tracked results before exporting them back to DeepLabCut. +> **Tracking is intended to accelerate annotation, and cannot replace manual review.** + ## Requirements +### In napari + ```{important} Before using tracking, you must: - Load a **video** as an `Image` layer with time as the first dimension. - - The easiest is to drag-and-drop one of the `labeled-data` folders from your DLC project + - For DLC-integrated workflows, the easiest starting point is often to drag-and-drop one of the `labeled-data` folders from your DLC project. + - Tracking is most useful on temporally continuous image sequences or videos. - Ensure you have a **Points** layer containing DeepLabCut-style keypoints. - - If annotating from scratch, drag-and-drop the `config.yaml` file from your DLC project to create a new Points layer with the correct metadata. - - If loading a folder which already contains a `CollectedData_*.h5` file, the plugin will automatically create a Points layer with the existing annotations. + - If annotating from scratch, drag-and-drop the `config.yaml` file from your DLC project to create a new Points layer with the correct metadata. + - If loading a folder which already contains a `CollectedData_*.h5` file, the plugin will automatically create a Points layer with the existing annotations. - Annotate at least one frame with valid keypoints. + +See the workflow guides below for more details on how to prepare your data and annotations for tracking. ``` -If you do not have PyTorch installed or if you are using the plugin without the DeepLabCut package installed, install with: +### In your Python environment + +**Skip this if you have already installed PyTorch or DeepLabCut** + +If you do not have PyTorch installed, or if you are using the plugin without the DeepLabCut package installed, install with: ```bash pip install napari-deeplabcut[tracking] @@ -45,6 +58,8 @@ pip install napari-deeplabcut[tracking] ## User interface + + ```{figure} ../../../images/napari/tracking/controls.png --- name: tracking-controls @@ -61,8 +76,16 @@ Use: ### 1. Model selection -- **Tracker**: Selects the tracking backend from `AVAILABLE_TRACKERS`. -- **Info button**: Hover to see tracker-specific details. +| Control | Description | +| --------------- | ------------------------------------------------------- | +| **Tracker** | Selects the tracking backend from `AVAILABLE_TRACKERS`. | +| **Info button** | Hover to see tracker-specific details. | + + + +```{note} +Available models may depend on your installation and optional dependencies. +``` ### 2. Layer selection @@ -77,13 +100,13 @@ The widget automatically updates when layers are added, removed, or reordered. - The **Current** spinbox always reflects the viewer's current time index. - This frame is used as the **query frame** for tracking. - - This means that the model will generate tracking predictions based on the keypoints present on this frame, and use them as seeds to track forward and/or backward in time. + - The model generates tracking predictions from the keypoints present on this frame and uses them as seeds to track forward and/or backward in time. ```{note} -Only keypoints *visible on the reference frame* are used to initialize tracking. +Only keypoints present on the selected reference frame are used to initialize tracking. ``` -### 4. Frame Range Controls +### 4. Frame range controls Tracking range can be specified **relative** or **absolute** to the reference frame. @@ -99,7 +122,9 @@ Tracking range can be specified **relative** or **absolute** to the reference fr - `Abs >>`: absolute frame index - `Rel >>`: relative frame offset -### 5. Tracking Actions +Changing the current frame updates the valid forward/backward range automatically. + +### 5. Tracking actions | Button | Action | | ------ | ----------------------------- | @@ -111,7 +136,7 @@ Tracking range can be specified **relative** or **absolute** to the reference fr | ■ | Stop tracking | ```{note} -Tracking runs in a background worker thread. You may edit layers while it is running, and results will appear as a new layer once tracking is complete. +Tracking runs in a background worker thread. You can continue navigating the viewer and editing layers while it runs, and results will appear as a new layer once tracking is complete. ``` ## Keyboard shortcuts @@ -138,33 +163,34 @@ Each tracking run creates a **new Points layer**: - Named automatically (`[Tracking vXX] Ref. layer name - tT - Tracker name`) - `XX` refers to the iteration number (if multiple tracking runs are performed from the same reference frame) - `T` refers to the reference frame index used to generate the tracking result -- Visually different from manual annotations: +- Visually distinct from manual annotations: - Cross symbol - Slight transparency - Green border ```{note} The original annotation layer is never modified by tracking. -This has to be done manually by merging, see below. +To incorporate tracking results into your annotation data, use the merge workflow described below. ``` ```{important} -If you run into accessibility issues with the default visualization style, please [open an issue](https://github.com/DeepLabCut/napari-deeplabcut/issues), we would be happy to expand settings and provide more customization options if requested. +If you run into accessibility issues with the default visualization style, please [open an issue](https://github.com/DeepLabCut/napari-deeplabcut/issues). We would be happy to expand settings and provide more customization options if requested. ``` -## Refinement & saving tools - -### Deleting tracked Points in future frames +## Refinement and saving tools -**Oftentimes, tracking results will be satisfactory for a certain number of frames, then start to drift or produce errors.** -This is inherent to the tools, and as such we provide a simple way to delete incorrect tracking results in future frames while preserving the original annotations on the reference frame. +```{danger} +There is **currently no undo option**. Any **deletion or merging action you perform is irreversible**, so we recommend keeping track of your layers and using visibility toggles to compare before and after merge results. +``` -1. Select a tracking-result Points layer. +### Deleting tracked points in future frames - - This is always disabled for the original annotation layer. +**Tracking results are often satisfactory for a certain number of frames, then start to drift or produce errors.** +This is inherent to the tools, so the plugin provides a simple way to delete incorrect tracking results in future frames while preserving the original annotations on the reference frame. +1. Select a tracking-result Points layer. + - This action is always disabled for the original annotation layer. 1. Select one or more points on the **current frame**. - 1. Click **Delete selected points in future frames**. Only *exact identity matches* in future frames are removed. @@ -173,7 +199,7 @@ Only *exact identity matches* in future frames are removed. Points on the current frame are preserved so you can correct them and re-run tracking. ``` -### Merge (save) tracked Points +### Merging tracked points The **Merge tracked points** workflow allows you to: @@ -186,66 +212,60 @@ There are several merge options available to help you achieve the desired result - **Fill missing only**: Existing keypoints are always preserved. Missing keypoints in frames are filled with tracked results. - Intended for merging final tracking results into the original annotation layer. -- **Overwrite existing target points**: Tracked keypoints always overwrite existing ones, regardless of presence. +- **Overwrite existing target points**: Tracked keypoints overwrite existing ones in the target layer. - Intended for replacing poor tracking results with a new, updated tracking pass. -```{danger} -There is **currently no undo option**. Any **deletion or merging action you perform is irreversible**, so we recommend keeping track of your layers and using visibility toggles to compare before/after merge results. +```{important} +Tracking-result layers are intermediate working layers. +To save results back into the DeepLabCut project, first merge tracked points into a standard DLC annotation layer, then save that final annotation layer. ``` ## Workflow example -### Loading & annotating from scratch +### Loading and annotating from scratch 1. Create a DeepLabCut project and add the videos to label. - -1. Extract frames in the videos. - +1. Extract frames from the videos. - Currently implemented trackers prefer continuous video frames. We recommend avoiding large gaps in frame indices, which can make tracking more difficult. - -1. Go to the `labeled-data` folder, drag-and-drop a folder with extracted frames into napari. - - - This will create an Image layer with the frames - +1. Go to the `labeled-data` folder, then drag-and-drop a folder with extracted frames into napari. + - This creates an Image layer with the frames. 1. Drag-and-drop the `config.yaml` file from your DLC project into napari. - - - This will create an empty Points layer with the correct DLC metadata, ready for annotation. - + - This creates an empty Points layer with the correct DLC metadata, ready for annotation. 1. Annotate keypoints on a reference frame. -See {ref}`sec:tracking-workflow-guides`. +> See {ref}`sec:tracking-workflow-guides`. ### Loading and annotating from existing DLC annotations -1. Go to the `labeled-data` folder, drag-and-drop a folder with extracted frames into napari. - - - This will create an Image layer with the frames. - - The existing annotation from the `CollectedData_*.h5` file will be loaded as a Points layer. - +1. Go to the `labeled-data` folder, then drag-and-drop a folder with extracted frames into napari. + - This creates an Image layer with the frames. + - Existing annotations from the `CollectedData_*.h5` file are loaded as a Points layer. 1. Inspect existing annotations, select a reference frame, and refine keypoints if needed. -See {ref}`sec:tracking-workflow-guides`. +> See {ref}`sec:tracking-workflow-guides`. (sec:tracking-workflow-guides)= ### Tracking 1. Open the Tracking Controls widget (`Plugins -> napari-deeplabcut -> Tracking controls`). -1. Go to the desired reference frame (with annotated keypoints visible). -1. Select the forward/backward tracking range using the sliders, OR track to beginning/end of the video using the fast-forward buttons. -1. Inspect the tracking results - - You can use "Show trajectories" in the Keypoint Controls dock widget to visualize the trajectories of tracked points across frames, which can help identify where tracking starts to drift. +1. Go to the desired reference frame, with annotated keypoints visible. +1. Select the forward/backward tracking range using the sliders, or track to the beginning/end of the video using the seek buttons. +1. Inspect the tracking results. + - You can use **Show trajectories** in the Keypoint Controls dock widget to visualize the trajectories of tracked points across frames, which can help identify where tracking starts to drift. - The plot is filtered by selected keypoints, so you can select a subset of points to inspect their trajectories more closely. 1. If there are problematic points: - 1. On the frame where tracking starts to drift, select the problematic point(s) and click "Delete selected points in future frames" to remove incorrect tracking results while preserving the tracked point(s) on the current frame. + 1. On the frame where tracking starts to drift, select the problematic point(s) and click **Delete selected points in future frames** to remove incorrect tracking results while preserving the tracked point(s) on the current frame. 1. Refine the keypoint(s) on the current frame to correct their position. - 1. Re-run tracking from that frame to propagate the correction forward/backward in time. -1. Merge the new tracking result back into the previous tracking layer (e.g. using "Overwrite existing target points") -1. Repeat until satisfied with the tracking result, then merge into the original annotation layer using "Fill missing only" to preserve your original annotations and only add tracked keypoints in frames where you don't have manual annotations. -1. **Remember to save the original Points annotation layer.** This is the only step that writes back to the DLC project folder directly and integrates with the `h5` file. + 1. Re-run tracking from that frame to propagate the correction forward or backward in time. +1. Merge the new tracking result back into the previous tracking layer when appropriate (for example, using **Overwrite existing target points**). +1. Repeat until satisfied with the tracking result, then merge into the original annotation layer using **Fill missing only** to preserve your original annotations and only add tracked keypoints in frames where you do not yet have manual annotations. +1. **Save the final DLC annotation layer** (usually the original annotation layer after merging). + - Tracking-result layers are intermediate working layers and are not written back directly as DLC project annotations. + - Saving the final merged annotation layer is the step that writes back to the DLC project folder and updates the `CollectedData_*.h5` workflow. ```{note} -The "Show trails" feature is currently not implemented for tracking result layers, please [open an issue](https://github.com/DeepLabCut/napari-deeplabcut/issues) if this is something you would like to see in the future! +The **Show trails** feature is currently not available for tracking-result layers. Please [open an issue](https://github.com/DeepLabCut/napari-deeplabcut/issues) if this is something you would like to see in the future. ``` ## Troubleshooting @@ -254,17 +274,17 @@ The "Show trails" feature is currently not implemented for tracking result layer Ensure: -- You are on the intended frame -- The correct Points layer is selected -- Points exist exactly on that frame index +- You are on the intended frame. +- The correct Points layer is selected. +- Points exist exactly on that frame index. ### Tracking buttons do nothing Check that: -- A video layer is selected -- A keypoint layer is selected -- Tracking is not already running +- A video layer is selected. +- A keypoint layer is selected. +- Tracking is not already running. (sec:napari-tracking-models-attribution)= @@ -272,10 +292,10 @@ Check that: ### CoTracker3 -> CoTracker is a fast transformer-based model that can track any point in a video. It brings to tracking some of the benefits of Optical Flow. +> CoTracker is a fast transformer-based model that can track any point in a video. It brings to tracking some of the benefits of OpticalFlow. -[Link to GitHub repository](https://github.com/facebookresearch/co-tracker) -[Citation information](https://github.com/facebookresearch/co-tracker#citing-cotracker) +- [Link to GitHub repository](https://github.com/facebookresearch/co-tracker) +- [Citation information](https://github.com/facebookresearch/co-tracker#citing-cotracker) ```{admonition} Empirical observations --- @@ -284,8 +304,8 @@ class: tip This information is based on our own testing and experience with the model. Please share any feedback or insights you have with us! -- **Strengths:** fast on GPU, can output 10-100 frames of satisfactory tracking results, depending on difficulty -- **Limitations:** strong preference for continuous video frames, struggles with large gaps in frame indices (e.g. automated DLC frame extraction via clustering, or uniform extraction with large step size) +- **Strengths:** fast on GPU, can output 10-100 frames of satisfactory tracking results, depending on difficulty. +- **Limitations:** strong preference for continuous video frames; struggles with large gaps in frame indices (for example, automated DLC frame extraction via clustering, or uniform extraction with a large step size). ``` ## Limitations and future directions @@ -294,19 +314,19 @@ Please share any feedback or insights you have with us! - As correcting labels can be time-consuming, annotating by hand may sometimes be faster than running tracking and heavily correcting its results. - The benefits are mostly for long, continuous videos with many frames to annotate, where tracking can save time by propagating annotations across many frames at once. - - In very high variability/very challenging videos, annotating by hand may still be more efficient than running tracking and correcting its results, especially if you only have a few frames to annotate. + - In very high-variability or very challenging videos, annotating by hand may still be more efficient than running tracking and correcting its results, especially if you only have a few frames to annotate. - Manual curation is still essential for good tracking results, and the tracking models do not fully replace the need for manual annotation. -- Ideally mixing manual annotations from challenging/distinct frames with tracking results from easier frames would yield the best results. +- Ideally, mixing manual annotations from challenging or distinct frames with tracking results from easier frames yields the best results. - Be mindful of training set imbalance: if you flood your training set with easy frames that are well tracked, and only have a few hand-picked frames with rare or difficult poses, your model may not learn to generalize well to those challenging poses. #### Future features -- We currently only provide CoTracker3 as a model. It is however relatively easy to add new models to the plugin via the registry; feel free to ask if you would like to contribute a model or see a specific model added! -- Saving tracking layers as CSV is supported, but they will not be loaded correctly as tracking results in the plugin. We currently recommend using the "Merge tracked points" workflow to save results back into the original annotation layer, which is then saved to the DLC project folder and can be loaded in future sessions. -- If there is demand, we may add support for saving/loading tracking layers as separate files in the DLC project folder. -- If you have ideas for specific refinement tools, shortcuts or other features that would be useful to add to the plugin, please share them with us! +- We currently only provide CoTracker3 as a model. It is, however, relatively easy to add new models to the plugin via the registry; feel free to ask if you would like to contribute a model or see a specific model added. +- Generic napari saves or exports of tracking-result layers are not part of the recommended DeepLabCut workflow. Tracking-result layers are intermediate working layers; to preserve results in a DLC project-compatible way, merge them into a standard annotation layer and save that layer. +- If there is demand, we may add support for saving and loading tracking layers as separate files in the DLC project folder. +- If you have ideas for specific refinement tools, shortcuts, or other features that would be useful to add to the plugin, please share them with us. ## Getting help and providing feedback -- [GitHub issues](https://github.com/DeepLabCut/napari-deeplabcut/issues): for bug reports, feature requests, or general questions. We welcome your feedback and contributions! +- [GitHub issues](https://github.com/DeepLabCut/napari-deeplabcut/issues): for bug reports, feature requests, or general questions. We welcome your feedback and contributions. - [Discussion forum](https://forum.image.sc/tag/deeplabcut): for general discussion, questions, and sharing your work with the community. We also provide troubleshooting help and guidance here, but may open an issue for actual bugs or feature requests and request more information there. From ea2db0ef5b5452a9d5858f4df537b6df3de05b8e Mon Sep 17 00:00:00 2001 From: Cyril Achard Date: Fri, 8 May 2026 15:48:32 +0200 Subject: [PATCH 09/10] Wording edits --- docs/gui/napari/tracking/basic_usage.md | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/docs/gui/napari/tracking/basic_usage.md b/docs/gui/napari/tracking/basic_usage.md index 43f61b9e3..0c346c6c6 100644 --- a/docs/gui/napari/tracking/basic_usage.md +++ b/docs/gui/napari/tracking/basic_usage.md @@ -14,7 +14,7 @@ For basic usage of the annotation plugin, see {ref}`file:napari-dlc-basic-usage` ``` ```{note} -We use third-party, open-source models for point tracking, and we thank the original authors and developers. +The plugin relies on third-party open-source tracking models. Please see {ref}`sec:napari-tracking-models-attribution` at the end of this page for information about the tracking models used in the plugin and their citation information. ``` @@ -94,7 +94,7 @@ Available models may depend on your installation and optional dependencies. | **Keypoints** | Points layer containing manually annotated DLC keypoints | | **Video** | Image layer containing the video to track | -The widget automatically updates when layers are added, removed, or reordered. +The widget automatically updates based on layer changes. ### 3. Reference frame selection @@ -136,7 +136,7 @@ Changing the current frame updates the valid forward/backward range automaticall | ■ | Stop tracking | ```{note} -Tracking runs in a background worker thread. You can continue navigating the viewer and editing layers while it runs, and results will appear as a new layer once tracking is complete. +Tracking runs in a background worker thread. You can continue navigating the viewer and editing layers while it runs; results will appear as a new layer once tracking is complete. ``` ## Keyboard shortcuts @@ -153,7 +153,7 @@ This is only available if the Keypoint controls widget has been opened. ## Tracking results ```{tip} -**Hiding layers, and being able to distinguish which results originate from which layer, is a very important notion for effectively using the plugin.** +**Being able to tell which results originate from which layer is very important for effectively using the plugin.** - Layers can be toggled (visible/invisible) with `V` by default or by clicking the eye icon next to the layer name in the layer list. - Grid mode (toggled with `Ctrl+G` by default) can also help visually separate different layers and their results. ``` @@ -186,7 +186,7 @@ There is **currently no undo option**. Any **deletion or merging action you perf ### Deleting tracked points in future frames **Tracking results are often satisfactory for a certain number of frames, then start to drift or produce errors.** -This is inherent to the tools, so the plugin provides a simple way to delete incorrect tracking results in future frames while preserving the original annotations on the reference frame. +Because of this sometimes unavoidable drift, we provide a way to delete future tracked points while keeping the current frame intact. 1. Select a tracking-result Points layer. - This action is always disabled for the original annotation layer. @@ -204,7 +204,7 @@ Points on the current frame are preserved so you can correct them and re-run tra The **Merge tracked points** workflow allows you to: - Combine multiple tracking passes -- Resolve overlaps or conflicts +- Decide how to handle overlaps or conflicts - Produce a clean final annotation layer This is especially useful when tracking was run from multiple reference frames. @@ -316,7 +316,7 @@ Please share any feedback or insights you have with us! - The benefits are mostly for long, continuous videos with many frames to annotate, where tracking can save time by propagating annotations across many frames at once. - In very high-variability or very challenging videos, annotating by hand may still be more efficient than running tracking and correcting its results, especially if you only have a few frames to annotate. - Manual curation is still essential for good tracking results, and the tracking models do not fully replace the need for manual annotation. -- Ideally, mixing manual annotations from challenging or distinct frames with tracking results from easier frames yields the best results. +- In practice, a mix of hand-labeled hard frames and tracked easy frames should often works best. - Be mindful of training set imbalance: if you flood your training set with easy frames that are well tracked, and only have a few hand-picked frames with rare or difficult poses, your model may not learn to generalize well to those challenging poses. #### Future features From 30494e84191d47b706783e304453806a35ef0b97 Mon Sep 17 00:00:00 2001 From: Cyril Achard Date: Mon, 11 May 2026 12:58:24 +0200 Subject: [PATCH 10/10] Add GPU info, fix layer name versioning --- docs/gui/napari/tracking/basic_usage.md | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/docs/gui/napari/tracking/basic_usage.md b/docs/gui/napari/tracking/basic_usage.md index 0c346c6c6..aefd43ea9 100644 --- a/docs/gui/napari/tracking/basic_usage.md +++ b/docs/gui/napari/tracking/basic_usage.md @@ -30,6 +30,11 @@ The **Tracking Controls** widget is designed to help automate DeepLabCut annotat ## Requirements +```{tip} +We recommend **having a GPU available for tracking**, as it can be computationally intensive and slow on CPU. +Expect longer processing times on CPU, especially for longer videos or larger tracking ranges. +``` + ### In napari ```{important} @@ -160,8 +165,8 @@ This is only available if the Keypoint controls widget has been opened. Each tracking run creates a **new Points layer**: -- Named automatically (`[Tracking vXX] Ref. layer name - tT - Tracker name`) - - `XX` refers to the iteration number (if multiple tracking runs are performed from the same reference frame) +- Named automatically (`[Tracking v] Ref. layer name - t - Tracker name`) + - `XX` refers to the iteration number (if multiple tracking runs are performed from the same reference layer and model) - `T` refers to the reference frame index used to generate the tracking result - Visually distinct from manual annotations: - Cross symbol