You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: recipes/MOABB/README.md
+13-3Lines changed: 13 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -130,7 +130,6 @@ For training the model using the `leave_one_subject_out` training approach, just
130
130
131
131
### Hyperparameter Tuning
132
132
133
-
134
133
Efficient hyperparameter tuning is paramount when introducing novel models or experimenting with diverse datasets. Our benchmark establishes a standardized protocol for hyperparameter tuning, utilizing [Orion](https://orion.readthedocs.io/en/stable/) to ensure fair model comparisons.
135
134
136
135
#### **Overview**
@@ -192,6 +191,19 @@ As evident from the example, you need to configure the hyperparameter file, spec
192
191
When it comes to training the model utilizing the leave_one_subject_out approach, simply employ the `--train_mode leave-one-subject-out` flag.
193
192
194
193
By default trainings are performed on gpu. However, in case you do not have any gpu available on your machine, you can train models on cpu by specifying the `--device cpu` flag.
194
+
195
+
**Note:**
196
+
- To monitor the status of the hyperparameter optimization, simply enter the following command: `orion status --all`. Ensure that you have added the necessary variables required by orion to your bash environment. You can achieve this by executing the following code within your terminal:
Please note that the value of the `ORION_DB_ADDRESS` variable will vary depending on the experiment. Adjust it accordingly.
204
+
205
+
- If needed, you can interrupt the code at any point, and it will resume from the last successfully completed experiment.
206
+
195
207
#### **Output Structure**
196
208
197
209
Results are organized within the specified output folder (`--output_folder`):
@@ -213,8 +225,6 @@ For further details on arguments and customization options, consult `./run_hpara
213
225
214
226
- If you intend to perform multiple repetitions of the same hparam optimization, it is necessary to modify the `--exp_name`.
215
227
216
-
- Feel free to interrupt and resume the hyperparameter optimization script at any point without encountering any complications. Orion is resumable and will restart from the last experiment completed.
217
-
218
228
- This script is designed for a Linux-based system. In this context, we provide a bash script instead of a Python script due to its natural ability of orchestrating diverse training loops across various subjects and sessions.
0 commit comments