Skip to content

Commit 067d94e

Browse files
committed
Fixed conflicts
2 parents 327a3f5 + 75f4c66 commit 067d94e

5 files changed

Lines changed: 20 additions & 8 deletions

File tree

recipes/MOABB/README.md

Lines changed: 13 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -130,7 +130,6 @@ For training the model using the `leave_one_subject_out` training approach, just
130130

131131
### Hyperparameter Tuning
132132

133-
134133
Efficient hyperparameter tuning is paramount when introducing novel models or experimenting with diverse datasets. Our benchmark establishes a standardized protocol for hyperparameter tuning, utilizing [Orion](https://orion.readthedocs.io/en/stable/) to ensure fair model comparisons.
135134

136135
#### **Overview**
@@ -192,6 +191,19 @@ As evident from the example, you need to configure the hyperparameter file, spec
192191
When it comes to training the model utilizing the leave_one_subject_out approach, simply employ the `--train_mode leave-one-subject-out` flag.
193192

194193
By default trainings are performed on gpu. However, in case you do not have any gpu available on your machine, you can train models on cpu by specifying the `--device cpu` flag.
194+
195+
**Note:**
196+
- To monitor the status of the hyperparameter optimization, simply enter the following command: `orion status --all`. Ensure that you have added the necessary variables required by orion to your bash environment. You can achieve this by executing the following code within your terminal:
197+
198+
```bash
199+
export ORION_DB_ADDRESS=results/MotorImagery/BNCI2014001/EEGNet/hopt/EEGNet_BNCI2014001_hopt.pkl
200+
export ORION_DB_TYPE=pickleddb
201+
```
202+
203+
Please note that the value of the `ORION_DB_ADDRESS` variable will vary depending on the experiment. Adjust it accordingly.
204+
205+
- If needed, you can interrupt the code at any point, and it will resume from the last successfully completed experiment.
206+
195207
#### **Output Structure**
196208

197209
Results are organized within the specified output folder (`--output_folder`):
@@ -213,8 +225,6 @@ For further details on arguments and customization options, consult `./run_hpara
213225

214226
- If you intend to perform multiple repetitions of the same hparam optimization, it is necessary to modify the `--exp_name`.
215227

216-
- Feel free to interrupt and resume the hyperparameter optimization script at any point without encountering any complications. Orion is resumable and will restart from the last experiment completed.
217-
218228
- This script is designed for a Linux-based system. In this context, we provide a bash script instead of a Python script due to its natural ability of orchestrating diverse training loops across various subjects and sessions.
219229

220230

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
mne
22
moabb
33
orion[profet]
4-
sklearn
4+
scikit-learn
55
torchinfo

recipes/MOABB/hparams/orion/hparams_tpe.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
experiment:
2-
algorithms:
2+
algorithm:
33
tpe:
44
seed: 1986
55
n_initial_points: 20

recipes/MOABB/run_experiments.sh

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -175,6 +175,9 @@ if [ "$rnd_dir" = True ]; then
175175
output_folder="$output_folder/$rnd_dirname"
176176
fi
177177

178+
# Make sure the output_folder is created
179+
mkdir -p $output_folder
180+
178181
# Print command line arguments and save to file
179182
{
180183
echo "hparams: $hparams"

recipes/MOABB/run_hparam_optimization.sh

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,6 @@
4646
# - Davide Borra (2023)
4747
###########################################################
4848

49-
5049
# Initialize variables
5150
exp_name=""
5251
output_folder=""
@@ -65,8 +64,8 @@ mne_dir=""
6564
orion_db_address=""
6665
orion_db_type="PickledDB"
6766
exp_max_trials=50
68-
store_all=False
69-
compress_exp=False
67+
store_all=True
68+
compress_exp=True
7069

7170
# Function to print argument descriptions and exit
7271
print_argument_descriptions() {

0 commit comments

Comments
 (0)