OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

Does hydra's basicsweeper support nested sweeps?

  • Thread starter Thread starter Parlu10
  • Start date Start date
P

Parlu10

Guest
I'm using Hydra to manage parameters configuration of my model, and I wanted to do hyperparameter tuning. I'm using two different optimizers, adam and sgd, and they have different parameters, so I wanted to use nested sweep to set those parameter; but I'm not sure if Hydra's Basic Sweeper does support Nested Tuning and, if so, how to do it. Documentation doesn't really say anything about it, but maybe I'm missing something.

Code:
defaults:
  - dataset: file_cpet
  - optimizer: ???
  - model: tenet_lstm
  - _self_

working_dir: ${hydra:runtime.cwd}
data_dir: ${working_dir}/data/
batch_size: 10
epochs: 150

hydra:
  sweeper:
    params:
      optimizer: sgd, adam
      optimizer.lr: 1e-2, 1.5e-2

sgd.yaml

Code:
defaults:
  - sgd-tuning: momentum-tuning
  - _self_

name: sgd
lr: 1.0e-2
nesterov: False
weight_decay: 0

hydra:
  sweeper:
    params:
      sgd-tuning.momentum: 0.85, 0.9

momentum-tuning.yaml

Code:
momentum: 0.85

I tried to set it like this but it won't work; or rather, it performs the tuning when it's "Adam's" turn, but it doesn't do it when it's "SGD's" turn, and gives me this error:

Code:
Error executing job with overrides: ['optimizer=sgd', 'optimizer.lr=0.01']
Traceback (most recent call last):
  File "c:\Users\Parlu\Desktop\respirazione\src\scripts\train_lstm.py", line 141, in main
    m = cfg.optimizer.momentum
omegaconf.errors.ConfigAttributeError: Key 'momentum' is not in struct
    full_key: optimizer.momentum
    object_type=dict
<p>I'm using Hydra to manage parameters configuration of my model, and I wanted to do hyperparameter tuning.
I'm using two different optimizers, adam and sgd, and they have different parameters, so I wanted to use nested sweep to set those parameter; but I'm not sure if Hydra's Basic Sweeper does support Nested Tuning and, if so, how to do it. Documentation doesn't really say anything about it, but maybe I'm missing something.</p>
<pre><code>defaults:
- dataset: file_cpet
- optimizer: ???
- model: tenet_lstm
- _self_

working_dir: ${hydra:runtime.cwd}
data_dir: ${working_dir}/data/
batch_size: 10
epochs: 150

hydra:
sweeper:
params:
optimizer: sgd, adam
optimizer.lr: 1e-2, 1.5e-2
</code></pre>
<p>sgd.yaml</p>
<pre><code>defaults:
- sgd-tuning: momentum-tuning
- _self_

name: sgd
lr: 1.0e-2
nesterov: False
weight_decay: 0

hydra:
sweeper:
params:
sgd-tuning.momentum: 0.85, 0.9
</code></pre>
<p>momentum-tuning.yaml</p>
<pre><code>momentum: 0.85
</code></pre>
<p>I tried to set it like this but it won't work; or rather, it performs the tuning when it's "Adam's" turn, but it doesn't do it when it's "SGD's" turn, and gives me this error:</p>
<pre><code>Error executing job with overrides: ['optimizer=sgd', 'optimizer.lr=0.01']
Traceback (most recent call last):
File "c:\Users\Parlu\Desktop\respirazione\src\scripts\train_lstm.py", line 141, in main
m = cfg.optimizer.momentum
omegaconf.errors.ConfigAttributeError: Key 'momentum' is not in struct
full_key: optimizer.momentum
object_type=dict
</code></pre>
 
Top