deep learning - Lightning Tensorboard Hparams tab not showing custom metric - Stack Overflow

admin2025-04-16  6

I'm training a neural network built with pyTorch Lightning and I'm trying to have the HParams tab working in tensorboard.

Following the official outdated guide I initialize the logger with default_hp_metric=False:

logger = TensorBoardLogger("lightning_logs", name=modelName, default_hp_metric=False)

I set once the hyper parameters defining a custom metric:

logger.log_hyperparams(settings, metrics = {'val_maxroa': 0})

and I log to val_maxroa at the end of each validation step in the LightningModule:

 def validation_step(self, batch, batch_idx):
        inputs, target = batch
        # ...Things...
        self.log("val_maxroa", maxRoa)

On Tensorboard I can see the metric plotted in the "Time series" and "Scalars" tabs:

But in the HParams tab is not visible:

I checked also the official documentation but it's not very detailed for this aspect.

Do you have any hint?

转载请注明原文地址:http://anycun.com/QandA/1744769843a87373.html