Hi, I'm trying to train a multi output nn and I need to change the weight of each loss component depending on the epoch number. In some previous versions of keras I implemented this mechanism by defining each weight of the loss_weights parameter as a K.variable() type, and changing the value with K.set_value() in an on_epoch_begin() method of a custom callback. Now, in keras 3.4 this is not allowed, as loss_weights must be a list/dict of float, so within the callback I can't change the value in place with K.set_value(). Is there something I can do to overcome this issue? Thanks

Comment From: mehtamansi29

Hi @LucaSCostanzo -

Can you help me with sample code where you are facing error ?

Comment From: LucaSCostanzo

Hi mehtamansi29, thanks for the answer. Here is an example of what I meant. This is the custom callback to update the weights after each epoch:

import tensorflow.keras.backend as K
from tensorflow.keras import callbacks

class WeightAdjuster(callbacks.Callback):
    def __init__(self, weights: dict):
        self.w1 = weights['loss1']
        self.w2 = weights['loss2']  
    def on_epoch_end(self, epoch, logs):
        K.set_value(self.w1, self.w1 + 1)
        K.set_value(self.w2, self.w2 - 1)

and the training pipeline I used in a previous version of Keras:

loss = {'loss1':   BinaryCrossentropy(),#BinaryFocalCrossentropy,
           'loss2': BinaryCrossentropy()}
loss_weights = {'loss1':   K.variable(1.),
                          'loss2': K.variable(0.)}
metrics = {'loss1':   'accuracy',
                  'loss2': 'accuracy'} 
nn.compile(loss=loss, loss_weights = loss_weights, optimizer=optimizer, metrics=metrics)
nn.fit(training_generator, epochs=epochs, validation_data=validation_generator, steps_per_epoch = None)

but now it is not allowed, as I receive this message:

ValueError: For a model with multiple outputs, when providing the `loss_weights` argument as a dict, each dict entry should be a Python float (the weighting coefficient corresponding to the loss for that output). At key 'loss1', received invalid type:
<tf.Variable 'Variable:0' shape=() dtype=float32>

Of course I'm forced to define loss_weights and the callback as:

loss_weights = {'loss1':   1.,  'loss2': 0.}

class WeightAdjuster(callbacks.Callback):
    def __init__(self, weights: dict):
        self.w1 = weights['loss1']
        self.w2 = weights['loss2']  
    def on_epoch_end(self, epoch, logs):
        self.w1 = self.w1 + 1
        self.w2 = self.w2 - 1

but now changing w1 and w2 have no effect on the loss_weights dict.

I hope the issue is clearer now. Thanks

Comment From: dhantule

Hi @LucaSCostanzo, for this issue please refer this comment which suggests updating the weights in Python (as floats) and recompiling the model when you want the new weight values to be taken into account. Thanks !

Comment From: github-actions[bot]

This issue is stale because it has been open for 14 days with no activity. It will be closed if no further activity occurs. Thank you.