Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rename the argument of dense encoder from layers to fc_layers #3930

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions ludwig/encoders/generic_encoders.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ class DenseEncoder(Encoder):
def __init__(
self,
input_size,
layers=None,
fc_layers=None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@metamath1 Thank you for proposing this change. Do you think it would potentially cause backward compatibility issues if other users already passed layers to the constructor? Thank you. /cc @arnavgarg1

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@alexsherstinsky Yes, that is right.

We have a file to add support for backward compatibility here: https://github.com/ludwig-ai/ludwig/blob/ea890d99e72fbc32fad8aae18070da9969ddb422/ludwig/utils/backward_compatibility.py

And an associated test file https://github.com/ludwig-ai/ludwig/blob/ea890d99e72fbc32fad8aae18070da9969ddb422/tests/ludwig/utils/test_backward_compatibility.py to add a test to make sure it works.

@metamath1 Are you able to add backward compatibility support and an associated test for it?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🙇

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@metamath1 Actually, I am a bit confused about this PR - what's the intended scope and reason for the change?

I misunderstood the change - this may not require any backward incompatibility changes since you're modifying the argument in the encoder class as opposed to the schema directly.

Copy link
Author

@metamath1 metamath1 Feb 13, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@alexsherstinsky Yes, that is right.

We have a file to add support for backward compatibility here: https://github.com/ludwig-ai/ludwig/blob/ea890d99e72fbc32fad8aae18070da9969ddb422/ludwig/utils/backward_compatibility.py

And an associated test file https://github.com/ludwig-ai/ludwig/blob/ea890d99e72fbc32fad8aae18070da9969ddb422/tests/ludwig/utils/test_backward_compatibility.py to add a test to make sure it works.

@metamath1 Are you able to add backward compatibility support and an associated test for it?

Here's the setup for creating multiple Linear layers in a dense encoder

config = {
    'preprocessing': {
        'split': {
            'type': 'fixed',
            'column': 'split'
        }
    },
    'input_features': [
        {
            'name': 'image',
            'type': 'vector',
            'encoder': {
                'type': 'dense',
                # This part doesn't work.
		# This part is ignored and always creates a layer with an output size of 256.
		'fc_layers': [
                    {'output_size': 64},
                    {'output_size': 32}
                ]
            }
        }
    ],
    'output_features': [
        {'name': 'label', 'type': 'category'}
    ],
    'trainer': {'epochs': 1, 'batch_size': 64, 'eval_batch_size': 128}
}

However, the fc_layers setting does not work.
This is because the argument to the DenseEncoder is named layers,
but the config dictionary object stores the setting as fc_layers.
The settings in the user's fc_layers are not being passed to the DenseEncoder.

Looking at the documentation, it seems that starting from version 0.7, layers has been renamed to fc_layers.
Since the layers key is no longer created in the config dictionary object, specifying layers in the config like in older versions instead of fc_layers won't work either.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This really needs to be fixed.

For retrocompatibility we can have a couple lines fo code that internally rename layers to fc_layers and warn the user of the change and that it is deprecated.

Also, if it was not working anyway, there's not a lot of damage in beckward incompatibility.

Finally, I'm surprised by the fact that the wrong keyword was not captured by either the config validar or the dereferencing of the kwargs, the should have been a check somewhere.

num_layers=1,
output_size=256,
use_bias=True,
Expand All @@ -87,7 +87,7 @@ def __init__(
logger.debug(" FCStack")
self.fc_stack = FCStack(
first_layer_input_size=input_size,
layers=layers,
layers=fc_layers,
num_layers=num_layers,
default_output_size=output_size,
default_use_bias=use_bias,
Expand Down
Loading