You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently a ForceBalance calculation reads from the input file a set of general options for the calculation and one set of target options for each target. Options that are not set in the input file are set to their default values as declared in the parser. This is done by first copying the default general options / target options and then inserting user-provided values. Simple right?
Wrong; it turns out we often want to define default values at the target level. That is to say, different targets might want to have different default values for a given option. The default values for XYZ coordinates is an example.
I have tried (unsuccessfully) to implement this. The set_option method defined for all ForceBalance classes allows you to enter a default value. This value is used if the options dictionaries (passed into the target initialization) do not contain the key in question. However, since the options dictionaries always contain the default values from parser.py, the target-level default values are never used, so we don't have the fine-grained control over target options that we desire.
I think there ought to be a hierarchy of ways to set target options:
User-provided input (in the input file) is always used.
Defaults provided at the target level are next highest in priority.
Global defaults provided in parser.py are lowest priority.
One way to implement this would be to simply not copy the global defaults when initially creating the global option / target option dictionaries, so they are empty except for the user-provided values when passed into the class initialization. This could break a few existing things, but thank goodness for unit tests.
Also: Certain True/False options have previously defaulted to False, but the proposed change would make it so that the option doesn't exist at all. This means several places in the code I have to do this clumsiness:
- if not options['continue']:
+ if ('continue' not in options) or (not options['continue']):
It's not too clumsy but somehow it seems wrong.
Let me know what you think.
The text was updated successfully, but these errors were encountered:
You mention the 'hierarchy' of different ways that a given option can be set, with direct user input at the highest level and different 'levels' of defaults having lower priority below that. What about directly translating that idea into code?
Each time you register a value for an option, you'd just mark what priority it has and then let the library give you the highest priority value each time you query it.
Thinking out loud here..
Currently a ForceBalance calculation reads from the input file a set of general options for the calculation and one set of target options for each target. Options that are not set in the input file are set to their default values as declared in the parser. This is done by first copying the default general options / target options and then inserting user-provided values. Simple right?
Wrong; it turns out we often want to define default values at the target level. That is to say, different targets might want to have different default values for a given option. The default values for XYZ coordinates is an example.
I have tried (unsuccessfully) to implement this. The
set_option
method defined for all ForceBalance classes allows you to enter a default value. This value is used if the options dictionaries (passed into the target initialization) do not contain the key in question. However, since the options dictionaries always contain the default values from parser.py, the target-level default values are never used, so we don't have the fine-grained control over target options that we desire.I think there ought to be a hierarchy of ways to set target options:
parser.py
are lowest priority.One way to implement this would be to simply not copy the global defaults when initially creating the global option / target option dictionaries, so they are empty except for the user-provided values when passed into the class initialization. This could break a few existing things, but thank goodness for unit tests.
Also: Certain True/False options have previously defaulted to False, but the proposed change would make it so that the option doesn't exist at all. This means several places in the code I have to do this clumsiness:
It's not too clumsy but somehow it seems wrong.
Let me know what you think.
The text was updated successfully, but these errors were encountered: