Introduction
Welcome to the Pywr book! Pywr is an open-source network resource allocation model. This book is a collection of documentation and tutorials for using Pywr. It should be read alongside the Pywr API documentation.
What is Pywr?
Pywr is a Rust crate and Python package for building and running water resource models. It allows users to construct models of water systems using a network of nodes and links, and other data. The model can then be used to simulate the operation of a water system, and to evaluate the performance of the system under different scenarios.
This version is a major update to the original Pywr model, which was written in Python and Cython. The new version is written in Rust, and uses Python bindings to expose the functionality to Python.
Installation
Pywr is both a Rust library and a Python package.
Rust
TBC
Python
Pywr requires Python 3.9 or later. It is currently not available on PyPI, but wheels are available from the GitHub actions page. Navigate to the latest successful build, and download the archive and extract the wheel for your platform.
pip install pywr-2.0.0b0-cp312-none-win_amd64.whl
Note: That current Pywr v2.x is in pre-release and may not be suitable for production use. If you require Pywr v1.x please use
pip install pywr<2
.
Running a model
Pywr is a modelling system for simulating water resources systems.
Models are defined using a JSON schema, and can be run using the pywr
command line tool.
Below is an example of a simple model definition simple1.json
:
{
"metadata": {
"title": "Simple 1",
"description": "A very simple example.",
"minimum_version": "0.1"
},
"timestepper": {
"start": "2015-01-01",
"end": "2015-12-31",
"timestep": 1
},
"network": {
"nodes": [
{
"meta": {
"name": "supply1"
},
"type": "Input",
"max_flow": {
"type": "Constant",
"value": 15.0
}
},
{
"meta": {
"name": "link1"
},
"type": "Link"
},
{
"meta": {
"name": "demand1"
},
"type": "Output",
"max_flow": {
"type": "Parameter",
"name": "demand"
},
"cost": {
"type": "Constant",
"value": -10
}
}
],
"edges": [
{
"from_node": "supply1",
"to_node": "link1"
},
{
"from_node": "link1",
"to_node": "demand1"
}
],
"parameters": [
{
"meta": {
"name": "demand"
},
"type": "Constant",
"value": 10.0
}
]
}
}
To run the model, use the pywr
command line tool:
python -m pywr run simple1.json
Related projects
Core concepts
The network
Parameters
Penalty costs
Reservoirs
Abstraction licences
Scenarios
Input data
Getting started
Migrating from Pywr v1.x
This guide is intended to help users of Pywr v1.x migrate to Pywr v2.x. Pywr v2.x is a complete rewrite of Pywr with a new API and new features. This guide will help you update your models to this new version.
Overview of the process
Pywr v2.x includes a more strict schema for defining models. This schema, along with the pywr-v1-schema crate, provide a way to convert models from v1.x to v2.x. However, this process is not perfect and will more than likely require manual intervention to complete the migration. The migration of larger and/or more complex models will require an iterative process of conversion and testing.
The overall process will follow these steps:
- Convert the JSON from v1.x to v2.x using the provided conversion tool.
- Handle any errors or warnings from the conversion tool.
- Apply any other manual changes to the converted JSON.
- (Optional) Save the converted JSON as a new file.
- Load and run the new JSON file in Pywr v2.x.
- Compare model outputs to ensure it behaves as expected. If necessary, make further changes to the above process and repeat.
Converting a model
The example below is a basic script that demonstrates how to convert a v1.x model to v2.x. This process converts the model at runtime, and does not replace the existing v1.x model with a v2.x definition.
Note: This example is meant to be a starting point for users to build their own conversion process; it is not a complete generic solution.
The function in the listing below is an example of the overall conversion process. The function takes a path to a JSON file containing a v1 Pywr model, and then converts it to v2.x.
- The function reads the JSON, and applies the conversion function (
convert_model_from_v1_json_string
). - The conversion function that takes a JSON string and returns a tuple of the converted JSON string and a list of errors.
- The function then handles these errors using the
handle_conversion_error
function. - After the errors are handled other arbitrary changes are applied using the
patch_model
function. - Finally, the converted JSON can be saved to a new file and run using Pywr v2.x.
from pywr import (
convert_model_from_v1_json_string,
ComponentConversionError,
ConversionError,
Schema,
)
def convert(v1_path: Path):
with open(v1_path) as fh:
v1_model_str = fh.read()
# 1. Convert the v1 model to a v2 schema
schema, errors = convert_model_from_v1_json_string(v1_model_str)
schema_data = json.loads(schema.to_json_string())
# 2. Handle any conversion errors
for error in errors:
handle_conversion_error(error, schema_data)
# 3. Apply any other manual changes to the converted JSON.
patch_model(schema_data)
schema_data_str = json.dumps(schema_data, indent=4)
# 4. Save the converted JSON as a new file (uncomment to save)
# with open(v1_path.parent / "v2-model.json", "w") as fh:
# fh.write(schema_data_str)
print("Conversion complete; running model...")
# 5. Load and run the new JSON file in Pywr v2.x.
schema = Schema.from_json_string(schema_data_str)
model = schema.build(Path(__file__).parent, None)
model.run("clp")
print("Model run complete 🎉")
Handling conversion errors
The convert_model_from_v1_json_string
function returns a list of errors that occurred during the conversion process.
These errors can be handled in a variety of ways, such as modifying the model definition, raising exceptions, or
ignoring them.
It is suggested to implement a function that can handle these errors in a way that is appropriate for your use case.
Begin by matching a few types of errors and then expand the matching as needed. By raising exceptions
for unhandled errors, you can ensure that all errors are eventually accounted for, and that new errors are not missed.
The example handles the ComponentConversionError
by matching on the error subclass (either Parameter()
or Node()
),
and then handling each case separately.
These two classes will contain the name of the component and optionally the attribute that caused the error.
In addition, these types contain an inner error (ConversionError
) that can be used to provide more detailed
information.
In the example, the UnrecognisedType()
class is handled for Parameter()
errors by applying the
handle_custom_parameters
function.
This second function adds a Pywr v2.x compatible custom parameter to the model definition using the same name and type (class name) as the original parameter.
def handle_conversion_error(error: ComponentConversionError, schema_data):
"""Handle a schema conversion error.
Raises a `RuntimeError` if an unhandled error case is found.
"""
match error:
case ComponentConversionError.Parameter():
match error.error:
case ConversionError.UnrecognisedType() as e:
print(
f"Patching custom parameter of type {e.ty} with name {error.name}"
)
handle_custom_parameters(schema_data, error.name, e.ty)
case _:
raise RuntimeError(f"Other parameter conversion error: {error}")
case ComponentConversionError.Node():
raise RuntimeError(f"Failed to convert node `{error.name}`: {error.error}")
case _:
raise RuntimeError(f"Unexpected conversion error: {error}")
def handle_custom_parameters(schema_data, name: str, p_type: str):
"""Patch the v2 schema to add the custom parameter with `name` and `p_type`."""
# Ensure the network parameters is a list
if schema_data["network"]["parameters"] is None:
schema_data["network"]["parameters"] = []
schema_data["network"]["parameters"].append(
{
"meta": {"name": name},
"type": "Python",
"source": {"path": "v2_custom_parameter.py"},
"object": p_type, # Use the same class name in v1 & v2
"args": [],
"kwargs": {},
}
)
Other changes
The upgrade to v2.x may require other changes to the model.
For example, the conversion process does not currently handle recorders and other model outputs.
These will need to be manually added to the model definition.
Such manual changes can be applied using, for example a patch_model
function.
This function will make arbitrary changes to the model definition.
The example, below updates the metadata of the model to modify the description.
def patch_model(schema_data):
"""Patch the v2 schema to add any additional changes."""
# Add any additional patches here
schema_data["metadata"]["description"] = "Converted from v1 model"
Full example
The complete example below demonstrates the conversion process for a v1.x model to v2.x:
import json
from pathlib import Path
# ANCHOR: convert
from pywr import (
convert_model_from_v1_json_string,
ComponentConversionError,
ConversionError,
Schema,
)
def convert(v1_path: Path):
with open(v1_path) as fh:
v1_model_str = fh.read()
# 1. Convert the v1 model to a v2 schema
schema, errors = convert_model_from_v1_json_string(v1_model_str)
schema_data = json.loads(schema.to_json_string())
# 2. Handle any conversion errors
for error in errors:
handle_conversion_error(error, schema_data)
# 3. Apply any other manual changes to the converted JSON.
patch_model(schema_data)
schema_data_str = json.dumps(schema_data, indent=4)
# 4. Save the converted JSON as a new file (uncomment to save)
# with open(v1_path.parent / "v2-model.json", "w") as fh:
# fh.write(schema_data_str)
print("Conversion complete; running model...")
# 5. Load and run the new JSON file in Pywr v2.x.
schema = Schema.from_json_string(schema_data_str)
model = schema.build(Path(__file__).parent, None)
model.run("clp")
print("Model run complete 🎉")
# ANCHOR_END: convert
# ANCHOR: handle_conversion_error
def handle_conversion_error(error: ComponentConversionError, schema_data):
"""Handle a schema conversion error.
Raises a `RuntimeError` if an unhandled error case is found.
"""
match error:
case ComponentConversionError.Parameter():
match error.error:
case ConversionError.UnrecognisedType() as e:
print(
f"Patching custom parameter of type {e.ty} with name {error.name}"
)
handle_custom_parameters(schema_data, error.name, e.ty)
case _:
raise RuntimeError(f"Other parameter conversion error: {error}")
case ComponentConversionError.Node():
raise RuntimeError(f"Failed to convert node `{error.name}`: {error.error}")
case _:
raise RuntimeError(f"Unexpected conversion error: {error}")
def handle_custom_parameters(schema_data, name: str, p_type: str):
"""Patch the v2 schema to add the custom parameter with `name` and `p_type`."""
# Ensure the network parameters is a list
if schema_data["network"]["parameters"] is None:
schema_data["network"]["parameters"] = []
schema_data["network"]["parameters"].append(
{
"meta": {"name": name},
"type": "Python",
"source": {"path": "v2_custom_parameter.py"},
"object": p_type, # Use the same class name in v1 & v2
"args": [],
"kwargs": {},
}
)
# ANCHOR_END: handle_conversion_error
# ANCHOR: patch_model
def patch_model(schema_data):
"""Patch the v2 schema to add any additional changes."""
# Add any additional patches here
schema_data["metadata"]["description"] = "Converted from v1 model"
# ANCHOR_END: patch_model
if __name__ == "__main__":
pth = Path(__file__).parent / "v1-model.json"
convert(pth)
Converting custom parameters
The main changes to custom parameters in Pywr v2.x are as follows:
- Custom parameters are no longer required to be a subclass of
Parameter
. They instead can be simple Python classes that implement acalc
method. - Users are no longer required to handle scenarios within custom parameters. Instead an instance of the custom parameter is created for each scenario in the simulation. This simplifies writing parameters and removes the risk of accidentally contaminating state between scenarios.
- Custom parameters are now added to the model using the "Python" parameter type. I.e. the "type" field in the parameter definition should be set to "Python" (not the class name of the custom parameter). This parameter type requires that the user explicitly define which metrics the custom parameter requires.
Simple example
v1.x custom parameter:
from pywr.parameters import ConstantParameter
class MyParameter(ConstantParameter):
def value(self, *args, **kwargs):
return 42
MyParameter.register()
v2.x custom parameter:
class MyParameter:
def calc(self, *args, **kwargs):
return 42
Developers Guide
This section is intended for developers who want to contribute to Pywr. It covers the following topics:
- Parameter types and traits
- Adding a new parameter
Parameter traits and return types
The pywr-core
crate defines a number of traits that are used to implement parameters. These traits are used to define
the behaviour of the parameter and how it interacts with the model. Each parameter must implement the Parameter
trait
and one of the three compute traits: GeneralParameter<T>
, SimpleParameter<T>
, or ConstParameter<T>
.
The Parameter
trait
The Parameter
trait is the base trait for all parameters in Pywr. It defines the basic behaviour of the parameter and
how it interacts with the model. The minimum implementation requires returning the metadata for the parameter.
Additional methods can be implemented to provide additional functionality. Please refer to the documentation for
the Parameter
trait for more information.
The GeneralParameter<T>
trait
The GeneralParameter<T>
trait is used for parameters that depend on MetricF64
values from the model. Because
MetricF64
values can refer to other parameters, general model state or other information implementing this
traits provides the most flexibility for a parameter. The compute
method is used to calculate the value of the
parameter at a given timestep and scenario. This method is resolved in order with other model components such
as nodes.
The SimpleParameter<T>
trait
The SimpleParameter<T>
trait is used for parameters that depend on SimpleMetricF64
or ConstantMetricF64
values only, or no other values at all. The compute
method is used to calculate the value of the parameter at a given
timestep and scenario, and therefore SimpleParameter<T>
can vary with time. This method is resolved in order with
other SimpleParameter<T>
before GeneralParameter<T>
and other model components such as nodes.
The ConstParameter<T>
trait
The ConstParameter<T>
trait is used for parameters that depend on ConstantMetricF64
values only and do
not vary with time. The compute
method is used to calculate the value of the parameter at the start of the simulation
and is not resolved at each timestep. This method is resolved in order with other ConstParameter<T>
.
Implementing multiple traits
A parameter should implement the "lowest" trait in the hierarchy. For example, if a parameter depends on
a SimpleParameter<T>
and a ConstParameter<T>
value, it should implement the SimpleParameter<T>
trait.
If a parameter depends on a GeneralParameter<T>
and a ConstParameter<T>
value, it should implement the
GeneralParameter<T>
trait.
For some parameters it can be beneficial to implement multiple traits. For example, a parameter could be generic to the
metric type (e.g. MetricF64
, SimpleMetricF64
, or ConstantMetricF64
) and implement each of the three
compute traits. This would allow the parameter to be used in the most efficient way possible depending on the
model configuration.
Return types
While the compute traits are generic over the type T
, the return type of the compute
Pywr currently only supports
f64
, usize
and MultiValue
types. The MultiValue
type is used to return multiple values from the compute
method. This is useful for parameters that return multiple values at a given timestep and scenario. See the
documentation for the MultiValue
type for more information. Implementations of the compute traits are usually for one
of these concrete types.
Adding a new parameter to Pywr.
This guide explains how to add a new parameter to Pywr.
When to add a new parameter?
New parameters can be added to complement the existing parameters in Pywr. These parameters should be generic and reusable across a wide range of models. By adding them to Pywr itself other users are able to use them in their models without having to implement them themselves. They are also typically implemented in Rust, which means they are fast and efficient.
If the parameter is specific to a particular model or data set, it is better to implement it in the model itself
using a custom parameter.
Custom parameters can be added using, for example, the PythonParameter
.
Adding a new parameter
To add new parameter to Pywr you need to do two things:
- Add the implementation to the
pywr-core
crate, and - Add the schema definition to the
pywr-schema
crate.
Adding the implementation to pywr-core
The implementation of the parameter should be added to the pywr-core
crate.
This is typically done by adding a new module to the parameters
module in the src
directory.
It is a good idea to follow the existing structure of the parameters
module by making a new module for the new
parameter.
Developers can follow the existing parameters as examples.
In this example, we will add a new parameter called MaxParameter
that calculates the maximum value of a metric.
Parameters can depend on other parameters or values from the model via the MetricF64
type.
In this case the metric
field stores a MetricF64
that will be compared with the threshold
field
to calculate the maximum value.
The threshold is a constant value that is set when the parameter is created.
Finally, the meta
field stores the metadata for the parameter.
The ParameterMeta
struct is used to store the metadata for all parameters and can be reused.
#![allow(dead_code)]
use pywr_core::metric::MetricF64;
use pywr_core::network::Network;
use pywr_core::parameters::{GeneralParameter, Parameter, ParameterMeta, ParameterName, ParameterState};
use pywr_core::scenario::ScenarioIndex;
use pywr_core::state::State;
use pywr_core::timestep::Timestep;
use pywr_core::PywrError;
pub struct MaxParameter {
meta: ParameterMeta,
metric: MetricF64,
threshold: f64,
}
impl MaxParameter {
pub fn new(name: ParameterName, metric: MetricF64, threshold: f64) -> Self {
Self {
meta: ParameterMeta::new(name),
metric,
threshold,
}
}
}
impl Parameter for MaxParameter {
fn meta(&self) -> &ParameterMeta {
&self.meta
}
}
impl GeneralParameter<f64> for MaxParameter {
fn compute(
&self,
_timestep: &Timestep,
_scenario_index: &ScenarioIndex,
model: &Network,
state: &State,
_internal_state: &mut Option<Box<dyn ParameterState>>,
) -> Result<f64, PywrError> {
// Current value
let x = self.metric.get_value(model, state)?;
Ok(x.max(self.threshold))
}
fn as_parameter(&self) -> &dyn Parameter
where
Self: Sized,
{
self
}
}
mod schema {
#[cfg(feature = "core")]
use pywr_core::parameters::ParameterIndex;
use pywr_schema::metric::Metric;
use pywr_schema::parameters::ParameterMeta;
#[cfg(feature = "core")]
use pywr_schema::{model::LoadArgs, SchemaError};
use schemars::JsonSchema;
#[derive(serde::Deserialize, serde::Serialize, Debug, Clone, JsonSchema)]
pub struct MaxParameter {
#[serde(flatten)]
pub meta: ParameterMeta,
pub parameter: Metric,
pub threshold: Option<f64>,
}
#[cfg(feature = "core")]
impl MaxParameter {
pub fn add_to_model(
&self,
network: &mut pywr_core::network::Network,
args: &LoadArgs,
) -> Result<ParameterIndex<f64>, SchemaError> {
let idx = self.parameter.load(network, args, Some(&self.meta.name))?;
let threshold = self.threshold.unwrap_or(0.0);
let p = pywr_core::parameters::MaxParameter::new(self.meta.name.as_str().into(), idx, threshold);
Ok(network.add_parameter(Box::new(p))?)
}
}
}
fn main() {
println!("Hello, world!");
}
To allow the parameter to be used in the model it is helpful to add a new
function that creates a new instance of the
parameter. This will be used by the schema to create the parameter when it is loaded from a model file.
#![allow(dead_code)]
use pywr_core::metric::MetricF64;
use pywr_core::network::Network;
use pywr_core::parameters::{GeneralParameter, Parameter, ParameterMeta, ParameterName, ParameterState};
use pywr_core::scenario::ScenarioIndex;
use pywr_core::state::State;
use pywr_core::timestep::Timestep;
use pywr_core::PywrError;
pub struct MaxParameter {
meta: ParameterMeta,
metric: MetricF64,
threshold: f64,
}
impl MaxParameter {
pub fn new(name: ParameterName, metric: MetricF64, threshold: f64) -> Self {
Self {
meta: ParameterMeta::new(name),
metric,
threshold,
}
}
}
impl Parameter for MaxParameter {
fn meta(&self) -> &ParameterMeta {
&self.meta
}
}
impl GeneralParameter<f64> for MaxParameter {
fn compute(
&self,
_timestep: &Timestep,
_scenario_index: &ScenarioIndex,
model: &Network,
state: &State,
_internal_state: &mut Option<Box<dyn ParameterState>>,
) -> Result<f64, PywrError> {
// Current value
let x = self.metric.get_value(model, state)?;
Ok(x.max(self.threshold))
}
fn as_parameter(&self) -> &dyn Parameter
where
Self: Sized,
{
self
}
}
mod schema {
#[cfg(feature = "core")]
use pywr_core::parameters::ParameterIndex;
use pywr_schema::metric::Metric;
use pywr_schema::parameters::ParameterMeta;
#[cfg(feature = "core")]
use pywr_schema::{model::LoadArgs, SchemaError};
use schemars::JsonSchema;
#[derive(serde::Deserialize, serde::Serialize, Debug, Clone, JsonSchema)]
pub struct MaxParameter {
#[serde(flatten)]
pub meta: ParameterMeta,
pub parameter: Metric,
pub threshold: Option<f64>,
}
#[cfg(feature = "core")]
impl MaxParameter {
pub fn add_to_model(
&self,
network: &mut pywr_core::network::Network,
args: &LoadArgs,
) -> Result<ParameterIndex<f64>, SchemaError> {
let idx = self.parameter.load(network, args, Some(&self.meta.name))?;
let threshold = self.threshold.unwrap_or(0.0);
let p = pywr_core::parameters::MaxParameter::new(self.meta.name.as_str().into(), idx, threshold);
Ok(network.add_parameter(Box::new(p))?)
}
}
}
fn main() {
println!("Hello, world!");
}
Finally, the minimum implementation of the Parameter
and one of the three types of parameter compute traits should be
added for MaxParameter
. These traits require the meta
function to return the metadata for the parameter, and
the compute
function to calculate the value of the parameter at a given timestep and scenario.
In this case the compute
function calculates the maximum value of the metric and the threshold.
The value of the metric is obtained from the model using the get_value
function.
See the documentation about parameter traits and return types for more information.
#![allow(dead_code)]
use pywr_core::metric::MetricF64;
use pywr_core::network::Network;
use pywr_core::parameters::{GeneralParameter, Parameter, ParameterMeta, ParameterName, ParameterState};
use pywr_core::scenario::ScenarioIndex;
use pywr_core::state::State;
use pywr_core::timestep::Timestep;
use pywr_core::PywrError;
pub struct MaxParameter {
meta: ParameterMeta,
metric: MetricF64,
threshold: f64,
}
impl MaxParameter {
pub fn new(name: ParameterName, metric: MetricF64, threshold: f64) -> Self {
Self {
meta: ParameterMeta::new(name),
metric,
threshold,
}
}
}
impl Parameter for MaxParameter {
fn meta(&self) -> &ParameterMeta {
&self.meta
}
}
impl GeneralParameter<f64> for MaxParameter {
fn compute(
&self,
_timestep: &Timestep,
_scenario_index: &ScenarioIndex,
model: &Network,
state: &State,
_internal_state: &mut Option<Box<dyn ParameterState>>,
) -> Result<f64, PywrError> {
// Current value
let x = self.metric.get_value(model, state)?;
Ok(x.max(self.threshold))
}
fn as_parameter(&self) -> &dyn Parameter
where
Self: Sized,
{
self
}
}
mod schema {
#[cfg(feature = "core")]
use pywr_core::parameters::ParameterIndex;
use pywr_schema::metric::Metric;
use pywr_schema::parameters::ParameterMeta;
#[cfg(feature = "core")]
use pywr_schema::{model::LoadArgs, SchemaError};
use schemars::JsonSchema;
#[derive(serde::Deserialize, serde::Serialize, Debug, Clone, JsonSchema)]
pub struct MaxParameter {
#[serde(flatten)]
pub meta: ParameterMeta,
pub parameter: Metric,
pub threshold: Option<f64>,
}
#[cfg(feature = "core")]
impl MaxParameter {
pub fn add_to_model(
&self,
network: &mut pywr_core::network::Network,
args: &LoadArgs,
) -> Result<ParameterIndex<f64>, SchemaError> {
let idx = self.parameter.load(network, args, Some(&self.meta.name))?;
let threshold = self.threshold.unwrap_or(0.0);
let p = pywr_core::parameters::MaxParameter::new(self.meta.name.as_str().into(), idx, threshold);
Ok(network.add_parameter(Box::new(p))?)
}
}
}
fn main() {
println!("Hello, world!");
}
Adding the schema definition to pywr-schema
The schema definition for the new parameter should be added to the pywr-schema
crate.
Again, it is a good idea to follow the existing structure of the schema by making a new module for the new parameter.
Developers can also follow the existing parameters as examples.
As with the pywr-core
implementation, the meta
field is used to store the metadata for the parameter and can
use the ParameterMeta
struct (NB this is from pywr-schema
crate).
The rest of the struct looks very similar to the pywr-core
implementation, but uses pywr-schema
types for the fields.
The struct should also derive serde::Deserialize
, serde::Serialize
, Debug
, Clone
, JsonSchema
,
and PywrVisitAll
to be compatible with the rest of Pywr.
Note: The
PywrVisitAll
derive is not shown in the listing as it can not currently be used outside thepywr-schema
crate.
#![allow(dead_code)]
use pywr_core::metric::MetricF64;
use pywr_core::network::Network;
use pywr_core::parameters::{GeneralParameter, Parameter, ParameterMeta, ParameterName, ParameterState};
use pywr_core::scenario::ScenarioIndex;
use pywr_core::state::State;
use pywr_core::timestep::Timestep;
use pywr_core::PywrError;
pub struct MaxParameter {
meta: ParameterMeta,
metric: MetricF64,
threshold: f64,
}
impl MaxParameter {
pub fn new(name: ParameterName, metric: MetricF64, threshold: f64) -> Self {
Self {
meta: ParameterMeta::new(name),
metric,
threshold,
}
}
}
impl Parameter for MaxParameter {
fn meta(&self) -> &ParameterMeta {
&self.meta
}
}
impl GeneralParameter<f64> for MaxParameter {
fn compute(
&self,
_timestep: &Timestep,
_scenario_index: &ScenarioIndex,
model: &Network,
state: &State,
_internal_state: &mut Option<Box<dyn ParameterState>>,
) -> Result<f64, PywrError> {
// Current value
let x = self.metric.get_value(model, state)?;
Ok(x.max(self.threshold))
}
fn as_parameter(&self) -> &dyn Parameter
where
Self: Sized,
{
self
}
}
mod schema {
#[cfg(feature = "core")]
use pywr_core::parameters::ParameterIndex;
use pywr_schema::metric::Metric;
use pywr_schema::parameters::ParameterMeta;
#[cfg(feature = "core")]
use pywr_schema::{model::LoadArgs, SchemaError};
use schemars::JsonSchema;
#[derive(serde::Deserialize, serde::Serialize, Debug, Clone, JsonSchema)]
pub struct MaxParameter {
#[serde(flatten)]
pub meta: ParameterMeta,
pub parameter: Metric,
pub threshold: Option<f64>,
}
#[cfg(feature = "core")]
impl MaxParameter {
pub fn add_to_model(
&self,
network: &mut pywr_core::network::Network,
args: &LoadArgs,
) -> Result<ParameterIndex<f64>, SchemaError> {
let idx = self.parameter.load(network, args, Some(&self.meta.name))?;
let threshold = self.threshold.unwrap_or(0.0);
let p = pywr_core::parameters::MaxParameter::new(self.meta.name.as_str().into(), idx, threshold);
Ok(network.add_parameter(Box::new(p))?)
}
}
}
fn main() {
println!("Hello, world!");
}
Next, the parameter needs a method to add itself to a network.
This is typically done by implementing a add_to_model
method for the parameter.
This method should be feature-gated with the core
feature to ensure it is only available when the core
feature is
enabled.
The method should take a mutable reference to the network and a reference to the LoadArgs
struct.
The method should load the metric from the model using the load
method, and then create a new MaxParameter
using
the new
method implemented above.
Finally, the method should add the parameter to the network using the add_parameter
method.
#![allow(dead_code)]
use pywr_core::metric::MetricF64;
use pywr_core::network::Network;
use pywr_core::parameters::{GeneralParameter, Parameter, ParameterMeta, ParameterName, ParameterState};
use pywr_core::scenario::ScenarioIndex;
use pywr_core::state::State;
use pywr_core::timestep::Timestep;
use pywr_core::PywrError;
pub struct MaxParameter {
meta: ParameterMeta,
metric: MetricF64,
threshold: f64,
}
impl MaxParameter {
pub fn new(name: ParameterName, metric: MetricF64, threshold: f64) -> Self {
Self {
meta: ParameterMeta::new(name),
metric,
threshold,
}
}
}
impl Parameter for MaxParameter {
fn meta(&self) -> &ParameterMeta {
&self.meta
}
}
impl GeneralParameter<f64> for MaxParameter {
fn compute(
&self,
_timestep: &Timestep,
_scenario_index: &ScenarioIndex,
model: &Network,
state: &State,
_internal_state: &mut Option<Box<dyn ParameterState>>,
) -> Result<f64, PywrError> {
// Current value
let x = self.metric.get_value(model, state)?;
Ok(x.max(self.threshold))
}
fn as_parameter(&self) -> &dyn Parameter
where
Self: Sized,
{
self
}
}
mod schema {
#[cfg(feature = "core")]
use pywr_core::parameters::ParameterIndex;
use pywr_schema::metric::Metric;
use pywr_schema::parameters::ParameterMeta;
#[cfg(feature = "core")]
use pywr_schema::{model::LoadArgs, SchemaError};
use schemars::JsonSchema;
#[derive(serde::Deserialize, serde::Serialize, Debug, Clone, JsonSchema)]
pub struct MaxParameter {
#[serde(flatten)]
pub meta: ParameterMeta,
pub parameter: Metric,
pub threshold: Option<f64>,
}
#[cfg(feature = "core")]
impl MaxParameter {
pub fn add_to_model(
&self,
network: &mut pywr_core::network::Network,
args: &LoadArgs,
) -> Result<ParameterIndex<f64>, SchemaError> {
let idx = self.parameter.load(network, args, Some(&self.meta.name))?;
let threshold = self.threshold.unwrap_or(0.0);
let p = pywr_core::parameters::MaxParameter::new(self.meta.name.as_str().into(), idx, threshold);
Ok(network.add_parameter(Box::new(p))?)
}
}
}
fn main() {
println!("Hello, world!");
}
Finally, the schema definition should be added to the Parameter
enum in the parameters
module.
This will require ensuring the new variant is added to all places where that enum is used.
The borrow checker can be helpful in ensuring all places are updated.
Contributing to Documentation
The documentation for Pywr V2 is located in the pywr-next repository, here in the
pywr-book
subfolder.
The documentation is written using 'markdown', a format which enables easy formatting for the web.
This website can help get started: www.markdownguide.org
To contribute documentation for Pywr V2, we recommend following the steps below to ensure we can review and integrate any changes as easily as possible.
Steps to create documentation
- Fork the pywr-next repository
- Clone the fork
git clone https://github.com/MYUSER/pywr-next
- Create a branch
git checkout -b my-awesome-docs
- Open the book documentation in your favourite editor
vi pywr-next/pywr-book/introduction.md
Which should look something like this:
- Having modified the documentation, add and commit the changes using the commit format
git add introduction.md"
git commit -m "docs: Add an example documentation"
- Create a pull request from your branch
-
In your fork, click on the 'Pull Requests' tab
-
Click on 'New Pull Request'
-
Choose your branch from the drop-down on the right-hand-side
-
Click 'Create Pull Request' when the button appears
-
Add a note if you want, and click 'Create Pull Request'
-