API Reference
Network
Sub-Network
Descriptors
Descriptors for component attributes.
- class pypsa.descriptors.Dict
Dict is a subclass of dict, which allows you to get AND SET items in the dict using the attribute syntax!
Stripped down from addict https://github.com/mewwts/addict/ .
- class pypsa.descriptors.OrderedGraph(incoming_graph_data=None, multigraph_input=None, **attr)
- adjlist_dict_factory
alias of
OrderedDict
- node_dict_factory
alias of
OrderedDict
- pypsa.descriptors.allocate_series_dataframes(network, series)
Populate time-varying outputs with default values.
- Parameters
network (pypsa.Network) –
series (dict) – Dictionary of components and their attributes to populate (see example)
- Return type
None
Examples
>>> allocate_series_dataframes(network, {'Generator': ['p'], 'Load': ['p']})
- pypsa.descriptors.expand_series(ser, columns)
Helper function to quickly expand a series to a dataframe with according column axis and every single column being the equal to the given series.
- pypsa.descriptors.get_active_assets(n, c, investment_period)
Getter function.
Get True values for elements of component c which are active at a given investment period. These are calculated from lifetime and the build year.
- pypsa.descriptors.get_activity_mask(n, c, sns=None, index=None)
Getter function.
Get a boolean array with True values for elements of component c which are active at a specific snapshot. If the network is in multi_investment_period mode (given by n._multi_invest), these are calculated from lifetime and the build year. Otherwise all values are set to True.
- pypsa.descriptors.get_bounds_pu(n, c, sns, index=None, attr=None)
Getter function to retrieve the per unit bounds of a given compoent for given snapshots and possible subset of elements (e.g. non-extendables). Depending on the attr you can further specify the bounds of the variable you are looking at, e.g. p_store for storage units.
- Parameters
n (pypsa.Network) –
c (string) – Component name, e.g. “Generator”, “Line”.
sns (pandas.Index/pandas.DateTimeIndex) – set of snapshots for the bounds
index (pd.Index, default None) – Subset of the component elements. If None (default) bounds of all elements are returned.
attr (string, default None) – attribute name for the bounds, e.g. “p”, “s”, “p_store”
- pypsa.descriptors.get_committable_i(n, c)
Getter function.
Get the index of commitable elements of a given component.
- pypsa.descriptors.get_extendable_i(n, c)
Getter function.
Get the index of extendable elements of a given component.
- pypsa.descriptors.get_non_extendable_i(n, c)
Getter function.
Get the index of non-extendable elements of a given component.
- pypsa.descriptors.get_switchable_as_dense(network, component, attr, snapshots=None, inds=None)
Return a Dataframe for a time-varying component attribute with values for all non-time-varying components filled in with the default values for the attribute.
- Parameters
network (pypsa.Network) –
component (string) – Component object name, e.g. ‘Generator’ or ‘Link’
attr (string) – Attribute name
snapshots (pandas.Index) – Restrict to these snapshots rather than network.snapshots.
inds (pandas.Index) – Restrict to these components rather than network.components.index
- Return type
pandas.DataFrame
Examples
>>> get_switchable_as_dense(network, 'Generator', 'p_max_pu')
- pypsa.descriptors.get_switchable_as_iter(network, component, attr, snapshots, inds=None)
Return an iterator over snapshots for a time-varying component attribute with values for all non-time-varying components filled in with the default values for the attribute.
- Parameters
network (pypsa.Network) –
component (string) – Component object name, e.g. ‘Generator’ or ‘Link’
attr (string) – Attribute name
snapshots (pandas.Index) – Restrict to these snapshots rather than network.snapshots.
inds (pandas.Index) – Restrict to these items rather than all of network.{generators, ..}.index
- Return type
pandas.DataFrame
Examples
>>> get_switchable_as_iter(network, 'Generator', 'p_max_pu', snapshots)
- pypsa.descriptors.zsum(s, *args, **kwargs)
pandas 0.21.0 changes sum() behavior so that the result of applying sum over an empty DataFrame is NaN.
Meant to be set as pd.Series.zsum = zsum.
Input and Output
Functions for importing and exporting data.
- pypsa.io.export_to_csv_folder(network, csv_folder_name, encoding=None, export_standard_types=False)
Export network and components to a folder of CSVs.
Both static and series attributes of all components are exported, but only if they have non-default values.
If
csv_folder_name
does not already exist, it is created.Static attributes are exported in one CSV file per component, e.g.
generators.csv
.Series attributes are exported in one CSV file per component per attribute, e.g.
generators-p_set.csv
.- Parameters
csv_folder_name (string) – Name of folder to which to export.
encoding (str, default None) – Encoding to use for UTF when reading (ex. ‘utf-8’). List of Python standard encodings
export_standard_types (boolean, default False) – If True, then standard types are exported too (upon reimporting you should then set “ignore_standard_types” when initialising the network).
Examples
>>> network.export_to_csv_folder(csv_folder_name)
- pypsa.io.export_to_hdf5(network, path, export_standard_types=False, **kwargs)
Export network and components to an HDF store.
Both static and series attributes of components are exported, but only if they have non-default values.
If path does not already exist, it is created.
- Parameters
path (string) – Name of hdf5 file to which to export (if it exists, it is overwritten)
export_standard_types (boolean, default False) – If True, then standard types are exported too (upon reimporting you should then set “ignore_standard_types” when initialising the network).
**kwargs – Extra arguments for pd.HDFStore to specify f.i. compression (default: complevel=4)
Examples
>>> network.export_to_hdf5(filename)
- pypsa.io.export_to_netcdf(network, path=None, export_standard_types=False, least_significant_digit=None)
Export network and components to a netCDF file.
Both static and series attributes of components are exported, but only if they have non-default values.
If path does not already exist, it is created.
If no path is passed, no file is exported, but the xarray.Dataset is still returned.
Be aware that this cannot export boolean attributes on the Network class, e.g. network.my_bool = False is not supported by netCDF.
- Parameters
path (string|None) – Name of netCDF file to which to export (if it exists, it is overwritten); if None is passed, no file is exported.
export_standard_types (boolean, default False) – If True, then standard types are exported too (upon reimporting you should then set “ignore_standard_types” when initialising the network).
least_significant_digit – This is passed to the netCDF exporter, but currently makes no difference to file size or float accuracy. We’re working on improving this…
- Returns
ds
- Return type
xarray.Dataset
Examples
>>> network.export_to_netcdf("my_file.nc")
- pypsa.io.import_components_from_dataframe(network, dataframe, cls_name)
Import components from a pandas DataFrame.
If columns are missing then defaults are used.
If extra columns are added, these are left in the resulting component dataframe.
- Parameters
dataframe (pandas.DataFrame) – A DataFrame whose index is the names of the components and whose columns are the non-default attributes.
cls_name (string) – Name of class of component, e.g.
"Line", "Bus", "Generator", "StorageUnit"
Examples
>>> import pandas as pd >>> buses = ['Berlin', 'Frankfurt', 'Munich', 'Hamburg'] >>> network.import_components_from_dataframe( pd.DataFrame({"v_nom" : 380, "control" : 'PV'}, index=buses), "Bus") >>> network.import_components_from_dataframe( pd.DataFrame({"carrier" : "solar", "bus" : buses, "p_nom_extendable" : True}, index=[b+" PV" for b in buses]), "Generator")
See also
pypsa.Network.madd
- pypsa.io.import_from_csv_folder(network, csv_folder_name, encoding=None, skip_time=False)
Import network data from CSVs in a folder.
The CSVs must follow the standard form, see
pypsa/examples
.- Parameters
csv_folder_name (string) – Name of folder
encoding (str, default None) –
Encoding to use for UTF when reading (ex. ‘utf-8’). List of Python standard encodings
skip_time (bool, default False) – Skip reading in time dependent attributes
Examples
>>> network.import_from_csv_folder(csv_folder_name)
- pypsa.io.import_from_hdf5(network, path, skip_time=False)
Import network data from HDF5 store at path.
- Parameters
path (string, Path) – Name of HDF5 store
skip_time (bool, default False) – Skip reading in time dependent attributes
- pypsa.io.import_from_netcdf(network, path, skip_time=False)
Import network data from netCDF file or xarray Dataset at path.
- Parameters
path (string|xr.Dataset) – Path to netCDF dataset or instance of xarray Dataset
skip_time (bool, default False) – Skip reading in time dependent attributes
- pypsa.io.import_from_pandapower_net(network, net, extra_line_data=False, use_pandapower_index=False)
Import PyPSA network from pandapower net.
Importing from pandapower is still in beta; not all pandapower components are supported.
Unsupported features include: - three-winding transformers - switches - in_service status and - tap positions of transformers
- Parameters
net (pandapower network) –
extra_line_data (boolean, default: False) – if True, the line data for all parameters is imported instead of only the type
use_pandapower_index (boolean, default: False) – if True, use integer numbers which is the pandapower index standard if False, use any net.name as index (e.g. ‘Bus 1’ (str) or 1 (int))
Examples
>>> network.import_from_pandapower_net(net) OR >>> import pypsa >>> import pandapower as pp >>> import pandapower.networks as pn >>> net = pn.create_cigre_network_mv(with_der='all') >>> network = pypsa.Network() >>> network.import_from_pandapower_net(net, extra_line_data=True)
- pypsa.io.import_from_pypower_ppc(network, ppc, overwrite_zero_s_nom=None)
Import network from PYPOWER PPC dictionary format version 2.
Converts all baseMVA to base power of 1 MVA.
For the meaning of the pypower indices, see also pypower/idx_*.
- Parameters
ppc (PYPOWER PPC dict) –
overwrite_zero_s_nom (Float or None, default None) –
Examples
>>> from pypower.api import case30 >>> ppc = case30() >>> network.import_from_pypower_ppc(ppc)
- pypsa.io.import_series_from_dataframe(network, dataframe, cls_name, attr)
Import time series from a pandas DataFrame.
- Parameters
dataframe (pandas.DataFrame) – A DataFrame whose index is
network.snapshots
and whose columns are a subset of the relevant components.cls_name (string) – Name of class of component
attr (string) – Name of time-varying series attribute
Examples
>>> import numpy as np >>> network.set_snapshots(range(10)) >>> network.import_series_from_dataframe( pd.DataFrame(np.random.rand(10, 4), columns=network.generators.index, index=range(10)), "Generator", "p_max_pu")
See also
pypsa.Network.madd
Examples
Network Graph
Graph helper functions, which are attached to network and sub_network.
- pypsa.graph.adjacency_matrix(network, branch_components=None, investment_period=None, busorder=None, weights=None)
Construct a sparse adjacency matrix (directed)
- Parameters
branch_components (iterable sublist of branch_components) – Buses connected by any of the selected branches are adjacent (default: branch_components (network) or passive_branch_components (sub_network))
busorder (pd.Index subset of network.buses.index) – Basis to use for the matrix representation of the adjacency matrix (default: buses.index (network) or buses_i() (sub_network))
weights (pd.Series or None (default)) – If given must provide a weight for each branch, multi-indexed on branch_component name and branch name.
- Returns
adjacency_matrix – Directed adjacency matrix
- Return type
sp.sparse.coo_matrix
- pypsa.graph.graph(network, branch_components=None, weight=None, inf_weight=False)
Build NetworkX graph.
- Parameters
network (Network|SubNetwork) –
branch_components ([str]) – Components to use as branches. The default are passive_branch_components in the case of a SubNetwork and branch_components in the case of a Network.
weight (str) – Branch attribute to use as weight
inf_weight (bool|float) – How to treat infinite weights (default: False). True keeps the infinite weight. False skips edges with infinite weight. If a float is given it is used instead.
- Returns
graph – NetworkX graph
- Return type
- pypsa.graph.incidence_matrix(network, branch_components=None, busorder=None)
Construct a sparse incidence matrix (directed)
- Parameters
branch_components (iterable sublist of branch_components) – Buses connected by any of the selected branches are adjacent (default: branch_components (network) or passive_branch_components (sub_network))
busorder (pd.Index subset of network.buses.index) – Basis to use for the matrix representation of the adjacency matrix (default: buses.index (network) or buses_i() (sub_network))
- Returns
incidence_matrix – Directed incidence matrix
- Return type
sp.sparse.csr_matrix
Power Flow
Power flow functionality.
- pypsa.pf.aggregate_multi_graph(sub_network)
Aggregate branches between same buses and replace with a single branch with aggregated properties (e.g. s_nom is summed, length is averaged).
- pypsa.pf.apply_line_types(network)
Calculate line electrical parameters x, r, b, g from standard types.
- pypsa.pf.apply_transformer_t_model(network)
Convert given T-model parameters to PI-model parameters using wye-delta transformation.
- pypsa.pf.apply_transformer_types(network)
Calculate transformer electrical parameters x, r, b, g from standard types.
- pypsa.pf.calculate_B_H(sub_network, skip_pre=False)
Calculate B and H matrices for AC or DC sub-networks.
- pypsa.pf.calculate_PTDF(sub_network, skip_pre=False)
Calculate the Power Transfer Distribution Factor (PTDF) for sub_network.
Sets sub_network.PTDF as a (dense) numpy array.
- Parameters
sub_network (pypsa.SubNetwork) –
skip_pre (bool, default False) – Skip the preliminary steps of computing topology, calculating dependent values, finding bus controls and computing B and H.
- pypsa.pf.calculate_Y(sub_network, skip_pre=False)
Calculate bus admittance matrices for AC sub-networks.
- pypsa.pf.calculate_dependent_values(network)
Calculate per unit impedances and append voltages to lines and shunt impedances.
- pypsa.pf.find_bus_controls(sub_network)
Find slack and all PV and PQ buses for a sub_network.
This function also fixes sub_network.buses_o, a DataFrame ordered by control type.
- pypsa.pf.find_cycles(sub_network, weight='x_pu')
Find all cycles in the sub_network and record them in sub_network.C.
networkx collects the cycles with more than 2 edges; then the 2-edge cycles from the MultiGraph must be collected separately (for cases where there are multiple lines between the same pairs of buses).
Cycles with infinite impedance are skipped.
- pypsa.pf.find_slack_bus(sub_network)
Find the slack bus in a connected sub-network.
- pypsa.pf.find_tree(sub_network, weight='x_pu')
Get the spanning tree of the graph, choose the node with the highest degree as a central “tree slack” and then see for each branch which paths from the slack to each node go through the branch.
- pypsa.pf.network_batch_lpf(network, snapshots=None)
Batched linear power flow with numpy.dot for several snapshots.
- pypsa.pf.network_lpf(network, snapshots=None, skip_pre=False)
Linear power flow for generic network.
- Parameters
snapshots (list-like|single snapshot) – A subset or an elements of network.snapshots on which to run the power flow, defaults to network.snapshots
skip_pre (bool, default False) – Skip the preliminary steps of computing topology, calculating dependent values and finding bus controls.
- Return type
None
- pypsa.pf.network_pf(network, snapshots=None, skip_pre=False, x_tol=1e-06, use_seed=False, distribute_slack=False, slack_weights='p_set')
Full non-linear power flow for generic network.
- Parameters
snapshots (list-like|single snapshot) – A subset or an elements of network.snapshots on which to run the power flow, defaults to network.snapshots
skip_pre (bool, default False) – Skip the preliminary steps of computing topology, calculating dependent values and finding bus controls.
x_tol (float) – Tolerance for Newton-Raphson power flow.
use_seed (bool, default False) – Use a seed for the initial guess for the Newton-Raphson algorithm.
distribute_slack (bool, default False) – If
True
, distribute the slack power across generators proportional to generator dispatch by default or according to the distribution scheme provided inslack_weights
. IfFalse
only the slack generator takes up the slack.slack_weights (dict|str, default 'p_set') – Distribution scheme describing how to determine the fraction of the total slack power (of each sub network individually) a bus of the subnetwork takes up. Default is to distribute proportional to generator dispatch (‘p_set’). Another option is to distribute proportional to (optimised) nominal capacity (‘p_nom’ or ‘p_nom_opt’). Custom weights can be specified via a dictionary that has a key for each subnetwork index (
network.sub_networks.index
) and a pandas.Series/dict with buses or generators of the corresponding subnetwork as index/keys. When specifying custom weights with buses as index/keys the slack power of a bus is distributed among its generators in proportion to their nominal capacity (p_nom
) if given, otherwise evenly.
- Returns
Dictionary with keys ‘n_iter’, ‘converged’, ‘error’ and dataframe values indicating number of iterations, convergence status, and iteration error for each snapshot (rows) and sub_network (columns)
- Return type
- pypsa.pf.newton_raphson_sparse(f, guess, dfdx, x_tol=1e-10, lim_iter=100, distribute_slack=False, slack_weights=None)
Solve f(x) = 0 with initial guess for x and dfdx(x). dfdx(x) should return a sparse Jacobian. Terminate if error on norm of f(x) is < x_tol or there were more than lim_iter iterations.
- pypsa.pf.sub_network_lpf(sub_network, snapshots=None, skip_pre=False)
Linear power flow for connected sub-network.
- Parameters
snapshots (list-like|single snapshot) – A subset or an elements of network.snapshots on which to run the power flow, defaults to network.snapshots
skip_pre (bool, default False) – Skip the preliminary steps of computing topology, calculating dependent values and finding bus controls.
- Return type
None
- pypsa.pf.sub_network_pf(sub_network, snapshots=None, skip_pre=False, x_tol=1e-06, use_seed=False, distribute_slack=False, slack_weights='p_set')
Non-linear power flow for connected sub-network.
- Parameters
snapshots (list-like|single snapshot) – A subset or an elements of network.snapshots on which to run the power flow, defaults to network.snapshots
skip_pre (bool, default False) – Skip the preliminary steps of computing topology, calculating dependent values and finding bus controls.
x_tol (float) – Tolerance for Newton-Raphson power flow.
use_seed (bool, default False) – Use a seed for the initial guess for the Newton-Raphson algorithm.
distribute_slack (bool, default False) – If
True
, distribute the slack power across generators proportional to generator dispatch by default or according to the distribution scheme provided inslack_weights
. IfFalse
only the slack generator takes up the slack.slack_weights (pandas.Series|str, default 'p_set') – Distribution scheme describing how to determine the fraction of the total slack power a bus of the subnetwork takes up. Default is to distribute proportional to generator dispatch (‘p_set’). Another option is to distribute proportional to (optimised) nominal capacity (‘p_nom’ or ‘p_nom_opt’). Custom weights can be provided via a pandas.Series/dict that has the buses or the generators of the subnetwork as index/keys. When using custom weights with buses as index/keys the slack power of a bus is distributed among its generators in proportion to their nominal capacity (
p_nom
) if given, otherwise evenly.
- Returns
Tuple of three pandas.Series indicating number of iterations,
remaining error, and convergence status for each snapshot
- pypsa.pf.sub_network_pf_singlebus(sub_network, snapshots=None, skip_pre=False, distribute_slack=False, slack_weights='p_set', linear=False)
Non-linear power flow for a sub-network consiting of a single bus.
- Parameters
snapshots (list-like|single snapshot) – A subset or an elements of network.snapshots on which to run the power flow, defaults to network.snapshots
skip_pre (bool, default False) – Skip the preliminary steps of computing topology, calculating dependent values and finding bus controls.
distribute_slack (bool, default False) – If
True
, distribute the slack power across generators proportional to generator dispatch by default or according to the distribution scheme provided inslack_weights
. IfFalse
only the slack generator takes up the slack.slack_weights (pandas.Series|str, default 'p_set') – Distribution scheme describing how to determine the fraction of the total slack power a bus of the subnetwork takes up. Default is to distribute proportional to generator dispatch (‘p_set’). Another option is to distribute proportional to (optimised) nominal capacity (‘p_nom’ or ‘p_nom_opt’). Custom weights can be provided via a pandas.Series/dict that has the generators of the single bus as index/keys.
- pypsa.pf.wye_to_delta(z1, z2, z3)
Linopy Optimisation Module
abstract.py
Build abstracted, extended optimisation problems from PyPSA networks with Linopy.
- pypsa.optimization.abstract.optimize_security_constrained(n, snapshots=None, branch_outages=None, multi_investment_periods=False, model_kwargs={}, **kwargs)
Computes Security-Constrained Linear Optimal Power Flow (SCLOPF).
This ensures that no branch is overloaded even given the branch outages.
- Parameters
n (pypsa.Network) –
snapshots (list-like, optional) – Set of snapshots to consider in the optimization. The default is None.
branch_outages (list-like/pandas.Index/pandas.MultiIndex, optional) – Subset of passive branches to consider as possible outages. If a list or a pandas.Index is passed, it is assumed to identify lines. If a multiindex is passed, its first level has to contain the component names, the second the assets. The default None results in all passive branches to be considered.
multi_investment_periods (bool, default False) – Whether to optimise as a single investment period or to optimise in multiple investment periods. Then, snapshots should be a
pd.MultiIndex
.model_kwargs (dict) – Keyword arguments used by linopy.Model, such as solver_dir or chunk.
**kwargs – Keyword argument used by linopy.Model.solve, such as solver_name, problem_fn or solver options directly passed to the solver.
- Return type
None
- pypsa.optimization.abstract.optimize_transmission_expansion_iteratively(n, snapshots=None, msq_threshold=0.05, min_iterations=1, max_iterations=100, track_iterations=False, **kwargs)
Iterative linear optimization updating the line parameters for passive AC and DC lines. This is helpful when line expansion is enabled. After each successful solving, line impedances and line resistance are recalculated based on the optimization result. If warmstart is possible, it uses the result from the previous iteration to fasten the optimization.
- Parameters
snapshots (list or index slice) – A list of snapshots to optimise, must be a subset of network.snapshots, defaults to network.snapshots
msq_threshold (float, default 0.05) – Maximal mean square difference between optimized line capacity of the current and the previous iteration. As soon as this threshold is undercut, and the number of iterations is bigger than ‘min_iterations’ the iterative optimization stops
min_iterations (integer, default 1) – Minimal number of iteration to run regardless whether the msq_threshold is already undercut
max_iterations (integer, default 100) – Maximal number of iterations to run regardless whether msq_threshold is already undercut
track_iterations (bool, default False) – If True, the intermediate branch capacities and values of the objective function are recorded for each iteration. The values of iteration 0 represent the initial state.
**kwargs – Keyword arguments of the n.optimize function which runs at each iteration
common.py
compat.py
constraints.py
global_constraints.py
optimize.py
variables.py
Optimisation Module
Tools for fast Linear Problem file writing. This module contains.
io functions for writing out variables, constraints and objective into a lp file.
functions to create lp format based linear expression
solver functions which read the lp file, run the problem and return the solution
This module supports the linear optimal power flow calculation without using pyomo (see module linopt.py)
- pypsa.linopt.align_with_static_component(n, c, attr)
Alignment of time-dependent variables with static components.
If c is a pypsa.component name, it will sort the columns of the variable according to the static component.
- pypsa.linopt.broadcasted_axes(*dfs)
Helper function which, from a collection of arrays, series, frames and other values, retrieves the axes of series and frames which result from broadcasting operations.
It checks whether index and columns of given series and frames, repespectively, are aligned. Using this function allows to subsequently use pure numpy operations and keep the axes in the background.
- pypsa.linopt.define_binaries(n, axes, name, attr='', spec='', mask=None)
Defines binary-variable(s) for pypsa-network. The variables are stored in the network object under n.vars with key of the variable name. For each entry for the pd.Series of pd.DataFrame spanned by the axes argument the function defines a binary.
- Parameters
n (pypsa.Network) –
axes (pd.Index or tuple of pd.Index objects) – Specifies the axes and therefore the shape of the variables.
name (str) –
general name of the variable (or component which the variable is referring to). The variable will then be stored under:
n.vars[name].pnl if the variable is two-dimensional
n.vars[name].df if the variable is one-dimensional
attr (str default '') – Specifying name of the variable, defines under which name the variable(s) are stored in n.vars[name].pnl if two-dimensional or in n.vars[name].df if one-dimensional
mask (pd.DataFrame/np.array) – Boolean mask with False values for variables which are skipped. The shape of the mask has to match the shape given by axes.
See also
- pypsa.linopt.define_constraints(n, lhs, sense, rhs, name, attr='', axes=None, spec='', mask=None)
Defines constraint(s) for pypsa-network with given left hand side (lhs), sense and right hand side (rhs). The constraints are stored in the network object under n.cons with key of the constraint name. If multiple constraints are defined at ones, only using np.arrays, then the axes argument can be used for defining the axes for the constraints (this is especially recommended for time-dependent constraints). If one of lhs, sense and rhs is a pd.Series/pd.DataFrame the axes argument is not necessary.
- Parameters
n (pypsa.Network) –
lhs (pd.Series/pd.DataFrame/np.array/str/float) – left hand side of the constraint(s), created with
pypsa.linot.linexpr()
.sense (pd.Series/pd.DataFrame/np.array/str/float) – sense(s) of the constraint(s)
rhs (pd.Series/pd.DataFrame/np.array/str/float) – right hand side of the constraint(s), must only contain pure constants, no variables
name (str) –
general name of the constraint (or component which the constraint is referring to). The constraint will then be stored under:
n.cons[name].pnl if the constraint is two-dimensional
n.cons[name].df if the constraint is one-dimensional
attr (str default '') – Specifying name of the constraint, defines under which name the constraint(s) are stored in n.cons[name].pnl if two-dimensional or in n.cons[name].df if one-dimensional
axes (pd.Index or tuple of pd.Index objects, default None) – Specifies the axes if all of lhs, sense and rhs are np.arrays or single strings or floats.
mask (pd.DataFrame/np.array) – Boolean mask with False values for constraints which are skipped. The shape of the mask has to match the shape of the array that come out when combining lhs, sense and rhs.
Example
Let’s say we want to constraint all gas generators to a maximum of 100 MWh during the first 10 snapshots. We then firstly get all operational variables for this subset and constraint there sum to less equal 100.
>>> from pypsa.linopt import get_var, linexpr, define_constraints >>> gas_i = n.generators.query('carrier == "Natural Gas"').index >>> gas_vars = get_var(n, 'Generator', 'p').loc[n.snapshots[:10], gas_i] >>> lhs = linexpr((1, gas_vars)).sum().sum() >>> define_constraints(n, lhs, '<=', 100, 'Generator', 'gas_power_limit')
Now the constraint references can be accessed by
pypsa.linopt.get_con()
using>>> cons = get_var(n, 'Generator', 'gas_power_limit')
Under the hood they are stored in n.cons.Generator.pnl.gas_power_limit. For retrieving their shadow prices add the general name of the constraint to the keep_shadowprices argument.
Note that this is useful for the extra_functionality argument.
- pypsa.linopt.define_variables(n, lower, upper, name, attr='', axes=None, spec='', mask=None)
Defines variable(s) for pypsa-network with given lower bound(s) and upper bound(s). The variables are stored in the network object under n.vars with key of the variable name. If multiple variables are defined at ones, at least one of lower and upper has to be an array (including pandas) of shape > (1, ) or axes have to define the dimensions of the variables.
- Parameters
n (pypsa.Network) –
lower (pd.Series/pd.DataFrame/np.array/str/float) – lower bound(s) for the variable(s)
upper (pd.Series/pd.DataFrame/np.array/str/float) – upper bound(s) for the variable(s)
name (str) –
general name of the variable (or component which the variable is referring to). The variable will then be stored under:
n.vars[name].pnl if the variable is two-dimensional
n.vars[name].df if the variable is one-dimensional
but can easily be accessed with
get_var(n, name, attr)()
attr (str default '') – Specifying name of the variable, defines under which name the variable(s) are stored in n.vars[name].pnl if two-dimensional or in n.vars[name].df if one-dimensional
axes (pd.Index or tuple of pd.Index objects, default None) – Specifies the axes and therefore the shape of the variables if bounds are single strings or floats. This is helpful when multiple variables have the same upper and lower bound.
mask (pd.DataFrame/np.array) – Boolean mask with False values for variables which are skipped. The shape of the mask has to match the shape the added variables.
Example
Let’s say we want to define a demand-side-managed load at each bus of network n, which has a minimum of 0 and a maximum of 10. We then define lower bound (lb) and upper bound (ub) and pass it to define_variables
>>> from pypsa.linopt import define_variables, get_var >>> lb = pd.DataFrame(0, index=n.snapshots, columns=n.buses.index) >>> ub = pd.DataFrame(10, index=n.snapshots, columns=n.buses.index) >>> define_variables(n, lb, ub, 'DSM', 'variableload')
Now the variables can be accessed by
pypsa.linopt.get_var()
using>>> variables = get_var(n, 'DSM', 'variableload')
Note that this is usefull for the extra_functionality argument.
- pypsa.linopt.get_con(n, c, attr, pop=False)
Retrieves constraint references for a given static or time-depending attribute of a give component.
- Parameters
Example
get_con(n, ‘Generator’, ‘mu_upper’)
- pypsa.linopt.get_dual(n, name, attr='')
Retrieves shadow price for a given constraint. Note that for retrieving shadow prices of a custom constraint, its name has to be passed to keep_references in the lopf, or keep_references has to be set to True. Note that a lookup of all stored shadow prices is given in n.dualvalues.
- Parameters
Example
get_dual(n, ‘Generator’, ‘mu_upper’)
- pypsa.linopt.get_sol(n, name, attr='')
Retrieves solution for a given variable. Note that a lookup of all stored solutions is given in n.solutions.
- Parameters
Example
get_dual(n, ‘Generator’, ‘mu_upper’)
- pypsa.linopt.get_var(n, c, attr, pop=False)
Retrieves variable references for a given static or time-depending attribute of a given component. The function looks into n.variables to detect whether the variable is a time-dependent or static.
- Parameters
Example
>>> get_var(n, 'Generator', 'p')
- pypsa.linopt.join_exprs(df)
Helper function to join arrays, series or frames of strings together.
- pypsa.linopt.linexpr(*tuples, as_pandas=True, return_axes=False)
Elementwise concatenation of tuples in the form (coefficient, variables). Coefficient and variables can be arrays, series or frames. Per default returns a pandas.Series or pandas.DataFrame of strings. If return_axes is set to True the return value is split into values and axes, where values are the numpy.array and axes a tuple containing index and column if present.
- Parameters
tuples (tuple of tuples) –
Each tuple must of the form (coeff, var), where
coeff is a numerical value, or a numerical array, series, frame
var is a str or a array, series, frame of variable strings
as_pandas (bool, default True) – Whether to return to resulting array as a series, if 1-dimensional, or a frame, if 2-dimensional. Supersedes return_axes argument.
return_axes (Boolean, default False) – Whether to return index and column (if existent)
Example
Initialize coefficients and variables
>>> coeff1 = 1 >>> var1 = pd.Series(['a1', 'a2', 'a3']) >>> coeff2 = pd.Series([-0.5, -0.3, -1]) >>> var2 = pd.Series(['b1', 'b2', 'b3'])
Create the linear expression strings
>>> linexpr((coeff1, var1), (coeff2, var2)) 0 +1.0 a1 -0.5 b1 1 +1.0 a2 -0.3 b2 2 +1.0 a3 -1.0 b3 dtype: object
For a further step the resulting frame can be used as the lhs of
pypsa.linopt.define_constraints()
For retrieving only the values:
>>> linexpr((coeff1, var1), (coeff2, var2), as_pandas=False) array(['+1.0 a1 -0.5 b1', '+1.0 a2 -0.3 b2', '+1.0 a3 -1.0 b3'], dtype=object)
- pypsa.linopt.run_and_read_cbc(n, problem_fn, solution_fn, solver_logfile, solver_options, warmstart=None, store_basis=True)
Solving function. Reads the linear problem file and passes it to the cbc solver. If the solution is successful it returns variable solutions and constraint dual values.
For more information on the solver options, run ‘cbc’ in your shell
- pypsa.linopt.run_and_read_cplex(n, problem_fn, solution_fn, solver_logfile, solver_options, warmstart=None, store_basis=True)
Solving function.
Reads the linear problem file and passes it to the cplex solver. If the solution is successful it returns variable solutions and constraint dual values. Cplex must be installed for using this function
- pypsa.linopt.run_and_read_glpk(n, problem_fn, solution_fn, solver_logfile, solver_options, warmstart=None, store_basis=True)
Solving function. Reads the linear problem file and passes it to the glpk solver. If the solution is successful it returns variable solutions and constraint dual values.
For more information on the glpk solver options: https://kam.mff.cuni.cz/~elias/glpk.pdf
- pypsa.linopt.run_and_read_gurobi(n, problem_fn, solution_fn, solver_logfile, solver_options, warmstart=None, store_basis=True)
Solving function. Reads the linear problem file and passes it to the gurobi solver. If the solution is successful it returns variable solutions and constraint dual values. Gurobipy must be installed for using this function.
For more information on solver options: https://www.gurobi.com/documentation/{gurobi_verion}/refman/parameter_descriptions.html
- pypsa.linopt.run_and_read_highs(n, problem_fn, solution_fn, solver_logfile, solver_options={}, warmstart=None, store_basis=True)
Highs solver function. Reads a linear problem file and passes it to the highs solver. If the solution is feasible the function returns the objective, solution and dual constraint variables. Highs must be installed for usage. Documentation: https://www.maths.ed.ac.uk/hall/HiGHS/
Notes
- The script might only work for version HiGHS 1.1.1. Installation steps::
sudo apt-get install cmake # if not installed git clone git@github.com:ERGO-Code/HiGHS.git cd HiGHS git checkout 95342daa73543cc21e5b27db3e0fbf7330007541 # moves to HiGHS 1.1.1 mkdir build cd build cmake .. make ctest
- Then in .bashrc add paths of executables and library ::
export PATH=”${PATH}:/foo/HiGHS/build/bin” export LD_LIBRARY_PATH=”${LD_LIBRARY_PATH}:/foo/HiGHS/build/lib” source .bashrc
- Now when typing
highs
in the terminal you should see something like :: Running HiGHS 1.1.1 [date: 2021-11-14, git hash: 95342daa]
The function reads and execute (i.e. subprocess.Popen, …) terminal commands of the solver. Meaning the command can be also executed at your command window/terminal if HiGHs is installed. Executing the commands on your local terminal helps to identify the raw outputs that are useful for developing the interface further.
All functions below the “process = …” do only read and save the outputs generated from the HiGHS solver. These parts are solver specific and depends on the solver output.
Solver options are read by the 1) command window and the 2) option_file.txt
1) An example list of solver options executable by the command window is given here: Examples: –model_file arg File of model to solve. –presolve arg Presolve: “choose” by default - “on”/”off” are alternatives. –solver arg Solver: “choose” by default - “simplex”/”ipm” are alternatives. –parallel arg Parallel solve: “choose” by default - “on”/”off” are alternatives. –time_limit arg Run time limit (double). –options_file arg File containing HiGHS options. -h, –help Print help.
2) The options_file.txt gives some more options, see a full list here: https://www.maths.ed.ac.uk/hall/HiGHS/HighsOptions.set By default, we insert a couple of options for the ipm solver. The dictionary can be overwritten by simply giving the new values. For instance, you could write a dictionary replacing some of the default values or adding new options:
``` solver_options = {
name: highs, method: ipm, parallel: “on”, <option_name>: <value>,
}
Note, the <option_name> and <value> must be equivalent to the name convention of HiGHS. Some function exist that are not documented, check their GitHub file: https://github.com/ERGO-Code/HiGHS/blob/master/src/lp_data/HighsOptions.h
- returns
status (string,) – “ok” or “warning”
termination_condition (string,) – Contains “optimal”, “infeasible”,
variables_sol (series)
constraints_dual (series)
objective (float)
- pypsa.linopt.run_and_read_xpress(n, problem_fn, solution_fn, solver_logfile, solver_options, keep_files, warmstart=None, store_basis=True)
Solving function. Reads the linear problem file and passes it to the Xpress solver. If the solution is successful it returns variable solutions and constraint dual values. The xpress module must be installed for using this function.
For more information on solver options: https://www.fico.com/fico-xpress-optimization/docs/latest/solver/GUID-ACD7E60C-7852-36B7-A78A-CED0EA291CDD.html
- pypsa.linopt.set_conref(n, constraints, c, attr, spec='')
Sets constraint references to the network. One-dimensional constraint references will be collected at n.cons[c].df, two-dimensional in n.cons[c].pnl For example:
constraints for nominal capacity variables for generators are stored in n.cons.Generator.df.mu_upper
operational capacity limits for generators are stored in n.cons.Generator.pnl.mu_upper
- pypsa.linopt.set_varref(n, variables, c, attr, spec='')
Sets variable references to the network. One-dimensional variable references will be collected at n.vars[c].df, two-dimensional varaibles in n.vars[c].pnl. For example:
nominal capacity variables for generators are stored in n.vars.Generator.df.p_nom
operational variables for generators are stored in n.vars.Generator.pnl.p
- pypsa.linopt.to_pandas(array, *axes)
Convert a numpy array to pandas.Series if 1-dimensional or to a pandas.DataFrame if 2-dimensional.
Provide index and columns if needed
- pypsa.linopt.write_binary(n, axes, mask=None)
Writer function for writing out multiple binary-variables at a time.
According to the axes it writes out binaries for each entry the pd.Series or pd.DataFrame spanned by axes. Returns a series or frame with variable references.
- pypsa.linopt.write_bound(n, lower, upper, axes=None, mask=None)
Writer function for writing out multiple variables at a time.
If lower and upper are floats it demands to give pass axes, a tuple of (index, columns) or (index), for creating the variable of same upper and lower bounds. Return a series or frame with variable references.
- pypsa.linopt.write_constraint(n, lhs, sense, rhs, axes=None, mask=None)
Writer function for writing out multiple constraints to the corresponding constraints file.
If lower and upper are numpy.ndarrays it axes must not be None but a tuple of (index, columns) or (index). Return a series or frame with constraint references.
Power System Optimisation
Pyomo Optimisation Module
Tools for fast Pyomo linear problem building.
Essentially this library replaces Pyomo expressions with more strict objects with a pre-defined affine structure.
- class pypsa.opt.LConstraint(lhs=None, sense='==', rhs=None)
Constraint of optimisation variables.
Linear constraint of the form:
lhs sense rhs
- Parameters
lhs (LExpression) –
sense (string) –
rhs (LExpression) –
- class pypsa.opt.LExpression(variables=None, constant=0.0)
Affine expression of optimisation variables.
Affine expression of the form:
constant + coeff1*var1 + coeff2*var2 + ….
- pypsa.opt.l_constraint(model, name, constraints, *args)
A replacement for pyomo’s Constraint that quickly builds linear constraints.
Instead of
model.name = Constraint(index1, index2, …, rule=f)
call instead
l_constraint(model, name, constraints, index1, index2, …)
where constraints is a dictionary of constraints of the form:
constraints[i] = LConstraint object
OR using the soon-to-be-deprecated list format:
constraints[i] = [[(coeff1, var1), (coeff2, var2), …], sense, constant_term]
i.e. the first argument is a list of tuples with the variables and their coefficients, the second argument is the sense string (must be one of “==”, “<=”, “>=”, “><”) and the third argument is the constant term (a float). The sense “><” allows lower and upper bounds and requires constant_term to be a 2-tuple.
Variables may be repeated with different coefficients, which pyomo will sum up.
- Parameters
model (pyomo.environ.ConcreteModel) –
name (string) – Name of constraints to be constructed
constraints (dict) – A dictionary of constraints (see format above)
*args – Indices of the constraints
- pypsa.opt.l_objective(model, objective=None, sense=1)
A replacement for pyomo’s Objective that quickly builds linear objectives.
Instead of
model.objective = Objective(expr=sum(vars[i]*coeffs[i] for i in index)+constant)
call instead
l_objective(model, objective, sense)
where objective is an LExpression.
Variables may be repeated with different coefficients, which pyomo will sum up.
- Parameters
model (pyomo.environ.ConcreteModel) –
objective (LExpression) –
sense (minimize / maximize) –
Pyomo Power System Optimisation
Optimal Power Flow functions.
- pypsa.opf.define_nodal_balances(network, snapshots)
Construct the nodal balance for all elements except the passive branches.
Store the nodal balance expression in network._p_balance.
- pypsa.opf.define_passive_branch_flows_with_kirchhoff(network, snapshots, skip_vars=False)
define passive branch flows with the kirchoff method.
- pypsa.opf.define_sub_network_cycle_constraints(subnetwork, snapshots, passive_branch_p, attribute)
Constructs cycle_constraints for a particular subnetwork.
- pypsa.opf.network_lopf(network, snapshots=None, solver_name='glpk', solver_io=None, skip_pre=False, extra_functionality=None, multi_investment_periods=False, solver_logfile=None, solver_options={}, keep_files=False, formulation='angles', ptdf_tolerance=0.0, free_memory={}, extra_postprocessing=None)
Linear optimal power flow for a group of snapshots.
- Parameters
snapshots (list or index slice) – A list of snapshots to optimise, must be a subset of network.snapshots, defaults to network.snapshots
solver_name (string) – Must be a solver name that pyomo recognises and that is installed, e.g. “glpk”, “gurobi”
solver_io (string, default None) – Solver Input-Output option, e.g. “python” to use “gurobipy” for solver_name=”gurobi”
skip_pre (bool, default False) – Skip the preliminary steps of computing topology, calculating dependent values and finding bus controls.
extra_functionality (callable function) – This function must take two arguments extra_functionality(network, snapshots) and is called after the model building is complete, but before it is sent to the solver. It allows the user to add/change constraints and add/change the objective function.
solver_logfile (None|string) – If not None, sets the logfile option of the solver.
solver_options (dictionary) – A dictionary with additional options that get passed to the solver. (e.g. {‘threads’:2} tells gurobi to use only 2 cpus)
keep_files (bool, default False) – Keep the files that pyomo constructs from OPF problem construction, e.g. .lp file - useful for debugging
formulation (string) – Formulation of the linear power flow equations to use; must be one of [“angles”, “cycles”, “kirchhoff”, “ptdf”]
ptdf_tolerance (float) – Value below which PTDF entries are ignored
free_memory (set, default {'pyomo'}) – Any subset of {‘pypsa’, ‘pyomo’}. Allows to stash pypsa time-series data away while the solver runs (as a pickle to disk) and/or free pyomo data after the solution has been extracted.
extra_postprocessing (callable function) – This function must take three arguments extra_postprocessing(network, snapshots, duals) and is called after the model has solved and the results are extracted. It allows the user to extract further information about the solution, such as additional shadow prices.
- Return type
None
- pypsa.opf.network_lopf_build_model(network, snapshots=None, skip_pre=False, formulation='angles', ptdf_tolerance=0.0)
Build pyomo model for linear optimal power flow for a group of snapshots.
- Parameters
snapshots (list or index slice) – A list of snapshots to optimise, must be a subset of network.snapshots, defaults to network.snapshots
skip_pre (bool, default False) – Skip the preliminary steps of computing topology, calculating dependent values and finding bus controls.
formulation (string) – Formulation of the linear power flow equations to use; must be one of [“angles”, “cycles”, “kirchhoff”, “ptdf”]
ptdf_tolerance (float) – Value below which PTDF entries are ignored
- Return type
network.model
- pypsa.opf.network_lopf_prepare_solver(network, solver_name='glpk', solver_io=None)
Prepare solver for linear optimal power flow.
- Parameters
solver_name (string) – Must be a solver name that pyomo recognises and that is installed, e.g. “glpk”, “gurobi”
solver_io (string, default None) – Solver Input-Output option, e.g. “python” to use “gurobipy” for solver_name=”gurobi”
- Return type
None
- pypsa.opf.network_lopf_solve(network, snapshots=None, formulation='angles', solver_options={}, solver_logfile=None, keep_files=False, free_memory={'pyomo'}, extra_postprocessing=None)
Solve linear optimal power flow for a group of snapshots and extract results.
- Parameters
snapshots (list or index slice) – A list of snapshots to optimise, must be a subset of network.snapshots, defaults to network.snapshots
formulation (string) – Formulation of the linear power flow equations to use; must be one of [“angles”, “cycles”, “kirchhoff”, “ptdf”]; must match formulation used for building the model.
solver_options (dictionary) – A dictionary with additional options that get passed to the solver. (e.g. {‘threads’:2} tells gurobi to use only 2 cpus)
solver_logfile (None|string) – If not None, sets the logfile option of the solver.
keep_files (bool, default False) – Keep the files that pyomo constructs from OPF problem construction, e.g. .lp file - useful for debugging
free_memory (set, default {'pyomo'}) – Any subset of {‘pypsa’, ‘pyomo’}. Allows to stash pypsa time-series data away while the solver runs (as a pickle to disk) and/or free pyomo data after the solution has been extracted.
extra_postprocessing (callable function) – This function must take three arguments extra_postprocessing(network, snapshots, duals) and is called after the model has solved and the results are extracted. It allows the user to extract further information about the solution, such as additional shadow prices.
- Return type
None
- pypsa.opf.network_opf(network, snapshots=None)
Optimal power flow for snapshots.
Contingency Analysis
Functionality for contingency analysis, such as branch outages.
- pypsa.contingency.calculate_BODF(sub_network, skip_pre=False)
Calculate the Branch Outage Distribution Factor (BODF) for sub_network.
Sets sub_network.BODF as a (dense) numpy array.
The BODF is a num_branch x num_branch 2d array.
For the outage of branch l, the new flow on branch k is given in terms of the flow before the outage
f_k^after = f_k^before + BODF_{kl} f_l^before
Note that BODF_{ll} = -1.
- Parameters
sub_network (pypsa.SubNetwork) –
skip_pre (bool, default False) – Skip the preliminary step of computing the PTDF.
Examples
>>> sub_network.caculate_BODF()
- pypsa.contingency.network_lpf_contingency(network, snapshots=None, branch_outages=None)
Computes linear power flow for a selection of branch outages.
- Parameters
snapshots (list-like|single snapshot) – A subset or an elements of network.snapshots on which to run the power flow, defaults to network.snapshots NB: currently this only works for a single snapshot
branch_outages (list-like) – A list of passive branches which are to be tested for outages. If None, it’s take as all network.passive_branches_i()
- Returns
p0 – num_passive_branch x num_branch_outages DataFrame of new power flows
- Return type
pandas.DataFrame
Examples
>>> network.lpf_contingency(snapshot, branch_outages)
- pypsa.contingency.network_sclopf(network, snapshots=None, branch_outages=None, solver_name='glpk', pyomo=True, skip_pre=False, extra_functionality=None, solver_options={}, keep_files=False, formulation='kirchhoff', ptdf_tolerance=0.0)
Computes Security-Constrained Linear Optimal Power Flow (SCLOPF).
This ensures that no branch is overloaded even given the branch outages.
- Parameters
snapshots (list or index slice) – A list of snapshots to optimise, must be a subset of network.snapshots, defaults to network.snapshots
branch_outages (list-like) – A list of passive branches which are to be tested for outages. If None, it’s take as all network.passive_branches_i()
solver_name (string) – Must be a solver name that pyomo recognises and that is installed, e.g. “glpk”, “gurobi”
pyomo (bool, default True) – Whether to use pyomo for building and solving the model, setting this to False saves a lot of memory and time.
skip_pre (bool, default False) – Skip the preliminary steps of computing topology, calculating dependent values and finding bus controls.
extra_functionality (callable function) – This function must take two arguments extra_functionality(network, snapshots) and is called after the model building is complete, but before it is sent to the solver. It allows the user to add/change constraints and add/change the objective function.
solver_options (dictionary) – A dictionary with additional options that get passed to the solver. (e.g. {‘threads’:2} tells gurobi to use only 2 cpus)
keep_files (bool, default False) – Keep the files that pyomo constructs from OPF problem construction, e.g. .lp file - useful for debugging
formulation (string, default "kirchhoff") – Formulation of the linear power flow equations to use; must be one of [“angles”, “cycles”, “kirchoff”, “ptdf”]
ptdf_tolerance (float) –
- Return type
None
Examples
>>> network.sclopf(network, branch_outages)