secure_learning.models.secure_logistic module
Implementation of Logistic regression model.
- class secure_learning.models.secure_logistic.ClassWeightsTypes(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]
Bases:
Enum
Class to store whether class weights are equal or balanced.
- BALANCED = 2
- EQUAL = 1
- class secure_learning.models.secure_logistic.ExponentiationTypes(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[source]
Bases:
Enum
Class to store whether exponentations are approximated or calculated exactly.
- APPROX = 2
- EXACT = 3
- NONE = 1
- class secure_learning.models.secure_logistic.Logistic(solver_type=SolverTypes.GD, exponentiation=ExponentiationTypes.EXACT, penalty=PenaltyTypes.NONE, class_weights_type=ClassWeightsTypes.EQUAL, **penalty_args)[source]
Bases:
Model
Solver for logistic regression. Optimizes a model with objective function
\[\left(\frac{1}{2{n}_{\textrm{samples}}}\right) \sum_{i=1}^{{n}_{\textrm{samples}}}\left(\textrm{-}(1+y_i) \log(h_w(x_i)) - (1-y_i) \log(1-h_w(x_i))\right)\]Here,
\[h_w(x) = \frac{1}{(1 + e^{-w^T x}}\]Labels \(y_i\) are assumed to have value \(-1\) or \(1\).
The gradient is given by:
\[g(X, y, w) = \left(\frac{1}{2} \times {n}_{\textrm{samples}}\right) \sum_{i=1}^{{n}_\textrm{samples}} x_i^T \left( (2h_w(x_i) - 1) - y \right)\]See secure_model.py docstrings for more information on solver types and penalties.
- __init__(solver_type=SolverTypes.GD, exponentiation=ExponentiationTypes.EXACT, penalty=PenaltyTypes.NONE, class_weights_type=ClassWeightsTypes.EQUAL, **penalty_args)[source]
Constructor method.
- Parameters:
solver_type (
SolverTypes
) – Solver type to use (e.g. Gradient Descent aka GD)exponentiation (
ExponentiationTypes
) – Choose whether exponentiations are approximated or exactly calculatedpenalty (
PenaltyTypes
) – Choose whether using L1, L2 or no penaltyclass_weights_type (
ClassWeightsTypes
) – Class weights type, either balanced or equalpenalty_args (
float
) – Necessary arguments for chosen penalty
- Raises:
ValueError – raised when exponentiation is of wrong type.
- class_weights = None
- class_weights_type = 1
- gradient_function(X, y, coef_, grad_per_sample)[source]
Evaluate the gradient from the given parameters.
- Parameters:
X (
List
[List
[SecureFixedPoint
]]) – Independent variablesy (
List
[SecureFixedPoint
]) – Dependent variablescoef – Current coefficients vector
grad_per_sample (
bool
) – Return a list with gradient per sample instead of aggregated (summed) gradient
- Return type:
Union
[List
[List
[SecureFixedPoint
]],List
[SecureFixedPoint
]]- Returns:
Gradient of objective function as specified in class docstring, evaluated from the provided parameters
- name = 'Logistic regression'
- static predict(X, coef_, prob=0.5, **_kwargs)[source]
Predicts labels for input data to classification model. Label \(-1\) is assigned of the predicted probability is less then prob, otherwise label \(+1\) is assigned.
- Parameters:
X (
List
[List
[SecureFixedPoint
]]) – Input data with all featurescoef – Coefficient vector of classification model
prob (
float
) – Threshold for labelling. Defaults to \(0.5\)._kwargs (
None
) – Not used
- Return type:
List
[SecureFixedPoint
]- Returns:
Target labels of classification model