tf.contrib.constrained_optimization.AdditiveSwapRegretOptimizer
A ConstrainedOptimizer
based on swap-regret minimization.
tf.contrib.constrained_optimization.AdditiveSwapRegretOptimizer( optimizer, constraint_optimizer=None )
This ConstrainedOptimizer
uses the given tf.compat.v1.train.Optimizer
s to jointly minimize over the model parameters, and maximize over constraint/objective weight matrix (the analogue of Lagrange multipliers), with the latter maximization using additive updates and an algorithm that minimizes swap regret.
For more specifics, please refer to:
Cotter, Jiang and Sridharan. "Two-Player Games for Efficient Non-Convex Constrained Optimization". https://arxiv.org/abs/1804.06500
The formulation used by this optimizer can be found in Definition 2, and is discussed in Section 4. It is most similar to Algorithm 2 in Section 4, with the differences being that it uses tf.compat.v1.train.Optimizer
s, instead of SGD, for the "inner" updates, and performs additive (instead of multiplicative) updates of the stochastic matrix.
Args | |
---|---|
optimizer | tf.compat.v1.train.Optimizer, used to optimize the objective and proxy_constraints portion of ConstrainedMinimizationProblem. If constraint_optimizer is not provided, this will also be used to optimize the Lagrange multiplier analogues. |
constraint_optimizer | optional tf.compat.v1.train.Optimizer, used to optimize the Lagrange multiplier analogues. |
Attributes | |
---|---|
constraint_optimizer | Returns the tf.compat.v1.train.Optimizer used for the matrix. |
optimizer | Returns the tf.compat.v1.train.Optimizer used for optimization. |
Methods
minimize
minimize( minimization_problem, unconstrained_steps=None, global_step=None, var_list=None, gate_gradients=train_optimizer.Optimizer.GATE_OP, aggregation_method=None, colocate_gradients_with_ops=False, name=None, grad_loss=None )
Returns an Operation
for minimizing the constrained problem.
This method combines the functionality of minimize_unconstrained
and minimize_constrained
. If global_step < unconstrained_steps, it will perform an unconstrained update, and if global_step >= unconstrained_steps, it will perform a constrained update.
The reason for this functionality is that it may be best to initialize the constrained optimizer with an approximate optimum of the unconstrained problem.
Args | |
---|---|
minimization_problem | ConstrainedMinimizationProblem, the problem to optimize. |
unconstrained_steps | int, number of steps for which we should perform unconstrained updates, before transitioning to constrained updates. |
global_step | as in tf.compat.v1.train.Optimizer 's minimize method. |
var_list | as in tf.compat.v1.train.Optimizer 's minimize method. |
gate_gradients | as in tf.compat.v1.train.Optimizer 's minimize method. |
aggregation_method | as in tf.compat.v1.train.Optimizer 's minimize method. |
colocate_gradients_with_ops | as in tf.compat.v1.train.Optimizer 's minimize method. |
name | as in tf.compat.v1.train.Optimizer 's minimize method. |
grad_loss | as in tf.compat.v1.train.Optimizer 's minimize method. |
Returns | |
---|---|
Operation , the train_op. |
Raises | |
---|---|
ValueError | If unconstrained_steps is provided, but global_step is not. |
minimize_constrained
minimize_constrained( minimization_problem, global_step=None, var_list=None, gate_gradients=train_optimizer.Optimizer.GATE_OP, aggregation_method=None, colocate_gradients_with_ops=False, name=None, grad_loss=None )
Returns an Operation
for minimizing the constrained problem.
Unlike minimize_unconstrained
, this function attempts to find a solution that minimizes the objective
portion of the minimization problem while satisfying the constraints
portion.
Args | |
---|---|
minimization_problem | ConstrainedMinimizationProblem, the problem to optimize. |
global_step | as in tf.compat.v1.train.Optimizer 's minimize method. |
var_list | as in tf.compat.v1.train.Optimizer 's minimize method. |
gate_gradients | as in tf.compat.v1.train.Optimizer 's minimize method. |
aggregation_method | as in tf.compat.v1.train.Optimizer 's minimize method. |
colocate_gradients_with_ops | as in tf.compat.v1.train.Optimizer 's minimize method. |
name | as in tf.compat.v1.train.Optimizer 's minimize method. |
grad_loss | as in tf.compat.v1.train.Optimizer 's minimize method. |
Returns | |
---|---|
Operation , the train_op. |
minimize_unconstrained
minimize_unconstrained( minimization_problem, global_step=None, var_list=None, gate_gradients=train_optimizer.Optimizer.GATE_OP, aggregation_method=None, colocate_gradients_with_ops=False, name=None, grad_loss=None )
Returns an Operation
for minimizing the unconstrained problem.
Unlike minimize_constrained
, this function ignores the constraints
(and proxy_constraints
) portion of the minimization problem entirely, and only minimizes objective
.
Args | |
---|---|
minimization_problem | ConstrainedMinimizationProblem, the problem to optimize. |
global_step | as in tf.compat.v1.train.Optimizer 's minimize method. |
var_list | as in tf.compat.v1.train.Optimizer 's minimize method. |
gate_gradients | as in tf.compat.v1.train.Optimizer 's minimize method. |
aggregation_method | as in tf.compat.v1.train.Optimizer 's minimize method. |
colocate_gradients_with_ops | as in tf.compat.v1.train.Optimizer 's minimize method. |
name | as in tf.compat.v1.train.Optimizer 's minimize method. |
grad_loss | as in tf.compat.v1.train.Optimizer 's minimize method. |
Returns | |
---|---|
Operation , the train_op. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/contrib/constrained_optimization/AdditiveSwapRegretOptimizer