optimize: Change initialization, remove Needser, and update Problem f… (#779)

* optimize: Change initialization, remove Needser, and update Problem function calls

We need a better way to express the Hessian function call so that sparse Hessians can be provided. This change updates the Problem function definitions to allow an arbitrary Symmetric matrix. With this change, we need to change how Location is used, so that we do not allocate a SymDense. Once this location is changed, we no longer need Needser to allocate the appropriate memory, and can shift that to initialization, further simplifying the interfaces.

A 'fake' Problem is passed to Method to continue to make it impossible for the Method to call the functions directly.

Fixes #727, #593.
This commit is contained in:
Brendan Tracey
2019-02-01 15:26:26 +00:00
committed by GitHub
parent 199b7405a3
commit c07f678f3f
20 changed files with 427 additions and 192 deletions

View File

@@ -6,6 +6,11 @@ package optimize
import "gonum.org/v1/gonum/floats"
var (
_ Method = (*GradientDescent)(nil)
_ localMethod = (*GradientDescent)(nil)
)
// GradientDescent implements the steepest descent optimization method that
// performs successive steps along the direction of the negative gradient.
type GradientDescent struct {
@@ -30,6 +35,10 @@ func (g *GradientDescent) Status() (Status, error) {
return g.status, g.err
}
func (*GradientDescent) Uses(has Available) (uses Available, err error) {
return has.gradient()
}
func (g *GradientDescent) Init(dim, tasks int) int {
g.status = NotTerminated
g.err = nil
@@ -75,7 +84,7 @@ func (g *GradientDescent) NextDirection(loc *Location, dir []float64) (stepSize
return g.StepSizer.StepSize(loc, dir)
}
func (*GradientDescent) Needs() struct {
func (*GradientDescent) needs() struct {
Gradient bool
Hessian bool
} {