Commit Graph

77 Commits

Author SHA1 Message Date
Dan Kortschak
da72779e7a floats/scalar: new package containing non-vector functions from floats 2020-08-07 07:59:02 +09:30
Egon Elbre
52173e64f5 optimize: make tests parallel 2020-03-16 16:10:59 +02:00
Vladimir Chalupecky
5f268d9394 mat: rename CloneVec to CloneFromVec 2020-03-08 11:00:58 +01:00
Dan Kortschak
939a2b38a3 optimize/functions: fix ExtendedRosenbrock for BFGS test
With the fused operation, grad diverges, resulting in a non-progression
of the location.
2020-02-22 11:38:52 +10:30
Dan Kortschak
be8b0445de optimize/functions: fix BrownBadlyScaled for BFGS and LBFGS tests
With the fused operation, f3 is calculated to -9e-17 rather than zero,
allowing another iteration, which fails to progress due to underflow.
2020-02-22 09:15:27 +10:30
Dan Kortschak
815f35ac4b optimize: explicitly state interface satisfaction of types 2020-02-22 06:55:55 +10:30
Dan Kortschak
39972c90c7 optimize: relax gradient tolerance for BFGS test 2020-02-21 20:24:15 +10:30
Dan Kortschak
89be31a4e6 optimize: fix doc comment typo 2020-02-21 19:37:42 +10:30
Dan Kortschak
c3867503e7 optimize: relax gradient tolerance for Newton test 2020-02-21 01:24:01 +10:30
Vladimir Chalupecky
43ba13d1a9 optimize: relax gradient tolerance in two tests 2020-02-20 09:33:46 +01:00
Dan Kortschak
ca302525a3 optimize: add doc comments for Location fields 2020-02-18 21:09:08 +10:30
Dan Kortschak
8ad895b51b optimize: avoid unnecessary allocations for Gradient and Hessian 2020-02-17 09:22:36 +10:30
Dan Kortschak
7d9e94571f optimize: fix typo 2020-02-16 10:24:52 +10:30
Dan Kortschak
889a9573ff all: remove redundant bad prefix and make panic strings const 2019-12-02 19:52:03 +10:30
Dan Kortschak
56079b412a optimize: clean up lint 2019-11-07 08:53:13 +10:30
Dan Kortschak
9fd3ad46a7 optimize: remove redundant returns 2019-10-12 20:15:11 +10:30
Dan Kortschak
c9689bae36 optimize: remove redundant return 2019-09-23 19:37:22 +09:30
Dan Kortschak
1d8f8b2ee4 all: address issues identified by golangci-lint 2019-09-09 07:38:44 +09:30
Dan Kortschak
17ea55aedb blas,lapack: clean up docs and comments
Apply (with manual curation after the fact):
* s/^T/U+1d40/g
* s/^H/U+1d34/g
* s/, {2,3}if / $1/g

Some additional manual editing of odd formatting.
2019-09-06 20:02:29 +09:30
Dan Kortschak
c5f01565d8 mat: rename Cloner=>ClonerFrom and Clone=>CloneFrom 2019-06-08 21:20:22 +09:30
Dan Kortschak
9827ae2933 optimize: fix doc comments for Grad and Hess fields of Problem 2019-05-01 06:30:20 +09:30
Dan Kortschak
9a5a03f262 optimize: unify Gradient and Hessian API behaviour 2019-04-23 20:19:19 +09:30
Dan Kortschak
ae324d6d48 optimize: make Problem.Hess take a *mat.SymDense 2019-04-23 20:19:19 +09:30
Brendan Tracey
4b1617dbb0 mat: Add methods for Cholesky to implement Matrix and Symmetric (#928)
* mat: Add methods for Cholesky to implement Matrix and Symmetric

Fixes #919.
2019-03-29 23:07:04 +00:00
Vladimir Chalupecky
c38fb5f9ef all: fix "the the" typo in comments 2019-03-28 14:24:03 +01:00
Brendan Tracey
a65628b4b5 mat: Rename Solve(Vec) to Solve(Vec)To (#922)
* mat: Rename Solve(Vec) to Solev(Vec)To

Fix #830.
2019-03-28 01:01:36 +00:00
Brendan Tracey
c07f678f3f optimize: Change initialization, remove Needser, and update Problem f… (#779)
* optimize: Change initialization, remove Needser, and update Problem function calls

We need a better way to express the Hessian function call so that sparse Hessians can be provided. This change updates the Problem function definitions to allow an arbitrary Symmetric matrix. With this change, we need to change how Location is used, so that we do not allocate a SymDense. Once this location is changed, we no longer need Needser to allocate the appropriate memory, and can shift that to initialization, further simplifying the interfaces.

A 'fake' Problem is passed to Method to continue to make it impossible for the Method to call the functions directly.

Fixes #727, #593.
2019-02-01 15:26:26 +00:00
Brendan Tracey
b545e3e77e optimize: Refactor gradient convergence and remove DefaultSettings (#772)
* optimize: Refactor gradient convergence and remove DefaultSettings

The current API design makes it easy to make a mistake in not using the DefaultSettings. This change makes the zero value of Settings do the 'right thing'. The remaining setting that is used by the DefaultSettings is to change the behavior of the GradientTolerance. This was necessary because gradient-based Local methods (BFGS, LBFGS, CG, etc.) typically _define_ convergence by the value of the gradient, while Global methods (CMAES, GuessAndCheck) are defined by _not_ converging when the gradient is small. The problem is to have two completely different default behaviors without knowing the Method. The solution is to treat a very small value of the gradient as a method-based convergence, in the same way that a small spread of data is a convergence of CMAES. Thus, the default behavior, from the perspective of Settings, is never to converge based on the gradient, but all of the Local methods will converge when a value close to the minimum is found. This default value is set to a very small value, such that users should not want a smaller value. A user can thus still set a (more reasonable) convergence value through settings.

Fixes 677.
2018-12-23 08:17:27 -05:00
Brendan Tracey
44a6721e0d optimize: make function converger an interface (#728)
* optimize: make function converger an interface

Fixes #488.
Updates #677.
2018-12-16 11:37:46 +01:00
Dan Kortschak
9d66d7e8f5 mat: rename NewDiagonal to NewDiagDense 2018-12-14 22:45:14 +10:30
Brendan Tracey
c395f0688f optimize: move global code to minimize (#724)
Fixes #482.
2018-12-05 18:18:39 +00:00
Brendan Tracey
7f00e25224 mat: Add Diagonal interface and DiagDense type (#594)
mat: Add Diagonal interface and DiagDense type

Fixes 592.
2018-10-04 21:05:13 +01:00
Dan Kortschak
70492dcef1 all: quieten vet for unkeyed composite literals in test code 2018-09-02 07:59:12 +09:30
Brendan Tracey
cebdade430 Remove InitMean from CmaEsChol and use the value passed to Minimize instead 2018-07-26 06:45:43 -06:00
Brendan Tracey
fd90faf24c Change Minimize to take in an initial X location 2018-07-26 06:45:43 -06:00
Brendan Tracey
6f6398b9ea Change Global to Minimize 2018-07-26 06:45:43 -06:00
Brendan Tracey
9c5a3cae0e Rename GlobalMethod to Method and GlobalTask to Task 2018-07-26 06:45:43 -06:00
Brendan Tracey
88ef6dbe25 Change DefaultSettings to DefaultSettingsLocal 2018-07-26 06:45:43 -06:00
Brendan Tracey
3dd933ac47 Change Method to localMethod.
Now that Local is gone, unexport the type.
2018-07-26 06:45:43 -06:00
Brendan Tracey
286f685bdb Change call to Local to call to Global (#553) 2018-07-18 17:04:18 -06:00
Brendan Tracey
c06c645ce2 Improve parameters when dimension of NelderMead is one. In particular… (#543)
* Improve parameters when dimension of NelderMead is one. In particular the shrink parameter is set to 0, which causes the algorithm to fail to converge.

Fixes #542.

* Add test for 1-D NelderMead
2018-07-18 12:18:43 -06:00
Brendan Tracey
2704973b50 optimize: Remove Local function (#538)
* optimize: Remove Local function

This change removes the Local function. In order to do so, this changes the previous LocalGlobal wrapper to LocalController to allow Local methods to be used as a Global optimizer. This adds methods to all of the Local methods in order to implement GlobalMethod, and changes the tests accordingly. The next commit will fix all of the names
2018-07-18 12:18:18 -06:00
Vladimir Chalupecky
e9e56344e3 all: fix capitalization of Gonum in copyright headers 2018-06-22 17:32:53 +02:00
Vladimir Chalupecky
b96df58db9 all: add missing copyright headers 2018-06-22 17:32:53 +02:00
Brendan Tracey
d56ca496c1 optimize: Change Settings to allow InitialLocation (#497)
* optimize: Change Settings to allow InitialLocation

This modifies Settings to allow specifying an initial location and properties of the function (value, gradient, etc.). This allows to work with local optimizers that are seeded with initial settings. This has two fields that must be specified, InitX and InitValues. Ideally this would only be one location, but the difficulty is that the default value of the function is 0. We either must require the user to specify it is set (in this case that InitValues is non-zero), or require the user to change the default value away if it is not set. The former seems much safer.
2018-06-06 09:11:04 -06:00
Brendan Tracey
3f7b30d06c Remove init location from FunctionConverge (#489) 2018-05-15 09:14:31 -06:00
Brendan Tracey
f402b0ae71 optimize: remove Local implementation and replace with a call to Global (#485)
* optimize: remove Local implementation and replace with a call to Global

This PR starts the process described in #482. It removes the existing Local implementation, replacing with a function that wraps Method to act as a GlobalMethod. This PR also adds a hack to fix an inconsistency with FunctionConverge between Global and Local (and a TODO to make it not a hack in the future)
2018-05-09 11:02:19 -06:00
Dan Kortschak
4eed5b6553 all: replace publicly facing *rand.Rand with rand.Source 2018-05-03 07:40:18 +09:30
Brendan Tracey
698d55ff6e optimize/lp: make simplex panic when there are more equality constrai… (#451)
* optimize/lp: make simplex panic when there are more equality constraints than variables
2018-04-05 13:30:52 -06:00
Brendan Tracey
f0b07f8621 optimize: add ListSearch for finding the optimum over a specific list… (#438)
* optimize: add ListSearch for finding the optimum over a specific list of values
2018-03-22 14:00:44 -06:00