Separable Natural Evolution Strategy algorithm in JavaScript.
Mona Lisa in 200 rectangles, see demo
import SNES from "snes";
// creates an optimizer with settings
const optimizer = SNES({
solutionLength, // number of parameters
populationCount,
});
// an array to store your fitnesses for each population
const fitnesses = new Float32Array(populationCount);
// for each epoch...
const epochs = 100;
for (let i = 0; i < epochs; i++) {
// ask for a set of solutions (flat array)
const solutions = optimizer.ask();
// compute the fitness for each
for (let j = 0; j < populationCount; j++) {
// note this returns a subarray (i.e. no copy)
const params = optimizer.getSolutionAt(solutions, j);
// compute the fitness, which is often -err
fitnesses[j] = fitness(params);
}
// update the optimizer with all fitnesses
optimizer.tell(fitnesses);
}
// The optimized 'mean' solution
console.log(optimizer.center);
See the examples directory for more.
Use npm to install.
npm install snes --save
With options:
solutionLength
number of parameters to optimizepopulationCount
number of candidate solutions to usealpha
(default=0.05) learning ratestate
a 4-element Uint32Array random seed staterandom
(default=Math.random
) randomizer for computing an initialopts.state
if not specified; ignored if state is givensigma
the initial standard deviation, defaults to:new Float32Array(solutionLength).fill(1)
center
the initial mean, defaults to:new Float32Array(solutionLength).fill(0)
Returns a flat array of solutions, strided by solutionLength
. The total size of this array will be solutionLength * populationCount
.
From a flat array of solutions
, gets a subarray at slot index
. Since this is a view of the flat array, it is not a c
7377
opy, and so any changes to the flat array will also be present in this view. You should use subarray.slice()
if you want a copy.
Updates the parameters based on the list of fitnesses, which is expected to be parallel to the solutions
array given by the ask()
function. The size of this array should be populationCount
.
The current mean of the optimizer, i.e. the optimization result. This may not always be the best performing candidate, for example compared to candidates from prior epochs.
optimizer.sigma
(lengthsolutionLength
)optimizer.gaussian
(lengthpopulationCount * solutionLength
)optimizer.prng
(you can useprng.next()
andprng.nextGaussian()
for random values)
Clone the repo, cd into it and npm install
dependencies, then you can run:
npm run rects
And open the localhost URL in your browser to see Mona Lisa painted in 200 rectangles.
You can also test with node examples/ascii.js
for a simple textual learning example.
- Exponential Natural Evolution Strategies (2010) - T Glasmachers
- Benchmarking Separable Natural Evolution Strategies on the Noiseless and Noisy Black-box Optimization Testbeds (2012) - T Schaul
- pints
MIT, see LICENSE.md for details.