Rand Stats

Algorithm::LBFGS

cpan:TITSUKI

Build Status

NAME

Algorithm::LBFGS - A Raku bindings for libLBFGS

SYNOPSIS

use Algorithm::LBFGS;
use Algorithm::LBFGS::Parameter;

my Algorithm::LBFGS $lbfgs .= new;
my &evaluate = sub ($instance, $x, $g, $n, $step --> Num) {
   my Num $fx = ($x[0] - 2.0) ** 2 + ($x[1] - 5.0) ** 2;
   $g[0] = 2.0 * $x[0] - 4.0;
   $g[1] = 2.0 * $x[1] - 10.0;
   return $fx;
};
my Algorithm::LBFGS::Parameter $parameter .= new;
my Num @x0 = [0e0, 0e0];
my @x = $lbfgs.minimize(:@x0, :&evaluate, :$parameter);
@x.say; # [2e0, 5e0]

DESCRIPTION

Algorithm::LBFGS is a Raku bindings for libLBFGS. libLBFGS is a C port of the implementation of Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method written by Jorge Nocedal.

The L-BFGS method solves the unconstrainted minimization problem,

minimize F(x), x = (x1, x2, ..., xN),

only if the objective function F(x) and its gradient G(x) are computable.

CONSTRUCTOR

my $lbfgs = Algorithm::LBFGS.new;
my Algorithm::LBFGS $lbfgs .= new; # with type restrictions

METHODS

minimize(:@x0!, :&evaluate!, :&progress, Algorithm::LBFGS::Parameter :$parameter!) returns Array

my @x = $lbfgs.minimize(:@x0!, :&evaluate, :&progress, :$parameter); # use &progress callback
my @x = $lbfgs.minimize(:@x0!, :&evaluate, :$parameter);

Runs the optimization and returns the resulting variables.

:@x0 is the initial value of the variables.

:&evaluate is the callback function. This requires the definition of the objective function F(x) and its gradient G(x).

:&progress is the callback function. This gets called on every iteration and can output the internal state of the current iteration.

:$parameter is the instance of the Algorithm::LBFGS::Parameter class.

:&evaluate

The one of the simplest &evaluate callback function would be like the following:

my &evaluate = sub ($instance, $x, $g, $n, $step --> Num) {
   my Num $fx = ($x[0] - 2.0) ** 2 + ($x[1] - 5.0) ** 2; # F(x) = (x0 - 2.0)^2 + (x1 - 5.0)^2

   # G(x) = [∂F(x)/∂x0, ∂F(x)/∂x1]
   $g[0] = 2.0 * $x[0] - 4.0; # ∂F(x)/∂x0 = 2.0 * x0 - 4.0
   $g[1] = 2.0 * $x[1] - 10.0; # ∂F(x)/∂x1 = 2.0 * x1 - 10.0
   return $fx;
};

&evaluate requires all of these five arguments in this order.

After writing the definition of the objective function F(x) and its gradient G(x), it requires returning the value of the F(x).

:&progress

The one of the simplest &progress callback function would be like the following:

my &progress = sub ($instance, $x, $g, $fx, $xnorm, $gnorm, $step, $n, $k, $ls --> Int) {
    "Iteration $k".say;
    "fx = $fx, x[0] = $x[0], x[1] = $x[1]".say;
    return 0;
}

&progress requires all of these ten arguments in this order.

Algorithm::LBFGS::Parameter :$parameter

Below is the examples of creating a Algorithm::LBFGS::Parameter instance:

my Algorithm::LBFGS::Parameter $parameter .= new; # sets default parameter
my Algorithm::LBFGS::Parameter $parameter .= new(max_iterations => 100); # sets max_iterations => 100
OPTIONS

STATUS CODES

TBD

AUTHOR

titsuki titsuki@cpan.org

COPYRIGHT AND LICENSE

Copyright 2016 titsuki

Copyright 1990 Jorge Nocedal

Copyright 2007-2010 Naoki Okazaki

libLBFGS by Naoki Okazaki is licensed under the MIT License.

This library is free software; you can redistribute it and/or modify it under the terms of the MIT License.

SEE ALSO