Rand Stats

The uploading author of cpan:ATROXAPER does not match the META author of github:atroxaper.

LogP6

cpan:ATROXAPER

Build Status Build Status

NAME

LogP6 is a fully customizable and fast logging library inspired by the idea of separating logging and its configuration. You can use it not only in apps but even in your libraries.

TABLE OF CONTENTS

SYNOPSIS

A logger system has to be as transparent as possible. At the same time, it has to be fully customizable. It has to provide a possibility to change logging logic without changing any line of code. It is amazing if you can use logger system during developing a library and its user do not feel discomfort of it. LogP6 logger library is all about that.

DESCRIPTION

Features

  1. Possibility to change logger configuration and its behavior without touching the code;
  2. Configuring from the code and/or configuration files;
  3. Possibility to use any IO::Handler, even for async or database work;
  4. Possibility to use multiple loggers in one app;
  5. Possibility to use multiply IO::Handler's in one logger;
  6. Flexible filter system;
  7. Stack and Map entities associated with each thread and possibility to use values from it in the logger;
  8. Pretty fast work. Using pre-calculation as much as possible - logger layout pattern is parsed only once, reuse current DateTime objects and so on;
  9. Possibility to use logger while development (dynamically change logger settings in runtime), and during production work (maximum fast, without any lock, excepts possible IO::Handle implementation's).

Concepts

Example

Using logger:

use LogP6;                     				# use library in general mode

my \log = get-logger('audit'); 				# create or get logger with 'audit' trait
log.info('property ', 'foo', ' setted as ', 5);   # log string with concatenation
log.infof('property %s setted as %d', 'foo', 5);  # log sprintf like style

Configure the logger in code:

use LogP6 :configure;   # use library in configure mode
cliche(                 # create a cliche
  :name<cl>,            # obligatory unique cliche name
  :matcher<audit>,      # obligatory matcher
  grooves => (                                          # optional list of writer-filter pairs (or their names)
    writer(:pattern('%level| %msg'), :handle($*ERR)),   # create anonymous (w/o name) writer configuration
    filter(:level($debug))));                           # create anonymous (w/o name) filter configuration

Configure in the configuration file style (same as above):

{
  "writers": [{
    "type": "std",
    "name": "w",
    "pattern": "%level | %msg",
    "handle": { "type": "std", "path": "err" }
  }],
  "filters": [{
    "type": "std",
    "name": "f",
    "level": "debug"
  }],
  "cliches": [{
    "name": "cl",
    "matcher": "audit",
    "grooves": [ "w", "f" ] 
  }]
}

Context

LogP6 library adds an object associated with each Thread - Logger Context or just context. You can work with the context directly in the filter subsystem or in custom writer implementations. Also, logger has methods for working with NDC and MDC (see below in Logger). For more information, please look at the methods' declarators in LogP6::Context.

Writer

Writer is responsible for writing all corresponding data to corresponding output in some format. It has only one method:

Filter

Filter is responsible for deciding to allow the corresponding writer to write a log or not. It has three methods:

Nested Diagnostic Context (NDC) and Mapped Diagnostic Context (MDC)

There are cases when we want to trace some information through a group of log messages, but not only in one message, for example, user id, http session number, or so. In such cases, we have to store the information somewhere, pass it through logic subs and methods, and pass to log methods over and over again. Since the log system has to be separated from the main program logic, then we need a special place to store the information. That place is a Nested Diagnostic Context (NDC) - a stack structure and a Mapped Diagnostic Context (MDC) - a map structure. You can push/pop values in NDC and put/remove values in MDC. The standard writer has special placeholders for message pattern (see below in Pattern) for put all values from NDC or some kay associated value from MDC to the final log message string.

Logger

A logger is an immutable object containing zero to several pairs of writer and filter (grooves). For each time you want to log some message (with or without arguments), the logger compiles message+arguments in one message string, updates the context, and goes through grooves - call filter's methods and, if it passes, then ask a writer to write the message. The writer takes all necessary information such as message, log level, NDC/MDC values, current date-time, and so from the context.

Logger has the following methods:

Logger Wrapper

It is a system to wrap (or decorate) logger object into another and add additional logic. You can describe logger wrapper factory, which will wrap any created logger.

Synchronisation of configuration and Logger instance

An example of logger wrapper usage is synchronization a logger configuration and logger instance. It may be useful in the case of development or debug session to change logger configuration dynamically.

Since a logger object is immutable and cannot know about configuration changes it produced, we need a logic that checks if the user updated the corresponding configuration and updates the logger instance.

You can specify any wrapper for logger synchronization. There is a helper class LogP6::Wrapper::SyncAbstract to create your synchronization wrapper.

For now, there are only two synchronization wrappers:

CONFIGURATION

For working with the LogP6 library, you need to use LogP6; module. Without any tags, it provides only the get-logger($trait) sub for retrieving a logger. :configure tag provides factory subroutines for configuring loggers from the code. Another option to configure logger is by using a configuration file.

Logger retrieve

To retrieve a logger, you need to use LogP6 module and call the get-logger($trait) sub with the logger trait you need. Example:

use LogP6;

my $log = get-logger('main');
# using $log ...

If you did not configure a Cliche for a specified logger trait ('main' in the example), the default logger would be returned (see Default logger). In other cases, the logger created by the cliche with matcher the trait satisfy will be returned.

Factory subroutines

LogP6 provides subroutines for configure loggers from the code dynamically. To get access to them, you need to use LogP6 with :configure tag. There are subroutines for configuring filters, writers, cliches, and any default values like writer pattern, logger wrapper, or so. Concrete subroutines will be described in the corresponding sections below. There is a get-logger-pure($trait) sub to retrieve pure logger without any wrappers. Also, five variables for five LogP6::Level enum values are exported as $trace, $debug, $info, $warn and $error. Example:

use LogP6 :configure;

set-default-wrapper(LogP6::Wrapper::SyncTime::Wrapper.new(:60seconds)); # set default wrapper
set-default-level($debug);    # set default logger level as debug
my $log = get-logger('main'); # get wrapped logger
$log.debug('msg');
my $pure-log = get-logger-pure('main'); # this logger will not synchronize its configuration

Configuration file

A better alternative (especially for production using) of configuration by factory subroutines is a configuration file. You can specify a path to it through LOG_P6_JSON system environment variable. In case the variable is empty, then standard path ./log-p6.json will be used (if it exists). Or you can initialize LogP6 library using init-from-file($config-path) factory subroutine.

The configuration file is a json formatted file. Example:

{
  "default-pattern": "%msg",
  "default-level":  "trace",
  "default-handle": { "type": "std", "path": "err" },
  "writers": [{ "type": "std", "name": "w" }],
  "filters": [{ "type": "std", "name": "f" }],
  "cliches": [{ "name": "c2", "matcher": "main" }]
}

The concrete format for concrete objects will be described in the corresponding sections below.

Some objects like writers, wrappers or so have a type filed. Each object has its own list of available types. There is a type that can be used in any object - custom. It uses to describe the factory method or class which will be used to produce the object. It requires additional fields:

For example, creating IO::Handle by create-handle subroutine in MyModule with arguments :path<out.txt>, :trait<rw>:

{
  "default-handle": {
    "type": "custom",
    "require": "MyModule",
    "fqn-method": "MyModule::EXPORT::DEFAULT::&create-handle",
    "args": { "path": "out.txt", "trait": "rw" }
  }
}

Writer configuration

WriterConf

WriterConf is a configuration object which contains all necessary information and algorithm for creating a concrete writer instance. For more information, please look at the methods' declarators in LogP6::WriterConf.

Standard WriterConf

Standard WriterConf (LogP6::WriterConf::Std) makes a writer that writes log message to abstract IO::Handle. It has a pattern - string with special placeholders for values like ndc, current Thread name, log message, etc. Writer will put all necessary values into pattern and write it to handle. Also, standard WriterConf has boolean auto-exceptions property - if it is True, then the placeholder for exception will be concatenated to the pattern automatically. Form of the exception placeholder can be configured separately (see Defaults and Cliche).

Pattern

Pattern placeholders start with % symbol following the placeholder name. If placeholder has arguments, they can be passed in curly brackets following placeholder name.

The pattern can have the following placeholders:

Note that using %framefile, %frameline or %framename in the pattern will slow your logging because it requires several callframe() calls on each resultative log call;

Async writing

LogP6 provides writer and handle implementation for asynchronous writing.

You can use LogP6::Handle::Async.new(IO::Handle :$delegate!, Scheduler :$scheduler = $*SCHEDULER) as a handle which will schedule WRITE method call of delegate handle.

If it is not enough to wrap a handle, then you can wrap the whole writer. Use LogP6::WriterConf::Async.new(LogP6::WriterConf :$delegate!, Scheduler :$scheduler = $*SCHEDULER), :$name, Bool :$need-callframe) as writer configuration of another configuration. The final writer will schedule the write method call of delegate created writer with a copy of the current logger context. If you miss a :name parameter, then delegate's name will be used. Pass boolean parameter need-callframe if you plan to use callframe information in the wrapped writer. Note that using callframe will slow your logging because it requires several callframe() calls on each resultative log call.

Writer factory subroutines

LogP6 module has the following subs for manage writers configurations:

Writer configuration file

In the configuration file, writer configurations have to be listed in writers array. Only std (for standard configuration) and custom types are supported.

In the case of standard configuration, all fields are optional excepts name. The handle can be:

In the case of the custom writer type, the result writer configuration has to returns not empty name.

Example:

{
  "writers": [
    {"type": "std", "name": "w1", "pattern": "%msg", "handle": {"type": "std", "path": "out"}},
    {"type": "std", "name": "w2", "handle": {"type": "file", "path": "log.txt", "append": false}},
    {"type": "custom", "require": "Module", "fqn-method": "Module::EXPORT::DEFAULT::&writer", "args": { "name": "w3" }}
  ]
}

Filter configuration

FilterConf

Filter creates by FilterConf - a configuration object which contains all necessary information and algorithm for creating a concrete filter instance. For more information, please look at the methods' declarators in LogP6::FilterConf.

Standard FilterConf

Standard FilterConf (LogP6::FilterConf::Std) has array for do-before subs and array for do-after subs. Filter made by standard FilterConf calls each do-before sub and stop at the first False returned value. If all do-before subs returned True, then the filter's do-before method returns True. The do-after works in the same way. Also, there is a first-level-check property. If it is set to True, then the sub for checking log level will be added automatically as the first element in do-before array; if the property set to False then the sub will be added automatically as the last element in do-before array.

Filter factory subroutines

LogP6 module has the following subs for manage filters configurations:

Filter configuration file

In the configuration file, filter configurations have to be listed in the filters array. Only std (for standard configuration) and custom types are supported.

In the case of standard configuration, all fields are optional excepts name. before-check and after-check are arrays with custom typed elements.

In the case of the custom filter type, the result filter configuration has to returns not empty name.

Example:

{
  "filters": [
    {"type": "std", "name": "f1", "level": "error", "first-level-check": false},
    {"type": "std", "name": "f2", "level": "info", "before-check": [{ "require": "MyModule", "fqn-method": "MyModule::EXPORT::DEFAULT::&before-check" }]},
    {"type": "custom", "require": "MyModule", "fqn-class": "MyModule::MyFilter", "args": { "name": "f3" }}
  ]
}

Defaults

Standard filters and writers have fields and options which affect their work. Some of them you can specify in factory subroutines or configuration file fields. If such arguments are omitted, then the default values of it will be used. Other fields and options cannot be setter this way. For example, the pattern for an exception that will be concatenated to the main pattern in the standard writer when auto-exceptions sets to True (see Standard WriterConf). Such properties have default values too. All the defaults can be set through factory subroutines or fields in the configuration file.

Configuring default values is useful in case you what to avoid many boilerplate configurations.

Defaults factory subroutines

There are the following factory subs for set defaults values:

Defaults configuration file

You can configure default values in the configuration file through the following json fields of a root object:

Wrapper can be:

Cliche

Cliche is a template for creating Logger. Each cliche has cliche's matcher - literal or regex field. When you what to get logger for some logger trait, then the logger system tries to find a cliche with matcher the trait satisfies (by smartmatch). If there is more than one such cliche, then the most recent created will be picked. The picked cliche's content will be used for making the new logger.

Cliche contains writers and filters configurations pairs called grooves and own defaults values which overrides global defaults values (see Defaults). You can use the same writer and/or filter in several grooves. If the grooves list is empty or missed, the created logger will drop all logs you pass to it;

Cliche factory subroutines

LogP6 module has the following subs for manage cliches configurations:

Cliche configuration file

In the configuration file, cliches have to be listed in the cliches array. It has the following fields:

Example:

{
  "cliches": [{
    "name": "c1", "matcher": "/bo .+ om/", "grooves": [ "w1", "f1", "w2", "f1" ],
    "wrapper": { "type": "transparent" }, "default-pattern": "%level %msg"
  }]
}

Default logger

In any way you configured your cliches by the factory routines or configuration file or did not use non of them, the default cliche will be in the logger library. Default cliche corresponds to the following configuration: cliche(:name(''), :matcher(/.*/), grooves => (writer(:name('')), filter(:name('')))). In other words, default cliche has an empty string name, matches any trait, has only one groove with empty (uses all defaults) writer with an empty string name and with empty (uses all defaults) filter with an empty string name. It means, by default, you do not need to configure anything at all. But you can change the default cliche or default writer and filter by factory subroutines or in the configuration file. Note that if the LogP6 library does not find cliche with matcher logger trait satisfies, then an exception will be thrown.

Change configuration

Sometimes you may need to change logger configuration in runtime execution. It can be simply done by factory subroutines. After calling any factory subroutine, all loggers for already used logger traits will be recreated, and you can get it by get-logger($trait) sub. If you already got logger, use synchronization wrapper, then the wrapper will sync the logger himself correspond to its algorithm.

Another way of change configuration is by using configuration file modification. Changes in the configuration file will be detected only if you are already using any of the synchronization wrappers (in defaults or one of cliches). After any change detection, all already configured configuration will be dropped and created new from the file.

EXAMPLES

Lets explore a few general use cases:

Use an external library that uses LogP6

LogP6 can be used during library development, and a user of the library wants entirely turn off any logs from the library. Let's imagine that all library loggers' traits start with LIBNAME letters. In this case, we can create a cliche with corresponding matcher and empty grooves - all library logs will be dropped.

In Raku:

use LogP6 :configure;

cliche(:name('turn off LIBNAME'), :matcher(/^LIBNAME .*/), :wrapper(LogP6::Wrapper::Transparent::Wrapper.new));

Or in the configuration file:

{ "cliches": [{"name": "turn off LIBNAME", "matcher": "/^LIBNAME .*/", "wrapper": {"type": "transparent"}}] }

We use wrapper without synchronization (transparent) because we do not plan to change the library loggers' configuration.

Change console application verbosity level

Let's imagine we are writing a console application, and we want to add the flag --verbose for getting a more detailed output. Lets using a particular logger for application console output instead of using simple say and change filter level according to the user's choice:

In Raku:

use LogP6 :configure;

cliche(:name<output>, :matcher<say>, grooves => (
  writer(:pattern('%msg'), :handle($*OUT)),
  filter(:name<verbosity>, :level($info))
));

sub MAIN(Bool :$verbose) {
  filter(:name<verbosity>, :level($debug), :update) if $verbose;
  my $say = get-logger('say');
  $say.info('Greetings');
  $say.debugf('You set verbose flag to %s value', $verbose);
}

In that case, we do not need to use the configuration file. But if you want, then you can remove the line with cliche creation and add the following configuration file:

{
  "writers": [{ "type": "std", "name": "say", "pattern": "%msg", "handle": { "type": "std", "path": "out" }}],
  "filters": [{ "type": "std", "name": "verbosity", "level": "info"}],
  "cliches": [{ "name": "output", "matcher": "say", "grooves": [ "say", "verbosity" ]}]
}

Conditional log calls

Sometimes you may need to log information that required additional calculation. It is useful to know whether the log will be written or not before the calculation. Logger's -on methods were created especially for that. It will return a particular object (or Any) with log and logf methods you can use to log with the corresponding log level. Please look at the example below:

use LogP6 :configure;

# set logger allowed level as INFO
filter(:name(''), :level($info), :update);
my $log = get-logger('condition');
my %map;
my $str;
# ...

# to-json will not be called here, because .debug-on returned Any for now
.log(to-json(%map, :pretty, :sorted-keys)) with $log.debug-on;

# from-json will be called here, because .warn-on returned a little logger
# log will be with WARN level
.log(from-json($str)<key>) with $log.warn-on;

with $log.trace-on {
  # this block will not be executed for now
  my $user-id = retrive-the-first-user-id-from-db();
  # use logf method to use sprintf-style logs
  .logf('the first user id in the database is %d', $user-id);
}

# Be careful with '.?' operator. Sins it is not an operator but syntax-sugar
# to-json will be called in any case, but log will not be written for now.
$log.debug-on.?log(to-json(%map, :pretty, :sorted-keys));

Associate logs with concrete user

Let's imagine we write a server application. Many users can connect to the server simultaneously and do some action, which produces log messages in a log file. If some exception will be caught and log, we want to reconstruct the user's execution flow to understand what went wrong. But needful records in the log file will be alongside logs from other users' actions. In such cases, we need to associate each log entry with some user id. Then we can grep the log file for the user id. For that, use MDC.

In Raku:

use LogP6 :configure;

cliche(:name<logfile>, :matcher<server-log>, grooves => (
  writer(
    :pattern('[%date{$hh:$mm:$ss:$mss}][user:%mdc{user-id}]: %msg'),
    :handle('logfile.log'.IO.open)),
  level($info)
));

my $server-log = get-logger('server-log');

sub database-read() { # note we do not pass $user in the sub
  $server-log.info('read from database'); # [23:35:43:1295][user:717]: read from database
  # read
  CATCH { default {
    $server-log.error('database fail', :x($_)); # [23:35:44:5432][user:717]: database fail Exception X::AdHoc "database not found" <trace> 
  }}
}

sub enter(User $user) {
  $server-log.mdc-put('user-id', $user.id);
  $server-log.info('connected');     # [23:35:43:1245][user:717]: connected

  database-read();

  $server-log.info('disconnected');  # [23:35:44:9850][user:717]: disconnected
  $server-log.mdc-remove('user-id'); # it is not necessary to remove 'user-id' value from MDC 
}

The same configuration you can write in the configuration file:

{
  "writers": [{ "type": "std", "name": "logfile", "pattern": "[%date{$hh:$mm:$ss:$mss}][user:%mdc{user-id}]: %msg",
    "handle": { "type": "file", "path": "logfile.log" }}],
  "filters": [{ "type": "std", "name": "logfile", "level": "info"}],
  "cliches": [{ "name": "logfile", "matcher": "server-log", "grooves": [ "logfile", "logfile" ]}]
}

Filter log by its content

Imagine we have an application that may write sensible content to log files, for example, user passwords. And we want to drop such sensible logs. We can use a particular sub in do-before action of log's filter.

In Raku:

unit module Main;
use LogP6 :configure;

sub drop-passwords($context) is export {...}

cliche(:name<sensible>, :matcher<log>, grooves => (
  writer(:pattern('%msg'), :handle('logfile.log'.IO.open)),
  filter(:level($info), before-check => (&drop-passwords))
));

sub drop-passwords($context) {
  return False if $context.msg ~~ / password /;
  # If you want to remove a password from the log entry instead of dropping it,
  # you can remove the password from the message and store it in the context like:
  #
  # $context.msg-set(remove-password($context.msg));
  True;
}

sub connect(User $user) {
  get-logger('log').infof('user with name %s and password %s connected', $user.id, $user.passwd);
}

The same configuration you can write in the configuration file:

{
  "writers": [{ "type": "std", "name": "writer", "pattern": "%msg",
    "handle": { "type": "file", "path": "logfile.log" }}],
  "filters": [{ "type": "std", "name": "pass-filter", "level": "info",
    "before-check": [{ "require": "Main", "fqn-method": "Main::EXPORT::DEFAULT::&drop-passwords" }]}],
  "cliches": [{ "name": "logfile", "matcher": "server-log", "grooves": [ "writer", "pass-filter" ]}]
}

Write one log in several outputs

Let's imagine we have an application that works with several types of a database -- for example, Oracle and SQLite. We want to log work with the databases. But we want to store Oracle related logs in oracle.log and database.log files, and SQLite related records only in database.log. In this case, we need a straightforward logger for SQLite related logs and another one (with two grooves) for Oracle associated logs.

In Raku:

use LogP6 :configure;

set-default-pattern('%msg');
writer(:name<database>, :handle('database.log'.IO.open));
writer(:name<oracle>,   :handle(  'oracle.log'.IO.open));
filter(:name<filter>, :level($info));
cliche(:name<oracle>, :matcher<oracle>, grooves => ('database', 'filter', 'oracle', 'filter'));
cliche(:name<sqlite>, :matcher<sqlite>, grooves => ('database', 'filter'));

sub oracle-db-fetch() {
  get-logger('oracle').info('fetch data');
  # fetch
}

sub sqlite-db-fetch() {
  get-logger('sqlite').info('fetch data');
  # fetch
}

The same configuration you can write in the configuration file:

{
  "default-pattern": "%msg",
  "writers": [
    { "name": "database", "type": "std", "handle": { "type": "file", "path": "database.log"}},
    { "name": "oracle",   "type": "std", "handle": { "type": "file", "path": "oracle.log"  }}
  ],
  "filters": [{ "name": "filter", "type": "std", "level": "info" }],
  "cliches": [
    { "name": "oracle", "matcher": "oracle", "grooves": [ "database", "filter", "oracle", "filter" ]},
    { "name": "sqlite", "matcher": "sqlite", "grooves": [ "database", "filter" ]}
  ]
}

Write logs in journald

Let's imagine you want to store logs in journald service. You can use LogP6::Writer::Journald module for that. For more information, please look at the writer module README. Example of configuration:

In Raku:

use LogP6 :configure;

writer(LogP6::WriterConf::Journald.new(
  # name, pattern and auto-exceptions as in standard writer
  :name<to-journald>, :pattern('%msg'), :auto-exeptions
  # which additional information must be written
  :use-priority, # write 'PRIORITY=' field to journald automatically
  :use-mdc       # write all MDC contend as field to journald in 'key=value' format
));

The same configuration you can write in the configuration file:

{"writers": [{
  "type": "custom",
  "require": "LogP6::WriterConf::Journald",
  "fqn-class": "LogP6::WriterConf::Journald",
  "args": {
    "name": "to-journald",
    "pattern": "%msg",
    "auto-exceptions": true,
    "use-priority": true,
    "use-mdc": true
  }
}]}

Write custom writer handle for your need

Sometimes you may need to write log to some exotic place. In this case you will need to implement your own Writer and its WriterConf. In simple cases, it would be enough to implement your own IO::Handle for the standard writer. For example, there is no de-facto must-use library for working with databases yet. That is why there is no particular writer for it in LogP6. But let's try to write it now:

Imagine we decided to use SQLite database and DB::SQLite library. We can ask the standard writer to prepare SQL insert expression for us. Therefore we can only write custom IO::Handle. Fortunately, it is easy in 6.d version:

unit module MyDBHandle;
use DB;
use DB::SQLite;

class DBHandle is IO::Handle {
  has Str $.filename is required;
  has DB $!db;

  submethod TWEAK() {
    self.encoding: 'utf8';
    # open database file and create table for logging
    $!db = DB::SQLite.new(:$!filename);
    $!db.execute('create table if not exists logs (date text, level text, log text)');
  }

  method WRITE(IO::Handle:D: Blob:D \data --> Bool:D) {
    # decode Blob data and execute. we expect the valid sql dml expression in data.
    $!db.execute(data.decode());
    True;
  }

  method close() { $!db.finish } # close database

  method READ(|) { #`[do nothing] }

  method EOF { #`[do nothing] }
}

It is all we need. Now we can write the LogP6 configuration. In Raku:

use MyDBModule;
use LogP6 :configure;

writer(
  :name<db>,
  # pattern provides us a valid sql dml expression
  :pattern('insert into logs (date, level, log) values (\'%date\', \'%level\', \'%msg%x{ - $name $msg $trace}\')'),
  # handle with corresponding database filename
  handle => DBHandle.new(:filename<database-log.sqlite>),
  # turn off auto-exceptions because it will corrupt our sql dml expression
  :!auto-exceptions
);
cliche(:name<db>, :matcher<db>, grooves => ('db', level($trace)));

my $log-db = get-logger('db');

$log-db.info('database logging works well');

The same configuration you can write in the configuration file:

{
  "writers": [{
    "type": "std",
    "name": "db",
    "pattern": "insert into logs (date, level, log) values ('%date', '%level', '%msg%x{ - $name $msg $trace}'",
    "handle": {
      "type": "custom",
      "require": "MyDBModule",
      "fqn-class": "MyDBModule::DBHandle",
      "args": {
        "filename": "database-log.sqlite"
      }
    },
    "auto-exceptions": false
  }],
  "filters": [{ "name": "filter", "type": "std", "level": "trace" }],
  "cliches": [{
    "name": "db",
    "matcher": "db",
    "grooves": [ "db", "filter" ]
  }]
}

Rollover log files

Log files tend to grow in size. There is IO::Handle::Rollover module to prevent such uncontrolled growth. For example, you decided to store only 30MB of logs separated into three files for convenience. For that, you only need to create a custom handle with an open routine like that:

In Raku:

use IO::Handle::Rollover;
my $handle = open("log.txt", :w, :rollover, :file-size<10M>, :3history-size);

The same initialization you can write in the configuration file:

{
...
  "handle": {
    "type": "custom",
    "require": "IO::Handle::Rollover",
    "fqn-method": "IO::Handle::Rollover::EXPORT::DEFAULT::&open",
    "positional": [ "log.txt" ],
    "args": {
      "w": true,
      "rollover": true,
      "file-size": "10M",
      "history-size": 3
    }
  }
...
}

You can use the handle as any other output handles, for example, in LogP6 writers. For more information, see the documentation for the IO::Handle::Rollover module.

BEST PRACTICE

Try to use good traits for your loggers. If you use loggers in your library, then probably using one prefix in all your traits is the best option. It allows users of your library to manage your loggers easily.

Try to choose a logger trait according to logger semantic or location. For example, you can use $?CLASS.^name as a logger trait in any of your classes or traits like database, user management, or so.

If you use logger within a class then make the logger be a class field like has $!log = get-logger('$?CLASS.^name'); If you use logger withing a subroutines logic then make a special sub for retrieve logger like sub log() { state $log = get-logger('trait'); }. Then use it like log.info('msg'); It prevents any side effects caused by precompilation.

SEE OLSO

ROADMAP

AUTHOR

Mikhail Khorkov atroxaper@cpan.org

Source can be located at: github. Comments and Pull Requests are welcome.

COPYRIGHT AND LICENSE

Copyright 2020 Mikhail Khorkov

This library is free software; you can redistribute it and/or modify it under the Artistic License 2.0.