pins
command
The pins
command is a multi-language interpreter used for dealing with
C - C++code generated by the XOptima module in Maple©.
It has three main functionalities:
compile the generated code into a dynamic library and a static library, to be used by custom code written by the user;
provide a powerful way for describing problem data as Ruby scripts that are loaded by the library generated in the previous step;
allow to quickly write a script using the flexible and powerful Ruby language that dynamically links at runtime with the problem dynamic library generated in 1.~and control the execution of the solver, manage the output, and even plan and run parametric investigations.
Note that the main language provided by pins is Ruby, although the interpreter itself is mruby, which is an embedded, simplified, and faster version of it. The syntax is compatible to Ruby 1.9, although the interpreter has some notable difference, including the lack of support for Ruby gems (libraries).
The powerful mechanism described in 3.~is enabled by the fact that
the dynamic library generated by XOptima also includes C entry
points that allow the mruby interpreter to dynamically load the
compiled library at runtime (via dlopen
).
This works regardless of the platform.
Note though that, if the Ruby scripts are portable amongst different platforms, the compiled dynamic libraries are not, and you will need to recompile the source code for each platform. The code generated by Maple© and XOptima, though, can be moved to a different platform and recompiled without editing.
The pins command also embeds a |LUA| interpreter. This allows the user to also write data files in |LUA|, although that interpreter currently does not offer the same functionalities as the Ruby one.
How to load data files in PINS
This introduction explains how to properly load data
files into a pins script, either by using the require
data/problem_data.rb
way, or by using a YAML file.
Data in Ruby scripts
This is the default method for loading problem data:
whenever you create a new problem in Maple© and generate the files,
a template pins
script and another Ruby script within data are created.
When executed, the pins script loads the data script by the instruction
require "data/problem_data.rb"
Then, a new instance of the solver is created:
ocp = Mechatronix::OCPSolver.new
To better understand how it works, you must remember that the
Mechatronix::OCPSolver.new
method has an optional argument,
which must be a Hash containing the problem data, with the appropriate keys.
If this value is missing, the initialization searches for available
data in the globally visible Mechatronix.content} variable
.
Whenever you are using the syntax
mechatronix do |data|
data.FirstPar = 10
data.SecondPar = 20
# ...
end
you are actually using some commodity methods for writing
keys into the globally visible Mechatronix.content
(an instance of Mechatronix::Container
, child class of
Hash
), so that — according to the previous example — the key
Mechatronix.content.FirstPar
has a value of 10
anywhere in your scripts.
This apparently cumbersome procedure allows you to do something that would be otherwise impossible: sharing values between different files. In fact, variables defined at top-level have a file-local scope: if you define any variable in file B required by file A, that variable is only visible in B and not in A.
So, thanks to this approach, values written into
Mechatronix.content
in file B, are perfectly visible from
within the file A (an executable script) that is requiring the file B
(a data file). The mechatronix do |d| ... end
writings
(note the lowercase mechatronix
) is thus a commodity
method for ensuring to properly access the keys of
Mechatronix.content
(which, by default,
need to start with an uppercase character).
Complex case: multiple data files
The mechanism above illustrated also allows to split the problem data structure over multiple files, giving you fine-tuned control. So the main executable script can require a first, top-level data script like the following:
include Mechatronix
# Auxiliary values: these values have a file-local scope,
# i.e. they are only visible here!
epsi_max = 0.01
tol_max = 0.01
ubMax = 2
uaMax = 10
# Setting data into Mechatronix.content
mechatronix do |data|
# setup solver
data.Solver = {
:max_iter => 300,
:tolerance => 1e-9,
}
# Boundary Conditions
data.BoundaryConditions = {
:initial_x => SET,
:initial_v => SET,
:final_x => SET,
:final_v => SET,
}
data.Parameters = {
# Model Parameters
:alpha => 0.3,
:beta => 0.14,
:gm => 0.16,
:uaMax => 10,
:ubMax => 2,
# Guess Parameters
# Boundary Conditions
:v_f => 0,
:v_i => 0,
:x_f => 6,
:x_i => 0,
# Post Processing Parameters
# User Function Parameters
# Continuation Parameters
:tol_max => 0.01,
:tol_min => 0.001,
:epsi_max => 0.01,
:epsi_min => 0.0001,
# Constraints Parameters
}
# Controls
data.Controls = {}
data.Controls[:uaControl] = {
:type => U_COS_LOGARITHMIC,
:epsilon => epsi_max,
:tolerance => tol_max
}
data.Controls[:ubControl] = {
:type => U_COS_LOGARITHMIC,
:epsilon => epsi_max,
:tolerance => tol_max
}
end
require_relative("./mesh.rb",__FILE__)
In turn, on the very last line, this top-level data script
requires another data script, mesh.rb
, describing the mesh and
located in the same folder than the calling data file:
mechatronix do |data|
# User defined classes initialization
data.Mesh =
{
:s0 => 0,
:segments => [
{
:n => 25,
:length => 0.25,
},
{
:n => 3000,
:length => 0.75,
},
{
:n => 100,
:length => 3.8,
},
],
}
end
Note that the require
command in pins
(as well as in Ruby) needs a path relative to the current
executable script. For this reason, if you want to require one or
more other scripts from the main data file, you shall use the
require_relative
pins
command, which starts searching
for the file to be loaded from the path of the current script,
rather than that of the current executable.
Data in YAML files
Sometimes, you might prefer to serialize the data structure
in a neutral format as YAML. To do so, you can serialize
Mechatronix.content
and write the result to a file quite easily:
File.open("data.yaml", 'w') do |file|
YAML.dump(Mechatronix.content.to_hash, file)
end
This is creating a YAML-formatted file that you can edit or pass to other scripts/languages (there are YAML interfaces for C - |CPP|, |MATLAB|, |PYTHON|, etc).
To load the YAML data file back into the Mechatronix::OCPSolver
,
you need then to do:
my_data = Mechatronix::Container.new
my_data.content = File.open("data.yaml") {|file| YAML.load(file)}
ocp = Mechatronix::OCPSolver.new my_data
## Main functionalities
Loading Ruby scripts
pins
scripts can load other Ruby files containing class
definitions or providing problem data via the
Mechatronix.content
mechanism.
Loading a file is performed with the require
command:
require "./lib/another_script" # .rb extension is optional!
where the argument to require
must be either an absolute path
or the path relative to the main executable script.
So if the script A requires the script B, which in turn needs to
require the script C, the path to C in B must be relative to A.
pins
comes with an utility function to always load a script
at a path relative to the current script, rather than the main
executable script.
This function is require_relative("./another_script",__FILE__)
.
PINS standard libraries
A number of utility libraries can be loaded, or required. These libraries are:
require "build"
: support automatic build of C++filesrequire "colorize"
: print coloured strings on terminalrequire "parametric_analysis"
: automate parametric analysis
Note that requiring these libraries does not need a full path,
just their name.
The user can customize some of these libraries and add more of his own
to be loaded by name only.
To do that, custom Ruby files can be placed in the special system folder,
whose path is system-dependent and can be obtained by issuing
pins -i
and looking for the pins
library path: string.
That library folder also contains a file named autoload.rb
.
That file is automatically loaded upon launching nd,
so that the user can customize that file by adding
require
commands for each library that ought to
be automatically loaded.
For example, if you want to have the colorize functionality
always available, just add require "colorize"
at the end of
the autoload.rb
file.
Note that the autoload.rb
script will be reset to default
after each update of the |MECHATRONIX| install package, so
remember to back it up before updates!
Example script
The following example corresponds to a simplified version of the basic template generated by XOptima.
#!/usr/bin/env pins
require 'mechatronix'
# Set the model name
problem_name = "Train"
# Compile the scripts, unless the library already exists
# the command line flag '-f' forces recompilation
if ! File.exist?("lib/lib#{problem_name}.dylib") || ARGV.include?('-f') then
require "build"
compiler = MXBuilder.new(problem_name)
compiler.build
end
# load the data file
require './data/Train_Data.rb'
# Link the library
ocp = Mechatronix::OCPSolver.new "lib/lib#{problem_name}.#{Mechatronix::DYLIB_EXT}"
# Setup the solver
ocp.setup
# Calculate the OCP solution. Result is into ocp.ocp_solution
ocp.solve
# Write results
unless ocp.ocp_solution[:Error] then
if ocp.ocp_solution[:converged] then
ocp.write_ocp_solution("data/#{problem_name}_OCP_result.txt")
else
ocp.write_ocp_solution("data/#{problem_name}_OCP_result_NOT_CONVERGED.txt")
end
else
ocp.write_ocp_solution("data/#{problem_name}_OCP_result_NOT_CONVERGED.txt")
puts ocp.ocp_solution[:Error]
end
figlet "All Done Folks!"
Example data file
The following is a typical data file generated by the XOptima package in Maple©. It allows a flexible and maintainable definition of model parameters and solver configuration. You can freely intermix code and data, define intermediate variables and set model parameters to computed values.
include Mechatronix
# Auxiliary values
epsi_max = 0.01
tol_max = 0.01
ubMax = 2
uaMax = 10
mechatronix do |data|
# Level of message
data.LU_method = LU_automatic
# Enable doctor
data.Doctor = false
# Level of message
data.InfoLevel = 4
# setup solver
data.Solver = {
:max_iter => 300,
:tolerance => 1e-9,
}
# Boundary Conditions
data.BoundaryConditions = {
:initial_x => SET,
:initial_v => SET,
:final_x => SET,
:final_v => SET,
}
data.Parameters = {
# Model Parameters
:alpha => 0.3,
:beta => 0.14,
:gm => 0.16,
:uaMax => 10,
:ubMax => 2,
# Guess Parameters
# Boundary Conditions
:v_f => 0,
:v_i => 0,
:x_f => 6,
:x_i => 0,
# Post Processing Parameters
# User Function Parameters
# Continuation Parameters
:tol_max => 0.01,
:tol_min => 0.001,
:epsi_max => 0.01,
:epsi_min => 0.0001,
# Constraints Parameters
}
# functions mapped on objects
data.MappedObjects = {}
# Controls
# Penalty type controls: U_QUADRATIC, U_QUADRATIC2, U_PARABOLA, U_CUBIC
# Barrier type controls: U_LOGARITHMIC, U_COS_LOGARITHMIC, TAN2, U_HYPERBOLIC
data.Controls = {}
data.Controls[:uaControl] = {
:type => U_COS_LOGARITHMIC,
:epsilon => epsi_max,
:tolerance => tol_max
}
data.Controls[:ubControl] = {
:type => U_COS_LOGARITHMIC,
:epsilon => epsi_max,
:tolerance => tol_max
}
data.Constraints = {}
# Constraint1D: none defined
# Constraint2D: none defined
# User defined classes initialization
data.Mesh =
{
:s0 => 0,
:segments => [
{
:n => 25,
:length => 0.25,
},
{
:n => 3000,
:length => 0.75,
},
{
:n => 100,
:length => 3.8,
},
],
} ;
end
# user function classes initializations
# EOF