more graphs for evo1d

This commit is contained in:
Nicole Dresselhaus 2017-10-01 20:06:41 +02:00
parent c0399a9499
commit 3ce0a99591
Signed by: Drezil
GPG Key ID: 057D94F356F41E25
25 changed files with 918 additions and 90 deletions

View File

@ -50,3 +50,28 @@
publisher = {Springer-Verlag},
address = {Berlin, Heidelberg},
}
@book{golub2012matrix,
title={Matrix computations},
author={Golub, Gene H and Van Loan, Charles F},
volume={3},
year={2012},
publisher={JHU Press}
}
@article{weise2012evolutionary,
title={Evolutionary Optimization: Pitfalls and Booby Traps},
author={Weise, Thomas and Chiong, Raymond and Tang, Ke},
journal={J. Comput. Sci. \& Technol},
volume={27},
number={5},
year={2012},
url={http://jcst.ict.ac.cn:8080/jcst/EN/article/downloadArticleFile.do?attachType=PDF\&id=9543}
}
@inproceedings{thorhauer2014locality,
title={On the locality of standard search operators in grammatical evolution},
author={Thorhauer, Ann and Rothlauf, Franz},
booktitle={International Conference on Parallel Problem Solving from Nature},
pages={465--475},
year={2014},
organization={Springer},
url={https://www.lri.fr/~hansen/proceedings/2014/PPSN/papers/8672/86720465.pdf}
}

View File

@ -22,7 +22,7 @@
\setcounter{ContinuedFloat}{0}
\setcounter{float@type}{16}
\setcounter{lstnumber}{1}
\setcounter{NAT@ctr}{5}
\setcounter{NAT@ctr}{8}
\setcounter{AM@survey}{0}
\setcounter{r@tfl@t}{0}
\setcounter{subfigure}{0}

View File

@ -4,14 +4,16 @@ fontsize: 11pt
\chapter*{How to read this Thesis}
As a guide through the nomenclature used in the formulas we prepend this chapter.
As a guide through the nomenclature used in the formulas we prepend this
chapter.
Unless otherwise noted the following holds:
- lowercase letters $x,y,z$
refer to real variables and represent a point in 3D-Space.
- lowercase letters $u,v,w$
refer to real variables between $0$ and $1$ used as coefficients in a 3D B-Spline grid.
refer to real variables between $0$ and $1$ used as coefficients in a 3D
B-Spline grid.
- other lowercase letters
refer to other scalar (real) variables.
- lowercase **bold** letters (e.g. $\vec{x},\vec{y}$)
@ -21,12 +23,13 @@ Unless otherwise noted the following holds:
# Introduction
In this Master Thesis we try to extend a previously proposed concept of predicting
the evolvability of \acf{FFD} given a Deformation-Matrix\cite{anrichterEvol}.
In the original publication the author used random sampled points weighted with
\acf{RBF} to deform the mesh and defined three different criteria that can be
calculated prior to using an evolutional optimisation algorithm to asses the
quality and potential of such optimisation.
In this Master Thesis we try to extend a previously proposed concept of
predicting the evolvability of \acf{FFD} given a
Deformation-Matrix\cite{anrichterEvol}. In the original publication the author
used random sampled points weighted with \acf{RBF} to deform the mesh and
defined three different criteria that can be calculated prior to using an
evolutional optimization algorithm to asses the quality and potential of such
optimization.
We will replicate the same setup on the same meshes but use \acf{FFD} instead of
\acf{RBF} to create a deformation and evaluate if the evolution-criteria still
@ -35,12 +38,14 @@ work as a predictor given the different deformation scheme.
## What is \acf{FFD}?
First of all we have to establish how a \ac{FFD} works and why this is a good
tool for deforming meshes in the first place. For simplicity we only summarize the
1D-case from \cite{spitzmuller1996bezier} here and go into the extension to the 3D case in chapter \ref{3dffd}.
tool for deforming meshes in the first place. For simplicity we only summarize
the 1D-case from \cite{spitzmuller1996bezier} here and go into the extension to
the 3D case in chapter \ref{3dffd}.
Given an arbitrary number of points $p_i$ alongside a line, we map a scalar
value $\tau_i \in [0,1[$ to each point with $\tau_i < \tau_{i+1} \forall i$.
Given a degree of the target polynomial $d$ we define the curve $N_{i,d,\tau_i}(u)$ as follows:
Given a degree of the target polynomial $d$ we define the curve
$N_{i,d,\tau_i}(u)$ as follows:
\begin{equation} \label{eqn:ffd1d1}
N_{i,0,\tau}(u) = \begin{cases} 1, & u \in [\tau_i, \tau_{i+1}[ \\ 0, & \mbox{otherwise} \end{cases}
@ -52,12 +57,17 @@ and
N_{i,d,\tau}(u) = \frac{u-\tau_i}{\tau_{i+d}} N_{i,d-1,\tau}(u) + \frac{\tau_{i+d+1} - u}{\tau_{i+d+1}-\tau_{i+1}} N_{i+1,d-1,\tau}(u)
\end{equation}
If we now multiply every $p_i$ with the corresponding $N_{i,d,\tau_i}(u)$ we get the contribution of each
point $p_i$ to the final curve-point parameterized only by $u \in [0,1[$.
As can be seen from \eqref{eqn:ffd1d2} we only access points $[i..i+d]$ for any given $i$^[one more for each recursive step.], which
gives us, in combination with choosing $p_i$ and $\tau_i$ in order, only a local interference of $d+1$ points.
If we now multiply every $p_i$ with the corresponding $N_{i,d,\tau_i}(u)$ we get
the contribution of each point $p_i$ to the final curve-point parameterized only
by $u \in [0,1[$. As can be seen from \eqref{eqn:ffd1d2} we only access points
$[i..i+d]$ for any given $i$^[one more for each recursive step.], which gives
us, in combination with choosing $p_i$ and $\tau_i$ in order, only a local
interference of $d+1$ points.
We can even derive this equation straightforward for an arbitrary $N$^[*Warning:* in the case of $d=1$ the recursion-formula yields a $0$ denominator, but $N$ is also $0$. The right solution for this case is a derivative of $0$]:
We can even derive this equation straightforward for an arbitrary
$N$^[*Warning:* in the case of $d=1$ the recursion-formula yields a $0$
denominator, but $N$ is also $0$. The right solution for this case is a
derivative of $0$]:
$$\frac{\partial}{\partial u} N_{i,d,r}(u) = \frac{d}{\tau_{i+d} - \tau_i} N_{i,d-1,\tau}(u) - \frac{d}{\tau_{i+d+1} - \tau_{i+1}} N_{i+1,d-1,\tau}(u)$$
@ -65,30 +75,42 @@ For a B-Spline
$$s(u) = \sum_{i} N_{i,d,\tau_i}(u) p_i$$
these derivations yield $\frac{\partial^d}{\partial u} s(u) = 0$.
Another interesting property of these recursive polynomials is that they are continuous (given $d \ge 1$) as every $p_i$ gets
blended in linearly between $\tau_i$ and $\tau_{i+d}$ and out linearly between $\tau_{i+1}$ and $\tau_{i+d+1}$
as can bee seen from the two coefficients in every step of the recursion.
Another interesting property of these recursive polynomials is that they are
continuous (given $d \ge 1$) as every $p_i$ gets blended in linearly between
$\tau_i$ and $\tau_{i+d}$ and out linearly between $\tau_{i+1}$ and
$\tau_{i+d+1}$ as can bee seen from the two coefficients in every step of the
recursion.
### Why is \ac{FFD} a good deformation function?
The usage of \ac{FFD} as a tool for manipulating follows directly from the properties of the polynomials and the correspondence to
the control points.
Having only a few control points gives the user a nicer high-level-interface, as she only needs to move these points and the
model follows in an intuitive manner. The deformation is smooth as the underlying polygon is smooth as well and affects as many
vertices of the model as needed. Moreover the changes are always local so one risks not any change that a user cannot immediately see.
The usage of \ac{FFD} as a tool for manipulating follows directly from the
properties of the polynomials and the correspondence to the control points.
Having only a few control points gives the user a nicer high-level-interface, as
she only needs to move these points and the model follows in an intuitive
manner. The deformation is smooth as the underlying polygon is smooth as well
and affects as many vertices of the model as needed. Moreover the changes are
always local so one risks not any change that a user cannot immediately see.
But there are also disadvantages of this approach. The user loses the ability to directly influence vertices and even seemingly simple tasks as
creating a plateau can be difficult to achieve\cite[chapter~3.2]{hsu1991dmffd}\cite{hsu1992direct}.
But there are also disadvantages of this approach. The user loses the ability to
directly influence vertices and even seemingly simple tasks as creating a
plateau can be difficult to
achieve\cite[chapter~3.2]{hsu1991dmffd}\cite{hsu1992direct}.
This disadvantages led to the formulation of \acf{DM-FFD}\cite[chapter~3.3]{hsu1991dmffd} in which the user directly interacts with the surface-mesh.
All interactions will be applied proportionally to the control-points that make up the parametrization of the interaction-point
itself yielding a smooth deformation of the surface *at* the surface without seemingly arbitrary scattered control-points.
Moreover this increases the efficiency of an evolutionary optimization\cite{Menzel2006}, which we will use later on.
This disadvantages led to the formulation of
\acf{DM-FFD}\cite[chapter~3.3]{hsu1991dmffd} in which the user directly
interacts with the surface-mesh. All interactions will be applied
proportionally to the control-points that make up the parametrization of the
interaction-point itself yielding a smooth deformation of the surface *at* the
surface without seemingly arbitrary scattered control-points. Moreover this
increases the efficiency of an evolutionary optimization\cite{Menzel2006}, which
we will use later on.
But this approach also has downsides as can be seen in \cite[figure~7]{hsu1991dmffd}\todo{figure hier einfügen?}, as the tessellation of
the invisible grid has a major impact on the deformation itself.
But this approach also has downsides as can be seen in
\cite[figure~7]{hsu1991dmffd}\todo{figure hier einfügen?}, as the tessellation
of the invisible grid has a major impact on the deformation itself.
All in all \ac{FFD} and \ac{DM-FFD} are still good ways to deform a high-polygon mesh albeit the downsides.
All in all \ac{FFD} and \ac{DM-FFD} are still good ways to deform a high-polygon
mesh albeit the downsides.
## What is evolutional optimization?
@ -96,26 +118,83 @@ All in all \ac{FFD} and \ac{DM-FFD} are still good ways to deform a high-polygon
## Wieso ist evo-Opt so cool?
The main advantage of evolutional algorithms is the ability to find optima of general functions just with the help of a given
error-function (or fitness-function in this domain). This avoids the general pitfalls of gradient-based procedures, which often
target the same error-function as an evolutional algorithm, but can get stuck in local optima.
The main advantage of evolutional algorithms is the ability to find optima of
general functions just with the help of a given error-function (or
fitness-function in this domain). This avoids the general pitfalls of
gradient-based procedures, which often target the same error-function as an
evolutional algorithm, but can get stuck in local optima.
This is mostly due to the fact that a gradient-based procedure has only one point of observation from where it evaluates the next
steps, whereas an evolutional strategy starts with a population of guessed solutions. Because an evolutional strategy modifies
the solution randomly, keeps the best solutions and purges the worst, it can also target multiple different hypothesis at the same time
where the local optima die out in the face of other, better candidates.
This is mostly due to the fact that a gradient-based procedure has only one
point of observation from where it evaluates the next steps, whereas an
evolutional strategy starts with a population of guessed solutions. Because an
evolutional strategy modifies the solution randomly, keeps the best solutions
and purges the worst, it can also target multiple different hypothesis at the
same time where the local optima die out in the face of other, better
candidates.
If an analytic best solution exists (i.e. because the error-function is convex) an evolutional algorithm is not the right choice. Although
both converge to the same solution, the analytic one is usually faster. But in reality many problems have no analytic solution, because
the problem is not convex. Here evolutional optimization has one more advantage as you get bad solutions fast, which refine over time.
If an analytic best solution exists (i.e. because the error-function is convex)
an evolutional algorithm is not the right choice. Although both converge to the
same solution, the analytic one is usually faster. But in reality many problems
have no analytic solution, because the problem is not convex. Here evolutional
optimization has one more advantage as you get bad solutions fast, which refine
over time.
## Criteria for the evolvability of linear deformations
### Variability
## Evolvierbarkeitskriterien
In \cite{anrichterEvol} variability is defined as
$$V(\vec{U}) := \frac{\textrm{rank}(\vec{U})}{n},$$
whereby $\vec{U}$ is the $m \times n$ deformation-Matrix used to map the $m$
control points onto the $n$ vertices.
- Konditionszahl etc.
Given $n = m$, an identical number of control-points and vertices, this
quotient will be $=1$ if all control points are independent of each other and
the solution is to trivially move every control-point onto a target-point.
# Hauptteil
In praxis the value of $V(\vec{U})$ is typically $\ll 1$, because as
there are only few control-points for many vertices, so $m \ll n$.
Additionally in our setup we connect neighbouring control-points in a grid so
each control point is not independent, but typically depends on $4^d$
control-points for an $d$-dimensional control mesh.
### Regularity
Regularity is defined\cite{anrichterEvol} as
$$R(\vec{U}) := \frac{1}{\kappa(\vec{U})} = \frac{\sigma_{min}}{\sigma_{max}}$$
where $\sigma_{min}$ and $\sigma_{max}$ are the smallest and greatest right singular
value of the deformation-matrix $\vec{U}$.
As we deform the given Object only based on the parameters as $\vec{p} \mapsto
f(\vec{x} + \vec{U}\vec{p})$ this makes sure that $\|\vec{Up}\| \propto
\|\vec{p}\|$ when $\kappa(\vec{U}) \approx 1$. The inversion of $\kappa(\vec{U})$
is only performed to map the criterion-range to $[0..1]$, whereas $1$ is the
optimal value and $0$ is the worst value.
This criterion should be characteristic for numeric stability on the on
hand\cite[chapter 2.7]{golub2012matrix} and for convergence speed of evolutional
algorithms on the other hand\cite{anrichterEvol} as it is tied to the notion of
locality\cite{weise2012evolutionary,thorhauer2014locality}.
### Improvement Potential
In contrast to the general nature of variability and regularity, which are
agnostic of the fitness-function at hand the third criterion should reflect a
notion of potential.
As during optimization some kind of gradient $g$ is available to suggest a
direction worth pursuing we use this to guess how much change can be achieved in
the given direction.
The definition for an improvement potential $P$ is\cite{anrichterEvol}:
$$
P(\vec{U}) := 1 - \|(\vec{1} - \vec{UU}^+)\vec(G)\|^2_F
$$
given some approximate $n \times d$ fitness-gradient $\vec{G}$, normalized to
$\|\vec{G}\|_F = 1$, whereby $\|\cdot\|_F$ denotes the Frobenius-Norm.
# Implementation of \acf{FFD}
## Was ist FFD?
\label{3dffd}
@ -124,41 +203,43 @@ the problem is not convex. Here evolutional optimization has one more advantage
- Wieso Newton-Optimierung?
- Was folgt daraus?
## Szenarien vorstellen
## Test Scenario: 1D Function Approximation
### 1D
#### Optimierungszenario
### Optimierungszenario
- Ebene -> Template-Fit
#### Matching in 1D
### Matching in 1D
- Trivial
#### Besonderheiten der Auswertung
### Besonderheiten der Auswertung
- Analytische Lösung einzig beste
- Ergebnis auch bei Rauschen konstant?
- normierter 1-Vektor auf den Gradienten addieren
- Kegel entsteht
### 3D
## Test Scenario: 3D Function Approximation
#### Optimierungsszenario
### Optimierungsszenario
- Ball zu Mario
#### Matching in 3D
### Matching in 3D
- alternierende Optimierung
#### Besonderheiten der Optimierung
### Besonderheiten der Optimierung
- Analytische Lösung nur bis zur Optimierung der ersten Punkte gültig
- Kriterien trotzdem gut
# Evaluation
# Evaluation of Scenarios
## Results of 1D Function Approximation
## Spearman/Pearson-Metriken
@ -166,7 +247,7 @@ the problem is not convex. Here evolutional optimization has one more advantage
- Wieso sollte uns das interessieren?
- Wieso reicht Monotonie?
- Haben wir das gezeigt?
- Stastik, Bilder, blah!
- Statistik, Bilder, blah!
# Schluss

Binary file not shown.

View File

@ -151,8 +151,8 @@ predicting the evolvability of \acf{FFD} given a
Deformation-Matrix\cite{anrichterEvol}. In the original publication the
author used random sampled points weighted with \acf{RBF} to deform the
mesh and defined three different criteria that can be calculated prior
to using an evolutional optimisation algorithm to asses the quality and
potential of such optimisation.
to using an evolutional optimization algorithm to asses the quality and
potential of such optimization.
We will replicate the same setup on the same meshes but use \acf{FFD}
instead of \acf{RBF} to create a deformation and evaluate if the
@ -265,13 +265,64 @@ in reality many problems have no analytic solution, because the problem
is not convex. Here evolutional optimization has one more advantage as
you get bad solutions fast, which refine over time.
\section{Evolvierbarkeitskriterien}\label{evolvierbarkeitskriterien}
\section{Criteria for the evolvability of linear
deformations}\label{criteria-for-the-evolvability-of-linear-deformations}
\begin{itemize}
\tightlist
\item
Konditionszahl etc.
\end{itemize}
\subsection{Variability}\label{variability}
In \cite{anrichterEvol} variability is defined as
\[V(\vec{U}) := \frac{\textrm{rank}(\vec{U})}{n},\] whereby \(\vec{U}\)
is the \(m \times n\) deformation-Matrix used to map the \(m\) control
points onto the \(n\) vertices.
Given \(n = m\), an identical number of control-points and vertices,
this quotient will be \(=1\) if all control points are independent of
each other and the solution is to trivially move every control-point
onto a target-point.
In praxis the value of \(V(\vec{U})\) is typically \(\ll 1\), because as
there are only few control-points for many vertices, so \(m \ll n\).
Additionally in our setup we connect neighbouring control-points in a
grid so each control point is not independent, but typically depends on
\(4^d\) control-points for an \(d\)-dimensional control mesh.
\subsection{Regularity}\label{regularity}
Regularity is defined\cite{anrichterEvol} as
\[R(\vec{U}) := \frac{1}{\kappa(\vec{U})} = \frac{\sigma_{min}}{\sigma_{max}}\]
where \(\sigma_{min}\) and \(\sigma_{max}\) are the smallest and
greatest right singular value of the deformation-matrix \(\vec{U}\).
As we deform the given Object only based on the parameters as
\(\vec{p} \mapsto f(\vec{x} + \vec{U}\vec{p})\) this makes sure that
\(\|\vec{Up}\| \propto \|\vec{p}\|\) when \(\kappa(\vec{U}) \approx 1\).
The inversion of \(\kappa(\vec{U})\) is only performed to map the
criterion-range to \([0..1]\), whereas \(1\) is the optimal value and
\(0\) is the worst value.
This criterion should be characteristic for numeric stability on the on
hand\cite[chapter 2.7]{golub2012matrix} and for convergence speed of
evolutional algorithms on the other hand\cite{anrichterEvol} as it is
tied to the notion of
locality\cite{weise2012evolutionary,thorhauer2014locality}.
\subsection{Improvement Potential}\label{improvement-potential}
In contrast to the general nature of variability and regularity, which
are agnostic of the fitness-function at hand the third criterion should
reflect a notion of potential.
As during optimization some kind of gradient \(g\) is available to
suggest a direction worth pursuing we use this to guess how much change
can be achieved in the given direction.
The definition for an improvement potential \(P\)
is\cite{anrichterEvol}: \[
P(\vec{U}) := 1 - \|(\vec{1} - \vec{UU}^+)\vec(G)\|^2_F
\] given some approximate \(n \times d\) fitness-gradient \(\vec{G}\),
normalized to \(\|\vec{G}\|_F = 1\), whereby \(\|\cdot\|_F\) denotes the
Frobenius-Norm.
\chapter{Hauptteil}\label{hauptteil}
@ -372,7 +423,7 @@ Optimierung}\label{besonderheiten-der-optimierung}
\item
Haben wir das gezeigt?
\item
Stastik, Bilder, blah!
Statistik, Bilder, blah!
\end{itemize}
\chapter{Schluss}\label{schluss}

View File

@ -87,13 +87,13 @@
\renewcommand\lq{\text{\textgravedbl}\!\!}\renewcommand\rq{\!\!\text{\textacutedbl}}
% ##### tables #####
\newcommand\rl[1]{\multicolumn{1}{r|}{#1}} % item|
\renewcommand\ll[1]{\multicolumn{1}{|r}{#1}} % |item
% \newcommand\cc[1]{\multicolumn{1}{|c|}{#1}} % |item|
\newcommand\lc[2]{\multicolumn{#1}{|c|}{#2}} %
\newcommand\cc[1]{\multicolumn{1}{c}{#1}}
% \renewcommand{\arraystretch}{1.2} % Tabellenzeilen ein bischen h?her machen.
\newcommand\m[2]{\multirow{#1}{*}{$#2$}}
% \newcommand\rl[1]{\multicolumn{1}{r|}{#1}} % item|
% \renewcommand\ll[1]{\multicolumn{1}{|r}{#1}} % |item
% % \newcommand\cc[1]{\multicolumn{1}{|c|}{#1}} % |item|
% \newcommand\lc[2]{\multicolumn{#1}{|c|}{#2}} %
% \newcommand\cc[1]{\multicolumn{1}{c}{#1}}
% % \renewcommand{\arraystretch}{1.2} % Tabellenzeilen ein bischen h?her machen.
% \newcommand\m[2]{\multirow{#1}{*}{$#2$}}
% ##### Text symbole #####
% \newcommand\subdot[1]{\lower0.5em\hbox{$\stackrel{\displaystyle #1}{.}$}}

View File

@ -1,7 +1,7 @@
*******************************************************************************
Wed Aug 30 22:31:49 2017
Sun Oct 1 19:59:33 2017
FIT: data read from "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 2:5
@ -47,7 +47,7 @@ b -0.996 1.000
*******************************************************************************
Wed Aug 30 22:31:49 2017
Sun Oct 1 19:59:33 2017
FIT: data read from "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:5
@ -93,7 +93,7 @@ bb -1.000 1.000
*******************************************************************************
Wed Aug 30 22:31:49 2017
Sun Oct 1 19:59:33 2017
FIT: data read from "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:6

View File

@ -2,13 +2,19 @@ set datafile separator ","
f(x)=a*x+b
fit f(x) "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 2:5 via a,b
set terminal png
set xlabel 'regularity'
set ylabel 'steps'
set output "20170830-evolution1D_5x5_100Times-added_one_regularity-vs-steps.png"
plot "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 2:5 title "regularity vs. steps", f(x) lc rgb "black"
plot "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 2:5 title "data", f(x) title "lin. fit" lc rgb "black"
g(x)=aa*x+bb
fit g(x) "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:5 via aa,bb
set xlabel 'improvement potential'
set ylabel 'steps'
set output "20170830-evolution1D_5x5_100Times-added_one_improvement-vs-steps.png"
plot "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:5 title "improvement potential vs. steps", g(x) lc rgb "black"
plot "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:5 title "data", g(x) title "lin. fit" lc rgb "black"
h(x)=aaa*x+bbb
fit h(x) "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:6 via aaa,bbb
set xlabel 'improvement potential'
set ylabel 'evolution error'
set output "20170830-evolution1D_5x5_100Times-added_one_improvement-vs-evo-error.png"
plot "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:6 title "improvement potential vs. evolution error", h(x) lc rgb "black"
plot "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:6 title "data", h(x) title "lin. fit" lc rgb "black"

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.7 KiB

After

Width:  |  Height:  |  Size: 5.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.6 KiB

After

Width:  |  Height:  |  Size: 5.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 6.0 KiB

After

Width:  |  Height:  |  Size: 6.0 KiB

View File

@ -0,0 +1,201 @@
"Least squares",regularity,variability,improvement,steps,"Evolution error",sigma
183.763,0.0179571,0.00111111,0.938632,228,192.44,0.032644
229.099,0.0189281,0.00111111,0.923492,96,240.171,0.0562716
238.479,0.0215758,0.00111111,0.920359,195,249.883,0.0272032
188.152,0.0144312,0.00111111,0.937166,256,197.529,0.0244736
191.586,0.0207835,0.00111111,0.936019,156,201.143,0.0235659
202.916,0.0168021,0.00111111,0.932236,220,212.978,0.0318143
178.439,0.0180162,0.00111111,0.94041,259,187.236,0.0269287
229.734,0.0238245,0.00111111,0.92328,203,241.13,0.0281426
211.994,0.0192363,0.00111111,0.929204,211,222.511,0.0152364
245.717,0.0185166,0.00111111,0.917942,154,256.592,0.0379309
225.543,0.0204032,0.00111111,0.92468,160,236.693,0.0243299
211.533,0.0207268,0.00111111,0.929358,135,221.694,0.0377055
213.806,0.022426,0.00111111,0.928599,188,224.469,0.0237864
223.483,0.0172601,0.00111111,0.925368,203,234.382,0.0196412
231.924,0.0177059,0.00111111,0.922549,209,243.433,0.0259235
184.824,0.0162623,0.00111111,0.938278,242,194.032,0.0222468
223.766,0.0179083,0.00111111,0.925273,207,234.567,0.0301655
203.27,0.0161445,0.00111111,0.932118,220,213.401,0.0273719
193.158,0.0166659,0.00111111,0.935494,221,201.428,0.0414548
197.851,0.0185201,0.00111111,0.933927,195,207.224,0.0362339
214.041,0.0178594,0.00111111,0.928521,139,224.585,0.0340444
211.533,0.018782,0.00111111,0.929358,132,221.895,0.0369246
185.356,0.0195083,0.00111111,0.9381,222,194.499,0.0184425
198.81,0.0201802,0.00111111,0.933607,242,208.589,0.0259606
205.265,0.0202697,0.00111111,0.931451,184,215.456,0.0190532
222.952,0.019699,0.00111111,0.925545,178,234.028,0.0261726
205.276,0.0193633,0.00111111,0.931448,191,214.716,0.02818
230.047,0.0214453,0.00111111,0.923175,216,241.431,0.0234518
196.924,0.0190251,0.00111111,0.934237,183,206.599,0.0212354
195.184,0.0181113,0.00111111,0.934818,178,204.769,0.0299481
194.787,0.018323,0.00111111,0.934951,212,204.23,0.0294953
212.533,0.0182128,0.00111111,0.929024,231,222.938,0.0412254
176.623,0.0158336,0.00111111,0.941017,209,185.355,0.0266792
209.136,0.0204612,0.00111111,0.930159,91,218.429,0.0579325
209.848,0.0201986,0.00111111,0.929921,163,219.886,0.0336554
209.515,0.0191636,0.00111111,0.930032,185,219.696,0.0286176
193.943,0.0178079,0.00111111,0.935233,199,203.581,0.0232988
216.239,0.0212003,0.00111111,0.927787,151,226.759,0.0189421
211.37,0.0162232,0.00111111,0.929413,173,221.819,0.0171682
224.988,0.0196037,0.00111111,0.924865,112,236.226,0.0602406
207.221,0.0185908,0.00111111,0.930798,157,217.553,0.0314001
203.513,0.0172686,0.00111111,0.932037,188,213.564,0.0328506
186.642,0.0195431,0.00111111,0.937671,178,195.759,0.0419734
236.265,0.0204047,0.00111111,0.921099,106,247.931,0.0409088
222.669,0.0231925,0.00111111,0.925639,162,233.713,0.0215862
223.48,0.0208053,0.00111111,0.925369,190,234.013,0.0261078
213.098,0.0182925,0.00111111,0.928835,227,223.628,0.0354102
185.847,0.0198307,0.00111111,0.937936,217,194.983,0.0198853
215.818,0.0211463,0.00111111,0.927927,212,226.437,0.0327274
203.914,0.0228368,0.00111111,0.931903,219,214.086,0.0166239
177.639,0.0183023,0.00111111,0.940677,243,186.419,0.0388416
187.065,0.0182927,0.00111111,0.937529,217,196.416,0.0185577
224.069,0.0206082,0.00111111,0.925172,240,235.058,0.0214527
233.059,0.020766,0.00111111,0.922169,249,244.587,0.0304497
243.252,0.0197131,0.00111111,0.918766,218,255.376,0.0210078
216.096,0.018823,0.00111111,0.927834,240,226.808,0.0218857
229.899,0.0235834,0.00111111,0.923225,185,241.372,0.0226563
214.545,0.02158,0.00111111,0.928352,180,225.08,0.0281248
201.315,0.0215738,0.00111111,0.932771,122,210.821,0.0306565
196.866,0.0199231,0.00111111,0.934256,254,206.672,0.0171958
191.811,0.0187791,0.00111111,0.935944,264,201.399,0.0284519
234.589,0.0153778,0.00111111,0.921659,173,246.066,0.0288251
241.822,0.0210075,0.00111111,0.919243,147,253.875,0.0448453
247.389,0.020118,0.00111111,0.917384,180,259.741,0.0237541
197.862,0.0191194,0.00111111,0.933924,258,207.655,0.0309044
227.298,0.0193875,0.00111111,0.924094,189,238.654,0.0244366
203.016,0.0194783,0.00111111,0.932203,345,213.147,0.0293344
200.344,0.0181137,0.00111111,0.933095,211,210.34,0.0279667
260.653,0.0211892,0.00111111,0.912955,159,273.684,0.0173598
190.801,0.018724,0.00111111,0.936282,145,200.321,0.0244174
219.31,0.0152034,0.00111111,0.926761,298,230.127,0.0304617
201.013,0.0189705,0.00111111,0.932871,154,210.898,0.0368793
214.751,0.0212631,0.00111111,0.928283,189,224.914,0.0281325
199.042,0.0200792,0.00111111,0.93353,228,208.711,0.0294291
222.258,0.0190311,0.00111111,0.925777,220,233.241,0.0218053
193.965,0.0185556,0.00111111,0.935225,281,203.658,0.0236132
216.305,0.0220209,0.00111111,0.927765,187,227.058,0.0289074
209.518,0.0164836,0.00111111,0.930031,193,219.89,0.0252204
202.805,0.0195855,0.00111111,0.932273,235,212.877,0.0518397
205.584,0.0203958,0.00111111,0.931345,203,215.439,0.0362367
181.975,0.0182167,0.00111111,0.939229,173,191.017,0.0257238
161.995,0.0204489,0.00111111,0.945902,260,170.069,0.0166518
194.688,0.0204944,0.00111111,0.934984,184,204.348,0.0314503
185.811,0.0200401,0.00111111,0.937948,332,195.049,0.0270099
197.624,0.020823,0.00111111,0.934003,281,207.186,0.0391989
214.766,0.0218128,0.00111111,0.928278,157,225.229,0.0353567
219.632,0.0173274,0.00111111,0.926653,160,230.466,0.036049
202.581,0.0162571,0.00111111,0.932348,237,212.578,0.0297746
181.621,0.0177214,0.00111111,0.939347,316,190.496,0.0259462
250.268,0.0185787,0.00111111,0.916423,135,262.382,0.0245029
205.717,0.0178907,0.00111111,0.931301,255,215.988,0.0274579
197.391,0.0189724,0.00111111,0.934081,215,206.934,0.0376919
238.981,0.0223217,0.00111111,0.920192,100,250.737,0.0435432
196.091,0.01434,0.00111111,0.934515,190,205.827,0.0556249
202.756,0.0184759,0.00111111,0.932289,271,212.891,0.0270203
191.765,0.0184657,0.00111111,0.93596,193,201.034,0.0201355
202.525,0.0186188,0.00111111,0.932366,239,212.53,0.0201397
198.694,0.0176067,0.00111111,0.933646,278,208.545,0.0231376
196.578,0.0184124,0.00111111,0.934353,226,206.327,0.0391784
190.046,0.0198379,0.00111111,0.936534,133,199.413,0.0284771
198.091,0.0185605,0.00111111,0.933847,245,207.886,0.0152523
234.524,0.0192422,0.00111111,0.92168,203,245.873,0.0333896
199.665,0.0169589,0.00111111,0.933322,236,209.253,0.0256795
225.432,0.019794,0.00111111,0.924717,164,236.693,0.0303567
226.331,0.0198218,0.00111111,0.924416,133,237.512,0.0326327
207.081,0.0183173,0.00111111,0.930845,293,217.347,0.0173303
197.215,0.0195282,0.00111111,0.93414,144,206.725,0.0274424
207.985,0.018551,0.00111111,0.930543,167,218.216,0.0264904
199.179,0.0192478,0.00111111,0.933484,230,208.831,0.0280096
190.622,0.0185748,0.00111111,0.936342,146,199.805,0.040456
186.169,0.0193022,0.00111111,0.937829,196,195.163,0.0339487
184.072,0.0170487,0.00111111,0.938529,224,193.258,0.0325346
173.247,0.0203667,0.00111111,0.942144,191,181.901,0.0186411
174.505,0.0185505,0.00111111,0.941724,232,183.091,0.0391035
204.996,0.019809,0.00111111,0.931541,126,215.233,0.0317816
201.159,0.0177293,0.00111111,0.932823,231,210.992,0.033055
215.57,0.0184879,0.00111111,0.92801,157,226.197,0.0248099
220.4,0.0220813,0.00111111,0.926397,120,230.55,0.0266999
192.669,0.0189865,0.00111111,0.935658,197,201.997,0.0273896
217.249,0.0209109,0.00111111,0.927449,196,227.649,0.0314102
183.442,0.0177993,0.00111111,0.938739,276,192.577,0.0290511
211.058,0.0191396,0.00111111,0.929517,262,221.454,0.0258037
225.405,0.0142923,0.00111111,0.924726,232,236.59,0.0554429
214.129,0.0153926,0.00111111,0.928491,258,224.637,0.025707
191.681,0.0159947,0.00111111,0.935988,221,201.263,0.0168213
208.302,0.0168977,0.00111111,0.930437,267,218.685,0.0303307
244.265,0.0186614,0.00111111,0.918427,240,256.401,0.014142
217.424,0.0189707,0.00111111,0.927391,194,228.137,0.0234014
193.974,0.0187277,0.00111111,0.935222,200,203.421,0.0313489
217.843,0.0191009,0.00111111,0.927251,244,228.677,0.0186412
228.126,0.0195362,0.00111111,0.923817,171,239.173,0.0237806
194.236,0.0205534,0.00111111,0.935135,206,203.783,0.0222971
231.826,0.017672,0.00111111,0.922582,126,243.217,0.0354461
194.684,0.0180998,0.00111111,0.934985,149,204.188,0.0265668
201.513,0.0176892,0.00111111,0.932705,232,211.535,0.0277219
219.557,0.016394,0.00111111,0.926679,145,229.573,0.0466214
215.208,0.0203045,0.00111111,0.928131,217,225.773,0.0198841
224.784,0.0187965,0.00111111,0.924933,207,235.748,0.0249149
198.752,0.0192186,0.00111111,0.933626,214,208.659,0.0198508
210.374,0.0169978,0.00111111,0.929745,201,220.83,0.029392
182.397,0.0156577,0.00111111,0.939088,271,191.357,0.0356415
214.532,0.017907,0.00111111,0.928357,187,224.938,0.0315807
206.254,0.0198955,0.00111111,0.931121,144,216.195,0.0261497
208.845,0.019427,0.00111111,0.930256,302,218.868,0.0293849
219.847,0.018273,0.00111111,0.926582,205,230.63,0.0222702
178.072,0.0161684,0.00111111,0.940532,271,186.89,0.0224425
208.094,0.0174596,0.00111111,0.930507,188,218.199,0.026439
206.759,0.017792,0.00111111,0.930952,249,217.047,0.0196121
213.588,0.0179354,0.00111111,0.928672,154,223.644,0.0261114
203.691,0.0186109,0.00111111,0.931977,215,213.801,0.029634
196.02,0.0164293,0.00111111,0.934539,286,205.631,0.0248201
200.893,0.0206686,0.00111111,0.932911,218,210.824,0.0165101
219.308,0.018868,0.00111111,0.926762,202,230.178,0.0193291
246.019,0.0189353,0.00111111,0.917842,155,257.369,0.0271402
231.847,0.0197966,0.00111111,0.922574,168,243.262,0.0289334
218.91,0.0209849,0.00111111,0.926895,197,229.047,0.0299942
211.126,0.0167501,0.00111111,0.929494,196,221.493,0.0286991
169.535,0.0151348,0.00111111,0.943383,175,177.905,0.0337364
230.113,0.0201869,0.00111111,0.923153,167,241.468,0.0344348
231.924,0.0200713,0.00111111,0.922549,149,243.443,0.0257716
223.016,0.0200786,0.00111111,0.925524,178,233.782,0.0402102
195.674,0.014894,0.00111111,0.934654,268,205.347,0.0551882
255.95,0.0195439,0.00111111,0.914525,148,268.384,0.0447607
220.055,0.0198234,0.00111111,0.926512,236,230.853,0.0367555
216.155,0.0189146,0.00111111,0.927815,129,226.312,0.0430726
199.784,0.0187532,0.00111111,0.933282,199,209.55,0.0353292
222.549,0.0203982,0.00111111,0.925679,178,233.426,0.0233115
200.95,0.0183086,0.00111111,0.932893,256,210.991,0.016803
209.249,0.016696,0.00111111,0.930121,269,219.415,0.02081
249.03,0.0223912,0.00111111,0.916836,177,260.926,0.0294009
219.09,0.019419,0.00111111,0.926835,218,229.786,0.0289335
185.792,0.0169305,0.00111111,0.937954,291,194.888,0.0258978
184.383,0.0208635,0.00111111,0.938425,256,193.57,0.0185135
201.82,0.0183427,0.00111111,0.932602,236,211.086,0.0474114
226.705,0.0196988,0.00111111,0.924292,255,237.989,0.0227452
206.776,0.019382,0.00111111,0.930947,187,217.102,0.0280757
185.74,0.0141851,0.00111111,0.937972,166,194.775,0.0321003
232.792,0.0240912,0.00111111,0.922259,191,244.384,0.0176279
201.838,0.0188448,0.00111111,0.932596,181,211.814,0.0297108
202.159,0.0179769,0.00111111,0.932489,175,212.073,0.0239756
178.922,0.018115,0.00111111,0.940249,259,187.619,0.0215103
196.096,0.0172259,0.00111111,0.934514,220,205.625,0.030462
200.913,0.0195059,0.00111111,0.932905,162,210.781,0.0383751
182.439,0.020238,0.00111111,0.939074,172,191.55,0.0264845
169.79,0.018196,0.00111111,0.943298,226,178.258,0.0148133
185.191,0.0167884,0.00111111,0.938155,194,194.329,0.0261194
202.268,0.0211018,0.00111111,0.932452,200,212.217,0.0301675
191.444,0.0189881,0.00111111,0.936067,211,200.944,0.0234532
216.792,0.0196005,0.00111111,0.927602,200,227.453,0.0212078
252.534,0.0172473,0.00111111,0.915666,225,264.972,0.0178314
193.043,0.0156078,0.00111111,0.935533,226,202.656,0.0411515
192.167,0.0174455,0.00111111,0.935826,195,201.39,0.0254769
225.725,0.0178959,0.00111111,0.924619,240,236.882,0.018903
204.599,0.0213119,0.00111111,0.931674,171,214.712,0.0263085
185.635,0.0163761,0.00111111,0.938007,190,194.569,0.0284307
186.264,0.016572,0.00111111,0.937797,232,195.513,0.0210752
249.948,0.0194051,0.00111111,0.91653,246,262.158,0.0215199
240.864,0.0182591,0.00111111,0.919563,188,251.577,0.0306745
184.618,0.0171253,0.00111111,0.938346,222,193.849,0.0300316
213.473,0.0190352,0.00111111,0.92871,275,224.14,0.0207037
1 Least squares regularity variability improvement steps Evolution error sigma
2 183.763 0.0179571 0.00111111 0.938632 228 192.44 0.032644
3 229.099 0.0189281 0.00111111 0.923492 96 240.171 0.0562716
4 238.479 0.0215758 0.00111111 0.920359 195 249.883 0.0272032
5 188.152 0.0144312 0.00111111 0.937166 256 197.529 0.0244736
6 191.586 0.0207835 0.00111111 0.936019 156 201.143 0.0235659
7 202.916 0.0168021 0.00111111 0.932236 220 212.978 0.0318143
8 178.439 0.0180162 0.00111111 0.94041 259 187.236 0.0269287
9 229.734 0.0238245 0.00111111 0.92328 203 241.13 0.0281426
10 211.994 0.0192363 0.00111111 0.929204 211 222.511 0.0152364
11 245.717 0.0185166 0.00111111 0.917942 154 256.592 0.0379309
12 225.543 0.0204032 0.00111111 0.92468 160 236.693 0.0243299
13 211.533 0.0207268 0.00111111 0.929358 135 221.694 0.0377055
14 213.806 0.022426 0.00111111 0.928599 188 224.469 0.0237864
15 223.483 0.0172601 0.00111111 0.925368 203 234.382 0.0196412
16 231.924 0.0177059 0.00111111 0.922549 209 243.433 0.0259235
17 184.824 0.0162623 0.00111111 0.938278 242 194.032 0.0222468
18 223.766 0.0179083 0.00111111 0.925273 207 234.567 0.0301655
19 203.27 0.0161445 0.00111111 0.932118 220 213.401 0.0273719
20 193.158 0.0166659 0.00111111 0.935494 221 201.428 0.0414548
21 197.851 0.0185201 0.00111111 0.933927 195 207.224 0.0362339
22 214.041 0.0178594 0.00111111 0.928521 139 224.585 0.0340444
23 211.533 0.018782 0.00111111 0.929358 132 221.895 0.0369246
24 185.356 0.0195083 0.00111111 0.9381 222 194.499 0.0184425
25 198.81 0.0201802 0.00111111 0.933607 242 208.589 0.0259606
26 205.265 0.0202697 0.00111111 0.931451 184 215.456 0.0190532
27 222.952 0.019699 0.00111111 0.925545 178 234.028 0.0261726
28 205.276 0.0193633 0.00111111 0.931448 191 214.716 0.02818
29 230.047 0.0214453 0.00111111 0.923175 216 241.431 0.0234518
30 196.924 0.0190251 0.00111111 0.934237 183 206.599 0.0212354
31 195.184 0.0181113 0.00111111 0.934818 178 204.769 0.0299481
32 194.787 0.018323 0.00111111 0.934951 212 204.23 0.0294953
33 212.533 0.0182128 0.00111111 0.929024 231 222.938 0.0412254
34 176.623 0.0158336 0.00111111 0.941017 209 185.355 0.0266792
35 209.136 0.0204612 0.00111111 0.930159 91 218.429 0.0579325
36 209.848 0.0201986 0.00111111 0.929921 163 219.886 0.0336554
37 209.515 0.0191636 0.00111111 0.930032 185 219.696 0.0286176
38 193.943 0.0178079 0.00111111 0.935233 199 203.581 0.0232988
39 216.239 0.0212003 0.00111111 0.927787 151 226.759 0.0189421
40 211.37 0.0162232 0.00111111 0.929413 173 221.819 0.0171682
41 224.988 0.0196037 0.00111111 0.924865 112 236.226 0.0602406
42 207.221 0.0185908 0.00111111 0.930798 157 217.553 0.0314001
43 203.513 0.0172686 0.00111111 0.932037 188 213.564 0.0328506
44 186.642 0.0195431 0.00111111 0.937671 178 195.759 0.0419734
45 236.265 0.0204047 0.00111111 0.921099 106 247.931 0.0409088
46 222.669 0.0231925 0.00111111 0.925639 162 233.713 0.0215862
47 223.48 0.0208053 0.00111111 0.925369 190 234.013 0.0261078
48 213.098 0.0182925 0.00111111 0.928835 227 223.628 0.0354102
49 185.847 0.0198307 0.00111111 0.937936 217 194.983 0.0198853
50 215.818 0.0211463 0.00111111 0.927927 212 226.437 0.0327274
51 203.914 0.0228368 0.00111111 0.931903 219 214.086 0.0166239
52 177.639 0.0183023 0.00111111 0.940677 243 186.419 0.0388416
53 187.065 0.0182927 0.00111111 0.937529 217 196.416 0.0185577
54 224.069 0.0206082 0.00111111 0.925172 240 235.058 0.0214527
55 233.059 0.020766 0.00111111 0.922169 249 244.587 0.0304497
56 243.252 0.0197131 0.00111111 0.918766 218 255.376 0.0210078
57 216.096 0.018823 0.00111111 0.927834 240 226.808 0.0218857
58 229.899 0.0235834 0.00111111 0.923225 185 241.372 0.0226563
59 214.545 0.02158 0.00111111 0.928352 180 225.08 0.0281248
60 201.315 0.0215738 0.00111111 0.932771 122 210.821 0.0306565
61 196.866 0.0199231 0.00111111 0.934256 254 206.672 0.0171958
62 191.811 0.0187791 0.00111111 0.935944 264 201.399 0.0284519
63 234.589 0.0153778 0.00111111 0.921659 173 246.066 0.0288251
64 241.822 0.0210075 0.00111111 0.919243 147 253.875 0.0448453
65 247.389 0.020118 0.00111111 0.917384 180 259.741 0.0237541
66 197.862 0.0191194 0.00111111 0.933924 258 207.655 0.0309044
67 227.298 0.0193875 0.00111111 0.924094 189 238.654 0.0244366
68 203.016 0.0194783 0.00111111 0.932203 345 213.147 0.0293344
69 200.344 0.0181137 0.00111111 0.933095 211 210.34 0.0279667
70 260.653 0.0211892 0.00111111 0.912955 159 273.684 0.0173598
71 190.801 0.018724 0.00111111 0.936282 145 200.321 0.0244174
72 219.31 0.0152034 0.00111111 0.926761 298 230.127 0.0304617
73 201.013 0.0189705 0.00111111 0.932871 154 210.898 0.0368793
74 214.751 0.0212631 0.00111111 0.928283 189 224.914 0.0281325
75 199.042 0.0200792 0.00111111 0.93353 228 208.711 0.0294291
76 222.258 0.0190311 0.00111111 0.925777 220 233.241 0.0218053
77 193.965 0.0185556 0.00111111 0.935225 281 203.658 0.0236132
78 216.305 0.0220209 0.00111111 0.927765 187 227.058 0.0289074
79 209.518 0.0164836 0.00111111 0.930031 193 219.89 0.0252204
80 202.805 0.0195855 0.00111111 0.932273 235 212.877 0.0518397
81 205.584 0.0203958 0.00111111 0.931345 203 215.439 0.0362367
82 181.975 0.0182167 0.00111111 0.939229 173 191.017 0.0257238
83 161.995 0.0204489 0.00111111 0.945902 260 170.069 0.0166518
84 194.688 0.0204944 0.00111111 0.934984 184 204.348 0.0314503
85 185.811 0.0200401 0.00111111 0.937948 332 195.049 0.0270099
86 197.624 0.020823 0.00111111 0.934003 281 207.186 0.0391989
87 214.766 0.0218128 0.00111111 0.928278 157 225.229 0.0353567
88 219.632 0.0173274 0.00111111 0.926653 160 230.466 0.036049
89 202.581 0.0162571 0.00111111 0.932348 237 212.578 0.0297746
90 181.621 0.0177214 0.00111111 0.939347 316 190.496 0.0259462
91 250.268 0.0185787 0.00111111 0.916423 135 262.382 0.0245029
92 205.717 0.0178907 0.00111111 0.931301 255 215.988 0.0274579
93 197.391 0.0189724 0.00111111 0.934081 215 206.934 0.0376919
94 238.981 0.0223217 0.00111111 0.920192 100 250.737 0.0435432
95 196.091 0.01434 0.00111111 0.934515 190 205.827 0.0556249
96 202.756 0.0184759 0.00111111 0.932289 271 212.891 0.0270203
97 191.765 0.0184657 0.00111111 0.93596 193 201.034 0.0201355
98 202.525 0.0186188 0.00111111 0.932366 239 212.53 0.0201397
99 198.694 0.0176067 0.00111111 0.933646 278 208.545 0.0231376
100 196.578 0.0184124 0.00111111 0.934353 226 206.327 0.0391784
101 190.046 0.0198379 0.00111111 0.936534 133 199.413 0.0284771
102 198.091 0.0185605 0.00111111 0.933847 245 207.886 0.0152523
103 234.524 0.0192422 0.00111111 0.92168 203 245.873 0.0333896
104 199.665 0.0169589 0.00111111 0.933322 236 209.253 0.0256795
105 225.432 0.019794 0.00111111 0.924717 164 236.693 0.0303567
106 226.331 0.0198218 0.00111111 0.924416 133 237.512 0.0326327
107 207.081 0.0183173 0.00111111 0.930845 293 217.347 0.0173303
108 197.215 0.0195282 0.00111111 0.93414 144 206.725 0.0274424
109 207.985 0.018551 0.00111111 0.930543 167 218.216 0.0264904
110 199.179 0.0192478 0.00111111 0.933484 230 208.831 0.0280096
111 190.622 0.0185748 0.00111111 0.936342 146 199.805 0.040456
112 186.169 0.0193022 0.00111111 0.937829 196 195.163 0.0339487
113 184.072 0.0170487 0.00111111 0.938529 224 193.258 0.0325346
114 173.247 0.0203667 0.00111111 0.942144 191 181.901 0.0186411
115 174.505 0.0185505 0.00111111 0.941724 232 183.091 0.0391035
116 204.996 0.019809 0.00111111 0.931541 126 215.233 0.0317816
117 201.159 0.0177293 0.00111111 0.932823 231 210.992 0.033055
118 215.57 0.0184879 0.00111111 0.92801 157 226.197 0.0248099
119 220.4 0.0220813 0.00111111 0.926397 120 230.55 0.0266999
120 192.669 0.0189865 0.00111111 0.935658 197 201.997 0.0273896
121 217.249 0.0209109 0.00111111 0.927449 196 227.649 0.0314102
122 183.442 0.0177993 0.00111111 0.938739 276 192.577 0.0290511
123 211.058 0.0191396 0.00111111 0.929517 262 221.454 0.0258037
124 225.405 0.0142923 0.00111111 0.924726 232 236.59 0.0554429
125 214.129 0.0153926 0.00111111 0.928491 258 224.637 0.025707
126 191.681 0.0159947 0.00111111 0.935988 221 201.263 0.0168213
127 208.302 0.0168977 0.00111111 0.930437 267 218.685 0.0303307
128 244.265 0.0186614 0.00111111 0.918427 240 256.401 0.014142
129 217.424 0.0189707 0.00111111 0.927391 194 228.137 0.0234014
130 193.974 0.0187277 0.00111111 0.935222 200 203.421 0.0313489
131 217.843 0.0191009 0.00111111 0.927251 244 228.677 0.0186412
132 228.126 0.0195362 0.00111111 0.923817 171 239.173 0.0237806
133 194.236 0.0205534 0.00111111 0.935135 206 203.783 0.0222971
134 231.826 0.017672 0.00111111 0.922582 126 243.217 0.0354461
135 194.684 0.0180998 0.00111111 0.934985 149 204.188 0.0265668
136 201.513 0.0176892 0.00111111 0.932705 232 211.535 0.0277219
137 219.557 0.016394 0.00111111 0.926679 145 229.573 0.0466214
138 215.208 0.0203045 0.00111111 0.928131 217 225.773 0.0198841
139 224.784 0.0187965 0.00111111 0.924933 207 235.748 0.0249149
140 198.752 0.0192186 0.00111111 0.933626 214 208.659 0.0198508
141 210.374 0.0169978 0.00111111 0.929745 201 220.83 0.029392
142 182.397 0.0156577 0.00111111 0.939088 271 191.357 0.0356415
143 214.532 0.017907 0.00111111 0.928357 187 224.938 0.0315807
144 206.254 0.0198955 0.00111111 0.931121 144 216.195 0.0261497
145 208.845 0.019427 0.00111111 0.930256 302 218.868 0.0293849
146 219.847 0.018273 0.00111111 0.926582 205 230.63 0.0222702
147 178.072 0.0161684 0.00111111 0.940532 271 186.89 0.0224425
148 208.094 0.0174596 0.00111111 0.930507 188 218.199 0.026439
149 206.759 0.017792 0.00111111 0.930952 249 217.047 0.0196121
150 213.588 0.0179354 0.00111111 0.928672 154 223.644 0.0261114
151 203.691 0.0186109 0.00111111 0.931977 215 213.801 0.029634
152 196.02 0.0164293 0.00111111 0.934539 286 205.631 0.0248201
153 200.893 0.0206686 0.00111111 0.932911 218 210.824 0.0165101
154 219.308 0.018868 0.00111111 0.926762 202 230.178 0.0193291
155 246.019 0.0189353 0.00111111 0.917842 155 257.369 0.0271402
156 231.847 0.0197966 0.00111111 0.922574 168 243.262 0.0289334
157 218.91 0.0209849 0.00111111 0.926895 197 229.047 0.0299942
158 211.126 0.0167501 0.00111111 0.929494 196 221.493 0.0286991
159 169.535 0.0151348 0.00111111 0.943383 175 177.905 0.0337364
160 230.113 0.0201869 0.00111111 0.923153 167 241.468 0.0344348
161 231.924 0.0200713 0.00111111 0.922549 149 243.443 0.0257716
162 223.016 0.0200786 0.00111111 0.925524 178 233.782 0.0402102
163 195.674 0.014894 0.00111111 0.934654 268 205.347 0.0551882
164 255.95 0.0195439 0.00111111 0.914525 148 268.384 0.0447607
165 220.055 0.0198234 0.00111111 0.926512 236 230.853 0.0367555
166 216.155 0.0189146 0.00111111 0.927815 129 226.312 0.0430726
167 199.784 0.0187532 0.00111111 0.933282 199 209.55 0.0353292
168 222.549 0.0203982 0.00111111 0.925679 178 233.426 0.0233115
169 200.95 0.0183086 0.00111111 0.932893 256 210.991 0.016803
170 209.249 0.016696 0.00111111 0.930121 269 219.415 0.02081
171 249.03 0.0223912 0.00111111 0.916836 177 260.926 0.0294009
172 219.09 0.019419 0.00111111 0.926835 218 229.786 0.0289335
173 185.792 0.0169305 0.00111111 0.937954 291 194.888 0.0258978
174 184.383 0.0208635 0.00111111 0.938425 256 193.57 0.0185135
175 201.82 0.0183427 0.00111111 0.932602 236 211.086 0.0474114
176 226.705 0.0196988 0.00111111 0.924292 255 237.989 0.0227452
177 206.776 0.019382 0.00111111 0.930947 187 217.102 0.0280757
178 185.74 0.0141851 0.00111111 0.937972 166 194.775 0.0321003
179 232.792 0.0240912 0.00111111 0.922259 191 244.384 0.0176279
180 201.838 0.0188448 0.00111111 0.932596 181 211.814 0.0297108
181 202.159 0.0179769 0.00111111 0.932489 175 212.073 0.0239756
182 178.922 0.018115 0.00111111 0.940249 259 187.619 0.0215103
183 196.096 0.0172259 0.00111111 0.934514 220 205.625 0.030462
184 200.913 0.0195059 0.00111111 0.932905 162 210.781 0.0383751
185 182.439 0.020238 0.00111111 0.939074 172 191.55 0.0264845
186 169.79 0.018196 0.00111111 0.943298 226 178.258 0.0148133
187 185.191 0.0167884 0.00111111 0.938155 194 194.329 0.0261194
188 202.268 0.0211018 0.00111111 0.932452 200 212.217 0.0301675
189 191.444 0.0189881 0.00111111 0.936067 211 200.944 0.0234532
190 216.792 0.0196005 0.00111111 0.927602 200 227.453 0.0212078
191 252.534 0.0172473 0.00111111 0.915666 225 264.972 0.0178314
192 193.043 0.0156078 0.00111111 0.935533 226 202.656 0.0411515
193 192.167 0.0174455 0.00111111 0.935826 195 201.39 0.0254769
194 225.725 0.0178959 0.00111111 0.924619 240 236.882 0.018903
195 204.599 0.0213119 0.00111111 0.931674 171 214.712 0.0263085
196 185.635 0.0163761 0.00111111 0.938007 190 194.569 0.0284307
197 186.264 0.016572 0.00111111 0.937797 232 195.513 0.0210752
198 249.948 0.0194051 0.00111111 0.91653 246 262.158 0.0215199
199 240.864 0.0182591 0.00111111 0.919563 188 251.577 0.0306745
200 184.618 0.0171253 0.00111111 0.938346 222 193.849 0.0300316
201 213.473 0.0190352 0.00111111 0.92871 275 224.14 0.0207037

View File

@ -0,0 +1,138 @@
*******************************************************************************
Sun Oct 1 20:05:21 2017
FIT: data read from "20170830-evolution1D_5x5_100Times-all.csv" every ::1 using 2:5
format = x:z
#datapoints = 200
residuals are weighted equally (unit weight)
function used for fitting: f(x)
fitted parameters initialized with current variable values
Iteration 0
WSSR : 8.65799e+06 delta(WSSR)/WSSR : 0
delta(WSSR) : 0 limit for stopping : 1e-05
lambda : 0.707234
initial set of free parameter values
a = 1
b = 1
After 5 iterations the fit converged.
final sum of squares of residuals : 387750
rel. change during last iteration : -1.80465e-10
degrees of freedom (FIT_NDF) : 198
rms of residuals (FIT_STDFIT) = sqrt(WSSR/ndf) : 44.2531
variance of residuals (reduced chisquare) = WSSR/ndf : 1958.33
Final set of parameters Asymptotic Standard Error
======================= ==========================
a = -7207.93 +/- 1699 (23.57%)
b = 339.971 +/- 32.22 (9.477%)
correlation matrix of the fit parameters:
a b
a 1.000
b -0.995 1.000
*******************************************************************************
Sun Oct 1 20:05:21 2017
FIT: data read from "20170830-evolution1D_5x5_100Times-all.csv" every ::1 using 4:5
format = x:z
#datapoints = 200
residuals are weighted equally (unit weight)
function used for fitting: g(x)
fitted parameters initialized with current variable values
Iteration 0
WSSR : 8.58412e+06 delta(WSSR)/WSSR : 0
delta(WSSR) : 0 limit for stopping : 1e-05
lambda : 0.965888
initial set of free parameter values
aa = 1
bb = 1
After 5 iterations the fit converged.
final sum of squares of residuals : 374307
rel. change during last iteration : -1.28123e-12
degrees of freedom (FIT_NDF) : 198
rms of residuals (FIT_STDFIT) = sqrt(WSSR/ndf) : 43.4792
variance of residuals (reduced chisquare) = WSSR/ndf : 1890.44
Final set of parameters Asymptotic Standard Error
======================= ==========================
aa = 2485.68 +/- 489.8 (19.71%)
bb = -2109 +/- 455.8 (21.61%)
correlation matrix of the fit parameters:
aa bb
aa 1.000
bb -1.000 1.000
*******************************************************************************
Sun Oct 1 20:05:21 2017
FIT: data read from "20170830-evolution1D_5x5_100Times-all.csv" every ::1 using 4:6
format = x:z
#datapoints = 200
residuals are weighted equally (unit weight)
function used for fitting: h(x)
fitted parameters initialized with current variable values
Iteration 0
WSSR : 9.43971e+06 delta(WSSR)/WSSR : 0
delta(WSSR) : 0 limit for stopping : 1e-05
lambda : 0.965888
initial set of free parameter values
aaa = 1
bbb = 1
After 5 iterations the fit converged.
final sum of squares of residuals : 11.3578
rel. change during last iteration : -7.81502e-08
degrees of freedom (FIT_NDF) : 198
rms of residuals (FIT_STDFIT) = sqrt(WSSR/ndf) : 0.239505
variance of residuals (reduced chisquare) = WSSR/ndf : 0.0573625
Final set of parameters Asymptotic Standard Error
======================= ==========================
aaa = -3135.44 +/- 2.698 (0.08605%)
bbb = 3135.83 +/- 2.511 (0.08006%)
correlation matrix of the fit parameters:
aaa bbb
aaa 1.000
bbb -1.000 1.000

View File

@ -0,0 +1,261 @@
Iteration 0
WSSR : 8.65799e+06 delta(WSSR)/WSSR : 0
delta(WSSR) : 0 limit for stopping : 1e-05
lambda : 0.707234
initial set of free parameter values
a = 1
b = 1
/
Iteration 1
WSSR : 422995 delta(WSSR)/WSSR : -19.4683
delta(WSSR) : -8.235e+06 limit for stopping : 1e-05
lambda : 0.0707234
resultant parameter values
a = -4.94625
b = 203.522
/
Iteration 2
WSSR : 415042 delta(WSSR)/WSSR : -0.0191613
delta(WSSR) : -7952.76 limit for stopping : 1e-05
lambda : 0.00707234
resultant parameter values
a = -864.859
b = 220.257
/
Iteration 3
WSSR : 387879 delta(WSSR)/WSSR : -0.0700308
delta(WSSR) : -27163.5 limit for stopping : 1e-05
lambda : 0.000707234
resultant parameter values
a = -6772.19
b = 331.747
/
Iteration 4
WSSR : 387750 delta(WSSR)/WSSR : -0.000332164
delta(WSSR) : -128.797 limit for stopping : 1e-05
lambda : 7.07234e-05
resultant parameter values
a = -7207.61
b = 339.965
/
Iteration 5
WSSR : 387750 delta(WSSR)/WSSR : -1.80465e-10
delta(WSSR) : -6.99755e-05 limit for stopping : 1e-05
lambda : 7.07234e-06
resultant parameter values
a = -7207.93
b = 339.971
After 5 iterations the fit converged.
final sum of squares of residuals : 387750
rel. change during last iteration : -1.80465e-10
degrees of freedom (FIT_NDF) : 198
rms of residuals (FIT_STDFIT) = sqrt(WSSR/ndf) : 44.2531
variance of residuals (reduced chisquare) = WSSR/ndf : 1958.33
Final set of parameters Asymptotic Standard Error
======================= ==========================
a = -7207.93 +/- 1699 (23.57%)
b = 339.971 +/- 32.22 (9.477%)
correlation matrix of the fit parameters:
a b
a 1.000
b -0.995 1.000
Iteration 0
WSSR : 8.58412e+06 delta(WSSR)/WSSR : 0
delta(WSSR) : 0 limit for stopping : 1e-05
lambda : 0.965888
initial set of free parameter values
aa = 1
bb = 1
/
Iteration 1
WSSR : 418736 delta(WSSR)/WSSR : -19.5001
delta(WSSR) : -8.16538e+06 limit for stopping : 1e-05
lambda : 0.0965888
resultant parameter values
aa = 112.257
bb = 99.0225
/
Iteration 2
WSSR : 395337 delta(WSSR)/WSSR : -0.0591881
delta(WSSR) : -23399.2 limit for stopping : 1e-05
lambda : 0.00965888
resultant parameter values
aa = 852.014
bb = -588.836
/
Iteration 3
WSSR : 374316 delta(WSSR)/WSSR : -0.0561565
delta(WSSR) : -21020.3 limit for stopping : 1e-05
lambda : 0.000965888
resultant parameter values
aa = 2450.37
bb = -2076.15
/
Iteration 4
WSSR : 374307 delta(WSSR)/WSSR : -2.6247e-05
delta(WSSR) : -9.82442 limit for stopping : 1e-05
lambda : 9.65888e-05
resultant parameter values
aa = 2485.68
bb = -2109
/
Iteration 5
WSSR : 374307 delta(WSSR)/WSSR : -1.28123e-12
delta(WSSR) : -4.79573e-07 limit for stopping : 1e-05
lambda : 9.65888e-06
resultant parameter values
aa = 2485.68
bb = -2109
After 5 iterations the fit converged.
final sum of squares of residuals : 374307
rel. change during last iteration : -1.28123e-12
degrees of freedom (FIT_NDF) : 198
rms of residuals (FIT_STDFIT) = sqrt(WSSR/ndf) : 43.4792
variance of residuals (reduced chisquare) = WSSR/ndf : 1890.44
Final set of parameters Asymptotic Standard Error
======================= ==========================
aa = 2485.68 +/- 489.8 (19.71%)
bb = -2109 +/- 455.8 (21.61%)
correlation matrix of the fit parameters:
aa bb
aa 1.000
bb -1.000 1.000
Iteration 0
WSSR : 9.43971e+06 delta(WSSR)/WSSR : 0
delta(WSSR) : 0 limit for stopping : 1e-05
lambda : 0.965888
initial set of free parameter values
aaa = 1
bbb = 1
/
Iteration 1
WSSR : 82262.8 delta(WSSR)/WSSR : -113.751
delta(WSSR) : -9.35745e+06 limit for stopping : 1e-05
lambda : 0.0965888
resultant parameter values
aaa = 93.9799
bbb = 130.237
/
Iteration 2
WSSR : 38961.1 delta(WSSR)/WSSR : -1.11141
delta(WSSR) : -43301.7 limit for stopping : 1e-05
lambda : 0.00965888
resultant parameter values
aaa = -912.157
bbb = 1067.01
/
Iteration 3
WSSR : 29.5535 delta(WSSR)/WSSR : -1317.32
delta(WSSR) : -38931.6 limit for stopping : 1e-05
lambda : 0.000965888
resultant parameter values
aaa = -3087.39
bbb = 3091.12
/
Iteration 4
WSSR : 11.3578 delta(WSSR)/WSSR : -1.60205
delta(WSSR) : -18.1958 limit for stopping : 1e-05
lambda : 9.65888e-05
resultant parameter values
aaa = -3135.43
bbb = 3135.82
/
Iteration 5
WSSR : 11.3578 delta(WSSR)/WSSR : -7.81502e-08
delta(WSSR) : -8.87612e-07 limit for stopping : 1e-05
lambda : 9.65888e-06
resultant parameter values
aaa = -3135.44
bbb = 3135.83
After 5 iterations the fit converged.
final sum of squares of residuals : 11.3578
rel. change during last iteration : -7.81502e-08
degrees of freedom (FIT_NDF) : 198
rms of residuals (FIT_STDFIT) = sqrt(WSSR/ndf) : 0.239505
variance of residuals (reduced chisquare) = WSSR/ndf : 0.0573625
Final set of parameters Asymptotic Standard Error
======================= ==========================
aaa = -3135.44 +/- 2.698 (0.08605%)
bbb = 3135.83 +/- 2.511 (0.08006%)
correlation matrix of the fit parameters:
aaa bbb
aaa 1.000
bbb -1.000 1.000

View File

@ -0,0 +1,20 @@
set datafile separator ","
f(x)=a*x+b
fit f(x) "20170830-evolution1D_5x5_100Times-all.csv" every ::1 using 2:5 via a,b
set terminal png
set xlabel 'regularity'
set ylabel 'steps'
set output "20170830-evolution1D_5x5_100Times-all_regularity-vs-steps.png"
plot "20170830-evolution1D_5x5_100Times.csv" every ::1 using 2:5 title "20170830-evolution1D_5x5_100Times.csv", "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 2:5 title "20170830-evolution1D_5x5_100Times-added_one.csv", f(x) title "lin. fit" lc rgb "black"
g(x)=aa*x+bb
fit g(x) "20170830-evolution1D_5x5_100Times-all.csv" every ::1 using 4:5 via aa,bb
set xlabel 'improvement potential'
set ylabel 'steps'
set output "20170830-evolution1D_5x5_100Times-all_improvement-vs-steps.png"
plot "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:5 title "20170830-evolution1D_5x5_100Times.csv", "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:5 title "20170830-evolution1D_5x5_100Times-added_one.csv", g(x) title "lin. fit" lc rgb "black"
h(x)=aaa*x+bbb
fit h(x) "20170830-evolution1D_5x5_100Times-all.csv" every ::1 using 4:6 via aaa,bbb
set xlabel 'improvement potential'
set ylabel 'evolution error'
set output "20170830-evolution1D_5x5_100Times-all_improvement-vs-evo-error.png"
plot "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:6 title "20170830-evolution1D_5x5_100Times.csv", "20170830-evolution1D_5x5_100Times-added_one.csv" every ::1 using 4:6 title "20170830-evolution1D_5x5_100Times-added_one.csv", h(x) title "lin. fit" lc rgb "black"

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.2 KiB

View File

@ -1,7 +1,7 @@
*******************************************************************************
Wed Aug 30 21:11:12 2017
Sun Oct 1 19:58:40 2017
FIT: data read from "20170830-evolution1D_5x5_100Times.csv" every ::1 using 2:5
@ -47,7 +47,7 @@ b -0.995 1.000
*******************************************************************************
Wed Aug 30 21:11:12 2017
Sun Oct 1 19:58:40 2017
FIT: data read from "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:5
@ -93,7 +93,7 @@ bb -1.000 1.000
*******************************************************************************
Wed Aug 30 21:11:12 2017
Sun Oct 1 19:58:40 2017
FIT: data read from "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:6

View File

@ -2,13 +2,19 @@ set datafile separator ","
f(x)=a*x+b
fit f(x) "20170830-evolution1D_5x5_100Times.csv" every ::1 using 2:5 via a,b
set terminal png
set xlabel 'regularity'
set ylabel 'steps'
set output "20170830-evolution1D_5x5_100Times_regularity-vs-steps.png"
plot "20170830-evolution1D_5x5_100Times.csv" every ::1 using 2:5 title "regularity vs. steps", f(x) lc rgb "black"
plot "20170830-evolution1D_5x5_100Times.csv" every ::1 using 2:5 title "data", f(x) title "lin. fit" lc rgb "black"
g(x)=aa*x+bb
fit g(x) "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:5 via aa,bb
set xlabel 'improvement potential'
set ylabel 'steps'
set output "20170830-evolution1D_5x5_100Times_improvement-vs-steps.png"
plot "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:5 title "improvement potential vs. steps", g(x) lc rgb "black"
plot "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:5 title "data", g(x) title "lin. fit" lc rgb "black"
h(x)=aaa*x+bbb
fit h(x) "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:6 via aaa,bbb
set xlabel 'improvement potential'
set ylabel 'evolution error'
set output "20170830-evolution1D_5x5_100Times_improvement-vs-evo-error.png"
plot "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:6 title "improvement potential vs. evolution error", h(x) lc rgb "black"
plot "20170830-evolution1D_5x5_100Times.csv" every ::1 using 4:6 title "data", h(x) title "lin. fit" lc rgb "black"

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.4 KiB

After

Width:  |  Height:  |  Size: 5.5 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.6 KiB

After

Width:  |  Height:  |  Size: 5.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.4 KiB

After

Width:  |  Height:  |  Size: 5.5 KiB

View File

@ -0,0 +1,33 @@
#!/bin/bash
if [[ $# -eq 0 ]]; then
echo "usage: $0 <DATA.csv>"
else
data="$1";
png="`echo $1 | sed -s "s/\.csv$//"`" # strip ending
(cat <<EOD
set datafile separator ","
f(x)=a*x+b
fit f(x) "$data" every ::1 using 2:5 via a,b
set terminal png
set xlabel 'regularity'
set ylabel 'steps'
set output "${png}_regularity-vs-steps.png"
plot "$2" every ::1 using 2:5 title "$2", "$3" every ::1 using 2:5 title "$3", f(x) title "lin. fit" lc rgb "black"
g(x)=aa*x+bb
fit g(x) "$data" every ::1 using 4:5 via aa,bb
set xlabel 'improvement potential'
set ylabel 'steps'
set output "${png}_improvement-vs-steps.png"
plot "$2" every ::1 using 4:5 title "$2", "$3" every ::1 using 4:5 title "$3", g(x) title "lin. fit" lc rgb "black"
h(x)=aaa*x+bbb
fit h(x) "$data" every ::1 using 4:6 via aaa,bbb
set xlabel 'improvement potential'
set ylabel 'evolution error'
set output "${png}_improvement-vs-evo-error.png"
plot "$2" every ::1 using 4:6 title "$2", "$3" every ::1 using 4:6 title "$3", h(x) title "lin. fit" lc rgb "black"
EOD
) > "${png}.gnuplot.script"
gnuplot "${png}.gnuplot.script" 2> "${png}.gnuplot.log"
mv fit.log "${png}.gnuplot.fit.log"
fi

View File

@ -10,16 +10,22 @@ set datafile separator ","
f(x)=a*x+b
fit f(x) "$data" every ::1 using 2:5 via a,b
set terminal png
set xlabel 'regularity'
set ylabel 'steps'
set output "${png}_regularity-vs-steps.png"
plot "$data" every ::1 using 2:5 title "regularity vs. steps", f(x) lc rgb "black"
plot "$data" every ::1 using 2:5 title "data", f(x) title "lin. fit" lc rgb "black"
g(x)=aa*x+bb
fit g(x) "$data" every ::1 using 4:5 via aa,bb
set xlabel 'improvement potential'
set ylabel 'steps'
set output "${png}_improvement-vs-steps.png"
plot "$data" every ::1 using 4:5 title "improvement potential vs. steps", g(x) lc rgb "black"
plot "$data" every ::1 using 4:5 title "data", g(x) title "lin. fit" lc rgb "black"
h(x)=aaa*x+bbb
fit h(x) "$data" every ::1 using 4:6 via aaa,bbb
set xlabel 'improvement potential'
set ylabel 'evolution error'
set output "${png}_improvement-vs-evo-error.png"
plot "$data" every ::1 using 4:6 title "improvement potential vs. evolution error", h(x) lc rgb "black"
plot "$data" every ::1 using 4:6 title "data", h(x) title "lin. fit" lc rgb "black"
EOD
) > "${png}.gnuplot.script"
gnuplot "${png}.gnuplot.script" 2> "${png}.gnuplot.log"