Another technique to regularise the positive (respectively negative) part function
is to approximate it with the Error Function, erf for short, which is defined as
the integral of a standard Gaussian,
It is possible to scale the erf function to match the function \(x^+\).
This is done imposing that the regularised function, call it \(x^+_{\erf}\) for commodity,
has a small value \(h\) at \(x=0\), that is \(x^+_{\erf}(0)=h\).
It is convenient to introduce an auxiliary parameter \(\kappa\DEF\frac{1}{2h\sqrt{\pi}}\), hence the function
A typical function that is used to approximate other functions is the inverse
tangent function, here it is composed with the sinus function to match the positive part function.
The control parameter is \(h\) (with \(\kappa=2h\)) and the approximation is:
\[x^+_{\mathrm{sa}} = \frac{1}{2}
\begin{cases}
x+\sqrt{x^2+\kappa^2} & x > 0 \\[1em]
\dfrac{\kappa^2}{\sqrt{x^2+\kappa^2}-x} & x \leq 0.
\end{cases}\]
Regularised positive/negative part with Polynomials
One of the most intuitive ways to avoid the corner at \(x=0\) with a rounded curve
is to smoothly join the line \(y(x)=0\) for negative \(x\) with the positive part \(y(x)=x\) with a polynomial.
The minimal degree necessary to have continuous derivatives up to second order is three.
If the required parameter \(h\) is chosen such that \(x^+_{\mathrm{poly}}(- h)=0\)
and \(x^+_{\mathrm{poly}}(h)=h\), then the approximating function is, see Figure The graph of the regularised positive part with the polynomial regularisation.
In red the analytic positive part.
\[x^+_{\mathrm{poly}}=
\begin{cases}
0 & x \leq -h \\[1em]
\dfrac{h}{6}\left( 1+{\dfrac {x}{h}} \right) ^{3} & x\leq 0 \\[1em]
x+\dfrac{h}{6}\left( 1-{\dfrac {x}{h}} \right) ^{3} & x\leq h \\[1em]
x & \textrm{otherwise.}
\end{cases}\]