Document Type : Research Paper
Authors
1 Department of Mathematics, Collage of Sciences, AL-Mustansiriyah University, Baghdad, Iraq
2 Department of Mathematics, Collage of Education for Pure Sciences, University of Babylon, Iraq
Abstract
Some scientists studied the weighted approximation of the partial neural network, but in this paper, we studied the weighted Ditzain-Totik modulus of smoothness for the fractional derivative of functions in $L_p$ of the partial neural network and this approximation of the real-valued functions over a compressed period by the tangent sigmoid and quasi-interpolation operators. These approximations measurable left and right partial Caputo models of the committed function. Approximations are bitmap with respect to the standard base. Feed-forward neural networks with a single hidden layer. Our higher-order fractal approximation results in better convergence than normal approximation with some applications. All proved results are in $L_p[X]$ spaces, where $0{<}p{<}1$
Keywords