TY - JOUR
ID - 6083
TI - Ditzain-Totik modulus of smoothness for the fractional derivative of functions in $L_p$ space of the partial neural network
JO - International Journal of Nonlinear Analysis and Applications
JA - IJNAA
LA - en
SN -
AU - Ibrahim, Amenah Hassan
AU - Bhaya, Eman Samir
AU - Hessen, Eman Ali
AD - Department of Mathematics, Collage of Sciences, AL-Mustansiriyah University, Baghdad, Iraq
AD - Department of Mathematics, Collage of Education for Pure Sciences, University of Babylon, Iraq
Y1 - 2022
PY - 2022
VL - 13
IS - 1
SP - 3305
EP - 3317
KW - Approximation
KW - Ditzain-Totik modulus
KW - higher-order fractal approximation
KW - partial Caputo models
KW - partial neural network
KW - Sobolev space
DO - 10.22075/ijnaa.2022.6083
N2 - Some scientists studied the weighted approximation of the partial neural network, but in this paper, we studied the weighted Ditzain-Totik modulus of smoothness for the fractional derivative of functions in $L_p$ of the partial neural network and this approximation of the real-valued functions over a compressed period by the tangent sigmoid and quasi-interpolation operators. These approximations measurable left and right partial Caputo models of the committed function. Approximations are bitmap with respect to the standard base. Feed-forward neural networks with a single hidden layer. Our higher-order fractal approximation results in better convergence than normal approximation with some applications. All proved results are in $L_p[X]$ spaces, where $0{<}p{<}1$
UR - https://ijnaa.semnan.ac.ir/article_6083.html
L1 - https://ijnaa.semnan.ac.ir/article_6083_1d98171dceb1462d76cb7ec50214c228.pdf
ER -