Semnan UniversityInternational Journal of Nonlinear Analysis and Applications2008-682213120220301Ditzain-Totik modulus of smoothness for the fractional derivative of functions in $L_p$ space of the partial neural network33053317608310.22075/ijnaa.2022.6083ENAmenah HassanIbrahimDepartment of Mathematics, Collage of Sciences, AL-Mustansiriyah University, Baghdad, IraqEman Samir BhayaDepartment of Mathematics, Collage of Education for Pure Sciences, University of Babylon, IraqEman Ali HessenDepartment of Mathematics, Collage of Sciences, AL-Mustansiriyah University, Baghdad, IraqJournal Article20210611Some scientists studied the weighted approximation of the partial neural network, but in this paper, we studied the weighted Ditzain-Totik modulus of smoothness for the fractional derivative of functions in $L_p$ of the partial neural network and this approximation of the real-valued functions over a compressed period by the tangent sigmoid and quasi-interpolation operators. These approximations measurable left and right partial Caputo models of the committed function. Approximations are bitmap with respect to the standard base. Feed-forward neural networks with a single hidden layer. Our higher-order fractal approximation results in better convergence than normal approximation with some applications. All proved results are in $L_p[X]$ spaces, where $0{<}p{<}1$https://ijnaa.semnan.ac.ir/article_6083_1d98171dceb1462d76cb7ec50214c228.pdf