Here, we propose a practical method for solving nonsmooth convex problems by using conjugate gradient type methods. We present a modified HS conjugate gradient method, as one of the most remarkable methods to solve smooth and large-scale optimization problems. In the case that we have a nonsmooth convex problem, by way of the Moreau-Yosida regularization, we convert the nonsmooth objective function to a smooth function and then we use our method, by making use of a nonmonotone line search, for solving a nonsmooth convex optimization problem. We prove that our algorithm converges to an optimal solution under standard condition. Our algorithm inherits the performance of HS conjugate gradient method.