In this thesis, we suggest a new algorithm for solving convex optimization problems in Banach spaces. This algorithm is based on a greedy strategy, and it could be viewed as a nonlinear conjugate gradient type method. We prove its convergent rates under a suitable behavior of the modulus of uniform smoothness of the objective function. We apply the proposed algorithm on several examples such as approximation in Hilbert spaces, solving linear systems, and others. We also perform several numerical tests in the case when the objective function is the opposite of the log-likelihood function under the Logistic Regression model. Our numerical results confirm the fast convergence rate of the proposed algorithm and its potential for solving real life problems.