In this paper, we consider the convergence properties of a new class of three terms conjugate gradient methods with generalized Armijo step size rule for minimizing a continuously differentiable function f on R^π without assuming that the sequence {xk} of iterates is bounded. We prove that the limit infimum of ‖↓△f(xk)‖ is Zero. Moreover, we prove that, when f(x) is pseudo-convex (quasi-convex) function, this new method has strong convergence results: either xk→x* and x* is a minimizer (stationary point); or ‖xk‖→∞, arg min{f(x) :x∈R^n} =φ, and.f(xk) ↓ inf(f(x) : x∈R^n}. Combining FR, PR, HS methods with our new method, FR, PR, HS methods are modified to have global convergence property.Numerical result show that the new algorithms are efficient by comparing with FR,PR, HS conjugate gradient methods with Armijo step size rule.