Yes, a college degree continues to be important but not that significant. Although it is true that most employers take into consideration the skills and experience more than at times the degrees, the degrees can provide the means to get the better opportunities, higher income and credibility in the industries where formal education is obligatory.