Healthcare continues to provide long-term career stability, and remains a viable choice for many students. For years, though, traditional gender roles almost exclusively directed women into the nursing field. If a young man didn’t want to attend medical school, he was mostly left out of careers in healthcare. That trend is changing now, as more males are entering the profession.
In 2015, men comprised 13 percent of the nursing field. That’s a drastic change from 1960, when only 2.2 percent of nurses were male. Since male nurses made up 11.3 percent of the profession in 2010, the most recent numbers show that the number of men entering the nursing field is still on the rise.
What’s changed? For one thing, demand for nurses is expanding, and nursing is now seen as a more lucrative career. The Baby Boomer generation has boosted the growth of the healthcare industry overall, nurses are needed in virtually every area of the country, and job competition means that experienced nurses can garner higher wages than in decades past.
Plus, more males are graduating high school overall, meaning more of them are moving into college-degree programs rather than trade schools.
Gender roles are shifting as well, with women moving into traditionally “male” careers and vice versa. As attitudes toward men’s and women’s responsibilities in the home shift, those perspectives carry over into the workforce; caregiving and nurturing are no longer viewed as exclusively “feminine” traits.
As you continue to examine career possibilities, remember to examine all of your options. There is no reason for outdated gender roles to dictate a choice that will affect the rest of your life. Nursing continues to offer a highly respected career, that offers plenty of opportunity for advancement, rising wages, and job flexibility. With the healthcare industry continuing its rapid expansion, post-graduation job security in the nursing field is expected to remain steady over coming years.