By Harvard Business Review
Financial institutions have been employers of women for decades: historically as tellers, secretaries, and junior administrative staff. In the 1980s, however, pioneering women began moving into management roles and into frontline business areas, such as investment banking. Today 47% of management and professional roles in American financial firms are occupied by women, according to the U.S. Bureau of Labor Statistics.