BUSINESS
09/03/2013 12:05 pm ET Updated Nov 03, 2013

The Gender Gap Is Only Getting Worse Among Physicians

Getty

While the gender wage gap appears to be disappearing in the general work force, the exact opposite is happening in the health care sector: New research in the Journal of the American Medical Association suggests that the gender pay gap among doctors, dentists and other health care workers has grown over the past decade.

Read more on Washington Post