Friday, November 19, 2010

Why do employers give their employees health and dental insurance?

Is this because the law requires that they do so or is it more of a competition factor as in other jobs of the same type/caliber offer it as a "perk" to choose one job offer over another.
--------------------
It began in World War II when wage controls stop companies from raising salaries to keep the best people. Instead they offered other paid-for benefits, like health care.
Source

No comments:

Post a Comment