More Than Half of U.S. Employees Say Their Employer Provides No Wellness Benefits
The majority of working Americans say their employers do not offer support, assistance or benefits designed to help them improve physical health or wellness. While many employers acknowledge that healthy workers are more productive workers, and contribute to lower healthcare costs, a new poll shows that most working Americans do not have employer-provided wellness benefits.
Read more →