Health insurance companies drill deeper into dental benefits

Sharing is caring!


Dental insurance has long been sold separately from health insurance, although studies show that oral health is strongly linked to a patient's overall health. An employer or individual who purchases medical and dental insurance is able to purchase packages from two separate companies.

But it may be changing A new survey shows that health insurance companies are increasingly interested in selling dental insurance to their health plan customers as a way to increase profits. and in line with the movement towards value-based holistic care. Although the change may threaten the viability of independent dental insurance companies, it can also provide a more convenient experience for patients.

About 80% of health insurance companies currently offer dental insurance, up from 68% in 2018, according to a survey and survey of 106 insurance directors of consulting firm West Monroe Partners based in Chicago. But insurance companies don't always come with dental benefits with health insurance, instead focusing on the much more beneficial health aspect, Will Hinde, one of the authors of report and managing director at West Monroe said.

The revenue an insurer earns from dental policies is about 3% to 4% of the amount it earns for a health insurance policy, he said. However, dental insurance has proven profitable and the healthcare industry has become more focused on providing integrated care to reduce spending. New technology is helping insurers easily analyze claims for both health and …



Source

Leave a Comment