Nine out of ten dentists see dental insurance companies as enemies rather than allies, according to this survey.
In the eyes of the average dentist, insurance companies exist solely to turn a profit and have little or no interest in actually helping patients or doctors. However, insurance does bring patients in, and few dental practices have the option of not dealing with insurance companies.
Read more: Dentists’ Hands Tied by Dental Insurance