Does anyone believe insurance companies exist to make medical services more available for a lower price or do they primarily exist to make a profit for their stockholders. I have nothing against any business being profitable. In fact they won't stay in business if they aren't profitable. So the answer in the business of providing medical care is to eliminate that cost of the middle man; in this case the insurance companies. The cost of medical care would drop enormously!