Health insurance in the United States


In the United States, health insurance is any program that helps pay for medical expenses, whether through privately purchased insurance, social insurance or a social welfare program funded by the government. Synonyms for this usage include "health coverage," "health care coverage" and "health benefits."