Advertisement
Advertisement
healthcare
[ helth-kair ]
noun
- the field concerned with the maintenance or restoration of the health of the body or mind.
- any of the procedures or methods employed in this field.
adjective
- of, relating to, or involved in healthcare:
healthcare workers; a healthcare center.
Discover More
Word History and Origins
Origin of healthcare1
Discover More
Example Sentences
But if you ask leftists what Cuba is famous for they will usually say something altogether different: healthcare and education.
From The Daily Beast
Here are four reasons to check out the healthcare marketplaces this year.
From The Daily Beast
Increased access to reproductive healthcare has resulted in better maternal and infant health outcomes.
From The Daily Beast
For the first time her children reliably received healthcare and consistently went to school.
From The Daily Beast
He tried to pass healthcare reform but Newt Gingrich and the GOP-controlled House and Senate shot it down.
From The Daily Beast
We will assure quality, affordable healthcare for all Americans.
From Project Gutenberg
Advertisement
Advertisement
Advertisement
Advertisement
Browse