tropical-medicine

tropical medicine

noun
the branch of medicine dealing with the study and treatment of diseases occurring in the tropics.
Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source Link To tropical-medicine
American Heritage
Medical Dictionary

tropical medicine n.
The branch of medicine that deals with diseases occurring in tropical countries.

The American Heritage® Stedman's Medical Dictionary
Copyright © 2002, 2001, 1995 by Houghton Mifflin Company. Published by Houghton Mifflin Company.
Cite This Source
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature