meaning of tropical medicine

1. the branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions


Related Words

tropical medicine |

Developed & Maintained By Taraprasad.com

Treasure Words