meaning of dermatology

1. The science which treats of the skin, its structure, functions, and diseases.
2.
the branch of medicine dealing with the skin and its diseases


Related Words

dermatology |

Developed & Maintained By Taraprasad.com

Treasure Words