The word dermatology comes from the Greek meaning skin. The skin is the largest organ in our body is the skin which acts as a barrier from trauma and infections.
A dermatologist is a doctor that specializes in treating skin, hair and nail disorders and diseases. A dermatologist helps you to maintain health skin. Dermatologists assess and manage cosmetic issues such as scars and acne.
A healthy skin is a reliable indicator of the overall health of our body.