|  | Cancer | Skin Cancer

What colleges in Texas have dermatology as a major?

Dermatology is not offered as a major at any colleges or universities in Texas or any other state in the United States. Dermatology is a medical specialty that requires a medical degree and additional specialized training.

Skin Cancer - Related Articles