ELI5: Explain Like I'm 5

Veterinary medicine in the United States

Veterinary medicine in the United States is the practice of taking care of animals. This includes diagnosis and treatment of illnesses and injuries, and preventive care. Veterinarians, or doctors who care for animals, go to college and school to learn how to take care of animals. They can help animals stay healthy, or make animals better when they are sick or hurt. Some also do surgery on animals.