A doctor is someone who knows a lot about our bodies and how to help us when we're sick or hurt. They go to school for many years to learn all about the different parts of our bodies and how to make them better. When we go to see a doctor, they ask us questions and check us to figure out what is making us feel not so good. They may give us medicine or tell us to rest or eat certain foods to help us feel better. Some doctors specialize in different things like helping babies or fixing broken bones, but they all want to make us feel better and healthy!