Nursing is a job where you help people stay healthy and feel better when they are sick. Nurses help doctors take care of people who are injured or sick, which includes giving them medicine, caring for their injuries, and helping them with things they need like wheelchair. Nurses also talk to patients and their families, and help them make decisions about their health.