ELI5: Explain Like I'm 5

History of nursing in the United Kingdom

Nursing has been around for a long time in the United Kingdom. In the past, nurses were usually women, who had to help patients with medical problems. This usually involved taking care of their wounds, giving them medicines and doing other medical-related tasks. Since then, nursing has become a profession, which means it has become very important and nurses now need to be properly trained and qualified before they can work in the UK. Nurses now provide care not just to the sick and injured, but also the healthy, such as providing health advice and helping people stay healthy. Nurses also help before and after people have surgery, and care for the elderly and disabled. Nurses now play an important role in helping people stay healthy and live happy lives.