ELI5: Explain Like I'm 5

History of nursing in the United States

Nursing is a job where women and men take care of sick people. The history of nursing in the United States is very interesting. Long ago, there were no nurses in the United States, but there were many sick people who needed help. Then, a woman named Clara Barton started a group called the American Red Cross to help soldiers during the Civil War. She taught other women to be nurses and helped form the first nursing schools.

Later, a woman named Florence Nightingale became famous for making hospitals cleaner and safer for patients in England. Her ideas soon spread to the United States, and nursing became a profession that people could go to school to learn how to do. Nurses were called on to help many different people, in hospitals, during disasters, and even in people's homes.

Some famous nurses in history include Mary Eliza Mahoney, who was the first African American registered nurse, and Lillian Wald, who started the Visiting Nurse Service in New York City to help poor people who were too sick to leave their homes.

As medicine became more advanced, nurses began to learn how to use machines and technology to help people. They also started to specialize in different areas of medicine, like pediatrics (taking care of kids) or oncology (taking care of people with cancer).

Today, nursing is a very important job in the United States. Nurses help keep people healthy and take care of them when they are sick. They work in hospitals, clinics, and even in people's homes. And they are very important members of the healthcare team that helps everyone stay healthy and happy.