How Nature Can Make You Healthier

Published: Feb. 9, 2018, 9 a.m.

b'

The sound of waves on a rocky beach. The smell of soil after the rain. The warmth of the sun on your skin. Nature just feels good. But a growing body of research suggests that it might be good for you, too. Florence Williams, author of The Nature Fix: Why Nature Makes Us Happier, Healthier, and More Creative, explains why going outside can make you feel better.

'