Is the raw food diet just another Hollywood trend?
It could be considered a trend because with celebrity interest brings mainstream attention. In reality, the raw food diet has been around for thousands of years with the modern Western movement beginning in the early 1900′s. I believe here and now in 2010, America is at a crossroads of all-time high obesity and chronic disease, with an emergence of dietary awareness. We’re at realization that there’s a lot wrong with our big picture. We’re tired of being sick, we want to feel better, to be healthier, to know where our food is coming from. There’s unprecedented interest in real natural, organic, and whole foods than ever before (just look at the growth of Whole Foods Market). So I don’t think the raw food diet is going anywhere soon, even when Hollywood moves along to the next “trend”.