Health Basics: The TRUE History of Cancer in the USA

(Natural News) You don’t have to go back that far actually, only a little over a century, to find out when cancer was “bred” in the United States as a form of income for insidious tyrants of the medical industry. Forget about conspiracy theories and let’s talk facts — in fact, a whole barrage of…

>View original article
Author: