RE: What if taking medication, prescription pills, supplements, and drugs makes health worse?

You are viewing a single comment's thread from:

What if taking medication, prescription pills, supplements, and drugs makes health worse?

in happierpeople •  7 years ago 

Everything have its own place in life. Medicines are important when our natural system does not work as it should, these can be avoided by taking balanced diet; but if the natural system does not work then medicines are crucial.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!