The United States Hasn't Been A Republic For Years, Nothing New.

in law •  8 years ago 

So I ran across this on facebook a couple days ago. It's not to often I find a video, especially from anon, that is quick concise and on par. This one however is pretty darn good. Goes slightly into the history of the United States and how it was turned from the republic into a democratic corporate.

Everything is reversed in this democratic world. In the republic the people are sovereign who then grant rights to the municipality and up the chain, in the democracy those rights have been granted and now the fed is king with rights now running downhill ending in the very lowly rights of the citizen (instead of people the 14th amendment ens Legis was created after the civil war to allow everyone to flow nicely into a different juricition).

https://mainerepublicemailalert.com/2016/11/20/united-states-republic-or-corporation/

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  

The Civil War was fought to take away our liberty and transfer power from the individual to a centralized locus of power...not slavery as history tells us.

Exactly. Couldn't agree more. That as well as the south wanting nothing to do with the debt owed to the international creditors who were trying to collect from the north

Not to mention that the South repaid all of the debts accrued by the nation during the Revolution.