The United States is rooted in a brand of liberalism which arose from the Enlightenment period of the 17th and 18th centuries. John Locke’s beliefs in natural rights, limited government, and the consent of the governed — indeed of life, liberty, and property — were melded together by the Founding…