Sunday, January 6, 2008

Varying Viewpoints: What Were the Consequences of the Civil War?

No, I don't agree with the historians that believe the importance of the Civil War has been exaggerated.
The Civil War was a very important turning point in our history after those rocky years leading up to the war. After the war, the issue of slavery was figured out and was forcibly removed from the states where it was previously allowed. Also within that same issue, African-Americans were allowed certain rights that now made them actual citizens of the United States. Also after the war, the Union and Confederacy were "forced" to reconcile with each other...which ultimately led to a UNITED United States.
After the war, there was also an expansion of federal power. It protected rights, encouraged the development of the National Banking System, enforced laws, expanded industry, and new legal and government institutions were formed. Even though the institution of the twelfth, thirteenth, and fourteenth amendments expanded the federal power, the expansion of federal power turned out to be a good thing. (The previous amendments had limited the governmental power.)
The importance of the Civil War has not been exaggerated.