Well, The American Civil War actually................. In England, the general view is, a good thing the Union won, slavery went and the Union was preserved. But it can't be quite that simple in the States, can it? I mean, those years from a southern perspective must be viewed differently from a Northern Standpoint? And weren't States like Delaware, Maryland & Kentucky, union states that had slavery, or have I got that wrong...........?? I'd be interested to hear the colonial views.