World War I is, I think, and often neglected subject because it is eclipsed by WWII. However, I think that the thing that impresses me so much about WWI, is that afterwards, things were never the same again. One example of this is Britain's social order. The country had such a rigid class structure up until the time of the Great War, and then all the men fought side by side in the war. Upon returning home, many of the upper class gentlemen realized how little difference there was between them and people from the lower classes. They found they could no longer keep such a superior attitude toward men with whom they had fought side by side. I don't know why, but this has always been very interesting to me. A war that we so often forget, changed so much in the world.