When I was doing history we learned about the build up to WW2, Establishment of the German Empire, British politics in the 19th century.
In fact many things that are taught in the Americas barley get a foot note here. For example the War of 1812, where the yanks got big headed and tried to invade canada and were beaten by Colonial Millitias and the British Army, and while we did win the conflict there was also something that overshadowed it for us.
Oh yeah, Him.
However I think the American civil war gets a mention in history classes since there was a whole controversy over the Union capturing British ships or something to detain Confederate officers.
We also learn a bit about the Empire, a bit about Itallian unification (Seriously Garibaldi started off his conquest well mainly due to the British presence in the med, since the Sicillian forces saw Red coats and imediatly though they were British Marines, and this was the time where Britain was the world power.)
It also depends on where you go to school though since there are many different courses in the SQA history section, one of which being the American west as well as the more local history of the Jacobite Rebellions of 1745, and of course we can't forget about WW1. From trench warfare right up to German piracy.