For practical purposes UTC and GMT mean precisely the same thing. Historically UTC was probably introduced as a more exact standard than GMT. GMT used to be the mean solar time at the Royal Observatory in Greenwich, while UTC was based on International Atomic Time "with leap seconds added at irregular intervals to synchronize with the Earth's rotation" (to quote Wikipedia). In other words, GMT was astronomically calculated while UTC isn't -- even if astronomical calculations are used to adjust it.
Today GMT is often used as a synonym of UTC, and even when it isn't the difference is small enough to be unimportant in daily life.
UTC doesn't actually stand for anything. English speaking people wanted to use the acronym "CUT" for "Coordinated Universal Time" and the French speakers wanted to use "TUC" for "temps universel coordonné", so UTC is a compromise signifying nothing, apart from the importance of international coordination.
In Linux as in Unix the hardware clock has traditionally been set to UTC/GMT and the system has converted the hardware time to local time when needed. But in Windows the hardware clock is normally set to local time. The result? Total confusion for dual booters.
Fortunately you can make Linux understand when not to convert the time -- if you just tell it that the hardware clock is set to local time.
Now, if your hardware clock is set to your local time and if Linux is aware of this, it usually doesn't matter whether you've specified your time zone as Paris, Washington or Tokyo. Throughout most of the year the time will not have to be converted. The hardware time remains the same as the system time. Until --
Until your standard local time is replaced by your standard local summer time (a.k.a. your "daylight saving time"). As the switch may not necessarily happen on the same date all over the world (if at all) your correct time zone settings may become important even if you've set your hardware clock to "local" time.
Edited for clarity and elegance.