The Wire and The Sopranos probably take it, although I haven't actually seen a lot from that list which renders my input all but useless.
At the risk of sounding slightly pretentious, I think that the truly great dramas have to offer some kind of social commentary, or insight into a certain subculture. There needs to be than just a story, like an overarching collection of themes or ideas. They need to say something.
I'm probably too illiterate to explain it better, but having just finished all of the Breaking Bads available on people's recommendations I feel my opinion is vindicated further; there's no substance there or ability to transcend the very limited microcosm the character's exist in.
Of course, 50 minute escapism isn't the worst thing in the world, but when I typically invest a lot of time trawling through the series I foolishly expect to be more than just entertained. :uhoh2:
/gay.