So far I've enjoyed stuff like The Boys, The Thick Of It, What We Do In The Shadows, but I'm open to try anything really. Except The Office.
So far I've enjoyed stuff like The Boys, The Thick Of It, What We Do In The Shadows, but I'm open to try anything really. Except The Office.
Newer stuff only? Cause if you haven't seen The Wire, I think that would instantly shoot to the top of my recommendations
Nah, don't mind the age
Without spoilers: does the series get better or worse after season one?
Better, IMO. It peaks at Season 4, but Season 5 is still good TV. Some people don't like Season 2 as much as others, but I've always liked it and how it looks at the white working class and unions' role in modern cities.
But they are all similar to Season 1. The show to me is very novelistic and I think it took me until episode 10 or so to be sucked into Season 1 and the show completely.