Up until very recently there was just something about Westerns that I rejected intellectually. As a genre, Westerns seem so base. Cowboys and Indians? At best it’s a throwaway Sleigh Bells lyric, and worst it’s violent, racist, and embodies American’s deep-seeded inability to ever just get along.
I realize the hypocrisy here from a (relatively) grown man who reads comic books like it’s his job. But, for whatever reason, Westerns always seemed a rung below acceptable.