Very cliche way of portraying the deep south, obviously written by those who've never actually lived there (I've lived in NC, LA, MS, MO and CA) so yes, I know what I'm talking about. I think one other Reviewer said "Hollywood Politically correct in nauseating and bigoted way. Worth watching to see and understand how depraved the film industry have become." And I echo that sentiment.