With all the new shows based around nurses that came out this televesion season I wondered about your take on them now that a lot of the hoopla has died down. I actually haven’t seen them myself, but I have read a lot of negative opinions about them. And from what I have seen and read they just seem to be playing off of exisiting stereotypes and creating false expectations for young nurses or those looking into the career.
But, what about you? Do you feel they are good or bad for the profession? Why? Is one worse or better than the others?