I understand that very often we are seen portraying women as the victims, or as the underdog, or as people who suffer, but with such a nice bunch of successful women in front of us, do you think the media would also need to work on the perception of women in these so-called non-traditional professions? There are so many benefits to bring into the professions, a positive image of encouraging women. Otherwise, if you always talk about harassment and lack of this, lack of that, that discourages women.
On April 14th, 2010. See this statement in context.