Women's fictionWomen's fiction is an umbrella term for women-centered books that focus on women's life experience that are marketed to female readers, and includes many mainstream novels or women's rights books. It is distinct from women's writing, which refers to literature written by (rather than promoted to) women. There exists no comparable label in English for works of fiction that are marketed to men. The Romance Writers of America organization defines women's fiction as, "a commercial novel about a woman on the brink of life change and personal growth. Her journey details emotional reflection and action that transforms her and her relationships with others, and includes a hopeful/upbeat ending with regard to her romantic relationship."[1] The Women's Fiction Writers' Association gives a broader and more inclusive definition, in which romance elements are not mandatory:
Criticism of the termWhile the women's fiction label is embraced by some authors, others have argued that it is applied too broadly to works by women that would otherwise be considered literary fiction, therefore marginalizes women's writing. Critics point to the lack of an equivalent term for men's works, and that men's works are rarely if ever considered women's fiction even if they fall within the parameters of the genre.[3] Author Jennifer Weiner has been a vocal critic of the term, which she believes leads to books written by women receiving less publicity, fewer reviews, and lower esteem than those written by men.[4] Women's fiction has been compared to chick lit, a term that has since fallen out of favor.[5] See also
References
External links |