Women’s fiction
Women’s fiction is a group of books that focus on the lives and experiences of women. These books are sold to female readers and include many popular novels and books about women’s rights. It is different from women’s writing, which means books written by women, not necessarily sold to women.