Question
Updated on
1 Nov 2018

  • French (France)
  • English (US)
  • Spanish (Spain)
  • English (UK)
Question about English (US)

Western fiction is a famous genre of the American literature. Even though the western stories take place in the eighteenth or nineteenth century, and particularly during the Conquest of the West, it still forms part of the American culture, whether it be with books or movies Does this sound natural?

Answers
Read more comments

  • English (US)
[News] Hey you! The one learning a language!

Share this question
Similar questions
Previous question/ Next question

Ask native speakers questions for free