Jump to content

Portal:Western films

From Wikisource
Western Film

The American Film Institute defines Western films as those "set in the American West that [embody] the spirit, the struggle and the demise of the new frontier."

Film

[edit]

Western drama

[edit]

Television series

[edit]