John Wayne: An American Icon

John Wayne has helped shape American values in the past, and remains a prominent figure in Hollywood. A true American, this documentary follows the rising path of his career.