The Marines

The Marines
Directed byJohn Grant
Directed by
John Grant
For longer than the United States has been an independent nation, there has been a Marine Corps. They consider themselves the very best America has to offer. Embodying fierce patriotism, extraordinary courage, and innovative weapons, they are a force. This documentary focuses on their training and examines what it means to be a Marine.
military
us military
Last Updated: 5 days ago















