only for rock
hollywood movies
Back To Top

News Update

About Us

The cinema of the United States, often metonymously referred to as Hollywood, has had a profound effect on cinema across the world since the early 20th century. The dominant style of American cinema is classical Hollywood cinema, which developed from 1917–1960 and characterizes most films to this day. While Auguste and Louis Lumière are generally credited with the birth of modern cinema,[7] it is American cinema that soon became the most dominant force in an emerging industry. Since the 1920s,

Video

Enquiry

Contact Us