In 1861, Brown & Sharpe created the groundbreaking Universal Milling Machine, which could mill complex part geometries with movement in three axes. From there, production took off as WWI approached. Milling technology developed rapidly for the next few decades, and the introduction of high-accuracy machines like the Jig Bore set the standard for milling accuracy. Now machinists could quickly and accurately locate holes with great precision, making mills commonplace for prototyping and producing wartime equipment.
During the 1950s, NC (Numerical Control) finally moved from the laboratory into the machine shop, with machinists using punched tape to direct the milling machine’s movements. Initially, NC machining was used only in aerospace applications, where recreating complex airfoil and wing profiles proved difficult to do reliably. It caught on slowly elsewhere, but accelerated into full CNC (Computer Numerical Control) in the 60s and 70s when data storage and input methods improved.
When manual labor was used to produce precision products, a lot of time and resources were spent in the production process. The quality of the products was also not guaranteed under this form of production. Shortly after the Second World War, the industry changed unbelievably. CNC machines and modern precision machines were introduced, reducing the number of people needed to do the physical work. Rather than using manual ways of production, the machines needed computer programming. Demand for skilled labor from engineers also went up.
The innovations and changes brought in a new era of machining, enabling high accuracy mass production parts and a focus on detail possible while reducing the cost of production significantly. Today we use CNC machines to create small and simple to giant, complex parts out of many materials.