01 - Automation in Manufacturing
01 - Automation in Manufacturing
Politecnico di Torino
Department of Management and Production Engineering (DIGEP)
abdollah.saboori@polito.it
What is manufacturing?
The word manufacture is derived from two Latin words manus (hand) and factus (make), the combination
means made-by-hand.
Made-by-hand accurately described the manual methods used when the English word manufacture was first
coined around 1567 A.D.
Manufacturing is the application of physical and chemical processes to alter the geometry, properties, and/or
appearance of a given starting material to make parts or products.
Economic perspective
Manufacturing is a means by which a nation creates material wealth
Historical perspective
Historically, the importance of manufacturing in the development of civilization is usually
underestimated
Computer systems
Human resources are required either full-time or periodically to keep the system running.
Automation and material handling technologies, as well as human workers, are combined to create
manufacturing systems.
The manufacturing system is where the value-added work is accomplished on the parts and products.
Production machines
Fixed routing
Work units always flow through the
same sequence of workstations
Variable routing
Work units are moved through a
variety of different station sequences
Automation is the conversion of a work process, a procedure, or equipment to automatic (with little to no
human assistance) rather than human operation or control.
Automation does not simply transfer human functions to machines, but involves a deep reorganization of
the work process, during which both the human and the machine functions are redefined.
During the last 40 years, however, the computer gradually became the leading vehicle of automation. Now,
automation includes electronic and computer controls.
Automation is an evolutionary rather than a revolutionary concept, and it has been implemented
successfully in the following basic areas of activity:
Manufacturing processes and operations
Material handling
Inspection
Assembly
Packaging
The internal workings of a water-clock. From ‘The Book of Archimedes on Venetian Arsenal, 1724 engraving by Joan Blaeu
the Construction of Water-Clocks’. Or. 14270, f. 16v
The mechanization of machine tools for production began during the Industrial Revolution at the end of the 18th
century with the introduction of the Watt steam engine. Mechanization replaced human or animal power with
machine power; those mechanisms, however, were not automatic but controlled by factory workers.
In the late 19th century Frederick W. Taylor rationalized the factory system by introducing the principles of scientific
management. Scientific management strictly separated mental work from manual labour: workers were not to think
but to follow detailed instructions prepared for them by managers.
In 1913 the Ford Motor Company introduced a moving assembly line, drastically cutting assembly time. The
assembly line imposed a strict order on production by forcing workers to keep pace with the motion of the conveyor
belt. The Ford assembly line became a symbol of efficiency of American manufacturing; for workers and social
critics, however, it epitomized the monotony and relentless pressure of mechanized work.
In 1947 the Ford Company brought the term automation into wide circulation by establishing the first Automation
Department, charged with designing electromechanical, hydraulic, and pneumatic parts handling, work-feeding, and
work-removing mechanisms to connect standalone machines and increase the rate of production. In 1950 Ford put into
operation the first automated engine plant.
To meet US Air Force demands for a high-performance fighter aircraft whose complex structural members could not
be manufactured by traditional machining methods, a technology of Numerical Control (NC) of machine tools was
developed in the early 1950s.
Designed to military specifications, early NC equipment proved too complex and therefore unreliable, as well as
prohibitively expensive, and was applied mostly in the state-subsidized aircraft industry.
NC technology allowed engineers and managers to exercise greater control over the production process.
The first industrial applications of digital computers occurred in the electrical power, dairy, chemical, and petroleum
refinery industries for automatic process control. In 1959, TRW installed the first digital computer designed
specifically for plant process control at Texaco's Port Arthur refinery.
In the late 1960s, with the development of time sharing on large mainframe computers standalone NC machines were
brought under Direct Numerical Control (DNC) of a central computer.
With the introduction of microprocessors in the 1970s, centralized DNC systems in manufacturing were largely
replaced by Computer Numerical Control (CNC) systems with distributed control, in which each NC machine was
controlled by its own microcomputer.
Robotics combined the techniques of NC and remote control to replace human workers with numerically controlled
mechanical manipulators. The first commercial robots appeared in the early 1960s.
Flexible Manufacturing Systems (FMS) combined DNC equipment with machine for automated loading, unloading,
and transfer of workpieces. These systems permitted varying process routes and sequences of operations, allowing
automatic machining of different products in small batches in the same system.
In the 1960s large aerospace manufacturers, such as McDonnell-Douglas and Boeing, developed proprietary
computer-aided design (CAD) systems, which provided computer graphics tools for drafting, analysing, and
modifying aircraft designs.
In 1970 Computer Vision Corporation introduced the first complete turnkey commercial CAD system for industrial
designers, which provided all the necessary hardware and software in one package.
In the 1970s, combined CAD/CAM systems emerged which used the parameters of a geometrical model created
with the help of CAD to generate programs for CNC machine tools and develop manufacturing plans and
schedules. While CAD systems are often standardized, CAM (Computer-Aided Manufacturing) applications tend
to be industry-specific and proprietary.
With the introduction of Computer-Aided Engineering (CAE) systems for standard techniques of engineering
analysis, the whole range of engineering tasks – from conceptual design to analysis to detailed design to drafting
and documentation to manufacturing design – became automated.
Among the earliest applications of information technology was the automation of information-processing tasks.
The first stored-program digital computer purchased by a nongovernment customer was UNIVAC, installed by GE
in 1954 to automate basic transaction processing: payroll, inventory control and material scheduling, billing and
order service, and general cost accounting.
In the mid-1960s the first management-information systems (MIS) appeared, providing management with data,
models of analysis, and algorithms for decision-making; eventually they became a standard tool for operation
control, management control, and strategic planning.
In the late 1980s an integration of the automated factory and the electronic office began. CIM combines flexible
automation (robots, numerically controlled machines, and flexible manufacturing systems), CAD/CAM systems,
and management-information systems to build integrated production systems that cover the complete
operations of a manufacturing firm, including purchasing, logistics, maintenance, engineering, and business
operations.
In short, everything in and around a manufacturing operation (suppliers, the plant, distributors, even the
product itself) is digitally connected, providing a highly integrated value chain.
The term Industry 4.0 originated in Germany, but the concept largely overlaps developments that, in other
European countries, may variously be labelled: Smart factories, the Industrial Internet of Things, Smart
industry, or Advanced manufacturing.
Industry 4.0 is the digital transformation of manufacturing, leveraging third platform technologies
(Social, Mobile, Cloud & Analytics, possibly IoT) and innovation accelerators in the convergence of IT
(Information Technology) and OT (Operational Technology) to realize connected factories and industry,
smart decentralized and self-optimizing systems and the digital supply chain in the information-driven
cyber-physical environment of the 4th industrial revolution.
1. Additive Manufacturing
2. Augmented Reality
3. Autonomous Robots
4. Big Data and Analytics
5. The Cloud
6. Cybersecurity
7. Horizontal and Vertical System Integration
8. Internet of Things (IoT)
9. Simulation
Additive Manufacturing refers to the 3D technology that creates objects — and potentially organic parts
— by appending successive layers of material. This technology offers a strong application for prototype
work, small custom batches, and lightweight, locally produced parts.
Augmented Reality offers benefits for training, troubleshooting, and repairs during service calls.
Autonomous Robots. Autonomous capabilities increase the utility of robots, allowing them to adjust
actions based on a particular activity or on the product’s level of completeness. In addition to collaborating
safely with humans, robots can also work together with humans. In manufacturing, for example, an
application may include the portable assembly line. Using autonomous vehicles, you could ferry work in
progress between stations, rather than employ a fixed conveyor belt assembly line. Portable workstations
provide the convertibility that accelerates turnaround in manufacturing. Increased cognitive automation
can also aid robots in decision making. Moreover, sophisticated cognitive automation can improve
manufacturing’s clerical activities. Armed with lines and machines that connect to your company’s ERP,
you can generate products based on orders rather than projections, reducing costly lead time.
The Cloud. Interconnectedness in manufacturing requires collaboration and contact beyond facility and
company boundaries. Fast cloud computing permits data collection, analysis, storage, and even
monitoring.
Cybersecurity. The move away from closed systems and toward interconnectedness demands higher
levels of user-access security and cybersecurity for networks that relay precision data and control
machines.
Horizontal and Vertical System Integration. System integration signifies the complete coordination of
all departments and entities along the supply chain, beginning with machine-to-machine (M2M)
communication on the factory floor. For example, producers receive information from their supply chain
and sales organizations, and engineering departments maintain a connection to production. Cloud
computing enables much of these capabilities.
Internet of Things (IoT). When they all contain IoT sensors, devices along the production line, in the
field, and in control centres can interact with one another to provide granular data and faster responses.
With IoT technology, these devices can also include the wired and wireless capability to communicate with
the cloud and to offer predictive maintenance.
Simulation. 3D simulations of products, materials, and production processes can leverage real-time data to
present virtual models of entire production systems. With enhanced simulation, you can test and optimize
tool settings before lines change over.
The nine pillars of technological advancement that comprise the foundation of Industry 4.0 are already
used in manufacturing.
With Industry 4.0 approach, the nine pillars of technological advancement will transform production.
Isolated, optimized cells will come together as a fully integrated, automated, and optimized production
flow, leading to greater efficiencies and changing traditional production relationships among suppliers,
producers, and customers as well as between human and machine.
https://www.bcg.com/it-it/publications/2015/engineered_products_project_business_industry_4_future_productivity_growth_manufacturing_industries
Question???
abdollah.saboori@polito.it
Integrated Manufacturing Systems | Industrial automation and Industry 4.0 39