0% found this document useful (0 votes)
5 views62 pages

Week 1 Intro

CENG 105 is an introductory course on programming in C, focusing on programming constructs, algorithmic problem-solving, and basic data structures. The course includes individual assignments, a midterm, and a final exam, with a textbook titled 'C: How to Program' by Paul and Harvey Deitel. Key topics covered include computer hardware and software basics, data hierarchy, and the role of programming languages, particularly C, in modern computing.

Uploaded by

maher sawsak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views62 pages

Week 1 Intro

CENG 105 is an introductory course on programming in C, focusing on programming constructs, algorithmic problem-solving, and basic data structures. The course includes individual assignments, a midterm, and a final exam, with a textbook titled 'C: How to Program' by Paul and Harvey Deitel. Key topics covered include computer hardware and software basics, data hierarchy, and the role of programming languages, particularly C, in modern computing.

Uploaded by

maher sawsak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 62

CENG 105

Computer Programming I
Week 1

Dr. Mesut ÜNLÜ


Instructor

Dr. Mesut ÜNLÜ (mesut.unlu@ostimteknik.edu.tr)

Office Room: 8. Floor (814)


Course Information - Computer Programming I

Objective
This course gives a brief introduction to programming language
constructs, solving algorithmic problems, and basic data structures in
C. It is designed as a first course of programming and supported by
laboratory sessions.

Textbook
"C: How to Program", Paul Deitel, Harvey Deitel, Nineth Edition,
2023
Course Information - Computer Programming I

Grading (Tentative)
Homeworks %10
%40
Midterm Exam %30
Final Exam %55
%60
Attendance %5 (Avarage grade

Notes
Each work will be individual assignment. Working in groups are not allowed.
You must follow general code of ethics for universities.
Follow Distance Education Center for updates about the course.
C How to Program
Ninth Edition

Chapter 1

Introduction to Computers and C

Copyright © 2022 Pearson Education, Inc. All Rights Reserved


Objectives (1 of 2)

▪ Learn about exciting recent developments in computing.


▪ Learn computer hardware, software and Internet basics.
▪ Understand the data hierarchy from bits to databases.
▪ Understand the different types of programming languages.
▪ Understand the strengths of C and other leading programming languages.
▪ Be introduced to the C standard library of reusable functions that help you
avoid “reinventing the wheel.”
Objectives (2 of 2)

▪ Test-drive a C program that you compile with one or more of the popular C
compilers we used to develop the book’s hundreds of C code examples,
exercises and projects (E E P s).
▪ Be introduced to big data and data science.
▪ Be introduced to artificial intelligence—a key intersection of computer science
and data science.
Introduction (1 of 5)

▪ You’re probably familiar with many of the powerful tasks computers perform—in this textbook,
you’ll get intensive, hands-on experience writing C instructions that command computers to
perform those and other tasks
▪ Software (that is, the C instructions you write, which are also called code) controls hardware
(that is, computers and related devices)
▪ C is widely used in industry for a wide range of tasks
▪ Today’s popular desktop operating systems—Windows, macOS and Linux—are partially written
in C
▪ Many popular applications are partially written in C, including popular web browsers (e.g.,
Google Chrome and Mozilla Firefox), database management systems (e.g., Microsoft S Q L
Server, Oracle and My S Q L) and more
Introduction (2 of 5)

▪ Introduce terminology and concepts that lay the groundwork for the C programming you’ll learn,
beginning in Chapter 2
▪ Introduce hardware and software concepts
▪ Overview the data hierarchy

From individual bits (ones and zeros) to databases, which store the massive amounts of data that
organizations need to implement contemporary applications such as Google Search, Netflix, Twitter,
Waze, Uber, Airbnb and a myriad of others
Introduction (3 of 5)

▪ We’ll discuss the types of programming languages


▪ We’ll introduce the C standard library and various C-based “open-source” libraries that help you
avoid “reinventing the wheel”
▪ We’ll introduce additional software technologies that you’re likely to use as you develop
software in your career
Introduction (4 of 5)

▪ Many development environments are available in which you can compile, build and run C
applications
▪ You’ll work through one or more of the four test-drives showing how to compile and execute
C code using:

Microsoft Visual Studio 2022 Community edition for


Windows.
Clang in X code on mac O S.
G N U g c c in a shell on Linux.
G N U g c c in a shell running inside the G N U Compiler
Collection (G C C) Docker container.
Introduction (5 of 5)

▪ In the past, most computer applications ran on “standalone” computers (that is, not networked
together)
▪ Today’s applications can communicate among the world’s computers via the Internet
▪ We’ll introduce the
– Internet,
– the World Wide Web,
– the Cloud
– the Internet of Things (I o T),
▪ each of which could play a significant part in the applications you’ll build in the 2020s (and
probably long afterward)
Hardware and Software

▪ Computers can perform calculations and make logical decisions phenomenally faster than
human beings can
▪ Today’s personal computers and smartphones can perform billions of calculations in one
second—more than a human can perform in a lifetime
Hardware and Software

Supercomputers already perform thousands of trillions (quadrillions) of instructions per second!


As of December 2020, Fujitsu’s Fugaku is the world’s fastest supercomputer—it can perform 442 quadrillion
calculations per second (442 petaflops)!
To put that in perspective, this supercomputer can perform in one second almost 58 million calculations for
every person on the planet! And supercomputing upper limits are growing quickly.
Hardware and Software

▪ Computers process data under the control of


sequences of instructions called computer
programs (or simply programs)

▪ Specified by people called computer programmers


Hardware and Software

A computer consists of various physical devices referred to as hardware: keyboard, screen, mouse, solid-
state disks, hard disks, memory, D V D drives and processing units
Hardware and Software

▪ Computing costs are dropping dramatically due to rapid developments in hardware and
software technologies.
▪ Computers that might have filled large rooms and cost millions of dollars decades ago are now
inscribed on silicon computer chips smaller than a fingernail, costing perhaps a few dollars
each
▪ Silicon-chip technology has made computing so economical that computers and computerized
devices have become commodities.
Computer Organization

▪ Regardless of physical differences, computers can be envisioned as divided into various logical
units or sections
▪ Input Unit—This “receiving” section obtains information (data and computer programs) from
input devices and places it at the other units’ disposal for processing
▪ Examples:
▪ Keyboards, mouse devices, microphones,
▪ scanners, barcode readers, USB flash drives, etc.
Computer Organization

Output Unit—This “shipping” section takes information the computer has processed and places it on
various output devices to make it available outside the computer
Examples:
Screens, printers, headphones, etc.
Computer Organization

Memory Unit—This rapid-access, relatively low-capacity “warehouse” section retains information entered
through the input unit, making it immediately available for processing when needed.
The memory unit also retains processed information until it can be placed on output devices by the output
unit
Information in the memory unit is volatile—it’s typically lost when the computer’s power is turned off
The memory unit is often called either memory, primary memory or R A M (Random Access Memory). Main
memories on desktop and notebook computers contain as much as 128 G B of R A M, though 8 to 16 G B is
most common. G B stands for gigabytes; a gigabyte is approximately one billion bytes. A byte is eight bits. A
bit (short for “binary digit”) is either a 0 or a 1.
Computer Organization

Arithmetic and Logic Unit (ALU): This “manufacturing” section performs


calculations (e.g., addition, subtraction, multiplication and division) and makes
decisions (e.g., comparing two items from the memory unit to determine
whether they’re equal)
Part of the CPU
Central Processing Unit (CPU): This “administrative” section coordinates and
supervises the operation of the other sections
Tells the input unit when to read information into the memory unit
Tells the ALU when to use information from the memory unit in calculations
Tells the output unit when to send information from the memory unit to specific
output devices
Computer Organization

Most computers today have multicore processors that economically implement multiple processors
on a single integrated circuit chip.
Such processors can perform many operations simultaneously.
A dual-core processor has two C P U s, a quad-core processor has four and an octa-core processor
has eight.
Intel has some processors with up to 72 cores.
Computer Organization

Secondary Storage Unit: This is the long-term, high-capacity


“warehousing” section
Programs and data not actively being used by the other units are placed
on secondary storage devices until they’re again needed, possibly hours,
days, months or even years later.
Information on secondary storage devices is persistent.
Secondary storage information takes much longer to access than
information in primary memory, but its cost per byte is much less
Examples of secondary storage devices include solid-state drives (SSD s),
U S B flash drives, hard drives and read/write Blu-ray drives
Many current drives hold terabytes (TB) of data
A terabyte is approximately one trillion bytes
Data Hierarchy

Data items processed by computers form a data


hierarchy that becomes larger and more complex
in structure as we progress from the simplest data
items (called “bits”) to richer ones, such as
characters and fields
Data Hierarchy

Bits
A bit is short for “binary digit”—a digit that can assume 0 or 1

Bits form the basis of the binary number system (Check “Number Systems” appendix)
Data Hierarchy

Characters
Work with data in the low-level form of bits is tedious.
Instead, people prefer to work with decimal digits (0–9), letters (A–Z and a–z) and special
symbols such as

$@% & * ( ) – +": ;, ?/

Digits, letters and special symbols are known as characters


Data Hierarchy

C uses the ASCII (American Standard Code for Information Interchange) character set by default
Data Hierarchy

C also supports Unicode® characters composed of one, two, three or four bytes (8, 16, 24 or 32
bits, respectively).
Data Hierarchy

Fields
Just as characters are composed of bits, fields are composed
of characters or bytes.
A field is a group of characters or bytes that conveys
meaning.
A field consisting of uppercase and lowercase letters could
represent a person’s name.
A field consisting of decimal digits could represent a person’s
age in years.
Data Hierarchy

Records
Several related fields can be used to compose a record
An employee record might consist of
• Employee identification number (a whole number).
• Name (a group of characters).
• Address (a group of characters).
• Hourly pay rate (a number with a decimal point).
• Year-to-date earnings (a number with a decimal point).
• Amount of taxes withheld (a number with a decimal point).
A record is a group of related fields.
Data Hierarchy

Files
A file is a group of related records.
A file contains arbitrary data in arbitrary formats.
Some operating systems view a file simply as a sequence
of bytes—any organization of the bytes in a file, such as
organizing the data into records, is a view created by the
application programmer.
It’s not unusual for an organization to have many files,
some containing billions, or even trillions, of characters of
information.
Data Hierarchy

A database is a collection of data organized for easy access


and manipulation.
Data Hierarchy

The most popular model is the relational database, in which data


is stored in simple tables.
A table includes records and fields.
A table of students might include: first name, last name, major,
year, student I D number and grade-point-average.
The data for each student is a record, and the individual pieces
of information in each record are the fields.
Data Hierarchy

Big Data
The amount of data being produced worldwide is enormous, and its growth is accelerating
Big data applications deal with massive amounts of data.
This field is growing quickly, creating lots of opportunities for software developers.
Millions of information technology (IT) jobs globally already support big-data applications.
Data Hierarchy

Unit Bytes Which is approximately


1 kilobyte (KB) 1024 bytes 103 bytes (1024 bytes exactly)
10 cubed

1 megabyte (M B) 1024 kilobytes 106 (1,000,000) bytes


10 to the sixth power

1 gigabyte (GB) 1024 megabytes 109 (1,000,000,000) bytes


10 to the ninth power

1 terabyte (TB) 1024 gigabytes 1012 (1,000,000,000,000) bytes


10 to the twelfth power

1 petabyte (PB) 1024 terabytes 1015 (1,000,000,000,000,000) bytes


10 to the fifteenth power

1 exabyte (EB) 1024 petabytes 1018 (1,000,000,000,000,000,000) bytes


10 to the eighteenth power

1 zettabyte (ZB) 1024 exabytes 1021 (1,000,000,000,000,000,000,000) bytes


10 to the Twenty first power
Data Hierarchy

Data Hierarchy—Twitter®: A Favorite Big-Data Source

One big-data source favored by developers is Twitter.


There are approximately 800,000,000 tweets per day.
Though tweets appear to be limited to 280 characters, Twitter actually provides almost 10,000
bytes of data per tweet to programmers who want to analyze tweets.
So 800,000,000 times 10,000 is about 8,000,000,000,000 bytes or 8 terabytes (T B) of data per
day. That’s big data.
Data mining is the process of searching through extensive collections of data, often big data,
to find insights that can be valuable to individuals and organizations.
Machine Languages, Assembly Languages
& High-Level Languages

Programmers write instructions in various programming languages, some directly understandable by


computers and others requiring intermediate translation steps.
Hundreds of such languages are in use today.
These may be divided into three general types:
• Machine languages.
• Assembly languages.
• High-level languages.
Machine Languages, Assembly Languages
& High-Level Languages

Machine Languages
Any computer can directly understand only its own machine
language, defined by its hardware design.
Machine languages generally consist of strings of numbers
(ultimately reduced to 1s and 0s) that instruct computers to
perform their most elementary operations one at a time.
Machine languages are machine-dependent—a particular
machine language can be used on only one type of computer
Such languages are cumbersome for humans.
Machine Languages, Assembly Languages
& High-Level Languages

Assembly Languages and Assemblers


Programming in machine language was simply too slow and tedious for
most programmers.
Instead of using the strings of numbers that computers could directly
understand, programmers began using English-like abbreviations to
represent elementary operations.
These abbreviations formed the basis of assembly languages.
Translator programs called assemblers were developed to convert
assembly-language programs to machine language at computer speeds.
Machine Languages, Assembly Languages
& High-Level Languages

Compilers
To speed the programming process, high-level languages were developed
in which single statements could accomplish substantial tasks.
A typical high-level-language program contains many statements, known
as the program’s source code.
Translator programs called compilers convert high-level-language source
code into machine language.
High-level languages allow you to write instructions that look almost like
everyday English and contain common mathematical notations.
C is among the world’s most widely used high-level programming
languages.
Machine Languages, Assembly Languages
& High-Level Languages

Interpreters
Compiling a large high-level language program into machine language can take considerable
computer time.
Interpreters execute high-level language programs directly.
Interpreters avoid compilation delays, but your code runs slower than compiled programs.
Some programming languages, such as Java and Python, use a clever mixture of compilation and
interpretation to run programs.
Operating Systems

Operating systems are software that make using


computers more convenient for users, software
developers and system administrators.
Provide services that allow applications to execute
safely, efficiently and concurrently with one
another.
The software that contains the core operating-
system components is called the kernel.
Linux, Windows and mac O S are popular desktop
computer operating systems.
Each is partially written in C.
Operating Systems

The most popular mobile operating systems used in smartphones and tablets are Google’s
Android and Apple’s i OS.
Operating Systems

Windows
In the mid-1980s, Microsoft developed the Windows operating
system, consisting of a graphical user interface built on top of DOS
(Disk Operating System)—an enormously popular personal-computer
operating system that users interacted with by typing commands
Windows 11 is Microsoft’s latest operating system.
Windows is a proprietary operating system—it’s controlled by
Microsoft exclusively.
It is by far the world’s most widely used desktop operating system.
Operating Systems

Linux
The Linux operating system is among the greatest successes of the open-source
movement.
With open source, individuals and companies contribute to developing,
maintaining and evolving the software.
Anyone can then use that software for their own purposes—normally at no charge,
but subject to a variety of (typically generous) licensing requirements.
Open-source code is often scrutinized by a much larger audience than proprietary
software, so errors can get removed faster, making the software more robust.
Open source increases productivity and has contributed to an explosion of
innovation.
Operating Systems

Open-Source Organizations
GitHub (provides tools for managing open-source projects—it has millions of them under
development).
The Apache Software Foundation (originally the creators of the Apache web server) now oversees
350+ open-source projects, including several big-data infrastructure technologies.
The Eclipse Foundation (the Eclipse Integrated Development Environment helps programmers
conveniently develop software).
The Mozilla Foundation (creators of the Firefox web browser).
Open ML (which focuses on open-source tools and data for machine learning.
Open AI (which does research on artificial intelligence and publishes open-source tools used in AI
reinforcement-learning research).
Open CV (which focuses on open-source computer-vision tools that can be used across various
operating systems and programming languages.
Python Software Foundation (responsible for the Python programming language).
Operating Systems

The Linux kernel is the core of the most popular open-source, freely
distributed, full-featured operating system
It’s developed by a loosely organized team of volunteers and is popular in
servers, personal computers and embedded systems (such as the
computer systems at the heart of smartphones, smart T V s and
automobile systems)
Unlike Microsoft’s Windows and Apple’s mac O S source code, the Linux
source code is available to the public for examination and modification and
is free to download and install
As a result, Linux users benefit from a huge community of developers
actively debugging and improving the kernel, and from the ability to
customize the operating system to meet specific needs
Operating Systems

Apple’s mac OS and Apple’s iOS for iPhone® and iPad® Devices
Apple, founded in 1976 by Steve Jobs and Steve Wozniak, quickly became a leader in personal computing

In 1979, Jobs and several Apple employees visited Xerox P A R C (Palo Alto Research Center) to learn about
Xerox’s desktop computer that featured a graphical user interface (G U I)

The Objective-C programming language, created by Stepstone in the early 1980s, added object-oriented
programming (O O P) capabilities to the C programming language

Steve Jobs left Apple in 1985 and founded NeXT Inc.

In 1988, NeXT licensed Objective-C from Stepstone

NeXT developed an Objective-C compiler and libraries, which were used as the platform for the NeXTSTEP
operating system’s user interface and Interface Builder (for constructing graphical user interfaces).

Apple’s macOS operating system is a descendant of NeXTSTEP


Operating Systems

Apple’s mac OS and Apple’s iOS for iPhone® and iPad® Devices
Apple has several other proprietary operating systems derived from mac OS:
▪ iOS is used in iPhones.
▪ iPad OS is used in iPads.
▪ watch O is used in Apple Watches.
▪ tvOS is used in Apple TV devices.
Operating Systems

Apple’s mac OS and Apple’s iOS for iPhone® and iPad® Devices
In 2014, Apple introduced its Swift programming language, which it open-sourced in 2015.
The Apple app-development community has largely shifted from Objective-C to Swift
Operating Systems

Google’s Android
Android—the most widely used mobile and smartphone operating
system—is based on the Linux kernel, the Java programming language and,
now, the open-source Kotlin programming language.
Android is open source and free.
According to idc.com, 84.8% of smartphones shipped in 2020 use Android,
compared to 15.2% for Apple.
The Android operating system is used in numerous smartphones, e-reader
devices, tablets, T V s, in-store touch-screen kiosks, cars, robots,
multimedia players and more.
Operating Systems

Billions of personal computers and an even larger number of mobile devices are now in use.
The explosive growth of smartphones, tablets and other devices creates significant opportunities
for mobile-app developers.
The C Programming Language

The C Programming Language


C evolved from two earlier languages, B C P L and B.
B C P L was developed in 1967 by Martin Richards as a language for writing
operating systems and compilers.
Ken Thompson modeled many features in his B language after their
counterparts in B C P L, and in 1970 he used B to create early versions of the U
N I X operating system at Bell Laboratories.
The C language was evolved from B by Dennis Ritchie at Bell Laboratories and
was originally implemented in 1972.
C initially became widely known as the development language of U N I X
Many of today’s leading operating systems are written in C and/or C++.
C is mostly hardware-independent—with careful design, it’s possible to write C
programs that are portable to most computers.
The C Programming Language

C is widely used to develop systems that demand performance.


operating systems
embedded systems
real-time systems
communications systems:
By the late 1970s, C had evolved into what’s now referred to as “traditional C.”.
The C Programming Language

C’s rapid expansion to various hardware platforms (that is, types of


computer hardware) led to many similar but often incompatible C
versions.
This was a serious problem for programmers who needed to develop
code for several platforms.
In 1983, the American National Standards Committee on Computers
and Information Processing (X3) created the X3J11 technical
committee to “provide an unambiguous and machine-independent
definition of the language.”
In 1989, the standard was approved in the United States through the
American National Standards Institute (A N S I), then worldwide
through the International Standards Organization (I S O).
The C Programming Language

We discuss the latest C standard (referred to as C11), which was approved in 2011 and
updated with bug fixes in 2018 (referred to as C18).
C11 refined and expanded C’s capabilities.
According to the C standard committee, the next C standard is likely to be released in 2022.
Because C is a hardware-independent, widely available language, C applications often can run
with little or no modification on a wide range of computer systems.
The C Programming Language

Open-Source Libraries
C programs consist of pieces called functions.
You can program all the functions you need to form a C program.
However, most C programmers take advantage of the rich
collection of existing functions in the C standard library.
Two parts to learning C programming:
learning the C language itself, and learning how to use the
functions in the C standard library.
Throughout this lesson, we discuss many of these functions.
The C Programming Language

Open-Source Libraries
When programming in C, you’ll typically use the following building blocks:
▪ C standard library functions,
▪ open-source C library functions,
▪ functions you create yourself, and
▪ functions other people have created and made available to you.
Throughout the book, we focus on using the existing C standard library to leverage your
program-development efforts and avoid “reinventing the wheel.”.
This is called software reuse.
The C Programming Language

There are enormous numbers of third-party and open-source C libraries that can help
you perform significant tasks with modest amounts of code. GitHub lists over 32,000
repositories in their C category:

https://github.com/topics/c

In addition, pages such as Awesome C provide curated lists of popular C libraries for a wide
range of application areas.

https://github.com/kozross/awesome-c
Other Popular Programming Languages

BASIC was developed in the 1960s at Dartmouth College to familiarize novices with programming
techniques.
Many of its latest versions are object-oriented.
C++, which is based on C, was developed by Bjarne Stroustrup in the early 1980s at Bell Laboratories
C++ provides features that enhance the C language and adds object-oriented programming capabilities
Python is an object-oriented language that was released publicly in 1991.
Developed by Guido van Rossum of the National Research Institute for Mathematics and Computer
Science in Amsterdam.
Python has rapidly become one of the world’s most popular programming languages, especially for
educational and scientific computing, and in 2017 it surpassed the programming language R as the
most popular data-science programming language.
Other Popular Programming Languages

Java—Sun Microsystems in 1991 funded an internal corporate research project led by James
Gosling, which resulted in the C++-based object-oriented programming language called Java
“write once, run anywhere”.
used in enterprise applications, web servers, applications for consumer devices, …
C# (based on C++ and Java) is one of Microsoft’s three primary object-oriented programming
languages.
▪ developed to integrate the web into computer applications
▪ now widely used to develop many kinds of applications
As part of Microsoft’s many open-source initiatives, they now offer open-source versions of C#
and Visual Basic
Other Popular Programming Languages

JavaScript is a widely used scripting language primarily used to add programmability to web
pages.
All major web browsers support it.
Many Python visualization libraries output JavaScript for use in web pages.
Tools such as NodeJS also enable JavaScript to run outside of web browsers.
Swift is Apple’s programming language for i O S/mac O S apps.
Includes features from Objective-C, Java, C#, Ruby, Python and more open-source, so it can be
used on non-Apple platforms as well.
R is a popular open-source programming language for statistical applications and visualization.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy