CC111_ComputerProgramming1_PRELIM_DAY 1 to 2
CC111_ComputerProgramming1_PRELIM_DAY 1 to 2
NOTE
Do not write anything on this module. There are answer sheets provided, use separate paper if
necessary.
Programming Concepts
Designing a Solution
When the problem is defined properly, we can start designing the solution to the
problem. In designing the solution, we must be able to breakdown the problem into several
steps so that it is easier for us to solve the problem in smaller pieces. This method is called
divide and conquer principle. This sequence of steps for the solution to the problem is
called an Algorithm.
Algorithm
It refers to well-defined procedures or instructions to solve a problem.
Flowcharting
One way a programmer can illustrate the sequence of steps in an algorithm is
with a flowchart. A flowchart is a graphical representation of the sequence of
operations where a computer is to perform. Flowcharting uses easily recognizable
symbols to represent the type of processing performed in a program.
Pseudocode
A pseudocode is a version of the instructions describing each step, that the
computer must follow. It is written in an abbreviated form of spoken language and
there lies somewhere commands written in ordinary English and those in a computer
language.
Program
1. A program is a list of instructions written in a programming language that a
computer can execute so that the machine acts in a predetermined way.
2. Program is synonymous with software. A program is also a sequence of instructions
that can be executed by a computer. The term may also refer to the original source
code or to an executable (machine language) version. The word program also implies
a degree of completeness, that is, a source code program compressing all statements
and files necessary for complete interpretation and compilation, and an executable
program that can be loaded into a given environment and executed independently
from other programs.
Despite the fact that programming languages differ in commands they use
high-level programming languages must have certain types of programming
statements in common. These are comments, declarations, input/output statements,
computations, transfer of control, and comparison.
Comments are statements that have no effect on the program. They are
simply used to make the program easier to understand. They are inserted at the key
points in the program and serve as an internal documentation of the program.
The programmer uses declarations to define items used in the program. Examples
include definitions of files, records, initial values, reusable functions and the like.
Input/output statements transfer data to and from the primary storage for use
by the program, as well as to and from other I/O devices like the monitor and the
keyboard. Commands such as READ and PRINT are examples of these types of
statements.
Computational instructions perform arithmetic operations such as addition,
subtraction, multiplication, division and exponentiation. Different programming
languages vary in the way they invoke the computer's arithmetic capabilities.
Another type of instruction allows the sequence of execution to be altered by
transferring control. A conditional transfer of control alters the sequence only when a
certain condition met. An unconditional transfer of control always changes the
sequence of execution.
Comparisons allow two items to be compared. Based on the result of the
comparison, input/output, computation, or transfer of control could occur.
As the program is being coded, the programmer should be aware that although
generating the correct output is the primary goal of the program, it is not the only
requirement of a good program. The programmer should try to incorporate the
following qualities into any program:
1. Programs should be easy to read and understand. Data names should be
descriptive. Statements should be placed in a format that is easy to read and follow.
Placing enough comments can help in making the program easier to understand.
2. Programs should be efficient. Programs should execute in as little time as
possible.
3. Programs should be reliable. Programs should consistently produce the correct
output. All formulas and computations, as well as all logic test and transfer control,
must be accurate.
4. Programs must be robust. Programs should work under all conditions. Reliability
alone is no guarantee for a successful program. Internal logic may be correct but
incorrect data item could produce an incorrect output. For example, how would a
program react if a person's age is between 16, 29 or -21? Or instead of accepting
numerical values such as 1, 10, 45, the person gives letter A?
5. Programs should be maintainable. They should be easy to update and modify.
Programs should be written in independent modules so that a change in one module
does not mean change of other modules.
Compiler
1. A computer program (or set of programs) that transforms source code written in a
programming language (the source language) into another computer language (the
target language, often having a binary form known as object code).
2. Refers to any program that transforms one set of symbols into another by following
a set of syntactic and semantic rules. In the most common sense, a compiler is a
program that translates all the source codes of a program written in a high-level
language into object codes prior to the execution of the program.
Debugging
1. In computers, debugging is the process of locating and fixing or bypassing bugs
(errors) in computer program code or the engineering of a hardware device. To debug
a program or to fix the hardware is to start with a problem, isolate the source of the
problem, and then fix it.
2. It is the process of correcting programming errors.
Testing
1. A method of assessing the functionality of a software program.
2. The process of checking if a program actually performs its functions as planned.
Maintenance
During the implementation of the system, there are some changes in the program that
will occur depending on the nature of the system that is being developed. Maintenance is
one of the most important aspects in developing a computer program in a sense that is there
are errors or bugs that are spotted by the user it should be fixed as soon as possible in order
to avoid big problems along the way of using the system. Let say, for example, the user has
discovered that every time a new record is being added in the database the record is not
being saved correctly having this problem is very critical in any business organization. This
problem should be corrected right away from the programmer, so that it will not hamper the
day-to-day operation of the company.
Programming Language
1. A programming language is an artificial language designed to communicate
instructions to a machine, particularly a computer
2. Programming languages can be used to create programs that control the behavior
of a machine and/or to express algorithms precisely.
3. Software used to create another software
1. Machine Language
It is the only language that the computer understands. It consists only of binary
numbers 0 and 1. The use of binary numbers as basis of machine language was proposed
by Dr. John Von Neumann. Each different type of CPU (Central Processing Unit) has its
own unique machine language. Programs written in machine language is very fast in terms of
program execution and uses only minimal amount of computer resources such as memory,
hard drive and CPU but it is not easy to write, debug and test programs written in machine
language on the part of the programmer. Machine language is widely used in 1930's up to
early 1950's primarily because of the small storage capacity and CPU speed of computer
during that time.
ba 0c 01
b4 09
cd 21
b8 00 4c cd 21
48 65 6c 6c 6f 2c
20 57 6f 72 6c 64
21 0d 0a 24
.model small
.stack 100h
.data
message db 13,10, "Hello World!$"
.code
main proc near
lea dx, message
mov ah, 09h
int 21h
/* hello.java */
/* Author: Mr. Jake R. Pomperada, MAED-IT*/
/* Date: August 8, 2015*/
public class HelloWorld {
public static void main(String[| args) {
System.out.println("Hello, World");
}
}
Compiler
A compiler is a program that translates a program written in a high-level language
(source code) and translates into machine language (object code). The compiler derives its
name from the way it works. The first practical compiler was written by Commodore Grace
Murray Hopper of United States Navy in 1952; she named it A compiler, It analyzes the
entire piece of source code and then reorganizes the instructions and then translates into
machine code.
Every high-level programming language (except strictly interpretive languages) comes
with a compiler. In the effect, the compiler is the language because it defines which
instructions are acceptable and which are not.
Interpreter
The most common way to translate a high-level language to machine language is to
compile the program; the other method is to pass the program through an interpreter. The
interpreter translates and executes the program line by line. An interpreter translates into an
intermediate form, which it then executes. In contrast, a compiler translates high-level
instructions or commands directly into machine language. The first interpreter was written by
Steve Russell on an IBM 704 computers in 1958. The first high-level language that uses an
interpreter to translate its code into machine code is LISP or List Processing language
written and developed by John McCarthy in 1958.
The advantage of an interpreter, however, is that it does not need to go through the
compilation stage during which machine instructions are generated. This compilation process
can be time consuming if the program is long. The interpreter, on the other hand, can
immediately execute high level programs.
Programming languages that use interpreters are LISP, BASIC, Java, Pascal,Ruby
and Python.