Tuesday, September 29, 2009
Questions for Week 4
Computer and John Von Neumann
John Von Neumann
John von Neumann (Hungarian: margittai Neumann János Lajos) (December 28, 1903 – February 8, 1957) was a Hungarian American[1] mathematician who made major contributions to a vast range of fields,[2] including set theory, functional analysis, quantum mechanics, ergodic theory, continuous geometry, economics and game theory, computer science, numerical analysis, hydrodynamics (of explosions), and statistics, as well as many other mathematical fields. He is generally regarded as one of the foremost mathematicians of the 20th century...[4] Most notably, von Neumann was a pioneer of the application of operator theory to quantum mechanics, a principal member of the Manhattan Project and the Institute for Advanced Study in Princeton (as one of the few originally appointed), and a key figure in the development of game theory[2][5] and the concepts of cellular automata[2] and the universal constructor. Along with Edward Teller and Stanislaw Ulam, von Neumann worked out key steps in the nuclear physics involved in thermonuclear reactions and the hydrogen bomb.
Von Neumann's hydrogen bomb work was also played out in the realm of computing, where he and Stanislaw Ulam developed simulations on von Neumann's digital computers for the hydrodynamic computations. During this time he contributed to the development of the Monte Carlo method, which allowed complicated problems to be approximated using random numbers. Because using lists of "truly" random numbers was extremely slow for the ENIAC, von Neumann developed a form of making pseudorandom numbers, using the middle-square method. Though this method has been criticized as crude, von Neumann was aware of this: he justified it as being faster than any other method at his disposal, and also noted that when it went awry it did so obviously, unlike methods which could be subtly incorrect.
While consulting for the Moore School of Electrical Engineering on the EDVAC project, von Neumann wrote an incomplete set of notes titled the First Draft of a Report on the EDVAC. The paper, which was widely distributed, described a computer architecture in which data and program memory are mapped into the same address space. This architecture became the de facto standard and can be contrasted with a so-called Harvard architecture, which has separate program and data memories on a separate bus. Although the single-memory architecture became commonly known by the name von Neumann architecture as a result of von Neumann's paper, the architecture's description was based on the work of J. Presper Eckert and John William Mauchly, inventors of the ENIAC at the University of Pennsylvania.[15] With very few exceptions, all present-day home computers, microcomputers, minicomputers and mainframe computers use this single-memory computer architecture.
http://en.wikipedia.org/wiki/John_von_Neumann
The IC and computers
Alan Kay
Computing
Computers (personal and mainframe)
computers computing
Computing is, put quite simply, the process of using a computer to process data or perform calculators. However, when you think about it, the idea of “computing” can be viewed in many ways. Would you think that just simply using a computer would be computing? Or would it only apply to numerical computations? Would the idea of processing data be limited to simply calculations? I, myself view computing as the ways in which we use computers to our technological advantages. For example, let’s say you had to do a simple addition problem, however, it required the addition of many numbers. Instead of plugging all those numbers into a hand held calculator, you could either electronically plug them into a calculator on your computer, or even use Google.com. If you type in your mathematical equation, Google.com will compute it for you! Our computers are always processing the data that we put into it; they are constantly computing new information, hardware, and data.
calculators
A calculator is an instrument used to perform mathematical calculations. The functions they perform can range from very simple such as addition and subtraction to incredibly complex equations that would take a human days to complete. They can also be very specific to certain areas of math; ie. scientific calculators, graphing, trigonometric, statical, etc.
The abacus is the earliest calculator, comprised of a wooden frame, wires, and beads that could be moved back and forth to express simple calculations. Calculators became electronic and solar powered in the 20th century. Modern pocket-sized calculators became popular during the 1960’s.
The difference between calculators and computers is that computers can be programed to perform different functions and operations where as calculators are predesigned for specific computing of numbers. Computers can operate with numbers but also manipulate words, images and sounds, where as calculators are mainly limited to numbers and symbols. Computers are also much more complex devices which therefore allows the speed at which they run to fluctuate depending on the hardware and amount of computing the user asks the computer to do. Conversely, calculator speed is constrained by how fast the user presses the buttons.
Genius! Alan Turing
Douglas Engelbart & the computer mouse
Monday, September 28, 2009
Algorithms and Computers
In other words, it is a pattern used in order to solve a problem.
One great example of an algorithm is Rubiks Cube. There are three different, popular algorithms including Thistlethwaite's Algorithm, Kociemba's Algorithm, and Korf's Algorithm (Optimal Solutions for Rubik's Cube). These different algorithms all involve a series of spins in order to assemble the cube correctly.
Princeton defines a computer as: "a machine for performing calculations automatically." For me today, a computer is something that helps me to perform almost all of the work in my life. It enables me to write, communicate, listen to music, edit pictures, etc.
The advanced calculations that my machine does automatically is what runs my life day to day. FLOPs (FLoating point Operations Per Second) are what computer speeds are measured in today. A simple calculator would executre only a few FLOPSs per second while a supercomputer would use about 900 GFLOPS (gigaflops).
The Personal Computer
In the late 1970's and early 1980's, computers were beginning to be used in the households of individuals. They offered personal productivity, programming and games. It also helped with the office and small businesses.
Personal computers have made such a large impact on today's society giving individuals more privacy and portability. Over the past decade, the growth in production of PC's has changed drastically. I remember believing I was the coolest kid on the block because I owned a clear and teal, egg shaped, Mac computer in 2000, yet today the screens are flat and the laptops are a quarter of the size they used to be.
Microprocessors
Microprocessors have evolved today into being able to do a multitude of computing and have truly revolutionized the way electronics and computers operate. For example, cybernetics would never have evolved into what it is today without the advancement of the first microprocessor released by Intel. Can you think of any other inventions or innovations today that would have been impossible without the invention of the microprocessor?
Architecture
A digital computer typically consists of a control unit, an arithmetic-logic unit, a memory unit, and input/output units. The arithmetic-logic unit (ALU) performs simple addition, subtraction, multiplication, division, and logic operations—such as OR and AND.
The main computer memory, usually high-speed random-access memory (RAM), stores instructions and data. The control unit fetches data and instructions from memory and effects the operations of the ALU.
The control unit and ALU usually are referred to as a processor, or central processing unit (CPU). The operational speed of the CPU primarily determines the speed of the computer as a whole. The basic operation of the CPU is analogous to a computation carried out by a person using an arithmetic calculator.
The control unit corresponds to the human brain and the memory to a notebook that stores the program, initial data, and intermediate and final computational results. In the case of an electronic computer, the CPU and fast memories are realized with transistor circuits.
I/O units, or devices, are commonly referred to as computer peripherals and consist of input units (such as keyboards and optical scanners) for feeding instructions and data into the computer and output units (such as printers and monitors) for displaying results.
In addition to RAM, a computer usually contains some slower, but larger and permanent, secondary memory storage. Almost all computers contain a magnetic storage device known as a hard disk, as well as a disk drive to read from or write to removable magnetic media known as floppy disks.
Various optical and magnetic-optical hybrid removable storage media are also quite common, such as CD-ROMs (compact disc, read-only memory) and DVD-ROMs (digital video [or versatile] disc read-only memory).
Computers also often contain a cache—a small, extremely fast (compared to RAM) memory unit that can be used to store information that will be urgently or frequently needed. Current research includes cache design and algorithms that can predict what data is likely to be needed next and preload it into the cache for improved performance.
Sunday, September 27, 2009
Operating Systems
What is an operating system? A simple definition is "software designed to control the hardware of a specific data-processing system in order to allow users and application programs to make use of it" (The American Heritage Dictionary of the English Language). In other words, the operating system is what we refer to when we are talking about the entire collection of a computer's software. The OS manages communication technology, input/output, device storage, and program operation. Examples of operating systems include Leopard or Snow Leopard for Mac, Windows Vista for PCs. There are operating systems for PDA's as well. For example, many PDA's operate on Windows Mobile and Blackberry has their own operating system. An operating system allows the user to operate and access the computer's different functions.
Hardware vs. Software
I found a great website (www.diffen.com/difference/Special:AshExtension?diffenVal1=Hardware&diffenVal2=Software) that thoroughly describes the differences between hardware and software. A computer can be broken down into two components: hardware and software. Basically, the website discusses each in terms of definition, interdependence, examples, types, and functions.
Software is a general term used to describe a collection of computer programs, procedures, and documentation that perform some task on a computer system. It is usually written in high-level programming languages. Some examples include Internet Explorer, Adobe Acrobat, Microsoft Office, etc.
Hardware is a physical device, something that you’re able to touch and see. The computer monitor, mouse, keyboard, CD-ROM and printer are all examples of computer hardware. Without any hardware the computer would not exist and software would have nothing to run on.
Hardware and software are interdependent. In other words, hardware cannot function until software is loaded and software is installed in hardware to allow the programs to run. The hardware of a computer is changed infrequently compared to software and data, which are frequently created, modified, or erased.
Computer and Computer Language
Computer language and programming language are often used interchangeably so I am going to assume that the premise of the question is to define programming language. According to Wikipedia a computer language is “is an artificial language designed to express computations that can be performed by a machine, particularly a computer. Programming languages can be used to create programs that control the behavior of a machine, to express algorithms precisely, or as a mode of human communication.” Each language is composed of different syntax and is structured differently. A lot of the programming languages use some of the same concepts such as control structures, loops, subroutines, arrays, hashes, regular expressions, classes and much more. There are many languages such as C, C++, Java, Perl, HTML, Visual Basic, etc.
In order to get a visual representation of how a program looks like I’ve included this simple C++ program that I made for one of my C.S classes. This program will calculate total home cost based on the square footage of a house, optional features, discount, and tax.
// File name: lab2fall08.cpp
// Author: Gabriel Acosta
// Date: September 21, 2008
// Course: CS1600
#include
using std::fixed; //ensures that decimal point is displayed
#include
using std::setprecision; //sets numeric output precision
int main()
{
//variable declarations
double houseSquareFootage;
int optionalFeatures;
int discount;
const double STATE_TAX_RATE = .075;
const double PERCENT_PROFIT = .25;
const double COST_PER_SQUARE_FT = 66.67;
//Ask user to input the house square footage
std::cout<< "Input the House Square Footage:" <
double basicHomeCost = houseSquareFootage * COST_PER_SQUARE_FT; //Calculates Basic Home Cost
//Ask user to input the Optional Features Cost
std::cout<< "Input Optional Features Cost:" <
double profit = (basicHomeCost + optionalFeatures) * PERCENT_PROFIT; //Caculates Profit
//Ask user to input and discount
std::cout<< "Input any discount:" <
double netHomeCost = basicHomeCost + optionalFeatures + profit - discount; //Calculates Net Home Cost
//Give user Total Home cost
double taxes = netHomeCost * STATE_TAX_RATE; //calculates taxes
double totalHomeCost = netHomeCost + taxes; //calculates Total Home Cost
std::cout << "The total cost is $"
<< setprecision (2) << fixed << totalHomeCost <
}//end of the program
In addition you can go to this website and actually use another program that I created using PERL: http://storm.cis.fordham.edu/~acosta/cgi-bin/cgitable.pl
This program is a simple version of how businesses conduct e-commerce. This program will collect your information and if you write your correct email address, it will then send you a confirmation page to that email address with all the information you included. It pretty much acts as if you were getting a receipt if you bought a product online.
Programming languages are evolving and getting increasingly more sophisticated, no wonder I.T jobs are in high demand.
Thursday, September 24, 2009
Computer and Herman Hollerith
Questions for Week 3
Other key terms to define:
computing
computer program
algorithm
computer language
software and hardware
integrated circuit
microprocessor
mainframe computer
minicomputer
microcomputer
personal computer
calculator
GUI
operating system
architecture
and/or identify the following individuals:
Information Theory and Communication
Tuesday, September 22, 2009
Frankenstein's Foreshadowing
significance of the electric light bulb
Alternating vs. Direct Currents
<-- Diagram of various types of current
While Direct Current is likely found among low-voltage applications, Alternating Current is for larger voltage applications.
Direct Current is used in batteries, solar power, and most automotive applications.
Alternating Current is what is produced by power plants, and is used in our homes to power devices such as television, lights and computers.
Monday, September 21, 2009
The Difference Between Electric and Electronic
The significance of the invention of the telephone
Transistors and Vacuum Tubes
What is the difference between electric and electronic?
Timeline of Electricity and Telecommunications
1600- William Gilbert, a British scientist and physician to, invented the term electricity. He was the first person to describe the earth's magnetic field and to realize the relationship between magnetism and electricity.
1752- Benjamin Franklin flew a kite with a metal tip into a thunderstorm to prove that lightning is a form of electricity.
1700s- The Wimshurst machine, a machine used to produce static electricity easily and reliably was invented. It functions by rotating two parallel plates in opposite directions, producing a charge around the edges of the plates.
1827- Georg Ohm, a German professor published his complete mathematical theory of electricity. The word Ohm later became the name of a unit of electrical resistance.
1831- Michael Faraday demonstrated electromagnetic induction by passing a magnet through a coil of wire. That same year, Charles Wheatstone and William Fothergill Cooke created the first telegraph machine.
1838- Samuel Morse invented Morse Code, a system of dots and dashes to communicate, which later became standard throughout the world.
1870s- Thomas Edison built a direct current electric generator in America. He later provided all of New York's electricity.
1876- Alexander Graham Bell, inventor of the telephone, used electricity to transmit speech for the first time.
1878- Joseph Swan, a British scientist, demonstrated the first electric light with a carbon filament lamp. A few months later, Thomas Edison made the same discovery in America.
1880s- Nikola Tesla developed an alternating current motor and a system of AC power generation. Edison saw Tesla's system as a threat to his DC supply and spread stories that it was not safe. It later gained acceptance.
1881- The first public electricity supply was generated in Godalming, Surrey using a waterwheel at a nearby mill.
1886- Heinrich Hertz produced and detected electric waves in the atmosphere.
1897- Guglielmo Marconi sends a radio message 20 miles away. Later he sends a message across the Atlantic.
1906 - Lee de Forest invents the vacuum tube.
1910 - The first commercial radios are sold
1918 - Edwin Armstrong develops a receiving circuit.
1919 - Radio Corporation of America (RCA) is formed.
1920 - Westinghouse Radio Station KDKA is established.
1929- Commercial ship-to-shore telephone service opened.
1933 - Edwin Armstrong demonstrates frequency modulation (FM).
1934 - Congress passes Communications Act of 1934, Federal Communications Commission founded.
1935 - First telephone call around the world.
1947- Opening of commercial telephone service for passengers on certain trains running between New York and Washington, D.C.
1954 - Sony introduces the first transistor radio that sold for $49.95.