Tuesday, September 29, 2009

Questions for Week 4

Everyone discuss the term digital, what it means, the different types of technologies, forms of communication, etc., that it is associated with, how it relates to information, electricity, computing, etc., how it contrasts with the term analog, and anything else you think relevant or otherwise worth contributing.

Mac vs. PC

 I figured we should include this on our blog.



Computer and John Von Neumann

According to Techterms.com, a computer is a programmable machine. This means it can execute a programmed list of instructions and respond to new instructions that it is given. Today, however, the term is most often used to refer to the desktop and laptop computers that most people use. When referring to a desktop model, the term "computer" technically only refers to the computer itself -- not the monitor, keyboard, and mouse. Still, it is acceptable to refer to everything together as the computer. If you want to be really technical, the box that holds the computer is called the "system unit."


John Von Neumann


John von Neumann (Hungarian: margittai Neumann János Lajos) (December 28, 1903 – February 8, 1957) was a Hungarian American[1] mathematician who made major contributions to a vast range of fields,[2] including set theory, functional analysis, quantum mechanics, ergodic theory, continuous geometry, economics and game theory, computer science, numerical analysis, hydrodynamics (of explosions), and statistics, as well as many other mathematical fields. He is generally regarded as one of the foremost mathematicians of the 20th century...[4] Most notably, von Neumann was a pioneer of the application of operator theory to quantum mechanics, a principal member of the Manhattan Project and the Institute for Advanced Study in Princeton (as one of the few originally appointed), and a key figure in the development of game theory[2][5] and the concepts of cellular automata[2] and the universal constructor. Along with Edward Teller and Stanislaw Ulam, von Neumann worked out key steps in the nuclear physics involved in thermonuclear reactions and the hydrogen bomb.

Von Neumann's hydrogen bomb work was also played out in the realm of computing, where he and Stanislaw Ulam developed simulations on von Neumann's digital computers for the hydrodynamic computations. During this time he contributed to the development of the Monte Carlo method, which allowed complicated problems to be approximated using random numbers. Because using lists of "truly" random numbers was extremely slow for the ENIAC, von Neumann developed a form of making pseudorandom numbers, using the middle-square method. Though this method has been criticized as crude, von Neumann was aware of this: he justified it as being faster than any other method at his disposal, and also noted that when it went awry it did so obviously, unlike methods which could be subtly incorrect.


While consulting for the Moore School of Electrical Engineering on the EDVAC project, von Neumann wrote an incomplete set of notes titled the First Draft of a Report on the EDVAC. The paper, which was widely distributed, described a computer architecture in which data and program memory are mapped into the same address space. This architecture became the de facto standard and can be contrasted with a so-called Harvard architecture, which has separate program and data memories on a separate bus. Although the single-memory architecture became commonly known by the name von Neumann architecture as a result of von Neumann's paper, the architecture's description was based on the work of J. Presper Eckert and John William Mauchly, inventors of the ENIAC at the University of Pennsylvania.[15] With very few exceptions, all present-day home computers, microcomputers, minicomputers and mainframe computers use this single-memory computer architecture.

http://en.wikipedia.org/wiki/John_von_Neumann



The IC and computers

According to BusinessDictionary.com, the integrated circuit is a miniature (typically 5 millimeter square, 1 millimeter thick)electronic device containing a few to millions of solid-statecomponents etched or printed on a silicon or othersemiconductor chip (called silicon chip or microchip). A four megabit memory chip, for example, contains four million two-paired transistors (called 'sandwiches') plus components for control circuitry. Depending on the number of components, ICs are classified (in the ascending order) as 'small scale integration' (SSI), 'medium scale integration' (MSI), 'large scale integration' (LSI), 'very large scale integration' (VLSI), and 'ultra large scale integration' (ULSI). ICs consume very little current, generate comparatively little heat, and are far more shock-proof and reliable than the older discrete-component circuits. They are encased in ceramic or plastic material, and are connected to other ICs via external pins that fit into a socket of a circuit board. Invented in 1958 independently by Jack Kilby (born 1923) and Robert Noyce (1927-1990).

These are Important for computers because they maximize the potential of a computer's capabilities. By adding circuits to enhance processing power, memory, or other vital aspects of modern computers, one can get the most out of the computer without changing much electronically. In fact, as stated above, since circuits consume little to no current, they will not draw energy away from the machine or make it any less efficient from an electricity standpoint.

Alan Kay

Alan Curtis Kay (born May 17, 1940) is an Americancomputer scientist, known for his early pioneering work on object-oriented programming and windowing graphical user interface design.

Beofre today, I knew absolutely nothing about Alan Kay. However, immediately after reading about him I knew why he was included on the list of key terms and people. Alan Kay worked on object-oriented programming and windowing graphical user interface designs, both of which people use every day. When people log on their computers, unless they have MSDos, they are using a graphical user interface, i.e. Windows. Object Oriented programming is popular as well. Visual Basic .NET and C# are both used for Windows .NET platform and Java.

Computing

According to Wikipedia, computing is usually defined as the activity of using and developing computer technology, computer hardware, or software. In other words, it is the computer-specific part of information technology.

In relation to our New Media course, I think computing directly relates to the development of media. Over time media has evolved dramatically. For example HD picture quality gives an audience a view of the picture that was previously unavailable. Also, devices such as the telephone and computer have developed into the multi-tasking devices that we use today.

Computers (personal and mainframe)

My comprehension of how a computer is defined is as follows: A "personal" computer is a machine used for a variety of tasks, from simple calculations, to complicated graphics designing and film editing. For me personally, a computer is firstly a word processor and secondly a media center. That is to say, I use my computer for music, movies, pictures and TV. Also a computer is an easy access point for the internet, where the possibilities for media and information intake is limitless. Beyond all this, a computer allows for connectivity among peers. This blog for instance would not even exist without computers. Computers connect us to each other and the rest of the world.

Different from our personal computers a Main Frame computers are traditionally used by large organizations to complete bulk data processing. For instance tasks like census and financial transaction processing require a computer that can make an insane amount of calculations in a short amount of time.

computers computing

The basic way for me to define my computer is my life. I honestly would not be able to function without my computer. I use my computer in just about all aspects of my life; whether it’s to check my email, type up class notes, or connecting with friends. That being said, I think that a major part of the definition of computers are the internet. Although I could still use my computer for word processing, without the internet, it would practically be useless for me.

Computing is, put quite simply, the process of using a computer to process data or perform calculators. However, when you think about it, the idea of “computing” can be viewed in many ways. Would you think that just simply using a computer would be computing? Or would it only apply to numerical computations? Would the idea of processing data be limited to simply calculations? I, myself view computing as the ways in which we use computers to our technological advantages. For example, let’s say you had to do a simple addition problem, however, it required the addition of many numbers. Instead of plugging all those numbers into a hand held calculator, you could either electronically plug them into a calculator on your computer, or even use Google.com. If you type in your mathematical equation, Google.com will compute it for you! Our computers are always processing the data that we put into it; they are constantly computing new information, hardware, and data.

calculators

A calculator is an instrument used to perform mathematical calculations. The functions they perform can range from very simple such as addition and subtraction to incredibly complex equations that would take a human days to complete. They can also be very specific to certain areas of math; ie. scientific calculators, graphing, trigonometric, statical, etc.


The abacus is the earliest calculator, comprised of a wooden frame, wires, and beads that could be moved back and forth to express simple calculations. Calculators became electronic and solar powered in the 20th century. Modern pocket-sized calculators became popular during the 1960’s.


The difference between calculators and computers is that computers can be programed to perform different functions and operations where as calculators are predesigned for specific computing of numbers. Computers can operate with numbers but also manipulate words, images and sounds, where as calculators are mainly limited to numbers and symbols. Computers are also much more complex devices which therefore allows the speed at which they run to fluctuate depending on the hardware and amount of computing the user asks the computer to do. Conversely, calculator speed is constrained by how fast the user presses the buttons.

Genius! Alan Turing

Alan Turing (1912-1954) was an English mathematician, logician,cryptanalyst, and computer scientist. He was influential in the development of computer science and provided an influential formalisation of the concept of the algorithm and computation with the Turing machine. In 1999, Time Magazine named Turing as one of the100 Most Important People of the 20th Century for his role in the creation of the modern computer, and stated: "The fact remains that everyone who taps at a keyboard, opening a spreadsheet or a word-processing program, is working on an incarnation of a Turing machine."

During the Second World War, Turing worked for the Government Code and Cypher School at Bletchley Park, Britain's codebreaking centre. For a time he was head of Hut 8, the section responsible for German navalcryptanalysis. He devised a number of techniques for breaking Germanciphers, including the method of thebombe, an electromechanical machine that could find settings for the Enigma machine. After the war he worked at the National Physical Laboratory, where he created one of the first designs for a stored-program computer, the ACE.

His late life was full of controversy- sad and interesting how different and repressed life was back then.

Douglas Engelbart & the computer mouse

Douglas Engelbart was born in 1925, and is still alive today. He was an American inventor and is known as an early computer engineer. He is most well-known for the creation of the computer mouse, hypertext ("text, displayed on a computer, with references to other text that the reader can immediately access" Wikipedia) , early GUIs, and the beginnings of interactive computing.
Working at SRI (Stanford Research Institute), he developed many patents. With his team at SRI, hedeveloped computer-interface elements such as bit-mapped screens, the mouse, hypertext, collaborative tools, and precursors to the graphical user interface. From 1967 to 1970, he developed, completed, and passed the patent for what he called the "mouse" of the computer, because of its tail with a wooden shell and composed of two metal wheels. His team also named the on-screen cursor, the "bug", but the term never caught on.
Unfortunately, "he never received any royalties for his mouse invention. During an interview, he says 'SRIpatented the mouse, but they really had no idea of its value. Some years later it was learned that they had licensed it to Apple for something like $40,000'" (Wikipedia).
Without Engelbart's achievements in computer science, we would not have what we now consider simple and normal to our computing systems, the mouse. Even the advancements of the years between PC and Mac, show and ever changing shape, use, and placement of the mouse, both in physical and on keyboard forms.


Monday, September 28, 2009

Algorithms and Computers

According to the National Industry of Standards and Technology, an algorithm is a: "a computable set of steps to achieve a desired result." The Persian author bu Ja'far Mohammed ibn Mûsâ al-Khowârizmî wrote a book in 825 AD first talking about these type of rules.
In other words, it is a pattern used in order to solve a problem.

One great example of an algorithm is Rubiks Cube. There are three different, popular algorithms including
Thistlethwaite's Algorithm, Kociemba's Algorithm, and Korf's Algorithm (Optimal Solutions for Rubik's Cube). These different algorithms all involve a series of spins in order to assemble the cube correctly.



Princeton defines a computer as: "
a machine for performing calculations automatically." For me today, a computer is something that helps me to perform almost all of the work in my life. It enables me to write, communicate, listen to music, edit pictures, etc.

The advanced calculations that my machine does automatically is what runs my life day to day. FLOPs (FLoating point Operations Per Second) are what computer speeds are measured in today. A simple calculator would executre only a few FLOPSs per second while a supercomputer would use about 900 GFLOPS (gigaflops).

The Personal Computer

The personal computer, otherwise known as a PC, is referred to as any general-purpose computer that is easily useful for individuals. PC's are operated directly by an end user, with no other intervening computer operator. Desk computers, laptops, tablet computers and palmtops are all considered personal computers. With early PC's, owners had to write or type their own programs to do anything useful, but today's PC's have commercial software which is provided in ready-to-run form.

In the late 1970's and early 1980's, computers were beginning to be used in the households of individuals. They offered personal productivity, programming and games. It also helped with the office and small businesses.

Personal computers have made such a large impact on today's society giving individuals more privacy and portability. Over the past decade, the growth in production of PC's has changed drastically. I remember believing I was the coolest kid on the block because I owned a clear and teal, egg shaped, Mac computer in 2000, yet today the screens are flat and the laptops are a quarter of the size they used to be.

Microprocessors

Referred to as the "heart" of most computers, the microprocessor is "a complete computation engine that is fabricated on a single chip (Microprocessor)." The first microprocessor is identified as the Intel 4004. "The 4004 is the first complete CPU on one chip, the first commercially available microprocessor, a feat made possible by the use of the new silicon gate technology allowing the integration of a higher number of transistors and a faster speed than was possible before (Intel)." In the November, 1971 issue of Electronic News an ad was placed by Intel (pictured below) highlighting a "new era of electronics," referring to the microprocessor as a "building block" that engineers could purchase and then customize with software to perform different functions in a wide variety of electronic devices.

Microprocessors have evolved today into being able to do a multitude of computing and have truly revolutionized the way electronics and computers operate. For example, cybernetics would never have evolved into what it is today without the advancement of the first microprocessor released by Intel. Can you think of any other inventions or innovations today that would have been impossible without the invention of the microprocessor?

Architecture

A broad understanding of architecture is the designing or design of any kind of system or structure with a strong concern for aesthetics. In computer science architecture refers to all levels of hardware design, as well as the integration of hardware and software components which form computer systems.

A digital computer typically consists of a control unit, an arithmetic-logic unit, a memory unit, and input/output units. The arithmetic-logic unit (ALU) performs simple addition, subtraction, multiplication, division, and logic operations—such as OR and AND.

The main computer memory, usually high-speed random-access memory (RAM), stores instructions and data. The control unit fetches data and instructions from memory and effects the operations of the ALU.

The control unit and ALU usually are referred to as a processor, or central processing unit (CPU). The operational speed of the CPU primarily determines the speed of the computer as a whole. The basic operation of the CPU is analogous to a computation carried out by a person using an arithmetic calculator.

The control unit corresponds to the human brain and the memory to a notebook that stores the program, initial data, and intermediate and final computational results. In the case of an electronic computer, the CPU and fast memories are realized with transistor circuits.
I/O units, or devices, are commonly referred to as computer peripherals and consist of input units (such as keyboards and optical scanners) for feeding instructions and data into the computer and output units (such as printers and monitors) for displaying results.

In addition to RAM, a computer usually contains some slower, but larger and permanent, secondary memory storage. Almost all computers contain a magnetic storage device known as a hard disk, as well as a disk drive to read from or write to removable magnetic media known as floppy disks.

Various optical and magnetic-optical hybrid removable storage media are also quite common, such as CD-ROMs (compact disc, read-only memory) and DVD-ROMs (digital video [or versatile] disc read-only memory).

Computers also often contain a cache—a small, extremely fast (compared to RAM) memory unit that can be used to store information that will be urgently or frequently needed. Current research includes cache design and algorithms that can predict what data is likely to be needed next and preload it into the cache for improved performance.

Sunday, September 27, 2009

Operating Systems

According to The American Heritage New Dictionary of Cultural Literacy (via dictionary.com), a computer is "an electronic device that stores and manipulates information. Unlike a calculator, it is able to store a program and retrieve information from its memory. Most computers today are digital, which means they perform operations with quantities represented electronically as digits."

What is an operating system? A simple definition is "software designed to control the hardware of a specific data-processing system in order to allow users and application programs to make use of it" (The American Heritage Dictionary of the English Language). In other words, the operating system is what we refer to when we are talking about the entire collection of a computer's software. The OS manages communication technology, input/output, device storage, and program operation. Examples of operating systems include Leopard or Snow Leopard for Mac, Windows Vista for PCs. There are operating systems for PDA's as well. For example, many PDA's operate on Windows Mobile and Blackberry has their own operating system. An operating system allows the user to operate and access the computer's different functions.

Hardware vs. Software

According to Whatis.com, a computer is a device that accepts information in the form of digital data and manipulates it for some result based on a program or sequence of instructions on how the data is to be processed. Complex computers also include the means for storing data for some necessary duration.

I found a great website (www.diffen.com/difference/Special:AshExtension?diffenVal1=Hardware&diffenVal2=Software) that thoroughly describes the differences between hardware and software. A computer can be broken down into two components: hardware and software. Basically, the website discusses each in terms of definition, interdependence, examples, types, and functions.

Software is a general term used to describe a collection of computer programs, procedures, and documentation that perform some task on a computer system. It is usually written in high-level programming languages. Some examples include Internet Explorer, Adobe Acrobat, Microsoft Office, etc.


Hardware is a physical device, something that you’re able to touch and see. The computer monitor, mouse, keyboard, CD-ROM and printer are all examples of computer hardware. Without any hardware the computer would not exist and software would have nothing to run on.

Hardware and software are interdependent. In other words, hardware cannot function until software is loaded and software is installed in hardware to allow the programs to run. The hardware of a computer is changed infrequently compared to software and data, which are frequently created, modified, or erased.

Computer and Computer Language

Simply put, computers are electronic devices that take in input, calculate and process the input and then outputs information. However what makes that different than a calculator? This limited definition can also be attributed to a calculator because calculators also ask for input, calculate and give outputs. What makes a computer what it is may be the hardware (CPU, memory hard drives) and the software (system software, Microsoft office) that it contains.

Computer language and programming language are often used interchangeably so I am going to assume that the premise of the question is to define programming language. According to Wikipedia a computer language is “is an artificial language designed to express computations that can be performed by a machine, particularly a computer. Programming languages can be used to create programs that control the behavior of a machine, to express algorithms precisely, or as a mode of human communication.” Each language is composed of different syntax and is structured differently. A lot of the programming languages use some of the same concepts such as control structures, loops, subroutines, arrays, hashes, regular expressions, classes and much more. There are many languages such as C, C++, Java, Perl, HTML, Visual Basic, etc.

In order to get a visual representation of how a program looks like I’ve included this simple C++ program that I made for one of my C.S classes. This program will calculate total home cost based on the square footage of a house, optional features, discount, and tax.

// File name: lab2fall08.cpp
// Author: Gabriel Acosta
// Date: September 21, 2008
// Course: CS1600
#include
using std::fixed; //ensures that decimal point is displayed
#include
using std::setprecision; //sets numeric output precision
int main()
{
//variable declarations
double houseSquareFootage;
int optionalFeatures;
int discount;
const double STATE_TAX_RATE = .075;
const double PERCENT_PROFIT = .25;
const double COST_PER_SQUARE_FT = 66.67;

//Ask user to input the house square footage
std::cout<< "Input the House Square Footage:" < std::cin >> houseSquareFootage;
double basicHomeCost = houseSquareFootage * COST_PER_SQUARE_FT; //Calculates Basic Home Cost

//Ask user to input the Optional Features Cost
std::cout<< "Input Optional Features Cost:" < std::cin >> optionalFeatures;
double profit = (basicHomeCost + optionalFeatures) * PERCENT_PROFIT; //Caculates Profit

//Ask user to input and discount
std::cout<< "Input any discount:" < std::cin >> discount;
double netHomeCost = basicHomeCost + optionalFeatures + profit - discount; //Calculates Net Home Cost

//Give user Total Home cost
double taxes = netHomeCost * STATE_TAX_RATE; //calculates taxes
double totalHomeCost = netHomeCost + taxes; //calculates Total Home Cost
std::cout << "The total cost is $"
<< setprecision (2) << fixed << totalHomeCost <return 0;

}//end of the program

In addition you can go to this website and actually use another program that I created using PERL: http://storm.cis.fordham.edu/~acosta/cgi-bin/cgitable.pl
This program is a simple version of how businesses conduct e-commerce. This program will collect your information and if you write your correct email address, it will then send you a confirmation page to that email address with all the information you included. It pretty much acts as if you were getting a receipt if you bought a product online.

Programming languages are evolving and getting increasingly more sophisticated, no wonder I.T jobs are in high demand.

Thursday, September 24, 2009

Computer and Herman Hollerith

Since we are all assigned the term computer, instead of simply looking it up in a dictionary or looking for it on Google, I'm simply going to define it in my own terms. To me, the word computer first brings to mind the device I'm doing this very assignment on. It include a keyboard, screen, and mouse and the numerous capabilities. It allows its user to type documents, search for information via an internet connection, listen to music, and many more things.
Computers haven't always been as advanced as the very ones we use today. They started off as huge boxes with word processing. I'm sure some scholars would link the first computer to some obscure device in ancient times, like the abacus. Even in our lifetimes we have seen computers come a long way. From taking typing class in middle school on very old Macintosh processers, to paper thin computers.
Computers are an important part of our daily lives, not only as college students, but as individuals in a modern society. To be without one of your own isn't the end of the world, but things certainly become more difficult.

As a second part of the assignment I researched Hermann Hollerith. He is regarded as the "father of modern automatic computation" and founded the company that is now called IBM. He created a system to encode data on cards which were used for the US Census in 1890. Using a series of punched holes, data was recorded and were passed through electrical contacts. This system allowed the information to be understand better and quicker using his new technology.

Questions for Week 3

I would like everyone to post a definition of the term computer this week, and please do not post one that has already been posted.

Other key terms to define:


computing
computer program
algorithm
computer language
software and hardware
integrated circuit
microprocessor
mainframe computer
minicomputer
microcomputer
personal computer
calculator
GUI
operating system
architecture

and/or identify the following individuals:

Joseph Marie Jacquard
Herman Hollerith
Alan Turing
John von Neumann
Douglas Engelbart
Alan Kay

 



Information Theory and Communication

Take a look at this post that I put up over on my blog this past February, and especially the film:






Tuesday, September 22, 2009

Frankenstein's Foreshadowing

Frankenstein is a revolutionary novel written by Mary Shelly, a radical, intellectual, English elite, and has been a part of our culture ever since. Published in 1817 it seemed to be a warning to man against technological expansion during the Industrial Revolution. In the book the monster, created by Dr. Frankenstein, is in search of revenge for his unfulfilled life and lack of "soul." This illustrates the horror of technology taking control over man kind...having one angry synthetic robot running around the world trying to kill his creator is more than enough to worry about, let alone imagining that all technology could think for themselves, form a coalition, and systematically take over humanity and the world.

Lewis Mumford expressed this concern during the early 20th century saying that the automobile was equivalent to Frankenstein's monster. He believed that the only piece of technology more destructive at the time was the Hydrogen Bomb, yet automobiles were even more dangerous because the general public had allowed themselves to indulge in, even embrace, such technology.

Similarly, Harold Innis was equally as concerned with the enormous advances in technology during the early 20th century. He correctly worried that just as Frankenstein's monster took over and ran his creator, technology was going to do that to humans.

With this in mind we can ask ourselves...has technology taken over at least some aspects of our life? For example, many people who live outside the city would not be able to get around without their car...most could never imagine walking from point A to point B. Also, so many of us are addicted to their iPhones and Blackberries, constantly texting, facebooking, and/or tweeting. We have become used to the instant gratification of finding out information, talking to people, and doing almost everything in a matter of seconds on our phones and computers...it makes you wonder how we ever survived using dial up...and you don't even want to think about how miserable life must have been before cell phones and the internet existed! Even simply doing homework for this class requires a computer, an internet connection, and a blog update. So maybe technology is slowly taking over our lives? Can we stop it? If we could...would we even want to?

significance of the electric light bulb

Invented in 1879 by Thomas Alva Edison, the electric light bulb was a huge advancement in technology. The most important thing that the light bulb does is allow people to be productive at night. This of course is a huge advantage for businesses, and mankind in general. Simply put, the light bulb allows for more work to be accomplished. Electric light itself was first made in 1800 by Humphry Davy. This starting point allowed Edison and others to perfect the science, eventually leading to the light bulb. Our electrified world had to start somewhere.

Alternating vs. Direct Currents

In an Alternate Current (AC), the direction of the flow of electricity periodically changes. This differs from Direct Current (DC) in which the flow of electricity moves steadily in one direction. While Direct current comes primarily from energy sources such as batteries, while Alternating Current is what comes into our homes and businesses. The usual waveform of an Alternating Current power circuit is a sine wave. The back-and-forth motion occurs between 50 and 60 times per second, depending on the electrical system of the country.
<-- Diagram of various types of current

While Direct Current is likely found among low-voltage applications, Alternating Current is for larger voltage applications.

Direct Current is used in batteries, solar power, and most automotive applications.

Alternating Current is what is produced by power plants, and is used in our homes to power devices such as television, lights and computers.








Monday, September 21, 2009

The Difference Between Electric and Electronic

The word "electric" is used for components that will function as a way for the flow of electricity to pass through. The use of electric identifies a source of power that serves to create an effect when conducted through a device. 

"Electronic" is a term that describes devices that are powered by electricity. An electronic device is often constructed by the use of electric components that make it possible for the flow of electricity to pass through. Televisions are examples of an "electronic" device because it is constructed with electric components that allow electricity to flow through. 
Light bulbs on the other hand, can be both electric and electronic because it can both receive the flow of electricity and is also the origin of the completed function of the electronic aspect of the device. 

In association with our New Media class, we sit in front Mac computers during class. They are considered electronic because they help with the flow of electricity. Most of the components used in media today are considered electronic devices, such as cell phones, laptops, ipods because they operate with the use of an electric component- a battery.

The significance of the invention of the telephone

According to Wkipedia.com, the modern telephone is the culmination of work done by many individuals, all worthy of recognition for their contributions to the field.Alexander Graham Bell was the first to patent the telephone, an "apparatus for transmitting vocal or other sounds telegraphically", after experimenting with many primitive sound transmitters and receivers. However, the history of the invention of the telephone is a confusing collection of claims and counterclaims, made no less confusing by the many lawsuits which attempted to resolve the patent claims of several individuals.

Additional inventions such as the call bell, central telephone exchange, common battery, ring tone, amplification, trunk lines, wireless phones, etc. were made by various engineers who made the telephone the useful and widespread apparatus it is now.

Specific to our New Media course, I think the significance of the invention of the telephone is very important. The telephone not only evolved how we communicated with each other, it also dramatically affected the speed at which we communicated. Consequently, people no longer had to wait long periods of time to decode messages via telegraph or morse code, methods previously used. As a result of the invention of the telephone, information was widespread in significantly shorter time.

Transistors and Vacuum Tubes

Transistor


Vacuum Tube



Vacuum tubes and transistors were two hugely important inventions in modern electronics. Without these two inventions, radios, telephones, televisions, computers, and other electronic devices would not exist.

According to wikipedia, a vacuum tube is: "is a device used to amplify, switch, otherwise modify, or create an electrical signal by controlling the movement of electrons in a low-pressure space. " While vacuum tubes are slowly being replaced by newer technology, they were vital to modern technology.

A transistor is: "semiconductor device commonly used to amplify or switch electronic signals." It makes electricity move in different directions and is also able to amplify a signal which is hugely important in modern technology.

What is the difference between electric and electronic?

Electric: dealing with the concept of running through electricity. Electric is more of the overall workings of how instruments run rather than what they are considered, and it is what is to be identified when talking about the conducting of power through a device. Example: if a fire starts from wires, it is an electric/electrical fire. It is most importantly necessary for anything electric to normally run through wires.

Electronic: the action of a device running through electricity. An example of this is a television, a device that runs on electricity but is electronic. A cell phone or iPod are also electronic, but operate on a different electric component. Products that are electronic can most importantly be running wirelessly.

Electric refers to the source of power, and electronic refers to the action of that device running one electricity. Though this is the case, we don't tend to think about either word when using them, as there is sometimes a grey area between the two words and their meanings/definitions. Think about a light bulb... grey area appears here as a light bulb is electric because it runs on electricity but it is not electronic.

Timeline of Electricity and Telecommunications

600BC- Thales, discovered static electricity when feathers and other light objects were attracted to amber and silk being rubbed together. The Greek word for amber is elecktra
1600- William Gilbert, a British scientist and physician to, invented the term electricity. He was the first person to describe the earth's magnetic field and to realize the relationship between magnetism and electricity.
1752- Benjamin Franklin flew a kite with a metal tip into a thunderstorm to prove that lightning is a form of electricity.
1700s- The Wimshurst machine, a machine used to produce static electricity easily and reliably was invented. It functions by rotating two parallel plates in opposite directions, producing a charge around the edges of the plates.
1827- Georg Ohm, a German professor published his complete mathematical theory of electricity. The word Ohm later became the name of a unit of electrical resistance.
1831- Michael Faraday demonstrated electromagnetic induction by passing a magnet through a coil of wire. That same year, Charles Wheatstone and William Fothergill Cooke created the first telegraph machine.
1838- Samuel Morse invented Morse Code, a system of dots and dashes to communicate, which later became standard throughout the world.
1870s- Thomas Edison built a direct current electric generator in America. He later provided all of New York's electricity.
1876- Alexander Graham Bell, inventor of the telephone, used electricity to transmit speech for the first time.
1878- Joseph Swan, a British scientist, demonstrated the first electric light with a carbon filament lamp. A few months later, Thomas Edison made the same discovery in America.
1880s- Nikola Tesla developed an alternating current motor and a system of AC power generation. Edison saw Tesla's system as a threat to his DC supply and spread stories that it was not safe. It later gained acceptance.
1881- The first public electricity supply was generated in Godalming, Surrey using a waterwheel at a nearby mill.
1886- Heinrich Hertz produced and detected electric waves in the atmosphere.
1897- Guglielmo Marconi sends a radio message 20 miles away. Later he sends a message across the Atlantic.
1906 - Lee de Forest invents the vacuum tube.
1910 - The first commercial radios are sold
1918 - Edwin Armstrong develops a receiving circuit.
1919 - Radio Corporation of America (RCA) is formed.
1920 - Westinghouse Radio Station KDKA is established.
1929- Commercial ship-to-shore telephone service opened.
1933 - Edwin Armstrong demonstrates frequency modulation (FM).
1934 - Congress passes Communications Act of 1934, Federal Communications Commission founded.
1935 - First telephone call around the world.
1947- Opening of commercial telephone service for passengers on certain trains running between New York and Washington, D.C.
1954 - Sony introduces the first transistor radio that sold for $49.95.