Wait a second!
More handpicked essays just for you.
More handpicked essays just for you.
Learning
Don’t take our word for it - see why 10 million students trust us with their essay needs.
Recommended: Learning
What is C programming language? When and why was it invented? What does the technology do? What impact does it have on our society?
History of C programming language and its influence on our society
University of Adelaide
April 10, 2014
COMP SCI 1101
Donghyeon Yoon
Under the rapid revolution of technologies, great convenience is provided to everyone around the world. Students nowadays are typically exposed to computing. As the modern technologies such as computers, smartphones, and tablets are becoming part of our everyday life, writing computer programs to solve problems is becoming a basic skill for all students. And the fundamental of these computing and modern technologies is called C language. This paper will explore the history of the C programming language and analyse its influence towards the modern technologies.
The C programming language is a popular and widely used programming language for creating computer programs. Cory(2012) defines C as ‘a widely used programming language around the world by programmers, which provides maximum control and efficiency and is ideal for developing firmware or portable applications’
The history of C language is not very long compared to the tremendous influence of C language upon the modern technologies. To speak about c programming language, Dennis M. Ritchie must be mentioned. Dennis MacAlistair Ritchie was an American computer scientist who helped shaping the digital era. He created the C programming language and Unix operating system with his colleague Ken Thompson. (Geoff, 2011)
In the late 1960s, after earning a degree in physics and applied mathematics from Harvard, Ritchie started his career at Bell Labs (AT&T), which was one of the centers of technology development...
... middle of paper ...
...r a large computer. And with its long term maintenance, even today, C plays a crucial part in IT industry. (Noll and Landon, 1996) C created excessively diverse and broad ecosystem of technology. GPS, security systems, satellites, traffic lights, Internet routers, digital cameras, Microcontrollers, televisions, computers, smartphones and literally every technology that we are using at the moment, are based on C programming language. It also led computer scientists and IT professional to in-depth study of science, technology, engineering and Mathematics. (Gustavo, 2009) "C is not a 'very high level' language, nor a 'big' one, and is not specialized to any particular area of application. But its absence of restrictions and its generality make it more convenient and effective for many tasks than supposedly more powerful languages."(Kernighan, Brian and Ritchie, 1998)
Grace Murray Hopper, born December 9, 1906, was a Math professor that enlisted in the United States Navy at the start of World War II. Over the time of her enlistment, Hopper developed several new programming languages, including COBOL, which is still one of the most used programming languages today. Hopper was also one of the first people to coin the term “computer bug”. Over the course of her life, Grace Hopper influenced many people through her service in the military and led a movement in modern electronics through her work.
After graduating from MIT, he went straight into work at Bell Laboratory. He did most of his research in solid state physics, especially vacuum tubes. Most of his theoretical advances led the company to conquer their goal of using electronic switches for telephone exchanges instead of the mechanical switches there were using at the time. Some of the other research he did was on energy bands in solids, order and disorder in alloys, self-diffusion of copper, experiments on photoelectrons in silver chloride, experiment and theory on ferromagnetic domains, and different topics in transistor physics. He also did operations research on individual productivity and the statistics of salary in research laboratories.
Dennis Ritchie was born on September 9th, 1941 in Bronxville, N.Y to the parents of Jean and Alistair. His father Alistair was an engineer at Bell Labs while his mother was a Homemaker. Dennis and his family moved to Summit, NJ when he was a child. Ritchie ended up growing up attending high school in Summit. He continued his schooling at Harvard University where he studied Physics as an undergraduate and Applied Mathematics as a graduate. Ritchie received a Bachelor’s Degree in both of his Undergraduate and Graduate studies and many other advanced degrees alongside a PhD in Computer Science. Since Ritchie liked procedural languages better than functional ones, Dennis decided hi...
The subject of this term paper will be about computers in the 1950’s. The divisions that will be covered are; the types of computers there were, the memory capacity of computers, the programming languages of that time, and the uses of the computers for that time. Information will be gathered from the Internet, from books, and from magazines, and from the encyclopedia.
He applied to the doctoral program in physics at MIT, and was accepted. In the summer of 1953 Bell Labs, then IBM offered him a research position. He turned down both offers to build transistors for Philco-a manufacturer of radios and
The programming language C++ can be used in many ways. It has exploded into the gaming community allowing PC game programmers to have access to a stabile, yet powerful, programming language, utilizing as little code as possible. It has also been used in other commercial software, such as word processors, audio players, screen savers, and other computer desktop tools.
If the nineteenth century was an era of the Industrial revolution in Europe, I would say that computers and Information Technology have dominated since the twentieth century. The world today is a void without computers, be it healthcare, commerce or any other field, the industry won’t thrive without Information Technology and Computer Science. This ever-growing field of technology has aroused interest in me since my childhood. After my twelfth grade, the inherent ardor I held for Computer Science motivated me to do a bachelors degree in Information Technology. Programming and Math, a paragon of logic and reasoning, have always been my favorite subjects since childhood.
Many different types of programming languages are used to write programs for computers. The languages are called "codes". Some of the languages include C++, Visual Basic, Java, XML, Perl, HTML, and COBOL. Each of the languages differs from each other, and each is used for specific program jobs. HTML and JAVA are languages used to build web pages for the Internet. Perl and XML can produce codes that block students from getting on certain inappropriate web pages on their school server. One of the most prominent programming languages of the day would have to be C++.
The field of Computer Science is based primarily on computer programing. Programming is the writing of computer programs using letters and numbers to make "code". The average computer programer will write at least a million lines of code in his or her lifetime. But even more important than writting code, a good programer must be able to solve problems and think logicaly.
We have the microprocessor to thank for all of our consumer electronic devices, because without them, our devices would be much larger. Microprocessors are the feat of generations of research and development. Microprocessors were invented in 1972 by Intel Corporation and have made it so that computers could shrink to the sizes we know today. Before, computers took a room because the transistors or vacuum tubes were individual components. Microprocessors unified the technology on one chip while reducing the costs. Microprocessor technology has been the most important revolution in the computer industry in the past forty years, as microprocessors have allowed our consumer electronics to exist.
Computer Science is the study of Computer theory, experimentation, designs and engineering. Studying computer science has always been my dream. From the first time I sat in front of a computer, I have always wanted to know how computer work, how software could make hardware do what it does, I did research and discovered a lot about computer science. With what I found about computer and the opportunities I holds I decided to study it and hopefully make a career or a business out of it .
In 500 B.C. the abacus was first used by the Babylonians as an aid to simple arithmetic. In 1623 Wihelm Schickard (1592 - 1635) invented a "Calculating Clock". This mechanical machine could add and subtract up to 6 digit numbers, and warned of an overflow by ringing a bell. J. H. Mueller comes up with the idea of the "difference engine", in 1786. This calculator could tabulate values of a polynomial. Muellers attempt to raise funds fails and the project was forgotten. Scheutz and his son Edward produced a 3rd order difference engine with a printer in 1843 and their government agreed to fund their next project.
Computer programming can as well be defined as a process that leads from an original formulation of a computer problem to execute problems. Computer programming is also referred to as programming. It encompasses other activities such as understanding, analysis and generally solving problems that result in algorithm, verifying the algorithm requirements and coding algorithm in a target program language. The program also involves the implementation of the build system and managing derived artifacts like computer programs machine codes. Most often, the algorithm is represented in human-parseable languages such as Java, Python, Smalltalk among others.
The history of the computer dates back all the way to the prehistoric times. The first step towards the development of the computer, the abacus, was developed in Babylonia in 500 B.C. and functioned as a simple counting tool. It was not until thousands of years later that the first calculator was produced. In 1623, the first mechanical calculator was invented by Wilhelm Schikard, the “Calculating Clock,” as it was often referred to as, “performed it’s operations by wheels, which worked similar to a car’s odometer” (Evolution, 1). Still, there had not yet been anything invented that could even be characterized as a computer. Finally, in 1625 the slide rule was created becoming “the first analog computer of the modern ages” (Evolution, 1). One of the biggest breakthroughs came from by Blaise Pascal in 1642, who invented a mechanical calculator whose main function was adding and subtracting numbers. Years later, Gottfried Leibnez improved Pascal’s model by allowing it to also perform such operations as multiplying, dividing, taking the square root.
Software engineering was suggested at a NATO conference in 1968 to talk about the software crisis. “Software crisis” was the name give to problems encountered in the development of large and complex systems . In the early 1970s, notions of structured programming started coming up. In the late 1970s, early