While countless individuals have impacted the field of computing, few have had the revolutionary vision of Grace Murray Hopper. Beyond her brilliant technical mind, it was her understanding for business and marketing that set her apart. Grace Hopper realized that the potential for computers could go far beyond what anyone at the time imagined. She believed that computers could be tremendously useful to a much broader audience if only they were simpler to use and understand. Such forward thinking theories are what led Hopper to improve binary code, create the first compiler and in doing so change the future of the computer (Borg).
Hopper’s early work gave her the experiences necessary to identify the computer’s current limitations but also to
From Harvard, Hopper joined Eckert-Mauchly Computer Corporation where she worked on UNIVAC I, the first commercially successful electronic computer. It was here at Eckert-Mauchly, which would later be sold out to Remington Rand, that Grace Hopper developed the first compiler, the A-O or Arithmetic Language, version 0 (Strawn). In order to make the computer more accessible Hopper had the idea that computers should be able to be controlled in a language people could learn to write and understand. In order to have such an outlandish idea come to fruition, Hopper designed the compiler. In general, compilers translate mathematical or English-like code that humans could learn and understand into numerical code that the computer can process (Laduke). The A-O series of compilers was able to translate mathematical code into machine code; it allowed the user to request information from a certain stored location on the computer and to tell the computer what to do with the retrieved information (Borg). While the A-O series allowed some to control a computer in a manner much simpler than before it still required an expert to operate. The mathematical code was still very complex and extensive training in mathematics or computers would have been essential for
Languages for scientific applications (Fortran), computer science (Algol), artificial intelligence (Lisp) and business (FLOW-MATIC and COMTRAN) were created (Strawn). However, this growth was unregulated and while the surge in new programming languages was an exciting advancement, the development of numerous languages for the same basic use resulted in reprogramming being as costly as initial programming and the need to entirely reprogram applications with the purchase of a new computer. Such complications indicated the need for standardized languages. Despite the varied field of programming, Hopper’s main focus remained on the use of computers for businesses. Therefore, in 1959 Hopper became a technical consultant on a committee of industry and government personnel whose purpose was to develop a common business-oriented language for computer programming, later to be known as COBOL. Using FLOW-MATIC as its basic foundation the first COBOL standard was issued by the American National Standards Institute and it became widely adopted. As of 1997 approximately 200 billion lines of COBOL code existed and ran 80 percent of all business programs. Its impressive success can be contributed to its high degree of standardization (Strawn). For this to occur Hopper had to convince business to adopt this one
Mathematician Katherine Coleman Johnson was born on August 26, 1918 in White Sulphur Springs, West Virginia to Joylette and Joshua Coleman. Her father was a lumberman, farmer, and handyman. He also worked at the Greenbrier Hotel. Her mother was a former teacher. Ms. Johnson’s nickname was “ the human computer “ At a very early age Ms. Johnson showed a talent for math, she was also anxious to go to school. Her interest was counting. She loved to count it did not matter what it was. She counted the steps to get to church, she counted the number and silverware she washed. Anything that can be counted she counted it. Ms. Katherine was named for the girl who loved to count. Her hobbies was reading book about math, numbers, nasa. If it had something
Admiral Grace Murray Hopper is known as one of the first female computer scientists and the mother of Corbel programming. Hopper was born on December 9, 1906 in New York City and was the oldest of three children. Even as a child she loved played with gadgets, disassembling items such an alarm clocks to determine how they worked (Norman). Hopper parents and siblings had a huge impact on her life. Her father who was a successful insurance broker inspired Hopper to pursue higher education and not limit her to typical feminine roles during that time (Norman). Hopper excelled in school graduating from Vassar College in 1928 with a BA in mathematic and physics (Rajaraman 2). She later went on to receive her MA in mathematics from Yale University in 1930 and her PhD in 1943 (Rajaraman 2).
Her accomplishments were so significant that she earned a new title that motivates many individuals to pursue their aspirations. Her influence on present-day technology and society was huge (Capstone, n.d.), to the extent that a computer programming language was named after her, Ada (Encyclopdia Britannica, n.d.). She remains a symbol of the past, as without her foundational algorithm for Babbage's machine, our capacity to program the cutting-edge technology we currently enjoy
The Ada language is the result of the most extensive and most expensive language design effort ever undertaken. The United States Department of Defense (DoD) was concerned in the 1970¡¦s by the number of different programming languages being used for its projects, some of which were proprietary and/or obsolete. Up until 1974, half of the applications at the DoD were embedded systems. An embedded system is one where the computer hardware is embedded in the device it controls. More than 450 programming languages were used to implement different DoD projects, and none of them were standardized. As a result of this, software was rarely reused. For these reasons, the Army, Navy, and Air Force proposed to develop a high-level language for embedded systems (The Ada Programming Language). In 1975 the Higher Order Language Working Group (HOLWG) was formed with the intent of reducing this number by finding or creating a programming language generally suitable for the department's requirements.
Computers are a magnificent feat of technology. They have grown from simple calculators to machines with many functions and abilities. Computers have become so common that almost every home has at least one computer, and schools find them a good source for information and education for their students (Hafner, Katie, unknown). Computers have created new careers and eliminated others and have left a huge impact on our society. The invention of the computer has greatly affected the arts, the business world, and society and history in many different areas, but to understand how great these changes are, it is necessary to take a look at the origins of the computer.
For years, C++ (C Plus Plus) has dominated the business market place for many different companies and has allowed many computer programmers to obtain vast amounts of knowledge and experience since 1972 when it was first developed by Dennis Ritchie of AT&T Bell Laboratories (Lambert / Nance Page 16). It has been in use for almost thirty years – not to mention the years before when it’s precursor C was developed and commonly used also – and has made a great impact on the development of software for business’ across the world. It has become a second nature programming language to those that use it and have been forced to stay with C++.
If the nineteenth century was an era of the Industrial revolution in Europe, I would say that computers and Information Technology have dominated since the twentieth century. The world today is a void without computers, be it healthcare, commerce or any other field, the industry won’t thrive without Information Technology and Computer Science. This ever-growing field of technology has aroused interest in me since my childhood. After my twelfth grade, the inherent ardor I held for Computer Science motivated me to do a bachelors degree in Information Technology. Programming and Math, a paragon of logic and reasoning, have always been my favorite subjects since childhood.
...ere are gears used to select which numbers you want. Though Charles Babbage will always be credited with making the first “true” computer, and Bill Gates with popularizing it, Blaise Pascal will always have a place among the first true innovator of the computer. There is even a programming language called Pascal or Object Pascal which is an early computer program.
Computer engineering started about 5,000 years ago in China when they invented the abacus. The abacus is a manual calculator in which you move beads back and forth on rods to add or subtract. Other inventors of simple computers include Blaise Pascal who came up with the arithmetic machine for his father’s work. Also Charles Babbage produced the Analytical Engine, which combined math calculations from one problem and applied it to solve other complex problems. The Analytical Engine is similar to today’s computers.
Many different types of programming languages are used to write programs for computers. The languages are called "codes". Some of the languages include C++, Visual Basic, Java, XML, Perl, HTML, and COBOL. Each of the languages differs from each other, and each is used for specific program jobs. HTML and JAVA are languages used to build web pages for the Internet. Perl and XML can produce codes that block students from getting on certain inappropriate web pages on their school server. One of the most prominent programming languages of the day would have to be C++.
On May 28, 1959, the Conference of Data Systems Languages (CODASYL) met for the first time with the idea of developing a universal language for building business applications. That language was COBOL. By 1960, COBOL was commercially ready, and for the next 20 years, more programs were written in COBOL than in any other language. Influenced by FORTRAN, a programming language for the scientific community, and FlowMatic, the group recognized the growing needs of the business community. They thought that if the scientific programmers were going to get a single language, they could do the same for business. In April 1959, at an informal meeting at the University of Pennsylvania in Philadelphia, a small group of computer manufacturers, large users and academics asked the Department of Defense (DOD) to head the efforts (The Creation of COBOL,Brandel). The next month, the DOD called the first meeting of CODASYL, which consisted of eight computer manufacturers and a few large users. The DOD broke CODASYL into several committees, and by June, the nine member “short-range committee” was asked to undertake a six-month investigation into developing the language. DOD made COBOL mandatory for all suppliers of computing hardware and software who were bidding of defense procurements (Encyclopedia of Comp.Sci.,page350). This pressure resulted in persuading other suppliers to adopt COBOL also and thus the programming language took off.
Pascal programming language was designed in 1968, and published in 1970. It is a small and efficient language intended to encourage good programming practices using structured programming and data structuring. Pascal was developed by Niklaus Wirth. The language was named in honor of the French mathematician and philosopher Blaise Pascal. In 1641, Pascal created the first arithmetical machine. Some say it was the first computer. Wirth improved the instrument eight years later. In 1650, Pascal left geometry and physics, and started his focus towards religious studies. A generation of students used Pascal as an introduction language in undergraduate courses. Types of Pascal have also frequently been used for everything from research projects to PC games. Niklaus Wirth reports that a first attempt to merge it in Fortran in 1969 was unsuccessful because of Fortran's lack of complex data structures. The second attempt was developed in the Pascal language itself and was operational by mid-1970. A generation of students used Pascal as an introductory language in undergraduate courses. Pascal, in its original form, is a Procedural language and includes the traditional like control structures with reserved words such as IF, THEN, ELSE, WHILE, FOR, and so on. However, Pascal has many data structuring and other ideas which were not included in the original, like type definitions, records, pointers, enumerations, and sets. The earliest computers were programmed in machine code. This type of programming is time consuming and error prone, as well as very difficult to change and understand. Programming is a time-consuming a process. More advanced languages were developed to resolve this problem. High level languages include a set of instruction...
Ada Lovelace was the daughter of famous poet at the time, Lord George Gordon Byron, and mother Anne Isabelle Milbanke, known as “the princess of parallelograms,” a mathematician. A few weeks after Ada Lovelace was born, her parents split. Her father left England and never returned. Women received inferior education that that of a man, but Isabelle Milbanke was more than able to give her daughter a superior education where she focused more on mathematics and science (Bellis). When Ada was 17, she was introduced to Mary Somerville, a Scottish astronomer and mathematician who’s party she heard Charles Babbage’s idea of the Analytic Engine, a new calculating engine (Toole). Charles Babbage, known as the father of computer invented the different calculators. Babbage became a mentor to Ada and helped her study advance math along with Augustus de Morgan, who was a professor at the University of London (Ada Lovelace Biography Mathematician, Computer Programmer (1815–1852)). In 1842, Charles Babbage presented in a seminar in Turin, his new developments on a new engine. Menabrea, an Italian, wrote a summary article of Babbage’s developments and published the article i...
The history of the computer dates back all the way to the prehistoric times. The first step towards the development of the computer, the abacus, was developed in Babylonia in 500 B.C. and functioned as a simple counting tool. It was not until thousands of years later that the first calculator was produced. In 1623, the first mechanical calculator was invented by Wilhelm Schikard, the “Calculating Clock,” as it was often referred to as, “performed it’s operations by wheels, which worked similar to a car’s odometer” (Evolution, 1). Still, there had not yet been anything invented that could even be characterized as a computer. Finally, in 1625 the slide rule was created becoming “the first analog computer of the modern ages” (Evolution, 1). One of the biggest breakthroughs came from by Blaise Pascal in 1642, who invented a mechanical calculator whose main function was adding and subtracting numbers. Years later, Gottfried Leibnez improved Pascal’s model by allowing it to also perform such operations as multiplying, dividing, taking the square root.
The fist computer, known as the abacus, was made of wood and parallel wires on which beads were strung. Arithmetic operations were performed when the beads were moved along the wire according to “programming” rules that had to be memorized by the user (Soma, 14). The second earliest computer, invented by Blaise Pascal in 1694, was a “digital calculating machine.” Pascal designed this first known digital computer to help his father, who was a tax collector. Pascal’s computer could only add numbers, and they had to be entered by turning dials (Soma, 32). It required a manual process like its ancestor, the abacus. Automation was introduced in the early 1800’s by a mathematics professor named Charles Babbage. He created an automatic calculation machine that was steam powered and stored up to 1000 50-digit numbers. Unlike its two earliest ancestors, Babbage’s invention was able to perform various operations. It relied on cards with holes punched in them, which are called “punch cards.” These cards carried out the programming and storing operations for the machine. Unluckily, Babbage’s creation flopped due to the lack of mechanical precision and the lack of demand for the product (Soma, 46). The machine could not operate efficiently because technology was t adequate to make the machine operate efficiently Computer interest dwindled for many years, and it wasn’t until the mid-1800’s that people became interested in them once again.