History of the Computer
The first devices that resemble modern computers date to the mid-20th century (around 1940 - 1945), although the computer concept and various machines similar to computers existed earlier. Early electronic computers were the size of a large room, consuming as much power as several hundred modern personal computers.[1] Modern computers are based on tiny integrated circuits and are millions to billions of times more capable while occupying a fraction of the space.[2] Today, simple computers may be made small enough to fit into a wristwatch and be powered from a watch battery. Personal computers in various forms are icons of the Information Age and are what most people think of as "a computer"; however, the most common form of computer in use today is the embedded computer. Embedded computers are small, simple devices that are used to control other devices — for example, they may be found in machines ranging from fighter aircraft to industrial robots, digital cameras, and children's toys.
The ability to store and execute lists of instructions called programs makes computers extremely versatile and distinguishes them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, computers with capability and complexity ranging from that of a personal digital assistant to a supercomputer are all able to perform the same computational tasks given enough time and storage capacity.
Computers have been used to coordinate information in multiple locations since the 1950s. The U.S. military's SAGE system was the first large-scale exa...
... middle of paper ...
...ings, and wherever possible should be designed to "fail secure" rather than "fail insecure" (see fail safe for the equivalent in safety engineering). Ideally, a secure system should require a deliberate, conscious, knowledgeable and free decision on the part of legitimate authorities in order to make it insecure.
In addition, security should not be an all or nothing issue. The designers and operators of systems should assume that security breaches are inevitable. Full audit trails should be kept of system activity, so that when a security breach occurs, the mechanism and extent of the breach can be determined. Storing audit trails remotely, where they can only be appended to, can keep intruders from covering their tracks. Finally, full disclosure helps to ensure that when bugs are found the "window of vulnerability" is kept as short as possible.
Security helps the organization meet its business objectives or mission by protecting its physical and financial resources, reputation, legal position, employees, and other tangible and intangible assets through the selection and application of appropriate safeguards. Businesses should establish roles and responsibilities of all personnel and staff members. However, a Chief Information Officer should be appointed to direct an organization’s day to day management of information assets. Supporting roles are performed by the service providers and include systems operations, whose personnel design and operate the computer systems. Each team member must be held accountable in ensuring all of the rules and policies are being followed, as well as, understanding their roles, responsibilities and functions. Organizations information processing systems are vulnerable to many threats that can inflict various types of damage that can result in significant losses (Harris, 2014). Losses can come from actions from trusted employees that defraud the system, outside hackers, or from careless data entry. The major threat to information protection is error and omissions that data entry personnel, users, system operators and programmers make. To better protect business information resources, organizations should conduct a risk analysis to see what
The practitioners are encouraged to provide full disclosure of all system 's limitations and problems (ACM). In our case, we can see that Diane has advised the company all the options available to build a good secure system. She is also honest in the sense that she told the company about the insufficiency of the security system and did not follow their requests right away just to get the contract for herself. We can conclude that Diane has followed this principle and this shows Diane has good professional ethics with respect to this
Technological advances continue to evolve at a continually increasing rate. Despite these improving increases in technology, the utilization of theoretical frameworks in risk management or information security may be deficient due to the inadequate substantiation of the theory. Furthermore, academic research to corroborate existing theories relevant to risk management or information security is underway, but current research may not be supportive of existing theories. According to Chuy et al. (2010), the roles of theories may not be fully understood and arguably used by others in the research process. In this article, a discussion will be presented on several theories regarding information security and risk management. Additionally, the selected theories will be compared to the implied use to information security and risk. In addition, a brief analysis of each theory will be conducted regarding whether abundant research exists on the specific theory that can be used by the academic community and others. Finally, a discussion will be offered on any challenges that may arise for each theory that does not have sufficient supportive research.
After looking into each of the seven layers in the OSI model it is apparent that there are many ways to exploit a security flaw within a system. A good security analyst has to look at the overall picture to keep the entire system secure and not just one or two layers. Information technology security measures are not a one time fix; it is a continuous process that must occur to keep pace with ever changing protocols, applications, and the ingenuity of attackers.
The evolution and understanding of the importance of information security and risk management originates from the awareness for the potential of IT in business functions and as a business enabler. This was then followed by the realization that the risks brought about by this boundless facilitator must be appropriately understood and addressed. The essence of information security and risk management is to identify low vs. high-risk systems and processes, followed by appropriately addressing those risks.
Glaser , C. L. (1997). The Security Dilemma Revisited. Cambridge University press, 50(1), 171-201. Retrieved from http://www.gwu.edu/~iscs/assets/docs/cg-docs/SecurityDilemma-WP-1997.pdf
Potential risks and security breaches have been on the rise with a growing number of skillful hackers. This results in an increase to external threats to personnel and businesses. However, when complex security measures and the appropriate level of controls are utilized, there is a reduction to the potential risk and loss due to failure or breach. Therefore, such practice will enhance system reliability.
Principle of Security Management by Brian R. Johnson, Published by Prentice-Hall copyright 2005 by Pearson Education, Inc.
... should invest considerably in efficient security and surveillance systems. They should ensure that the safety of the firm is well implemented and all the necessary support teams are well informed and equipped to avert any eventuality. Ensuring information is not leaked and sabotage is averted should be considered and even if a difficult objective with proper systems and adequate resources it can be enhanced.
Secondly, is integrity which is one of the main areas of accountability regarding information security. Integrity is
"Software leakage points include all vulnerabilities directly related to the software in the computer system. Of special concern is the operating system and the supplementary programs that support the operating system because they contain the software safeguards. Weaknesses can result from improper design or from failure to check adequately for combinations of circumstances that can lead to unpredictable consequences."7
In 500 B.C. the abacus was first used by the Babylonians as an aid to simple arithmetic. In 1623 Wihelm Schickard (1592 - 1635) invented a "Calculating Clock". This mechanical machine could add and subtract up to 6 digit numbers, and warned of an overflow by ringing a bell. J. H. Mueller comes up with the idea of the "difference engine", in 1786. This calculator could tabulate values of a polynomial. Muellers attempt to raise funds fails and the project was forgotten. Scheutz and his son Edward produced a 3rd order difference engine with a printer in 1843 and their government agreed to fund their next project.
"Risk management is the part of analysis phase that identifies vulnerabilities in an organization's information system and take carefully reasoned steps to assure the confidentiality, integrity, and availability of all components in the organization's information system" (Management of Information Security - second Ed, Michael E. Whitman and Herbert J. Mattord)
The history of the computer dates back all the way to the prehistoric times. The first step towards the development of the computer, the abacus, was developed in Babylonia in 500 B.C. and functioned as a simple counting tool. It was not until thousands of years later that the first calculator was produced. In 1623, the first mechanical calculator was invented by Wilhelm Schikard, the “Calculating Clock,” as it was often referred to as, “performed it’s operations by wheels, which worked similar to a car’s odometer” (Evolution, 1). Still, there had not yet been anything invented that could even be characterized as a computer. Finally, in 1625 the slide rule was created becoming “the first analog computer of the modern ages” (Evolution, 1). One of the biggest breakthroughs came from by Blaise Pascal in 1642, who invented a mechanical calculator whose main function was adding and subtracting numbers. Years later, Gottfried Leibnez improved Pascal’s model by allowing it to also perform such operations as multiplying, dividing, taking the square root.
very old. It is about 2000 years old .1 The first computer was the abacus. This