Advantages And Disadvantages Of Digital Compression And Digital Image Processing

1296 Words3 Pages

Chapter 1
INTRODUCTION

An image as perceived in "reality" is thought to be a function of two real variables, for instance, a(x, y) with a certain level of brightness of the image at the real coordinates (x, y). Further, an image may be considered to contain sub-images now and mentioned to as areas of-investment or ROI’s, or basically regions. This idea reflects the way that images as often as possible contain build-ups of items each of which can be the idea for a region.
Digital image processing is the utilization of machine calculations to perform image transformations on digital images. Being a subset of digital signal processing, digital image processing has numerous advantages over the analog image processing. Digital Image processing …show more content…

Compression means the decrease in size of information with a specific end goal to save storage space or transmission time over the network. For information transmission, compression can be performed on simply the information content or on the whole transmission unit depending upon various elements. Content compression can be as simple as removing all extra space characters, embedding a single reused character to show a string of reused characters, and substituting tinier bit strings for as often as possible occurring characters. This sort of compression can lessen a content record to half of its unique size. Compression is performed by a program that uses a formula or calculation to decide how to compress or decompress …show more content…

It used shorter codes for the English alphabets. The expansion of information theory in 1940s laid a path to the radical development of various techniques in data compression area. In 1949, Claude Shannon and Robert Fano invented a method for compression by allocating code words based on prospects of blocks in the data to be compressed. These compressions were limited to hardware operations. During mid-1970s, the technique of dynamic updation of code words based on precise data was put forward by Huffman. In the late 1970s, software suites started using Huffman encoding. In 1977, Abraham Lempel and Jacob Ziv presented the idea of pointer-based encoding. During mid-1980s, the pioneering work done by Terry Welch led to invention of Lempel–Ziv–Welch (LZW) algorithm which later became the most widespread algorithm for many universal compression systems. This has been used for operation in programs like PKZIP and also used in hardware devices like modems. During the late 1980s, digital images came to be so popular that standards for image compression started developing. In 1990s, lossless compression evolved. The major progresses in data compression techniques achieved by devotees such as Huffman, Lempel, Ziv, and many other great scientists has led the world to the summit of technical expansions. The enormous progress of web technology

Open Document