The Concept Of Entropy Computer Science Essay Free Essay Example
We can use the construct of information this to data compaction, to happen the most efficient manner to compact a piece of informations. Data compaction plants by extinguishing or minimising redundancy in a file, doing your files smaller without losing any information. Every character on your computing machine, every missive, figure and punctuation grade, is really made up of several characters that make up computing machine codification. A simple illustration of compaction is: If you have a set of characters “ AAAADDDDDDD ” stand foring a missive, one type of compaction package can rewrite this as “ 4A7D ” , salvaging seven infinites and doing that line 64 % smaller.
Compression package uses algorithms to make this.
Compression makes files smaller so they take up less storage infinite and can be transferred faster from machine to machine. Combined with archiving, it becomes a utile manner to form files and undertakings. Data compaction besides removes some of the redundancies in a non-compressed text file, which really contributes in informations security.
Data compaction besides has its disadvantages. One of the disadvantages of informations compaction is that dependability is reduced. This is because there is a decrease in redundancies, which is utile for mistake sensing.
Data compaction is most utile for big archival file and files that need to be transferred over long communicational distances, for illustration over the cyberspace, as informations compaction can offer dramatic nest eggs on storage demands.
Data compaction can be split into two classs, Lossless ( entropy encoding ) and Lossy ( beginning coding ) .
Lossy and Lossless
Some compaction techniques ( normally image and multimedia files such as JPEG and mp3 ) lose information when they are compressed, cut downing the quality of the file.
Compressing these types of files are irreversible and hence can non fit the original file. The size of the file is relative to the quality debasement. Therefore this type of compaction is non used with files that need to be restored to its original file. This is Lossy informations compaction
When the information of the file needs to be conserved, Lossless information compaction is used. This ensures that the information during the compaction procedure is non lost, so the information is the same before and after compaction. Continuous compaction of a Lossless file does non intend uninterrupted decrease in size, as there is a lower bound to what the file can be compressed to, its information.
Lossless Data Compression ( Entropy Coding )
Lossless informations compaction uses a information compaction algorithm that allows the original informations to be reconstructed to the full from the compressed information. This is used when the original piece of informations must be to the full reconstructed and one time decompressed, the informations should be indistinguishable to the information before compaction.
As antecedently discussed, Shannon defined the information of a set of distinct events p1, aa‚¬A¦. , pn as:
H = – A??’A? pi log pi
and that the information of a uninterrupted distribution with the denseness distribution map with denseness distribution map P ( ten ) as:
H = – A??’A? P ( x ) log P ( x ) dx
By utilizing the information of a set of symbols and their chances, we can infer the optimal compaction ratio we can acquire. For illustration, the English linguistic communication has an information of 1.3, hence an optimal degree, we can utilize 1.3 spots per character.
The Shannon-Fano compaction method and Huffman compaction method is based on statistics obtained from the information. These statistics take into consideration the chance, or how frequently each symbol will look and with this information we assign a binary twine for each symbol. The purpose is to delegate the most occurring symbol with the shortest binary codification and the least happening with the longest. This allows the coded information to be smaller in size of the original given informations.
Compaction of informations utilizing Shannon-Fano cryptography
As we have already discussed the Shannon-Fano codification in the old chapter, we can jump the method on how to calculate the codification threading to how the Shannon-Fano codification can be affectional in compaction.
So if we consider the a set of symbols with their corresponding chances of happening:
Set of Symbols ( x )
Probability of Occurrence P ( Xi ) :
a
P ( a ) = 0.20
B
P ( B ) = 0.18
degree Celsiuss
P ( degree Celsius ) = 0.13
vitamin D
P ( vitamin D ) = 0.10
vitamin E
P ( vitamin E ) = 0.09
degree Fahrenheit
P ( degree Fahrenheit ) = 0.08
g
P ( g ) = 0.08
H
P ( H ) = 0.07
I
P ( I ) = 0.04
J
P ( I ) = 0.03
Aplying the Shannon-Fano compaction method, we can utilize this order of diminishing chances of symbols to assist compact the information. First, we divide the list of diminishing chances into two groups, where amount of the symbols of each group, has about half of the entire chance. Then we continue this division procedure until there is merely one symbol in each group.
We will now show this method:
Group one
Group two
a
vitamin D
B
vitamin E
degree Celsiuss
degree Fahrenheit
g
H
I
J
Entire Probabilities
0.51
0.49
A tree diagram can stand for the farther splitting of these groups:
The information of this twine of informations is:
H = – ( ( 0.2 ) log ( 0.2 ) + ( 0.18 ) log ( 0.18 ) + ( 0.13 ) log ( 0.13 ) + ( 0.1 ) log ( 0.1 ) + ( 0.09 ) log ( 0.09 ) +
( 0.08 ) log ( 0.08 ) + ( 0.08 ) log ( 0.08 ) + ( 0.07 ) log ( 0.07 ) + ( 0.04 ) log ( 0.04 ) +
( 0.03 ) log ( 0.03 ) = 3.1262772
Now the symbols have a binary twine attached to them. With this binary twine and their chance of happening, we can cipher the mean length, and see how close it is to the information of the informations. We can cipher the mean length of this codification by being the amount of each symbols binary twine length multiplied by the chance of happening.
So the mean length = ( 2×0.2 ) + ( 3 x ( 0.18+0.13+0.10 ) + ( 4 x ( 0.09+0.08+0.08+0.07+0.04+0.03 )
= 3.19 spots
We can see how shut the mean length of the information is to its information. Now utilizing this, we can compare which method of cryptography is the most effectual by seeing which method of coding leaves the mean length closest to the information.
Compaction of informations utilizing Huffman Coding
Again, as we have already discussed the Huffman codification in the old chapter, we can jump the method on how to calculate the codification threading to how the Huffman codification can be affectional in compaction.
So if we consider the information twine used above, with the same chances, we have:
Set of Symbols ( x )
Probability of Occurrence P ( Xi ) :
a
P ( a ) = 0.20
B
P ( B ) = 0.18
degree Celsiuss
P ( degree Celsius ) = 0.13
vitamin D
P ( vitamin D ) = 0.10
vitamin E
P ( vitamin E ) = 0.09
degree Fahrenheit
P ( degree Fahrenheit ) = 0.08
g
P ( g ) = 0.08
H
P ( H ) = 0.07
I
P ( I ) = 0.04
J
P ( I ) = 0.03
As we can see, the Huffman compaction method, likewise to the Shannon-Fano compaction method, uses this order of diminishing chances of symbols to assist compact the information. However, thataa‚¬a„?s where the similarities end. In the Huffman compaction method, we foremost create a binary tree in the order of diminishing chances. Then from the binary tree, we branch out from each symbol, get downing from the least likely simple happening ramifying to the following least likely symbol happening. Then eventually for each symbol, we label each subdivision with a binary figure and the spot sequence obtained from the binary tree is that symbols Huffman codification.
As computed earlier, the information of this twine of informations is:
H = – ( ( 0.2 ) log ( 0.2 ) + ( 0.18 ) log ( 0.18 ) + ( 0.13 ) log ( 0.13 ) + ( 0.1 ) log ( 0.1 ) + ( 0.09 ) log ( 0.09 ) +
( 0.08 ) log ( 0.08 ) + ( 0.08 ) log ( 0.08 ) + ( 0.07 ) log ( 0.07 ) + ( 0.04 ) log ( 0.04 ) +
( 0.03 ) log ( 0.03 ) = 3.1262772
As we now have the binary strings attached to each symbol derived from the Huffman codification, we can cipher the mean length, and see how close it is to the information of the informations. Again we can cipher the mean length of this codification by being the amount of each symbols binary twine length multiplied by the chance of happening.
So the mean length = ( 9 x ( 0.03+0.04 ) ) + ( 8 x 0.07 ) + ( 7 x 0.08 ) + ( 6 x 0.08 ) + ( 5 x 0.09 ) + ( 4 x 0.10 ) + ( 3 x 0.13 ) + ( 2 x 0.18 ) + 0.2 =
We can see how shut the mean length of the information is to its information. From the two compaction methods, we can infer that the Huffman compaction method is more effectual as the mean length of the codification is closer to the information. This is why today, Huffman cryptography is used more than the Shannon-Fano method, to compact informations.
Other Data Compression Algorithms
Compaction utilizing Lempel-Ziv method
The Lempel-Ziv method has been explained in the old chapter, so we can summarize the usage of the compaction method. The Lempel- Ziv compaction method is used chiefly to compact text files. In the Lempel-Ziv compaction method, the input sequence is put in to non-overlapping blocks of different lengths, whilst making that we make a lexicon of blocks that we have already seen.
Lossy Data Compression
Lossy compaction is a information compaction method that loses some of the informations, in order to accomplish its end of compaction. However, when decompressed the information content is different from the original, though similar plenty to be utile in some manner. Lossy compaction is most normally used to compact multimedia informations, particularly in applications such as streaming media and utilizing the telephone over the cyberspace.
Compressing and Archiving
Compaction is typically applied to a individual file and compressed formats contain merely contain a individual point.
Archiving allows files and booklets to be grouped together for compaction. Archive formats can incorporate a individual point, or many points, and continue the hierarchy of nested booklets. However, most archive formats include compaction as portion of the file awaying procedure.
Expansion and Extraction
When an archive is created, it can be accessed in two different manners. One of the manners is Expansion, where the full contents of an archive are expanded out at one time. The 2nd is the Browse manner, where the archive is accessed like a booklet. The hierarchal construction can be navigated and single files can be extracted without holding to spread out the full contents of the archive. In some instances, archive content can be manipulated: points can be renamed, moved, deleted, added, etc.
Re-compression
Re-compression involves doing files smaller by dismantling the construction of the informations and so compacting the file more expeditiously. Then when the file is so expanded, the information construction for that file is so reassembled.
Recompression normally consequences in an end product which is 100 % indistinguishable to its original, nevertheless in some instances, it may non be. In these instances the content and any information is ne’er lost, nevertheless the encryption may be somewhat different.
ReferencesP
1977, Martin J. Computer Data-Base Organisation. Englewood Cliffs, NJ: Prentice-Hall
1980. Reghbati, H.K. Technical Aspects of Teleprocessing. Saskatoon, Sask
1983 Encyclopedia of computing machine scientific discipline and engjneering, Anthony Ralston
Top-quality papers guaranteed
100% original papers
We sell only unique pieces of writing completed according to your demands.
Confidential service
We use security encryption to keep your personal data protected.
Money-back guarantee
We can give your money back if something goes wrong with your order.
Enjoy the free features we offer to everyone
-
Title page
Get a free title page formatted according to the specifics of your particular style.
-
Custom formatting
Request us to use APA, MLA, Harvard, Chicago, or any other style for your essay.
-
Bibliography page
Don’t pay extra for a list of references that perfectly fits your academic needs.
-
24/7 support assistance
Ask us a question anytime you need to—we don’t charge extra for supporting you!
Calculate how much your essay costs
What we are popular for
- English 101
- History
- Business Studies
- Management
- Literature
- Composition
- Psychology
- Philosophy
- Marketing
- Economics