How computer able to differentiate between so many characters and symbols?

How computer able to differentiate between so many characters and symbols?


Have you ever wonder that how the computer able to differentiate between so many characters and symbol as we all know the electronic machines on understanding the language of 0's and 1's or high and low voltage. So here the first question arises that how it able to even difference between symbol 'A' and 'B' and all the character which you come in contact daily. Here is which the concept of askii and unicode come in. The solution to this is very simple that allocating all the symbol a number which further convert into binary to perform a certain task on it like displaying, sorting and all that kind of stuff. 

Here comes the entry of ASCII code which was developed by the ANSI(American national standard institute) on 6 October 1960 (probably on the same day you are reading this post)

Brief about ascii code


ASCII encodes 128 specified characters into seven-bit integers as shown by the ASCII chart above.Ninety-five of the encoded characters are printable: these include the digits 0 to 9, lowercase letters a to z, uppercase letters A to Z, and punctuation symbols. In addition, the original ASCII specification included 33 non-printing control codes which originated with Teletype machines; most of these are now obsolete,-although a few are still commonly used, such as the carriage returnline feedand tab codes.
For example, lowercase i would be represented in the ASCII encoding by binary 1101001 = hexadecimal 69 (i is the ninth letter) = decimal 105. 
 {source}
The problem which is facing with ascii code is that ASCII only contains 128 characters or a byte of the symbol which not able to include all there symbol and characters including day by day like emoji and rupees symbol.
which is solved by Unicode which contains 2 bytes of character 

Brief of Unicode


The Unicode Standard provides a unique number for every character, no matter what platform, device, application or language. It has been adopted by all modern software providers and now allows data to be transported through many different platforms, devices and applications without corruption. Support of Unicode forms the foundation for the representation of languages and symbols in all major operating systems, search engines, browsers, laptops, and smartphones—plus the Internet and World Wide Web (URLs, HTML, XML, CSS, JSON, etc.). Supporting Unicode is the best way to implement ISO/IEC 10646.
The emergence of the Unicode Standard and the availability of tools supporting it are among the most significant recent global software technology trends.
Click this link and help this website to grow






  •  https://rojkmao.com/iX8tvNm




  • related blog:- 

    cryptography

    p2p network

    data scientist search engine

    follow us on Instagram as:- theabhi_eye

    more blog soon on:-

     

    More categories:-

    bitcoin