This article describes the knowledge of C # bits, Bytes, and so on.
1. Bits (BIT)
Bit (bit) is called a bit, refers to the binary in the one, is the smallest unit of information binary.
Bit is also called small B, denoted by B.
2. Bytes (bytes)
A 8-bit representation of a byte.
One byte can be calculated by the following formula to represent a maximum of 256
28=28=256
But it's not actually 256, 8 bytes is the maximum:
111111
This value is
27+26+25+ 24 + 23+22+2 1+ 1=255 27+26+25+24+23+22+21+1=255
This also illustrates the following:
byte.MaxValue == 255;
It is also important to note that byte is unsigned, so the range of byte is from 0-255
This byte is also called bigger B, denoted by B.
This byte is also a byte inside C #.
3. KB and KB
It says B and B, and then the difference between KB and KB.
Modems and other devices that access the Internet are typically metered at "Kbps" (kilobits per second), while other data rates, such as the transfer rate of the IDE or SCSI bus, are typically in "Kbps" (kilobytes per second) or "MBps" (megabytes per second).
These two are not the same.
Because 8 bits represents a byte, you can understand
B=8bb=8b
When we installed the broadband, said 10M, referring to 10MB, the actual bandwidth is:
Ten m b / 8 = 1.25 M B " > 10mb/8=1.25< Span id= "mathjax-span-58" class= "Mi" >mb 10MB/8=1.25MB
4.sbyte
SByte differs from byte in that SByte is signed, that is, sbyte can represent negative numbers. The highest bit is the sign bit, so sbyte only 7 bits can represent the value.
The minimum value should be:
−−128
The maximum value should be:
26+25+ 24 + 23+22+2 1+ 1=127 26+25+24+23+22+21+1=127
about why is-128 instead of 127, this article explains very detailed, are complement, anti-code that set of knowledge.
5.short ushort int UINT Long Ulongshort
Short is a 16-bit signed integer, which is a 2-byte integer.
Because short has a sign bit, only 15 digits represent a numeric value. Its scope should be
−32768−−32767−32768−−32767
The other is the same as the short truth, not explained.
6. Char
How many bytes does char occupy?
In C #, a character is represented by a unicode,2 (16-bit) byte.
Knowledge of bits, bytes, etc. in C #