Today Debug program found a strange thing, 1 and string.length () is always 1 large, look at the following code
#include <iostream>#include<string>using namespacestd;intMain () {stringstr; STR="123"; intnum=-1; //int len=str.length (); if(num<str.length ()) {cout<<"-1<str.length ()"; } Else{cout<<"-1>=str.length ()"; } return 0;}
The result of the output is: -1>=str.length (), which seems quite bizarre, changed to the following code to see:
#include <iostream>#include<string>using namespacestd;intMain () {stringstr; STR="123"; intnum=-1; intlen=str.length (); if(num<Len) {cout<<"-1<str.length ()"; } Else{cout<<"-1>=str.length ()"; } return 0;}
This time the output is -1<str.length ()
These two procedures should seem to output the same results, but actually not, it reminds me of a blog I wrote before, the implicit type of C + + http://www.cnblogs.com/bewolf/p/4358006.html
Look, sure enough, the return value of Str.length () is unsigned int, if you compare directly with-1, the process int will be implicitly converted to unsigned int, so 1 will become a large number, of course "1 is bigger than 3", If you assign Str.length () to a variable of type int, it will be converted like the assigned type, so Str.length () will be converted to the int type, then 1 and a variable of type int, the result is the normal result we want.
String.Length () vs. 1 Why do you have incredible results?