c++ - Making a string of 0's and 1's into an int, into a character from that ascii val -


here's procedure:

static void undo_bitstring(std::string& str) {      (unsigned = 0; < str.length(); += charbits) {          int ascii_val = 0;          (unsigned j = 0; j < charbits; j++) {              if (str[i+j] == '1') ascii_val += (int)exp2(charbits-j-1);         }         str[i/charbits] = (char)ascii_val;     }     str.erase(str.begin()+str.size()/charbits,str.end()); } 

where, know,

charbits  

was defined

static const size_t charbits = 8 * sizeof(char); 

what supposed happen is, example,

std::string str = "01010111"; undo_bitsring(str); 

should change str

"w" 

since

0x2^7 + 1x2^6 + 0x2^5 + 1x2^4 + 0x2^3 + 1x2^2 + 1x2^1 + 1x2^0  = 64 + 16 + 4 + 2 + 1 = 87 

and

(int)'w' = 87 

and of course procedure supposed work string of 0's , 1's length multiple of charbits. instance,

std::string str = "010101110101011101010111"; undo_bitsring(str); 

should change str

"www" 

on tests i've run, output looks bunch of boxes question marks inside them, indicating sort of error.

any ideas?

am totally going wrong in first place? part of encryption/decryption algorithm i'm trying make.

do know >> , << operators. these operators, when applied int shift bit 1 position right or left. more reliable using (int)exp2(charbits-j-1);


Comments

Popular posts from this blog

plot - Remove Objects from Legend When You Have Also Used Fit, Matlab -

java - Why does my date parsing return a weird date? -

Need help in packaging app using TideSDK on Windows -