|Real Software Forums
|8bit Character interpretation.
|Page 1 of 1|
|Author:||boborama [ Tue Aug 28, 2012 10:32 pm ]|
|Post subject:||8bit Character interpretation.|
I have a web based project that receives a makes a pseudo web-service call and expects/receives a stream of binary data in response. My data is delimited by 0xFD - Chr(253) I was able to create a string and assign Chr(253) to it.
I could then take the binary stream and separate it into an array using the split command. This was working fine under 2011 R3 (and prior) -- but fails under 2012 R1. I found I needed to switch to using Memory Blocks. That's all fine and well..makes sense for binary data.... but it seems odd that it was working just fine under the previous version and not anymore. I didn't see that as a documented change anywhere.
|Author:||ktekinay [ Wed Aug 29, 2012 9:12 am ]|
|Post subject:||Re: 8bit Character interpretation.|
For binary data, the encoding of your string should be set to "nil". Perhaps it was set to UTF8 (or something else) where chr( 253), on its own, would make a deformed string?
You didn't mention, but with binary data, you should be using the "B" versions of the string functions anyway, so perhaps that would work.
|Page 1 of 1||All times are UTC - 5 hours|
|Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group