01-07-2011 10:09 AM
01-07-2011 10:16 AM
Something like this?
01-07-2011 08:29 PM - edited 01-07-2011 08:31 PM
Jeff,
It looks like your code swaps nibbles within a byte. I'm not sure there is such a thing as nibble endianess.
I think edf is looking not at endianess, but reversing the order of bits. least significant bit vs. most significant bit first.
Try converting the U8 to boolean array, reverse the 1-D boolean array, then convert back to a U8.
01-07-2011 10:12 PM
I think Ravens Fan is right about the bit reversal, two ways I do it:
Of course, by lame I mean simple and fast.
01-08-2011 03:23 PM - edited 01-08-2011 03:26 PM
I never heard of a format with reversed bits. Are you sure that's what it actually is?
Can you attach a VI containing a typical 90byte raw data set and the desired result?
I would go with the Lookup table as in Darin's top example. If you disable debugging, it will get folded into a constant at compile time and is thus extremely efficient at runtime. (or just make it into a constant manually as Darin did).
(Darin, that second algorithm is weird. These guys have way too much time on their hand: Inflate each byte 8x to U64 just to do some obscure binary operations then shrink it back later... 🐵
01-09-2011 04:46 AM
Darin, altenbach, thank you!
We're pretty sure the data is bit-reversed. The FPGA deivce outputting the data is custom, and so could easilly be outputting reversed bits if thats what the guy who built it felt like doing. There are a number of fixed marker bytes in the packet (as well as start and stop bytes), and the only way these are what we expect them to be is by reversing the bits. Then the whole packet looks more sensible. Wierd I know, and most confusing, as it took us a week to figure out why it wasn't working the way we expected!