07-03-2012 12:15 PM
Hi! I have a problem with my data - I get it in a large 1x1000 array, but it is repeated measurments, each taking roughly 500 data points. I want to break this array up so this string of data starts a new column in my array every 500 data points. I'm not sure how to do this - please help!
Solved! Go to Solution.
07-03-2012 12:36 PM
Post what you have so far.
07-03-2012 02:00 PM - edited 07-03-2012 02:00 PM
Hey thank you for replying, and first off I have to apologize for the state i'm attaching my VI. I put the part of the VI that i'm working on (my whole team has access to, thus didn't want to post it all here), and also attached the file the data (when just written to a spreadsheet file, not through the VI attached). I want to transpose the long row of data and then start a new column every 50, 100, 5 (user defined) points.
Thanks!
Note: I had to save the spreadsheet file as a .txt file to upload it, but it is a spreadsheet file being produced in any vi i'm working with
07-03-2012 03:10 PM
A Quick vi.
07-03-2012 04:11 PM - edited 07-03-2012 04:32 PM
@datacompiler100 wrote:
Hey thank you for replying, and first off I have to apologize for the state i'm attaching my VI. I put the part of the VI that i'm working on (my whole team has access to, thus didn't want to post it all here), and also attached the file the data (when just written to a spreadsheet file, not through the VI attached). I want to transpose the long row of data and then start a new column every 50, 100, 5 (user defined) points.
Starting with the data from the file, you can simply reshape (as you already do!), followed by transpose (since you want columns instead of rows). 2D arrays always need to be rectangular, so the last column will be padded with zeroes if needed. Is that what you want?
Of course if you try to add a new column to a file, that won't work. You can only append rows to an existing file because of the way the data is arranged. To append colums, the entire file needs to be read, the new data interlaced, and everything re-written to the file.
07-03-2012 04:20 PM
@ErnieH wrote:
A Quick vi.
It is considered bad form to use "delete from array" inside an inner loop, you are constantly allocating new memory. For large arrays it can be orders of magnitude slower.
07-03-2012 07:43 PM
It is considered bad form to critique someone VI's unless they ask you, which I did not. It was stated a quick VI to help him get started and for the size of arrays presented, the difference is not noticeable by the end user. Thank you for your concern.
07-03-2012 08:03 PM
@ErnieH wrote:
It is considered bad form to critique someone VI's unless they ask you, which I did not. It was stated a quick VI to help him get started and for the size of arrays presented, the difference is not noticeable by the end user. Thank you for your concern.
It is considered bad form for someone to allow what might be a bad piece of code to go unchallenged.
Consider is a lesson not only for the original poster but also for yourself.
07-03-2012 10:02 PM
@ErnieH wrote:
It is considered bad form to critique someone VI's unless they ask you, which I did not.
My comment was addressed to the original poster to help him make an informed decision.
@ErnieH wrote:
the difference is not noticeable by the end user.
The much more complicated diagram that is also difficult to follow and debug is highly noticeable to the end user. It gives significantly more places for bugs to hide. 😄
The solution should not depend on the size of the inputs. It should be sufficient to teach/learn one efficient algorithm that works well for all sizes of a given problem, especially if it is also simpler and easier to implement. Starting with a poor, overly complicated algorithm that works for smallish problems is highly inefficient. If it later needs to be scaled up, everything would need to be designed from scratch again. Bad coding habits are hard to break.
Inefficient algorithms are sometimes OK if they result in significantly simpler code and speed does not matter. This wasn't the case here.
07-03-2012 10:03 PM
Everything should be challenged, including unnecessarily rude and condescending remarks? I read a lot by don't post often. This is an example of why that is the case. My mistake was posting to begin with.