LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How can, when collecting data, start a new column in my array every 100 data points?

Solved!
Go to solution

Hi! I have a problem with my data - I get it in a large 1x1000 array, but it is repeated measurments, each taking roughly 500 data points. I want to break this array up so this string of data starts a new column in my array every 500 data points. I'm not sure how to do this - please help!

0 Kudos
Message 1 of 21
(3,655 Views)

Post what you have so far.

0 Kudos
Message 2 of 21
(3,647 Views)

Hey thank you for replying, and first off I have to apologize for the state i'm attaching my VI. I put the part of the VI  that i'm working on (my whole team has access to, thus didn't want to post it all here), and also attached the file the data (when just written to a spreadsheet file, not through the VI attached). I want to transpose the long row of data and then start a new column every 50, 100, 5 (user defined) points. 

 

Thanks!

 

Note: I had to save the spreadsheet file as a .txt file to upload it, but it is a spreadsheet file being produced in any vi i'm working with

Download All
0 Kudos
Message 3 of 21
(3,627 Views)
0 Kudos
Message 4 of 21
(3,607 Views)
Solution
Accepted by topic author datacompiler100

@datacompiler100 wrote:

Hey thank you for replying, and first off I have to apologize for the state i'm attaching my VI. I put the part of the VI  that i'm working on (my whole team has access to, thus didn't want to post it all here), and also attached the file the data (when just written to a spreadsheet file, not through the VI attached). I want to transpose the long row of data and then start a new column every 50, 100, 5 (user defined) points. 

 


Starting with the data from the file, you can simply reshape (as you already do!), followed by transpose (since you want columns instead of rows). 2D arrays always need to be rectangular, so the last column will be padded with zeroes if needed. Is that what you want?

 

 

 

 

 

Of course if you try to add a new column to a file, that won't work. You can only append rows to an existing file because of the way the data is arranged. To append colums, the entire file needs to be read, the new data interlaced, and everything re-written to the file.

 

0 Kudos
Message 5 of 21
(3,600 Views)

@ErnieH wrote:

A Quick vi.



It is considered bad form to use "delete from array" inside an inner loop, you are constantly allocating new memory. For large arrays it can be orders of magnitude slower.

0 Kudos
Message 6 of 21
(3,594 Views)

It is considered bad form to critique someone VI's unless they ask you, which I did not. It was stated a quick VI to help him get started and for the size of arrays presented, the difference is not noticeable by the end user. Thank you for your concern.

0 Kudos
Message 7 of 21
(3,567 Views)

@ErnieH wrote:

It is considered bad form to critique someone VI's unless they ask you, which I did not. It was stated a quick VI to help him get started and for the size of arrays presented, the difference is not noticeable by the end user. Thank you for your concern.


It is considered bad form for someone to allow what might be a bad piece of code to go unchallenged.

 

Consider is a lesson not only for the original poster but also for yourself.

0 Kudos
Message 8 of 21
(3,565 Views)

@ErnieH wrote:

It is considered bad form to critique someone VI's unless they ask you, which I did not.


My comment was addressed to the original poster to help him make an informed decision.


@ErnieH wrote:

 the difference is not noticeable by the end user.


The much more complicated diagram that is also difficult to follow and debug is highly noticeable to the end user. It gives significantly more places for bugs to hide. 😄

 

The solution should not depend on the size of the inputs. It should be sufficient to teach/learn one efficient algorithm that works well for all sizes of a given problem, especially if it is also simpler and easier to implement. Starting with a poor, overly complicated algorithm that works for smallish problems is highly inefficient. If it later needs to be scaled up, everything would need to be designed from scratch again. Bad coding habits are hard to break.

 

Inefficient algorithms are sometimes OK if they result in significantly simpler code and speed does not matter. This wasn't the case here.

0 Kudos
Message 9 of 21
(3,548 Views)

Everything should be challenged, including unnecessarily rude and condescending remarks? I read a lot by don't post often. This is an example of why that is the case. My mistake was posting to begin with.

0 Kudos
Message 10 of 21
(3,547 Views)