LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Data offset in the plot from an XY graph

Hi,

 

I've got a weird situation where two arrays of data feed a cluster and dump into an XY graph for a simple 1 plot graph. All the data is correct to the values I have generated mathematically at the input of the graph, yet when I see it plotted there is an offset of approximately 100MHz from the X-axis frequency i'm supplying at the input and the Y-axis data seems to be offset by a multiplier of ~100. The X-axis values are close but the Y-axis values seem like gibberish even though the correct shape of the plot can be seen in the resulting graph.

 

So basically what i'm asking is, 'Is there a property of the XY graph that i'm not aware of that would be able to cause this sort of data corruption?' Like I said the data up to the graph input is correct, so it must be within the graph function itself that the error resides.

 

Unfortunately I can't paste the code snippet in that it uses the hardware we have in our lab to generate the data but hopefully someone can understand what i'm suggesting.

 

Thank you for any help here.

0 Kudos
Message 1 of 3
(2,525 Views)

Does the offset error occur when you read in a test sample set of data, such as 2 arrays containing random constant values?

 

Would you be able to take a screen shot of the code that processes the data after it is generated and sent to the graph?

 

Regards,

 

Anjelica W.

 

 

Regards,

Anjelica W.
National Instruments
Product Marketing Manager
FlexLogger and TestStand
0 Kudos
Message 2 of 3
(2,498 Views)

Check the offset and multiplier setting for your axes. They should be a 0|1

 

 

0 Kudos
Message 3 of 3
(2,493 Views)