07-18-2012 02:07 AM
Hi,
I've got a weird situation where two arrays of data feed a cluster and dump into an XY graph for a simple 1 plot graph. All the data is correct to the values I have generated mathematically at the input of the graph, yet when I see it plotted there is an offset of approximately 100MHz from the X-axis frequency i'm supplying at the input and the Y-axis data seems to be offset by a multiplier of ~100. The X-axis values are close but the Y-axis values seem like gibberish even though the correct shape of the plot can be seen in the resulting graph.
So basically what i'm asking is, 'Is there a property of the XY graph that i'm not aware of that would be able to cause this sort of data corruption?' Like I said the data up to the graph input is correct, so it must be within the graph function itself that the error resides.
Unfortunately I can't paste the code snippet in that it uses the hardware we have in our lab to generate the data but hopefully someone can understand what i'm suggesting.
Thank you for any help here.
07-19-2012 05:43 PM
Does the offset error occur when you read in a test sample set of data, such as 2 arrays containing random constant values?
Would you be able to take a screen shot of the code that processes the data after it is generated and sent to the graph?
Regards,
Anjelica W.
07-19-2012 06:26 PM - edited 07-19-2012 06:26 PM