LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Error 2 when reading the code


@cstorey wrote:

I've never done truly high speed measurements with a Keithley 2000, but the theory is always the same; use the instrument to store as many readings as possible, taken as fast as possible in the internal memory.   Then once the measurement is completed, transmit the readings to the PC. 

 

A quick glance at the manual - https://www.slac.stanford.edu/grp/cd/soft/epics/site/gpib/2002_900_01D.pdf  shows two good examples of how to do this.  It also mentions that the scanner cards only have 2 high speed channels, 5 & 10.  Are you using the scanner and those channels?  

 

Examples on page G-12 and G-14 shows the use of internal memory and reading back the memory or the stats on that memory after the measurement.

 

Post your snippet when complete. (Snippet How To)

 

Craig


Close enough to get a kudos A direct link to the Screencast upload is here 


"Should be" isn't "Is" -Jay
Message 11 of 16
(815 Views)

For speed, first of all.

Remove the process to read/write file to outside the loop.

between this is only 1 command may not effect.

 

another thing is better set the "byte to read" to be variable. 

as you set to read 1000 byte, labview will wait until it received 1000 bytes or timeout.

so the timeout can cause the long run time.

 

try my vi if it can help.

please set this as ANSWER or KUDO this post, if it help you.


Message 12 of 16
(790 Views)

Hi everyone!

 

First of all I would like to thank you all for your valuable advice and possible solutions.

Last two days I have been trying to implement them to see if they will help.

Here is the summary of what I have done and what issues I have encountered. 

 

LVNinja advised:

smallboy31mailru_0-1686910887864.png

I have incorporated this, however, the FOR loop was as slow as the WHILE.

 

Craig (cstorey) advised:

In essence, to do buffer measurements with channels 5 and 10 and pointed out examples G-12 and G-14 from the 2002 manual.

 

I have tried the G-12 to find out that I am missing the hardware in the form of the scanner card to replicate such an example. However, I omitted the lines that specify channels and mimicked the rest of the code.

G-12 SNIPPED ERROR 1073807339 DURING VISA READ.png

As a result I was receiving an ERROR 1073807339 during VISA READ. I tried to add commands such as *OPC? and *WAI and also to alter byte storage, however I was unsuccessful. I also tried to change TRIG:COUN INF to finite small values (such as 5 or 10) to see if this would work but it didn't. Am I missing a coding line to transmit data or am I not giving enough time for an instrument to start recording the values? In both cases I have no idea how to incorporate this into VI.

 

Jumpod advised:

In essence, to W/R file outside of the loop and to replace my 1000byte constant for the Visa R with the  "byte to read" variable. He also very kindly attached a VI architecture where I believe everything he suggested was incorporated. 


I, unfortunately, was not able to open your VI since my LV version is 13 (old). I was not able to find the "byte to read" variable myself but I have managed to reduce input to 10 bytes while everything still worked (sadly it didn't improve the acquisition time). I also tried to remove W/R file outside of the loop but this, as expected, recorded only single data point instead of various I am trying to continuously collect. Maybe I didn't understand you correctly and all answers are in the VI I failed to access. Forgive me if so.

 

JÞB advised:

What I was originally planning to use - another mode for acquiring signals with 2002 multimeter.

 

I was quite successful with Burst Mode as my code appeared to work and very fast! However, there is a drawback in this method - an inability to collect timestamps along with the corresponding values. I have tried to force a code line to collect them but it was returning same number (9.91E+37) to the spreadsheet for all VDC values. I am attaching architecture below in a hope that someone has any ideas on how to "trick the system".

BURST MODE WORKS.png

 

According to manual the SSTR mode is as fast as Burst and allows to collect timestamps!

Must admit that it is hard to work with this method because, by default, it locks front display of 2002 and its hard to see errors (I had to run DCL afterwards to return to original state).

I managed to get rid of them one by one and this resulted in the architecture below.

SSTR MODE ERROR 1073807339 DURING VISA READ.png

 

As in the case with G-12 example I was receiving an ERROR 1073807339 during VISA READ. And as before the questions are: Am I missing a coding line to transmit data or am I not giving enough time for an instrument to start recording the values? In both cases I have no idea how to incorporate this into VI.

 

I am confident it is one of these two because, my Burst VI would also get this error if I, for instance, increase my point amount from 10 to 1000. I think 2002 post processing must finish before I should start asking instrument to read my data which makes perfect sense to me.

 

Please let me know if you know how to solve this issue.

 

Once again, thank you for your suggestions LVNinja, Craig (cstorey), Jumpod and JÞB.

Your help is highly appreciated. 

 

Best regards,

Sergejs

 

P.S.

Sorry for so many letters. 

0 Kudos
Message 13 of 16
(739 Views)

On a side note, none of your sequence structures do anything useful, because execution order is already fully determined by dataflow alone. They just clutter the diagram and prevent certain compiler optimizations. I would suggest to remove them.

0 Kudos
Message 14 of 16
(736 Views)

another thing is better set the "byte to read" to be variable. 

as you set to read 1000 byte, labview will wait until it received 1000 bytes or timeout.

so the timeout can cause the long run time.

 

try my vi if it can help.


This information is incorrect.  You might be confusing Serial "Bytes at Port" with VISA "Bytes to Read".  Please see VISA Read documentation - https://www.ni.com/docs/en-US/bundle/labview/page/lvinstio/visa_read.html

 

"Bytes to read" is just a memory allocation for the maximum amount of data that can be read.  The value specified should be larger than the data to be read, otherwise the reading will terminate before all data has been received.  It does not affect the speed of the reading, and LabVIEW does not wait until it receives that exact number of byte to continue!  It will return fewer bytes that specified if the function reaches the end of the buffer, or there's a termination character, or if a timeout occurs    If your reading on the insrtument is 24bytes and you specify "Bytes to read" as 5000, it reads 24bytes and returns because it hits the end of the reading buffer. 

 

OP - What channel are you using?  What is the speed of that channel?  What is the speed of the measurement that gets stored into memory?  

 

I don't think your BURST mode allows you to record the channel or time stamp.  It that info is important then you should consider using a trigger timer to take measurements at exact intervals, as fast as allowed.  There's also examples for that in the manual I believe.  I'll see if I have a free K2000 somewhere and play with some code Monday.


Craig

Message 15 of 16
(729 Views)

Hi Craig,

 

Thank you for getting back. 

 

I believe I am missing the hardware in the form of the scanner card which appears to have 1-10 channels. Instead, I am using the rear panel inputs with banana plugs (also can use front but was told by someone that rear are better).

 

Never the less, the acquisition speed for rear panel inputs in the burst mode is decent enough for me - I can collect my readings and see 0.01V change between them at 5Hz. It is, unfortunately impossible to also track the TST in the burst mode which I have mentioned in my previous message and what you have just clarified. Unless, there is a way to still force write TST in this BURST mode (my Burst VI works but the COUNT (or amount of points) needs to be low). This is since instrument needs to post-process readings from buffer before I ask it to write them to file. I am just not that good (terribly bad) with LV to write a better architecture. 

 

I very much appreciate your input. Please have a look at my SSTR mode ( :AMEThod SSTReam) - it is what I would like to incorporate (according to manual it can do TST in Full mode) but surely am missing smth in my architecture/code to make it work. I have attached it as snip to my previous message. I can send you the VI file if this will help you. 

smallboy31mailru_0-1687168480475.png

page: 3-140 in the user manual for 2002 model (attached)

 

Thanks again for helping me out! 

 

Best regards,

 

Sergejs

0 Kudos
Message 16 of 16
(677 Views)