LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabviewのNI-Industrial Communications for CANopenを使わない、NI-XNETのみを用いたCanopen機能の実装について

現在、CANopenプロトコルでデータの送受信をする加速度センサーと、NI9862モジュールを通してWindows 11のパソコンでデータ(フレーム)の送受信をしています。データの送受信自体はできていますが、問題が発生しています。
加速度センサーから送られてくる計測データ(PDO: Process Data Object)をLabVIEW内で受け取り処理しようとすると、受け取ったデータ数が明らかにセンサーから送信されたデータ数から変化します。具体的には、設定したサンプリング周波数に関わらず、常に1000 sps(samples per second)になるよう値が自動調整されているようです。
この自動調整は意図したものではありません。LabVIEWやNI-XNETドライバの設定で、このような自動調整を制御できる項目があるでしょうか?
この現象は、SignalinputのwaveformにしてもFrameInputなどいろいろな設定にしても同様と思われる現象(値の自動調整もしくわ増加)が起こっています。
待機時間も100nsに設定はしていますが、それ以上早くしすぎるとオーバーフローが発生します。
同期に問題がるのかとも思いましたが、よくわからず。
原因や同じような症状など、どんなことでもよいので情報くださると幸いです

0 Kudos
Message 1 of 3
(59 Views)

@mkkkkk wrote:

Currently, I am using an accelerometer that transmits and receives data using the CANopen protocol to send and receive data (frames) on a Windows 11 computer through an NI9862 module. Although I am able to send and receive data, I am having problems.
When I try to receive and process the measurement data (PDO: Process Data Object) sent from the accelerometer in LabVIEW, the number of data received clearly changes from the number of data sent from the sensor. Specifically, regardless of the sampling frequency I set, the value seems to be automatically adjusted to always be 1000 sps (samples per second).
This automatic adjustment is not intentional. Are there any settings in LabVIEW or the NI-XNET driver that can control this automatic adjustment?
This phenomenon seems to be the same regardless of the settings of Signalinput waveform, FrameInput, etc. (automatic adjustment or increase of values).
The wait time is also set to 100ns, but if it is set any faster than that, an overflow occurs.
I thought it might be a problem with synchronization, but I'm not sure.
I would appreciate any information you can give me, such as the cause or similar symptoms.


NI-9862 does not support CANopen natively so I believe you implemented the CANopen on your own.

I am going to cover my opinion on NI-XNET only. XNET Read VI returns data differently for each session mode.

Assuming that the signal has a transmit time of 0.001s in the database:

Depending on what you are trying to do with the data, I can recommend which session suits your application the best.

-------------------------------------------------------
Control Lead | Intelline Inc
0 Kudos
Message 2 of 3
(17 Views)

Thank you for your response. To clarify my situation and needs:

  1. My goal is to receive the data transmitted from the sensor, save it to a CSV file, and display a graph on the front panel.
  2. Currently, we have a one-to-one relationship between the sensor and the computer, using asynchronous communication in CANopen.
  3. I've tried using Signal Input Waveform mode to create a graph, but I'm still experiencing the same symptoms (automatic adjustment to 1000 sps). I've also attempted other modes like Frame Input Queued mode and Signal Input XY mode, but the results remain unchanged.
  4. Which session mode do you think would be most suitable for resolving this issue? Do you have any advice on settings or methods to accurately receive and process the data?
  5. If there's any additional information that might be helpful in solving this problem, such as details about our CANopen implementation or database file settings, please let me know.

I appreciate your guidance on how to overcome this automatic adjustment issue and accurately capture the sensor data at the intended sampling rate.

0 Kudos
Message 3 of 3
(6 Views)