03-26-2019 01:52 PM
Hi Folks,
Is anyone using neural network, machine learning or deep learning libraries on their Linux RT cRIO? If so, what libraries are you using? How well is it working? Is it giving you problems?
Thanks!
04-09-2019 03:30 PM
Hey cgisbon,
Anecdotally, from my role, I have not heard of a NI Linux RT cRIO being used as a neural network. This is most definitely not all-encompassing, but I wanted to at least give you my perspective.
Thanks,
Andy
Product Support Engineer
04-15-2019 08:06 AM
Thanks Andy!
12-10-2019 10:53 AM - last edited on 05-09-2024 04:17 PM by Content Cleaner
Hi cgibson,
I just found your post.
DeepLTK (Deep Learning Toolkit for LabVIEW) supports deployment of pretrained neural networks for inference on NI's RT targets.
@NITN - https://www.ni.com/en-us/shop/product/deep-learning-toolkit-for-labview.html
@ngene - https://www.ngene.co/deep-learning-toolkit-for-labview
You can train a model on a desktop PC (and use GPUs to accelerate the process) and deploy the model for inference on the RT targets.
The toolkit comes with an MNIST classification example designed for this purpose, which can be found at:
C:\Program Files (x86)\National Instruments\LabVIEW 2016\examples\Ngene\Deep Learning Toolkit\MNIST(RT_Deployment)
Also, recently we released FPGA add-on to the toolkit which allows to use FPGA for inference acceleration.
@NITN - https://www.ni.com/en-us/shop/product/fpga-add-on-for-deep-learning-toolkit-for-labview.html
@ngene - https://www.ngene.co/dnn-accelerator-for-fpga
It will allow to achieve up to 10x speed up compared to the CPU of the same target.
Currently it supports NI sbRIO-9607 and IC-3173
Hope this would be helpful.
Please let me know if there would be any questions.