NI VeriStand Add-Ons Discussions

cancel
Showing results for 
Search instead for 
Did you mean: 

Engine Simulation Custom Device Feedback

The Engine Simulation Custom Device only reads/updates the name of the *lvbitx file from the associated *.fpgaconfig file when Adding it to the system definition for the first time. This is performed by the "Custom Device, Engine Simulation Initialization VI". Any time you change the fpgaconfig file name, the new lvbitx file is not updated to the System Definition file.

Option #1: Delete the Engine Simulation Custom Device from the System Definition and then Add back, requiring any reconfiguring of System Mappings.

Option #2: Edit the System Definition File *.nivssdf, to manually update the “Dependent File Type” and “RTDestination” paths to point to the updated *.lvbitx file names.

Todd - Digalog Systems Inc.

0 Kudos
Message 71 of 247
(1,688 Views)

Yes.

Any update on debugging the degrees per tic control?

Stephen B
0 Kudos
Message 72 of 247
(1,688 Views)

HI ,

I am new to VeriStand/FPGA set up. I want to run crank/ cam generation on my FPGA. I removed the fuel injection loop from my VI and compiled it by assigning it to my I/O on my FPGA. Do I need to change the XML file to run this on VeriStand.

I am using default settings as specified for crank/cam

Thanks

0 Kudos
Message 73 of 247
(1,688 Views)

isabella2,

Have you seen the getting started documentation on NI VeriStand Add-on: Engine Simulation Custom Device?

Stephen B
0 Kudos
Message 74 of 247
(1,688 Views)

Hi Stephen,

I followed the instructions in the document to change the XML file. I am not able to generate a crank signal.  I have the same question as ToddK -

How does the Engine Simulation custom device pass the Engine Speed value into the FPGA code?

How do I define Engine Speed so VS can pass that data to FPGA.

Thank you

0 Kudos
Message 75 of 247
(1,688 Views)

The custom device writes to a u64 register named "APU.Degrees per Tick" to write the RPM to the FPGA. The RPM value is provided to the custom device as an input channel under the APU item in the navigation tree. Make sure you write a value to this.

To debug an issue with your FPGA code, I suggest the same thing I did to Toddk.

  1. Run the NIVS project as is
  2. Open up your source LabVIEW project (the one that you made the FPGA bitfile with)
  3. Verify the IP address is correct on the PXI RT Target and the RIO Resource name is filled in on the FPGA device (should be RIO0 or RIO1... etc... on the properties of this item)
  4. Then just open up your FPGA VI and click run. It will connect to the running FPGA and allow you to see its front panel.

Then you can look at the front panel and see if it is getting the expected values. For example try changing RPM and see if "APU.Degrees per Tick" changes.

Stephen B
0 Kudos
Message 76 of 247
(1,688 Views)

Stephen,

The hardware that I am using is 7851R FPGA card with a PXI data-acquistion  cards. I dont have a real time controller card. I wont be able to use the debug method you suggested.

Is there an example FPGA config file for engine simulation available with all controls/ scale/offset etc?

I have been using this document http://zone.ni.com/reference/en-XX/help/372846C-01/veristandmerge/creating_custom_fpga_configuration... for FPGA config.

Thank you

0 Kudos
Message 77 of 247
(1,688 Views)

Thats OK, you don't need a real time controller. Just follow the steps as listed without the real time part. This is a much better debugging method than using an FPGA configuration file to loop back the values into NI VeriStand.

To debug an issue with your FPGA code, I suggest the same thing I did to Toddk.

  1. Run the NIVS project as is
  2. Open up your source LabVIEW project (the one that you made the FPGA bitfile with)
  3. Verify the RIO Resource name is filled in on the FPGA device (should be RIO0 or RIO1... etc... on the properties of this item) and the device is under my computer
  4. Then just open up your FPGA VI and click run. It will connect to the running FPGA and allow you to see its front panel.
Stephen B
0 Kudos
Message 78 of 247
(1,688 Views)

Hi Stephen,

I am not able to run the Engine Simulation model in simulation only mode. The steps I followed are

Compile the model (without the DMA packet loop) to generate bit file. In VS project, add Engine Simulation custom device and select bit file. On VS screen connect Engine Speed from APU to Engine Speed Control Dial.

Are there any steps that I am missing?

H/W - 7851R FPGA card with a PXI data-acquistion cards, S/W - VS 2011.

Thank you

0 Kudos
Message 79 of 247
(1,688 Views)

Isabella2,

OK so you created a bitfile that has just engine simulation inside it and you added the custom device and selected this bitfile. FYI, you didn't have to do this (you can just debug your orginal bitfile and setup)... but we can work from here. Please follow these steps as posted above.

To debug an issue with your FPGA code

  1. Run the NIVS project as is
  2. Open up your source LabVIEW project (the one that you made the FPGA bitfile with)
  3. Verify the RIO Resource name is filled in on the FPGA device (should be RIO0 or RIO1... etc... on the properties of this item) and the device is under my computer
  4. Then just open up your FPGA VI and click run. It will connect to the running FPGA and allow you to see its front panel.

Verify that the values on the front panel are what you expect. Try turing the knob on the workspace for engine speed and see if the FPGA front panel updates.

Stephen B
0 Kudos
Message 80 of 247
(1,688 Views)