12-14-2017 11:05 PM
I have a PCIe-1473R-LX110 FPGA board. I've been using the "10-Tap 8-bit" example to acquire images off my cameralink camera successfully. Now I would like to do some additional processing on my board. Unfortunately it looks like most of the NI Vision FPGA functionality is restricted to 1x8bit pixel bus inputs (at least for the Virtex-5 chip).
How can I convert or "split out" the 8x8bit pixel bus so I end up with a 1x8bit pixel bus for processing? I can't just put my camera in 1-tap mode. That would not give me sufficient data rate for my application.
12-15-2017 01:08 AM
I am working a lot with the same card.
Serialize the pixel bus is easy. But you might run into issues of data rate,
When you do the acquisition you put the data into FIFO.
Then you need to read the FIFO in a loop that run in 10 times your camera pixel clock.
You read from the FIFO 10 pixels once in every 10 clocks.
Then put the pixels on the bus in serial one pixel per clock.
What you need to check in your software the LX110 temperature.
It might get hot in this data rate.
Amit
08-10-2018 05:52 PM
I realize this thread is 9 months old but thought I would give it a shot. I have a U8x8 pixel bus on a PCIe-1473R and I would like to use some of the vision processing functions (like 2D convolution), but these can only be run on a U8x1 bus on this FPGA. I don't see how to set up a clock that is 10x the pixel clock (which is 100MHz), from what I can tell this is not possible. It already has 40MHz and 100MHz clocks and won't let me configure a new one. Any ideas?
08-16-2018 10:56 AM - edited 08-16-2018 10:57 AM
As you mentioned
There are a couple of things here. You can derive a clock from either the 40 MHz or 100 MHz clock by right-clicking on one of those clocks and choosing "New FPGA Derived Clock". When choosing a frequency for this new clock, you would need to select a value that is 10x (assuming you are using a 10-tap camera) the _camera's_ pixel clock value. This is usually configurable and the Camera Link specification limits the maximum clock speed to 85 MHz.
However, that would put you at 850 MHz, which is probably too fast for the Vision Development Module FPGA IP to successfully meet timing. In order to lower the required clock frequency for your processing loop, you would need to reduce the number of taps coming from the camera and/or lower the camera's pixel clock frequency. This would obviously limit the maximum frame rate of the camera, but I think that is the best way to get things working with the 1473R.
--Jared