DIAdem Idea Exchange

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

When dealing with large TTL data log files (approximately 25 Gb or more) in DIAdem using BUS CAN converter, users have reported encountering freezing issues during the conversion process. 

Furthermore, if you check the Task Manager, it may show that DIAdem is not responding, which typically implies that the software has crashed. This lack of feedback makes it challenging to determine whether DIAdem is functioning correctly.

However, despite the apparent freeze, the conversion process eventually completes successfully after approximately 30 minutes.

 

To improve the user experience, it would be beneficial for DIAdem to provide clearer feedback during large file conversions, such as a progress indicator or status updates. This would help users understand that the software is indeed working and prevent unnecessary concern about crashes.

 

Thanks in advance!

EuroNCAP has introduced a new injury criterion for the THOR dummy, which would be very handy to have as a function native to Diadem through the Crash Analysis Toolkit

 

https://cdn.euroncap.com/media/67886/tb-035-brain-injury-calculation-v10.pdf

Lately there has been a need by many individuals to shift data for one reason or another.  Possibly the data was collected without using a trigger to synch everything or just collected with an inevitable delay.

 

Would R&D look into a function that could mark two points in a customer's data and then align/shift the data so a comparison may be done of one data set against another?

 

Thanks!

Please include an 'Integrate' (or 'Integral') feature for a specific definition range (x-axis) in Analysis / Statistics / Descriptive statistics, as the Integrate feature in Analysis / Basic Mathematics only supports integrating for whole channels.

Hi,

 

I noticed channels can be dragged and drop everywhere but in the calculator. Ths channel list is quite OK when havin a few channel in the Data portal but becomes fast ´a waste of time on bigger projects. Allowing to drag channel in ot would make it faster to use and also a bit more user friendly.

storethere.jpgMany DIAdem ANALYSIS functions offer storing the result in the original channel. And if this is not chosen, result goes into 'default group'.

My suggestion is to add another checkbox to chose storing in the group of origin.

Then, if I do an analysis over 25 groups (a typical number in R&R studies) each group will contain the calculation results of their own source data.

OK, I can call set default group. But still, I'd like to be able to use that feature 🙂

 

Michael

 

 

I am sure that this is far from a 'new idea'.  I have been using DiAdem 11.1, and have not looked at the new features of 2010 yet.  The new idea...use LabVIEW code to make scripts.  I know LabVIEW very well, that is how I make the TDMS files; but the idea of data analysis or automation comes to a standstill when you tell me I need to learn VB to manipulate or work with the data I just gathered beyond the canned toolsets already in DiAdem.  DiAdem needs to be like Teststand in this regard.  You can use different code types with Teststand, and it works.  And this needs to work in all the different categories of scripting or automation (Navigator, View, Analysis, etc.)  Any software package NI creates should have LabVIEW as the center point, and always require seamless integration of LabVIEW code.  I think that comes down to plain-jane good marketing strategy.

The idea is to provide a "Channel concatenation" button in the default Analysis panel > Channel Functions.

After clicking this button, a dialog box should open, where the various Channels of different Channel Groups can be selected for the concatenation process. In the same dialog box, some options for the channel concatenation could be selected as well.

In this moment, some example scripts can be found in the discussion board for the concatenation of N Channels in M Groups. However, these scripts are quite cumbersome, not easy to customize, and providing too many optional features that are aiming to address shortcomings in the time or data channels to be concatenated.
It is a good practice, however, to correct first the issues in your Time and Data channels that prevent them of being concatenated properly.

The channel concatenation function would be a very useful feature, especially for engineers working with big datasets, that are increasingly recorded in contemporary data acquisition.
Typical examples of big datasets are vehicle CAN network data, recorded over an extended period of time (weeks, months).
Usually, these data are divided into multiple data files, which then need to be concatenated afterwards in order to cover a selected recording period, prior to further analysis of the data.

 

I was using an “assignment” channel as the “ChnEventTrueChn” for “ChnEventCreateFilteredTrueChn” and was really hoping the result channel would also be an assignment channel, but no, it’s not.  

 

My original assignment channel is matched up nicely with some text values, so it would be really helpful to have those values “carry” if I execute a function or calculation on the numeric channel.

 

FYI, Brad Turpin provided a workaround until this is implemented that works just fine for me:

Clone your assignment channel and use the clone for the ResultChannel channel in the “ChnEventCreateFilteredTrueChn”  function.

 

Set OldGroup = Data.Root.ChannelGroups(1)

Set NewGroup = Data.Root.ChannelGroups(2)

Set OldAssignChannel = OldGroup.Channels(1)

Set NewAssignChannel = NewGroup.Channels.AddChannel(OldAssignChannel)

Call ChnEventCreateFilteredTrueChn(NewAssignChannel, ChnEventList, OldAssignChannel, ChnEventFalseValue)

 

 

It would be fantastic if you would develop your own, or put a wrapper around one of the existing public domain, statistics packages and greatly expand on the statistics currently offered in the product.  Evaluating the statistical significance of data seems a natural and powerful feature that should be included as part of data analysis.

Hi all,

sometimes it can be an advantage for data manipulation to have the possibility to access to an index channel (a channel just containing the values from 1 to n) generated out of channel with length n.

Such a channel would be helpful when using event search function which have to be performed on a certain number of values o in a certain block of data in a channel.

This function could be integrated in the current function "Generate Numeric Channel".

Best regards

 

 

 

Add a function to setup live measurements of curve data between cursors.  As the cursors move the function uses the new curve data between the cursors to immediately update the measurement field(s).  This functions comes standard on most advanced digital oscilloscopes.  The oscopes' usually have a measurement screen where measurement parameters are entered for the specific measurement desired.  My immediate desire was just to measure RMS between verticle cursors but by no means should this function be limited to one measurement type.

I'm new to Diadem so this may be an option that I've not discovered yet...

 

I have very large data files which take a couple of seconds to display on a chart.  Annoyingly if I want to add an extra plot to an existing graph Diadem re-draws all the graphs, not just the new data.  I have my own LabVIEW code which only plots new data whilst leaving old plots on the graph and it really speeds up the process - can you do something similar with Diadem?

 

 

Whatever method is used in the "Calculate Circle Approximation" does not do well if there are not points around an arc which is close to 360°.  I suspect the Kasa method, or something similar is used (link to Matlab code:  http://blogs.mathworks.com/pick/2008/03/14/fitting-a-circle-easily/).  This method works well if there are points all around the 360° of a circle, but not so well if the data only covers an arc.  If you go to Chernov's page ( http://www.math.uab.edu/~chernov/cl/) you can find a link to Matlab code for the Pratt Method ( http://www.math.uab.edu/~chernov/cl/MATLABcircle.html), which works great for arcs.  Although I have not tested it, I suspect the Pratt Method works well for data distributed across 360° as well.  If that is the case, I would recommend that you either add an "arc fitting" function using the Pratt Method or replace the method used in the "Calculate Circle Approximation" with the Pratt Method.

 

Attached is an image of an example of latitude and longitude data.  Note that, because this is lat & lon data, the circles look like ovals, but they really are circles.  I was trying to fit a circle to the red curve between the two vertical black cursor lines.  The green circle was obtained by using the "Calculate Circle Approximation" function while the blue circle was calculated via the Pratt Method.

Analyze the timing and transition of single or multiple pulses like LabVIEW express VI: 

 

hongcc1_1-1733149290329.png

Or quick analysis to calculate the rise and fall time for 10% to 90% and vice versa from the peak: 

hongcc1_2-1733150037868.png

Or the step response analysis like Matlab stepinfo to get more information about the rising step such as settling time, peak time, overshoot, undershoot, etc. 

 

This will be very handy for quick understanding of signal when acquired waveform from other LabVIEW+ software such as Flexlogger and Instruments Studio.



I'm a student in my last year of studies and I'm not that good working with scripts.

I work with Diadem to analyze my real-time data,  and when i import the excel files ( One year of data gathered in excel files separately ) into Diadem my columns change automatically to text format So i guess the solution in to write a script to convert my channels into Numeric format?

  Capture.PNG

Many functions in DIAdem seem to be biased towards waveform channels, ignoring the classical ways of working with standalone numeric channels.

 

The ChnResampleFreqBased function returns waveforms when I think it should have the option to return standard numeric channels. My workaround is not to use it at all and to use the channel generation and linear mapping functions instead.

 

This obviates the need to reformat channels (wherein you also get a new X channel that you might not have wanted) and is less code. But why have to use two functions to get the same output that could have come from the one resampling function were it to have the option to provide numeric channels?

 

Regards.

Today I was trying to find the SCRIPT command for "Converting Numeric Channels into XY-Channels" and could not easily find it or record a script to find it. So two requests here: (1) when recording a script, output the script or have a message that says 'sorry!' and (2) update the help with the correct script functions, similar to the other channel conversion functions.

 

I tried to record a script, executing the ANALYSIS function "Channel Functions -> Channels <-> XY Channels", but nothing showed up in the recorded script.  

 

Looking up the help reference for "Channels <-> XY-Channels" only shows how to display the dialog, Call SUDDlgShow("Main", ResourceDrv & "AnaChnToXYChn")


All of the other "channel conversion" help documents show how to do this in SCRIPT, for example, "Numeric Channels <-> Complex Channels" shows "

ChnConvertNumericToComplex", similar for the Numeric to Time, Waveform, etc. My proposal is below:

 

Script Call:

Call SUDDlgShow("Main", ResourceDrv & "AnaChnToXYChn")

 

Set Group = Data.Root.ChannelGroups(1)

Set XChannel = Group.Channels(1)

Set YChannel = Group.Channels(2)

Set YChannel.XRelation = XChannel


 

 

Screenshot_6.png

 

The idea is to have a description of the channels that are being used when doing this type of analysis and the order in which those values are being processed mainly because in this type of function the order of the channels will make a difference in the coefficient.

Screenshot_7.png

 

I would like a tool that would help identify noise spikes in my data, so that they may be set to NO VALUE or carefully interpolated.  In the attached example, I have two torque spikes in my data that are significantly higher than the mean.  If I were only looking at one file, I would simply "Set Flags" in VIEW, then "Flags: Remove Data Points". However, I am reviewing dozens of files with several channels of interest.  

 

Peak data points are the most damaging to the components under test, so it is important that we keep the real events and reject the noise.  

 

Concerns with other peak / spike removal options:

  • Manual Set Flags -> Flags:  Remove Data Points; time required
  • ANALYSIS -> Event Search -> Event = Window -> Upper Limit:  In other data files, I have real events with values higher than the noise in this data file.  
  • ANALYSIS -> Digital Filters or Smoothing:  This will change all of the data - it will likely narrow the peak-to-peak of my other data and interpolate my noise spikes, adding damage that is not actually real.  I only want to remove a few data points in the channel data. 

nCode Glyphworks has a nice Spike Detection Tool that uses 6 available algorithms to detect spikes.  Once it identifies them, it allows the user to see them, prior to performing any actions - this is important, because it allows the user to identify if they are real or noise, prior to taking action.  

Download All