LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How granular do you get with your functional requirements?

Sorry this is a lame question. I have googled it a lot, but I don't know if I'm taking things I've seen too literally and it just leaves me a bit stunned how long this doc is going to be for a pretty standard program. 

 

What would you go with

 

  1. The system shall continuously acquire data at 1kHz from 5 pressure sensors (PT 112-116)

 

Or

 

  1. The system shall continuously acquire data at 1kHz from pressure sensor PT 112
  2. The system shall continuously acquire data at 1kHz from pressure sensor PT 113
  3. The system shall continuously acquire data at 1kHz from pressure sensor PT 114
  4. The system shall continuously acquire data at 1kHz from pressure sensor PT 115
  5. The system shall continuously acquire data at 1kHz from pressure sensor PT 116

 

 

The second is easier to pass/fail individual requirements. But there’s a bunch more things that all of them should do, eg all lines get logged to file, all lines get displayed, all lines have an alarm level, and soon I’m going to have 20 requirements instead of 5 for this tiny part, and for the whole program probably 400 requirements rather than 100. 

 

I'm trying to make my requirements docs more professional, but I don’t work in a software oriented company so I’d probably have to defend coming out with something so extremely granular... which I could stand by if I could stop doubting which one I ‘should’ do…

 

I don’t use teststand, maybe if I did I’d be able to see this as what’s most appropriate to plug into there.

 

-------------------
CLD
0 Kudos
Message 1 of 21
(887 Views)

Apologies that this isn't a question about the code itself so maybe belongs on another forum, this just feels like the home forum

-------------------
CLD
0 Kudos
Message 2 of 21
(882 Views)

I'm working with a Systems Engineer for the first time in my life.  My team has never done requirements beyond "The system shall perform the acceptance test per document XYZ."  Now we have a very comprehensive set of requirements in Jama.  I hate Jama.

Jim
You're entirely bonkers. But I'll tell you a secret. All the best people are. ~ Alice
For he does not know what will happen; So who can tell him when it will occur? Eccl. 8:7

Message 3 of 21
(856 Views)

Version 1 is hard to misinterpret and a lot less to write, so i'd go with that.

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
Message 4 of 21
(844 Views)

Aha I'm glad you mentioned Jama. I came across something suggesting Jama today and was wondering if I should consider using it. Now I see perhaps not. What's the pain with it? One of these systems that adds more work than it saves?

-------------------
CLD
0 Kudos
Message 5 of 21
(821 Views)

This topic is fascinating to me, and a thing that I now realize I haven't had to do in a long time.  At a previous job I would be the one helping others with requirements writing, maybe because I was a stickler, or because I could interpret things more literal than others.  I would often run through test procedures and fail things as they were written, when others would pass based on their interpretation.

 

As for your direct question I see no problem with either way it is written.  But they will affect site or factory acceptance testing.

 

If you have a set of requirements where you include all sensors in one line, then to validate that one requirement you need evidence of all 5 sensors.  If any one of them fail then you fail that requirement.  The set of steps in your acceptance test will also need to instruct the user to do the steps for all 5 before validating the requirement.

 

I've also seen it written like this as an example:

 

1. The test software shall measure the output from the pressure input PT 112.

2. If the previously measure input frequency is not 500Hz +/- 10Hz the tester shall fail the DUT

3. If the previously measured input duty cycle is not 50% +/ -5% the tester shall fail the DUT.

4. The test software shall repeat steps 1 through 3 for PT 113, PT 114, PT 115, and PT 116, in place of PT 112.

 

Writing it this way is more convenient for when there are many steps taking place and then we just need to do the same steps over and over for other inputs.  But from a testing perspective this can be more work to validate.  Flattening these steps out over all pressure inputs is easier to test for, since each requirement is a single thing to check for.  Requirement 4 here is actually a bunch of things, and may take a long time to validate, providing failing inputs for the frequency and duty cycle for so many sensors.  Then again you can maybe convince the customer that looking at the report, and seeing the limits used is enough to validate lines 2 and 3, and all this really needs to prove is it is capable of measuring individual readings.  In which case the requirement could be checked in a variety of other ways, like measuring the output with a scope, and then seeing that the tester has the same value in the report. 

 

I have also seen a weird situation where the Requirement 4 actually said "The test software should repeat..." which is a bit weird.  A shall statement needs to be done, and proven that it is done through verification.  A should statement I believe means the software needs to do it, but there is no requirement to verify it does it.  That would mean we need to prove it worked for PT 112, but for all others we just trusted the tester did what it should have.

 

Basically every Shall statement means lots of extra work. You need to writing the test procedure, then for every time the test procedure is ran the operator has lots of extra work to document it.  Then this is repeated for factory acceptance test (FAT) and a site acceptance test (SAT).  So you should limit your Shall statements to things you really do mean need to be proved it is doing.  In this example there might be another requirement that the pressure sensor should output a 5V signal.  This is a thing we can put in there but doesn't need to be verified. Then again maybe the customer doesn't care what the sensor is at all, just that the system can measure pressure within some tolerance.  If that is the case maybe we can don't care about the signal type at all and can plug in a dummy pressure sensor and just see that the value goes bad when we expect, and that a golden DUT passes as we expect.

 

There's so much room for interpretation in the intent, and how to prove a thing does what it needs to.  You can spend a ton of extra time proving the most simple thing, or go in circles forever because of a badly written requirement.  My main advice is to think about how you will test for your Shalls when you write them.  I remember one common requirement I struggled with was:

 

The test software shall be written in LabVIEW 2020.

 

Seems fine on its surface but when I have a built application how can I prove that?  I can look at what runtime engine Windows has loaded when I run but now applications can run in different runtime engines.  I can open up the EXE in a hex editor and try to find it there, but that can be unreliable.  And in the end this is more of a design requirement and I'd suggest making it a should statement.

 

EDIT: Oh and NI still has their Requirements Gateway software.  You have it scan for requirements in documents, then it can scan the source of LabVIEW or TestStand for where those requirements are being fulfilled, and then have it scan the SAT and FAT documents to see where they are being covered.  It is a bit clunky at times, but it is great for checking for testing coverage.

Message 6 of 21
(804 Views)

@Shiv0921 wrote:

Aha I'm glad you mentioned Jama. I came across something suggesting Jama today and was wondering if I should consider using it. Now I see perhaps not. What's the pain with it? One of these systems that adds more work than it saves?


Jama is probably great.  I'm just not accustomed to working with formal requirements.

Jim
You're entirely bonkers. But I'll tell you a secret. All the best people are. ~ Alice
For he does not know what will happen; So who can tell him when it will occur? Eccl. 8:7

0 Kudos
Message 7 of 21
(789 Views)

@Hooovahh wrote:

If you have a set of requirements where you include all sensors in one line, then to validate that one requirement you need evidence of all 5 sensors.  If any one of them fail then you fail that requirement.  The set of steps in your acceptance test will also need to instruct the user to do the steps for all 5 before validating the requirement.

 

I've also seen it written like this as an example:

 

1. The test software shall measure the output from the pressure input PT 112.

2. If the previously measure input frequency is not 500Hz +/- 10Hz the tester shall fail the DUT

3. If the previously measured input duty cycle is not 50% +/ -5% the tester shall fail the DUT.

4. The test software shall repeat steps 1 through 3 for PT 113, PT 114, PT 115, and PT 116, in place of PT 112.


Good point on the single line, you can add that the measurements should be tested individually, so basically a 1a-1f

I prefer requirements written from what they Should do, not what it shouldn't do. Also, you're already (pseudo)programming, let the programmer decide in which order (or in parallell) to test it.

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 8 of 21
(781 Views)

@Yamaeda wrote:


Also, you're already (pseudo)programming, let the programmer decide in which order (or in parallell) to test it.


It is a good point that the requirements shouldn't dictate how it is done.  Certainly it looks like code, doing a set of steps then doing it FOR a set number of times.  But I'd still argue that this doesn't define the programming used and if it is done in a different order, in parallel, or even at different times in the application that isn't set by this requirement. It only states that the software needs to do these things, and that a test will be written that verifies these things are done.  How the software does it, and how it is being tested that it does it, isn't specified here.  But in our case the programmers were often the ones writing the requirements, so maybe I just saw this done because they were thinking in loops already.  But I bet they mostly just did this to save time and space in writing the requirements document rather than flattening it out more.

 

EDIT: You know what maybe I didn't use the right language.  I said:

 

4. The test software shall repeat steps 1 through 3 for PT 113, PT 114, PT 115, and PT 116, in place of PT 112.

 

Maybe what I should have said was:

 

4. The test software shall perform the same steps defined in steps 1 through 3 for each of the remaining sensors: PT 113, PT 114, PT 115, and PT 116.

 

This leaves it up for interpretation on what the test software does, and is hopefully more clear that we are just trying to make a shortcut in the document.

Message 9 of 21
(763 Views)

You could still "write it out" without repetition:

  1. The system shall continuously acquire data at 1kHz from pressure sensor
    1. PT 112
    2. PT 113
    3. PT 114
    4. PT 115
    5. PT 116

At least you'd have traceable requirements (the numbering could be better in this example).

 

I'd mostly consider how it's going to be tested.

 

I've seen requirements like this:

1. The system shall continuously acquire data at 1kHz from sensors (PT 112-116).

... 

100. The acquire data from pressure sensors (PT 112-116) has a maximum of 10 bar.

... 

200. The acquire data from pressure sensors (PT 112-116) has a minimum of 0 bar.

... 

300. Test acquire data from pressure sensors (PT 112-116) has 20%.

... 

400. Test acquire data from pressure sensors (PT 112-116) has 40%.

... 

500. Test acquire data from pressure sensors (PT 112-116) has 60%.

..

600. Test acquire data from pressure sensors (PT 112-116) has 80%.

..

etc

 

Now at testing time, it's often required (by an (often over anxious) commissioning officer) to perform the test in order. This can waste days, because you have to test PT 112..PT 116 to test the frequency. When done, go back to PT 112..PT 116 to test the maximum. Then yet again go over each pressure sensor to test the minimum. It would make a lot more sense to fully test PT 112, then go to PT 113.

 

Also, it makes more sense to test 100% after 80%, saving more time.

 

I've wasted days performing the first test ("does anything come in") and later the nth test (does a 20% signal come in). Obviously test 1 is covered in test 2, but you'll never get the time back.

0 Kudos
Message 10 of 21
(566 Views)