Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Validation of MISRA?
#13
Hi, MM!

>
acm Wrote:I read somewhere (it may have been in Steve McConnell's book about coding) that bug frequency / 1000 lines of code is pretty much independent of the source language. It therefore seems plausible that code written in "MISRA-C" will contain more bugs than the same stuff written in full C. This is the sort of issue that the putative studies should address.


>Bugs per line of code is a good measure. However you have to measure like for like. Some projects have a much lower B/KLOC than others. One would expect MISRA-C compliant code to have a low B/KLOC

Well would one, now? That's begging the whole point of this thread. :-) Where is the evidence that the (moderate) code-bloat mandated by MISRA is innocuous? In the example I gave, it was quite marked, and in my judgement harmful.

>Now this I must take issue with [time acm spent learning MISRA tool was not well spent]. A good static analyser will, after you have set it up and spent time being trained on it, catch a hell of a lot of bugs. The point is it is far more cost effective to find bugs with a static analyser just after they have been writen than woorking out what just happened in a test/debug session.

I agree, if we're talking about a GOOD static analyser. This needs a good basis to build on. Is MISRA such a basis? The tool I had to learn had a very low "signal/nose ratio" indeed, so much so that the time I've spent using it hasn't been fruitful. I've been using it on old code, though, not new code.

>There are several reports on the telecoms industry that show that good use of static analysis will reduce the bug hunting phase deastically ....

What's deism got to do with it? :-)

Cheers!

--
Alan Mackenzie (Nuremberg, Germany)
Reply


Messages In This Thread

Forum Jump:


Users browsing this thread: 2 Guest(s)