Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

Software Complexity Analysis

Status
Not open for further replies.

Noway2

Electrical
Apr 15, 2005
789
US
Greetings All,

I am looking for suggestions for software analysis tools to help evaluate and analyze source code, predominantly written in C.

I have been tasked with managing a product development project, where the majority of the development will be done by an off shore team. From looking over some of their previous (software) work and observing the amount of ongoing maintanence required on existing products, I believe that a quality improvement is required. One of the challenges I am currently facing in this goal is how to actually specify and quantify the 'quality' of the software and hopefully prevent "clever" programming before it occurs.

I have found some free cyclomatic complexity tools, such as CCCC, but I am looking for something a little more professional and as long as the price isn't exhorbant, I believe that I can justify the cost.


I am planning on insisting on proper code review, testing process, etc, but one of the primary reasons I am looking for a tool is that it will help to de-objectify the process. For example, if the tool scores a routine as high risk due to the possible execution paths it becomes harder for the coder to say, "it looks fine to me" or "I don't understand what you mean by its too complex."

My question to the group is: do any of you have experience using some of these tools, such as Lint or McCabe's, that you would be willing to recommend?

 
Replies continue below

Recommended for you

How does it score with state machines? They are easy to code, and for those who understand state machines, easy to understand. Those who do not understand state machines find the code incomprehensible.

There are all sorts of coding styles. I like using macros and include files to reduce the amount of coding and errors but many tools are totally incapable of handling this usage of macros and includes and just fall over.

It might be better to have a cross section test suite to test the tools with. Have a human evaluation and see what the tool thinks. You'd be surprised how complex techniques with lots of disjointed indirect procedure calls are OK and simple techniques like state machines are classified as complex.
 
Lint will tell you if your code contains dangerous constructs, such as pointers that may not be initialized before use, etc., but it is not a tool meant to check for complete path testing (though it will tell you, to the best of its ability, if a path cannot be reached).

Other than that, I probably can't be of much more help.

Dan - Owner
Footwell%20Animation%20Tiny.gif
 
XWB, How does what score with state machines? Are you refering to cyclomatic complexity checkers?

From what I understand of how CC checkers work, it would depend on how the state machine was implemeneted. For example, switch and case structures report high on the complexity scale because of the number of possible paths through the high level source code, yet their actual complexity at the object code level is implementation specific.

Macgyvers, thank you for the suggestion. I have been looking at a version of lint called SPlint, which used to be called LCLint. In the couple-few hours I have been looking at it, I ran it against one of the source modules that has been troublesome. Not surprisingly it flagged several potentially suspicious constructs.

Some of what it flags, may be considered exceptable, such as ... IF (!DUMMY_VARIABLE) { do something }.... where DUMMY_VARIABLE is not of type BOOL, but is of INT. Naturally, this is a quick short cut that just about everyone is guilty of using, but could be spelled out explicity which might be safer especially if DUMMY_VARIABLE is of external scope.

Another thing I noticed it flagging has to do with comparisons of floating point variables. It doesn't like for example, comparing Float_var1 > Float_var2 due to imprecision of floating point representation. Instead it suggests using an epision comparison, presumably where you would subtract the values and compare versus some other value? I am aware of wanting to avoid comapring float values with an equality, but I thought that using an inequality was considered safe.





 
Did your company specify unit testing, FQT, etc.?

How did your company handle design reviews of their documentation, ala SDP, SDD, SRS, etc.?

How did your company write their procurement specification? Were the requirements solid at contract award? How much did the requirements evolve during the design phase?

While there may be blame to laid at the feet of the developers, I generally find that writing a rock-solid and thorough procurement specification is quite non-trivial, and that often, you need start looking there.

Using tools on the product code might be interesting, but it's usually too late to do anything useful by that time.

TTFN

FAQ731-376
 
IRStuff,

You raise a very good point. Quite frankly, I don't have answers to these questions, but from what I have seen I would hazard a guess that the answer is: "If they did handle design reviews and specs that it was minimalistically at best." I have only been at this company for about four weeks, so I have little knowledge of how things WERE done. I think that at least part of the problem is / was a lack of experienced resources, which is a lot of why I was brought on board.

In this instance, I am not trying to lay blame at anyone's feet for existing designs. What I am trying to do, is exert as much control over the portions of the process that I do have influence over to make things as right as I can given the time that I have to devote to the effort. I have been told to "manage" the design and specify how it will be done, but then to let the off shore team perform the work and then to 'criticize' their efforts and make them redo it till I am satisfied with the result. I want to avoid this circle jerk process and instead focus on a process to get it right the first time. To that end I plan to work with specifications as much as possible (thankfully, those are one of the aspects of which I have dominion), though, as you pointed out GOOD specs are difficult to write.

From review of existing work I see a lot of 'questionable' practices and overly complex routines coded into existing software that I would like to avoid in the new design. This has resulted in 'fragile' code that has had a very large number of field updates in a short period of time. I am certain that a lot of this resulted from improper specs and too little review. I KNOW that I am going to be pressured to reuse as much existing software as feasible. What I am after is a tool that will help me to quantify whats wrong with these portions of the software AND provide a valid metric for the develop team to use as a standard. What would be even better is if I can state, up front, in the design specification what will be expected and acceptible.








 
Ideally, if you specify enough stuff on the front end, they'll be less likely to code complex stuff that they can't test.

On the other hand, however, complexity is hard to control out, since it's a individualistic thing.

If you have documentation on the changes made since delivery, that'll tell you something about the degree of reliability and complexity of any given chunk of code, since the least reliable and most complex chunks probably have the highest number of changes.

There does appear to be a number of tools available that purport to check code. As far as I know, no one at work has been talking about using any of them. However, just because the code has no logical bugs, doesn't mean that it's correct, which gets back to the specification.

That's usually the hardest part, to make sure that the specification matches intent. And right now, there's no good solution, other than a lot of peer reviewing.

TTFN

FAQ731-376
 
Noway 2 said:
Some of what it flags, may be considered exceptable, such as ... IF (!DUMMY_VARIABLE) { do something }.... where DUMMY_VARIABLE is not of type BOOL, but is of INT. Naturally, this is a quick short cut that just about everyone is guilty of using, but could be spelled out explicity which might be safer especially if DUMMY_VARIABLE is of external scope.
Thankfully I learned early on to be one of the few who doesn't try that kind of nonsense. You're removing one of the compiler's most important tools, and that's type checking. If I have a for loop that runs from zero to some value, that looping variable should be an unsigned int... most would make it an int simply because it takes longer to type "unsigned" in front of it. If the value the for loop runs to was a define that was later accidentally changed to a negative value (or purposely), the compiler will flag it.

Most recently, I came across the exact issue you mentioned. The variable was declared as an int, but was being used as a bool in several spots. The bool portions required a true/false value, but elsewhere the program was incrementing the variable to something other than zero or 1. The prior programmer was lucky, the program worked, but any change in the logic would most likely have made it fail. It could have easily failed if another compiler was used that didn not hold the same values for true/flase as the current one.

The same programmer tried this little gem of a statement:
if ( (Value = foo()) == ERROR_FLAG)
The function foo() will return ERROR_FLAG if there is any problem in processing the data. Seems innocent enough, right? Look closer. The function can return anything it wants, including ERROR_FLAG, and it's stored in Value. The problem is the action of setting Value equal to the return value of foo() will always return TRUE, which boils the statement down to:
if (TRUE == ERROR_FLAG)
which will always be false. Lint will help catch that kind of crap.


Dan - Owner
Footwell%20Animation%20Tiny.gif
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top