Assembly to Ada Conversion Metrics 
Author Message
 Assembly to Ada Conversion Metrics

With Mr. Sims permission, I am posting my (stream of consciousness :-)
reply to his query regarding Assembly to Ada Conversion Metrics in the
hope that it might be of use to others.  Other readers of this group
must have had experience along these lines; those of you who have worked
in a similar trench probably have limitations in the amount of detail you
can share, as do I.  However, any  such information may help people pursuing
other programs to learn from past trials and tribulations.


-- Bud Hammons


Subject: Re: Assembly to Ada Conversion Metrics

Dear Mr. Sims:

|> I'm looking for information regarding the conversion of assembly
|> language, particularly PDP11 Macro to Ada.  Does anyone have any metrics
|> on what is an appropriate ratio of assembly lines of code to Ada lines of
|> code?  How about hours of effort for a function point in assembly vs
|> Ada?  
|> Any and all information is tremedously appreciated.
|> Regards,
|> John Sims
|> --

|> "Life is much too serious a matter to be taken too seriously"  
|>                                                                                                                                                                                            Goethe (I think)
|> ============================================================================

Given what you've written, I'm not sure exactly what you are trying to estimate,
so forgive me for rambling a bit.

If it is your objective to estimate the relative difference in labor, I can't
give you much help -- there are many subjective factors, most notably the the
skill level of the person(s) re-engineering the code.  Also, are you working
from an original design specification or are you planning on a simple
transliteration of what you currently have in hand?  I claim that it is a
much harder job to get the transliteration AND have something you would care
to maintain over the long haul.  You are also at risk of simply reproducing
prior design/implementation flaws.  On the other hand, if you are trying to get
a handle on performance and memory, there are means of getting there, but.....

Based on my experience in this area I recommend that you not believe any fixed
ratio you receive from ANY source.  The 'conversion factor' you request is
dependent upon numerous factors, including, but not limited to:

    * the 'quality' of the original assembly code, which includes stylistic
        and other subjective issues.
    * how much of Ada you support/use in the target appliation -- kernels
        can get quite compact if you can eliminate support for package Calendar,
        slice assignments, dynamic memory allocation, tasking, etc.  Depends
        on your real needs and what the customer will accept...don't EVER
        surprise your Government customer in this area, you will regret it.
    * the differences in the target processor architecture in case the
        real target is different from your initial assembly language base
    * the sophistication and services of the run-time support of the
        compiler/kernel relative to what you had in your baseline.
    * the quality of the code generation, at the TIME IN THE FUTURE WHEN
        THE MEASURE IS IMPORTANT -- code quality is a moving target, as one
        expects a given compiler vendor to get better over time with a given
        target architecture...this is where you may have to stick your neck
        out and _estimate_ what will be true 1-5 years down the road...I've
        done this _once_ and managed to get close, mainly because I talked
        directly with the developers and got them to talk candidly about their
        expectations and approches for the architecture in question.  Obviously,
        the amount of homework to be done is non-trivial :-)
    * the time/space tradeoffs between the procedure call linkage used by
        the compiler v. the means used in the assembly language model.
    * any hardware support you gain/lose with the new approach -- with some
        of the nice things that can be done with gate arrays or ASIC's these
        days, you can delegate some of your more time critical functions to
        hardware and consign the assembly code formerly responsible for those
        functions to the 'ancient archives'.

If the latter is your goal, I recommend you create or have someone else create
some representative benchmarks of the components of your application which are
critical in either time, space, or are expected to confound the compiler
(e.g., heavy use of low-level bit manipulations).  Look at the generated
assembly code, talk to the (prospective?) compiler vendors -- if you can get
these benchmarks up front before purchase, you have some potential leverage
to get their support/attention (better to find out before purchase whether
they are going to be helpful when things get tough :-).

Enough bandwidth.....good luck in your endeavor.


-- Bud Hammons

* NEC America Inc.       CompuServe: 71160,3220                            *
* 1525 Walnut Hill Lane  voice:      (214)518-3488     CCSL S/W Dev.Lab.   *
* Irving, Texas 75038    FAX:        (214)518-3552     S/W Eng. Group      *

Sat, 17 Sep 1994 00:13:31 GMT  
 [ 1 post ] 

 Relevant Pages 

1. Template add on for Metric Conversions

2. Template add on for Metric Conversions

3. Template add on for Metric Conversions

4. English-Metric conversion calculator?

5. Estimating conversion of Ada 83 to Ada 95

6. Ada Tools Exhibition (& Ada-Belgium General Assembly)

7. Design your "Dream" Metric/Metric Tool

8. Ada 95 Metrics software

9. tools [Ada metrics]

10. Looking for Ada metrics

11. Help finding Ada software metrics tool

12. code metrics generators for Ada


Powered by phpBB® Forum Software