OO Metrics: People and Publications 
Author Message
 OO Metrics: People and Publications

Object-Oriented Metrics: People and Publications

Robin Whitty, South Bank University, London SE1 0AA, UK

______________________________________________________________________________________
This annotated list is meant to offers an expanding coverage of what is
happening where in research into measurement of object-oriented design and
programming. Future updates of this list will be posted on the ami electronic
discussion group (join by sending a single line: subscribe firstname lastname
e-mail to

This list is copyright of South Bank University. It may be copied and
distributed freely provided this copyright is acknowledged. In return I
acknowledge responsibility for all sins of commission and omission.

Please send me additions or corrections. Remember, if you have read a paper
then any comments or annotations you have will be especially useful. The
following are acknowledged in this respect: Jim Bieman, Kevin Bray, Lionel
Briand, Fernando Brito e Abreu, Rachel Harrison, Mark Leszczynski, Joern
Muenzel.

This work was carried out for the PROMISE project, supported by the EPSRC as
part of the EPSRC/DTI programme in Safety-critical Systems.
______________________________________________________________________________________
Update: 11th November 1994.
*** Don't forget to register for Object-Oriented Information Systems '94. Held
this year at South Bank University, London, 19-21 December. Do your Christmas

______________________________________________________________________________________

1. M.G. Barnes and B.R. Swim, "Inheriting software metrics", Journal of
Object-Oriented Programming, November-December 1993, 27-34.

2. J.M. Bieman, "Deriving measures of software reuse in object oriented
systems", in Formal Aspects of Measurement (T. Denvir, R. Herman and R. Whitty,
eds.), Springer-Verlag, London 1992, 63-83.

Quote:
>>Tackles the problem of measuring reuse in OO systems which support

'leveraged' reuse through inheritance. Measurements are described for 3
perspectives: the server, the client and the system.
Jim Bieman is at Colorado State University, University Services Centre, Fort
Collins, Colorado 80523,  USA
Main tel: +303 491 7096
Main fax: +303 491 6639

3. J. M. Bieman and S. Karunanithi, "Measurement of language supported reuse in
object oriented and object based software" (to appear), J. Systems and
Software.

4. R.V. Binder, "Design for testability in object-oriented systems", Commun.
ACM, 37 (9), 87-101.

Quote:
>>Provides tables of metrics for testability and encapsulation, inheritance,

polymorphism and complexity (compare with McCabe et al 1994). There is a very
detailed discussion of the testing process in which these metrics may be
deployed, addressing such issues as standards, tools, design for testability

5. S.C. Bilow, "Borrowing from McCabe: what object-oriented methodologists can
learn from cyclomatic complexity, OOPSLA '92 Workshop 'Metrics for
Object-Oriented Software Development', 05.10.-10.10.92; 4 pages;

6. G. Booch and M. Vilot, "Simplifying the Booch components", C++ Report, 5
(5), 1993, 41-52, ISSN: 1040-6042.

Quote:
>>Abstract: What factors contributed to making the C++ Booch Components into

'the incredible shrinking library?' This article examines, how the concepts of
OOD and the features of C++ helped to organize and simplify the library. First,
it provides a brief overview of the library showing its contents and
organization. Then, it explores how inheritance and parameterization helped to
streamline and simplify the library. The article concludes with a summary of
how much each design change contributed to the overall size reduction. (14
Refs.)

7. G. Booch, "Qualitaetsmasse", OBJEKTspektrum, Nr.4, September-October 1994,
53-56.

Quote:
>>Quality measures paper. OBJEKTspektrum is a German journal.

8. L. Briand, S. Morasca and V. Basili, "Assessing software maintainability at
the end of high-level design", IEEE Conference on Software Maintenance,
Montreal, Quebec, Canada, 1993.
Quote:
>>Lionel Briand writes: "We did a substantial amount of work on the subject of

metrics for abstract data types. Moreover, we validated them experimentally,
which is not the case of most (if not all) the metrics in the literature. Only
a part of this work is published since several papers are still under review.
The second reference is more complete since it was completed recently. The
experimental results are only provided in the second reference."
He also has these papers on the following ftp site: ftp.cs.umd.edu in
pub/sel/papers. However, the postscript files are not perfect because they have
been generated from MS-word files on Mac. Hardcopies can be sent. Lionel can be
contacted at:
L C Briand, Software Engineering Laboratory, Maryland University, College Park,
Maryland 20742, USA

9. L. Briand, S. Morasca, V. Basili, "Defining and validating high-level design
metrics", Submitted for journal publication, technical report of the University
of Maryland, CS-TR 3301, UMIACS-TR-94-75.

10. F. Brito e Abreu, "Candidate metrics for object-oriented softwarewithin a
taxonomy framework", J. Systems and Software 23 (1), July 1994, 87-96.

11. F. Brito e Abreu, "Object-oriented software engineering: measuring and
controlling the development process", Proceedings of the 4th International
Conference on Software Quality, McLean, {*filter*}ia, October 3-5, 1994.

Quote:
>> Fernando is currently a lecturer in Computer Science at the Lisbon Technical

University and a researcher at INESC, a R&D company, also in Lisbon. He is
working on a PhD thesis on quantitative methods for the OO paradigm (metrics
and resource estimation models). He can be contacted at:
Eng Fernando M Brito E Abrev
INESC
Rua Alves Redol 9, Apartado 13069, 1000 Lisboa, Portugal
Direct tel: +351 1 3100226
Main tel: +351 1 3100000
Main fax: +351 1 525843
E-mail: fba.luanda.inesc.pt

12. I. Brooks, "Object-oriented metrics collection and evaluation with a
software process", OOPSLA `93 Workshop on Processes and Metrics for Object
Oriented Software Development, Washington DC, 26 September, 1993.

Quote:
>>Not necessarily available as a paper. However, see the OOPSLA `93 Workshop

Report: Steven C. Bilow, Tektronix, Inc. and Doug Lea, SUNY Oswego & N.Y. CASE
Center. The following is taken from that report: Irene Brooks presented a paper
based on practical experience at Texas Instruments. She explaned how TI has
found measures of size, defect density, and defect intensity to be very useful
in the schedule management and defect control. She commented that some
traditional metrics like McCabe complexity measure do not appear to be useful
for O-O development. She also noted that it seems necessary to find adequate
measures for polymorphism, inheritance, and the cohesion between class
attributes. Her primary goals are repeatability of process, quality
measurement, project management and configuration control techniques.

13. S.N. Cant, B. Henderson-Sellers, and D.R. Jeffery, "Application of
cognititive complexity  metrics to object-oriented programs", Journal of
Object-Oriented Programming, July-August 1994, 52-63.

14. D. de Champeaux, A. Anderson,  D. Lerman, M.D. Gasperina, E. Feldhousen, M.
Glei, F. Fulton, C. Groh, D. Houston, C. Monroe, R. Raj and  D. Shultheis, Case
study of object-oriented software development, Report no. HPL-91-170, Oct.
1991,
Hewlett-Packard Lab., Palo Alto, CA, USA.

Quote:
>>Try contacting Barbara Zimmer, Hewlett Packard Labs, 1801 Page Mill Road,

Bldg 18D, Palo Alto, California  94304, USA, tel:+1 (415) 857-4894. She is a
member of HP's Software Quality and Productivity Analysis group.
Abstract: These are the highlights of a successfully completed  application of
object-oriented software development for a new  product. The project was of
medium size, the duration was less than  24 months (from the end of the
requirements specification to product  shipment), and the average team size was
8-10 software engineers. The  authors discuss how the team dealt with major new
aspects: a  different paradigm, a different programming language, a different
user interface environment, and a different development environment.  In spite
of all these novelties and in spite of the fact that the  code size had been
underestimated by about 75%, the project schedule  slipped only by 20%. The
authors touch upon all phases of the  development life cycle: requirements
capture, OO analysis, OO design,  OO implementation and the verification phase.
Some management  perspectives are addressed as well. (2 Refs.)

15.  J.-Y. Chen and J.-F.Lu, "A new metric for object-oriented design",
Information and Software Technology, 35 (4) 1993, 232-240.

Quote:
>>Abstract: The paper presents a new metric for object-oriented design. The

metric measures the complexity of a class in an object-oriented design. The
metrics include operation complexity,
operation argument complexity, attribute complexity, operation coupling, class
coupling, cohesion, class hierarchy, and reuse. An experiment is conducted to
build the metric system. The approach is to derive a regression model of the
metrics based on experimental data. Moreover, subjective judgements by experts
are incorporated in the regression model. This ensures that the metric system
is pragmatic and flexible for the software industry. (26 Refs.)

16. S. Chidamber and C. Kemerer, "Towards a metrics suite for object oriented
design", in Proc. Conference on Object-Oriented Programming: Systems, Languages
and Applications (OOPSLA'91), October 1991.  SIGPLAN Notices, 26 (11) 1991,
197-211.

17. S.R. Chidamber and C.F. Kemmerer, "A metrics suite for object oriented
design", Center of Information Systems Research (MIT), WP No. 249, July 1993 ;
35 pages.

Quote:
>>also published in IEEE Transactions on Software Engineering, 20 (6), June

1994, 476-493.

18. S.R. Chidamber and C.F. Kemerer, "MOOSE: Metrics for Object Oriented
Software Engineering", OOPSLA `93 Workshop on Processes and Metrics for Object
Oriented Software Development, Washington DC, 26 September, 1993.

Quote:
>>Not necessarily available as a paper. However, see the OOPSLA `93 Workshop

Report: Steven C. Bilow, Tektronix, Inc. and Doug Lea, SUNY Oswego & N.Y. CASE
Center. The following is taken from that report: The Chidamber and Kemerer
metrics have generated a significant amount of interest and are currently the
most well known suite of measurements for O-O software. In Shyam's quest to
validate his metrics he has spent 3 months interviewing software designers and
several month collecting empirical data from both C++ and Smalltalk projects.
His principle points are that metrics must be theoretically rigorous and
practically relevant. Toward those goals, the MOOSE metrics are beginning to
show strong empirical validity.

19. C.-M. Chung and M.-C. Lee, "Inheritance-based metric for complexity
analysis in object-oriented design", Journal of Information Science and
Engineering, ISSN: 1016-2364, 8 (3) 1992, 431-47

Quote:
>>Abstract: Object-oriented software development, including object-oriented

analysis (OOA), object-oriented design (OOD) and object-oriented programming
(OOP), is a promising new approach for developing software systems to reduce
software costs and to increase software reusability, flexibility, and
extensibility. Software metric is an important technique used to measure
software complexity to improve software quality and enhance software
correctness. For the past decade, most software metrics have been developed on
procedure-oriented languages and widely applied to software complexity
evaluation. Recently, many researches on software metrics  have taken the
object-oriented approach to measuring the complexity  of OO software. However,
few researches have led to insight into the  relationships between
object-oriented design complexity and  inheritance. This paper describe a
graph-theoretical metric for  measuring the complexity of class hierarchy. This
metric shows that  inheritance has a close relation with object-oriented
software  complexity and reveals that overuse of repeated (multiple)
inheritance will increase software complexity and be prone to implicit software
errors. An algorithm to support this software metric is presented. Its time
complexity is O(n^3). (12 Refs.)

20. J. Hamilton, "Metrics: where things went wrong", OOPSLA `93 Workshop on
Processes and Metrics for Object Oriented Software Development, Washington DC,
26 September, 1993.

Quote:
>>Not necessarily available as a paper. However, see the OOPSLA `93 Workshop

Report: Steven C. Bilow, Tektronix, Inc. and Doug Lea, SUNY Oswego & N.Y. CASE
Center. The following is taken from that report: Perhaps the most deeply
revealing story of the day came from Jerry Hamilton who summed up the misuse of
metrics through practical example. Jerry's is the story of "last minute
metrics" and the result that unsubstantiated metrics can have on a project.

21. R. Harrison, L.G. Samaraweera, M.R. Dobie and P.H. Lewis, "Comparing
programming paradigms: an evaluation of functional and object-oriented
programs", Internal Report, Dept. of Electronics and Computer Science,
University of Southampton SO17 1BJ, UK, 1994.

Quote:
>>Results of the EFOOL project comparing external and internal measures for


22. R. Harrison, L.G. samaraweera, M.R. Dobie and P.H. Lewis, "An evaluation of
code metrics for object-oriented programs", Internal Report, Dept. of
Electronics and Computer Science, University of Southampton SO17 1BJ, UK, 1994.

23. B. Henderson-Sellers and D. Tegarden, RClarification concerning
modulatization and McCabeUs cyclomatic complexityS, CACM 37 (4), 1994, 92-94.

24. S. Henry and M. Lettanzi, "Measurement of software maintenance and
reliability in the object oriented paradigm", OOPSLA `93 Workshop on Processes
and Metrics for Object Oriented Software Development, Washington DC, 26
September, 1993.

Quote:
>>Not necessarily available as a paper. However, see the OOPSLA `93 Workshop

Report: Steven C. Bilow, Tektronix, Inc. and Doug Lea, SUNY Oswego & N.Y. CASE
Center. The following is taken from that report: The paper describes the
results of three studies. One, a study of maintenance difficulty in procedural
versus O-O software. Two, a relationship between reuse, productivity, and O-O
techniques. Three, a modification and application of the MOOSE (C&K) metrics
for the purpose of predicting maintainability.

25. T.P. Hopkins, "Complexity metrics for quality assessment of object-oriented
design" in Software Quality Management II, vol. 2: Building Quality into
Software (M. Ross, C.A. Brebbia, G. Staples and J. Stapleton, eds.),
Computational Mechanics Press, 1994, 467-481.

Quote:
>>Proposes some simple syntactic measures of interface complexity with the aim

of addressing robustness (ease of change) and reusibility.  Trevor Hopkins is
at Manchester University, Oxford Rd, Manchester, M13 9PL, UK, Main tel: (0161)
273-7121

26. IEE Colloquium on 'Object-oriented development' (Digest no. 007), London
January 1993.

Quote:
>> Contains a paper on numbers from two OO developments. I don't know who are

the authors. Obtainable from Institution of Electrical Engineers, Savoy Place,
London, WC2R 0BL  UK, tel: +44 (0) 171 240 1871, fax: +44 (0) 171 497 3633.

27. S. Karunanithi and J.M. Bieman, "Candidate reuse metrics for object
oriented and Ada software", in IEEE-CS Int. Symp. Software Metrics, 1993.

Quote:
>> contact via Bieman (see under B)

28. R. Kolewe, RMetrics in object-oriented design and programmingS, Software
Development, October 1993, 53-62.

29. J.A. Lewis, S.A. Henry and D.G. Kafura, "An empirical study of
object-oriented paradigm and software reuse", in Proc. OOPSLA '91 Conference on
Object-Oriented Programming Systems, Languages and Applications, Phoenix,
Arizona, 1991.J SIGPLAN Notices, 26 (11) 1991, 184-196.

30. W. Li and S. Henry, "Maintenance metrics for the object-oriented paradigm",
in "Proceedings of the first International Software Metrics Symposium,
Baltimore, Maryland", May 1993, 52-60

31. W. Li and S. Henry, RObject-oriented metrics that predict maintainabilityS,
J. Systems and Software 23 (2), 1994, 111-122.

32. J. Liddiard, "Achieving testability when using Ada packaging and data
hiding methods", Ada User, 14 (1), 1993, 27-32.

33. M. Lorenz and J. Kidd, "O-o metrics position paper", OOPSLA `93 Workshop on
Processes and Metrics for Object Oriented Software Development, Washington DC,
26 September, 1993.

Quote:
>>Not necessarily available as a paper. However, see the OOPSLA `93 Workshop

Report: Steven C. Bilow, Tektronix, Inc. and Doug Lea, SUNY Oswego & N.Y. CASE
Center. The following is taken from that report: Mark Lorenz and Jeff Kidd came
to the workshop with metrics taken from their up and coming book on the
subject. They made two very important points. First, they make a large
distinction between what they term "Project Metrics" and "Design Metrics".
Project metrics include such items as schedule, staffing estimating, and
nearness to completion. Design metrics, on the other hand include such items as
method "size", class size, use of inheritance, cohesion, etc. More important
than the presentation of Mark's metrics was his statement that metrics should
not "drive design" but, rather, should be used to pinpoint anomalies. Some may
disagree with this premise but it does remind us of how easy it is to misuse
measures.

34. M. Lorenz and J. Kidd, Object-Oriented Software Metrics, Prentice Hall
Object-Oriented Series, 1994.

35. R. Martin, "OO design quality metrics - an analysis of dependencies",
Technical Report, 14.09.1994; 7 pages.

Quote:
>>Robert Martin can be contacted at Object Mentor Assoc., Green Oaks, IL 60048,

USA
Tel.: +708 918 - 1004
Fax: +708 918 - 1023

36. T.J. McCabe, L.A. Dreyer, A.J. Dunn and A.H. Watson, "Testing an
object-oriented application", Journal of the Quality Assurance Insititute,
October 1994, 21-27.

Quote:
>>Claims that traditional control flow analysis still has a role in o-o testing

and complexity measurement. Also lists some new metrics for o-o systems:
inheritance metrics, encapsulation metrics, polymorphism metrics and quality
metrics.

37. C.L. Ong and W.T. Tsai, "Class and object extraction from imperative code",
Journal of Object-Oriented Programming, 6 (1) 1993, 58-60 and 62-68.

Quote:
>>Abstract: The article studies class and object extraction from  existing

systems for translation to a class-based or object-oriented  language. The
effects of functional decomposition and object-oriented  design on the
resulting code are examined. Heuristics and algorithms  to extract instance
variables of objects from imperative code are  presented. Data flow analysis is
used to analyze variable usage in a  program; the results are used to extract
methods. The languages  fortran 77 and C++ are widely used in the
implementation stage of the  object-oriented and the imperative paradigms. A
prototype extractor  was developed that automatically extracts classes and
objects from  code written in Fortran 77 and translates them into class-based
code  in C++. In an experiment with an 18 K-line Fortran application, the
prototype identified 6 K lines of code as potential components of  classes.
Further analysis showed that much of the extracted code was  repeated, and if
the software is reengineered in an object-oriented  version much redundancy can
be eliminated. The prototype is also  useful for the task of program
understanding. (46 Refs.)

38. K. Reinold, "Processes and metrics for object-oriented software
development", OOPSLA `93 Workshop on Processes and Metrics for Object Oriented
Software Development, Washington DC, 26 September, 1993.

Quote:
>>Not necessarily available as a paper. However, see the OOPSLA `93 Workshop

Report: Steven C. Bilow, Tektronix, Inc. and Doug Lea, SUNY Oswego & N.Y. CASE
Center. The following is taken from that report: Kathy Reinold brought to us
the perspective of a software manager tasked with moving from a procedural
world to an object oriented one. She presented charts illustrating her
contention that Function Point Analysis and "object counting" have the ability
to product reliable KLOC estimates, even in O-O projects.

39. L. Rising, "An information hiding metric", OOPSLA `93 Workshop on Processes
and Metrics for Object Oriented Software Development, Washington DC, 26
September, 1993.

Quote:
>>Not necessarily available as a paper. However, see the OOPSLA `93 Workshop

Report: Steven C. Bilow, Tektronix, Inc. and Doug Lea, SUNY Oswego & N.Y. CASE
Center. The following is taken from that report: Another quite interesting bit
of work came from Linda Rising at Honeywell. She has been working on measures
of information hiding in Object-Based languages and brought the ADA perspective
to the group.

40. T. Roberts, "Metrics for object-oriented software development OOPSLA '92,
Addendum to the Conference Proceedings, Workshop, 97-100.

41. D. Rocacher, "Metrics definitions for Smalltalk", ESPRIT Project 1257,
MUSE, Workpackage WP9A, 1988.

Quote:
>>Official contact for MUSE: Brameur Ltd, Clark House, Kings Road, Fleet, HANTS

GU13 9AD, UK.  Main tel: 01252 812 252, Main fax: 01252 815 702

42. D. Rocacher, "Smalltalk Measure Analysis Manual", ESPRIT Project 1257,
MUSE, Workpackage WP9A, 1989.

43. X. Song, "Engineering o-o design methods into repeatable design processes",
OOPSLA `93 Workshop on Processes and Metrics for Object Oriented Software
Development, Washington DC, 26 September, 1993.

Quote:
>>Not necessarily available as a paper. However, see the OOPSLA `93 Workshop

Report: Steven C. Bilow, Tektronix, Inc. and Doug Lea, SUNY Oswego & N.Y. CASE
Center. The following is taken from that report: Song's work is devoted to
process "repeatability". His goal is to quantify existing design methodologies
and generalize them into a set of processes, each of which is specific to a
certan class of project, and which may be used consistantly, and repeatably
across these projects. The work is rooted in what Leon Osterweil calls "Process
Programming". In otherwords, a software process should be designed, "coded",
and executed in a manner quite similar to software itself.

44. D.A. Taylor, "Software metrics for object technology", Object Magazine,
March-April 1993, 22-25.

45. D.P. Tegarden, "Object-oriented system complexity: an integrated model of
structure and perceptions", Presented at OOPSLA '92 Workshop 'Metrics for
Object-Oriented Software Development'.

46. D. Ungar, Position paper, OOPSLA'93 workshop on orject-oriented testing,
September 1993.

Quote:
>>Not necessarily available as a published paper. I understand he examines

error rates in C++ and Self code: findings suggest that OO code is as
error-prone as other types. (Ungar developed the Self compiler).

47. J.D. Williams, "Metrics for object-oriented projects", Conference
Proceedings OOP'94 and C++ World, 31 Jan-4 Feb. 1994, SIGS Publications, ISBN
1-884842-00-3, 253-258



Wed, 30 Apr 1997 06:02:27 GMT  
 
 [ 1 post ] 

 Relevant Pages 

1. OO Metrics: People and Publications

2. Use of O.O. metrics to estimate the effert and cost of O.O. projects

3. OO Metrics Bibliography - Update

4. OO-Metrics

5. OO Metrics list: April update

6. Any good OO metrics?

7. OO Languages, methodologies and metrics (includes text version)

8. OO languages, methodologies, and metrics (includes text version)

9. OO languages, methodologies, and metrics (includes text version)

10. OO languages, methodologies, and metrics (includes text version)

11. OO Relative Complexity Metric

12. OO Languages, methodologies and metrics (includes text version)

 

 
Powered by phpBB® Forum Software