Abstract: Industry recognizes very small enterprises for their contribution of valuable products and services. As software quality increasingly becomes a subject of concern, and as process approaches are maturing and earning the confidence of companies, the use of ISO/IEC JTC1/SC7 international standards isspreading in organizations of all sizes. These standards, however, were not written for very small development organizations?that is, those with one to 25 employees?and are consequently difficult to apply in such settings. A new ISO/IEC JTC1/SC7 working group has been established to addressthese difficulties by developing profiles and providing guidance for compliance with ISO software engineering standards. A survey was conducted among very small enterprises on their use of standards, as well as to collect data to identify problems and potential solutions to help these enterprises apply them. More than 400 responses were received from 31 countries.
Abstract: This paper examines how the industrial applicability of both ISO/IEC 9126:2001 and MITRE CorporationÂ’s Software Quality Assessment Exercise (SQAE) can be bolstered by migrating SQAEÂ’s quality model to ISO/IEC 9126:2001. The migration of the quality model is accomplished through the definition of an abstraction layer. The consolidated quality model is examined and further improvements to enrich the assessment of quality are enumerated.
Abstract: This article examines the differentways in which the quality modelbehind MITRE CorporationÂ’sSoftware Quality AssessmentExercise (SQAE) can be migrated toISO/IEC 9126:2001. The reasonswhy such a migration is desirableare detailed through a comparisonof the quality models supportingISO/IEC 9126 and SQAE. The problemis solved through the definitionand usage of an abstraction layerbetween the two quality models.The resulting model is examinedand further improvements aresuggested to ensure compliancewith ISO/IEC 9126.
Abstract: ISO/IEC 90003:2004, Software engineering ? Guidelines for the application of ISO 9001:2000 to computer software, is a new ISO/IEC standard that has a huge worldwide potentialdue to the penetration of just about every business sector, as well as many aspectsof social life, by information technology.
Abstract: In this article, a systems engineering process is briefly described followed by a discussion of the application of risk managementpractices to the reengineering of operator console stations of a missile weapon system. Lastly, 12 lessons learned are presented.
Abstract: Functiepuntanalyse is de meest gebruikte methodevoor omvangbepaling van softwareontwikkelprojecten.De telrichtlijnen hiervan zijn echter lastig toepasbaarbij nieuwe methoden zoals objectgeoriënteerd ofcomponentgebaseerd ontwikkelen. Voor dezemethoden zijn Cosmic full function points ontwikkeld.
Abstract: A time-bound project is constrained by hard deadlines, in which the delivery's timing is as important as the delivery itself. Because most time-bound projects start with more requirements than developers can handle within the imposed time constraints, requirements often must be slashed halfway through the project, resulting in missed deadlines, customer frustration, and wasted effort. A better approach defines requirement priorities before the project's start. But failing to prioritize requirements is not the only reason that projects miss deadlines.Traditional planning methods' inability to deal with uncertain estimates and their failure to recognize that development work does not progress linearly are also to blame. Statistically Planned Incremental Deliveries offer an approach that addresses these problems by combining ideas from critical chain planning, incremental development, and rate monitoring into a practical method for planning and executing time-bound projects.
Abstract: Ohjelmiston koon laskentaton oleelinen osa ohjelmistoprojektin mitoitusta jayleensäkin ohjelmistoyksikön toiminnan laadukkuuden arviointia. Koko on normittavana tekijänä monessa laatumittarissa. Käytössaï on monenlaisia koon laskennan metodeja, joista ohjelmiston ositus arvioitaviin palasiin, tehtäväosituksen käyttö, silkka asiantuntija-arvio, rivinäärien arviointi ja toimintopistelaskenta ovat joitakin esimerkkejä. Mahdollisimman monenlaisiin ohjelmistoihin soveltuvaa methodia ei ole oikein löytynyt tai niitä ei ainakaan ole huluttu käyttää. Nyt on syntynyt uusi ehdokas ohjelmiston toiminnallisen koon laskentaan: COSMIC.
Abstract: Most practitioners and project managers still produce estimates based on ad hoc or so-called expert" approaches, even though several structured methods for software sizing and effort estimation are well known. The paired-comparisons method offers a more accurate and precise alternative to "guesstimating.""
Abstract: This paper describes the approach used by a defense contractor to address the people issues raised when developing and implementing engineering processes. First, a brief description of the context is presented, then organizational mechanisms to better manage changes are described, and finally 16 lessons learned are presented.
Abstract: In order to reduce cycle time, increase customer satisfaction and lower costs, Oerlikon Aerospace initiated a seriesof projects to define and implement engineering and management processes. The first initiative, in 1992, defined a software engineering process. A second initiative was started in 1995 with the objective of defining and implementing a systems engineering process, and integrating this process to the software engineering process already in use. We presenta brief description of the context, then describe the systems engineering process. Organizational mechanisms to better manage changes are also described. Finally, lessons learned are presented.
Abstract: This paper describes the approach used by a defense contractor, since 1992, to address the issues raised when defining and implementing engineering and management processes. First are described the steps taken to define the software engineering process, the systems engineering process and the project management process. Then issues raised during the integration of the processees are described. Finally, the steps to address the management of change are discussed and lessons learned are presented.
Abstract: This paper describes the approach used by OerlikonAerospace since 1993 to define and implement software andsystems engineering processes:- First, the steps taken to assess and define a software process are described using the Software Engineering Institute's Capability Maturity Model (SEI CMM) framework.- The steps to develop the systems engineering process using the SEI CMM framework and two processes from the Software Productivity Consortium are then described.- Process integration is discussed.- Since the human dimension of the implementation of newtechnologies is critical to the success of our effort, a few human issues are discussed.- Finally, lessons learned and the next steps are described.
Abstract: This paper describes the architecture of a voice-interactiveevaluation system designed around the STD bus. The evaluation system includes a microcomputer module, a speech synthesizer module, a speech recognition module, and an application or interface module. The voice-interactive system is used, under the control of a host computer I in a laboratory experiment to simulate the frequency selection of VHF and UHF radios in a military aircraft. The results show that tracking a target while performing radio-frequency selection using a voice-interactive system is more precise than the manual frequency-selection method.
Abstract: This paper describes the architecture of a distributed computer system designed for the development of voice interactive applications. The system is composed of two main elements: a microcomputer and a front-end processor. The front-end processor is a speech transaction processor used as a man-machine interface. It is designed to receive verbal commands and deliver aural information and feedback to the operator of the system.
Abstract: The software industry recognizes the value of very small enterprises in contributing valuable productsand services to the economy. As the quality of software increasingly becomes a subject of concern and process approaches are maturing and gaining the confidence of companies, the use of ISO/IEC JTC 1 SC71 standards is spreading in organizations of all sizes. However, these standards were not written for development organizations with fewer than 25 employees and are consequently difficult to apply in such small settings. A new ISO/IEC JTC1 SC7 Working Group, WG24, has been established to address some of these difficulties by developing profiles and providing guidance for compliance with ISO software engineering standards. A survey was conducted to question these very small organizations about their utilization of ISO/IEC JTC1 SC7 standards and to collect data to identify problems and potential solutions to help them apply these standards. Over 400 responses were received from 32 countries. Results from the survey are discussed.
Abstract: One of the strengths contributing to the diffusion and adoption during last years of Maturity Models (MMs) such as CMMI and ISO 15504 (aka SPICE) is the evolutionary path towards a continuous improvement they provide, evolving the initial CrosbyÂ’s idea. Differently from Performance Management models (PMMs) such as the Balanced Scorecard (BSC), Malcolm Baldridge, EFQM Excellence Model or the JUSE Deming Prize, MMs seems to do not stress in their appraisal criteria the way resources are renewed, redistributing obtained ?resultsÂ’ towards the ?enablersÂ’. Looking at this question from an ?ecologicalÂ’ viewpoint, where the current environmental situation urgently asks to adopt renewable resources taking care from an holistic view of the state of the planet, the paper will discuss this issue translating it to the organizational management, proposing possible improvements to the process assessment model (PAM) generic structure of a MM, with the objective to provide a more confident picture of an organization from an appraisal, not overrated, as nowadays it can happen.
Abstract: Effort estimation is a significant practical problem in software engineering, and various cost drivers, including software size, which mighthave an impact on it have been explored. In many of these studies, total software size (measured in either lines of code or functional size units) is theprimary input. However, the relationship between effort and the components of functional size has not yet been fully analyzed. This study explores whethereffort estimation models based on the functional size components, that is, Base Functional Component types, rather than those based on a single total value,would improve estimation models. For this empirical study, the project data in the International Software Benchmarking Standards Group (ISBSG) Release 10dataset, which were sized by the COSMIC FFP method, are used.
Abstract: Effort estimation is a significant practical problem in softwareengineering, and various cost drivers, including software size, which mighthave an impact on it have been explored. In many of these studies, totalsoftware size (measured in either lines of code or functional size units) is theprimary input. However, the relationship between effort and the components offunctional size has not yet been fully analyzed. This study explores whethereffort estimation models based on the functional size components, that is, BaseFunctional Component types, rather than those based on a single total value,would improve estimation models. For this empirical study, the project data inthe International Software Benchmarking Standards Group (ISBSG) Release 10dataset, which were sized by the COSMIC FFP method, are used.
Abstract: In the context of cross-platform Web applications, Pattern-Oriented Design (POD) proposes that developers use proven solutions emerging from best design practices in order to solve common design problems. In addition, it requires composing patterns to create a platform-independent design and then mapping these pattern-oriented designs to specific platforms. This prevents the designer from reinventing the wheel and can have positive implications on system performance, scalability and usability. In this paper, we introduce different types of Web design patterns, as well as different composition and mapping rules to design a multi-platform Web application. We discuss why patterns are a suitable means for creating and mapping a Web design to new platforms while maintaining usability.
Abstract: In recent years, the authors have implemented measurement programs in several organizations of different sizes. Two of them were small software companies of approximately 12 employees, which were mostly developers. Although these two organizations were similar in size and technology, the differences in the issues they were facing led to completely different approaches for their measurement program. This paper is about the steps taken to implement these measurement programs, both including functional size measurement with COSMIC, effort, schedule, and defects. It also describes what was done to ensure the success of each program and, most importantly, the challenges that were faced during their implementation and maintenance, as well as some of the solutions proposed to answer these challenges.
Abstract: It is known in the industry that software quality requirements engineering is still an immature discipline since its absence results in dissatisfied users and costly applications. The identification and specification of software quality requirements from system and user requirements is becoming a prominent task in software engineering. The lack of these requirements or their inappropriate identification may compromise business processes and may impact negatively the results of any development project. The presented paper discusses three quality engineering approaches which address quality requirements. The main objective of this research study is to define a methodology for building ISO/IEC standards-based quality approach for quality requirements identification.
Abstract: Quality of service (QoS) is difficult to achieve in modern voice over IP (VoIP) systems because software and hardware have a symbiotic relationship. The purpose of this paper is to present an analysis of softswitch quality that identifies where higher quality requirements should be enforced when designing or evaluating VoIP solutions. The work is based on the international standard ISO/IEC 9126 and the International Packet Communications Consortium (IPCC) reference architecture. Therefore, it applies to most vendors and architectures regardless the underlying technology. Since the softswitch is fairly complex and involves many software and hardware modules, quality attributes have been analyzed through their functional behavior. This paper provides softswitch vendors and buyers with results that will help them make better resource allocation decisions and therefore reduce both their capital and operational expenses.
Abstract: To define a trajectory, in computerized precincts, is necessary to study some aspects as shape of the objects, the form representation and also to make some obstacle shape measurements. Because of sampling, we do not observe the true contour but a version approaches for various types of connexity on the same grid of sampling, which succeeds has to define different contours for the same object.We will analyze two categories of form description:- the linear descriptors ? they describe the form step by step, according to its contour;- the surface descriptors ? they operate by elements of surface.
Abstract: Looking at the distribution of the costs of IT, the largest part of the budget is allocated to maintenance and enhancement projects. New development comprises between 30 to 50% of IT costs. Functional size measurement methods are mostly used for new development only. With some extensions to common size measurement methods like Function Point Analysis [1] and COSMIC Full Function Point [2] one can tackle almost all IT activities. Furthermore the same productivity rates (performance) can be used in enhancement projects also. Over the last 10 years we used the extended measurement method based on Function Point Analysis very successfully in a great number of projects. The same concept is also applicable when using COSMIC Full Function Points.
Abstract: The application of software functional size metrics, as IFPUG Function Points andCOSMIC Full Function Points, frequently reveals serious difficulties arising from the lack ofa detailed and complete description of the functional user requirements of the systems beingmeasured. This kind of difficulties, combined with the obvious need to provide an estimationof the measures in a reduced and early time frame, compared to the standard measurementduration and time, led to the definition of the Early and Quick (Function Pont) technique.This work describes the generalized definition of the technique and how its structure andconcepts are specialized for the above measurement methods, providing all the requiredinformation for applying it in practice. Moreover, we provide the goodness evaluationmethod and results of the technique with respect to the estimated and actual measures,through a given set of numerical indicators.
Abstract: The COSMIC-FFP functional sizing method can be applied to several software domains (such as business applications and 'real-time' software). The Measurement Manual offers a theoretical skeleton, together with examples from those domains. For each specific software domain a Guideline will be developed. A Guideline aims to describe detailed rules and to provide extensive examples for the sizing of software from that specific domain.The first Guideline to be published gives a characterisation of the business applications domain. To apply the COSMIC-FFP method to business applications software the measurer requires a good understanding of data analysis. The Guideline gives a short explanation of data analysis and its relation with the COSMIC-FFP method. Finally, the application of the COSMIC-FFP method in the domain is explained for the End-user Measurement Viewpoint and the Developer Measurement Viewpoint, as defined in the COSMIC-FFP Measurement Manual.The Guideline discusses the materials of the COSMIC-FFP method (boundary, layer, object of interest, identification of data movements etc.), their manifestation in the business applications domain together with many examples.Measurement of maintenance, the extended definition of object of interest and its implications for sizing are treated. Besides, several issues as sizing authorisation, help, logging, menus and layouts are treated.
Abstract: This paper examines how the industrial applicability of both ISO/IEC 9126:2001and MITRE Corporation's Software Quality Assessment Exercise (SQAE) can bebolstered by migrating SQAE's quality model to ISO/IEC 9126:2001. Themigration of the quality model is accomplished through the definition of anabstraction layer. The consolidated quality model is examined and furtherimprovements to enrich the assessment of quality are enumerated.
Abstract: When implementing FPA, COSMIC Full Function Points or another measurement program everyone is looking for best practices. Although there is a change in initiation of a measurement program the items relevant for an implementation did not really change. In the early days IT (supplier) initiated most of the time the measurement program. Nowadays business management (principal) shows more interest in having a measurement program in place. But it has to be controllable and transparent. Business is not looking for a crystal ball. With that the measurement program should be pragmatic, simple and give quick wins. Because implementations are part of the business of Sogeti Nederland B.V., we developed based on over 10 years of experience a model that addresses strategical, tactical and operational issues. MOUSE gives a helping hand for both experienced and less experienced professionals to do a successful implementation.
Abstract: This article shows the first results of the adoption of COSMIC Full Function Points as a sizing method replacing function point analysis. The main arguments why COSMIC FFP was chosen will be explained, the transformation plan will be shown together with the first results of the use of COSMIC FFP. Next to the management requirement that the new functional sizing method had to be a standard a number of practical requirements were essential before the transformation could start: to find a correlation between Cosmic functional sizing units and function points so that the existing figures for size and product delivery rate could be reused, (early) estimation possibilities and the use of COSMIC-FFP for sizing maintenance projects.
Abstract: Many small software organizations have recognized the need to improve their software product. Evaluating the software product alone seems insufficient since it is known that its quality is largely dependant on the process that is used to create it. Thus, small organizations are asking for evaluation of their software processes and products. The ISO/IEC 14598-5 standard is already used as a methodology basis for evaluating software products. This article explores how it can be combined with the CMMI to produce a methodology that can be tailored for process evaluation in order to improve their software processes.
Abstract: A large number of software projects are enhancement projects of existing software. For estimating new projects acceptance of COSMIC Full Function Points [ ] is rapidly growing because it has already proven to be a good alternative for Function Point Analysis. Estimating enhancements using classic Function Point Analysis [ ] has always been somewhat controversial, but we believe that COSMIC can be a very good alternative in the very near future.
Abstract: COSMIC-FFP is a rigorous measurement method that makes possible to measure the functionalsize of the software, based on identifiable functional user requirements allocated onto differentlayers, corresponding to different levels of abstraction. The key concepts of COSMIC-FFP aresoftware layers, functional processes and four types of data movement (sub-processes). A preciseCOSMIC-FFP measure can then be obtained only after the functional specification phase, while forforecasting reasons the Early & Quick COSMIC-FFP technique has been subsequently provided,for using just after the feasibility study phase.This paper shows how the Analytic Hierarchy Process, a quantification technique of subjectivejudgements, can be applied to this estimation technique in order to improve significantly its selfconsistencyand robustness. The AHP technique, based on pair-wise comparisons of all (or someof) the items of the functional hierarchical structure of the software provided by E&Q COSMICFFP,provides the determination of a ratio scale of relative values between the items, through amathematical normalization. Consequently, it is not necessary either to evaluate the numericalvalue of each item, or to use statistical calibration values, since the true values of only one or fewcomponents are propagated in the ratio scale of relative values, providing the consistent values forthe rest of the hierarchy.This merging of E&Q COSMIC-FFP with AHP results in a more precise estimation methodwhich is robust to errors in the pair-wise comparisons, and self-consistent because of theredundancy and the normalization process of the comparisons.
Abstract: Looking at the distribution of the costs of IT, the largest part of the budget is allocated to maintenance and enhancement projects. New development comprises between 30 to 50% of IT costs. Functional size measurement methods are mostly used for new development only. With some extensions to common size measurement methods like Function Point Analysis and COSMIC Full Function Point one can tackle almost all IT activities. Furthermore the same productivity rates (performance) can be used in enhancement projects also. Over the last 10 years we used the extended measurement method based on Function Point Analysis very successfully in a great number of projects.
Abstract: Looking at the distribution of the costs of IT, the largest part of the budget is allocated to maintenance and enhancement projects. New development comprises between 30 to 50% of IT costs. Functional size measurement methods are mostly used for new development only. With some extensions to common size measurement methods like Function Point Analysis [1] and COSMIC Full Function Point [2] one can tackle almost all IT activities. Furthermore the same productivity rates (performance) can be used in enhancement projects also. Over the last 10 years we used the extended measurement method based on Function Point Analysis very successfully in a great number of projects. The same concept is also applicable when using COSMIC Full Function Points.
Abstract: Assessing and controlling software quality is still an immature discipline. One of the reasons for this is that many of the concepts and terms that are used in discussing and describing quality are overloaded with a history from manufacturing quality. We argue in this paper that a quite distinct approach is needed to software quality control as compared with manufacturing quality control. In particular, the emphasis in software quality control is in design to fulfil business needs, rather than replication to agreed standards. We will describe how quality goals can be derived from business needs. Following that, we will introduce an approach to quality control that uses rich causal models, which can take into, account human as well as technological influences. A significant concern of developing such models is the limited sample sizes that are available for eliciting their parameters. In the final section of the paper we will show how expert judgement can be reliably used to elicit parameters in the absence of statistical data. In total this provides a framework for quality control in software engineering that is freed from the shackles of an inappropriate legacy.
Abstract: Looking at the distribution of the costs of IT, the largest part of the budget is allocated to maintenance and enhancement projects. New development comprises between 30 to 50% of IT costs. Functional size measurement methods are mostly used for new development only. With some extensions to common size measurement methods like Function Point Analysis and COSMIC Full Function Point one can tackle almost all IT activities. Furthermore the same productivity rates (performance) can be used in enhancement projects also. Over the last 10 years we used the extended measurement method based on Function Point Analysis very successfully in a great number of projects.
Abstract: Betrachtet man die Verteilung der IT Kosten, stellt man fest, dass der größte Teil des Budgets für Wartungs- und Verbesserungsprojekte zugewiesen wird. Neuentwicklungen umfassen 30-50% der IT Kosten. Methoden zur Messung des Funktionsumfangs werden meistens nur bei Neuentwicklungen angewendet. Mit kleinen Erweiterungen bekannter Meßmethoden wie der Function Point Analyse oder COSMIC Full Function Points kann fast jede IT Aktivität analysiert werden. Während der vergangenen 10 Jahre hat Sogeti die erweiterte Meßmethode basierend auf der Function Point Analyse sehr erfolgreich in zahlreichen Projekten angewendet.
Abstract: Rabobank is reshaping its systems portfolio of Rabobank from dedicated product systems to a network of generic services with a shared data source. In this environment Function Point Analysis no longer fits the sizing needs. An alternative was found in the COSMIC Full Function Points method. Because of the absence of benchmark data a conversion formula was derived for projects that were measurable in both COSMIC Full Function Points (End User Viewpoint) and Function Point Analysis. This conversion formula now reads as:Y (cfsu) = -87 + 1,2 X (fp)The correlation coefficient for this conversion formula is 0,99 and the standard deviation in the difference in the Y-value is 59. To support the estimating process in early stages of systems development the locally calibrated approximate version of COSMIC Full Function Points was derived from the first set of measurements. Our version shows very good resemblance to the version presented in the Measurement Manual. Because these figures were derived in a very different environment this might be an indication that these figures have a more general applicability.
Abstract: The Expertise Centre Metrics of the Dutch Rabobank experienced problems in using Function Point Analysis for estimating development effort especially in contemporary platforms. As a result confidence in metrics decreased and management asked the Expertise Centre to look for a measurement method that could solve the problems. After a useful experiment with the Functional Size Reference Model, COSMIC Full Function Point was chosen as a method next to Function Point Analysis. This paper shows the first results of the adoption of COSMIC Full Function Points as a sizing method next to (or replacing) Function Point Analysis.
Abstract: The article presents an overview of the subject of Software Quality Engineering (SQE) education. Four different perspectives are taken into account: why to teach SQE, how the subject is being taught today, what support teachers have to teach SQE and how could the Software Quality engineer be educated. The latest trends in methods and tools pertinent to the domain are also presented.
Abstract: SWEBOK describes what knowledge a softwareengineer who has a BachelorÂ’s degree andfour years of experience should have. SEEKdescribes the knowledge to be taught in an undergraduateprogram in software engineering.Although different in scope and purpose, thereare many similarities between the two, and afterall, even experienced developers need an education,donÂ’t they? A full-day workshop onthe alignment between SWEBOK and SEEK,held at STEP 2002, revealed a number of issuesthat received either a scant or a scatteredtreatment in either or both documents.These issues include: software architecture,software measurement, and software quality.In addition, topics of debate were whether ornot user interface design should be consideredpart of software design, or rather deserves itsown, separate treatment; and whether maintenance/evolution merits a separate discussion,or should rather be seen as the default mode ofoperation in software development. This paperelaborates the discussions of this workshop.
Abstract: The current methods of effort estimation frequently take indirect account of the tasks of software performance engineering (SPE), and provide widely differing conclusions. In order to create transparency and acceptance for this task that has been growing in importance for years in the context of the life cycle of IT systems, an approach (PRM Performance Risk Model) is adopted that derives resource requirements from a corresponding risk analysis, and conversely looks at the business system to be supported, the software development and the operational environment. After a short introduction of the current situation and a look behind the PRM-model itself, this paper describes first experience by the use of the PRM-model within 6 industrial projects.
Abstract: Research has been initiated to develop a cost model for an outsourcing company which is oriented toward the development and maintenance of management information systems for a large telecommunications company. A measurement process has been implemented to collect post-project data and to develop a cost model for estimation purposes. As an initial step, these data were analyzed using the SLIM metrics tool and the COSMIC FFP method in parallel for the measurement of size. In this paper, the cost models derived from the two methods are presented, compared and discussed for their validity. Results of a short-cut estimation method based on the COSMIC FFP measures are also included. Finally, research avenues emerging from this study are identified.
Abstract: In the area of software measurement a diversity of methods exists to evaluate respectively to measure characteristics of software products, processes and resources. Within the last years different points metrics, e.g. Function Points, Feature Points, Object Points, Full Function Points, were developed and introduced. With help of these metrics functional size measurement is possible as well as early costs and effort estimations and process-conducting management activities based on metrics.Because of the increasing importance of the topic of functional size measurement and the variety of points metrics, this paper gives an overview of existing approaches and discussions, thus deals with opportunities and problems in this area.This study introduces chosen functional size metrics and evaluates them with respect to their suitability for certain functional domains and their maturity. Furthermore, general problems of functional size measurement and the belonging methods will be discussed as well as alternative approaches will be presented.
Abstract: Function Point Analysis was invented by Allan Albrecht of IBM as a means of sizing business application software independently of the technology used for its development. AlbrechtÂ’s method washeavily promoted in its early years and has become the most widely used such method. However, the underlying model which Albrecht used for his sizing method, which was valid in the mid 70Â’s when itwas first conceived, is increasingly difficult to apply to modern software development. This and other factors have led to a decline in the methodÂ’s use.In this paper, we examine the reasons for the decline, and the main advances in thinking on the more general topic of Functional Size Measurement (FSM) which have been made in the last 15 years.Specifically, the COSMIC FFP method is seen as a significant step forward in being the first method designed by an international group of software metrics experts to work for both business applicationand real-time software.Furthermore, it has been realised that a reliable FSM Method would be a very great asset with many uses, such as helping improve requirements specification, estimating, project ?scope creepÂ’, supplierperformance measurement and contract control, etc. The experience of the new methods and the realisation of their potential value indicate that a return to popularity of (modernised) Function Point Analysis, in the guise of more general FSM Methods such as COSMIC FFP is highly likely.
Abstract: This paper presents a kind of application of the software performance engineering to the area of agent-based systems.In a first part we will describe the general aspects and contents of multi agent systems (MAS) architectures. Then, a short presentation of the main software performance principles should motivate the measurement approaches for the MAS. Software agents can be intelligent as well as flexible. This involves the possibility of using traditional methods of software performance evaluation and controlling based on some classical approaches like a posteriori" or "fix it later" technologies.Our paper presents new approaches in performance measurement of agent-based systems relating to software aglets and MAS based on the new MALINA concept."
Abstract: The user/contractor negotiations when constructing a software application are not easy. Many factors need to be considered (cost, product deliver, technical issues, competition, risk, etc.). In this article our focus is on the delivery of the product through the different phases of a project, using functional measure. To follow a product though different phases, a measure that can cope with preliminary requirements must be used. So, functional measure is preferable to lines of code (1). But lines of code are not totally eliminated. Some have used FPA (Function Point Analysis) to predict lines of code (1,2) or to predict FPA from lines of code (backfire) (3). However there has been very little study, to our knowledge, that analyses the performance of FPA as a predictor, from the user requirements through to theimplementation of the software (2). The following article relates our experiences measuring, at different project phases (early, intermediary and final), two software development projects using different methodologies for counting and validating function points. We found that it was necessary to consider different types of users when counting software, not only the end user. What one user could see as technical could be seen as a functional requirement by another. This challenges the common view about what functional measure is measuring. In both projects a report generator was seen as the solution for the many reports identified in the requirements. This resulted in a lower number of function points. Other less obvious infrastructure processes were constructed to improve the quality of the product. We were not able to do the count of those processes based on FPA actual rules. Those processes are considered as technical.The final result for one project had no more than 6% difference in the number of function points between early, intermediary and final report. What was surprising for the supplier was the fact that the number of function points went down. There were many disagreements between the client and the supplier about the extension of the requirements caused by the construction of infrastructure processes not considered by FPA.The intermediary result of the second project showed minus 16% difference between the requirements and the second count when considering only the number of function points. For this particular project, it was decided to look more closely at the infrastructure processes considering different types of users. We used a measurement methodology that takes in account the different type of users (COSMIC-FFP). As a result, at the intermediary stage, the difference between the requirements count and the second count was 2%. Our conclusion is that, in order to control the cost of a software development project, the measurement methodology must consider the different types of users.
Abstract: This paper examines current approaches to usability metrics and proposes a new approach for quantifying software quality in use, based on modeling the dynamic relationships of the attributes that affect software usability. The Quality in Use Integrated Map (QUIM) is proposed for specifying and identifying quality in use components, which brings together different factors, criteria, metrics and data defined in different Human Computer Interface and Software Engineering models. The Graphical Dynamic Quality Assessment (GDQA) model is used to analyse interaction of these components into a systematic structure. The paper first introduces a new classification scheme into graphical logic based framework using QUIM components (factors, criteria metrics and data) to assess quality in use of interactive systems. Then, we illustratehow QUIM and GDQA may be used to assess software usability using subjective measures of quality characteristics as defined in ISO/IEC 9126.
Abstract: One of a software development manager's important tasks is to decide how to modify the development process in the interests of greater efficiency. For example, some have suggested that the benefits of Fagan-style code inspection far outweigh its cost; a cautious manager will wish to verify this before authorizing a trial. This paper shows what to measure about the current development process and how to use these measurements to make a rational decision.
Abstract: Following a CMM appraisal which identified the Requirements Management Process (RMP) as a target for improvement, CGI published a new requirements management policy. An action research study was then undertaken to find out the root-causes of some requirements-related problems and to design improvements to the process. Although development projects had been managed according to an internal development methodology for many years, the research concluded that the initial requirements for projects were too often incomplete, not validated by their end-users and that they were not really intelligible to the users. The Requirements Engineering Process and particularly the elicitation process was at the source of the problems. To improve the elicitation process, requirements checklists and a user scenario template were designed and tested in a pilot project. These were then included in the formal methodology. The weak parts of the methodology having been resolved, the real challenge of enforcing the Requirements Management policy now becomes a responsibility of project managers with the support of the high-level management for its institutionalisation.
Abstract: This paper evaluates the accuracy, precision and robustness of the paired comparisons method for software sizing and concludes that the results produced by it are superior to the so called ?expert? approaches.
Abstract: There are already a number of studies and success stories" about practical applications related to software reuse. For the most part however, the actual benefits of reuse, particularly for concrete technologies, are difficult to verify. The SW-WiVe project performed by Deutsche Telekom in collaboration with the Otto-von-Guericke University provides a detailed analysis and offers strategies for software reuse within industrial software development that can be subjected to critical evaluation. Traditional evaluation approaches, such as reuse metrics, were critically studied and necessary processes for continuous reuse were developed for this purpose. In a further step, currently available, valid reuse metrics for the software development process were classified and lacking metrics-based evaluation approaches were identified. This paper focuses on a description of the project's metrics-oriented terms of reference."
Abstract: This paper presents an application of full function point analysis to the estimation of the size of real-time control software. The full function point counting technique is briefly described. Its usage is illustrated on a part of the Westinghouse Reactor Protection Control System and the results analyzed. We further describe a technique for the graphical representation of requirements that helps in the full function point assessment. Specifically, the technique is used for identifying the groups of data and processes as well as the application boundaries.
Abstract: Since its first publication by Albrecht, Function Point Analysis has been revised and modified several times. Today, a number of variants are in use, which differ in their respective views on functional size.Function Point Analysis relies implicitly on a model of software. We propose the Function Point Structure as a formalization of the software model of Function Point Analysis. The Function Point count is then defined as a function on the Function Point Structure. Function Point variants differ in their abstract models of software as well as in their measure functions. Therefore, different formalizations of the Function Point Structure are required for each variant. We present here a generalized Function Point Structure for several data oriented variants of Function Point Structure, we can analyze the empirical assumptions made by the FPA variants and the implications on the prediction of other variables. We can also study the differences between the views and assumptions of the variants.
Abstract: This paper describes the approach used by a defense contractor to address the people issues raised when developing and implementing engineering processes. First, a brief description of the context is presented, then organizationalmechanisms to better manage changes are described finally, sixteen lessons learned are presented.
Abstract: There exist a lot of methodology hints, software inspection- and review- handbooks, and metrics tools to support the application of software measurement throughout the software lifecycle. There is still, however, a widespread lack of confidence in the interpretation of metrics and their values, in concentration on use of only a few metrics and in the processing of large sets of measured values. The analysis of the different evaluation, counting and measurement results should be supported by a data base technique to keep control of the software development process. This paper describes the different sources of metrics data and their effective handling. It concludes with a discussion of the experiences with a prototype of a metrics database in an industrial setting.
Abstract: We define an intermediate representation of a program P as a data flow graph DF(P) and shown that this representation allows us to express the program as a quadruple Q={F,T,r,w} useful in function point analysis. We show that we can derive DF(P) by program slicing, a form of static code analysis. Starting with one given output file, A say, we derive the smallest program A(P) that mimics P in its writing to A. When we repeat this process for all output files,(A,B,C,D say) we obtain a set of programs A(P), B(P), C(P) and D(P) in which any two are disjoint or identical. The number of unique such programs is the number of transactions and straightforward analysis yields DF(P).We shown that we can also derive DF(P) by program tracing, a form of dynamic code analysis. In this case, we can modify the program P being studied into a second program P' such that P' has the same behavior as P and P' generates a trace showing what it is doing. From the trace, we can automatically derive the intermediate representation DF(P). The modification of P to P' can be carried out automatically by methods of static code analysis now under development.
Abstract: The purpose of this paper is to discuss a structured software development process based on a graphical method which relates functional requirements and software specifications to the detailed software design and implementation. This paper presents a systematic logic-based method. The approach may be used to model the functional specifications of digital instrumentation and control systems used for safety purposes. The graphical representation of functional requirements is depicted through a multilevel hierarchical decom-position technique which allows one to (i) show functional interrelations between system and software (ii) map to follow functional information from the system level to the software implementation (requirement, de-sign, code), (iii) address incomplete, inconsistent, and ambiguous requirements.
Abstract: During the last five years, Oerlikon Aerospace developed and implemented engineering processes. In this paper, we discuss the application of risk management to the re-engineering of operator console stations of a missile weapon system. Webriefly describe the systems engineering process. Finally, twelve lessons learned are discussed.
Abstract: This paper describes the steps taken by our organization to develop, implement and integrate software engineering, systems engineering, supporting processes and project management process over a period of six years.
Abstract: Being aware of the growing pace of adoption of new technologies of information and communi-cation (NTIC) in many fields of activities, our mid-term research interest concerns the level of interest that education people should give to it and particularly the type of return that could be expected and reached in terms of student lear-ning at university level.This paper presents the progressive approach we have followed to adopt a groupware system with the intent of supporting collaborative group working of students in a typical project course" where group working is an essential condition of success. The first results indicate that interactions with professors have increased and that the context of use should really support interactivity to be effective."
Abstract: Information System Architecture has been considered since a few years as an essential part of the success of business computing systems. Formerly based on the modelling of data and functional decomposition, architecture should now evolve toward object-oriented techniques where data and functions are encapsulated in the same class as an object. In this paper, we present an original and particularly effective approach for its construction :· the representation of a generic model of business processes based on existing processes ;· the design of the domain object models by distinguishing actors, roles and objects involved in those processes ;· implementation using different technological approaches.Such a construct is well suited to support corporate strategy and offers many benefits :· system consistency and reuse of components ;· integration of processes and systems that supports them ;· rapid development of new components when applying this models to other processes.
Abstract: During the last five years, our organization has developed and implemented software and systems engineering processes. In this paper, we briefly describe the systems engineering process development, its application to the re-engineering oftwo major components of an air defense missile system, and some lessons learned from the two projects.
Abstract: Enterprise-wide Information System Architecture has been considered for a few years to influence the success of the MIS function.Meanwhile, the emerging Object Oriented (OO) techniques hold methods that will allow to :1) develop faster and2) distribute systems across internet and intranet networks.Today, component-based development (CBD), a natural evolution of OO development, appears to be the best development approach, mainly because of its capacity for reusability and, therefore, its potential for saving time and effort. The market for CBD tools and frameworks is rising at 80% per year.Initially based on data and function models, the architecture has to evolve towards OO techniques, thus expanding its role to serve as a foundation for component-based development.In this paper, we present an original and particularly effective method to build such an Enterprise-wide Information System architecture. The method encompasses:· adaptation of a generic process model, based on Porter value chain, to the enterprise;· design of Enterprise-Component Models by identifying actors, roles and objects involved in processes ;· creation of a component library from these models. We describe how to build the enterprise-component models, starting from a business process model and representing it in a generic pattern; this pattern is then used at a higher level to identify and represent enterprise components. The method is illustrated step by step to get the high-level model and an example of a typical enterprise-component is given. Such a method offers many advantages :· system consistency and reusability of components ;· integration of processes and systems that support them ;· rapid development of new components when applying these models to other processes.A component library, the most important result of the method, can then be expanded in several ways, as more and more components become available from software or application vendors., Component Models presented in this article result from a work being done by Éric Lefebvre and Peter Coad, in order to write a book entitled Enterprise-Component Models ", to be published by Prentice-Hall, in spring 1999."
Abstract: A Year 2000 software Factory has been established, in Canada, to re-engineer major systems of the Canadian Government mainly for real-time embedded systems. This paper is divided in three sections. In the first section the elements of a software factory and its implementation in the Factory are described. In section two the Year 2000 conversion process is described. Finally, lessons learned are presented.
Abstract: Function Point Analysis (FPA) is used by organisations worldwide as one of the measures used to establish the baseline size of their software assets in outsourcing contracts. This paper introduces new techniques, which enable all the functionality delivered and worked on by the supplier to be included in the productivity performance monitoring of these contracts. Typically only the business applications layer can be measured using FPA. The infrastructure software e.g. Utilities, device drivers and gateway applications, are usually overlooked because FPA is not designed to, nor easily adapted to, measuring internal layers of functions not delivered to the business user. This new Full Function Point Technique, developed by the University of Quebec in Montreal and SELAM, is a refinement of the FPA technique. It is no longer limited to only measuring MIS type applications but was specifically designed to meet the needs of organisations that build and support infrastructure applications, real-time and embedded software.
Abstract: Function Point Analysis (FPA) is used by organisations worldwide as one of the measures used to establish the baseline size of their software assets in outsourcing contracts. This paper introduces new techniques, which enable all the functionality delivered and worked on by the supplier to be included in the productivity performance monitoring of these contracts. Typically only the business applications layer can be measured using FPA. The infrastructure software e.g. Utilities, device drivers and gateway applications, are usually overlooked because FPA is not designed to, nor easily adapted to, measuring internal layers of functions not delivered to the business user. This new Full Function Point Technique, developed by the University of Quebec in Montreal and SELAM, is a refinement of the FPA technique. It is no longer limited to only measuring MIS type applications but was specifically designed to meet the needs of organisations that build and support infrastructure applications, real-time and embedded software.
Abstract: In this paper, the Goal Tree Success Tree and Master Logic Diagram (GTST-MLD) is proposed to model software development life cycle to ensure software quality based on meeting the criteria for high integrity safety systems. The GTST-MLD- based software development life cycle framework allows one to (1) show how a local change affects other phases of development; (2) GTST-MLD hierarchically represent software development life cycle so as to identify missing and incomplete requirements; (3) it is easy to automate on computers, to expand and update.
Abstract: In order to reduce cycle time, increase customer satisfaction and lower costs, Oerlikon Aerospace has initiated, in 1992, a project to define and implement software and systems engineering processes. The initiative started by performing a formal assessment of current software engineering practices. An action plan was developed and multi-functional workinggroups were tasked to define and facilitate the implementation of software processes. A second initiative was started, in 1995, with the objective of defining and implementing a systems engineering process, and integrating to the systems engineering process the software engineering process already in use.
Abstract: This paper is divided in three parts. The first part will present the Applied Software Engineering Centre, its history, its mission, and the services offered. The second part will present a brief profile of organisations that have undertaken to improve software processes utilising mainly the Capability Maturity Model developed by the Carnegie Mellon University Software Engineering Institute. The third part will present lessons learned in process improvement. This paper is an update of a presentation given on the occasion of a workshop held at GMD, a German software research centre. (Laporte 1993, 1995, 1996a).
Abstract: In order to reduce cycle time, increase customer satisfaction and lower costs, Oerlikon Aerospace has initiated, three years ago, a project to define and implement software and systems engineering processes. The initiative started by performing a formal assessment of current software engineering practices. An action plan was developed and multi-functional working groups were tasked to define and facilitate the implementation of software processes. A second initiative was started a year ago with the objective of defining and implementing a systems engineering process and, integrating to the systems engineering process the software engineering process already in use.
Abstract: The management of change is a key element of a successful process improvement program. Based on the experience described in this position paper, process improvementactivities, i.e. on-site assessment, action plan elaboration and action plan implementation activities can be facilitated by carefully managing the human issues of a major changeprogram.
Abstract: In the software engineering literature, there are numerous maturity models for assessing and evaluating a set of software processes. By contrast, there is no corresponding maturity model for assessing the quality of a software product. The design of such a model to assess the quality of a software product therefore represents a new research challenge in software engineering.Our main goal is to make available to industry (and consumers) a maturity model for assessing and improving the quality of the software product. This Software Product Quality Maturity Model (SPQMM) consists of three quality maturity submodels (viewpoints) that can be used not only once the software product has been delivered, but also throughout the life-cycle:? Software Product Internal Quality Maturity Model ? SPIQMM,? Software Product External Quality Maturity Model ? SPEQMM, and? Software Product Quality-in-Use Maturity Model ? SPQiUMM.In this thesis, we introduce the Software Product Quality Maturity Model (SPQMM), which could be used from three different viewpoints: the software product internal quality, the software product external quality, and the software product quality in-use. This quality maturity model is a quantitative model, and it based on the ISO 9126 (software product quality measures), ISO 15026 (software integrity levels), IEEE Std. 1012 (software verification and validation) and on six-sigma concepts. To build such a quality maturity model, we have combined the set of quality measures into a single number for each quality characteristic by assuming that all the measures for a single quality characteristic have an equal weight in the computation of a single value for that quality characteristic (they all make an equal contribution), yielding a quality level for that quality characteristic. The resulting quality level is then transformed based on the software integrity level into a sigma value positioned within a quality maturity level.
Notes: Type of Work: Ph. D. Thesis, Research Notes: 363
Abstract: Ispite of the fact that neural networks have been applied successfully in static pattern recognition, their applicability to the processing or the recognition of dynamic non stationary patterns (where time is an essential component) are still a great challenge and unsolved. Recent works in neurophysiology indicate that complex behaviors such as chaos, grouped synchronization, and spiking synchronization of certain cerebral zones could intervene in the processes of memorizing and of perception. Moreover, mathematical tools related to the study of non linear dynamics allow to widen our comprehension of certain neuronal mechanisms. These recent concepts open a new way in the development of neural networks.Rather than to solve a given problem or to improve a known technique, the fundamental idea of this thesis is to develop and to test a neural network architecture based on new principles inspired from chaotic dynamics and from observations in neurophysiology. The goal of the research presented here is thus to propose and to study an innovative neural network architecture that allows to treat temporal information of non stationary spatio-temporal processes.This work adopts a multidisciplinary character and uses the following available tools: simulators of artificial neural networks and simulators of the auditory peripheral system.The contributions of this thesis covers two aspects: - on the aspect of innovations, we propose a new neural network architecture inspired of layer IV of the cortex. This architecture has characteristics that are appropriates to the processing and recognition of non stationary dynamic processes. - on the aspect of applications, we have experimentally shown that the proposed network is able to process and to recognize non-stationary spatio-temporal processes through tasks such as recognizing noisy digits, processing temporal sequences, detecting movement in sequences of images, processing envelopes obtained from a bank of cochlear filters and realizing a prototype of speaker recognition system.
Notes: Type of Work: Ph.D., 19990429, Research Notes: 604
Abstract: We propose a formalization of the COSMIC Full Function Point (COSMIC-FFP) measure for the Real-time Object Oriented Modeling (ROOM) language. COSMIC-FFP is a measure of functional size. It has been proposed by the COSMIC group as an adaptation of the function point measure for real-time systems. The COSMIC-FFP definition is general and can be applied to any specification language. We propose a formalization of this definition for the ROOM language. ROOM is now widely used for constructing real-time systems. Thebenefits of our formalization are twofold. First it eliminates measurement variance, because the COSMIC-FFP informal definition is subject to interpretation by COSMIC-FFP raters, which may lead to different measurements for the same specification, depending on the interpretation made by each rater. Second it allows the automation of COSMIC-FFP measurement for ROOM specifications, which reduces measurement costs. Finally, the formal definition of COSMIC-FFP can provide a clear and unambiguous characterization of COSMIC-FFP concepts which is helpful for COSMIC-FFP measurement for other object-oriented notations like UML.
Abstract: Statistically Planned Incremental Deliveries (SPID) is a practical approach to planning and executing projects that need to meet hard deadlines. SPIDÂ’s objective is to guarantee that at least a minimum functionality is delivered by a required date.
Abstract: The management of software cost, development effort and project planning are key aspects of software development. Software size is critical element in these measurement requirements. Various approaches for the measurement of software size have been formulated, among others the number of lines of source code. Functional size measurement methods have been propose to overcome some of the deficiencies of approaches based on source code. It is the goal of these methods to measure the functionality of the software, independent of its implementation.We present here a case study that demonstrates the application of five current functional size measurement methods. The case in this study is formed by the functional user requirements for a portfolio of related applications for the management of warehouses. For each of the five methods, we present an evaluation of these functional user requirements. The selected functional size measurement methods are IFPUG Function Point Analysis, release 4.0 and 4.1, Mark II Function Point Analysis, and the Full Function Points approach, version 1.0 and 2.0. The study illustrates the core concepts common to these methods and the differences in their detailed measurement processes.
Notes: Type of Work: Technical, 20000704, Research Notes: 771
Abstract: This paper presents basic methods to analyze the properties of object-oriented software metrics. The metrics are characterized with several concatenation operations on different levels of abstraction. Metrics can thereby be interpreted above the ordinal scale level. The result of this investigation is that a large set of object-oriented metrics have properties that are completely different from properties of metrics for procedural languages. This set of metrics follows the Demptest-Shafer measure of belief.