Nothing Special   »   [go: up one dir, main page]

skip to main content
Skip header Section
The Future of Software EngineeringOctober 2010
Publisher:
  • Springer-Verlag
  • Berlin, Heidelberg
ISBN:978-3-642-15186-6
Published:21 October 2010
Pages:
186
Skip Bibliometrics Section
Reflects downloads up to 01 Nov 2024Bibliometrics
Skip Abstract Section
Abstract

This book focuses on defining the achievements of software engineering in the past decades and showcasing visions for the future. It features a collection of articles by some of the most prominent researchers and technologists who have shaped the field: Barry Boehm, Manfred Broy, Patrick Cousot, Erich Gamma, Yuri Gurevich, Tony Hoare, Michael A. Jackson, Rustan Leino, David L. Parnas, Dieter Rombach, Joseph Sifakis, Niklaus Wirth, Pamela Zave, and Andreas Zeller. The contributed articles reflect the authors individual views on what constitutes the most important issues facing software development. Both research- and technology-oriented contributions are included. The book provides at the same time a record of a symposium held at ETH Zurich on the occasion of Bertrand Meyers 60th birthday.

Contributors
  • Swiss Federal Institute of Technology, Zurich

Index Terms

  1. The Future of Software Engineering

    Reviews

    Haim I. Kilov

    Speaking of the first NATO Software Engineering Conference in 1968, Brian Randell says that "it was fully accepted that the term software engineering expressed a need rather than a reality. ... Many people started to use the term to describe their work, to my mind often with very little justification" [1]. Now, when discussing the future of software engineering as in this book-a collection of papers and abstracts from a symposium held on the occasion of the 60th birthday of Bertrand Meyer, a great software engineer-we may consider the current state of the art. There are (relative) optimists (some contributors to this book); there are pessimists (none in this book, but see [2]); and there are realists such as Michael Jackson, who stresses in his outstanding paper that "the phrase `software engineering' [after 40 years of currency] still denotes no more than a vague and largely unfulfilled aspiration." The contributions are ordered alphabetically, by the last names of the contributors (with the exception of the afterword by Tony Hoare). The most interesting ones are noted here. Boehm's survey paper opens the book and is a very good place to start. Boehm describes eight unsurprising trends and two "wild-card" trends in the future of software engineering. He is quite emphatic about dependability not being the top priority for software producers and predicts that between now and 2025 "a major software-induced systems catastrophe similar in impact to world consciousness to the 9/11 World Trade Center catastrophe ... is highly likely." Boehm and some other contributors make a very important observation about the dangers of reductionism-that is, considering software independently of the business systems for which it ought to be used. Boehm is properly skeptical about the increased needs for businesses to accommodate commercial off-the-shelf (COTS) (often "opaque and hard to debug") and software services. He notes that in COTS or service-based solutions, "the COTS or service capabilities largely determine the requirements"; however, he is not explicit about the consequences for businesses using information technology (IT) systems that are not what the businesses want them to be and that, as a result, often impose on a business unnatural, robotlike, and sometimes unclear rules and processes. Such systems are described in the context of "professionalization" in [3]: Encoding ... beliefs [about the best ways for people in organizations to do things] in software often imbues them with the force of law. ... Business rules are increasingly captured in black-box analytic engines that few, if any, people can understand or override. As one CIO recently asserted: "There are no more business processes any more; the system is the business process." Whereas formerly the systems that codified management ideas were "home-grown" and unique to the cultures and practices of individual organizations, they are increasingly "off-the-shelf" packages or services that themselves have the aura of "best practice" standards. Later Boehm refers to relatively unpredictable change, but mostly in the context of very large systems "with deep supplier hierarchies (often 6 to 12 levels)," so that drastic changes may lead to destabilizing the development process. However, changes are not necessarily incremental, and Boehm notes serious problems in using the purchasing agent metaphor in large system acquisition because the business world is open rather than closed (the term "open systems" is, regrettably, not used). Broy properly observes that many development principles and rules of best practice have been around for decades, often since the NATO 1968 conference, but have not been consistently applied. For example, the importance and role of (business) domain modeling was noted, in passing, by Broy only as a research topic and in his reference to "Gedankenmodell." In addition, while the need for understandability was mentioned in several papers, the dangers of illusion of understandability when using box-and-line diagrams with unspecified semantics were not mentioned. Broy emphasizes the need for precise definitions of concepts and terms in engineering and notes such counterexamples as "function," "feature," and "service" frequently used without a proper definition. It is doubtful, however, that "starting from use cases," as Broy recommends, is the best way to describe "interactions and dependencies between the different subfunctionalities": experience suggests that this approach often leads to problems as a result of which "a lot of time is wasted in confusing discussions." Starting from business domain models was recommended by Bjørner [4] (not mentioned by any of the contributors) for a good reason! In their very well-written and interesting paper, Blass and others discuss evidential authorization in a distributed environment (without intermediaries or a central engine) using the clinical trial environment as an example. Their paper is an excellent example of providing precise definitions of concepts and terms advocated by Broy. At the same time, the authors acknowledge that their language has important semantic problems, such as the existence of homonyms in communications, and that it can lead to misunderstood messages (this is certainly due to the absence of a common business domain model). Of course, the problem of syntactic versus semantic interoperability is rather well known. In his outstanding paper, Jackson emphasizes that the purpose of a computer-based system is firmly located in the problem world and that, as a result, "software engineers must be intimately concerned with the problem world" that is "not reducible to physics or mathematics" (compare [5]). He warns about the need for an "intuitive grasp of how [a human] purpose can be achieved by the designed contrivance" (for example, a software system). He further notes that "the problem domains effectively act as shared variables, introducing additional interaction paths between software components," leading us to the theory of complex phenomena [5], also not mentioned in the book. Furthermore, following Vincenti, Jackson clearly distinguishes between normal and radical design and observes that there are almost no published specific artifacts serving as examples of normal design of software systems. He stresses the need for an intelligible structure of a program, but notes "that the emphasis on human understanding has gradually faded" and that, while "formal techniques are essential, they must be deployed within humanly intelligible structures." Leino (and Wirth in his abstract) notes the inadequacy-especially the low abstraction level-of popular programming languages and the need for behavioral abstraction in a language (of which Eiffel's contracts are obviously a well-known example). Parnas, in his excellent and strongly written paper on precise documentation, stresses the need for software documentation, "as in other fields of engineering," to be a forethought rather than an afterthought. He clearly differentiates between software development and programming (as Gerald Weinberg in [6] does between professional and amateur programming-not mentioned in the book) and states that "using mathematics in documentation is the only way to produce good documents," noting that tabular (modular) representations have been and may be used to "make these mathematical expressions easier to read, review, and write." He further states that "no user-visible decisions should be left to the programmers because programmers [...] often do not have expertise in an application or knowledge of the needs of users"; regrettably, quite a few software packages-the use of which demands businesses to behave as the system requires-are instantiations of this pattern. In her paper on the Internet evolution, Zave stresses that some principles well understood by software engineers "appear to be unfamiliar to the networking community and/or at least to large segments of it." She implicitly supports Dijkstra's viewpoint on programmers being best qualified to take up abstraction challenges and Bjørner's viewpoint on software engineers being best qualified to take up business (domain) modeling. She emphasizes that complexity matters, that concerns should be separated, that structures should be simple and general, and that there is a need to think compositionally ("there is no such thing as a unique element"). She proposes a pattern for network architecture based on Day's concept of an overlay as a type of architectural unit that has a clear structure with appropriately defined abstraction levels. Zave advocates a top-down rather than a bottom-up (resource-centric) viewpoint, and shows an example of serious problems with networking software: the application protocol SIP-the dominant protocol for Internet protocol (IP)-based multimedia applications-is "defined by 142 standards documents totaling many thousands of pages." Zeller's paper about specification mining describes the first steps and (semi-)automatic approaches of extracting specifications from existing software systems for their use and reuse. The book does not have an index, although it certainly would be interesting and instructive to compare different approaches, for example, to reductionism, by different authors, even if these approaches are implicit. Online Computing Reviews Service

    Access critical reviews of Computing literature here

    Become a reviewer for Computing Reviews.

    Please enable JavaScript to view thecomments powered by Disqus.

    Recommendations