Informatica A Journal of Computing and Informatics The Slovene Society INFORMATIKA Ljubljana A Journal of Computing and Informatics Subscription Information Informatica (YU ISSN 0350-5596) is published four times a year in Winter, Spring, Summer and Autumn (4 issues). The subscription price for 1990 (Volume 14) is US$ 30 for companies and US$ 15-for individuals. Claims for missing issues will be honoured free of charge within six months after the publication date of the issue. Printed by Tiskarna Kresija Informacija za naročnike Informatica (YU ISSN 0350 - 5596) izide štirikrat na leto, in sicer v začetku januarja, aprila, julija in oktobra. Letna naročnina v letu 1990 (letnik 14) se oblikuje z upoštevanjem tečaja domače valute in znaša okvirno za podjetja DEM 16, za zasebnike DEM 8, za študente DEM 4, za posamezno številko pa DM 5. Številka žiro računa: 50101-678 - 51841. Zahteva za izgubljeno številko časopisa se upošteva v roku šestih mesecev od izida in je brezplačna. Tisk: Tiskarna Kresija, Ljubljana. Na podlagi mnenja Republiškega komiteja za informiranje št. 23 - 85, z dne 29. 1. 1986, je časopis Inforniatica oproščen temeljnega davka od prometa proizvodov. Pri financiranju časopisa Informatica sodeluje Raziskovalna skupnost Slovenije. A Journal of Computing and Informatics EDITOR-IN-CHIEF Anton P. Železnikar Iskra Delta Computers, Ljubljana ASSOCIATE EDITOR Rudolf Murn Jožef Stefan Institute, Ljubljana The Slovene Society INFORMATIKA Ljubljana Letnik 14 Številka 1 Januar 1990 YU ISSN 0350-5596 Informatica časopis za računalništvo in informatiko VSEBINA Giving Priority to Informational Technology? Zen and the Art of Modular Engineering An Introduction to Informationar Algebra Modula-2 and Software Engineering Suitability of 'Case' Methods and Tools for Computer Control Systems Trends of Computer Progress Detection of the Intersection of Two Simple Polyhedra Konveksni optimizacijski problemi z linearnimi omejitvami Dijagram toka podataka za proces projektiranja i uvodjenja informacijskih sistema Potpune definicije sintakse naredbi Pascal jezika i poziva standardnih procedura Read i Write Some Experiences in Teaching the Programming Practicum for Undergraduate and Graduate Students Možnost implementacije zanesljive baze podatkov v MS-DOS okolju Nekaj praktičnih vidikov uporabe Case orodij A. P. Zeleznikar B. R. Kirk A. P. Zeleznikar G. Pomberger J. Čemetič M. Gams Karmen Žitnik D. Surla Z. Budimac Tjaša Meško E. Drandič Mirjana Ivanovič D. Bonačič I. Pepelnjak J. Virant N. Zimic M. Rihar 1 2 7 29 38 45 49 55 61 70 74 77 83 Novice in zanimivosti 90 GIVING PRIORITY TO INFORMATIONAL TECHNOLOGY? Keywords: computer technology, informational technology, informational language Anton p. Železnikar Iskra Delta Summary. In this article the necessity and the reasonableness of possible informational technology versus traditional computational technology are discussed. This orientation also calls for a new, more flexible, i. e. informational language. In his preface article, published in the New Generation Computing 6 359-360, Toshio Yokoi, from the Japan Electronic Dictionary Research Institute (the group from ICOT), is putting the question whetlier to give priority to infonnation-oriented technology over computer-oriented technology. He argues that priorities in information-processing technology are changing rapidly from computer to information orientation. Although he is still predominantly 'thinking' in terms of traditional (e.g., Shaimonian, metric) encapsulation of the term information, he can, in fact, not avoid (or reveal) the awareness of the difference existing between real (living, autopoietic, anthropological) information (as coming into existence, for instance, at the use of a dictionary) and the so-called computer-produced or data-structured "information'. This awareness images evidently the difference he observes between the two technologies, where the informational reminds us of mind (or information-processing) and computational of brain (or machine, substance) structured and organized concept. 'The mcreasing interest in artificial intelligence (and not only in knowledge-information processing) and man-machine interface is pointing to die direction, which can be xmderstood as Informational. The consequence of disposed awareness is that the emphasis is already "shifting from system (machine) software to application (practical information) software, where users are beginning to play their own productive role in creating their computer systems. Computing is becoming conmiunicational and modem communication has to become more and nuore computational. 'Communication ari J information sy iis" is becoming a common and indivisible term. By vinu jf such develop- ment, computing with communication is becoming Informational, not only in the sense of traditional (hard, mathematical) sciences, but more and more in the sense of that what living beings understand (or think) as information, as informationally arising phenomenology of information. Yokoi argues that the point in creating of information-oriented technology implies the arising of information-oriented theory and basic technology. And he says that there have been various discussions on what this common base should be. It should be a language which encompasses both natural language and artificial language, where the last includes logical formulas, algebraic formulas, programming languages, graphic forms, etc. In his articles concerning informational logic, algebra, and discourse, the author has shown one of the possibilities how tlie way to the so-called informational philosophy and theory could be traced and how this philosophy and theory could impact the fields of natural language theory, artificial language tlieory, the language theory integrating tlie two, and the language-processing technology (as stressed by Yokoi). How could the informational logic and the informational algebra be used by an Electronic Dictionai^ Project? References Yokoi, T., Giving Proirity to "Information-Oriented Technology" over "Computer-Oriented Technology", New Generation Computing 6 (1989) 358-360. Železnikar, A. P., Informational Logic I, II, III, IV, Informatica 12 (1988) 3, 26-38; 4, 3-20; Informatica 13 (1989) 7, 25-42; 2, 6-23. Železnikar, A. P., An Informational Theory of Discourse I, Informatica 13 (1989) 4, 16-37. Železnikar, A. P., An Introduction to Informational Algebra, Informatica 14 (1990) 1, 7-28. ZEN AND THE ART OF MODULAR ENGINEERING Keywords: modular engineering, modularization, software Brian R Kirk MSc MBSC, Robinson Associates, United Kingdom Presented at the 1 Int'l Modula-2 Conference, October 12-13,1989 Bled-Yugoslavia INTRODUCTION THE PRODUCT As time passes computing components became an integral part of more and larger systems. The diagram shows our dilemma bcope Comprehension Extension The scope of what is requested seems only to be limited by what can be magined. Somehow as designers we must gain comprehension of all the implications of the whole system in all its states. And most taxing of all we must provide accurate solutions that support extensions to match the requirements as they evolve. This paper offers an approach to coping with the dilemma, the title encapsulates the concepts. Zen looking inside in a search for understanding Art a fine skill Modular separate parts designed to be cohesive Engineering designing and building practical machines A real product is used as an example of how modules and machines made from modules can provide reuse and extension of existing software, even when there are difficult constraints on the implementation. The .objective of the paper is to pass on the experience learned whilst engineering a large real time software system, in particular the approaches used to divide and conquer the complexity and inherent concurrency may be of interest to imple-mentors of high integrity systems. In all cases the pragmatic approach to the finding of practical solutions is described. The language Modula-2 has been used as the programming notation. It is now hard to conceive or believe that such a large system could have been created so effectively with any other available language. Particularly in a form that can be understood and extended with ease. Often it is necessary to update an existing product and give it a new image. In our case the requirement was to take a paper-tape based multi-axis machine tool and to match it to the current marketplace. The extensions included CAD, graphics, a file system, a printer, remote controlled operation - and all this with either English, French, German, Italian and Russian interaction with the user - see Figure 1. The form of any design is a product of its designers interpretation of its requirements and constraints. In this case the constraints were formidable ... 1 the need to support all existing functionality 2 the impossibility of all but minor modification to existing software (some sources were lost!) 3 the need for a real-time response on the display, CNC, remote link and language translation 4 the need to interact in ad-hoc ways with 3 existing computers 5 only having a RAM memory of one third the size of the whole program 6 the need to make all the new software resilient to power failure for continuous operation 7 the Client's prior choice of DOS and GEM for filing, graphics and multi-tasking The completed software is large, it contains: 3 programs with 15 overiays 150 modules 2000 messages each in 5 languages 2 Mbytes of executable code 30 Mbytes of source code It was developed by a team of 6 people over a period of 2 years. Had we realised initially the full scope of the requirements and the implications of the constraints we might never have started. Only the rigorous use of modular engineering concepts and carefully coordinated implementation in Modula-2 by a team of professional software engineers made the whole project feasible. MODULAR ENGINEERING Engineers analyse problems using concepts and then synthesize their solution by organising some physical form, in this case the software part of the system. The diagram shows the main criteria Abstraction Mechanism Quality The abstractions we use to analyse and model the problem have evolved over the past 40 years of computing, these include Names Macros Procedures Control Stnjctures Classes for instructions, data and locations to encapsulate and reuse the text of sequences of instructions or data to encapsulate and reuse sequences of instructions at runtime to encapsulate the flow of control to encapsulate evolutionary definitions in a reusable and extensible way Modules Extensible Modules Delegating Objects to encapsulate whole components, hiding information and/ or ownership to encapsulate objects which have statically related definitions to encapsulate objects which are dynamically related and extensible. An object which cannot provide a requested method delegates it to another object which can. Languages provide a means to express solutions to problems in terms of these abstractions, for example, Assembler, Algol, Simula, Modula-2, Oberon and Delegate. Thè trend in abstraction is towards an object oriented approach because this minimises the distance between the problem and its programmed solution: "the solution is a simulation of the problem". In practice we have found Modula-2 an adequate language for expressing both modules and delegating objects, which are message driven tasks consisting of modules. The mechanisms are simply ways of achieving something. For example in Figure 1, modules M5 and M6 provide an interface between various tasks in the two processors. In our first implementation M5 replaced the old graphics card driver and sent equivalent messages to M6. This made it possible to reuse the vast majority of the original software LOCAL USER REMOTE USER PRINTER I ORIGINAL PRODUCT WITH "BUTTONS AND LIGHTS" — , FIGURE 1: NEW PRODUCT WITH GRAPHICS AND CAD. with minimal changes. Of course M6 completely hid the protocol and a rather nasty dual port RAM interface from all the new software. This technique was much too slow in practice and was later replaced by a set of records and update flags at agreed fixed positions in the shared memory. By using modules on each side to encapsulate the mechanisms it became possible to change the méèhanisrri separately from the rest of the systjera. We found that a good test for the quality of a module's interface was to consider how much it would need to change if the mechanism it encapsulated but not necessarily the functionality, has to change. The quality of the implementation is the third main factor. Engineers differ from computer scientists in that they are faced with many practical constraints and exceptions yet their solution must be effective in actual use. For example the machine tool can cut diamonds and diamonds are valuable. The clients are not impressed by large diamonds that unfortunately have the wrong shape due to software errors. About 15% of the modules we wrote were test harness modules which either exercised the modules under test or acted as dummy modules for uncompleted parts of the system. Sometimes we wrote modules to provide rough prototypes of parts of the system that were poorly specified or particularly difficult to achieve. By isolating these areas adequate solutions were found quickly and the risk to the whole system minimised. Sometimes the structure or quality of existing software was too risky to incorporate into the product. In these cases we 'reverse engineered' the software. This involved analysing the code to discover what the intended requirements were, we then made the requirements self-consistent. The software was then redesigned in line with the system model, mechanisms and modules. This concept provides clean maintainable software rather than horribly bodged incongruous coding - it also takes less effort. The system was constructed as a 'pile of machines' implemented with programs, processes and modules. Always striving to verify that the partially complete system had 100% correct functionality within itself. This policy of stepwise construction of the system provided visibility of progress, a practical means to assess quality and confidence or our Clients. CRITERIA FOR MODULARISATION The main reason that we partition systems into subsystems and modules is to encapsulate our comprehension and thus extend our capabilities. This is achieved by using abstraction to separate out distinct parts of the problem. These abstractions are then implemented by building logrcal machines on top of physical ones to mechanise the abstraction in a form, and at a cost, which is appropriate to the user. In the past the criteria for modularisation were influenced by the 'everything is a hierarchy' view of programming, latteriy the use of 'information hiding' as a criterion has been much more useful. What is really needed is a set of criteria that maximise the separation of... Representation of Objects Relationships both Static and Dynamic Mechanisms At the same time we wish to optimise ease of comprehension ease of development by teams flexibility for extension possibilities Perhaps the fundamental criterion is that each separate part, be it active, object or component module, should be testable. If it is not certain that something can be tested before it is built then there is little point in building it because there is no possibility for quality assessment or control. Looking back on our projects we can now see the actual criteria that have been most effective, they include encapsulation of reuse, adaption, concurrency, consistency and mechanisms. REUSE It is a fact of life that reuse of what already exists is often essential. Usually the reason is short-term economic optimisation (this is rarely justified in practice!) but sometimes it is just not possible to reimplement old parts of a new system because there is not enough time or the knowledge is no longer available. In any case if a product is still 'alive' it certainly will need to be extended to match its behaviour and performance to the evolving needs of its users. This needs to be done with minimal modification of existing parts but the aim is to inherit the functionality and system model from the existing system. Unfortunately the mistakes and constraints are also inherited, the main disadvantages of standardization. There are some classic examples of reuse in Figure 1. The whole of the old inachine software Is reused except for two modules that provide a new interface to the software extensions, eg module M5. More typical ones are M1 and M2 which provide 'cleaned up' interfaces to DOS and GEM. Indeed GEM provides both graphics and multiprogramming scheduling machines built on-top-of DOS. The GEM constraints of supporting only 4 programs (not tasks) and of round-rob n scheduling were inherited by the system and distorted its form, reducing productivity. ADAPTION When creating large systems it always pays to make the software part as portable as possible. Conventionally this s done by providing 'device driver' modules which abstract away particular physical characteristics at the lowest level and offer a clean logical software interface instead. The client modules then use the clean interface so mal