40 resultados para Legacy object oriented code
Resumo:
The demands of developing modern, highly dynamic applications have led to an increasing interest in dynamic programming languages and mechanisms. Not only applications must evolve over time, but the object models themselves may need to be adapted to the requirements of different run-time contexts. Class-based models and prototype-based models, for example, may need to co-exist to meet the demands of dynamically evolving applications. Multi-dimensional dispatch, fine-grained and dynamic software composition, and run-time evolution of behaviour are further examples of diverse mechanisms which may need to co-exist in a dynamically evolving run-time environment How can we model the semantics of these highly dynamic features, yet still offer some reasonable safety guarantees? To this end we present an original calculus in which objects can adapt their behaviour at run-time to changing contexts. Both objects and environments are represented by first-class mappings between variables and values. Message sends are dynamically resolved to method calls. Variables may be dynamically bound, making it possible to model a variety of dynamic mechanisms within the same calculus. Despite the highly dynamic nature of the calculus, safety properties are assured by a type assignment system.
Resumo:
The delineation of shifting cultivation landscapes using remote sensing in mountainous regions is challenging. On the one hand, there are difficulties related to the distinction of forest and fallow forest classes as occurring in a shifting cultivation landscape in mountainous regions. On the other hand, the dynamic nature of the shifting cultivation system poses problems to the delineation of landscapes where shifting cultivation occurs. We present a two-step approach based on an object-oriented classification of Advanced Land Observing Satellite, Advanced Visible and Near-Infrared Spectrometer (ALOS AVNIR) and Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) data and landscape metrics. When including texture measures in the object-oriented classification, the accuracy of forest and fallow forest classes could be increased substantially. Based on such a classification, landscape metrics in the form of land cover class ratios enabled the identification of crop-fallow rotation characteristics of the shifting cultivation land use practice. By classifying and combining these landscape metrics, shifting cultivation landscapes could be delineated using a single land cover dataset.
Resumo:
The future Internet architecture aims to reformulate the way the content/service is requested to make it location-independent. Information-Centric Networking is a new network paradigm, which tries to achieve this goal by making content objects identified and requested by name instead of address. In this paper, we extend Information-Centric Networking architecture to support services in order to be requested and invoked by names. We present NextServe framework, which is a service framework with a human-readable self-explanatory naming scheme. NextServe is inspired by the object-oriented programming paradigm and is applicable with real-world scenarios.
Resumo:
Debuggers are crucial tools for developing object-oriented software systems as they give developers direct access to the running systems. Nevertheless, traditional debuggers rely on generic mechanisms to explore and exhibit the execution stack and system state, while developers reason about and formulate domain-specific questions using concepts and abstractions from their application domains. This creates an abstraction gap between the debugging needs and the debugging support leading to an inefficient and error-prone debugging effort. To reduce this gap, we propose a framework for developing domain-specific debuggers called the Moldable Debugger. The Moldable Debugger is adapted to a domain by creating and combining domain-specific debugging operations with domain-specific debugging views, and adapts itself to a domain by selecting, at run time, appropriate debugging operations and views. We motivate the need for domain-specific debugging, identify a set of key requirements and show how our approach improves debugging by adapting the debugger to several domains.
Resumo:
Femoroacetabular impingement (FAI) is a dynamic conflict of the hip defined by a pathological, early abutment of the proximal femur onto the acetabulum or pelvis. In the past two decades, FAI has received increasing focus in both research and clinical practice as a cause of hip pain and prearthrotic deformity. Anatomical abnormalities such as an aspherical femoral head (cam-type FAI), a focal or general overgrowth of the acetabulum (pincer-type FAI), a high riding greater or lesser trochanter (extra-articular FAI), or abnormal torsion of the femur have been identified as underlying pathomorphologies. Open and arthroscopic treatment options are available to correct the deformity and to allow impingement-free range of motion. In routine practice, diagnosis and treatment planning of FAI is based on clinical examination and conventional imaging modalities such as standard radiography, magnetic resonance arthrography (MRA), and computed tomography (CT). Modern software tools allow three-dimensional analysis of the hip joint by extracting pelvic landmarks from two-dimensional antero-posterior pelvic radiographs. An object-oriented cross-platform program (Hip2Norm) has been developed and validated to standardize pelvic rotation and tilt on conventional AP pelvis radiographs. It has been shown that Hip2Norm is an accurate, consistent, reliable and reproducible tool for the correction of selected hip parameters on conventional radiographs. In contrast to conventional imaging modalities, which provide only static visualization, novel computer assisted tools have been developed to allow the dynamic analysis of FAI pathomechanics. In this context, a validated, CT-based software package (HipMotion) has been introduced. HipMotion is based on polygonal three-dimensional models of the patient’s pelvis and femur. The software includes simulation methods for range of motion, collision detection and accurate mapping of impingement areas. A preoperative treatment plan can be created by performing a virtual resection of any mapped impingement zones both on the femoral head-neck junction, as well as the acetabular rim using the same three-dimensional models. The following book chapter provides a summarized description of current computer-assisted tools for the diagnosis and treatment planning of FAI highlighting the possibility for both static and dynamic evaluation, reliability and reproducibility, and its applicability to routine clinical use.
Resumo:
Object inspectors are an essential category of tools that allow developers to comprehend the run-time of object-oriented systems. Traditional object inspectors favor a generic view that focuses on the low-level details of the state of single objects. Based on 16 interviews with software developers and a follow-up survey with 62 respondents we identified a need for object inspectors that support different high-level ways to visualize and explore objects, depending on both the object and the current developer need. We propose the Moldable Inspector, a novel inspector model that enables developers to adapt the inspection workflow to suit their immediate needs by making the inspection context explicit, providing multiple interchangeable domain-specific views for each object, and supporting a workflow that groups together multiple levels of connected objects. We show that the Moldable Inspector can address multiple kinds of development needs involving a wide range of objects.
Resumo:
Subtype polymorphism is a cornerstone of object-oriented programming. By hiding variability in behavior behind a uniform interface, polymorphism decouples clients from providers and thus enables genericity, modularity and extensi- bility. At the same time, however, it scatters the implementation of the behavior over multiple classes thus potentially hampering program comprehension. The extent to which polymorphism is used in real programs and the impact of polymorphism on program comprehension are not very well understood. We report on a preliminary study of the prevalence of polymorphism in several hundred open source software systems written in Smalltalk, one of the oldest object-oriented programming languages, and in Java, one of the most widespread ones. Although a large portion of the call sites in these systems are polymorphic, a majority have a small number of potential candidates. Smalltalk uses polymorphism to a much greater extent than Java. We discuss how these findings can be used as input for more detailed studies in program comprehension and for better developer support in the IDE.
Resumo:
Conventional debugging tools present developers with means to explore the run-time context in which an error has occurred. In many cases this is enough to help the developer discover the faulty source code and correct it. However, rather often errors occur due to code that has executed in the past, leaving certain objects in an inconsistent state. The actual run-time error only occurs when these inconsistent objects are used later in the program. So-called back-in-time debuggers help developers step back through earlier states of the program and explore execution contexts not available to conventional debuggers. Nevertheless, even back-in-time debuggers do not help answer the question, ``Where did this object come from?'' The Object-Flow Virtual Machine, which we have proposed in previous work, tracks the flow of objects to answer precisely such questions, but this VM does not provide dedicated debugging support to explore faulty programs. In this paper we present a novel debugger, called Compass, to navigate between conventional run-time stack-oriented control flow views and object flows. Compass enables a developer to effectively navigate from an object contributing to an error back-in-time through all the code that has touched the object. We present the design and implementation of Compass, and we demonstrate how flow-centric, back-in-time debugging can be used to effectively locate the source of hard-to-find bugs.
Resumo:
Context-dependent behavior is becoming increasingly important for a wide range of application domains, from pervasive computing to common business applications. Unfortunately, mainstream programming languages do not provide mechanisms that enable software entities to adapt their behavior dynamically to the current execution context. This leads developers to adopt convoluted designs to achieve the necessary runtime flexibility. We propose a new programming technique called Context-oriented Programming (COP) which addresses this problem. COP treats context explicitly, and provides mechanisms to dynamically adapt behavior in reaction to changes in context, even after system deployment at runtime. In this paper we lay the foundations of COP, show how dynamic layer activation enables multi-dimensional dispatch, illustrate the application of COP by examples in several language extensions, and demonstrate that COP is largely independent of other commitments to programming style.
Resumo:
This paper presents a case study of analyzing a legacy PL/1 ecosystem that has grown for 40 years to support the business needs of a large banking company. In order to support the stakeholders in analyzing it we developed St1-PL/1 — a tool that parses the code for association data and computes structural metrics which it then visualizes using top-down interactive exploration. Before building the tool and after demonstrating it to stakeholders we conducted several interviews to learn about legacy ecosystem analysis requirements. We briefly introduce the tool and then present results of analysing the case study. We show that although the vision for the future is to have an ecosystem architecture in which systems are as decoupled as possible the current state of the ecosystem is still removed from this. We also present some of the lessons learned during our experience discussions with stakeholders which include their interests in automatically assessing the quality of the legacy code.