944 resultados para problems with object-oriented paradigm
Resumo:
Optimized structure of the educational program consisting of a set of the interconnected educational objects is offered by means of problem solution of optimum partition of the acyclic weighed graph. The condition of acyclicity preservation for subgraphs is formulated and the quantitative assessment of decision options is executed. The original algorithm of search of quasioptimum partition using the genetic algorithm scheme with coding chromosomes by permutation is offered. Object-oriented realization of algorithm in language C++ is described and results of numerical experiments are presented.
Resumo:
Projected air and ground temperatures are expected to be higher in Arctic and sub-Arcticlatitudes and with temperatures already close to the limit where permafrost can exist,resistance against degradation is low. With thawing permafrost, the landscape is modifiedwith depression in which thermokarst lakes emerge. In permafrost soils a considerableamount of soil organic carbon is stored, with the potential of altering climate even furtherif expansion and formation of new thermokarst lakes emerge, as decay releasesgreenhouse gases (C02 and CH4) to the atmosphere. Analyzing the spatial distribution andmorphometry over time of thermokarst lakes and other water bodies, is of importance inaccurately predict carbon budget and feedback mechanisms, as well as to assess futurelandscape layout and these features interaction. Different types of high-spatial resolutionaerial and satellite imageries from 1963, 1975, 2003, 2010 and 2015, were used in bothpre- and post-classification change detection analyses. Using object oriented segmentationin eCognition combined with manual adjustments, resulted in digitalized water bodies>28m2 from which direction of change and morphometric values were extracted. Thequantity of thermokarst lakes and other water bodies was in 1963 n=92, with succeedingyears as a trend decreased in numbers, until 2010-2015 when eleven water bodies wereadded in 2015 (n=74 to n=85). In 1963-2003, area of these water bodies decreased with50 651m2 (189 446-138 795m2) and continued to decrease in 2003-2015 ending at 129337m2. Limnicity decreased from 19.9% in 1963 to 14.6% in 2003 (-5.3%). In 2010 and2015 13.7-13.6%. The late increase in water bodies differs from an earlier hypothesis thatsporadic permafrost regions experience decrease in both area and quantity of thermokarstlakes and water bodies. During 1963-2015, land gain has been in dominance of the ratiobetween the two competing processes of expansion and drainage. In 1963-1975, 55/45%,followed by 90/10% in 1975-2003. After major drainage events, land loss increased to62/38% in 2010-2015. Drainage and infilling rates, calculated for 15 shorelines werevaried across both landscape and parts of shorelines, with in average 0.17/0.15/0.14m/yr.Except for 1963-1975 when rate of change in average was in opposite direction (-0.09m/yr.), likely due to evident expansion of a large thermokarst lake. Using a squaregrid, distribution of water bodies was determined, with an indistinct cluster located in NEand central parts. Especially for water bodies <250m2, which is the dominant area classthroughout 1963-2015 ranging from n=39-51. With a heterogeneous composition of bothsmall and large thermokarst lakes, and with both expansion and drainage altering thelandscape in Tavvavuoma, both positive and negative climate feedback mechanisms are inplay - given that sporadic permafrost still exist.
Resumo:
We have designed and tested an Internet-based video-phone suitable for use in the homes of families in need of paediatric palliative care services. The equipment uses an ordinary telephone line and includes a PC, Web camera and modem housed in a custom-made box. In initial field testing, six clinical consultations were conducted in a one-month trial of the videophone with a family in receipt of palliative care services who were living in the outer suburbs of Brisbane. Problems with variability in call quality-namely audio and video freezing, and audio break-up-prompted further laboratory testing. We completed a programme of over 250 test calls. Fixing modem connection parameters to use the V.34 modulation protocol at a set bandwidth of 24 kbit/s improved connection stability and the reliability of the video-phone. In subsequent field testing 47 of 50 calls (94%) connected without problems. The freezes that did occur were brief (with greatly reduced packet loss) and had little effect on the ability to communicate, unlike the problems arising in the home testing. The low-bandwidth Internet-based video-phone we have developed provides a feasible means of doing telemedicine in the home.
Resumo:
This paper highlights the importance of design expertise, for designing liquid retaining structures, including subjective judgments and professional experience. Design of liquid retaining structures has special features different from the others. Being more vulnerable to corrosion problem, they have stringent requirements against serviceability limit state of crack. It is the premise of the study to transferring expert knowledge in a computerized blackboard system. Hybrid knowledge representation schemes, including production rules, object-oriented programming, and procedural methods, are employed to express engineering heuristics and standard design knowledge during the development of the knowledge-based system (KBS) for design of liquid retaining structures. This approach renders it possible to take advantages of the characteristics of each method. The system can provide the user with advice on preliminary design, loading specification, optimized configuration selection and detailed design analysis of liquid retaining structure. It would be beneficial to the field of retaining structure design by focusing on the acquisition and organization of expert knowledge through the development of recent artificial intelligence technology. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Current Physiologically based pharmacokinetic (PBPK) models are inductive. We present an additional, different approach that is based on the synthetic rather than the inductive approach to modeling and simulation. It relies on object-oriented programming A model of the referent system in its experimental context is synthesized by assembling objects that represent components such as molecules, cells, aspects of tissue architecture, catheters, etc. The single pass perfused rat liver has been well described in evaluating hepatic drug pharmacokinetics (PK) and is the system on which we focus. In silico experiments begin with administration of objects representing actual compounds. Data are collected in a manner analogous to that in the referent PK experiments. The synthetic modeling method allows for recognition and representation of discrete event and discrete time processes, as well as heterogeneity in organization, function, and spatial effects. An application is developed for sucrose and antipyrine, administered separately and together PBPK modeling has made extensive progress in characterizing abstracted PK properties but this has also been its limitation. Now, other important questions and possible extensions emerge. How are these PK properties and the observed behaviors generated? The inherent heuristic limitations of traditional models have hindered getting meaningful, detailed answers to such questions. Synthetic models of the type described here are specifically intended to help answer such questions. Analogous to wet-lab experimental models, they retain their applicability even when broken apart into sub-components. Having and applying this new class of models along with traditional PK modeling methods is expected to increase the productivity of pharmaceutical research at all levels that make use of modeling and simulation.
Resumo:
This paper presents a formal framework for modelling and analysing mobile systems. The framework comprises a collection of models of the dominant design paradigms which are readily extended to incorporate details of particular technologies, i.e., programming languages and their run-time support, and applications. The modelling language is Object-Z, an extension of the well-known Z specification language with explicit support for object-oriented concepts. Its support for object orientation makes Object-Z particularly suited to our task. The system structuring techniques offered by object-orientation are well suited to modelling mobile systems. In addition, inheritance and polymorphism allow us to exploit commonalities in mobile systems by defining more complex models in terms of simpler ones.
Resumo:
Object-Z offers an object-oriented means for structuring formal specifications. We investigate the application of refactoring rules to add and remove structure from such specifications to forge object-oriented designs. This allows us to tractably move from an abstract functional description of a system toward a lower-level design suitable for implementation on an object-oriented platform.
Resumo:
An object-oriented finite-difference time-domain (FDTD) simulator has been developed for electromagnetic study and design applications in Magnetic Resonance Imaging. It is aimed to be a complete FDTD model of an MRI system including all high and low-frequency field generating units and electrical models of the patient. The design method is described and MRI-based numerical examples are presented to illustrate the function of the numerical solver, particular emphasis is placed on high field studies.
Resumo:
We discuss a methodology for animating the Object-Z specification language using a Z animation environment. Central to the process is the introduction of a framework to handle dynamic instantiation of objects and management of object references. Particular focus is placed upon building the animation environment through pre-existing tools, and a case study is presented that implements the proposed framework using a shallow encoding in the Possum Z animator. The animation of Object-Z using Z is both automated and made transparent to the user through the use of a software tool named O-zone.
Resumo:
PURPOSE: To determine the objective measures of visual function that are most relevant to subjective quality of vision and perceived reading ability in patients with acquired macular disease. METHODS: Twenty-eight patients with macular disease underwent a comprehensive assessment of visual function. The patients also completed a vision-related quality-of-life questionnaire that included a section of general questions about perceived visual performance and a section with specific questions on reading. RESULTS: Results of all tests of vision correlated highly with reported vision-related quality-of-life impairment. Low-contrast tests explained most of the variance in self-reported problems with reading. Text-reading speed correlated highly with overall concern about vision. CONCLUSIONS: Reading performance is strongly associated with vision-related quality of life. High-contrast distance acuity is not the only relevant measure of visual function in relation to the perceived visual performance of a patient with macular disease. The results suggest the importance of print contrast, even over print size, in reading performance in patients with acquired macular disease.
Resumo:
In this position paper we present the developing Fluid framework, which we believe offers considerable advantages in maintaining software stability in dynamic or evolving application settings. The Fluid framework facilitates the development of component software via the selection, composition and configuration of components. Fluid's composition language incorporates a high-level type system supporting object-oriented principles such as type description, type inheritance, and type instantiation. Object-oriented relationships are represented via the dynamic composition of component instances. This representation allows the software structure, as specified by type and instance descriptions, to change dynamically at runtime as existing types are modified and new types and instances are introduced. We therefore move from static software structure descriptions to more dynamic representations, while maintaining the expressiveness of object-oriented semantics. We show how the Fluid framework relates to existing, largely component based, software frameworks and conclude with suggestions for future enhancements. © 2007 IEEE.
Resumo:
The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.
Resumo:
Almost a decade has passed since the objectives and benefits of autonomic computing were stated, yet even the latest system designs and deployments exhibit only limited and isolated elements of autonomic functionality. In previous work, we identified several of the key challenges behind this delay in the adoption of autonomic solutions, and proposed a generic framework for the development of autonomic computing systems that overcomes these challenges. In this article, we describe how existing technologies and standards can be used to realise our autonomic computing framework, and present its implementation as a service-oriented architecture. We show how this implementation employs a combination of automated code generation, model-based and object-oriented development techniques to ensure that the framework can be used to add autonomic capabilities to systems whose characteristics are unknown until runtime. We then use our framework to develop two autonomic solutions for the allocation of server capacity to services of different priorities and variable workloads, thus illustrating its application in the context of a typical data-centre resource management problem.
Using interior point algorithms for the solution of linear programs with special structural features
Resumo:
Linear Programming (LP) is a powerful decision making tool extensively used in various economic and engineering activities. In the early stages the success of LP was mainly due to the efficiency of the simplex method. After the appearance of Karmarkar's paper, the focus of most research was shifted to the field of interior point methods. The present work is concerned with investigating and efficiently implementing the latest techniques in this field taking sparsity into account. The performance of these implementations on different classes of LP problems is reported here. The preconditional conjugate gradient method is one of the most powerful tools for the solution of the least square problem, present in every iteration of all interior point methods. The effect of using different preconditioners on a range of problems with various condition numbers is presented. Decomposition algorithms has been one of the main fields of research in linear programming over the last few years. After reviewing the latest decomposition techniques, three promising methods were chosen the implemented. Sparsity is again a consideration and suggestions have been included to allow improvements when solving problems with these methods. Finally, experimental results on randomly generated data are reported and compared with an interior point method. The efficient implementation of the decomposition methods considered in this study requires the solution of quadratic subproblems. A review of recent work on algorithms for convex quadratic was performed. The most promising algorithms are discussed and implemented taking sparsity into account. The related performance of these algorithms on randomly generated separable and non-separable problems is also reported.
Resumo:
The development of increasingly powerful computers, which has enabled the use of windowing software, has also opened the way for the computer study, via simulation, of very complex physical systems. In this study, the main issues related to the implementation of interactive simulations of complex systems are identified and discussed. Most existing simulators are closed in the sense that there is no access to the source code and, even if it were available, adaptation to interaction with other systems would require extensive code re-writing. This work aims to increase the flexibility of such software by developing a set of object-oriented simulation classes, which can be extended, by subclassing, at any level, i.e., at the problem domain, presentation or interaction levels. A strategy, which involves the use of an object-oriented framework, concurrent execution of several simulation modules, use of a networked windowing system and the re-use of existing software written in procedural languages, is proposed. A prototype tool which combines these techniques has been implemented and is presented. It allows the on-line definition of the configuration of the physical system and generates the appropriate graphical user interface. Simulation routines have been developed for the chemical recovery cycle of a paper pulp mill. The application, by creation of new classes, of the prototype to the interactive simulation of this physical system is described. Besides providing visual feedback, the resulting graphical user interface greatly simplifies the interaction with this set of simulation modules. This study shows that considerable benefits can be obtained by application of computer science concepts to the engineering domain, by helping domain experts to tailor interactive tools to suit their needs.