864 resultados para Component-based systems
Resumo:
The literature contains a number of reports of early work involving telemedicine and chronic disease; however, there are comparatively few studies in asthma. Most of the telemedicine studies in asthma have investigated the use of remote monitoring of patients in the home, e.g. transmitting spirometry data via a telephone modem to a central server. The primary objective of these studies was to improve management. A secondary benefit was that patient adherence to prescribed treatment is also likely to be improved. Early results are encouraging; home monitoring in a randomized controlled trial in Japan significantly reduced the number of emergency room visits by patients with poorly controlled asthma. Other studies have described the cost-benefits of a specialist asthma nurse who can manage patients by telephone contact, as well as deliver asthma education. Many web-based systems are available for the general public or healthcare professionals to improve education in asthma, although their quality is highly variable. The work on telemedicine in asthma clearly shows that the technique holds promise in a number of areas. Unfortunately - as in telemedicine generally - most of the literature in patients with asthma refers to pilot trials and feasibility studies, with short-term outcomes. Large-scale, formal research trials are required to establish the cost effectiveness of telemedicine in asthma.
Resumo:
Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
In component-based software engineering programs are constructed from pre-defined software library modules. However, if the library's subroutines do not exactly match the programmer's requirements, the subroutines' code must be adapted accordingly. For this process to be acceptable in safety or mission-critical applications, where all code must be proven correct, it must be possible to verify the correctness of the adaptations themselves. In this paper we show how refinement theory can be used to model typical adaptation steps and to define the conditions that must be proven to verify that a library subroutine has been adapted correctly.
Resumo:
This paper describes a formal component language, used to support automated component-based program development. The components, referred to as templates, are machine processable, meaning that appropriate tool support, such as retrieval support, can be developed. The templates are highly adaptable, meaning that they can be applied to a wide range of problems. Some of the main features of the language are described, including: higher-order parameters; state variable declarations; specification statements and conditionals; applicability conditions and theories; meta-level place holders; and abstract data structures.
Resumo:
A plethora of techniques for the imaging of liposomes and other bilayer vesicles are available. However, sample preparation and the technique chosen should be carefully considered in conjunction with the information required. For example, larger vesicles such as multilamellar and giant unilamellar vesicles can be viewed using light microscopy and whilst vesicle confirmation and size prior to additional physical characterisations or more detailed microscopy can be undertaken, the technique is limited in terms of resolution. To consider the options available for visualising liposome-based systems, a wide range of microscopy techniques are described and discussed here: these include light, fluorescence and confocal microscopy and various electron microscopy techniques such as transmission, cryo, freeze fracture and environmental scanning electron microscopy. Their application, advantages and disadvantages are reviewed with regard to their use in analysis of lipid vesicles.
Resumo:
This second issue of Knowledge Management Research & Practice (KMRP) continues the international nature of the first issue, with papers from authors based on four different continents. There are five regular papers, plus the first of what is intended to be an occasional series of 'position papers' from respected figures in the knowledge management field, who have specific issues they wish to raise from a personal standpoint. The first two regular papers are both based on case studies. The first is 'Aggressively pursuing knowledge management over two years: a case study a US government organization' by Jay Liebowitz. Liebowitz is well known to both academics and practictioners as an author on knowledge management and knowledge based systems. Government departments in many Western countries must soon face up to the problems that will occur as the 'baby boomer' generation reaches retirement age over the next decade. This paper describes how one particular US government organization has attempted to address this situation (and others) through the introduction of a knowledge management initiative. The second case study paper is 'Knowledge creation through the synthesizing capability of networked strategic communities: case study on new product development in Japan' by Mitsuru Kodama. This paper looks at the importance of strategic communities - communities that have strategic relevance and support - in knowledge management. Here, the case study organization is Nippon Telegraph and Telephone Corporation (NTT), a Japanese telecommunication firm. The third paper is 'Knowledge management and intellectual capital: an empirical examination of current practice in Australia' by Albert Zhou and Dieter Fink. This paper reports the results of a survey carried out in 2001, exploring the practices relating to knowledge management and intellectual capital in Australia and the relationship between them. The remaining two regular papers are conceptual in nature. The fourth is 'The enterprise knowledge dictionary' by Stuart Galup, Ronald Dattero and Richard Hicks. Galup, Dattero and Hicks propose the concept of an enterprise knowledge dictionary and its associated knowledge management system architecture as offering the appropriate form of information technology to support various different types of knowledge sources, while behaving as a single source from the user's viewpoint. The fifth and final regular paper is 'Community of practice and metacapabilities' by Geri Furlong and Leslie Johnson. This paper looks at the role of communities of practice in learning in organizations. Its emphasis is on metacapabilities - the properties required to learn, develop and apply skills. This discussion takes work on learning and core competences to a higher level. Finally, this issue includes a position paper 'Innovation as an objective of knowledge management. Part I: the landscape of management' by Dave Snowden. Snowden has been highly visible in the knowledge management community thanks to his role as the Director of IBM Global Services' Canolfan Cynefin Centre. He has helped many government and private sector organizations to consider their knowledge management problems and strategies. This, the first of two-part paper, is inspired by the notion of complexity. In it, Snowden calls for what he sees as a 20th century emphasis on designed systems for knowledge management to be consigned to history, and replaced by a 21st century emphasis on emergence. Letters to the editor on this, or any other topic related to knowledge management research and practice, are welcome. We trust that you will find the contributions stimulating, and again invite you to contribute your own paper(s) to future issues of KMRP.
Resumo:
Knowledge elicitation is a well-known bottleneck in the production of knowledge-based systems (KBS). Past research has shown that visual interactive simulation (VIS) could effectively be used to elicit episodic knowledge that is appropriate for machine learning purposes, with a view to building a KBS. Nonetheless, the VIS-based elicitation process still has much room for improvement. Based in the Ford Dagenham Engine Assembly Plant, a research project is being undertaken to investigate the individual/joint effects of visual display level and mode of problem case generation on the elicitation process. This paper looks at the methodology employed and some issues that have been encountered to date. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
In this position paper we present the developing Fluid framework, which we believe offers considerable advantages in maintaining software stability in dynamic or evolving application settings. The Fluid framework facilitates the development of component software via the selection, composition and configuration of components. Fluid's composition language incorporates a high-level type system supporting object-oriented principles such as type description, type inheritance, and type instantiation. Object-oriented relationships are represented via the dynamic composition of component instances. This representation allows the software structure, as specified by type and instance descriptions, to change dynamically at runtime as existing types are modified and new types and instances are introduced. We therefore move from static software structure descriptions to more dynamic representations, while maintaining the expressiveness of object-oriented semantics. We show how the Fluid framework relates to existing, largely component based, software frameworks and conclude with suggestions for future enhancements. © 2007 IEEE.
Resumo:
The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.
Resumo:
Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.
Resumo:
This paper identifies the important limiting processes in transmission capacity for amplified soliton systems. Some novel control techniques are described for optimizing this capacity. In particular, dispersion compensation and phase conjugation are identified as offering good control of jitter without the need for many new components in the system. An advanced average soliton model is described and demonstrated to permit large amplifier spacing. The potential for solitons in high-dispersion land-based systems is discussed and results are presented showing 10 Gbit s$^{-1}$ transmission over 1000 km with significant amplifier spacing.
Resumo:
In practical term any result obtained using an ordered weighted averaging (OWA) operator heavily depends upon the method to determine the weighting vector. Several approaches for obtaining the associated weights have been suggested in the literature, in which none of them took into account the preference of alternatives. This paper presents a method for determining the OWA weights when the preferences of alternatives across all the criteria are considered. An example is given to illustrate this method and an application in internet search engine shows the use of this new OWA operator.
Resumo:
Liposomes have been imaged using a plethora of techniques. However, few of these methods offer the ability to study these systems in their natural hydrated state without the requirement of drying, staining, and fixation of the vesicles. However, the ability to image a liposome in its hydrated state is the ideal scenario for visualization of these dynamic lipid structures and environmental scanning electron microscopy (ESEM), with its ability to image wet systems without prior sample preparation, offers potential advantages to the above methods. In our studies, we have used ESEM to not only investigate the morphology of liposomes and niosomes but also to dynamically follow the changes in structure of lipid films and liposome suspensions as water condenses on to or evaporates from the sample. In particular, changes in liposome morphology were studied using ESEM in real time to investigate the resistance of liposomes to coalescence during dehydration thereby providing an alternative assay of liposome formulation and stability. Based on this protocol, we have also studied niosome-based systems and cationic liposome/DNA complexes. Copyright © Informa Healthcare.
Resumo:
This book challenges the accepted notion that the transition from the command economy to market based systems is complete across the post-Soviet space. While it is noted that different political economies have developed in such states, such as Russia’s ‘managed democracy’, events such as Ukraine gaining ‘market economy status’ by the European Union and acceding to the World Trade Organisation in 2008 are taken as evidence that the reform period is over. Such thinking is based on numerous assumptions; specifically that economic transition has defined start and end points, that the formal economy now has primacy over other forms of economic practices and that national economic growth leads to the ‘trickle down’ of wealth to those marginalised by the transition process. Based on extensive ethnographic and quantitative research, conducted in Ukraine and Russia between 2004 - 2007, this book questions these assumptions by stating that the economies that operate across post-Soviet spaces are far from the textbook idea of a market economy. Through this the whole notion of ‘transition’ is problematised and the importance of informal economies to everyday life is demonstrated. Using case studies of various sectors, such as entrepreneurial behaviour and the higher education system, it is also shown how corruption has invaded almost all sectors of the post-Soviet every day.