883 resultados para Agent-based systems
Resumo:
We show how hydrogenation of graphene nanoribbons at small concentrations can open venues toward carbon-based spintronics applications regardless of any specific edge termination or passivation of the nanoribbons. Density-functional theory calculations show that an adsorbed H atom induces a spin density on the surrounding π orbitals whose symmetry and degree of localization depends on the distance to the edges of the nanoribbon. As expected for graphene-based systems, these induced magnetic moments interact ferromagnetically or antiferromagnetically depending on the relative adsorption graphene sublattice, but the magnitude of the interactions are found to strongly vary with the position of the H atoms relative to the edges. We also calculate, with the help of the Hubbard model, the transport properties of hydrogenated armchair semiconducting graphene nanoribbons in the diluted regime and show how the exchange coupling between H atoms can be exploited in the design of novel magnetoresistive devices.
Resumo:
Corneal and anterior segment imaging techniques have become a crucial tool in the clinical practice of ophthalmology, with a great variety of applications, such as corneal curvature and pachymetric analysis, detection of ectatic corneal conditions, anatomical study of the anterior segment prior to phakic intraocular lens implantation, or densitometric analysis of the crystalline lens. From the Placido-based systems that allow only a characterization of the geometry of the anterior corneal surface to the Scheimpflug photography-based systems that provide a characterization of the cornea, anterior chamber, and crystalline lens, there is a great variety of devices with the capability of analyzing different anatomical parameters with very high precision. To date, Scheimpflug photography-based systems are the devices providing the more complete analysis of the anterior segment in a non-invasive way. More developments are required in anterior segment imaging technologies in order to improve the analysis of the crystalline lens structure as well as the ocular structures behind the iris in a non-invasive way when the pupil is not dilated.
Resumo:
Feature selection is an important and active issue in clustering and classification problems. By choosing an adequate feature subset, a dataset dimensionality reduction is allowed, thus contributing to decreasing the classification computational complexity, and to improving the classifier performance by avoiding redundant or irrelevant features. Although feature selection can be formally defined as an optimisation problem with only one objective, that is, the classification accuracy obtained by using the selected feature subset, in recent years, some multi-objective approaches to this problem have been proposed. These either select features that not only improve the classification accuracy, but also the generalisation capability in case of supervised classifiers, or counterbalance the bias toward lower or higher numbers of features that present some methods used to validate the clustering/classification in case of unsupervised classifiers. The main contribution of this paper is a multi-objective approach for feature selection and its application to an unsupervised clustering procedure based on Growing Hierarchical Self-Organising Maps (GHSOMs) that includes a new method for unit labelling and efficient determination of the winning unit. In the network anomaly detection problem here considered, this multi-objective approach makes it possible not only to differentiate between normal and anomalous traffic but also among different anomalies. The efficiency of our proposals has been evaluated by using the well-known DARPA/NSL-KDD datasets that contain extracted features and labelled attacks from around 2 million connections. The selected feature sets computed in our experiments provide detection rates up to 99.8% with normal traffic and up to 99.6% with anomalous traffic, as well as accuracy values up to 99.12%.
Resumo:
The use of microprocessor-based systems is gaining importance in application domains where safety is a must. For this reason, there is a growing concern about the mitigation of SEU and SET effects. This paper presents a new hybrid technique aimed to protect both the data and the control-flow of embedded applications running on microprocessors. On one hand, the approach is based on software redundancy techniques for correcting errors produced in the data. On the other hand, control-flow errors can be detected by reusing the on-chip debug interface, existing in most modern microprocessors. Experimental results show an important increase in the system reliability even superior to two orders of magnitude, in terms of mitigation of both SEUs and SETs. Furthermore, the overheads incurred by our technique can be perfectly assumable in low-cost systems.
Resumo:
Software-based techniques offer several advantages to increase the reliability of processor-based systems at very low cost, but they cause performance degradation and an increase of the code size. To meet constraints in performance and memory, we propose SETA, a new control-flow software-only technique that uses assertions to detect errors affecting the program flow. SETA is an independent technique, but it was conceived to work together with previously proposed data-flow techniques that aim at reducing performance and memory overheads. Thus, SETA is combined with such data-flow techniques and submitted to a fault injection campaign. Simulation and neutron induced SEE tests show high fault coverage at performance and memory overheads inferior to the state-of-the-art.
Resumo:
Integrity assurance of configuration data has a significant impact on microcontroller-based systems reliability. This is especially true when running applications driven by events which behavior is tightly coupled to this kind of data. This work proposes a new hybrid technique that combines hardware and software resources for detecting and recovering soft-errors in system configuration data. Our approach is based on the utilization of a common built-in microcontroller resource (timer) that works jointly with a software-based technique, which is responsible to periodically refresh the configuration data. The experiments demonstrate that non-destructive single event effects can be effectively mitigated with reduced overheads. Results show an important increase in fault coverage for SEUs and SETs, about one order of magnitude.
Resumo:
This paper investigates the impacts of high interest rates for borrowed capital and credit restrictions on the structural development of four European regions. The method used is the model AgriPoliS which is a spatial-dynamic agent-based model. It is able to provide aggregated results at the regional level, but very individual results as well by considering farms as independent entities. Farms can choose between different investment options during the simulation. Several scenarios with different interest rates for borrowed capital on the one hand as well as with different levels of credit restrictions on the other hand are tested and compared. Results show that higher interest rates have less impact on declining production branches than on expanding ones. If they have the possibility farms invest in the most profitable production branch which relative profitability might have changed with high interest rates. Credit restrictions lead farms to choose smaller and cheaper investments than expensive and large ones. Results also show that income losses in both cases due to under-investment compared to the reference situation are partially compensated by lower rental prices. The impacts on structural change also differ depending on the region and the initial situation. In summary, credit subsidies or imperfections on credit markets might have indirect impacts on the type of dominant investment and therefore on the whole regional agricultural sector as well.
Resumo:
This contribution focuses on analyzing the quality of democracy of the United States (U.S.) and of Austria by using a comparative approach. Even though comparisons are not the only possible or legitimate method of research, this analysis is based on the opinion that comparisons provide crucial analytical perspectives and learning opportunities. Following is the proposition, put directly forward: national political systems (political systems) are comprehensively understood only by using an international comparative approach. International comparisons (of country-based systems) are common (see the status of comparative politics, for example in Sodaro, 2004). Comparisons do not have to be based necessarily on national systems alone, but can also be carried out using “within”-comparisons inside (or beyond) sub-units or regional sub-national systems, for instance the individual provinces in the case of Austria (Campbell, 2007, p. 382).
Resumo:
"College of Engineering, UILU-ENG-89-1757."
Resumo:
The literature contains a number of reports of early work involving telemedicine and chronic disease; however, there are comparatively few studies in asthma. Most of the telemedicine studies in asthma have investigated the use of remote monitoring of patients in the home, e.g. transmitting spirometry data via a telephone modem to a central server. The primary objective of these studies was to improve management. A secondary benefit was that patient adherence to prescribed treatment is also likely to be improved. Early results are encouraging; home monitoring in a randomized controlled trial in Japan significantly reduced the number of emergency room visits by patients with poorly controlled asthma. Other studies have described the cost-benefits of a specialist asthma nurse who can manage patients by telephone contact, as well as deliver asthma education. Many web-based systems are available for the general public or healthcare professionals to improve education in asthma, although their quality is highly variable. The work on telemedicine in asthma clearly shows that the technique holds promise in a number of areas. Unfortunately - as in telemedicine generally - most of the literature in patients with asthma refers to pilot trials and feasibility studies, with short-term outcomes. Large-scale, formal research trials are required to establish the cost effectiveness of telemedicine in asthma.
Resumo:
Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Knowledge maintenance is a major challenge for both knowledge management and the Semantic Web. Operating over the Semantic Web, there will be a network of collaborating agents, each with their own ontologies or knowledge bases. Change in the knowledge state of one agent may need to be propagated across a number of agents and their associated ontologies. The challenge is to decide how to propagate a change of knowledge state. The effects of a change in knowledge state cannot be known in advance, and so an agent cannot know who should be informed unless it adopts a simple ‘tell everyone – everything’ strategy. This situation is highly reminiscent of the classic Frame Problem in AI. We argue that for agent-based technologies to succeed, far greater attention must be given to creating an appropriate model for knowledge update. In a closed system, simple strategies are possible (e.g. ‘sleeping dog’ or ‘cheap test’ or even complete checking). However, in an open system where cause and effect are unpredictable, a coherent cost-benefit based model of agent interaction is essential. Otherwise, the effectiveness of every act of knowledge update/maintenance is brought into question.
Resumo:
A plethora of techniques for the imaging of liposomes and other bilayer vesicles are available. However, sample preparation and the technique chosen should be carefully considered in conjunction with the information required. For example, larger vesicles such as multilamellar and giant unilamellar vesicles can be viewed using light microscopy and whilst vesicle confirmation and size prior to additional physical characterisations or more detailed microscopy can be undertaken, the technique is limited in terms of resolution. To consider the options available for visualising liposome-based systems, a wide range of microscopy techniques are described and discussed here: these include light, fluorescence and confocal microscopy and various electron microscopy techniques such as transmission, cryo, freeze fracture and environmental scanning electron microscopy. Their application, advantages and disadvantages are reviewed with regard to their use in analysis of lipid vesicles.
Resumo:
This second issue of Knowledge Management Research & Practice (KMRP) continues the international nature of the first issue, with papers from authors based on four different continents. There are five regular papers, plus the first of what is intended to be an occasional series of 'position papers' from respected figures in the knowledge management field, who have specific issues they wish to raise from a personal standpoint. The first two regular papers are both based on case studies. The first is 'Aggressively pursuing knowledge management over two years: a case study a US government organization' by Jay Liebowitz. Liebowitz is well known to both academics and practictioners as an author on knowledge management and knowledge based systems. Government departments in many Western countries must soon face up to the problems that will occur as the 'baby boomer' generation reaches retirement age over the next decade. This paper describes how one particular US government organization has attempted to address this situation (and others) through the introduction of a knowledge management initiative. The second case study paper is 'Knowledge creation through the synthesizing capability of networked strategic communities: case study on new product development in Japan' by Mitsuru Kodama. This paper looks at the importance of strategic communities - communities that have strategic relevance and support - in knowledge management. Here, the case study organization is Nippon Telegraph and Telephone Corporation (NTT), a Japanese telecommunication firm. The third paper is 'Knowledge management and intellectual capital: an empirical examination of current practice in Australia' by Albert Zhou and Dieter Fink. This paper reports the results of a survey carried out in 2001, exploring the practices relating to knowledge management and intellectual capital in Australia and the relationship between them. The remaining two regular papers are conceptual in nature. The fourth is 'The enterprise knowledge dictionary' by Stuart Galup, Ronald Dattero and Richard Hicks. Galup, Dattero and Hicks propose the concept of an enterprise knowledge dictionary and its associated knowledge management system architecture as offering the appropriate form of information technology to support various different types of knowledge sources, while behaving as a single source from the user's viewpoint. The fifth and final regular paper is 'Community of practice and metacapabilities' by Geri Furlong and Leslie Johnson. This paper looks at the role of communities of practice in learning in organizations. Its emphasis is on metacapabilities - the properties required to learn, develop and apply skills. This discussion takes work on learning and core competences to a higher level. Finally, this issue includes a position paper 'Innovation as an objective of knowledge management. Part I: the landscape of management' by Dave Snowden. Snowden has been highly visible in the knowledge management community thanks to his role as the Director of IBM Global Services' Canolfan Cynefin Centre. He has helped many government and private sector organizations to consider their knowledge management problems and strategies. This, the first of two-part paper, is inspired by the notion of complexity. In it, Snowden calls for what he sees as a 20th century emphasis on designed systems for knowledge management to be consigned to history, and replaced by a 21st century emphasis on emergence. Letters to the editor on this, or any other topic related to knowledge management research and practice, are welcome. We trust that you will find the contributions stimulating, and again invite you to contribute your own paper(s) to future issues of KMRP.