22 resultados para Modeling information
em CentAUR: Central Archive University of Reading - UK
Modeling of atmospheric effects on InSAR measurements by incorporating terrain elevation information
Resumo:
We propose an elevation-dependent calibratory method to correct for the water vapour-induced delays over Mt. Etna that affect the interferometric syntheric aperture radar (InSAR) results. Water vapour delay fields are modelled from individual zenith delay estimates on a network of continuous GPS receivers. These are interpolated using simple kriging with varying local means over two domains, above and below 2 km in altitude. Test results with data from a meteorological station and 14 continuous GPS stations over Mt. Etna show that a reduction of the mean phase delay field of about 27% is achieved after the model is applied to a 35-day interferogram. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Smooth flow of production in construction is hampered by disparity between individual trade teams' goals and the goals of stable production flow for the project as a whole. This is exacerbated by the difficulty of visualizing the flow of work in a construction project. While the addresses some of the issues in Building information modeling provides a powerful platform for visualizing work flow in control systems that also enable pull flow and deeper collaboration between teams on and off site. The requirements for implementation of a BIM-enabled pull flow construction management software system based on the Last Planner System™, called ‘KanBIM’, have been specified, and a set of functional mock-ups of the proposed system has been implemented and evaluated in a series of three focus group workshops. The requirements cover the areas of maintenance of work flow stability, enabling negotiation and commitment between teams, lean production planning with sophisticated pull flow control, and effective communication and visualization of flow. The evaluation results show that the system holds the potential to improve work flow and reduce waste by providing both process and product visualization at the work face.
Resumo:
This paper describes the user modeling component of EPIAIM, a consultation system for data analysis in epidemiology. The component is aimed at representing knowledge of concepts in the domain, so that their explanations can be adapted to user needs. The first part of the paper describes two studies aimed at analysing user requirements. The first one is a questionnaire study which examines the respondents' familiarity with concepts. The second one is an analysis of concept descriptions in textbooks and from expert epidemiologists, which examines how discourse strategies are tailored to the level of experience of the expected audience. The second part of the paper describes how the results of these studies have been used to design the user modeling component of EPIAIM. This module works in a two-step approach. In the first step, a few trigger questions allow the activation of a stereotype that includes a "body" and an "inference component". The body is the representation of the body of knowledge that a class of users is expected to know, along with the probability that the knowledge is known. In the inference component, the learning process of concepts is represented as a belief network. Hence, in the second step the belief network is used to refine the initial default information in the stereotype's body. This is done by asking a few questions on those concepts where it is uncertain whether or not they are known to the user, and propagating this new evidence to revise the whole situation. The system has been implemented on a workstation under UNIX. An example of functioning is presented, and advantages and limitations of the approach are discussed.
Resumo:
It is known that germin, which is a marker of the onset of growth in germinating wheat, is an oxalate oxidase, and also that germins possess sequence similarity with legumin and vicilin seed storage proteins. These two pieces of information have been combined in order to generate a 3D model of germin based on the structure of vicilin and to examine the model with regard to a potential oxalate oxidase active site. A cluster of three histidine residues has been located within the conserved beta-barrel structure. While there is a relatively low level of overall sequence similarity between the model and the vicilin structures, the conservation of amino acids important in maintaining the scaffold of the beta-barrel lends confidence to the juxtaposition of the histidine residues. The cluster is similar structurally to those found in copper amine oxidase and other proteins, leading to the suggestion that it defines a metal-binding location within the oxalate oxidase active site. It is also proposed that the structural elements involved in intermolecular interactions in vicilins may play a role in oligomer formation in germin/oxalate oxidase.
Resumo:
Semiotics is the study of signs. Application of semiotics in information systems design is based on the notion that information systems are organizations within which agents deploy signs in the form of actions according to a set of norms. An analysis of the relationships among the agents, their actions and the norms would give a better specification of the system. Distributed multimedia systems (DMMS) could be viewed as a system consisted of many dynamic, self-controlled normative agents engaging in complex interaction and processing of multimedia information. This paper reports the work of applying the semiotic approach to the design and modeling of DMMS, with emphasis on using semantic analysis under the semiotic framework. A semantic model of DMMS describing various components and their ontological dependencies is presented, which then serves as a design model and implemented in a semantic database. Benefits of using the semantic database are discussed with reference to various design scenarios.
Resumo:
This paper describes the design and implementation of an agent based network for the support of collaborative switching tasks within the control room environment of the National Grid Company plc. This work includes aspects from several research disciplines, including operational analysis, human computer interaction, finite state modelling techniques, intelligent agents and computer supported co-operative work. Aspects of these procedures have been used in the analysis of collaborative tasks to produce distributed local models for all involved users. These models have been used as the basis for the production of local finite state automata. These automata have then been embedded within an agent network together with behavioural information extracted from the task and user analysis phase. The resulting support system is capable of task and communication management within the transmission despatch environment.
Resumo:
Monitoring Earth's terrestrial water conditions is critically important to many hydrological applications such as global food production; assessing water resources sustainability; and flood, drought, and climate change prediction. These needs have motivated the development of pilot monitoring and prediction systems for terrestrial hydrologic and vegetative states, but to date only at the rather coarse spatial resolutions (∼10–100 km) over continental to global domains. Adequately addressing critical water cycle science questions and applications requires systems that are implemented globally at much higher resolutions, on the order of 1 km, resolutions referred to as hyperresolution in the context of global land surface models. This opinion paper sets forth the needs and benefits for a system that would monitor and predict the Earth's terrestrial water, energy, and biogeochemical cycles. We discuss six major challenges in developing a system: improved representation of surface‐subsurface interactions due to fine‐scale topography and vegetation; improved representation of land‐atmospheric interactions and resulting spatial information on soil moisture and evapotranspiration; inclusion of water quality as part of the biogeochemical cycle; representation of human impacts from water management; utilizing massively parallel computer systems and recent computational advances in solving hyperresolution models that will have up to 109 unknowns; and developing the required in situ and remote sensing global data sets. We deem the development of a global hyperresolution model for monitoring the terrestrial water, energy, and biogeochemical cycles a “grand challenge” to the community, and we call upon the international hydrologic community and the hydrological science support infrastructure to endorse the effort.
Resumo:
Even minor changes in user activity can bring about significant energy savings within built space. Many building performance assessment methods have been developed, however these often disregard the impact of user behavior (i.e. the social, cultural and organizational aspects of the building). Building users currently have limited means of determining how sustainable they are, in context of the specific building structure and/or when compared to other users performing similar activities, it is therefore easy for users to dismiss their energy use. To support sustainability, buildings must be able to monitor energy use, identify areas of potential change in the context of user activity and provide contextually relevant information to facilitate persuasion management. If the building is able to provide users with detailed information about how specific user activity that is wasteful, this should provide considerable motivation to implement positive change. This paper proposes using a dynamic and temporal semantic model, to populate information within a model of persuasion, to manage user change. By semantically mapping a building, and linking this to persuasion management we suggest that: i) building energy use can be monitored and analyzed over time; ii) persuasive management can be facilitated to move user activity towards sustainability.
Resumo:
This paper proposes and demonstrates an approach, Skilloscopy, to the assessment of decision makers. In an increasingly sophisticated, connected and information-rich world, decision making is becoming both more important and more difficult. At the same time, modelling decision-making on computers is becoming more feasible and of interest, partly because the information-input to those decisions is increasingly on record. The aims of Skilloscopy are to rate and rank decision makers in a domain relative to each other: the aims do not include an analysis of why a decision is wrong or suboptimal, nor the modelling of the underlying cognitive process of making the decisions. In the proposed method a decision-maker is characterised by a probability distribution of their competence in choosing among quantifiable alternatives. This probability distribution is derived by classic Bayesian inference from a combination of prior belief and the evidence of the decisions. Thus, decision-makers’ skills may be better compared, rated and ranked. The proposed method is applied and evaluated in the gamedomain of Chess. A large set of games by players across a broad range of the World Chess Federation (FIDE) Elo ratings has been used to infer the distribution of players’ rating directly from the moves they play rather than from game outcomes. Demonstration applications address questions frequently asked by the Chess community regarding the stability of the Elo rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The method of Skilloscopy may be applied in any decision domain where the value of the decision-options can be quantified.
Resumo:
Purpose: Increasing costs of health care, fuelled by demand for high quality, cost-effective healthcare has drove hospitals to streamline their patient care delivery systems. One such systematic approach is the adaptation of Clinical Pathways (CP) as a tool to increase the quality of healthcare delivery. However, most organizations still rely on are paper-based pathway guidelines or specifications, which have limitations in process management and as a result can influence patient safety outcomes. In this paper, we present a method for generating clinical pathways based on organizational semiotics by capturing knowledge from syntactic, semantic and pragmatic to social level. Design/methodology/approach: The proposed modeling approach to generation of CPs adopts organizational semiotics and enables the generation of semantically rich representation of CP knowledge. Semantic Analysis Method (SAM) is applied to explicitly represent the semantics of the concepts, their relationships and patterns of behavior in terms of an ontology chart. Norm Analysis Method (NAM) is adopted to identify and formally specify patterns of behavior and rules that govern the actions identified on the ontology chart. Information collected during semantic and norm analysis is integrated to guide the generation of CPs using best practice represented in BPMN thus enabling the automation of CP. Findings: This research confirms the necessity of taking into consideration social aspects in designing information systems and automating CP. The complexity of healthcare processes can be best tackled by analyzing stakeholders, which we treat as social agents, their goals and patterns of action within the agent network. Originality/value: The current modeling methods describe CPs from a structural aspect comprising activities, properties and interrelationships. However, these methods lack a mechanism to describe possible patterns of human behavior and the conditions under which the behavior will occur. To overcome this weakness, a semiotic approach to generation of clinical pathway is introduced. The CP generated from SAM together with norms will enrich the knowledge representation of the domain through ontology modeling, which allows the recognition of human responsibilities and obligations and more importantly, the ultimate power of decision making in exceptional circumstances.
Resumo:
In order to improve the quality of healthcare services, the integrated large-scale medical information system is needed to adapt to the changing medical environment. In this paper, we propose a requirement driven architecture of healthcare information system with hierarchical architecture. The system operates through the mapping mechanism between these layers and thus can organize functions dynamically adapting to user’s requirement. Furthermore, we introduce the organizational semiotics methods to capture and analyze user’s requirement through ontology chart and norms. Based on these results, the structure of user’s requirement pattern (URP) is established as the driven factor of our system. Our research makes a contribution to design architecture of healthcare system which can adapt to the changing medical environment.
Resumo:
In this paper a modified algorithm is suggested for developing polynomial neural network (PNN) models. Optimal partial description (PD) modeling is introduced at each layer of the PNN expansion, a task accomplished using the orthogonal least squares (OLS) method. Based on the initial PD models determined by the polynomial order and the number of PD inputs, OLS selects the most significant regressor terms reducing the output error variance. The method produces PNN models exhibiting a high level of accuracy and superior generalization capabilities. Additionally, parsimonious models are obtained comprising a considerably smaller number of parameters compared to the ones generated by means of the conventional PNN algorithm. Three benchmark examples are elaborated, including modeling of the gas furnace process as well as the iris and wine classification problems. Extensive simulation results and comparison with other methods in the literature, demonstrate the effectiveness of the suggested modeling approach.
Resumo:
The growing energy consumption in the residential sector represents about 30% of global demand. This calls for Demand Side Management solutions propelling change in behaviors of end consumers, with the aim to reduce overall consumption as well as shift it to periods in which demand is lower and where the cost of generating energy is lower. Demand Side Management solutions require detailed knowledge about the patterns of energy consumption. The profile of electricity demand in the residential sector is highly correlated with the time of active occupancy of the dwellings; therefore in this study the occupancy patterns in Spanish properties was determined using the 2009–2010 Time Use Survey (TUS), conducted by the National Statistical Institute of Spain. The survey identifies three peaks in active occupancy, which coincide with morning, noon and evening. This information has been used to input into a stochastic model which generates active occupancy profiles of dwellings, with the aim to simulate domestic electricity consumption. TUS data were also used to identify which appliance-related activities could be considered for Demand Side Management solutions during the three peaks of occupancy.