47 resultados para Uncertainty in generation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Authors from Burrough (1992) to Heuvelink et al. (2007) have highlighted the importance of GIS frameworks which can handle incomplete knowledge in data inputs, in decision rules and in the geometries and attributes modelled. It is particularly important for this uncertainty to be characterised and quantified when GI data is used for spatial decision making. Despite a substantial and valuable literature on means of representing and encoding uncertainty and its propagation in GI (e.g.,Hunter and Goodchild 1993; Duckham et al. 2001; Couclelis 2003), no framework yet exists to describe and communicate uncertainty in an interoperable way. This limits the usability of Internet resources of geospatial data, which are ever-increasing, based on specifications that provide frameworks for the ‘GeoWeb’ (Botts and Robin 2007; Cox 2006). In this paper we present UncertML, an XML schema which provides a framework for describing uncertainty as it propagates through many applications, including online risk management chains. This uncertainty description ranges from simple summary statistics (e.g., mean and variance) to complex representations such as parametric, multivariate distributions at each point of a regular grid. The philosophy adopted in UncertML is that all data values are inherently uncertain, (i.e., they are random variables, rather than values with defined quality metadata).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The chromium chalcogenide spinels, MCr2X4 (M = Zn, Cd, Hg; X = O, S, Se), have been the subject of considerable interest in recent years. In each case the crystal structure is that of a normal spinel with the chromium ions exclusively occupying the octahedral (B) sites, so that when diamagnetic ions are located at the tetrahedral (A) sites the only magnetic interactions present are those between B-site ions. Despite such apparently simple circumstances a rich variety of magnetic behaviour is exhibited. For the oxides the ground state spin configurations are antiferromagnetic whilst for the selenides ferromagnetic interactions dominate and several authors have drawn attention to the fact that the nature of the dominant interaction is a function of the nearest neighbour chromium - chromium separation. However, at least two of the compounds exhibit spiral structures and it has been proved difficult to account for the various spin configurations within a unified theory of the magnetic interactions involved. More recently, the possibility of formulating a simplified interpretation of the magnetic interactions has been provided by the discovery that the crystal struture of spinels does not always conform to the centrosymmetrical symmetry Fd3m that has been conventionally assumed. The deviation from this symmetry is associated with small < 111> displacements of the octahedrally coordinated metal ions and the structures so obtained are more correctly referred to the non-centrosymmetrical space group F43m. In the present study, therefore, extensive X-ray diffraction data have been collected from four chromium chalcogenide specimens and used to refine the corresponding structural parameters assuming F43m symmetry and also with conventional symmetry. The diffracted intensities from three of the compounds concerned cannot be satisfactorily accounted for on the basis of conventional symmetry and new locations have been found for the chromium ions in these cases. It is shown, however, that these displacements in chromium positions only partially resolve the difficulties in interpreting the magnetic behaviour. A re-examination of the magnetic data from different authors indicates much greater uncertainty in their measurements than they had claimed. By taking this into consideration it is shown that a unified theory of magnetic behaviour for the chromium chalcogenide spinels is a real possibility.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This is an exploratory study in a field which previously was virtually unexplored. The aim is to identify, for the benefit of innovators, the influence of industrial design on the commercial success of new science-based products used for professional and industrial purposes. The study is a contribution to the theory of success and failure in industrial innovation. The study begins by defining the terminology. To place the investigation in context, there is then a review of past attempts by official policy-making bodies to improve the competitiveness of British products of manufacture through good design. To elucidate the meaning of good design, attempts to establish a coherent philosophy of style in British products of manufacture during the same period are also reviewed. Following these reviews, empirical evidence is presented to identify what actually takes place in successful firms when industrial design is allocated a role in the process of technological innovation. The evidence comprises seven case studies of new science-based products used for professional or industrial purposes which have received Design Council Awards. To facilitate an objective appraisal, evidence was obtained by conducting separate semi-structured interviews, the detail of which is described, with senior personnel in innovating firms, with industrial design consultants, and with professional users. The study suggests that the likelihood of commercial success in technological innovation is greater when the form, configuration, and the overall appearance of a new product, together with the detail which delineates them, are consciously and expertly controlled. Moreover, uncertainty in innovation is likely to be reduced if the appearance of a new product is consciously designed to facilitate recognition and comprehension. Industrial design is an especially significant factor when a firm innovates against a background of international competition and comparable levels of technological competence in rival firms. The likelihood of success in innovation is enhanced if design is allocated a role closely identified with the total needs of the user and discrete from the engineering function in company organisation. Recent government measures, initiated since this study began, are corroborative of the findings.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Current guidelines recommend oral anticoagulation therapy for patients with atrial fibrillation who are at moderate-to-high risk of stroke, however anticoagulation control (time in therapeutic range (TTR)) is dependent on many factors. Educational and behavioural interventions may impact on patients’ ability to maintain their International Normalised Ratio (INR) control. Objectives To evaluate the effects on TTR of educational and behavioural interventions for oral anticoagulation therapy (OAT) in patients with atrial fibrillation (AF). Search methods We searched the Cochrane Central Register of Controlled Trials (CENTRAL) and the Database of Abstracts of Reviews of Effects (DARE) in The Cochrane Library (2012, Issue 7 of 12), MEDLINE Ovid (1950 to week 4 July 2012), EMBASE Classic + EMBASE Ovid (1947 to Week 31 2012), PsycINFO Ovid (1806 to 2012 week 5 July) on 8 August 2012 and CINAHL Plus with Full Text EBSCO (to August 2012) on 9 August 2012. We applied no language restrictions. Selection criteria The primary outcome analysed was TTR. Secondary outcomes included decision conflict (patient's uncertainty in making health-related decisions), percentage of INRs in the therapeutic range, major bleeding, stroke and thromboembolic events, patient knowledge, patient satisfaction, quality of life (QoL), and anxiety. Data collection and analysis The two review authors independently extracted data. Where insufficient data were present to conduct a meta-analysis, effect sizes and confidence intervals (CIs) of the included studies were reported. Data were pooled for two outcomes, TTR and decision conflict. Main results Eight trials with a total of 1215 AF patients (number of AF participants included in the individual trials ranging from 14 to 434) were included within the review. Studies included education, decision aids, and self-monitoring plus education. For the primary outcome of TTR, data for the AF participants in two self-monitoring plus education trials were pooled and did not favour self-monitoring plus education or usual care in improving TTR, with a mean difference of 6.31 (95% CI -5.63 to 18.25). For the secondary outcome of decision conflict, data from two decision aid trials favoured usual care over the decision aid in terms of reducing decision conflict, with a mean difference of -0.1 (95% CI -0.2 to -0.02). Authors' conclusions This review demonstrated that there is insufficient evidence to draw definitive conclusions regarding the impact of educational or behavioural interventions on TTR in AF patients receiving OAT. Thus, more trials are needed to examine the impact of interventions on anticoagulation control in AF patients and the mechanisms by which they are successful. It is also important to explore the psychological implications for patients suffering from this long-term chronic condition.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Classification is the most basic method for organizing resources in the physical space, cyber space, socio space and mental space. To create a unified model that can effectively manage resources in different spaces is a challenge. The Resource Space Model RSM is to manage versatile resources with a multi-dimensional classification space. It supports generalization and specialization on multi-dimensional classifications. This paper introduces the basic concepts of RSM, and proposes the Probabilistic Resource Space Model, P-RSM, to deal with uncertainty in managing various resources in different spaces of the cyber-physical society. P-RSM’s normal forms, operations and integrity constraints are developed to support effective management of the resource space. Characteristics of the P-RSM are analyzed through experiments. This model also enables various services to be described, discovered and composed from multiple dimensions and abstraction levels with normal form and integrity guarantees. Some extensions and applications of the P-RSM are introduced.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dynamically adaptive systems (DASs) are intended to monitor the execution environment and then dynamically adapt their behavior in response to changing environmental conditions. The uncertainty of the execution environment is a major motivation for dynamic adaptation; it is impossible to know at development time all of the possible combinations of environmental conditions that will be encountered. To date, the work performed in requirements engineering for a DAS includes requirements monitoring and reasoning about the correctness of adaptations, where the DAS requirements are assumed to exist. This paper introduces a goal-based modeling approach to develop the requirements for a DAS, while explicitly factoring uncertainty into the process and resulting requirements. We introduce a variation of threat modeling to identify sources of uncertainty and demonstrate how the RELAX specification language can be used to specify more flexible requirements within a goal model to handle the uncertainty. © 2009 Springer Berlin Heidelberg.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Requirements awareness should help optimize requirements satisfaction when factors that were uncertain at design time are resolved at runtime. We use the notion of claims to model assumptions that cannot be verified with confidence at design time. By monitoring claims at runtime, their veracity can be tested. If falsified, the effect of claim negation can be propagated to the system's goal model and an alternative means of goal realization selected automatically, allowing the dynamic adaptation of the system to the prevailing environmental context. © 2011 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The complexity of environments faced by dynamically adaptive systems (DAS) means that the RE process will often be iterative with analysts revisiting the system specifications based on new environmental understanding product of experiences with experimental deployments, or even after final deployments. An ability to trace backwards to an identified environmental assumption, and to trace forwards to find the areas of a DAS's specification that are affected by changes in environmental understanding aids in supporting this necessarily iterative RE process. This paper demonstrates how claims can be used as markers for areas of uncertainty in a DAS specification. The paper demonstrates backward tracing using claims to identify faulty environmental understanding, and forward tracing to allow generation of new behaviour in the form of policy adaptations and models for transitioning the running system. © 2011 ACM.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

background Current guidelines recommend oral anticoagulation therapy for patients with atrial fibrillation who are at moderate-to-high risk of stroke, however anticoagulation control (time in therapeutic range (TTR)) is dependent on many factors. Educational and behavioural interventions may impact on patients’ ability to maintain their International Normalised Ratio (INR) control. Objectives To evaluate the effects on TTR of educational and behavioural interventions for oral anticoagulation therapy (OAT) in patients with atrial fibrillation (AF). Search methods We searched the Cochrane Central Register of Controlled Trials (CENTRAL) and the Database of Abstracts of Reviews of Effects (DARE) in The Cochrane Library (2012, Issue 7 of 12), MEDLINE Ovid (1950 to week 4 July 2012), EMBASE Classic + EMBASE Ovid (1947 to Week 31 2012), PsycINFO Ovid (1806 to 2012 week 5 July) on 8 August 2012 and CINAHL Plus with Full Text EBSCO (to August 2012) on 9 August 2012. We applied no language restrictions. Selection criteria The primary outcome analysed was TTR. Secondary outcomes included decision conflict (patient's uncertainty in making health-related decisions), percentage of INRs in the therapeutic range, major bleeding, stroke and thromboembolic events, patient knowledge, patient satisfaction, quality of life (QoL), and anxiety. Data collection and analysis The two review authors independently extracted data. Where insufficient data were present to conduct a meta-analysis, effect sizes and confidence intervals (CIs) of the included studies were reported. Data were pooled for two outcomes, TTR and decision conflict. Main results Eight trials with a total of 1215 AF patients (number of AF participants included in the individual trials ranging from 14 to 434) were included within the review. Studies included education, decision aids, and self-monitoring plus education. For the primary outcome of TTR, data for the AF participants in two self-monitoring plus education trials were pooled and did not favour self-monitoring plus education or usual care in improving TTR, with a mean difference of 6.31 (95% CI -5.63 to 18.25). For the secondary outcome of decision conflict, data from two decision aid trials favoured usual care over the decision aid in terms of reducing decision conflict, with a mean difference of -0.1 (95% CI -0.2 to -0.02). Authors' conclusions This review demonstrated that there is insufficient evidence to draw definitive conclusions regarding the impact of educational or behavioural interventions on TTR in AF patients receiving OAT. Thus, more trials are needed to examine the impact of interventions on anticoagulation control in AF patients and the mechanisms by which they are successful. It is also important to explore the psychological implications for patients suffering from this long-term chronic condition.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The creation of new ventures is a process characterized by the need to decide and take action in the face of uncertainty, and this is particularly so in the case of technology-based ventures. Effectuation theory (Sarasvathy, 2001) has advanced two possible approaches for making decisions while facing uncertainty in the entrepreneurial process. Causation logic is based on prediction and aims at lowering uncertainty, whereas effectuation logic is based on non-predictive action and aims at working with uncertainty. This study aims to generate more fine-grained insight in the dynamics of effectuation and causation over time. We address the following questions: (1) What patterns can be found in effectual and causal behaviour of technology-based new ventures over time? And (2) How may patterns in the dynamics of effectuation and causation be explained?

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This investigation aimed to pinpoint the elements of motor timing control that are responsible for the increased variability commonly found in children with developmental dyslexia on paced or unpaced motor timing tasks (Chapter 3). Such temporal processing abilities are thought to be important for developing the appropriate phonological representations required for the development of literacy skills. Similar temporal processing difficulties arise in other developmental disorders such as Attention Deficit Hyperactivity Disorder (ADHD). Motor timing behaviour in developmental populations was examined in the context of models of typical human timing behaviour, in particular the Wing-Kristofferson model, allowing estimation of the contribution of different timing control systems, namely timekeeper and implementation systems (Chapter 2 and Methods Chapters 4 and 5). Research examining timing in populations with dyslexia and ADHD has been inconsistent in the application of stimulus parameters and so the first investigation compared motor timing behaviour across different stimulus conditions (Chapter 6). The results question the suitability of visual timing tasks which produced greater performance variability than auditory or bimodal tasks. Following an examination of the validity of the Wing-Kristofferson model (Chapter 7) the model was applied to time series data from an auditory timing task completed by children with reading difficulties and matched control groups (Chapter 8). Expected group differences in timing performance were not found, however, associations between performance and measures of literacy and attention were present. Results also indicated that measures of attention and literacy dissociated in their relationships with components of timing, with literacy ability being correlated with timekeeper variance and attentional control with implementation variance. It is proposed that these timing deficits associated with reading difficulties are attributable to central timekeeping processes and so the contribution of error correction to timing performance was also investigated (Chapter 9). Children with lower scores on measures of literacy and attention were found to have a slower or failed correction response to phase errors in timing behaviour. Results from the series of studies suggest that the motor timing difficulty in poor reading children may stem from failures in the judgement of synchrony due to greater tolerance of uncertainty in the temporal processing system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the specific area of software engineering (SE) for self-adaptive systems (SASs) there is a growing research awareness about the synergy between SE and artificial intelligence (AI). However, just few significant results have been published so far. In this paper, we propose a novel and formal Bayesian definition of surprise as the basis for quantitative analysis to measure degrees of uncertainty and deviations of self-adaptive systems from normal behavior. A surprise measures how observed data affects the models or assumptions of the world during runtime. The key idea is that a "surprising" event can be defined as one that causes a large divergence between the belief distributions prior to and posterior to the event occurring. In such a case the system may decide either to adapt accordingly or to flag that an abnormal situation is happening. In this paper, we discuss possible applications of Bayesian theory of surprise for the case of self-adaptive systems using Bayesian dynamic decision networks. Copyright © 2014 ACM.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thermal effects in uncontrolled factory environments are often the largest source of uncertainty in large volume dimensional metrology. As the standard temperature for metrology of 20°C cannot be achieved practically or economically in many manufacturing facilities, the characterisation and modelling of temperature offers a solution for improving the uncertainty of dimensional measurement and quantifying thermal variability in large assemblies. Technologies that currently exist for temperature measurement in the range of 0-50°C have been presented alongside discussion of these temperature measurement technologies' usefulness for monitoring temperatures in a manufacturing context. Particular aspects of production where the technology could play a role are highlighted as well as practical considerations for deployment. Contact sensors such as platinum resistance thermometers can produce accuracy closest to the desired accuracy given the most challenging measurement conditions calculated to be ∼0.02°C. Non-contact solutions would be most practical in the light controlled factory (LCF) and semi-invasive appear least useful but all technologies can play some role during the initial development of thermal variability models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It has never been easy for manufacturing companies to understand their confidence level in terms of how accurate and to what degree of flexibility parts can be made. This brings uncertainty in finding the most suitable manufacturing method as well as in controlling their product and process verification systems. The aim of this research is to develop a system for capturing the company’s knowledge and expertise and then reflect it into an MRP (Manufacturing Resource Planning) system. A key activity here is measuring manufacturing and machining capabilities to a reasonable confidence level. For this purpose an in-line control measurement system is introduced to the company. Using SPC (Statistical Process Control) not only helps to predict the trend in manufacturing of parts but also minimises the human error in measurement. Gauge R&R (Repeatability and Reproducibility) study identifies problems in measurement systems. Measurement is like any other process in terms of variability. Reducing this variation via an automated machine probing system helps to avoid defects in future products.Developments in aerospace, nuclear, oil and gas industries demand materials with high performance and high temperature resistance under corrosive and oxidising environments. Superalloys were developed in the latter half of the 20th century as high strength materials for such purposes. For the same characteristics superalloys are considered as difficult-to-cut alloys when it comes to formation and machining. Furthermore due to the sensitivity of superalloy applications, in many cases they should be manufactured with tight tolerances. In addition superalloys, specifically Nickel based, have unique features such as low thermal conductivity due to having a high amount of Nickel in their material composition. This causes a high surface temperature on the work-piece at the machining stage which leads to deformation in the final product.Like every process, the material variations have a significant impact on machining quality. The main cause of variations can originate from chemical composition and mechanical hardness. The non-uniform distribution of metal elements is a major source of variation in metallurgical structures. Different heat treatment standards are designed for processing the material to the desired hardness levels based on application. In order to take corrective actions, a study on the material aspects of superalloys has been conducted. In this study samples from different batches of material have been analysed. This involved material preparation for microscopy analysis, and the effect of chemical compositions on hardness (before and after heat treatment). Some of the results are discussed and presented in this paper.