899 resultados para scientific uncertainty


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The uncontrolled disposal of solid wastes poses an immediate threat to public health and a long term threat to the environmental well being of future generations. Solid waste is waste resulting from human activities that is solid and unwanted (Peavy et al., 1985). If unmanaged, dumped solid wastes generate liquid and gaseous emissions that are detrimental to the environment. This can lead to a serious form of contamination known as metal contamination, which poses a risk to human health and ecosystems. For example, some heavy metals (cadmium, chromium compounds, and nickel tetracarbonyl) are known to be highly toxic, and are aggressive at elevated concentrations. Iron, copper, and manganese can cause staining, and aluminium causes depositions and discolorations. In addition, calcium and magnesium cause hardness in water causing scale deposition and scum formation. Though not a metal but a metalloid, arsenic is poisonous at relatively high concentrations and when diluted at low concentrations causes skin cancer. Normally, metal contaminants are found in a dissolved form in the liquid percolating through landfills. Because average metal concentrations from full-scale landfills, test cells, and laboratory studies have tended to be generally low, metal contamination originating from landfills is not generally considered a major concern (Kjeldsen et al., 2002; Christensen et al., 1999). However, a number of factors make it necessary to take a closer look at metal contaminants from landfills. One of these factors relates to variability. Landfill leachate can have different qualities depending on the weather and operating conditions. Therefore, at one moment in time, metal contaminant concentrations may be quite low, but at a later time these concentrations could be quite high. Also, these conditions relate to the amount of leachate that is being generated. Another factor is biodiversity. It cannot be assumed that a particular metal contaminant is harmless to flora and fauna (including micro organisms) just because it is harmless to human health. This has significant implications for ecosystems and the environment. Finally, there is the moral factor. Because uncertainty surrounds the potential effects of metal contamination, it is appropriate to take precautions to prevent it from taking place. Consequently, it is necessary to have good scientific knowledge (empirically supported) to adequately understand the extent of the problem and improve the way waste is being disposed of

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the formalization and application of a methodology to evaluate the safety benefit of countermeasures in the face of uncertainty. To illustrate the methodology, 18 countermeasures for improving safety of at grade railroad crossings (AGRXs) in the Republic of Korea are considered. Akin to “stated preference” methods in travel survey research, the methodology applies random selection and laws of large numbers to derive accident modification factor (AMF) densities from expert opinions. In a full Bayesian analysis framework, the collective opinions in the form of AMF densities (data likelihood) are combined with prior knowledge (AMF density priors) for the 18 countermeasures to obtain ‘best’ estimates of AMFs (AMF posterior credible intervals). The countermeasures are then compared and recommended based on the largest safety returns with minimum risk (uncertainty). To the author's knowledge the complete methodology is new and has not previously been applied or reported in the literature. The results demonstrate that the methodology is able to discern anticipated safety benefit differences across candidate countermeasures. For the 18 at grade railroad crossings considered in this analysis, it was found that the top three performing countermeasures for reducing crashes are in-vehicle warning systems, obstacle detection systems, and constant warning time systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our study to be a starting point for animal experiments that test navigation in perceptually ambiguous environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computation Fluid Dynamics (CFD) has become an important tool in optimization and has seen successful in many real world applications. Most important among these is in the optimisation of aerodynamic surfaces which has become Multi-Objective (MO) and Multidisciplinary (MDO) in nature. Most of these have been carried out for a given set of input parameters such as free stream Mach number and angle of attack. One cannot ignore the fact that in aerospace engineering one frequently deals with situations where the design input parameters and flight/flow conditions have some amount of uncertainty attached to them. When the optimisation is carried out for fixed values of design variables and parameters however, one arrives at an optimised solution that results in good performance at design condition but poor drag or lift to drag ratio at slightly off-design conditions. The challenge is still to develop a robust design that accounts for uncertainty in the design in aerospace applications. In this paper this issue is taken up and an attempt is made to prevent the fluctuation of objective performance by using robust design technique or Uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From a ‘cultural science’ perspective, this paper traces one aspect of a more general shift, from the realist representational regime of modernity to the productive DIY systems of the internet era. It argues that collecting and archiving is transformed by this change. Modern museums – and also broadcast television – were based on determinist or ‘essence’ theory; while internet archives like YouTube (and the internet as an archive) are based on ‘probability’ theory. The paper goes through the differences between modernist ‘essence’ and postmodern ‘probability’; starting from the obvious difference that in a museum each object is selected by experts for its intrinsic properties, while on the internet you don’t know what you will find. The status of individual objects is uncertain, although the productivity of the overall archive is unlimited. The paper links these differences with changes in contemporary culture – from a Newtonian to a quantum universe, progress to risk, institutional structure to evolutionary change, objectivity to uncertainty, identity to performance. Borrowing some of its methodology from science fiction, the paper uses examples from museums and online archives, ranging from the oldest stone tool in the world to the latest tribute vid on the net.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years several scientific Workflow Management Systems (WfMSs) have been developed with the aim to automate large scale scientific experiments. As yet, many offerings have been developed, but none of them has been promoted as an accepted standard. In this paper we propose a pattern-based evaluation of three among the most widely used scientific WfMSs: Kepler, Taverna and Triana. The aim is to compare them with traditional business WfMSs, emphasizing the strengths and deficiencies of both systems. Moreover, a set of new patterns is defined from the analysis of the three considered systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 27-item Intolerance of Uncertainty Scale (IUS) has become one of the most frequently used measure of Intolerance of Uncertainty. More recently, an abridged, 12-item version of the IUS has been developed. The current research used clinical (n = 50) and non-clinical (n = 56) samples to examine and compare the psychometric properties of both versions of the IUS. The two scales showed good internal consistency at both the total and subscale level and had satisfactory test-retest reliability. Both versions were correlated with worry and trait anxiety and had satisfactory concurrent validity. Significant differences between the scores of the clinical and non-clinical sample supported discriminant validity. Predictive validity was also supported for the two scales. Total scores, in the case of the clinical sample, and a subscale, in the case of the non-clinical sample, significantly predicted pathological worry and trait anxiety. Overall, the clinicians and researchers can use either version of the IUS with confidence, due to their sound psychometric properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper provides an assessment of the performance of commercial Real Time Kinematic (RTK) systems over longer than recommended inter-station distances. The experiments were set up to test and analyse solutions from the i-MAX, MAX and VRS systems being operated with three triangle shaped network cells, each having an average inter-station distance of 69km, 118km and 166km. The performance characteristics appraised included initialization success rate, initialization time, RTK position accuracy and availability, ambiguity resolution risk and RTK integrity risk in order to provide a wider perspective of the performance of the testing systems. ----- ----- The results showed that the performances of all network RTK solutions assessed were affected by the increase in the inter-station distances to similar degrees. The MAX solution achieved the highest initialization success rate of 96.6% on average, albeit with a longer initialisation time. Two VRS approaches achieved lower initialization success rate of 80% over the large triangle. In terms of RTK positioning accuracy after successful initialisation, the results indicated a good agreement between the actual error growth in both horizontal and vertical components and the accuracy specified in the RMS and part per million (ppm) values by the manufacturers. ----- ----- Additionally, the VRS approaches performed better than the MAX and i-MAX when being tested under the standard triangle network with a mean inter-station distance of 69km. However as the inter-station distance increases, the network RTK software may fail to generate VRS correction and then may turn to operate in the nearest single-base RTK (or RAW) mode. The position uncertainty reached beyond 2 meters occasionally, showing that the RTK rover software was using an incorrect ambiguity fixed solution to estimate the rover position rather than automatically dropping back to using an ambiguity float solution. Results identified that the risk of incorrectly resolving ambiguities reached 18%, 20%, 13% and 25% for i-MAX, MAX, Leica VRS and Trimble VRS respectively when operating over the large triangle network. Additionally, the Coordinate Quality indicator values given by the Leica GX1230 GG rover receiver tended to be over-optimistic and not functioning well with the identification of incorrectly fixed integer ambiguity solutions. In summary, this independent assessment has identified some problems and failures that can occur in all of the systems tested, especially when being pushed beyond the recommended limits. While such failures are expected, they can offer useful insights into where users should be wary and how manufacturers might improve their products. The results also demonstrate that integrity monitoring of RTK solutions is indeed necessary for precision applications, thus deserving serious attention from researchers and system providers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern statistical models and computational methods can now incorporate uncertainty of the parameters used in Quantitative Microbial Risk Assessments (QMRA). Many QMRAs use Monte Carlo methods, but work from fixed estimates for means, variances and other parameters. We illustrate the ease of estimating all parameters contemporaneously with the risk assessment, incorporating all the parameter uncertainty arising from the experiments from which these parameters are estimated. A Bayesian approach is adopted, using Markov Chain Monte Carlo Gibbs sampling (MCMC) via the freely available software, WinBUGS. The method and its ease of implementation are illustrated by a case study that involves incorporating three disparate datasets into an MCMC framework. The probabilities of infection when the uncertainty associated with parameter estimation is incorporated into a QMRA are shown to be considerably more variable over various dose ranges than the analogous probabilities obtained when constants from the literature are simply ‘plugged’ in as is done in most QMRAs. Neglecting these sources of uncertainty may lead to erroneous decisions for public health and risk management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scientists need to transfer semantically similar queries across multiple heterogeneous linked datasets. These queries may require data from different locations and the results are not simple to combine due to differences between datasets. A query model was developed to make it simple to distribute queries across different datasets using RDF as the result format. The query model, based on the concept of publicly recognised namespaces for parts of each scientific dataset, was implemented with a configuration that includes a large number of current biological and chemical datasets. The configuration is flexible, providing the ability to transparently use both private and public datasets in any query. A prototype implementation of the model was used to resolve queries for the Bio2RDF website, including both Bio2RDF datasets and other datasets that do not follow the Bio2RDF URI conventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Generally speaking, psychologists have suggested three traditional views of how people cope with uncertainty. They are the certainty maximiser, the intuitive statistician-economist and the knowledge seeker (Smithson, 2008). In times of uncertainty, such as the recent global financial crisis, these coping methods often result in innovation in industry. Richards (2003) identifies innovation as different from creativity in that innovation aims to transform and implement rather than simply explore and invent. An examination of the work of iconic fashion designers, through case study and situational analysis, reveals that coping with uncertainty manifests itself in ways that have resulted in innovations in design, marketing methods, production and consumption. In relation to contemporary fashion, where many garments look the same in style, colour, cut and fit (Finn, 2008), the concept of innovation is an important one. This paper explores the role of uncertainty as a driver of innovation in fashion design. A key aspect of seeking knowledge, as a mechanism to cope with this uncertainty, is a return to basics. This is a problem for contemporary fashion designers who are no longer necessarily makers and therefore do not engage with the basic materials and methods of garment construction. In many cases design in fashion has become digital, communicated to an unseen, unknown production team via scanned image and specification alone. The disconnection between the design and the making of garments, as a result of decades of off-shore manufacturing, has limited the opportunity for this return to basics. The authors argue that the role of the fashion designer has become about the final product and as a result there is a lack of innovation in the process of making: in the form, fit and function of fashion garments. They propose that ‘knowledge seeking’ as a result of uncertainty in the fashion industry, in particular through re-examination of the methods of making, could hold the key to a new era of innovation in fashion design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Starting from a local problem with finding an archival clip on YouTube, this paper expands to consider the nature of archives in general. It considers the technological, communicative and philosophical characteristics of archives over three historical periods: 1) Modern ‘essence archives’ – museums and galleries organised around the concept of objectivity and realism; 2) Postmodern mediation archives – broadcast TV systems, which I argue were also ‘essence archives,’ albeit a transitional form; and 3) Network or ‘probability archives’ – YouTube and the internet, which are organised around the concept of probability. The paper goes on to argue the case for introducing quantum uncertainty and other aspects of probability theory into the humanities, in order to understand the way knowledge is collected, conserved, curated and communicated in the era of the internet. It is illustrated throughout by reference to the original technological 'affordance' – the Olduvai stone chopping tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of learning paradigms of identification in the limit, we address the question: why is uncertainty sometimes desirable? We use mind change bounds on the output hypotheses as a measure of uncertainty and interpret ‘desirable’ as reduction in data memorization, also defined in terms of mind change bounds. The resulting model is closely related to iterative learning with bounded mind change complexity, but the dual use of mind change bounds — for hypotheses and for data — is a key distinctive feature of our approach. We show that situations exist where the more mind changes the learner is willing to accept, the less the amount of data it needs to remember in order to converge to the correct hypothesis. We also investigate relationships between our model and learning from good examples, set-driven, monotonic and strong-monotonic learners, as well as class-comprising versus class-preserving learnability.