148 resultados para scientific uncertainty
Resumo:
Purpose This work introduces the concept of very small field size. Output factor (OPF) measurements at these field sizes require extremely careful experimental methodology including the measurement of dosimetric field size at the same time as each OPF measurement. Two quantifiable scientific definitions of the threshold of very small field size are presented. Methods A practical definition was established by quantifying the effect that a 1 mm error in field size or detector position had on OPFs, and setting acceptable uncertainties on OPF at 1%. Alternatively, for a theoretical definition of very small field size, the OPFs were separated into additional factors to investigate the specific effects of lateral electronic disequilibrium, photon scatter in the phantom and source occlusion. The dominant effect was established and formed the basis of a theoretical definition of very small fields. Each factor was obtained using Monte Carlo simulations of a Varian iX linear accelerator for various square field sizes of side length from 4 mm to 100 mm, using a nominal photon energy of 6 MV. Results According to the practical definition established in this project, field sizes < 15 mm were considered to be very small for 6 MV beams for maximal field size uncertainties of 1 mm. If the acceptable uncertainty in the OPF was increased from 1.0 % to 2.0 %, or field size uncertainties are 0.5 mm, field sizes < 12 mm were considered to be very small. Lateral electronic disequilibrium in the phantom was the dominant cause of change in OPF at very small field sizes. Thus the theoretical definition of very small field size coincided to the field size at which lateral electronic disequilibrium clearly caused a greater change in OPF than any other effects. This was found to occur at field sizes < 12 mm. Source occlusion also caused a large change in OPF for field sizes < 8 mm. Based on the results of this study, field sizes < 12 mm were considered to be theoretically very small for 6 MV beams. Conclusions Extremely careful experimental methodology including the measurement of dosimetric field size at the same time as output factor measurement for each field size setting and also very precise detector alignment is required at field sizes at least < 12 mm and more conservatively < 15 mm for 6 MV beams. These recommendations should be applied in addition to all the usual considerations for small field dosimetry, including careful detector selection.
Resumo:
The recently identified can didate gene, HLA-H, for haemochromatosis (HH) by Feder et al. generated considerable scientific interest coupled with a degree of uncertainty about the likely involvement of this gene in this common iron metabolism disorder, Feder et al. found a single point mutation resulting in an amino acid substitution (C282Y) that was homozygous in 148 (83%) of their patients, heterozygous in 9 patients (5%) but completely absent in 21 patients (12%). They proposed that the lack of a causative mutation in HLA-H in 12% of their patients was because these cases were not linked to chromosome 6p. A significant weakness in this argument is that all familial studies of the disorder so far have concluded that HH is due to a single major HLA-linked gene5-7. The ultimate test for a candidate gene is the clear segregation of a mutation with the disorder in all patients. Thus, some of the uncertainty surrounding the role of HLA-H in HH may be resolved by the identification of complete concordance of the C282Y mutation (or some other mutation) in HLA H with disease status in HH families. One potential problem in the design of such an experimental analysis is that a number of studies have shown the presence of a predominant ancestral haplotype in all HH populations examined: Australian, French, Italian, UK and US Thus in the analysis of a putative causative mutation, it is important to include families with...
Resumo:
This paper presents a new algorithm based on a Modified Particle Swarm Optimization (MPSO) to estimate the harmonic state variables in a distribution networks. The proposed algorithm performs the estimation for both amplitude and phase of each injection harmonic currents by minimizing the error between the measured values from Phasor Measurement Units (PMUs) and the values computed from the estimated parameters during the estimation process. The proposed algorithm can take into account the uncertainty of the harmonic pseudo measurement and the tolerance in the line impedances of the network as well as the uncertainty of the Distributed Generators (DGs) such as Wind Turbines (WTs). The main features of the proposed MPSO algorithm are usage of a primary and secondary PSO loop and applying the mutation function. The simulation results on 34-bus IEEE radial and a 70-bus realistic radial test networks are presented. The results demonstrate that the speed and the accuracy of the proposed Distribution Harmonic State Estimation (DHSE) algorithm are very excellent compared to the algorithms such as Weight Least Square (WLS), Genetic Algorithm (GA), original PSO, and Honey Bees Mating Optimization (HBMO).
Resumo:
This paper presents a new algorithm based on a Hybrid Particle Swarm Optimization (PSO) and Simulated Annealing (SA) called PSO-SA to estimate harmonic state variables in distribution networks. The proposed algorithm performs estimation for both amplitude and phase of each harmonic currents injection by minimizing the error between the measured values from Phasor Measurement Units (PMUs) and the values computed from the estimated parameters during the estimation process. The proposed algorithm can take into account the uncertainty of the harmonic pseudo measurement and the tolerance in the line impedances of the network as well as uncertainty of the Distributed Generators (DGs) such as Wind Turbines (WT). The main feature of proposed PSO-SA algorithm is to reach quickly around the global optimum by PSO with enabling a mutation function and then to find that optimum by SA searching algorithm. Simulation results on IEEE 34 bus radial and a realistic 70-bus radial test networks are presented to demonstrate the speed and accuracy of proposed Distribution Harmonic State Estimation (DHSE) algorithm is extremely effective and efficient in comparison with the conventional algorithms such as Weight Least Square (WLS), Genetic Algorithm (GA), original PSO and Honey Bees Mating Optimization (HBMO) algorithm.
Resumo:
This paper presents the Mossman Mill District Practices Framework. It was developed in the Wet Tropics region within the Great Barrier Reef in north-eastern Australia to describe the environmental benefits of agricultural management practices for the sugar cane industry. The framework translates complex, unclear and overlapping environmental plans, policy and legal arrangements into a simple framework of management practices that landholders can use to improve their management actions. Practices range from those that are old or outdated through to aspirational practices that have the potential to achieve desired resource condition targets. The framework has been applied by stakeholders at multiple scales to better coordinate and integrate a range of policy arrangements to improve natural resource management. It has been used to structure monitoring and evaluation in order to underpin a more adaptive approach to planning at mill district and property scale. Potentially, the framework and approach can be applied across fields of planning where adaptive management is needed. It has the potential to overcome many of the criticisms of property-scale and regional Natural Resource Management.
Resumo:
In the six decades since the discovery of the double helix structure of DNA by Watson and Crick in 1953, developments in genetic science have transformed our understanding of human health and disease. These developments, along with those in other areas such as computer science, biotechnology, and nanotechnology, have opened exciting new possibilities for the future. In addition, the increasing trend for technologies to converge and build upon each other potentially increases the pace of change, constantly expanding the boundaries of the scientific frontier. At the same time, however, scientific advances are often accompanied by public unease over the potential for unforeseen, negative outcomes. For governments, these issues present significant challenges for effective regulation. This Article analyzes the challenges associated with crafting laws for rapidly changing science and technology. It considers whether we need to regulate, how best to regulate for converging technologies, and how best to ensure the continued relevance of laws in the face of change.
Resumo:
The practice of travel journalism is still largely neglected as a field of inquiry for communication and journalism scholars, despite the fact that news media are increasingly focussing on softer news. Lifestyle sections of newspapers, for example, have been growing in size over the past few decades, and given corresponding cutbacks in international news reporting, particularly travel journalism is now playing a growing role in the representation of ‘the Other’. While this need for research into the field has been identified before, very little actual investigation of travel journalism has been forthcoming. This paper reviews the current state of research by reviewing what studies have been conducted into the production, content and reception of travel journalism. It argues that while there does now exist a very small number of studies, these have often been conducted in isolation and with significant limitations, and much remains to be done to sufficiently explore this sub-field of journalism. By analysing what we do know about travel journalism, the paper suggests a number of possibilities in each area on how we can advance this knowledge. Above all, it contends that dated prejudices against the field have to be put to the side, and the practice of travel journalism needs to be taken seriously in order to do its growing importance justice.
Resumo:
Since 1995 the eruption of the andesitic Soufrière Hills Volcano (SHV), Montserrat, has been studied in substantial detail. As an important contribution to this effort, the Seismic Experiment with Airgunsource-Caribbean Andesitic Lava Island Precision Seismo-geodetic Observatory (SEA-CALIPSO) experiment was devised to image the arc crust underlying Montserrat, and, if possible, the magma system at SHV using tomography and reflection seismology. Field operations were carried out in October–December 2007, with deployment of 238 seismometers on land supplementing seven volcano observatory stations, and with an array of 10 ocean-bottom seismometers deployed offshore. The RRS James Cook on NERC cruise JC19 towed a tuned airgun array plus a digital 48-channel streamer on encircling and radial tracks for 77 h about Montserrat during December 2007, firing 4414 airgun shots and yielding about 47 Gb of data. The main objecctives of the experiment were achieved. Preliminary analyses of these data published in 2010 generated images of heterogeneous high-velocity bodies representing the cores of volcanoes and subjacent intrusions, and shallow areas of low velocity on the flanks of the island that reflect volcaniclastic deposits and hydrothermal alteration. The resolution of this preliminary work did not extend beyond 5 km depth. An improved three-dimensional (3D) seismic velocity model was then obtained by inversion of 181 665 first-arrival travel times from a more-complete sampling of the dataset, yielding clear images to 7.5 km depth of a low-velocity volume that was interpreted as the magma chamber which feeds the current eruption, with an estimated volume 13 km3. Coupled thermal and seismic modelling revealed properties of the partly crystallized magma. Seismic reflection analyses aimed at imaging structures under southern Montserrat had limited success, and suggest subhorizontal layering interpreted as sills at a depth of between 6 and 19 km. Seismic reflection profiles collected offshore reveal deep fans of volcaniclastic debris and fault offsets, leading to new tectonic interpretations. This chapter presents the project goals and planning concepts, describes in detail the campaigns at sea and on land, summarizes the major results, and identifies the key lessons learned.
Resumo:
Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies.
Resumo:
Despite rising levels of safe-sex knowledge in Australia, sexually transmitted infection notifications continue to increase. A culture-centred approach suggests it is useful in attempting to reach a target population first to understand their perspective on the issues. Twenty focus groups were conducted with 89 young people between the ages of 14 and 16 years. Key findings suggest that scientific information does not articulate closely with everyday practice, that young people get the message that sex is bad and they should not be preparing for it and that it is not appropriate to talk about sex. Understanding how young people think about these issues is particularly important because the focus groups also found that young people disengage from sources of information that do not match their own experiences.
Learned stochastic mobility prediction for planning with control uncertainty on unstructured terrain
Resumo:
Motion planning for planetary rovers must consider control uncertainty in order to maintain the safety of the platform during navigation. Modelling such control uncertainty is difficult due to the complex interaction between the platform and its environment. In this paper, we propose a motion planning approach whereby the outcome of control actions is learned from experience and represented statistically using a Gaussian process regression model. This mobility prediction model is trained using sample executions of motion primitives on representative terrain, and predicts the future outcome of control actions on similar terrain. Using Gaussian process regression allows us to exploit its inherent measure of prediction uncertainty in planning. We integrate mobility prediction into a Markov decision process framework and use dynamic programming to construct a control policy for navigation to a goal region in a terrain map built using an on-board depth sensor. We consider both rigid terrain, consisting of uneven ground, small rocks, and non-traversable rocks, and also deformable terrain. We introduce two methods for training the mobility prediction model from either proprioceptive or exteroceptive observations, and report results from nearly 300 experimental trials using a planetary rover platform in a Mars-analogue environment. Our results validate the approach and demonstrate the value of planning under uncertainty for safe and reliable navigation.
Resumo:
While existing multi-biometic Dempster-Shafer the- ory fusion approaches have demonstrated promising perfor- mance, they do not model the uncertainty appropriately, sug- gesting that further improvement can be achieved. This research seeks to develop a unified framework for multimodal biometric fusion to take advantage of the uncertainty concept of Dempster- Shafer theory, improving the performance of multi-biometric authentication systems. Modeling uncertainty as a function of uncertainty factors affecting the recognition performance of the biometric systems helps to address the uncertainty of the data and the confidence of the fusion outcome. A weighted combination of quality measures and classifiers performance (Equal Error Rate) are proposed to encode the uncertainty concept to improve the fusion. We also found that quality measures contribute unequally to the recognition performance, thus selecting only significant factors and fusing them with a Dempster-Shafer approach to generate an overall quality score play an important role in the success of uncertainty modeling. The proposed approach achieved a competitive performance (approximate 1% EER) in comparison with other Dempster-Shafer based approaches and other conventional fusion approaches.
Resumo:
This paper examines discourses of male prostitution through an analysis of scientific texts. A contrast is drawn between nineteenth-century understandings of male prostitution and twentieth-century accounts of male prostitution. In contrast to female prostitution, male prostitution was not regarded as a significant social problem throughout the nineteenth century, despite its close association with gender deviation and social disorder. Changing conceptions of sexuality, linked with the emergence of the ‘adolescent’, drew scientific attention to male prostitution during the 1940s and 1950s. Research suggested that male prostitution was a problem associated with the development of sexual identity. Through the application of scientific techniques, which tagged and differentiated male prostitute populations, a language developed about male prostitution that allowed for normative assessments and judgements to be made concerning particular classes of male prostitute. The paper highlights how a broad distinction emerged between public prostitutes, regarded as heterosexual/masculine, and private prostitutes, regarded as homosexual/effeminate. This distinction altered the way in which male prostitution was understood and governed, allowing for male prostitution to be constituted as a public health concern.