994 resultados para scientific uncertainty


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some of Queensland's regions are experiencing rapid changes related to the recent and growing capacity to more effectively exploit significant energy sources. These changes have triggered land-use conflicts between the mining sector and other economic sectors, mainly agriculture. These conflicts fuel existing uncertainty surrounding the current and future economic, social and environmental impacts of extractive industries. This paper explores the concept of uncertainty as it applies to planning for resource-based regions through a scoping analysis of regional stakeholders' perceptions of land-use uncertainty. It then investigates solutions to alleviate such an issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To evaluate the burden of malignant neoplasms in Shandong Province in order to provide scientific evidence for policy-making. Methods: The main data for this study were from Shandong third cause of death sampling survey in 2006 and Shandong 2007 cancer prevalence survey. YLLs, YLDs, DALYs and disability weights of each type of cancers were calculated according to the global burdens of disease (GBD) methodology. The direct method was used to estimate YLDs. The uncertainty analysis was conducted following the methodology in GBD study. Results: The total cancers burden in Shandong population was 1 383 thousands DALYs. Lung cancer, liver cancer, stomach cancer and esophagus cancer were the top four cancers with the highest health burden. The burden of the four major cancers together accounted for 71.45% of the total burden of all cancers. 95% of the total burden of malignant tumors was caused by premature death, and only 5.26% of the total cancer burden was due to disability. The uncertainty of total burden estimate was around±11%, the uncertainty of YLDs was bigger than that of YLLs. Conclusion: The health burden due to cancers in Shandong population is heavier than that of the national average level. Liver cancer, lung cancer and stomach cancer should be the major cancers for disease control and prevention in Shandong.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine the role of politico-economic influences on macroeconomic performance within the framework of an endogenous growth model with costly technology adoption and uncertainty. The model is aimed at understanding the diversity in growth and inequality experiences across countries. Agents adopt either of two risky technologies, one of which is only available through financial intermediaries, who are able to alleviate some of this risk. The entry cost of financial intermediation depends on the proportion of government revenue that is allocated towards cost-reducing financial development expenditure, and agents vote on this proportion. The results show that agents at the top and bottom ends of the distribution prefer alternative means of re-distribution, thereby effectively blocking the allocation of resources towards cost-reducing financial development expenditure. Thus political factors have a role in delaying financial and capital deepening and economic development. Furthermore, the model provides a political-economy perspective on the Kuznets curve; uncertainty interacts with the political economy mechanism to produce transitional inequality patterns that, depending on initial conditions, can unearth the Kuznets-curve experience. Finally, the political outcomes are inefficient relative to policies aimed at maximizing the collective welfare of agents in the economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study considered the problem of predicting survival, based on three alternative models: a single Weibull, a mixture of Weibulls and a cure model. Instead of the common procedure of choosing a single “best” model, where “best” is defined in terms of goodness of fit to the data, a Bayesian model averaging (BMA) approach was adopted to account for model uncertainty. This was illustrated using a case study in which the aim was the description of lymphoma cancer survival with covariates given by phenotypes and gene expression. The results of this study indicate that if the sample size is sufficiently large, one of the three models emerge as having highest probability given the data, as indicated by the goodness of fit measure; the Bayesian information criterion (BIC). However, when the sample size was reduced, no single model was revealed as “best”, suggesting that a BMA approach would be appropriate. Although a BMA approach can compromise on goodness of fit to the data (when compared to the true model), it can provide robust predictions and facilitate more detailed investigation of the relationships between gene expression and patient survival. Keywords: Bayesian modelling; Bayesian model averaging; Cure model; Markov Chain Monte Carlo; Mixture model; Survival analysis; Weibull distribution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Porn studies researchers in the humanities have tended to use different research methods from those in social sciences. There has been surprisingly little conversation between the groups about methodology. This article presents a basic introduction to textual analysis and statistical analysis, aiming to provide for all porn studies researchers a familiarity with these two quite distinct traditions of data analysis. Comparing these two approaches, the article suggests that social science approaches are often strongly reliable – but can sacrifice validity to this end. Textual analysis is much less reliable, but has the capacity to be strongly valid. Statistical methods tend to produce a picture of human beings as groups, in terms of what they have in common, whereas humanities approaches often seek out uniqueness. Social science approaches have asked a more limited range of questions than have the humanities. The article ends with a call to mix up the kinds of research methods that are applied to various objects of study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Diabetes in South Asia represents a different disease entity in terms of its onset, progression, and complications. In the present study, we systematically analyzed the medical research output on diabetes in South Asia. METHODS: The online SciVerse Scopus database was searched using the search terms "diabetes" and "diabetes mellitus" in the article Title, Abstract or Keywords fields, in conjunction with the names of each regional country in the Author Affiliation field. RESULTS: In total, 8478 research articles were identified. Most were from India (85.1%) and Pakistan (9.6%) and the contribution to the global diabetes research output was 2.1%. Publications from South Asia increased markedly after 2007, with 58.7% of papers published between 2000 and 2010 being published after 2007. Most papers were Research Articles (75.9%) and Reviews (12.9%), with only 90 (1.1%) clinical trials. Publications predominantly appeared in local national journals. Indian authors and institutions had the most number of articles and the highest h-index. There were 136 (1.6%) intraregional collaborative studies. Only 39 articles (0.46%) had >100 citations. CONCLUSIONS: Regional research output on diabetes mellitus is unsatisfactory, with only a minimal contribution to global diabetes research. Publications are not highly cited and only a few randomized controlled trials have been performed. In the coming decades, scientists in the region must collaborate and focus on practical and culturally acceptable interventional studies on diabetes mellitus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this response to Tom G. K. Bryce and Stephen P. Day’s (Cult Stud Sci Educ. doi:10.1007/s11422-013-9500-0, 2013) original article, I share with them their interest in the teaching of climate change in school science, but I widen it to include other contemporary complex socio-scientific issues that also need to be discussed. I use an alternative view of the relationship between science, technology and society, supported by evidence from both science and society, to suggest science-informed citizens as a more realistic outcome image of school science than the authors’ one of mini-scientists. The intellectual independence of students Bryce and Day assume, and intend for school science, is countered with an active intellectual dependence. It is only in relation to emerging and uncertain scientific contexts that students should be taught about scepticism, but they also need to learn when, and why to trust science as an antidote to the expressions of doubting it. Some suggestions for pedagogies that could lead to these new learnings are made. The very recent fifth report of the IPCC answers many of their concerns about climate change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scientific visualisations such as computer-based animations and simulations are increasingly a feature of high school science instruction. Visualisations are adopted enthusiastically by teachers and embraced by students, and there is good evidence that they are popular and well received. There is limited evidence, however, of how effective they are in enabling students to learn key scientific concepts. This paper reports the results of a quantitative study conducted in Australian chemistry classrooms. The visualisations chosen were from free online sources, intended to model the ways in which classroom teachers use visualisations, but were found to have serious flaws for conceptual learning. There were also challenges in the degree of interactivity available to students using the visualisations. Within these limitations, no significant difference was found for teaching with and without these visualisations. Further study using better designed visualisations and with explicit attention to the pedagogy surrounding the visualisations will be required to gather high quality evidence of the effectiveness of visualisations for conceptual development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose This work introduces the concept of very small field size. Output factor (OPF) measurements at these field sizes require extremely careful experimental methodology including the measurement of dosimetric field size at the same time as each OPF measurement. Two quantifiable scientific definitions of the threshold of very small field size are presented. Methods A practical definition was established by quantifying the effect that a 1 mm error in field size or detector position had on OPFs, and setting acceptable uncertainties on OPF at 1%. Alternatively, for a theoretical definition of very small field size, the OPFs were separated into additional factors to investigate the specific effects of lateral electronic disequilibrium, photon scatter in the phantom and source occlusion. The dominant effect was established and formed the basis of a theoretical definition of very small fields. Each factor was obtained using Monte Carlo simulations of a Varian iX linear accelerator for various square field sizes of side length from 4 mm to 100 mm, using a nominal photon energy of 6 MV. Results According to the practical definition established in this project, field sizes < 15 mm were considered to be very small for 6 MV beams for maximal field size uncertainties of 1 mm. If the acceptable uncertainty in the OPF was increased from 1.0 % to 2.0 %, or field size uncertainties are 0.5 mm, field sizes < 12 mm were considered to be very small. Lateral electronic disequilibrium in the phantom was the dominant cause of change in OPF at very small field sizes. Thus the theoretical definition of very small field size coincided to the field size at which lateral electronic disequilibrium clearly caused a greater change in OPF than any other effects. This was found to occur at field sizes < 12 mm. Source occlusion also caused a large change in OPF for field sizes < 8 mm. Based on the results of this study, field sizes < 12 mm were considered to be theoretically very small for 6 MV beams. Conclusions Extremely careful experimental methodology including the measurement of dosimetric field size at the same time as output factor measurement for each field size setting and also very precise detector alignment is required at field sizes at least < 12 mm and more conservatively < 15 mm for 6 MV beams. These recommendations should be applied in addition to all the usual considerations for small field dosimetry, including careful detector selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recently identified can didate gene, HLA-H, for haemochromatosis (HH) by Feder et al. generated considerable scientific interest coupled with a degree of uncertainty about the likely involvement of this gene in this common iron metabolism disorder, Feder et al. found a single point mutation resulting in an amino acid substitution (C282Y) that was homozygous in 148 (83%) of their patients, heterozygous in 9 patients (5%) but completely absent in 21 patients (12%). They proposed that the lack of a causative mutation in HLA-H in 12% of their patients was because these cases were not linked to chromosome 6p. A significant weakness in this argument is that all familial studies of the disorder so far have concluded that HH is due to a single major HLA-linked gene5-7. The ultimate test for a candidate gene is the clear segregation of a mutation with the disorder in all patients. Thus, some of the uncertainty surrounding the role of HLA-H in HH may be resolved by the identification of complete concordance of the C282Y mutation (or some other mutation) in HLA H with disease status in HH families. One potential problem in the design of such an experimental analysis is that a number of studies have shown the presence of a predominant ancestral haplotype in all HH populations examined: Australian, French, Italian, UK and US Thus in the analysis of a putative causative mutation, it is important to include families with...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new algorithm based on a Modified Particle Swarm Optimization (MPSO) to estimate the harmonic state variables in a distribution networks. The proposed algorithm performs the estimation for both amplitude and phase of each injection harmonic currents by minimizing the error between the measured values from Phasor Measurement Units (PMUs) and the values computed from the estimated parameters during the estimation process. The proposed algorithm can take into account the uncertainty of the harmonic pseudo measurement and the tolerance in the line impedances of the network as well as the uncertainty of the Distributed Generators (DGs) such as Wind Turbines (WTs). The main features of the proposed MPSO algorithm are usage of a primary and secondary PSO loop and applying the mutation function. The simulation results on 34-bus IEEE radial and a 70-bus realistic radial test networks are presented. The results demonstrate that the speed and the accuracy of the proposed Distribution Harmonic State Estimation (DHSE) algorithm are very excellent compared to the algorithms such as Weight Least Square (WLS), Genetic Algorithm (GA), original PSO, and Honey Bees Mating Optimization (HBMO).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new algorithm based on a Hybrid Particle Swarm Optimization (PSO) and Simulated Annealing (SA) called PSO-SA to estimate harmonic state variables in distribution networks. The proposed algorithm performs estimation for both amplitude and phase of each harmonic currents injection by minimizing the error between the measured values from Phasor Measurement Units (PMUs) and the values computed from the estimated parameters during the estimation process. The proposed algorithm can take into account the uncertainty of the harmonic pseudo measurement and the tolerance in the line impedances of the network as well as uncertainty of the Distributed Generators (DGs) such as Wind Turbines (WT). The main feature of proposed PSO-SA algorithm is to reach quickly around the global optimum by PSO with enabling a mutation function and then to find that optimum by SA searching algorithm. Simulation results on IEEE 34 bus radial and a realistic 70-bus radial test networks are presented to demonstrate the speed and accuracy of proposed Distribution Harmonic State Estimation (DHSE) algorithm is extremely effective and efficient in comparison with the conventional algorithms such as Weight Least Square (WLS), Genetic Algorithm (GA), original PSO and Honey Bees Mating Optimization (HBMO) algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the Mossman Mill District Practices Framework. It was developed in the Wet Tropics region within the Great Barrier Reef in north-eastern Australia to describe the environmental benefits of agricultural management practices for the sugar cane industry. The framework translates complex, unclear and overlapping environmental plans, policy and legal arrangements into a simple framework of management practices that landholders can use to improve their management actions. Practices range from those that are old or outdated through to aspirational practices that have the potential to achieve desired resource condition targets. The framework has been applied by stakeholders at multiple scales to better coordinate and integrate a range of policy arrangements to improve natural resource management. It has been used to structure monitoring and evaluation in order to underpin a more adaptive approach to planning at mill district and property scale. Potentially, the framework and approach can be applied across fields of planning where adaptive management is needed. It has the potential to overcome many of the criticisms of property-scale and regional Natural Resource Management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the six decades since the discovery of the double helix structure of DNA by Watson and Crick in 1953, developments in genetic science have transformed our understanding of human health and disease. These developments, along with those in other areas such as computer science, biotechnology, and nanotechnology, have opened exciting new possibilities for the future. In addition, the increasing trend for technologies to converge and build upon each other potentially increases the pace of change, constantly expanding the boundaries of the scientific frontier. At the same time, however, scientific advances are often accompanied by public unease over the potential for unforeseen, negative outcomes. For governments, these issues present significant challenges for effective regulation. This Article analyzes the challenges associated with crafting laws for rapidly changing science and technology. It considers whether we need to regulate, how best to regulate for converging technologies, and how best to ensure the continued relevance of laws in the face of change.