253 resultados para Averaging operators

em Queensland University of Technology - ePrints Archive


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Harmful Algal Blooms (HABs) are a worldwide problem that have been increasing in frequency and extent over the past several decades. HABs severely damage aquatic ecosystems by destroying benthic habitat, reducing invertebrate and fish populations and affecting larger species such as dugong that rely on seagrasses for food. Few statistical models for predicting HAB occurrences have been developed, and in common with most predictive models in ecology, those that have been developed do not fully account for uncertainties in parameters and model structure. This makes management decisions based on these predictions more risky than might be supposed. We used a probit time series model and Bayesian Model Averaging (BMA) to predict occurrences of blooms of Lyngbya majuscula, a toxic cyanophyte, in Deception Bay, Queensland, Australia. We found a suite of useful predictors for HAB occurrence, with Temperature figuring prominently in models with the majority of posterior support, and a model consisting of the single covariate average monthly minimum temperature showed by far the greatest posterior support. A comparison of alternative model averaging strategies was made with one strategy using the full posterior distribution and a simpler approach that utilised the majority of the posterior distribution for predictions but with vastly fewer models. Both BMA approaches showed excellent predictive performance with little difference in their predictive capacity. Applications of BMA are still rare in ecology, particularly in management settings. This study demonstrates the power of BMA as an important management tool that is capable of high predictive performance while fully accounting for both parameter and model uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Collaborative user-led content creation by online communities, or produsage (Bruns 2008), has generated a variety of useful and important resources and other valuable outcomes, from open source software through the Wikipedia to a variety of smaller-scale, specialist projects. These are often seen as standing in an inherent opposition to commercial interests, and attempts to develop collaborations between community content creators and commercial partners have had mixed success rates to date. However, such tension between community and commerce is not inevitable, and there is substantial potential for more fruitful exchanges and collaboration. This article contributes to the development of this understanding by outlining the key underlying principles of such participatory community processes and exploring the potential tensions which could arise between these communities and their potential external partners. It also sketches out potential approaches to resolving them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Cancer can be a distressing experience for cancer patients and carers, impacting on psychological, social, physical and spiritual functioning. However, health professionals often fail to detect distress in their patients due to time constraints and a lack of experience. Also, with the focus on the patient, carer needs are often overlooked. This study investigated the acceptability of brief distress screening with the Distress Thermometer (DT) and Problem List (PL) to operators of a community-based telephone helpline, as well as to cancer patients and carers calling the service. Methods Operators (n = 18) monitored usage of the DT and PL with callers (cancer patients/carers, >18 years, and English-speaking) from September-December 2006 (n = 666). The DT is a single item, 11-point scale to rate level of distress. The associated PL identifies the cause of distress. Results The DT and PL were used on 90% of eligible callers, most providing valid responses. Benefits included having an objective, structured and consistent means for distress screening and triage to supportive care services. Reported challenges included apparent inappropriateness of the tools due to the nature of the call or level of caller distress, the DT numeric scale, and the level of operator training. Conclusions We observed positive outcomes to using the DT and PL, although operators reported some challenges. Overcoming these challenges may improve distress screening particularly by less experienced clinicians, and further development of the PL items and DT scale may assist with administration. The DT and PL allow clinicians to direct/prioritise interventions or referrals, although ongoing training and support is critical in distress screening.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Countless studies have stressed the importance of social identity, particularly its role in various organizational outcomes, yet questions remain as to how identities initially develop, shift and change based on the configuration of multiple, pluralistic relationships grounded in an organizational setting. The interactive model of social identity formation has been proposed recently to explain the internalization of shared norms and values – critical in identity formation – has not received empirical examination. We analyzed multiple sources of data from nine nuclear professionals over three years to understand the construction of social identity in new entrants entering an organization. Informed by our data analyses, we found support for the interactive model and that age and level of experience influenced whether they undertook an inductive or deductive route of the group norm and value internalization. This study represents an important contribution to the study of social identity and the process by which identities are formed, particularly under conditions of duress or significant organizational disruption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study considered the problem of predicting survival, based on three alternative models: a single Weibull, a mixture of Weibulls and a cure model. Instead of the common procedure of choosing a single “best” model, where “best” is defined in terms of goodness of fit to the data, a Bayesian model averaging (BMA) approach was adopted to account for model uncertainty. This was illustrated using a case study in which the aim was the description of lymphoma cancer survival with covariates given by phenotypes and gene expression. The results of this study indicate that if the sample size is sufficiently large, one of the three models emerge as having highest probability given the data, as indicated by the goodness of fit measure; the Bayesian information criterion (BIC). However, when the sample size was reduced, no single model was revealed as “best”, suggesting that a BMA approach would be appropriate. Although a BMA approach can compromise on goodness of fit to the data (when compared to the true model), it can provide robust predictions and facilitate more detailed investigation of the relationships between gene expression and patient survival. Keywords: Bayesian modelling; Bayesian model averaging; Cure model; Markov Chain Monte Carlo; Mixture model; Survival analysis; Weibull distribution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nitrous oxide (N2O) is one of the greenhouse gases that can contribute to global warming. Spatial variability of N2O can lead to large uncertainties in prediction. However, previous studies have often ignored the spatial dependency to quantify the N2O - environmental factors relationships. Few researches have examined the impacts of various spatial correlation structures (e.g. independence, distance-based and neighbourhood based) on spatial prediction of N2O emissions. This study aimed to assess the impact of three spatial correlation structures on spatial predictions and calibrate the spatial prediction using Bayesian model averaging (BMA) based on replicated, irregular point-referenced data. The data were measured in 17 chambers randomly placed across a 271 m(2) field between October 2007 and September 2008 in the southeast of Australia. We used a Bayesian geostatistical model and a Bayesian spatial conditional autoregressive (CAR) model to investigate and accommodate spatial dependency, and to estimate the effects of environmental variables on N2O emissions across the study site. We compared these with a Bayesian regression model with independent errors. The three approaches resulted in different derived maps of spatial prediction of N2O emissions. We found that incorporating spatial dependency in the model not only substantially improved predictions of N2O emission from soil, but also better quantified uncertainties of soil parameters in the study. The hybrid model structure obtained by BMA improved the accuracy of spatial prediction of N2O emissions across this study region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cryo-electron tomography together with averaging of sub-tomograms containing identical particles can reveal the structure of proteins or protein complexes in their native environment. The resolution of this technique is limited by the contrast transfer function (CTF) of the microscope. The CTF is not routinely corrected in cryo-electron tomography because of difficulties including CTF detection, due to the low signal to noise ratio, and CTF correction, since images are characterised by a spatially variant CTF. Here we simulate the effects of the CTF on the resolution of the final reconstruction, before and after CTF correction, and consider the effect of errors and approximations in defocus determination. We show that errors in defocus determination are well tolerated when correcting a series of tomograms collected at a range of defocus values. We apply methods for determining the CTF parameters in low signal to noise images of tilted specimens, for monitoring defocus changes using observed magnification changes, and for correcting the CTF prior to reconstruction. Using bacteriophage PRDI as a test sample, we demonstrate that this approach gives an improvement in the structure obtained by sub-tomogram averaging from cryo-electron tomograms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The provision of effective training of supervisors and operators is essential if sugar factories are to operate profitably and in an environmentally sustainable and safe manner. The benefits of having supervisor and operator staff with a high level of operational skills are reduced stoppages, increased recovery, improved sugar quality, reduced damage to equipment, and reduced OH&S and environmental impacts. Training of new operators and supervisors in factories has traditionally relied on on-the-job training of the new or inexperienced staff by experienced supervisors and operators, supplemented by courses conducted by contractors such as Sugar Research Institute (SRI). However there is clearly a need for staff to be able to undertake training at any time, drawing on the content of online courses as required. An improved methodology for the training of factory supervisors and operators has been developed by QUT on behalf of a syndicate of mills. The new methodology provides ‘at factory’ learning via self-paced modules. Importantly, the training resources for each module are designed to support the training programs within sugar factories, thereby establishing a benchmark for training across the sugar industry. The modules include notes, training guides and session plans, guidelines for walkthrough tours of the stations, learning activities, resources such as videos, animations, job aids and competency assessments. The materials are available on the web for registered users in Australian Mills and many activities are best undertaken online. Apart from a few interactive online resources, the materials for each module can also be downloaded. The acronym SOTrain (Supervisor and Operator Training) has been applied to the new training program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We generalize the classical notion of Vapnik–Chernovenkis (VC) dimension to ordinal VC-dimension, in the context of logical learning paradigms. Logical learning paradigms encompass the numerical learning paradigms commonly studied in Inductive Inference. A logical learning paradigm is defined as a set W of structures over some vocabulary, and a set D of first-order formulas that represent data. The sets of models of ϕ in W, where ϕ varies over D, generate a natural topology W over W. We show that if D is closed under boolean operators, then the notion of ordinal VC-dimension offers a perfect characterization for the problem of predicting the truth of the members of D in a member of W, with an ordinal bound on the number of mistakes. This shows that the notion of VC-dimension has a natural interpretation in Inductive Inference, when cast into a logical setting. We also study the relationships between predictive complexity, selective complexity—a variation on predictive complexity—and mind change complexity. The assumptions that D is closed under boolean operators and that W is compact often play a crucial role to establish connections between these concepts. We then consider a computable setting with effective versions of the complexity measures, and show that the equivalence between ordinal VC-dimension and predictive complexity fails. More precisely, we prove that the effective ordinal VC-dimension of a paradigm can be defined when all other effective notions of complexity are undefined. On a better note, when W is compact, all effective notions of complexity are defined, though they are not related as in the noncomputable version of the framework.