117 resultados para Quasi-Likelihood
Resumo:
The Fabens method is commonly used to estimate growth parameters k and l infinity in the von Bertalanffy model from tag-recapture data. However, the Fabens method of estimation has an inherent bias when individual growth is variable. This paper presents an asymptotically unbiassed method using a maximum likelihood approach that takes account of individual variability in both maximum length and age-at-tagging. It is assumed that each individual's growth follows a von Bertalanffy curve with its own maximum length and age-at-tagging. The parameter k is assumed to be a constant to ensure that the mean growth follows a von Bertalanffy curve and to avoid overparameterization. Our method also makes more efficient use nf thp measurements at tno and recapture and includes diagnostic techniques for checking distributional assumptions. The method is reasonably robust and performs better than the Fabens method when individual growth differs from the von Bertalanffy relationship. When measurement error is negligible, the estimation involves maximizing the profile likelihood of one parameter only. The method is applied to tag-recapture data for the grooved tiger prawn (Penaeus semisulcatus) from the Gulf of Carpentaria, Australia.
Resumo:
We consider estimation of mortality rates and growth parameters from length-frequency data of a fish stock and derive the underlying length distribution of the population and the catch when there is individual variability in the von Bertalanffy growth parameter L-infinity. The model is flexible enough to accommodate 1) any recruitment pattern as a function of both time and length, 2) length-specific selectivity, and 3) varying fishing effort over time. The maximum likelihood method gives consistent estimates, provided the underlying distribution for individual variation in growth is correctly specified. Simulation results indicate that our method is reasonably robust to violations in the assumptions. The method is applied to tiger prawn data (Penaeus semisulcatus) to obtain estimates of natural and fishing mortality.
Resumo:
The method of generalized estimating equation-, (GEEs) has been criticized recently for a failure to protect against misspecification of working correlation models, which in some cases leads to loss of efficiency or infeasibility of solutions. However, the feasibility and efficiency of GEE methods can be enhanced considerably by using flexible families of working correlation models. We propose two ways of constructing unbiased estimating equations from general correlation models for irregularly timed repeated measures to supplement and enhance GEE. The supplementary estimating equations are obtained by differentiation of the Cholesky decomposition of the working correlation, or as score equations for decoupled Gaussian pseudolikelihood. The estimating equations are solved with computational effort equivalent to that required for a first-order GEE. Full details and analytic expressions are developed for a generalized Markovian model that was evaluated through simulation. Large-sample ".sandwich" standard errors for working correlation parameter estimates are derived and shown to have good performance. The proposed estimating functions are further illustrated in an analysis of repeated measures of pulmonary function in children.
Resumo:
A simple stochastic model of a fish population subject to natural and fishing mortalities is described. The fishing effort is assumed to vary over different periods but to be constant within each period. A maximum-likelihood approach is developed for estimating natural mortality (M) and the catchability coefficient (q) simultaneously from catch-and-effort data. If there is not enough contrast in the data to provide reliable estimates of both M and q, as is often the case in practice, the method can be used to obtain the best possible values of q for a range of possible values of M. These techniques are illustrated with tiger prawn (Penaeus semisulcatus) data from the Northern Prawn Fishery of Australia.
Resumo:
Background There has been considerable publicity regarding population ageing and hospital emergency department (ED) overcrowding. Our study aims to investigate impact of one intervention piloted in Queensland Australia, the Hospital in the Nursing Home (HiNH) program, on reducing ED and hospital attendances from residential aged care facilities (RACFs). Methods A quasi-experimental study was conducted at an intervention hospital undertaking the program and a control hospital with normal practice. Routine Queensland health information system data were extracted for analysis. Results Significant reductions in the number of ED presentations per 1000 RACF beds (rate ratio (95 % CI): 0.78 (0.67–0.92); p = 0.002), number of hospital admissions per 1000 RACF beds (0.62 (0.50–0.76); p < 0.0001), and number of hospital admissions per 100 ED presentations (0.61 (0.43–0.85); p = 0.004) were noticed in the experimental hospital after the intervention; while there were no significant differences between intervention and control hospitals before the intervention. Pre-test and post-test comparison in the intervention hospital also presented significant decreases in ED presentation rate (0.75 (0.65–0.86); p < 0.0001) and hospital admission rate per RACF bed (0.66 (0.54–0.79); p < 0.0001), and a non-significant reduction in hospital admission rate per ED presentation (0.82 (0.61–1.11); p = 0.196). Conclusions Hospital in the Nursing Home program could be effective in reducing ED presentations and hospital admissions from RACF residents. Implementation of the program across a variety of settings is preferred to fully assess the ongoing benefits for patients and any possible cost-savings.
Resumo:
Having the ability to work with complex models can be highly beneficial, but the computational cost of doing so is often large. Complex models often have intractable likelihoods, so methods that directly use the likelihood function are infeasible. In these situations, the benefits of working with likelihood-free methods become apparent. Likelihood-free methods, such as parametric Bayesian indirect likelihood that uses the likelihood of an alternative parametric auxiliary model, have been explored throughout the literature as a good alternative when the model of interest is complex. One of these methods is called the synthetic likelihood (SL), which assumes a multivariate normal approximation to the likelihood of a summary statistic of interest. This paper explores the accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions. We relate BSL to pseudo-marginal methods and propose to use an alternative SL that uses an unbiased estimator of the exact working normal likelihood when the summary statistic has a multivariate normal distribution. Several applications of varying complexity are considered to illustrate the findings of this paper.
Resumo:
Background The estimated likelihood of lower limb amputation is 10 to 30 times higher amongst people with diabetes compared to those without diabetes. Of all non-traumatic amputations in people with diabetes, 85% are preceded by a foot ulcer. Foot ulceration associated with diabetes (diabetic foot ulcers) is caused by the interplay of several factors, most notably diabetic peripheral neuropathy (DPN), peripheral arterial disease (PAD) and changes in foot structure. These factors have been linked to chronic hyperglycaemia (high levels of glucose in the blood) and the altered metabolic state of diabetes. Control of hyperglycaemia may be important in the healing of ulcers. Objectives To assess the effects of intensive glycaemic control compared to conventional control on the outcome of foot ulcers in people with type 1 and type 2 diabetes. Search methods In December 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE; EBSCO CINAHL; Elsevier SCOPUS; ISI Web of Knowledge Web of Science; BioMed Central and LILACS. We also searched clinical trial databases, pharmaceutical trial databases and current international and national clinical guidelines on diabetes foot management for relevant published, non-published, ongoing and terminated clinical trials. There were no restrictions based on language or date of publication or study setting. Selection criteria Published, unpublished and ongoing randomised controlled trials (RCTs) were considered for inclusion where they investigated the effects of intensive glycaemic control on the outcome of active foot ulcers in people with diabetes. Non randomised and quasi-randomised trials were excluded. In order to be included the trial had to have: 1) attempted to maintain or control blood glucose levels and measured changes in markers of glycaemic control (HbA1c or fasting, random, mean, home capillary or urine glucose), and 2) documented the effect of these interventions on active foot ulcer outcomes. Glycaemic interventions included subcutaneous insulin administration, continuous insulin infusion, oral anti-diabetes agents, lifestyle interventions or a combination of these interventions. The definition of the interventional (intensive) group was that it should have a lower glycaemic target than the comparison (conventional) group. Data collection and analysis All review authors independently evaluated the papers identified by the search strategy against the inclusion criteria. Two review authors then independently reviewed all potential full-text articles and trials registry results for inclusion. Main results We only identified one trial that met the inclusion criteria but this trial did not have any results so we could not perform the planned subgroup and sensitivity analyses in the absence of data. Two ongoing trials were identified which may provide data for analyses in a later version of this review. The completion date of these trials is currently unknown. Authors' conclusions The current review failed to find any completed randomised clinical trials with results. Therefore we are unable to conclude whether intensive glycaemic control when compared to conventional glycaemic control has a positive or detrimental effect on the treatment of foot ulcers in people with diabetes. Previous evidence has however highlighted a reduction in risk of limb amputation (from various causes) in people with type 2 diabetes with intensive glycaemic control. Whether this applies to people with foot ulcers in particular is unknown. The exact role that intensive glycaemic control has in treating foot ulcers in multidisciplinary care (alongside other interventions targeted at treating foot ulcers) requires further investigation.
Resumo:
Hedonic property price analysis tells us that property prices can be affected by natural hazards such as floods. This paper examines the impact of flood-related variables (among other factors) on property values, and examines the effect of the release of flood risk map information on property values by comparing the impact with the effect of an actual flood incidence. An examination of the temporal variation of flood impacts on property values is also made. The study is the first of its kind where the impact of the release of flood risk map information to the public is compared with an actual flood incident. In this study, we adopt a spatial quasi-experimental analysis using the release of flood risk maps by Brisbane City Council in Queensland, Australia, in 2009 and the actual floods of 2011. The results suggest that property buyers are more responsive to the actual incidence of floods than to the disclosure of information to the public on the risk of floods.
Resumo:
The electrochemical characteristics of a series of heteroleptic tris(phthalocyaninato) complexes with identical rare earths or mixed rare earths (Pc)M(OOPc)M(OOPc) [M = Eu...Lu, Y; H2Pc = unsubstituted phthalocyanine, H2(OOPc) = 3,4,12,13,21,22,30,31-octakis(octyloxy)phthalocyanine] and (Pc)Eu(OOPc)Er(OOPc) have been recorded and studied comparatively by cyclic voltammetry (CV) and differential pulse voltammetry (DPV) in CH2Cl2 containing 0.1 M tetrabutylammonium perchlorate (TBAP). Up to five quasi-reversible one-electron oxidations and four one-electron reductions have been revealed. The half-wave potentials of the first, second and fifth oxidations depend on the size of the metal center, but the fifth changes in the opposite direction to that of the first two. Moreover, the difference in redox potentials of the first oxidation and first reduction for (Pc)M(OOPc)M(OOPc), 0.85−0.98 V, also decreases linearly along with decreasing rare earth ion radius, clearly showing the rare earth ion size effect and indicating enhanced π−π interactions in the triple-deckers connected by smaller lanthanides. This order follows the red-shift seen in the lowest energy band of triple-decker compounds. The electronic differences between the lanthanides and yttrium are more apparent for triple-decker sandwich complexes than for the analogous double-deckers. By comparing triple-decker, double-decker and mononuclear [ZnII] complexes containing the OOPc ligand, the HOMO−LUMO gap has been shown to contract approximately linearly with the number of stacked phthalocyanine ligands.
Resumo:
The electrochemistry of homoleptic substituted phthalocyaninato rare earth double-decker complexes M(TBPc)2 and M(OOPc)2 [M = Y, La...Lu except Pm; H2TBPc = 3(4),12(13),21(22),30(31)-tetra-tert-butylphthalocyanine, H2OOPc = 3,4,12,13,21,22,30,31-octakis(octyloxy)phthalocyanine] has been comparatively studied by cyclic voltammetry (CV) and differential pulse voltammetry (DPV) in CH2Cl2 containing 0.1 M tetra-n-butylammonium perchlorate (TBAP). Two quasi-reversible one-electron oxidations and three or four quasi-reversible one-electron reductions have been revealed for these neutral double-deckers of two series of substituted complexes, respectively. For comparison, unsubstituted bis(phthalocyaninato) rare earth analogues M(Pc)2 (M = Y, La...Lu except Pm; H2Pc = phthalocyanine) have also been electrochemically investigated. Two quasi-reversible one-electron oxidations and up to five quasi-reversible one-electron reductions have been revealed for these neutral double-decker compounds. The three bis(phthalocyaninato)cerium compounds display one cerium-centered redox wave between the first ligand-based oxidation and reduction. The half-wave potentials of the first and second oxidations and first reduction for double-deckers of the tervalent rare earths depend on the size of the metal center. The difference between the redox potentials of the second and third reductions for MIII(Pc)2, which represents the potential difference between the first oxidation and first reduction of [MIII(Pc)2]−, lies in the range 1.08−1.37 V and also gradually diminishes along with the lanthanide contraction, indicating enhanced π−π interactions in the double-deckers connected by the smaller, lanthanides. This corresponds well with the red-shift of the lowest energy band observed in the electronic absorption spectra of reduced double-decker [MIII(Pc′)2]− (Pc′ = Pc, TBPc, OOPc).
Resumo:
Despite the best intentions of service providers and organisations, service delivery is rarely error-free. While numerous studies have investigated specific cognitive, emotional or behavioural responses to service failure and recovery, these studies do not fully capture the complexity of the services encounter. Consequently, this research develops a more holistic understanding of how specific service recovery strategies affect the responses of customers by combining two existing models—Smith & Bolton’s (2002) model of emotional responses to service performance and Fullerton and Punj’s (1993) structural model of aberrant consumer behaviour—into a conceptual framework. Specific service recovery strategies are proposed to influence consumer cognition, emotion and behaviour. This research was conducted using a 2x2 between-subjects quasi-experimental design that was administered via written survey. The experimental design manipulated two levels of two specific service recovery strategies: compensation and apology. The effect of the four recovery strategies were investigated by collecting data from 18-25 year olds and were analysed using multivariate analysis of covariance and multiple regression analysis. The results suggest that different service recovery strategies are associated with varying scores of satisfaction, perceived distributive justice, positive emotions, negative emotions and negative functional behaviour, but not dysfunctional behaviour. These finding have significant implications for the theory and practice of managing service recovery.
Resumo:
Design as seen from the designer's perspective is a series of amazing imaginative jumps or creative leaps. But design as seen by the design historian is a smooth progression or evolution of ideas that they seem self-evident and inevitable after the event. But the next step is anything but obvious for the artist/creator/inventor/designer stuck at that point just before the creative leap. They know where they have come from and have a general sense of where they are going, but often do not have a precise target or goal. This is why it is misleading to talk of design as a problem-solving activity - it is better defined as a problem-finding activity. This has been very frustrating for those trying to assist the design process with computer-based, problem-solving techniques. By the time the problem has been defined, it has been solved. Indeed the solution is often the very definition of the problem. Design must be creative-or it is mere imitation. But since this crucial creative leap seem inevitable after the event, the question must arise, can we find some way of searching the space ahead? Of course there are serious problems of knowing what we are looking for and the vastness of the search space. It may be better to discard altogether the term "searching" in the context of the design process: Conceptual analogies such as search, search spaces and fitness landscapes aim to elucidate the design process. However, the vastness of the multidimensional spaces involved make these analogies misguided and they thereby actually result in further confounding the issue. The term search becomes a misnomer since it has connotations that imply that it is possible to find what you are looking for. In such vast spaces the term search must be discarded. Thus, any attempt at searching for the highest peak in the fitness landscape as an optimal solution is also meaningless. Futhermore, even the very existence of a fitness landscape is fallacious. Although alternatives in the same region of the vast space can be compared to one another, distant alternatives will stem from radically different roots and will therefore not be comparable in any straightforward manner (Janssen 2000). Nevertheless we still have this tantalizing possibility that if a creative idea seems inevitable after the event, then somehow might the process be rserved? This may be as improbable as attempting to reverse time. A more helpful analogy is from nature, where it is generally assumed that the process of evolution is not long-term goal directed or teleological. Dennett points out a common minsunderstanding of Darwinism: the idea that evolution by natural selection is a procedure for producing human beings. Evolution can have produced humankind by an algorithmic process, without its being true that evolution is an algorithm for producing us. If we were to wind the tape of life back and run this algorithm again, the likelihood of "us" being created again is infinitesimally small (Gould 1989; Dennett 1995). But nevertheless Mother Nature has proved a remarkably successful, resourceful, and imaginative inventor generating a constant flow of incredible new design ideas to fire our imagination. Hence the current interest in the potential of the evolutionary paradigm in design. These evolutionary methods are frequently based on techniques such as the application of evolutionary algorithms that are usually thought of as search algorithms. It is necessary to abandon such connections with searching and see the evolutionary algorithm as a direct analogy with the evolutionary processes of nature. The process of natural selection can generate a wealth of alternative experiements, and the better ones survive. There is no one solution, there is no optimal solution, but there is continuous experiment. Nature is profligate with her prototyping and ruthless in her elimination of less successful experiments. Most importantly, nature has all the time in the world. As designers we cannot afford prototyping and ruthless experiment, nor can we operate on the time scale of the natural design process. Instead we can use the computer to compress space and time and to perform virtual prototyping and evaluation before committing ourselves to actual prototypes. This is the hypothesis underlying the evolutionary paradigm in design (1992, 1995).
Resumo:
Participatory evaluation and participatory action research (PAR) are increasingly used in community-based programs and initiatives and there is a growing acknowledgement of their value. These methodologies focus more on knowledge generated and constructed through lived experience than through social science (Vanderplaat 1995). The scientific ideal of objectivity is usually rejected in favour of a holistic approach that acknowledges and takes into account the diverse perspectives, values and interpretations of participants and evaluation professionals. However, evaluation rigour need not be lost in this approach. Increasing the rigour and trustworthiness of participatory evaluations and PAR increases the likelihood that results are seen as credible and are used to continually improve programs and policies.----- Drawing on learnings and critical reflections about the use of feminist and participatory forms of evaluation and PAR over a 10-year period, significant sources of rigour identified include:----- • participation and communication methods that develop relations of mutual trust and open communication----- • using multiple theories and methodologies, multiple sources of data, and multiple methods of data collection----- • ongoing meta-evaluation and critical reflection----- • critically assessing the intended and unintended impacts of evaluations, using relevant theoretical models----- • using rigorous data analysis and reporting processes----- • participant reviews of evaluation case studies, impact assessments and reports.