838 resultados para Accelerated failure time Model. Correlated data. Imputation. Residuals analysis
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
This paper presents a metafrontier production function model for firms in different groups having different technologies. The metafrontier model enables the calculation of comparable technical efficiencies for firms operating under different technologies. The model also enables the technology gaps to be estimated for firms under different technologies relative to the potential technology available to the industry as a whole. The metafrontier model is applied in the analysis of panel data on garment firms in five different regions of Indonesia, assuming that the regional stochastic frontier production function models have technical inefficiency effects with the time-varying structure proposed by Battese and Coelli ( 1992).
Resumo:
Background Burns and scalds are a significant cause of morbidity and mortality in children. Successful counter-measures to prevent burn and scald-related injury have been identified. However, evidence indicating the successful roll-out of these counter-measures into the wider community is lacking. Community-based interventions in the form of multi-strategy, multi-focused programmes are hypothesised to result in a reduction in population-wide injury rates. This review tests this hypothesis with regards to burn and scald injury in children. Objectives To assess the effects of community-based interventions, defined as coordinated, multi-strategy initiatives, for reducing burns and scalds in children aged 14 years and under. Search strategy We searched the Cochrane Injuries Group's specialised register, CENTRAL, MEDLINE, EMBASE, CINAHL, PsycINFO, National Research Register and the Web of Knowledge. We also handsearched selected journals and checked the reference lists of selected publications. The searches were last updated in May 2007. Selection criteria Included studies were those that reported changes in medically attended burn and scald-related injury rates in a paediatric population (aged 14 years and under), following the implementation of a controlled community-based intervention. Data collection and analysis Two authors independently assess studies for eligibility and extracted data. Due to heterogeneity between the included studies, a pooled analysis was not appropriate. Main results Of 39 identified studies, four met the criteria for inclusion. Two of the included studies reported a significant decrease in paediatric burn and scald injury in the intervention compared with the control communities. The failure of the other two studies to show a positive result may have been due to limited time-frame for the intervention and/or failure to adequately implement the counter-measures in the communities. Authors' conclusions There are a very limited number of research studies allowing conclusions to be drawn about the effectiveness of community-based injury prevention programmes to prevent burns and scalds in children. There is a pressing need to evaluate high-quality community-based intervention programmes based on efficacious counter-measures to reduce burns and scalds in children. It is important that a framework for considering the problem of burns and scalds in children from a prevention perspective be articulated, and that an evidence-based suite of interventions be combined to create programme guidelines suitable for implementation in communities throughout the world.
Resumo:
Background: One of the major immediate and long-term health issues in modern society is the problem of overweight and obesity. This paper examines the role of the workplace in the problem by studying the association between occupational sitting time and overweight and obesity (body mass index [BMI] >= 25) in a sample of adult Australians in full-time employment. Methods: Data on age, gender, occupation, physical activity, occupational sitting time, and BMI were collected in September 2003 from a sample of 1579 adult men and women in full-time employment at the time of the survey. Logistic regression was used to examine the association between occupational sitting time and overweight and obesity. Results: Mean occupational sitting time was > 3 hours/day, and significantly higher in men (209 minutes) than in women (189 minutes, p =0.026). Univariate analyses showed significant associations between occupational sitting time and BMI of >= 25 in men but not in women. After adjusting for age, occupation, and physical activity, the odds ratio for BMI >= 25 was 1.92 (confidence interval: 1.17-3.17) in men who reported sitting for > 6 hours/day, compared with those who sat for < 45 minutes/day. Conclusions: Occupational sitting time was independently associated with overweight and obesity in men who were in full-time paid work. These results suggest that the workplace may play an important role in the growing problem of overweight and obesity. Further research is needed to clearly understand the association between sitting time at work and over-weight and obesity in women.
Resumo:
We compared growth rates of the lemon shark, Negaprion brevirostris, from Bimini, Bahamas and the Marquesas Keys (MK), Florida using data obtained in a multi-year annual census. We marked new neonate and juvenile sharks with unique electronic identity tags in Bimini and in the MK we tagged neonate and juvenile sharks. Sharks were tagged with tiny, subcutaneous transponders, a type of tagging thought to cause little, if any disruption to normal growth patterns when compared to conventional external tagging. Within the first 2 years of this project, no age data were recorded for sharks caught for the first time in Bimini. Therefore, we applied and tested two methods of age analysis: ( 1) a modified 'minimum convex polygon' method and ( 2) a new age-assigning method, the 'cut-off technique'. The cut-off technique proved to be the more suitable one, enabling us to identify the age of 134 of the 642 previously unknown aged sharks. This maximised the usable growth data included in our analysis. Annual absolute growth rates of juvenile, nursery-bound lemon sharks were almost constant for the two Bimini nurseries and can be best described by a simple linear model ( growth data was only available for age-0 sharks in the MK). Annual absolute growth for age-0 sharks was much greater in the MK than in either the North Sound (NS) and Shark Land (SL) at Bimini. Growth of SL sharks was significantly faster during the first 2 years of life than of the sharks in the NS population. However, in MK, only growth in the first year was considered to be reliably estimated due to low recapture rates. Analyses indicated no significant differences in growth rates between males and females for any area.
Resumo:
We demonstrate a portable process for developing a triple bottom line model to measure the knowledge production performance of individual research centres. For the first time, this study also empirically illustrates how a fully units-invariant model of Data Envelopment Analysis (DEA) can be used to measure the relative efficiency of research centres by capturing the interaction amongst a common set of multiple inputs and outputs. This study is particularly timely given the increasing transparency required by governments and industries that fund research activities. The process highlights the links between organisational objectives, desired outcomes and outputs while the emerging performance model represents an executive managerial view. This study brings consistency to current measures that often rely on ratios and univariate analyses that are not otherwise conducive to relative performance analysis.
Resumo:
The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Electricity market price forecast is a changeling yet very important task for electricity market managers and participants. Due to the complexity and uncertainties in the power grid, electricity prices are highly volatile and normally carry with spikes. which may be (ens or even hundreds of times higher than the normal price. Such electricity spikes are very difficult to be predicted. So far. most of the research on electricity price forecast is based on the normal range electricity prices. This paper proposes a data mining based electricity price forecast framework, which can predict the normal price as well as the price spikes. The normal price can be, predicted by a previously proposed wavelet and neural network based forecast model, while the spikes are forecasted based on a data mining approach. This paper focuses on the spike prediction and explores the reasons for price spikes based on the measurement of a proposed composite supply-demand balance index (SDI) and relative demand index (RDI). These indices are able to reflect the relationship among electricity demand, electricity supply and electricity reserve capacity. The proposed model is based on a mining database including market clearing price, trading hour. electricity), demand, electricity supply and reserve. Bayesian classification and similarity searching techniques are used to mine the database to find out the internal relationships between electricity price spikes and these proposed. The mining results are used to form the price spike forecast model. This proposed model is able to generate forecasted price spike, level of spike and associated forecast confidence level. The model is tested with the Queensland electricity market data with promising results. Crown Copyright (C) 2004 Published by Elsevier B.V. All rights reserved.
Resumo:
This paper presents a finite-difference time-domain (FDTD) simulator for electromagnetic analysis and design applications in MRI. It is intended to be a complete FDTD model of an MRI system including all RF and low-frequency field generating units and electrical models of the patient. The pro-ram has been constructed in an object-oriented framework. The design procedure is detailed and the numerical solver has been verified against analytical solutions for simple cases and also applied to various field calculation problems. In particular, the simulator is demonstrated for inverse RF coil design, optimized source profile generation, and parallel imaging in high-frequency situations. The examples show new developments enabled by the simulator and demonstrate that the proposed FDTD framework can be used to analyze large-scale computational electromagnetic problems in modern MRI engineering. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Reliable, comparable information about the main causes of disease and injury in populations, and how these are changing, is a critical input for debates about priorities in the health sector. Traditional sources of information about the descriptive epidemiology of diseases, injuries and risk factors are generally incomplete, fragmented and of uncertain reliability and comparability. Lack of a standardized measurement framework to permit comparisons across diseases and injuries, as well as risk factors, and failure to systematically evaluate data quality have impeded comparative analyses of the true public health importance of various conditions and risk factors. As a consequence the impact of major conditions and hazards on population health has been poorly appreciated, often leading to a lack of public health investment. Global disease and risk factor quantification improved dramatically in the early 1990s with the completion of the first Global Burden of Disease Study. For the first time, the comparative importance of over 100 diseases and injuries, and ten major risk factors, for global and regional health status could be assessed using a common metric (Disability-Adjusted Life Years) which simultaneously accounted for both premature mortality and the prevalence, duration and severity of the non-fatal consequences of disease and injury. As a consequence, mental health conditions and injuries, for which non-fatal outcomes are of particular significance, were identified as being among the leading causes of disease/injury burden worldwide, with clear implications for policy, particularly prevention. A major achievement of the Study was the complete global descriptive epidemiology, including incidence, prevalence and mortality, by age, sex and Region, of over 100 diseases and injuries. National applications, further methodological research and an increase in data availability have led to improved national, regional and global estimates for 2000, but substantial uncertainty around the disease burden caused by major conditions, including, HIV, remains. The rapid implementation of cost-effective data collection systems in developing countries is a key priority if global public policy to promote health is to be more effectively informed.
Resumo:
Objective Comparisons of the changing patterns of inequalities in occupational mortality provide one way to monitor the achievement of equity goals. However, previous comparisons have not corrected for numerator/denominator bias, which is a consequence of the different ways in which occupational details are recorded on death certificates and on census forms. The objective of this study was to measure the impact of this bias on mortality rates and ratios over time. Methods Using data provided by the Australian Bureau of Statistics, we examined the evidence for bias over the period 1981-2002, and used imputation methods to adjust for this bias. We compared unadjusted with imputed rates of mortality for manual/non-manual workers. Findings Unadjusted data indicate increasing inequality in the age-adjusted rates of mortality for manual/non-manual workers during 1981-2002, Imputed data suggest that there have been modest fluctuations in the ratios of mortality for manual/non-manual workers during this time, but with evidence that inequalities have increased only in recent years and are now at historic highs. Conclusion We found that imputation for missing data leads to changes in estimates of inequalities related to social class in mortality for some years but not for others. Occupational class comparisons should be imputed or otherwise adjusted for missing data on census or death certificates.
Resumo:
We construct the Drinfeld twists (or factorizing F-matrices) of the supersymmetric model associated with quantum superalgebra U-q(gl(m vertical bar n)), and obtain the completely symmetric representations of the creation operators of the model in the F-basis provided by the F-matrix. As an application of our general results, we present the explicit expressions of the Bethe vectors in the F-basis for the U-q(gl(2 vertical bar 1))-model (the quantum t-J model).
Resumo:
Transcriptional regulatory networks govern cell differentiation and the cellular response to external stimuli. However, mammalian model systems have not yet been accessible for network analysis. Here, we present a genome-wide network analysis of the transcriptional regulation underlying the mouse macrophage response to bacterial lipopolysaccharide (LPS). Key to uncovering the network structure is our combination of time-series cap analysis of gene expression with in silico prediction of transcription factor binding sites. By integrating microarray and qPCR time-series expression data with a promoter analysis, we find dynamic subnetworks that describe how signaling pathways change dynamically during the progress of the macrophage LPS response, thus defining regulatory modules characteristic of the inflammatory response. In particular, our integrative analysis enabled us to suggest novel roles for the transcription factors ATF-3 and NRF-2 during the inflammatory response. We believe that our system approach presented here is applicable to understanding cellular differentiation in higher eukaryotes. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
A set of techniques referred to as circular statistics has been developed for the analysis of directional and orientational data. The unit of measure for such data is angular (usually in either degrees or radians), and the statistical distributions underlying the techniques are characterised by their cyclic nature-for example, angles of 359.9 degrees are considered close to angles of 0 degrees. In this paper, we assert that such approaches can be easily adapted to analyse time-of-day and time-of-week data, and in particular daily cycles in the numbers of incidents reported to the police. We begin the paper by describing circular statistics. We then discuss how these may be modified, and demonstrate the approach with some examples for reported incidents in the Cardiff area of Wales. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Intraventricular dyssynchrony has prognostic implications in patients who have severe functional limitation and decreased ejection fraction. Patients with less advanced cardiac disease often exhibit intraventricular dyssynchrony, but there is little available information about its prognostic relevance in such patients. We investigated the prognostic effect of intraventricular dyssynchrony on outcome in 318 patients with known or suspected coronary artery disease who were classified according to the presence or absence of left ventricular dysfunction and heart failure symptoms. Mortality was considered the primary end point over a median follow-up of 56 months, and a Cox proportional hazards model was used for survival analysis. Despite a low prevalence (8%) of left bundle branch block, there was a high prevalence of intraventricular dyssynchrony even in patients without symptomatic heart failure. The magnitude of intraventricular dyssynchrony correlated poorly with QRS duration (r = 0.25),end-systolic volume index (r = 0.27), and number of scar segments (r = 0.25). There,were 58 deaths during follow-up. Ventricular volume, ischemic burden, and magnitude of intraventricular dyssynchrony predicted outcome, but magnitude of intraventricular dyssynchrony was an independent predictor of survival only in patients with asymptomatic left ventricular dysfunction. In conclusion, patients with known or suspected coronary artery disease have a high prevalence of intraventricular dyssynchrony. Although ventricular volume, ischemic burden, and intraventricular dyssynchrony are potentially important prognostic markers, the relative importance of intraventricular dyssynchrony changes with the clinical setting and, may be greatest-in patients with preclinical disease. (c) 2006 Elsevier Inc. All rights reserved.