31 resultados para Data-driven analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a novel nonparametric density estimator and a new data-driven bandwidth selection method with excellent properties. The approach is in- spired by the principles of the generalized cross entropy method. The pro- posed density estimation procedure has numerous advantages over the tra- ditional kernel density estimator methods. Firstly, for the first time in the nonparametric literature, the proposed estimator allows for a genuine incor- poration of prior information in the density estimation procedure. Secondly, the approach provides the first data-driven bandwidth selection method that is guaranteed to provide a unique bandwidth for any data. Lastly, simulation examples suggest the proposed approach outperforms the current state of the art in nonparametric density estimation in terms of accuracy and reliability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this and a preceding paper, we provide an introduction to the Fujitsu VPP range of vector-parallel supercomputers and to some of the computational chemistry software available for the VPP. Here, we consider the implementation and performance of seven popular chemistry application packages. The codes discussed range from classical molecular dynamics to semiempirical and ab initio quantum chemistry. All have evolved from sequential codes, and have typically been parallelised using a replicated data approach. As such they are well suited to the large-memory/fast-processor architecture of the VPP. For one code, CASTEP, a distributed-memory data-driven parallelisation scheme is presented. (C) 2000 Published by Elsevier Science B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background-The importance of serum triglyceride levels as a risk factor for cardiovascular diseases is uncertain. Methods and Results-We performed an individual participant data meta-analysis of prospective studies conducted in the Asia-Pacific region. Cox models were applied to the combined data from 26 studies to estimate the overall and region-, sex-, and age-specific hazard ratios for major cardiovascular diseases by fifths of triglyceride values. During 796 671 person-years of follow-up among 96 224 individuals, 670 and 667 deaths as a result of coronary heart disease (CHD) and stroke, respectively, were recorded. After adjustment for major cardiovascular risk factors, participants grouped in the highest fifth of triglyceride levels had a 70% (95% CI, 47 to 96) greater risk of CHD death, an 80% (95% CI, 49 to 119) higher risk of fatal or nonfatal CHD, and a 50% (95% CI, 29% to 76%) increased risk of fatal or nonfatal stroke compared with those belonging to the lowest fifth. The association between triglycerides and CHD death was similar across subgroups defined by ethnicity, age, and sex. Conclusions-Serum triglycerides are an important and independent predictor of CHD and stroke risk in the Asia-Pacific region. These results may have clinical implications for cardiovascular risk prediction and the use of lipid-lowering therapy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE: Many guidelines advocate measurement of total or low density lipoprotein cholesterol (LDL), high density lipoprotein cholesterol (HDL), and triglycerides (TG) to determine treatment recommendations for preventing coronary heart disease (CHD) and cardiovascular disease (CVD). This analysis is a comparison of lipid variables as predictors of cardiovascular disease. METHODS: Hazard ratios for coronary and cardiovascular deaths by fourths of total cholesterol (TC), LDL, HDL, TG, non-HDL, TC/HDL, and TG/HDL values, and for a one standard deviation change in these variables, were derived in an individual participant data meta-analysis of 32 cohort studies conducted in the Asia-Pacific region. The predictive value of each lipid variable was assessed using the likelihood ratio statistic. RESULTS: Adjusting for confounders and regression dilution, each lipid variable had a positive (negative for HDL) log-linear association with fatal CHD and CVD. Individuals in the highest fourth of each lipid variable had approximately twice the risk of CHD compared with those with lowest levels. TG and HDL were each better predictors of CHD and CVD risk compared with TC alone, with test statistics similar to TC/HDL and TG/HDL ratios. Calculated LDL was a relatively poor predictor. CONCLUSIONS: While LDL reduction remains the main target of intervention for lipid-lowering, these data support the potential use of TG or lipid ratios for CHD risk prediction. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this study we use region-level panel data on rice production in Vietnam to investigate total factor productivity (TFP) growth in the period since reunification in 1975. Two significant reforms were introduced during this period, one in 1981 allowing farmers to keep part of their produce, and another in 1987 providing improved land tenure. We measure TFP growth using two modified forms of the standard Malmquist data envelopment analysis (DEA) method, which we have named the Three-year-window (TYW) and the Full Cumulative (FC) methods. We have developed these methods to deal with degrees of freedom limitations. Our empirical results indicate strong average TFP growth of between 3.3 and 3.5 per cent per annum, with the fastest growth observed in the period following the first reform. Our results support the assertion that incentive related issues have played a large role in the decline and subsequent resurgence of Vietnamese agriculture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a pressing need to address productivity analysis in the hospitality industry if hotels are to exist as sustainable business entities in rapidly maturing markets. Unfortunately, productivity ratios commonly used by managers are narrowly defined. This study illustrates data envelopment analysis of cross-sectional data that benchmark hotels on observed best performances. Data envelopment analysis enables management to integrate unlike multiple inputs and outputs to make simultaneous comparisons. Findings from the cross-sectional data suggest that some of the hotels have the potential to reduce number of beds and number of part-time staff while increasing revenue.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The reasons for the spectacular collapse of so many centrally-planned economies are a source of ongoing debate. In this paper, we use detailed farm-level data to measure total factor productivity (TFP) changes in Mongolian grain and potato farming during the 14-year period immediately preceding the 1990 economic reforms. We measure TFP growth using stochastic frontier analysis (SFA) and data envelopment analysis (DEA) methods. Our results indicate quite poor overall performance, with an average annual TFP change of - 1.7% in grain and 0.8% in potatoes, over the 14-year period. However, the pattern of TFP growth changed substantially during this period, with TFP growth exceeding 7% per year in the latter half of this period. This suggests that the new policies of improved education, greater management autonomy, and improved incentives, which were introduced in final two planning periods in the 1980s, were beginning to have a significant influence upon the performance of Mongolian crop farming. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Whereas terrestrial animal populations might show genetic connectivity within a continent, marine species, such as hermatypic corals, may have connectivity stretching to all corners of the planet. We quantified the genetic variability within and among populations of the widespread scleractinian coral, Plesiastrea versipora along the eastern Australian seaboard (4145 km) and the Ryukyu Archipelago (Japan, 681 km) using sequences of internal transcribed spacers (ITS1-2) from ribosomal DNA. Geographic patterns in genetic variability were deduced from a nested clade analysis (NCA) performed on a parsimony network haplotype. This analysis allowed the establishment of geographical associations in the distribution of haplotypes within the network cladogram, therefore allowing us to deduce phylogeographical patterns based under models of restricted gene flow, fragmentation and range expansion. No significant structure was found among Ryukyu Archipelago populations. The lack of an association between the positions of haplotypes in the cladogram with geographical location of these populations may be accounted for by a high level of gene flow of P. versipora within this region, probably due to the strong Kuroshio Current. In contrast, strong geographical associations were apparent among populations of P. versipora along the south-east coast of Australia. This pattern of restricted genetic connectivity among populations of P. versipora on the eastern seaboard of Australia seems to be associated with the present surface ocean current (the East Australian Current) on this side of the south-western Pacific Ocean.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The OARSI Standing Committee for Clinical Trials Response Criteria Initiative had developed two sets of responder criteria to present the results of changes after treatment in three symptomatic domains (pain, function, and patient's global assessment) as a single variable for clinical trials (1). For each domain, a response was defined by both a relative and an absolute change, with different cut-offs with regard to the drug, the route of administration and the OA localization. Objective: To propose a simplified set of responder criteria with a similar cut-off, whatever the drug, the route or the OA localization. Methods: Data driven approach: (1) Two databases were considered The 'elaboration' database with which the formal OARSI sets of responder criteria were elaborated and The 'revisit' database. (2) Six different scenarios were evaluated: The two formal OARSI sets of criteria Four proposed scenarios of simplified sets of criteria Data from clinical randomized blinded placebo controlled trials were used to evaluate the performances of the two formal scenarios with two different databases ('elaboration' versus 'revisit') and those of the four proposed simplified scenarios within the 'revisit' database. The placebo effect, active effect, treatment effect, and the required sample arm size to obtain the placebo effect and the active treatment effect observed were the performances evaluated for each of the six scenarios. Experts' opinion approach: Results were discussed among the participants of the OMERACT VI meeting, who voted to select the definite OMERACT-OARSI set of criteria (one of the six evaluated scenarios). Results: Data driven approach: Fourteen trials totaling 1886 CA patients and fifteen studies involving 8164 CA patients were evaluated in the 'elaboration' and the 'revisit' databases respectively. The variability of the performances observed in the 'revisit' database when using the different simplified scenarios was similar to that observed between the two databases ('elaboration' versus 'revisit') when using the formal scenarios. The treatment effect and the required sample arm size were similar for each set of criteria. Experts' opinion approach: According to the experts, these two previous performances were the most important of an optimal set of responder criteria. They chose the set of criteria considering both pain and function as evaluation domain and requiring an absolute change and a relative change from baseline to define a response, with similar cut-offs whatever the drug, the route of administration or the CA localization. Conclusion: This data driven and experts' opinion approach is the basis for proposing an optimal simplified set of responder criteria for CA clinical trials. Other studies, using other sets of CA patients, are required in order to further validate this proposed OMERACT - OARSI set of criteria. (C) 2004 OsteoArthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Genetic diversity and population structure were investigated across the core range of Tasmanian devils (Sarcophilus laniarius; Dasyuridae), a wide-ranging marsupial carnivore restricted to the island of Tasmania. Heterozygosity (0.386-0.467) and allelic diversity (2.7-3.3) were low in all subpopulations and allelic size ranges were small and almost continuous, consistent with a founder effect. Island effects and repeated periods of low population density may also have contributed to the low variation. Within continuous habitat, gene flow appears extensive up to 50 km (high assignment rates to source or close neighbour populations; nonsignificant values of pairwise F-ST), in agreement with movement data. At larger scales (150-250 km), gene flow is reduced (significant pairwise F-ST) but there is no evidence for isolation by distance. The most substantial genetic structuring was observed for comparisons spanning unsuitable habitat, implying limited dispersal of devils between the well-connected, eastern populations and a smaller northwestern population. The genetic distinctiveness of the northwestern population was reflected in all analyses: unique alleles; multivariate analyses of gene frequency (multidimensional scaling, minimum spanning tree, nearest neighbour); high self-assignment (95%); two distinct populations for Tasmania were detected in isolation by distance and in Bayesian model-based clustering analyses. Marsupial carnivores appear to have stronger population subdivisions than their placental counterparts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We demonstrate a portable process for developing a triple bottom line model to measure the knowledge production performance of individual research centres. For the first time, this study also empirically illustrates how a fully units-invariant model of Data Envelopment Analysis (DEA) can be used to measure the relative efficiency of research centres by capturing the interaction amongst a common set of multiple inputs and outputs. This study is particularly timely given the increasing transparency required by governments and industries that fund research activities. The process highlights the links between organisational objectives, desired outcomes and outputs while the emerging performance model represents an executive managerial view. This study brings consistency to current measures that often rely on ratios and univariate analyses that are not otherwise conducive to relative performance analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We develop foreign bank technical, cost and profit efficiency models for particular application with data envelopment analysis (DEA). Key motivations for the paper are (a) the often-observed practice of choosing inputs and outputs where the selection process is poorly explained and linkages to theory are unclear, and (b) foreign bank productivity analysis, which has been neglected in DEA banking literature. The main aim is to demonstrate a process grounded in finance and banking theories for developing bank efficiency models, which can bring comparability and direction to empirical productivity studies. We expect this paper to foster empirical bank productivity studies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we propose a range of dynamic data envelopment analysis (DEA) models which allow information on costs of adjustment to be incorporated into the DEA framework. We first specify a basic dynamic DEA model predicated on a number or simplifying assumptions. We then outline a number of extensions to this model to accommodate asymmetric adjustment costs, non-static output quantities, non-static input prices, and non-static costs of adjustment, technological change, quasi-fixed inputs and investment budget constraints. The new dynamic DEA models provide valuable extra information relative to the standard static DEA models-they identify an optimal path of adjustment for the input quantities, and provide a measure of the potential cost savings that result from recognising the costs of adjusting input quantities towards the optimal point. The new models are illustrated using data relating to a chain of 35 retail department stores in Chile. The empirical results illustrate the wealth of information that can be derived from these models, and clearly show that static models overstate potential cost savings when adjustment costs are non-zero.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: The aim of this paper is to examine some of the factors that facilitate and hinder interagency collaboration between child protection services and mental health services in cases where there is a parent with a mental illness and there are protection concerns for the child(ren). The paper reports on agency practices, worker attitudes and experiences, and barriers to effective collaboration. Method: A self-administered, cross-sectional survey was developed and distributed via direct mail or via line supervisors to workers in statutory child protection services, adult mental health services, child and youth mental health services, and Suspected Child Abuse and Neglect (SCAN) Teams. There were 232 completed questionnaires returned, with an overall response rate of 21%. Thirty-eight percent of respondents were statutory child protection workers. 39% were adult mental health workers, 16% were child and youth mental health workers, and 4% were SCAN Team medical officers (with 3% missing data). Results: Analysis revealed that workers were engaging in a moderate amount of interagency contact, but that they were unhappy with the support provided by their agency. Principle components analysis and multivariate analysis of variance (MANOVA) on items assessing attitudes toward other workers identified four factors, which differed in rates of endorsement: inadequate training, positive regard for child protection workers, positive regard for mental health workers, and mutual mistrust (from highest to lowest level of endorsement). The same procedure identified the relative endorsement of five factors extracted from items about potential barriers: inadequate resources, confidentiality, gaps in interagency processes, unrealistic expectations, and professional knowledge domains and boundaries. Conclusions: Mental health and child protection professionals believe that collaborative practice is necessary; however, their efforts are hindered by a lack of supportive structures and practices at the organizational level. (c) 2005 Published by Elsevier Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Risk assessment systems for introduced species are being developed and applied globally, but methods for rigorously evaluating them are still in their infancy. We explore classification and regression tree models as an alternative to the current Australian Weed Risk Assessment system, and demonstrate how the performance of screening tests for unwanted alien species may be quantitatively compared using receiver operating characteristic (ROC) curve analysis. The optimal classification tree model for predicting weediness included just four out of a possible 44 attributes of introduced plants examined, namely: (i) intentional human dispersal of propagules; (ii) evidence of naturalization beyond native range; (iii) evidence of being a weed elsewhere; and (iv) a high level of domestication. Intentional human dispersal of propagules in combination with evidence of naturalization beyond a plants native range led to the strongest prediction of weediness. A high level of domestication in combination with no evidence of naturalization mitigated the likelihood of an introduced plant becoming a weed resulting from intentional human dispersal of propagules. Unlikely intentional human dispersal of propagules combined with no evidence of being a weed elsewhere led to the lowest predicted probability of weediness. The failure to include intrinsic plant attributes in the model suggests that either these attributes are not useful general predictors of weediness, or data and analysis were inadequate to elucidate the underlying relationship(s). This concurs with the historical pessimism that we will ever be able to accurately predict invasive plants. Given the apparent importance of propagule pressure (the number of individuals of an species released), future attempts at evaluating screening model performance for identifying unwanted plants need to account for propagule pressure when collating and/or analysing datasets. The classification tree had a cross-validated sensitivity of 93.6% and specificity of 36.7%. Based on the area under the ROC curve, the performance of the classification tree in correctly classifying plants as weeds or non-weeds was slightly inferior (Area under ROC curve = 0.83 +/- 0.021 (+/- SE)) to that of the current risk assessment system in use (Area under ROC curve = 0.89 +/- 0.018 (+/- SE)), although requires many fewer questions to be answered.