1000 resultados para 280599 Data Format not elsewhere classified


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Hospital performance reports based on administrative data should distinguish differences in quality of care between hospitals from case mix related variation and random error effects. A study was undertaken to determine which of 12 diagnosis-outcome indicators measured across all hospitals in one state had significant risk adjusted systematic ( or special cause) variation (SV) suggesting differences in quality of care. For those that did, we determined whether SV persists within hospital peer groups, whether indicator results correlate at the individual hospital level, and how many adverse outcomes would be avoided if all hospitals achieved indicator values equal to the best performing 20% of hospitals. Methods: All patients admitted during a 12 month period to 180 acute care hospitals in Queensland, Australia with heart failure (n = 5745), acute myocardial infarction ( AMI) ( n = 3427), or stroke ( n = 2955) were entered into the study. Outcomes comprised in-hospital deaths, long hospital stays, and 30 day readmissions. Regression models produced standardised, risk adjusted diagnosis specific outcome event ratios for each hospital. Systematic and random variation in ratio distributions for each indicator were then apportioned using hierarchical statistical models. Results: Only five of 12 (42%) diagnosis-outcome indicators showed significant SV across all hospitals ( long stays and same diagnosis readmissions for heart failure; in-hospital deaths and same diagnosis readmissions for AMI; and in-hospital deaths for stroke). Significant SV was only seen for two indicators within hospital peer groups ( same diagnosis readmissions for heart failure in tertiary hospitals and inhospital mortality for AMI in community hospitals). Only two pairs of indicators showed significant correlation. If all hospitals emulated the best performers, at least 20% of AMI and stroke deaths, heart failure long stays, and heart failure and AMI readmissions could be avoided. Conclusions: Diagnosis-outcome indicators based on administrative data require validation as markers of significant risk adjusted SV. Validated indicators allow quantification of realisable outcome benefits if all hospitals achieved best performer levels. The overall level of quality of care within single institutions cannot be inferred from the results of one or a few indicators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates how social security interacts with growth and growth determinants (savings, human capital investment, and fertility). Our empirical investigation finds that the estimated coefficient on social security is significantly negative in the fertility equation, insignificant in the saving equation, and significantly positive in the growth and education equations. By contrast, the estimated coefficient on growth is insignificant in the social security equation. The results suggest that social security may indeed be conducive to growth through tipping the trade-off between the number and quality of children toward the latter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this report is to describe the use of WinBUGS for two datasets that arise from typical population pharmacokinetic studies. The first dataset relates to gentamicin concentration-time data that arose as part of routine clinical care of 55 neonates. The second dataset incorporated data from 96 patients receiving enoxaparin. Both datasets were originally analyzed by using NONMEM. In the first instance, although NONMEM provided reasonable estimates of the fixed effects parameters it was unable to provide satisfactory estimates of the between-subject variance. In the second instance, the use of NONMEM resulted in the development of a successful model, albeit with limited available information on the between-subject variability of the pharmacokinetic parameters. WinBUGS was used to develop a model for both of these datasets. Model comparison for the enoxaparin dataset was performed by using the posterior distribution of the log-likelihood and a posterior predictive check. The use of WinBUGS supported the same structural models tried in NONMEM. For the gentamicin dataset a one-compartment model with intravenous infusion was developed, and the population parameters including the full between-subject variance-covariance matrix were available. Analysis of the enoxaparin dataset supported a two compartment model as superior to the one-compartment model, based on the posterior predictive check. Again, the full between-subject variance-covariance matrix parameters were available. Fully Bayesian approaches using MCMC methods, via WinBUGS, can offer added value for analysis of population pharmacokinetic data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mechanical behavior of the vertebrate skull is often modeled using free-body analysis of simple geometric structures and, more recently, finite-element (FE) analysis. In this study, we compare experimentally collected in vivo bone strain orientations and magnitudes from the cranium of the American alligator with those extrapolated from a beam model and extracted from an FE model. The strain magnitudes predicted from beam and FE skull models bear little similarity to relative and absolute strain magnitudes recorded during in vivo biting experiments. However, quantitative differences between principal strain orientations extracted from the FE skull model and recorded during the in vivo experiments were smaller, and both generally matched expectations from the beam model. The differences in strain magnitude between the data sets may be attributable to the level of resolution of the models, the material properties used in the FE model, and the loading conditions (i.e., external forces and constraints). This study indicates that FE models and modeling of skulls as simple engineering structures may give a preliminary idea of how these structures are loaded, but whenever possible, modeling results should be verified with either in vitro or preferably in vivo testing, especially if precise knowledge of strain magnitudes is desired. (c) 2005 Wiley-Liss, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Methodological challenges such as recruitment problems and participant burden make clinical trials in palliative care difficult. In 2001-2004, two community-based randomized controlled trials (RCTs) of case conferences in palliative care settings were independently conducted in Australia-the Queensland Case Conferences trial (QCC) and the Palliative Care Trial (PCT). Design: A structured comparative study of the QCC and PCT was conducted, organized by known practical and organizational barriers to clinical trials in palliative care. Results: Differences in funding dictated study designs and recruitment success; PCT had 6 times the budget of QCC. Sample size attainment. Only PCT achieved the sample size goal. QCC focused on reducing attrition through gatekeeping while PCT maximized participation through detailed recruitment strategies and planned for significant attrition. Testing sustainable interventions. QCC achieved a higher percentage of planned case conferences; the QCC strategy required minimal extra work for clinicians while PCT superimposed conferences on normal work schedules. Minimizing participant burden. Differing strategies of data collection were implemented to reduce participant burden. QCC had short survey instruments. PCT incorporated all data collection into normal clinical nursing encounters. Other. Both studies had acceptable withdrawal rates. Intention-to-treat analyses are planned. Both studies included substudies to validate new outcome measures. Conclusions: Health service interventions in palliative care can be studied using RCTs. Detailed comparative information of strategies, successes and challenges can inform the design of future trials. Key lessons include adequate funding, recruitment focus, sustainable interventions, and mechanisms to minimize participant burden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is ongoing debate whether the efficiency of local cognitive processes leads to global cognitive ability or whether global ability feeds the efficiency of basic processes. A prominent example is the well-replicated association between inspection time (IT), a measure of perceptual discrimination speed, and intelligence (IQ), where it is not known whether increased speed is a cause or consequence of high IQ. We investigated the direction of causation between IT and IQ in 2012 genetically related subjects from Australia and The Netherlands. Models in which the reliable variance of each observed variable was specified as a latent trait showed IT correlations of -0.44 and -0.33 with respective Performance and Verbal IQ; heritabilities were 57% (IT), 83% (PIQ) and 77% (VIQ). Directional causation models provided poor fits to the data, with covariation best explained by pleiotropic genes (influencing variation in both IT and IQ). This finding of a common genetic factor provides a better target for identifying genes involved in cognition than genes which are unique to specific traits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Networked information and communication technologies are rapidly advancing the capacities of governments to target and separately manage specific sub-populations, groups and individuals. Targeting uses data profiling to calculate the differential probabilities of outcomes associated with various personal characteristics. This knowledge is used to classify and sort people for differentiated levels of treatment. Targeting is often used to efficiently and effectively target government resources to the most disadvantaged. Although having many benefits, targeting raises several policy and ethical issues. This paper discusses these issues and the policy responses governments may take to maximise the benefits of targeting while ameliorating the negative aspects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides information on the experimental set-up, data collection methods and results to date for the project Large scale modelling of coarse grained beaches, undertaken at the Large Wave Channel (GWK) of FZK in Hannover by an international group of researchers in Spring 2002. The main objective of the experiments was to provide full scale measurements of cross-shore processes on gravel and mixed beaches for the verification and further development of cross-shore numerical models of gravel and mixed sediment beaches. Identical random and regular wave tests were undertaken for a gravel beach and a mixed sand/gravel beach set up in the flume. Measurements included profile development, water surface elevation along the flume, internal pressures in the swash zone, piezometric head levels within the beach, run-up, flow velocities in the surf-zone and sediment size distributions. The purpose of the paper is to present to the scientific community the experimental procedure, a summary of the data collected, some initial results, as well as a brief outline of the on-going research being carried out with the data by different research groups. The experimental data is available to all the scientific community following submission of a statement of objectives, specification of data requirements and an agreement to abide with the GWK and EU protocols. (C) 2005 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We model nongraphitized carbon black surfaces and investigate adsorption of argon on these surfaces by using the grand canonical Monte Carlo simulation. In this model, the nongraphitized surface is modeled as a stack of graphene layers with some carbon atoms of the top graphene layer being randomly removed. The percentage of the surface carbon atoms being removed and the effective size of the defect ( created by the removal) are the key parameters to characterize the nongraphitized surface. The patterns of adsorption isotherm and isosteric heat are particularly studied, as a function of these surface parameters as well as pressure and temperature. It is shown that the adsorption isotherm shows a steplike behavior on a perfect graphite surface and becomes smoother on nongraphitized surfaces. Regarding the isosteric heat versus loading, we observe for the case of graphitized thermal carbon black the increase of heat in the submonolayer coverage and then a sharp decline in the heat when the second layer is starting to form, beyond which it increases slightly. On the other hand, the isosteric heat versus loading for a highly nongraphitized surface shows a general decline with respect to loading, which is due to the energetic heterogeneity of the surface. It is only when the fluid-fluid interaction is greater than the surface energetic factor that we see a minimum-maximum in the isosteric heat versus loading. These simulation results of isosteric heat agree well with the experimental results of graphitization of Spheron 6 (Polley, M. H.; Schaeffer, W. D.; Smith, W. R. J. Phys. Chem. 1953, 57, 469; Beebe, R. A.; Young, D. M. J. Phys. Chem. 1954, 58, 93). Adsorption isotherms and isosteric heat in pores whose walls have defects are also studied from the simulation, and the pattern of isotherm and isosteric heat could be used to identify the fingerprint of the surface.