48 resultados para heterogeneous data sources
Resumo:
Self-report surveys are a common method of collecting data on protective equipment use in sport. The aim of this study was to assess the validity of self-reported use of appropriate protective eyewear by squash players. Surveys of squash players' appropriate protective eyewear behaviours were conducted over two consecutive years (2002 and 2003) at randomly-selected squash venues in Melbourne, Australia. Over the two years, 1219 adult players were surveyed (response rate of 92%). Trained observers also recorded the actual on-court appropriate protective eyewear behaviours of all players during the survey sessions. Eyewear use rates calculated from both data sources were compared. The self-reported appropriate protective eyewear use rate (9.4%; 95% CI 7.8, 11.0) was significantly higher (1.6 times more) than the observed rate (5.9%; 95%CI 4.6, 7.2). This suggests that players may over-report their use of appropriate protective equipment, though some may have incorrectly classified their eyewear as being appropriate or suitably protective. Studies that rely only on self-report data on protective equipment use need to take into account that this could lead to biased estimates.
Resumo:
Objectives To find how early experience in clinical and community settings (early experience) affects medical education, and identify strengths and limitations of the available evidence. Design A systematic review rating, by consensus, the strength and importance of outcomes reported in the decade 1992-2001. Data sources Bibliographical databases and journals were searched for publications on the topic, reviewed under the auspices of the recently formed Best Evidence Medical Education (BEME) collaboration. Selection of studies All empirical studies (verifiable, observational data) were included, whatever their design, method, or language of publication. Results Early experience was most commonly provided in community settings, aiming to recruit primary care practitioners for underserved populations. It increased the popularity of primary care residencies, albeit among self selected students. It fostered self awareness and empathic attitudes towards ill people, boosted students' confidence, motivated them, gave them satisfaction, and helped them develop a professional identity. By helping develop interpersonal skills, it made entering clerkships a less stressful experience. Early experience helped students learn about professional roles and responsibilities, healthcare systems, and health needs of a population. It made biomedical, behavioural, and social sciences more relevant and easier to learn. It motivated and rewarded teachers and patients and enriched curriculums. In some countries,junior students provided preventive health care directly to underserved populations. Conclusion Early experience helps medical students learn, helps them develop appropriate attitudes towards their studies and future practice, and orientates medical curriculums towards society's needs. Experimental evidence of its benefit is unlikely to be forthcoming and yet more medical schools are likely to provide it. Effort could usefully be concentrated on evaluating the methods and outcomes of early experience provided within non-experimental research designs, and using that evaluation to improve the quality of curriculums.
Resumo:
Objective: To evaluate the efficacy of Lactobacillus rhamnosus GG in the prevention of antibiotic-associated diarrhoea. Data Sources: A computer-based search of MED-LINE, CINAHL, AMED, the Cochrane Controlled Trials Register and the Cochrane Database of Systematic Reviews was conducted. A hand-search of the bibliographies of relevant papers and previous meta-analyses was undertaken. Review Methods: Trials were included in the review if they compared the effects of L. rhamnosus GG and placebo and listed diarrhoea as a primary end-point. Studies were excluded if they were not placebo-controlled or utilised other probiotic strains. Results:Six trials were found that met all eligibility requirements. Significant statistical heterogeneity of the trials precluded meta-analysis. Four of the six trials found a significant reduction in the risk of antibiotic-associated diarrhoea with co-administration of Lactobacillus GG. One of the trials found a reduced number of days with antibiotic-induced diarrhoea with Lactobacillus GG administration, whilst the final trial found no benefit of Lactobacillus GG supplementation. Conclusion: Additional research is needed to further clarify the effectiveness of Lactobacillus GG in the prevention of antibiotic-associated diarrhoea. Copyright (c) 2005 S. Karger AG, Basel.
Resumo:
An approach and strategy for automatic detection of buildings from aerial images using combined image analysis and interpretation techniques is described in this paper. It is undertaken in several steps. A dense DSM is obtained by stereo image matching and then the results of multi-band classification, the DSM, and Normalized Difference Vegetation Index (NDVI) are used to reveal preliminary building interest areas. From these areas, a shape modeling algorithm has been used to precisely delineate their boundaries. The Dempster-Shafer data fusion technique is then applied to detect buildings from the combination of three data sources by a statistically-based classification. A number of test areas, which include buildings of different sizes, shape, and roof color have been investigated. The tests are encouraging and demonstrate that all processes in this system are important for effective building detection.
Resumo:
A multiagent diagnostic system implemented in a Protege-JADE-JESS environment interfaced with a dynamic simulator and database services is described in this paper. The proposed system architecture enables the use of a combination of diagnostic methods from heterogeneous knowledge sources. The process ontology and the process agents are designed based on the structure of the process system, while the diagnostic agents implement the applied diagnostic methods. A specific completeness coordinator agent is implemented to coordinate the diagnostic agents based on different methods. The system is demonstrated on a case study for diagnosis of faults in a granulation process based on HAZOP and FMEA analysis.
Risk of serious NSAID-related gastrointestinal events during long-term exposure: a systematic review
Resumo:
Objective: Exposure to non-steroidal anti-inflammatory drugs (NSAIDs) is associated wit increased risk of serious gastrointestinal (GI) events compared with non-exposure. We investigated whether that risk is sustained over time. Data sources: Cochrane Controlled Trials Register (to 2002); MEDLINE, EMBASE, Derwent Drug File and Current Contents (1999-2002); manual searching of reviews (1999-2002). Study selection: From 479 search results reviewed and 221 articles retrieved, seven studies of patients exposed to prescription non-selective NSAIDs for more than 6 months and reporting time-dependent serious GI event rates were selected for quantitative data synthesis. These were stratified into two groups by study design. Data extraction: Incidence of GI events and number of patients at specific time points were extracted. Data synthesis: Meta-regression analyses were performed. Change in risk was evaluated by testing whether the slope of the regression line declined over time. Four randomised controlled trials (RCTs) provided evaluable data from five NSAID arms (aspirin, naproxen, two ibuprofen arms, and diclofenac). When the RCT data were combined, a small significant decline in annualised risk was seen: -0.005% (95% Cl, -0.008% to -0.001%) per month. Sensitivity analyses were conducted because there was disparity within the RCT data. The pooled estimate from three cohort studies showed no significant decline in annualised risk over periods up to 2 years: -0.003% (95% Cl, -0.008% to 0.003%) per month. Conclusions: Small decreases in risk over time were observed; these were of negligible clinical importance. For patients who need long-term (> 6 months) treatment, precautionary measures should be considered to reduce the net probability of serious GI events over the anticipated treatment duration. The effect of intermittent versus regular daily therapy on long-term risk needs further investigation.
Resumo:
Objectives: To systematically review radiofrequency ablation (RFA) for treating liver tumors. Data Sources: Databases were searched in July 2003. Study Selection: Studies comparing RFA with other therapies for hepatocellular carcinoma (HCC) and colorectal liver metastases (CLM) plus selected case series for CLM. Data Extraction: One researcher used standardized data extraction tables developed before the study, and these were checked by a second researcher. Data Synthesis: For HCC, 1.3 comparative studies were included, 4 of which were randomized, controlled trials. For CLM, 13 studies were included, 2 of which were nonrandomized comparative studies and 11 that were case series. There did not seem to be any distinct differences in the complication rates between RFA and any of the other procedures for treatment of HCC. The local recurrence rate at 2 years showed a statistically significant benefit for RFA over percutaneous ethanol injection for treatment of HCC (6% vs 26%, 1 randomized, controlled trial). Local recurrence was reported to be more common after RFA than after laser-induced thermotherapy, and a higher recurrence rate and a shorter time to recurrence were dassociated with RFA compared with surgical resection (1 nonrandomized study each). For CLM, the postoperative complication rate ranged from 0% to 33% (3 case series). Survival after diagnosis was shorter in the CLM group treated with RFA than in the surgical resection group (1 nonrandomized study). The CLM local recurrence rate after RFA ranged from 4% to 55% (6 case series). Conclusions: Radiofrequency ablation may be more effective than other treatments in terms of less recurrence of HCC and may be as sale, although the evidence is scant. There was not enough evidence to determine the safety or efficacy of RFA for treatment of CLM.
Resumo:
An emerging issue in the field of astronomy is the integration, management and utilization of databases from around the world to facilitate scientific discovery. In this paper, we investigate application of the machine learning techniques of support vector machines and neural networks to the problem of amalgamating catalogues of galaxies as objects from two disparate data sources: radio and optical. Formulating this as a classification problem presents several challenges, including dealing with a highly unbalanced data set. Unlike the conventional approach to the problem (which is based on a likelihood ratio) machine learning does not require density estimation and is shown here to provide a significant improvement in performance. We also report some experiments that explore the importance of the radio and optical data features for the matching problem.
Resumo:
In 2001/02 five case study communities in both metropolitan and regional urban locations in Australia were chosen as test sites to develop measures of community strength on four domains: natural capital; produced economic capital; human capital; and social and institutional capital. Secondary data sources were used to develop measures on the first three domains. For the fourth domain social and institutional capital primary data collection was undertaken through sample surveys of households. A structured approach was devised. This involved developing a survey instrument using scaled items relating to four elements: formal norms; informal norms; formal structures; and informal structures which embrace the concepts of trust, reciprocity, bonds, bridges, links and networks in the interaction of individuals with their community inherent in the notion social capital. Exploratory principal components analysis was used to identify factors that measure those aspects of social and institutional capital, with confirmatory analysis conducted using Cronbach's Alpha. This enabled the construction of four primary scales and 15 sub-scales as a tool for measuring social and institutional capital. Further analyses reveals that two measures anomie and perceived quality of life and wellbeing relate to certain primary scales of social capital.
Resumo:
The developments of models in Earth Sciences, e.g. for earthquake prediction and for the simulation of mantel convection, are fare from being finalized. Therefore there is a need for a modelling environment that allows scientist to implement and test new models in an easy but flexible way. After been verified, the models should be easy to apply within its scope, typically by setting input parameters through a GUI or web services. It should be possible to link certain parameters to external data sources, such as databases and other simulation codes. Moreover, as typically large-scale meshes have to be used to achieve appropriate resolutions, the computational efficiency of the underlying numerical methods is important. Conceptional this leads to a software system with three major layers: the application layer, the mathematical layer, and the numerical algorithm layer. The latter is implemented as a C/C++ library to solve a basic, computational intensive linear problem, such as a linear partial differential equation. The mathematical layer allows the model developer to define his model and to implement high level solution algorithms (e.g. Newton-Raphson scheme, Crank-Nicholson scheme) or choose these algorithms form an algorithm library. The kernels of the model are generic, typically linear, solvers provided through the numerical algorithm layer. Finally, to provide an easy-to-use application environment, a web interface is (semi-automatically) built to edit the XML input file for the modelling code. In the talk, we will discuss the advantages and disadvantages of this concept in more details. We will also present the modelling environment escript which is a prototype implementation toward such a software system in Python (see www.python.org). Key components of escript are the Data class and the PDE class. Objects of the Data class allow generating, holding, accessing, and manipulating data, in such a way that the actual, in the particular context best, representation is transparent to the user. They are also the key to establish connections with external data sources. PDE class objects are describing (linear) partial differential equation objects to be solved by a numerical library. The current implementation of escript has been linked to the finite element code Finley to solve general linear partial differential equations. We will give a few simple examples which will illustrate the usage escript. Moreover, we show the usage of escript together with Finley for the modelling of interacting fault systems and for the simulation of mantel convection.
Resumo:
We investigate the X-ray properties of the Parkes sample of Bat-spectrum radio sources using data from the ROSAT All-Sky Survey and archival pointed PSPC observations. In total, 163 of the 323 sources are detected. For the remaining 160 sources, 2 sigma upper limits to the X-ray flux are derived. We present power-law photon indices in the 0.1-2.4 keV energy band for 115 sources, which were determined either with a hardness ratio technique or from direct fits to pointed PSPC data if a sufficient number of photons were available. The average photon index is <Gamma > = 1.95(-0.12)(+0.13) for flat-spectrum radio-loud quasars, <Gamma > = 1.70(-0.24)(+0.23) for galaxies, and <Gamma > = 2.40(-0.31)(+0.12) for BL Lac objects. The soft X-ray photon index is correlated with redshift and with radio spectral index in the sense that sources at high redshift and/or with flat (or inverted) radio spectra have flatter X-ray spectra on average. The results are in accord with orientation-dependent unification schemes for radio-loud active galactic nuclei. Webster et al. discovered many sources with unusually red optical continua among the quasars of this sample, and interpreted this result in terms of extinction by dust. Although the X-ray spectra in general do not show excess absorption, we find that low-redshift optically red quasars have significantly lower soft X-ray luminosities on average than objects with blue optical continua. The difference disappears for higher redshifts, as is expected for intrinsic absorption by cold gas associated with the dust. In addition, the scatter in log(f(x)/f(o)) is consistent with the observed optical extinction, contrary to previous claims based on optically or X-ray selected samples. Although alternative explanations for the red optical continua cannot be excluded with the present X-ray data, we note that the observed X-ray properties are consistent with the idea that dust plays an important role in some of the radio-loud quasars with red optical continua.
Resumo:
The assumption in analytical solutions for flow from surface and buried point sources of an average water content, (θ) over bar, behind the wetting front is examined. Some recent work has shown that this assumption fitted some field data well. Here we calculated (θ) over bar using a steady state solution based on the work by Raats [1971] and an exponential dependence of the diffusivity upon the water content. This is compared with a constant value of (θ) over bar calculated from an assumption of a hydraulic conductivity at the wetting front of 1 mm day(-1) and the water content at saturation. This comparison was made for a wide range of soils. The constant (θ) over bar generally underestimated (θ) over bar at small wetted radii and overestimated (θ) over bar at large radii. The crossover point between under and overestimation changed with both soil properties and flow rate. The largest variance occurred for coarser texture soils at low-flow rates. At high-flow rates in finer-textured soils the use of a constant (θ) over bar results in underestimation of the time for the wetting front to reach a particular radius. The value of (θ) over bar is related to the time at which the wetting front reaches a given radius. In coarse-textured soils the use of a constant value of (θ) over bar can result in an error of the time when the wetting front reaches a particular radius, as large as 80% at low-flow rates and large radii.
Resumo:
OBJECTIVE: To establish body mass index (BMI) norms for standard figural stimuli using a large Caucasian population-based sample. In addition, we sought to determine the effectiveness of the figural stimuli to identify individuals as obese or thin. DESIGN: All Caucasian twins born in Virginia between 1915 and 1971 were identified by public birth record. In addition, 3347 individual twins responded to a letter published in the newsletter of the American Association of Retired Persons (AARP). All adult twins (aged 18 and over) from both of these sources and their family members were mailed a 16 page 'Health and Lifestyle' questionnaire. SUBJECTS: BMI and silhouette data were available on 16 728 females and 11 366 males ranging in age from 18- 100. MEASUREMENTS: Self-report information on height-weight, current body size, desired body size and a discrepancy score using standard figural stimuli. RESULTS: Gender- and age-specific norms are presented linking BMI to each of the figural stimuli. Additional norms for desired body size and discrepancy scores are also presented. Receiver operating curves (ROC) indicate that the figural stimuli are effective in classifying individuals as obese or thin. CONCLUSIONS: With the establishment of these norms, the silhouettes used in standard body image assessment can now be linked to BMI. Differences were observed between women and men in terms of desired body size and discrepancy scores, with women preferring smaller sizes. The figural stimuli are a robust technique for classifying individuals as obese or thin.
Resumo:
The vacancy solution theory of adsorption is re-formulated here through the mass-action law, and placed in a convenient framework permitting the development of thermodynamic ally consistent isotherms. It is shown that both the multisite Langmuir model and the classical vacancy solution theory expression are special cases of the more general approach when the Flory-Huggins activity coefficient model is used, with the former being the thermodynamically consistent result. The improved vacancy solution theory approach is further extended here to heterogeneous adsorbents by considering the pore-width dependent potential along with a pore size distribution. However, application of the model to numerous hydrocarbons as well as other adsorptives on microporous activated carbons shows that the multisite model has difficulty in the presence of a pore size distribution, because pores of different sizes can have different numbers of adsorbed layers and therefore different site occupancies. On the other hand, use of the classical vacancy solution theory expression for the local isotherm leads to good simultaneous fit of the data, while yielding a site diameter of about 0.257 nm, consistent with that expected for the potential well in aromatic rings on carbon pore surfaces. It is argued that the classical approach is successful because the Flory-Huggins term effectively represents adsorbate interactions in disguise. When used together with the ideal adsorbed solution theory the heterogeneous vacancy solution theory successfully predicts binary adsorption equilibria, and is found to perform better than the multisite Langmuir as well as the heterogeneous Langmuir model. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
This study presents an investigation of the communicative behaviors and strategies employed in the stimulation and management of productive and destructive conflict in culturally heterogeneous workgroups. Using communication accommodation theory (CAT), we argue that the type and course of conflict in culturally heterogeneous workgroups is impacted by the communicative behaviors and strategies employed by group members during interactions. Analysis of data from participant observations, non-participant observations, semi-structured interviews, and self-report questionnaires support CA T-based predictions and provide fresh insights into the triggers and management strategies associated with conflict in culturally heterogeneous workgroups. In particular, results indicated that the more groups used discourse management Strategies, the more they experienced productive conflict. In addition, the use of explanation and checking of own and others' understanding was a major feature of productive conflict, while speech interruptions emerged as a strategy leading to potential destructive conflict. Groups where leaders emerged and assisted in reversing communication breakdowns were better able to manage their discourse, and achieved consensus On task processes. Contributions to the understanding of the triggers and the management of productive conflict in culturally heterogeneous workgroups are discussed.