66 resultados para automated thematic analysis of textual data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Understanding the learning styles of individuals may assist in the tailoring of an educational program to optimize learning. General surgery faculty and residents have been characterized previously as having a tendency toward particular learning styles. We seek to understand better the learning styles of general surgery residents and differences that may exist within the population. METHODS: The Kolb Learning Style Inventory was administered yearly to general surgery residents at the University of Cincinnati from 1994 to 2006. This tool allows characterization of learning styles into 4 groups: converging, accommodating, assimilating, and diverging. The converging learning style involves education by actively solving problems. The accommodating learning style uses emotion and interpersonal relationships. The assimilating learning style learns by abstract logic. The diverging learning style learns best by observation. Chi-square analysis and analysis of variance were performed to determine significance. RESULTS: Surveys from 1994 to 2006 (91 residents, 325 responses) were analyzed. The prevalent learning style was converging (185, 57%), followed by assimilating (58, 18%), accommodating (44, 14%), and diverging (38, 12%). At the PGY 1 and 2 levels, male and female residents differed in learning style, with the accommodating learning style being relatively more frequent in women and assimilating learning style more frequent in men (Table 1, p < or = 0.001, chi-square test). Interestingly, learning style did not seem to change with advancing PGY level within the program, which suggests that individual learning styles may be constant throughout residency training. If a resident's learning style changed, it tended to be to converging. In addition, no relation exists between learning style and participation in dedicated basic science training or performance on the ABSIT/SBSE. CONCLUSIONS: Our data suggests that learning style differs between male and female general surgery residents but not with PGY level or ABSIT/SBSE performance. A greater understanding of individual learning styles may allow more refinement and tailoring of surgical programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objectives of this study were to describe the spatio-temporal pattern of an epidemic of highly pathogenic avian influenza (HPAI) in Vietnam and to identify potential risk factors for the introduction and maintenance of infection within the poultry population. The results indicate that during the time period 2004–early 2006 a sequence of three epidemic waves occurred in Vietnam as distinct spatial and temporal clusters. The risk of outbreak occurrence increased with a greater percentage of rice paddy fields, increasing domestic water bird and chicken density. It increased with reducing distance to higher population density aggregations, and in the third epidemic wave with increasing percentage of aquaculture. The findings indicate that agri-livestock farming systems involving domestic water birds and rice production in river delta areas are important for the maintenance and spread of infection. While the government’s control measures appear to have been effective in the South and Central parts of Vietnam, it is likely that in the North of Vietnam the vaccination campaign led to transmission of infection which was subsequently brought under control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: In clinical practice a diagnosis is based on a combination of clinical history, physical examination and additional diagnostic tests. At present, studies on diagnostic research often report the accuracy of tests without taking into account the information already known from history and examination. Due to this lack of information, together with variations in design and quality of studies, conventional meta-analyses based on these studies will not show the accuracy of the tests in real practice. By using individual patient data (IPD) to perform meta-analyses, the accuracy of tests can be assessed in relation to other patient characteristics and allows the development or evaluation of diagnostic algorithms for individual patients. In this study we will examine these potential benefits in four clinical diagnostic problems in the field of gynaecology, obstetrics and reproductive medicine. METHODS/DESIGN: Based on earlier systematic reviews for each of the four clinical problems, studies are considered for inclusion. The first authors of the included studies will be invited to participate and share their original data. After assessment of validity and completeness the acquired datasets are merged. Based on these data, a series of analyses will be performed, including a systematic comparison of the results of the IPD meta-analysis with those of a conventional meta-analysis, development of multivariable models for clinical history alone and for the combination of history, physical examination and relevant diagnostic tests and development of clinical prediction rules for the individual patients. These will be made accessible for clinicians. DISCUSSION: The use of IPD meta-analysis will allow evaluating accuracy of diagnostic tests in relation to other relevant information. Ultimately, this could increase the efficiency of the diagnostic work-up, e.g. by reducing the need for invasive tests and/or improving the accuracy of the diagnostic workup. This study will assess whether these benefits of IPD meta-analysis over conventional meta-analysis can be exploited and will provide a framework for future IPD meta-analyses in diagnostic and prognostic research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High density spatial and temporal sampling of EEG data enhances the quality of results of electrophysiological experiments. Because EEG sources typically produce widespread electric fields (see Chapter 3) and operate at frequencies well below the sampling rate, increasing the number of electrodes and time samples will not necessarily increase the number of observed processes, but mainly increase the accuracy of the representation of these processes. This is namely the case when inverse solutions are computed. As a consequence, increasing the sampling in space and time increases the redundancy of the data (in space, because electrodes are correlated due to volume conduction, and time, because neighboring time points are correlated), while the degrees of freedom of the data change only little. This has to be taken into account when statistical inferences are to be made from the data. However, in many ERP studies, the intrinsic correlation structure of the data has been disregarded. Often, some electrodes or groups of electrodes are a priori selected as the analysis entity and considered as repeated (within subject) measures that are analyzed using standard univariate statistics. The increased spatial resolution obtained with more electrodes is thus poorly represented by the resulting statistics. In addition, the assumptions made (e.g. in terms of what constitutes a repeated measure) are not supported by what we know about the properties of EEG data. From the point of view of physics (see Chapter 3), the natural “atomic” analysis entity of EEG and ERP data is the scalp electric field

Relevância:

100.00% 100.00%

Publicador:

Resumo:

he physics program of the NA61/SHINE (SHINE = SPS Heavy Ion and Neutrino Experiment) experiment at the CERN SPS consists of three subjects. In the first stage of data taking (2007-2009) measurements of hadron production in hadron-nucleus interactions needed for neutrino (T2K) and cosmic-ray (Pierre Auger and KASCADE) experiments will be performed. In the second stage (2009-2010) hadron production in proton-proton and proton-nucleus interactions needed as reference data for a better understanding of nucleus-nucleus reactions will be studied. In the third stage (2009-2013) energy dependence of hadron production properties will be measured in p+p, p+Pb interactions and nucleus-nucleus collisions, with the aim to identify the properties of the onset of deconfinement and find evidence for the critical point of strongly interacting matter. The NA61 experiment was approved at CERN in June 2007. The first pilot run was performed during October 2007. Calibrations of all detector components have been performed successfully and preliminary uncorrected spectra have been obtained. High quality of track reconstruction and particle identification similar to NA49 has been achieved. The data and new detailed simulations confirm that the NA61 detector acceptance and particle identification capabilities cover the phase space required by the T2K experiment. This document reports on the progress made in the calibration and analysis of the 2007 data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Twentieth Century Reanalysis (20CR) is an atmospheric dataset consisting of 56 ensemble members, which covers the entire globe and reaches back to 1871. To assess the suitability of this dataset for studying past extremes, we analysed a prominent extreme event, namely the Galveston Hurricane, which made landfall in September 1900 in Texas, USA. The ensemble mean of 20CR shows a track of the pressure minimum with a small standard deviation among the 56 ensemble members in the area of the Gulf of Mexico. However, there are systematic differences between the assimilated “Best Track” from the International Best Track Archive for Climate Stewardship (IBTrACS) and the ensemble mean track in 20CR. East of the Strait of Florida, the tracks derived from 20CR are located systematically northeast of the assimilated track while in the Gulf of Mexico, the 20CR tracks are systematically shifted to the southwest compared to the IBTrACS position. The hurricane can also be observed in the wind field, which shows a cyclonic rotation and a relatively calm zone in the centre of the hurricane. The 20CR data reproduce the pressure gradient and cyclonic wind field. Regarding the amplitude of the wind speeds, the ensemble mean values from 20CR are significantly lower than the wind speeds known from measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND There is ongoing debate on the optimal drug-eluting stent (DES) in diabetic patients with coronary artery disease. Biodegradable polymer drug-eluting stents (BP-DES) may potentially improve clinical outcomes in these high-risk patients. We sought to compare long-term outcomes in patients with diabetes treated with biodegradable polymer DES vs. durable polymer sirolimus-eluting stents (SES). METHODS We pooled individual patient-level data from 3 randomized clinical trials (ISAR-TEST 3, ISAR-TEST 4 and LEADERS) comparing biodegradable polymer DES with durable polymer SES. Clinical outcomes out to 4years were assessed. The primary end point was the composite of cardiac death, myocardial infarction and target-lesion revascularization. Secondary end points were target lesion revascularization and definite or probable stent thrombosis. RESULTS Of 1094 patients with diabetes included in the present analysis, 657 received biodegradable polymer DES and 437 durable polymer SES. At 4years, the incidence of the primary end point was similar with BP-DES versus SES (hazard ratio=0.95, 95% CI=0.74-1.21, P=0.67). Target lesion revascularization was also comparable between the groups (hazard ratio=0.89, 95% CI=0.65-1.22, P=0.47). Definite or probable stent thrombosis was significantly reduced among patients treated with BP-DES (hazard ratio=0.52, 95% CI=0.28-0.96, P=0.04), a difference driven by significantly lower stent thrombosis rates with BP-DES between 1 and 4years (hazard ratio=0.15, 95% CI=0.03-0.70, P=0.02). CONCLUSIONS In patients with diabetes, biodegradable polymer DES, compared to durable polymer SES, were associated with comparable overall clinical outcomes during follow-up to 4years. Rates of stent thrombosis were significantly lower with BP-DES.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a program (Ragu; Randomization Graphical User interface) for statistical analyses of multichannel event-related EEG and MEG experiments. Based on measures of scalp field differences including all sensors, and using powerful, assumption-free randomization statistics, the program yields robust, physiologically meaningful conclusions based on the entire, untransformed, and unbiased set of measurements. Ragu accommodates up to two within-subject factors and one between-subject factor with multiple levels each. Significance is computed as function of time and can be controlled for type II errors with overall analyses. Results are displayed in an intuitive visual interface that allows further exploration of the findings. A sample analysis of an ERP experiment illustrates the different possibilities offered by Ragu. The aim of Ragu is to maximize statistical power while minimizing the need for a-priori choices of models and parameters (like inverse models or sensors of interest) that interact with and bias statistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identifying and comparing different steady states is an important task for clinical decision making. Data from unequal sources, comprising diverse patient status information, have to be interpreted. In order to compare results an expressive representation is the key. In this contribution we suggest a criterion to calculate a context-sensitive value based on variance analysis and discuss its advantages and limitations referring to a clinical data example obtained during anesthesia. Different drug plasma target levels of the anesthetic propofol were preset to reach and maintain clinically desirable steady state conditions with target controlled infusion (TCI). At the same time systolic blood pressure was monitored, depth of anesthesia was recorded using the bispectral index (BIS) and propofol plasma concentrations were determined in venous blood samples. The presented analysis of variance (ANOVA) is used to quantify how accurately steady states can be monitored and compared using the three methods of measurement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract. Ancient Lake Ohrid is a steep-sided, oligotrophic, karst lake that was tectonically formed most likely within the Pliocene and often referred to as a hotspot of endemic biodiversity. This study aims on tracing significant lake level fluctuations at Lake Ohrid using high-resolution acoustic data in combination with lithological, geochemical, and chronological information from two sediment cores recovered from sub-aquatic terrace levels at ca. 32 and 60m water depth. According to our data, significant lake level fluctuations with prominent lowstands of ca. 60 and 35m below the present water level occurred during Marine Isotope Stage (MIS) 6 and MIS 5, respectively. The effect of these lowstands on biodiversity in most coastal parts of the lake is negligible, due to only small changes in lake surface area, coastline, and habitat. In contrast, biodiversity in shallower areas was more severely affected due to disconnection of today sublacustrine springs from the main water body. Multichannel seismic data from deeper parts of the lake clearly image several clinoform structures stacked on top of each other. These stacked clinoforms indicate significantly lower lake levels prior to MIS 6 and a stepwise rise of water level with intermittent stillstands since its existence as water-filled body, which might have caused enhanced expansion of endemic species within Lake Ohrid.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The WOCAT network has collected, documented, and assessed more than 350 case studies on promising and good practices of SLM. Information on on- and off-site benefits of different SLM types, as well as on investment and maintenance costs is available, sometimes in quantitative and often in qualitative form. The objective of the present paper is to analyse what kind of economic benefits accrue to local stakeholders, and to better understand how these benefits compare to investment and maintenance costs. The large majority of the technologies contained in the database are perceived by land users as having positive benefits that outweigh costs in the long term. About three quarters of them also have positive or at least neutral benefits in the short term. The analysis shows that many SLM measures exist which can generate important benefits to land users, but also to other stakeholders. However, methodological issues need to be tackled and further quantitative and qualitative data are needed to better understand and support the adoption of SLM measures. Keywords: Sustainable Land Management, Costs, Benefits, Technologies