940 resultados para Methods: Data Analysis
Resumo:
The use of synthetic combinatorial peptide libraries in positional scanning format (PS-SCL) has emerged recently as an alternative approach for the identification of peptides recognized by T lymphocytes. The choice of both the PS-SCL used for screening experiments and the method used for data analysis are crucial for implementing this approach. With this aim, we tested the recognition of different PS-SCL by a tyrosinase 368-376-specific CTL clone and analyzed the data obtained with a recently developed biometric data analysis based on a model of independent and additive contribution of individual amino acids to peptide antigen recognition. Mixtures defined with amino acids present at the corresponding positions in the native sequence were among the most active for all of the libraries. Somewhat surprisingly, a higher number of native amino acids were identifiable by using amidated COOH-terminal rather than free COOH-terminal PS-SCL. Also, our data clearly indicate that when using PS-SCL longer than optimal, frame shifts occur frequently and should be taken into account. Biometric analysis of the data obtained with the amidated COOH-terminal nonapeptide library allowed the identification of the native ligand as the sequence with the highest score in a public human protein database. However, the adequacy of the PS-SCL data for the identification for the peptide ligand varied depending on the PS-SCL used. Altogether these results provide insight into the potential of PS-SCL for the identification of CTL-defined tumor-derived antigenic sequences and may significantly implement our ability to interpret the results of these analyses.
Resumo:
Switzerland, the country with the highest health expenditure per capita, is lacking data on trauma care and system planning. Recently, 12 trauma centres were designated to be reassessed through a future national trauma registry by 2015. Lausanne University Hospital launched the first Swiss trauma registry in 2008, which contains the largest database on trauma activity nationwide. METHODS: Prospective analysis of data from consecutively admitted shock room patients from 1 January 2008 to 31 December 2012. Shock room admission is based on physiology and mechanism of injury, assessed by prehospital physicians. Management follows a surgeon-led multidisciplinary approach. Injuries are coded by Association for the Advancement of Automotive Medicine (AAAM) certified coders. RESULTS: Over the 5 years, 1,599 trauma patients were admitted, predominantly males with a median age of 41.4 years and median injury severity score (ISS) of 13. Rate of ISS >15 was 42%. Principal mechanisms of injury were road traffic (40.4%) and falls (34.4%), with 91.5% blunt trauma. Principal patterns were brain (64.4%), chest (59.8%) and extremity/pelvic girdle (52.9%) injuries. Severe (abbreviated injury scale [AIS] score ≥ 3) orthopaedic injuries, defined as extremity and spine injuries together, accounted for 67.1%. Overall, 29.1% underwent immediate intervention, mainly by orthopaedics (27.3%), neurosurgeons (26.3 %) and visceral surgeons (13.9%); 43.8% underwent a surgical intervention within the first 24 hours and 59.1% during their hospitalisation. In-hospital mortality for patients with ISS >15 was 26.2%. CONCLUSION: This is the first 5-year report on trauma in Switzerland. Trauma workload was similar to other European countries. Despite high levels of healthcare, mortality exceeds published rates by >50%. Regardless of the importance of a multidisciplinary approach, trauma remains a surgical disease and needs dedicated surgical resources.
Resumo:
The primary objective of this research was to demonstrate the benefits of NDT technologies for effectively detecting and characterizing deterioration in bridge decks. In particular, the objectives were to demonstrate the capabilities of ground-penetrating radar (GPR) and impact echo (IE), and to evaluate and describe the condition of nine bridge decks proposed by Iowa DOT. The first part of the report provides a detailed review of the most important deterioration processes in concrete decks, followed by a discussion of the five NDT technologies utilized in this project. In addition to GPR and IE methods, three other technologies were utilized, namely: half-cell (HC) potential, electrical resistivity (ER), and ultrasonic surface waves (USW) method. The review includes a description of the principles of operation, field implementation, data analysis, and interpretation; information regarding their advantages and limitations in bridge deck evaluations and condition monitoring are also implicitly provided.. The second part of the report provides descriptions and bridge deck evaluation results from the nine bridges. The results of the NDT surveys are described in terms of condition assessment maps and are compared with the observations obtained from the recovered cores or conducted bridge deck rehabilitation. Results from this study confirm that the used technologies can provide detailed and accurate information about a certain type of deterioration, electrochemical environment, or defect. However, they also show that a comprehensive condition assessment of bridge decks can be achieved only through a complementary use of multiple technologies at this stage,. Recommendations are provided for the optimum implementation of NDT technologies for the condition assessment and monitoring of bridge decks.
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
Amplified Fragment Length Polymorphisms (AFLPs) are a cheap and efficient protocol for generating large sets of genetic markers. This technique has become increasingly used during the last decade in various fields of biology, including population genomics, phylogeography, and genome mapping. Here, we present RawGeno, an R library dedicated to the automated scoring of AFLPs (i.e., the coding of electropherogram signals into ready-to-use datasets). Our program includes a complete suite of tools for binning, editing, visualizing, and exporting results obtained from AFLP experiments. RawGeno can either be used with command lines and program analysis routines or through a user-friendly graphical user interface. We describe the whole RawGeno pipeline along with recommendations for (a) setting the analysis of electropherograms in combination with PeakScanner, a program freely distributed by Applied Biosystems; (b) performing quality checks; (c) defining bins and proceeding to scoring; (d) filtering nonoptimal bins; and (e) exporting results in different formats.
Resumo:
BACKGROUND: Changes in antihypertensive drug treatment are paramount in the adequate management of patients with hypertension, still, there is little information regarding changes in antihypertensive drug treatment in Switzerland. Our aim was to assess those changes and associated factors in a population-based, prospective study. METHODS: Data from the population-based, CoLaus study, conducted among subjects initially aged 35-75 years and living in Lausanne, Switzerland. 772 hypertensive subjects (371 women) were followed for a median of 5.4 years. Data Subjects were defined as continuers (no change), switchers (one antihypertensive class replaced by another), combiners (one antihypertensive class added) and discontinuers (stopped treatment). The distribution and the factors associated with changes in antihypertensive drug treatment were assessed. RESULTS: During the study period, the prescription of diuretics decreased and of ARBs increased: at baseline, diuretics were taken by 46.9% of patients; angiotensin receptor blockers (ARB) by 44.7%, angiotensin converting enzyme inhibitors (ACEI) by 28.8%, beta-blockers (BB) by 28.0%, calcium channel blockers (CCB) by 18.9% and other antihypertensive drugs by 0.3%. At follow-up (approximately 5 years later), their corresponding percentages were 42.8%, 51.7%, 25.5%, 33.0% 20.7% and 1.0%. Among all participants, 54.4% (95% confidence interval: 50.8-58.0) were continuers, 26.9% (23.8-30.2) combiners, 12.7% (10.4-15.3) switchers and 6.0% (4.4-7.9) discontinuers. Combiners had higher systolic blood pressure values at baseline than the other groups (p < 0.05). Almost one third (30.6%) of switchers and 29.3% of combiners improved their blood pressure status at follow-up, versus 18.8% of continuers and 8.7% of discontinuers (p < 0.001). Conversely, almost one third (28.3%) of discontinuers became hypertensive (systolic ≥140 mm Hg or diastolic ≥90 mm Hg), vs. 22.1% of continuers, 16.3% of switchers and 11.5% of combiners (p < 0.001). Multivariate analysis showed baseline uncontrolled hypertension, ARBs, drug regimen (monotherapy/polytherapy) and overweight/obesity to be associated with changes in antihypertensive therapy. CONCLUSION: In Switzerland, ARBs have replaced diuretics as the most commonly prescribed antihypertensive drug. Uncontrolled hypertension, ARBs, drug regimen (monotherapy or polytherapy) and overweight/obesity are associated with changes in antihypertensive treatment.
Resumo:
This paper introduces a nonlinear measure of dependence between random variables in the context of remote sensing data analysis. The Hilbert-Schmidt Independence Criterion (HSIC) is a kernel method for evaluating statistical dependence. HSIC is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces. The HSIC empirical estimator is very easy to compute and has good theoretical and practical properties. We exploit the capabilities of HSIC to explain nonlinear dependences in two remote sensing problems: temperature estimation and chlorophyll concentration prediction from spectra. Results show that, when the relationship between random variables is nonlinear or when few data are available, the HSIC criterion outperforms other standard methods, such as the linear correlation or mutual information.
Resumo:
Introduction ICM+ software encapsulates our 20 years' experience in brain monitoring. It collects data from a variety of bedside monitors and produces time trends of parameters defi ned using confi gurable mathematical formulae. To date it is being used in nearly 40 clinical research centres worldwide. We present its application for continuous monitoring of cerebral autoregulation using near-infrared spectroscopy (NIRS). Methods Data from multiple bedside monitors are processed by ICM+ in real time using a large selection of signal processing methods. These include various time and frequency domain analysis functions as well as fully customisable digital fi lters. The fi nal results are displayed in a variety of ways including simple time trends, as well as time window based histograms, cross histograms, correlations, and so forth. All this allows complex information from bedside monitors to be summarized in a concise fashion and presented to medical and nursing staff in a simple way that alerts them to the development of various pathological processes. Results One hundred and fi fty patients monitored continuously with NIRS, arterial blood pressure (ABP) and intracranial pressure (ICP), where available, were included in this study. There were 40 severely headinjured adult patients, 27 SAH patients (NCCU, Cambridge); 60 patients undergoing cardiopulmonary bypass (John Hopkins Hospital, Baltimore) and 23 patients with sepsis (University Hospital, Basel). In addition, MCA fl ow velocity (FV) was monitored intermittently using transcranial Doppler. FV-derived and ICP-derived pressure reactivity indices (PRx, Mx), as well as NIRS-derived reactivity indices (Cox, Tox, Thx) were calculated and showed signifi cant correlation with each other in all cohorts. Errorbar charts showing reactivity index PRx versus CPP (optimal CPP chart) as well as similar curves for NIRS indices versus CPP and ABP were also demonstrated. Conclusions ICM+ software is proving to be a very useful tool for enhancing the battery of available means for monitoring cerebral vasoreactivity and potentially facilitating autoregulation guided therapy. Complexity of data analysis is also hidden inside loadable profi les, thus allowing investigators to take full advantage of validated protocols including advanced processing formulas.
Resumo:
To study the stress-induced effects caused by wounding under a new perspective, a metabolomic strategy based on HPLC-MS has been devised for the model plant Arabidopsis thaliana. To detect induced metabolites and precisely localise these compounds among the numerous constitutive metabolites, HPLC-MS analyses were performed in a two-step strategy. In a first step, rapid direct TOF-MS measurements of the crude leaf extract were performed with a ballistic gradient on a short LC-column. The HPLC-MS data were investigated by multivariate analysis as total mass spectra (TMS). Principal components analysis (PCA) and hierarchical cluster analysis (HCA) on principal coordinates were combined for data treatment. PCA and HCA demonstrated a clear clustering of plant specimens selecting the highest discriminating ions given by the complete data analysis, leading to the specific detection of discrete-induced ions (m/z values). Furthermore, pool constitution with plants of homogeneous behaviour was achieved for confirmatory analysis. In this second step, long high-resolution LC profilings on an UPLC-TOF-MS system were used on pooled samples. This allowed to precisely localise the putative biological marker induced by wounding and by specific extraction of accurate m/z values detected in the screening procedure with the TMS spectra.
Resumo:
Background: This study analyzed prognostic factors and treatment outcomes of primary thyroid lymphoma. Patients and Methods: Data were retrospectively collected for 87 patients (53 stage I and 34 stage II) with median age 65 years. Fifty-two patients were treated with single modality (31 with chemotherapy alone and 21 with radiotherapy alone) and 35 with combined modality treatment. Median follow-up was 51 months. Results: Sixty patients had aggressive lymphoma and 27 had indolent lymphoma. The 5- and 10-year overall survival (OS) rates were 74% and 71%, respectively, and the disease-free survival (DFS) rates were 68% and 64%. Univariate analysis revealed that age, tumor size, stage, lymph node involvement, B symptoms, and treatment modality were prognostic factors for OS, DFS, and local control (LC). Patients with thyroiditis had significantly better LC rates. In multivariate analysis, OS was influenced by age, B symptoms, lymph node involvement, and tumor size, whereas DFS and LC were influenced by B symptoms and tumor size. Compared with single modality treatment, patients treated with combined modality had better 5-year OS, DFS, and LC. Conclusions: Combined modality leads to an excellent prognosis for patients with aggressive lymphoma but does not improve OS and LC in patients with indolent lymphoma.
Resumo:
PURPOSE: To ascertain the prevalence of piercing among a nationally representative sample of adolescents; to assess whether having a piercing is a marker for risk behaviors; and to determine whether having more than one piercing is a cumulative marker for risk behaviors. METHODS: Data were drawn from a cross-sectional survey of a nationally representative sample of adolescents aged 16 to 20 years (N=7548). Controlling for background variables, pierced and non-pierced youth were compared on risk behaviors related to drug use, sexual behavior, and suicide. In a second step, adolescents having one piercing were compared with those having more than one. In both cases, statistically significant variables in the bivariate analysis were included in a logistic regression. Analyses were conducted separately by gender. RESULTS: Overall, 20.2% of our sample had a piercing (excluding earlobes), and it was significantly more prevalent among females than among males (33.8% vs. 7.4%; P<.001). In the bivariate analysis, all risk behaviors were significantly associated with having a piercing, and most of them remained significant in the multivariate analysis. One third of pierced subjects had more than one piercing, with no gender difference in prevalence. In the multivariate analysis, females with more than one piercing were more likely to have had multiple partners and to use cannabis, while no differences were noted for males. CONCLUSIONS: Body piercing is becoming popular among Swiss adolescents, especially females. Having a body piercing seems to be a risk marker for risk behaviors. Moreover, having multiple piercings is a cumulative risk marker for females.
Resumo:
OBJECTIVE: To assess how intrahepatic fat and insulin resistance relate to daily fructose and energy intake during short-term overfeeding in healthy subjects. DESIGN AND METHODS: The analysis of the data collected in several studies in which fasting hepatic glucose production (HGP), hepatic insulin sensitivity index (HISI), and intrahepatocellular lipids (IHCL) had been measured after both 6-7 days on a weight-maintenance diet (control, C; n = 55) and 6-7 days of overfeeding with 1.5 (F1.5, n = 7), 3 (F3, n = 17), or 4 g fructose/kg/day (F4, n = 10), with 3 g glucose/kg/day (G3, n = 11), or with 30% excess energy as saturated fat (fat30%, n = 10). RESULTS: F3, F4, G3, and fat30% all significantly increased IHCL, respectively by 113 ± 86, 102 ± 115, 59 ± 92, and 90 ± 74% as compared to C (all P < 0.05). F4 and G3 increased HGP by 16 ± 10 and 8 ± 11% (both P < 0.05), and F3 and F4 significantly decreased HISI by 20 ± 22 and 19 ± 14% (both P < 0.01). In contrast, there was no significant effect of fat30% on HGP or HISI. CONCLUSIONS: Short-term overfeeding with fructose or glucose decreases hepatic insulin sensitivity and increases hepatic fat content. This indicates short-term regulation of hepatic glucose metabolism by simple carbohydrates.
Resumo:
After a rockfall event, a usual post event survey includes qualitative volume estimation, trajectory mapping and determination of departing zones. However, quantitative measurements are not usually made. Additional relevant quantitative information could be useful in determining the spatial occurrence of rockfall events and help us in quantifying their size. Seismic measurements could be suitable for detection purposes since they are non invasive methods and are relatively inexpensive. Moreover, seismic techniques could provide important information on rockfall size and location of impacts. On 14 February 2007 the Avalanche Group of the University of Barcelona obtained the seismic data generated by an artificially triggered rockfall event at the Montserrat massif (near Barcelona, Spain) carried out in order to purge a slope. Two 3 component seismic stations were deployed in the area about 200 m from the explosion point that triggered the rockfall. Seismic signals and video images were simultaneously obtained. The initial volume of the rockfall was estimated to be 75 m3 by laser scanner data analysis. After the explosion, dozens of boulders ranging from 10¿4 to 5 m3 in volume impacted on the ground at different locations. The blocks fell down onto a terrace, 120 m below the release zone. The impact generated a small continuous mass movement composed of a mixture of rocks, sand and dust that ran down the slope and impacted on the road 60 m below. Time, time-frequency evolution and particle motion analysis of the seismic records and seismic energy estimation were performed. The results are as follows: 1 ¿ A rockfall event generates seismic signals with specific characteristics in the time domain; 2 ¿ the seismic signals generated by the mass movement show a time-frequency evolution different from that of other seismogenic sources (e.g. earthquakes, explosions or a single rock impact). This feature could be used for detection purposes; 3 ¿ particle motion plot analysis shows that the procedure to locate the rock impact using two stations is feasible; 4 ¿ The feasibility and validity of seismic methods for the detection of rockfall events, their localization and size determination are comfirmed.
Resumo:
The correlation between the species composition of pasture communities and soil properties in Plana de Vic has been studied using two multivariate methods, Correspondence Analysis (CA) for the vegetation data and Principal Component Analysis (PCA) for the soil data. To analyse the pastures, we took 144 vegetation relevés (comprising 201 species) that have been classified into 10 phytocoenological communities elsewhere. Most of these communities are almost entirely built up by perennials, ranging from xerophilous, clearly Mediterranean, to mesophilous, related to medium-European pastures, but a few occurring in shallow soils are dominated by therophytes. As for the soil properties, we analysed texture, pH, depth, bulk density, organic matter, C/N ratio and the carbonates content of 25 samples, correspondingto representative relevés of the communities studied.
Resumo:
The present study proposes a modification in one of the most frequently applied effect size procedures in single-case data analysis the percent of nonoverlapping data. In contrast to other techniques, the calculus and interpretation of this procedure is straightforward and it can be easily complemented by visual inspection of the graphed data. Although the percent of nonoverlapping data has been found to perform reasonably well in N = 1 data, the magnitude of effect estimates it yields can be distorted by trend and autocorrelation. Therefore, the data correction procedure focuses on removing the baseline trend from data prior to estimating the change produced in the behavior due to intervention. A simulation study is carried out in order to compare the original and the modified procedures in several experimental conditions. The results suggest that the new proposal is unaffected by trend and autocorrelation and can be used in case of unstable baselines and sequentially related measurements.