903 resultados para Comprehensive
Resumo:
OBJECTIVE: To evaluate a comprehensive MRI protocol that investigates for cancer, vascular disease, and degenerative/inflammatory disease from the head to the pelvis in less than 40 minutes on a new generation 48-channel 3T system. MATERIALS AND METHODS: All MR studies were performed on a 48-channel 3T MR scanner. A 20-channel head/neck coil, two 18-channel body arrays, and a 32-channel spine array were employed. A total of 4 healthy individuals were studied. The designed protocol included a combination of single-shot T2-weighted sequences, T1-weighted 3D gradient-echo pre- and post-gadolinium. All images were retrospectively evaluated by two radiologists independently for overall image quality. RESULTS: The image quality for cancer was rated as excellent in the liver, pancreas, kidneys, lungs, pelvic organs, and brain, and rated as fair in the colon and breast. For vascular diseases ratings were excellent in the aorta, major branch vessel origins, inferior vena cava, portal and hepatic veins, rated as good in pulmonary arteries, and as poor in the coronary arteries. For degenerative/inflammatory diseases ratings were excellent in the brain, liver and pancreas. The inter-observer agreement was excellent. CONCLUSION: A comprehensive and time efficient screening for important categories of disease processes may be achieved with high quality imaging in a new generation 48-channel 3T system.
Resumo:
One of the possible courses of cancer treatment is teletherapy, and one of the most important adverse side effects are skin reactions, an ailment more commonly called radiodermatitis. The main purpose of this study is to analyze knowledge of the evidence about topical products used in the prevention of radiodermatitis, to support care delivery to women with breast cancer during teletherapy. The research method used here is the comprehensive literature review. Four databases were used to select the bibliography. The sample consists of 15 articles. The data shows that, among the topical products analyzed here, Calendula, corticosteroids and Xclair have shown significant protective effects, underlining their actions. The lack of articles published in Brazil highlights the need for further research in this area, seeking better care quality through the use of products with scientifically proven efficiency.
Resumo:
OBJECTIVE: To review the psychometric properties of the Beck Depression Inventory-II (BDI-II) as a self-report measure of depression in a variety of settings and populations. METHODS: Relevant studies of the BDI-II were retrieved through a search of electronic databases, a hand search, and contact with authors. Retained studies (k = 118) were allocated into three groups: non-clinical, psychiatric/institutionalized, and medical samples. RESULTS: The internal consistency was described as around 0.9 and the retest reliability ranged from 0.73 to 0.96. The correlation between BDI-II and the Beck Depression Inventory (BDI-I) was high and substantial overlap with measures of depression and anxiety was reported. The criterion-based validity showed good sensitivity and specificity for detecting depression in comparison to the adopted gold standard. However, the cutoff score to screen for depression varied according to the type of sample. Factor analysis showed a robust dimension of general depression composed by two constructs: cognitive-affective and somatic-vegetative. CONCLUSIONS: The BDI-II is a relevant psychometric instrument, showing high reliability, capacity to discriminate between depressed and non-depressed subjects, and improved concurrent, content, and structural validity. Based on available psychometric evidence, the BDI-II can be viewed as a cost-effective questionnaire for measuring the severity of depression, with broad applicability for research and clinical practice worldwide.
Resumo:
In this work a multidisciplinary study of the December 26th, 2004 Sumatra earthquake has been carried out. We have investigated both the effect of the earthquake on the Earth rotation and the stress field variations associated with the seismic event. In the first part of the work we have quantified the effects of a water mass redistribution associated with the propagation of a tsunami wave on the Earth’s pole path and on the length-of-day (LOD) and applied our modeling results to the tsunami following the 2004 giant Sumatra earthquake. We compared the result of our simulations on the instantaneous rotational axis variations with some preliminary instrumental evidences on the pole path perturbation (which has not been confirmed yet) registered just after the occurrence of the earthquake, which showed a step-like discontinuity that cannot be attributed to the effect of a seismic dislocation. Our results show that the perturbation induced by the tsunami on the instantaneous rotational pole is characterized by a step-like discontinuity, which is compatible with the observations but its magnitude turns out to be almost one hundred times smaller than the detected one. The LOD variation induced by the water mass redistribution turns out to be not significant because the total effect is smaller than current measurements uncertainties. In the second part of this work of thesis we modeled the coseismic and postseismic stress evolution following the Sumatra earthquake. By means of a semi-analytical, viscoelastic, spherical model of global postseismic deformation and a numerical finite-element approach, we performed an analysis of the stress diffusion following the earthquake in the near and far field of the mainshock source. We evaluated the stress changes due to the Sumatra earthquake by projecting the Coulomb stress over the sequence of aftershocks taken from various catalogues in a time window spanning about two years and finally analyzed the spatio-temporal pattern. The analysis performed with the semi-analytical and the finite-element modeling gives a complex picture of the stress diffusion, in the area under study, after the Sumatra earthquake. We believe that the results obtained with the analytical method suffer heavily for the restrictions imposed, on the hypocentral depths of the aftershocks, in order to obtain the convergence of the harmonic series of the stress components. On the contrary we imposed no constraints on the numerical method so we expect that the results obtained give a more realistic description of the stress variations pattern.
Resumo:
Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.
Resumo:
This thesis describes modelling tools and methods suited for complex systems (systems that typically are represented by a plurality of models). The basic idea is that all models representing the system should be linked by well-defined model operations in order to build a structured repository of information, a hierarchy of models. The port-Hamiltonian framework is a good candidate to solve this kind of problems as it supports the most important model operations natively. The thesis in particular addresses the problem of integrating distributed parameter systems in a model hierarchy, and shows two possible mechanisms to do that: a finite-element discretization in port-Hamiltonian form, and a structure-preserving model order reduction for discretized models obtainable from commercial finite-element packages.
Resumo:
Aufbau einer kontinuierlichen, mehrdimensionalen Hochleistungs-flüssigchromatographie-Anlage für die Trennung von Proteinen und Peptiden mit integrierter größenselektiver ProbenfraktionierungEs wurde eine mehrdimensionale HPLC-Trennmethode für Proteine und Peptide mit einem Molekulargewicht von <15 kDa entwickelt.Im ersten Schritt werden die Zielanalyte von höhermolekularen sowie nicht ionischen Bestandteilen mit Hilfe von 'Restricted Access Materialien' (RAM) mit Ionenaustauscher-Funktionalität getrennt. Anschließend werden die Proteine auf einer analytischen Ionenaustauscher-Säule sowie auf Reversed-Phase-Säulen getrennt. Zur Vermeidung von Probenverlusten wurde ein kontinuierlich arbeitendes, voll automatisiertes System auf Basis unterschiedlicher Trenngeschwindigkeiten und vier parallelen RP-Säulen aufgebaut.Es werden jeweils zwei RP-Säulen gleichzeitig, jedoch mit zeitlich versetztem Beginn eluiert, um durch flache Gradienten ausreichende Trennleistungen zu erhalten. Während die dritte Säule regeneriert wird, erfolgt das Beladen der vierte Säule durch Anreicherung der Proteine und Peptide am Säulenkopf. Während der Gesamtanalysenzeit von 96 Minuten werden in Intervallen von 4 Minuten Fraktionen aus der 1. Dimension auf die RP-Säulen überführt und innerhalb von 8 Minuten getrennt, wobei 24 RP-Chromatogramme resultieren.Als Testsubstanzen wurden u.a. Standardproteine, Proteine und Peptide aus humanem Hämofiltrat sowie aus Lungenfibroblast-Zellkulturüberständen eingesetzt. Weiterhin wurden Fraktionen gesammelt und mittels MALDI-TOF Massenspektrometrie untersucht. Bei einer Injektion wurden in den 24 RP-Chromatogrammen mehr als 1000 Peaks aufgelöst. Der theoretische Wert der Peakkapazität liegt bei ungefähr 3000.
Resumo:
The focus of this thesis was the in-situ application of the new analytical technique "GCxGC" in both the marine and continental boundary layer, as well as in the free troposphere. Biogenic and anthropogenic VOCs were analysed and used to characterise local chemistry at the individual measurement sites. The first part of the thesis work was the characterisation of a new set of columns that was to be used later in the field. To simplify the identification, a time-of-flight mass spectrometer (TOF-MS) detector was coupled to the GCxGC. In the field the TOF-MS was substituted by a more robust and tractable flame ionisation detector (FID), which is more suitable for quantitative measurements. During the process, a variety of volatile organic compounds could be assigned to different environmental sources, e.g. plankton sources, eucalyptus forest or urban centers. In-situ measurements of biogenic and anthropogenic VOCs were conducted at the Meteorological Observatory Hohenpeissenberg (MOHP), Germany, applying a thermodesorption-GCxGC-FID system. The measured VOCs were compared to GC-MS measurements routinely conducted at the MOHP as well as to PTR-MS measurements. Furthermore, a compressed ambient air standard was measured from three different gas chromatographic instruments and the results were compared. With few exceptions, the in-situ, as well as the standard measurements, revealed good agreement between the individual instruments. Diurnal cycles were observed, with differing patterns for the biogenic and the anthropogenic compounds. The variability-lifetime relationship of compounds with atmospheric lifetimes from a few hours to a few days in presence of O3 and OH was examined. It revealed a weak but significant influence of chemistry on these short-lived VOCs at the site. The relationship was also used to estimate the average OH radical concentration during the campaign, which was compared to in-situ OH measurements (1.7 x 10^6 molecules/cm^3, 0.071 ppt) for the first time. The OH concentration ranging from 3.5 to 6.5 x 10^5 molecules/cm^3 (0.015 to 0.027 ppt) obtained with this method represents an approximation of the average OH concentration influencing the discussed VOCs from emission to measurement. Based on these findings, the average concentration of the nighttime NO3 radicals was estimated using the same approach and found to range from 2.2 to 5.0 x 10^8 molecules/cm^3 (9.2 to 21.0 ppt). During the MINATROC field campaign, in-situ ambient air measurements with the GCxGC-FID were conducted at Tenerife, Spain. Although the station is mainly situated in the free troposphere, local influences of anthropogenic and biogenic VOCs were observed. Due to a strong dust event originating from Western Africa it was possible to compare the mixing ratios during normal and elevated dust loading in the atmosphere. The mixing ratios during the dust event were found to be lower. However, this could not be attributed to heterogeneous reactions as there was a change in the wind direction from northwesterly to southeasterly during the dust event.
Resumo:
QUESTIONS UNDER STUDY / PRINCIPLES: Interest groups advocate centre-specific outcome data as a useful tool for patients in choosing a hospital for their treatment and for decision-making by politicians and the insurance industry. Haematopoietic stem cell transplantation (HSCT) requires significant infrastructure and represents a cost-intensive procedure. It therefore qualifies as a prime target for such a policy. METHODS: We made use of the comprehensive database of the Swiss Blood Stem Cells Transplant Group (SBST) to evaluate potential use of mortality rates. Nine institutions reported a total of 4717 HSCT - 1427 allogeneic (30.3%), 3290 autologous (69.7%) - in 3808 patients between the years 1997 and 2008. Data were analysed for survival- and transplantation-related mortality (TRM) at day 100 and at 5 years. RESULTS: The data showed marked and significant differences between centres in unadjusted analyses. These differences were absent or marginal when the results were adjusted for disease, year of transplant and the EBMT risk score (a score incorporating patient age, disease stage, time interval between diagnosis and transplantation, and, for allogeneic transplants, donor type and donor-recipient gender combination) in a multivariable analysis. CONCLUSIONS: These data indicate comparable quality among centres in Switzerland. They show that comparison of crude centre-specific outcome data without adjustment for the patient mix may be misleading. Mandatory data collection and systematic review of all cases within a comprehensive quality management system might, in contrast, serve as a model to ascertain the quality of other cost-intensive therapies in Switzerland.
Resumo:
Although it is well established that stromal intercellular adhesion molecule-1 (ICAM-1), ICAM-2, and vascular cell adhesion molecule-1 (VCAM-1) mediate lymphocyte recruitment into peripheral lymph nodes (PLNs), their precise contributions to the individual steps of the lymphocyte homing cascade are not known. Here, we provide in vivo evidence for a selective function for ICAM-1 > ICAM-2 > VCAM-1 in lymphocyte arrest within noninflamed PLN microvessels. Blocking all 3 CAMs completely inhibited lymphocyte adhesion within PLN high endothelial venules (HEVs). Post-arrest extravasation of T cells was a 3-step process, with optional ICAM-1-dependent intraluminal crawling followed by rapid ICAM-1- or ICAM-2-independent diapedesis and perivascular trapping. Parenchymal motility of lymphocytes was modestly reduced in the absence of ICAM-1, while ICAM-2 and alpha4-integrin ligands were not required for B-cell motility within follicles. Our findings highlight nonredundant functions for stromal Ig family CAMs in shear-resistant lymphocyte adhesion in steady-state HEVs, a unique role for ICAM-1 in intraluminal lymphocyte crawling but redundant roles for ICAM-1 and ICAM-2 in lymphocyte diapedesis and interstitial motility.
Resumo:
BACKGROUND: Only few standardized apraxia scales are available and they do not cover all domains and semantic features of gesture production. Therefore, the objective of the present study was to evaluate the reliability and validity of a newly developed test of upper limb apraxia (TULIA), which is comprehensive and still short to administer. METHODS: The TULIA consists of 48 items including imitation and pantomime domain of non-symbolic (meaningless), intransitive (communicative) and transitive (tool related) gestures corresponding to 6 subtests. A 6-point scoring method (0-5) was used (score range 0-240). Performance was assessed by blinded raters based on videos in 133 stroke patients, 84 with left hemisphere damage (LHD) and 49 with right hemisphere damage (RHD), as well as 50 healthy subjects (HS). RESULTS: The clinimetric findings demonstrated mostly good to excellent internal consistency, inter- and intra-rater (test-retest) reliability, both at the level of the six subtests and at individual item level. Criterion validity was evaluated by confirming hypotheses based on the literature. Construct validity was demonstrated by a high correlation (r = 0.82) with the De Renzi-test. CONCLUSION: These results show that the TULIA is both a reliable and valid test to systematically assess gesture production. The test can be easily applied and is therefore useful for both research purposes and clinical practice.
Resumo:
This paper describes informatics for cross-sample analysis with comprehensive two-dimensional gas chromatography (GCxGC) and high-resolution mass spectrometry (HRMS). GCxGC-HRMS analysis produces large data sets that are rich with information, but highly complex. The size of the data and volume of information requires automated processing for comprehensive cross-sample analysis, but the complexity poses a challenge for developing robust methods. The approach developed here analyzes GCxGC-HRMS data from multiple samples to extract a feature template that comprehensively captures the pattern of peaks detected in the retention-times plane. Then, for each sample chromatogram, the template is geometrically transformed to align with the detected peak pattern and generate a set of feature measurements for cross-sample analyses such as sample classification and biomarker discovery. The approach avoids the intractable problem of comprehensive peak matching by using a few reliable peaks for alignment and peak-based retention-plane windows to define comprehensive features that can be reliably matched for cross-sample analysis. The informatics are demonstrated with a set of 18 samples from breast-cancer tumors, each from different individuals, six each for Grades 1-3. The features allow classification that matches grading by a cancer pathologist with 78% success in leave-one-out cross-validation experiments. The HRMS signatures of the features of interest can be examined for determining elemental compositions and identifying compounds.