899 resultados para time-related underemployment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

1) Background: The most common methods to evaluate clarithromycin resistance is the E-Test, but is time consuming. Resistance of Hp to clarithromycin is due to point mutations in the 23S rRNA. Eight different point mutations have been related to CH resistance, but the large majority of the clarithromycin resistance depends on three point mutations (A2142C, A2142G and A2143G). A novel PCR-based clarithromycin resistance assays, even on paraffin-embedded biopsy specimens, have been proposed. Aims: to assess clarithromycin resistance detecting these point mutation (E-Test as a reference method);secondly, to investigate relation with MIC values. Methods: Paraffin-embedded biopsies of patients Hp-positive were retrieved. The A2142C, A2142G and A2143G point mutations were detected by molecular analysis after DNA extraction by using a TaqMan real-time PCR. Results: The study enrolled 86 patients: 46 resistant and 40 sensible to CH. The Hp status was evaluated at endoscopy, by rapid urease test (RUT), histology and hp culture. According to real-time PCR, 37 specimens were susceptible to clarithromycin (wild type dna) whilst the remaining 49 specimens (57%) were resistant. A2143G is the most frequent mutation. A2142C always express a resistant phenotype and A2142G leads to a resitant phenotype only if homozigous. 2) Background: Colonoscopy work-load for endoscopy services is increasing due to colorectal cancer prevention. We tested a combination of faecal tests to improve accuracy and prioritize the access to colonoscopy. Methods: we tested a combination of fecal tests (FOBT, M2-PK and calprotectin) in a group of 280 patients requiring colonoscopy. Results: 47 patients had CRC and 85 had advanced adenoma/s at colonoscopy/histology. In case of single test, for CRC detection FOBT was the test with the highest specificity and PPV, M2-PK had the highest sensitivity and higher NPV. Combination was more interesting in term of PPV. And the best combination of tests was i-FOBT + M2-PK.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diskotische Hexa-peri-hexabenzocoronene (HBC) als molekulare, definierte graphitische Substrukturen sind bereits seit langem Gegenstand von Untersuchungen zu der Delokalisierung von π-Elektronen. In dieser Arbeit wurden zusätzlich Platin-Komplexe in das periphere Substitutionsmuster von HBC eingeführt. Dies führte zu einer Verbesserung der Emission von dem angeregten Triplett-Zustand in den Singulett-Grundzustand mit einer zusätzlichen Verlängerung der Lebensdauer des angeregten Zustandes. Zusätzlich erlaubte diese Konfiguration ein schnelles Intersystem-Crossing mittels einer verstärkten Spin-Orbit Kopplung, die sowohl bei tiefen Temperaturen, als auch bei Raumtemperatur exklusiv zu Phosphoreszenz (T1→S0) führte. Das Verständniss über solche Prozesse ist auch essentiell für die Entwicklung verbesserter opto-elektronischer Bauteile. Die Erstellung von exakt definierten molekularen Strukturen, die speziell für spezifische Interaktionen hergestellt wurden, machten eine Inkorporation von hydrophoben-hydrophilen, wasserstoffverbrückten oder elektrostatischen funktionalisierten Einheiten notwendig, um damit den supramolekularen Aufbau zu kontrollieren. Mit Imidazolium-Salzen funktionalisierte HBC Derivate wurden zu diesem Zwecke hergestellt. Eine interessante Eigenschaft dieser Moleküle ist ihre Amphiphilie. Dies gestattete die Untersuchung ihrer Eigenschaften in einem polaren Solvens und sowohl der Prozessierbarkeit als auch der Faserbildung auf Siliziumoxid-Trägern. Abhängig vom Lösungsmittel und der gewählten Konditionen konnten hochkristalline Fasern erhalten werden. Durch eine Substitution der HBCs mit langen, sterisch anspruchsvollen Seitenketten, konnte durch eine geeignete Prozessierung eine homöotrope Ausrichtung auf Substraten erreicht werden, was dieses Material interessant für photovoltaische Applikationen macht. Neuartige Polyphenylen-Metall-Komplexe mit diskotischen, linearen und dendritischen Geometrien wurden mittels einer einfachen Reaktion zwischen Co2(CO)8 und Ethinyl-Funktionalitäten in Dichlormethan hergestellt. Nach der Pyrolyse dieser Komplexe ergaben sich unterschiedliche Kohlenstoff-Nanopartikel, inklusive Nanoröhren, graphitischen Nanostäben und Kohlenstoff/Metall Hybrid Komplexe, die durch Elektronenmikroskopie untersucht wurden. Die resultierenden Strukturen waren dabei abhängig von der Zusammensetzung und Struktur der Ausgangssubstanzen. Anhand dieser Resultate ergeben sich diverse Möglichkeiten, um den Mechanismus, der zur Herstellung graphitischer Nanopartikel führt, besser zu verstehen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerosi studi mostrano che gli intervalli temporali sono rappresentati attraverso un codice spaziale che si estende da sinistra verso destra, dove gli intervalli brevi sono rappresentati a sinistra rispetto a quelli lunghi. Inoltre tale disposizione spaziale del tempo può essere influenzata dalla manipolazione dell’attenzione-spaziale. La presente tesi si inserisce nel dibattito attuale sulla relazione tra rappresentazione spaziale del tempo e attenzione-spaziale attraverso l’uso di una tecnica che modula l’attenzione-spaziale, ovvero, l’Adattamento Prismatico (AP). La prima parte è dedicata ai meccanismi sottostanti tale relazione. Abbiamo mostrato che spostando l’attenzione-spaziale con AP, verso un lato dello spazio, si ottiene una distorsione della rappresentazione di intervalli temporali, in accordo con il lato dello spostamento attenzionale. Questo avviene sia con stimoli visivi, sia con stimoli uditivi, nonostante la modalità uditiva non sia direttamente coinvolta nella procedura visuo-motoria di AP. Questo risultato ci ha suggerito che il codice spaziale utilizzato per rappresentare il tempo, è un meccanismo centrale che viene influenzato ad alti livelli della cognizione spaziale. La tesi prosegue con l’indagine delle aree corticali che mediano l’interazione spazio-tempo, attraverso metodi neuropsicologici, neurofisiologici e di neuroimmagine. In particolare abbiamo evidenziato che, le aree localizzate nell’emisfero destro, sono cruciali per l’elaborazione del tempo, mentre le aree localizzate nell’emisfero sinistro sono cruciali ai fini della procedura di AP e affinché AP abbia effetto sugli intervalli temporali. Infine, la tesi, è dedicata allo studio dei disturbi della rappresentazione spaziale del tempo. I risultati ci indicano che un deficit di attenzione-spaziale, dopo danno emisferico destro, provoca un deficit di rappresentazione spaziale del tempo, che si riflette negativamente sulla vita quotidiana dei pazienti. Particolarmente interessanti sono i risultati ottenuti mediante AP. Un trattamento con AP, efficace nel ridurre il deficit di attenzione-spaziale, riduce anche il deficit di rappresentazione spaziale del tempo, migliorando la qualità di vita dei pazienti.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proxy data are essential for the investigation of climate variability on time scales larger than the historical meteorological observation period. The potential value of a proxy depends on our ability to understand and quantify the physical processes that relate the corresponding climate parameter and the signal in the proxy archive. These processes can be explored under present-day conditions. In this thesis, both statistical and physical models are applied for their analysis, focusing on two specific types of proxies, lake sediment data and stable water isotopes.rnIn the first part of this work, the basis is established for statistically calibrating new proxies from lake sediments in western Germany. A comprehensive meteorological and hydrological data set is compiled and statistically analyzed. In this way, meteorological times series are identified that can be applied for the calibration of various climate proxies. A particular focus is laid on the investigation of extreme weather events, which have rarely been the objective of paleoclimate reconstructions so far. Subsequently, a concrete example of a proxy calibration is presented. Maxima in the quartz grain concentration from a lake sediment core are compared to recent windstorms. The latter are identified from the meteorological data with the help of a newly developed windstorm index, combining local measurements and reanalysis data. The statistical significance of the correlation between extreme windstorms and signals in the sediment is verified with the help of a Monte Carlo method. This correlation is fundamental for employing lake sediment data as a new proxy to reconstruct windstorm records of the geological past.rnThe second part of this thesis deals with the analysis and simulation of stable water isotopes in atmospheric vapor on daily time scales. In this way, a better understanding of the physical processes determining these isotope ratios can be obtained, which is an important prerequisite for the interpretation of isotope data from ice cores and the reconstruction of past temperature. In particular, the focus here is on the deuterium excess and its relation to the environmental conditions during evaporation of water from the ocean. As a basis for the diagnostic analysis and for evaluating the simulations, isotope measurements from Rehovot (Israel) are used, provided by the Weizmann Institute of Science. First, a Lagrangian moisture source diagnostic is employed in order to establish quantitative linkages between the measurements and the evaporation conditions of the vapor (and thus to calibrate the isotope signal). A strong negative correlation between relative humidity in the source regions and measured deuterium excess is found. On the contrary, sea surface temperature in the evaporation regions does not correlate well with deuterium excess. Although requiring confirmation by isotope data from different regions and longer time scales, this weak correlation might be of major importance for the reconstruction of moisture source temperatures from ice core data. Second, the Lagrangian source diagnostic is combined with a Craig-Gordon fractionation parameterization for the identified evaporation events in order to simulate the isotope ratios at Rehovot. In this way, the Craig-Gordon model can be directly evaluated with atmospheric isotope data, and better constraints for uncertain model parameters can be obtained. A comparison of the simulated deuterium excess with the measurements reveals that a much better agreement can be achieved using a wind speed independent formulation of the non-equilibrium fractionation factor instead of the classical parameterization introduced by Merlivat and Jouzel, which is widely applied in isotope GCMs. Finally, the first steps of the implementation of water isotope physics in the limited-area COSMO model are described, and an approach is outlined that allows to compare simulated isotope ratios to measurements in an event-based manner by using a water tagging technique. The good agreement between model results from several case studies and measurements at Rehovot demonstrates the applicability of the approach. Because the model can be run with high, potentially cloud-resolving spatial resolution, and because it contains sophisticated parameterizations of many atmospheric processes, a complete implementation of isotope physics will allow detailed, process-oriented studies of the complex variability of stable isotopes in atmospheric waters in future research.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a collection of works focused on the topic of Earthquake Early Warning, with a special attention to large magnitude events. The topic is addressed from different points of view and the structure of the thesis reflects the variety of the aspects which have been analyzed. The first part is dedicated to the giant, 2011 Tohoku-Oki earthquake. The main features of the rupture process are first discussed. The earthquake is then used as a case study to test the feasibility Early Warning methodologies for very large events. Limitations of the standard approaches for large events arise in this chapter. The difficulties are related to the real-time magnitude estimate from the first few seconds of recorded signal. An evolutionary strategy for the real-time magnitude estimate is proposed and applied to the single Tohoku-Oki earthquake. In the second part of the thesis a larger number of earthquakes is analyzed, including small, moderate and large events. Starting from the measurement of two Early Warning parameters, the behavior of small and large earthquakes in the initial portion of recorded signals is investigated. The aim is to understand whether small and large earthquakes can be distinguished from the initial stage of their rupture process. A physical model and a plausible interpretation to justify the observations are proposed. The third part of the thesis is focused on practical, real-time approaches for the rapid identification of the potentially damaged zone during a seismic event. Two different approaches for the rapid prediction of the damage area are proposed and tested. The first one is a threshold-based method which uses traditional seismic data. Then an innovative approach using continuous, GPS data is explored. Both strategies improve the prediction of large scale effects of strong earthquakes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of supermassive black hole (SMBH) accretion during their phase of activity (hence becoming active galactic nuclei, AGN), and its relation to the host-galaxy growth, requires large datasets of AGN, including a significant fraction of obscured sources. X-ray data are strategic in AGN selection, because at X-ray energies the contamination from non-active galaxies is far less significant than in optical/infrared surveys, and the selection of obscured AGN, including also a fraction of heavily obscured AGN, is much more effective. In this thesis, I present the results of the Chandra COSMOS Legacy survey, a 4.6 Ms X-ray survey covering the equatorial COSMOS area. The COSMOS Legacy depth (flux limit f=2x10^(-16) erg/s/cm^(-2) in the 0.5-2 keV band) is significantly better than that of other X-ray surveys on similar area, and represents the path for surveys with future facilities, like Athena and X-ray Surveyor. The final Chandra COSMOS Legacy catalog contains 4016 point-like sources, 97% of which with redshift. 65% of the sources are optically obscured and potentially caught in the phase of main BH growth. We used the sample of 174 Chandra COSMOS Legacy at z>3 to place constraints on the BH formation scenario. We found a significant disagreement between our space density and the predictions of a physical model of AGN activation through major-merger. This suggests that in our luminosity range the BH triggering through secular accretion is likely preferred to a major-merger triggering scenario. Thanks to its large statistics, the Chandra COSMOS Legacy dataset, combined with the other multiwavelength COSMOS catalogs, will be used to answer questions related to a large number of astrophysical topics, with particular focus on the SMBH accretion in different luminosity and redshift regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we provide a baseline data on semidemersal fish assemblages and biology in a heterogeneous and yet less studied portion of the shelf of Antalya Gulf. The distribution of fish abundance in three transects subjected to different fisheries regulations (fishery vs non fishery areas), and including depths of 10, 25, 75, 125, 200 m, was studied between May 2014 and February 2015 in representative months of winter, spring, summer and autumn seasons. A total of 76 fish species belonging to 40 families was collected and semidemersal species distribution was analyzed in comparison with the whole community. Spatial distribution of fish was driven mainly by depth and two main assemblages were observed: shallow waters (10-25; 75 m) and deep waters (125-200 m). Significant differences among transects were found for the whole community but not for the semidemersal species. Analysis showed that this was due to a strong relation of these species with local environmental characteristics rather than to a different fishing pressure over transects. Firstly all species distribute according to the bathymetrical gradient and secondly to the bottom type structure. Semidemersal species were then found more related to zooplankton and suspended matter availability. The main morphological characteristics, sex and size distribution of the target semidemersal species Spicara smaris (Linnaeus, 1758), Saurida undosquamis (Richardson, 1848), Pagellus acarne (Risso, 1827) were also investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present study, we have tried to expand our knowledge about the endocrine mechanisms that regulate feeding and growth in cultured fish, which could be relevant for the improvement of fish farming conditions and feeding strategies. In order to reach this goal, we have investigated some orexigenic hormones, Neuropeptide Y (NPY) and the paralogues of Agouti-related protein, (AgRP1, AgRP2) in Solea senegalensis, an important species for Mediterranean aquaculture. We focused on hormones synchronization to different feeding regimes (diurnal vs nocturnal and random feeding) and photoperiod (light-dark cycle vs constant darkness). Therefore, the achieved results could also be relevant from a chronobiological perspective. Solea senegalensis specimen were reared in two different photoperiods, i.e.LD Light-Dark conditions as well as in DD conditions (constant darkness) along with different feeding regimes (fed at ML, Med and RND times), so to determine if mRNA expression of orexigenic hormones (NPY, AgRP1 and AgRP2) are entrained by feeding time and/or photoperiod. Our results show an independence of npy mRNA expression from the feeding time and suggest an endogenous control of npy expression in telencephalon of sole, while in optice tectum, npy expression could be entrained by the light-dark cycle. Our results on Senegalese sole AgRP1 and AgRP2 showed the same pattern of expression, indicating that expression of AgRPs is related to photoperiod in optic tectum, instead to feeding time. However the involvement of AgRP1 and AgRP2 in feeding behaviour should not be discarded in sole, as further research will be carried out with specimens maintained under different fasting conditions. our results reinforce the role of the telencephalon as the main neural area involved in the neuroendocrine control of food intake in fish, where endogenous NPY rhythms have been found, while diencephalon statistical variations weren’t observed suggesting that this brain area could be less involved in the neuroendocrine control of food intake in fish than previously thought.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional time-domain optical coherence tomography (OCT) has become an important tool for following dry or exudative age-related macular degeneration (AMD). Fourier-domain three-dimensional (3D) OCT was recently introduced. This study tested the reproducibility of 3D-OCT retinal thickness measurements in patients with dry and exudative AMD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paralysis-by-analysis phenomenon, i.e., attending to the execution of one's movement impairs performance, has gathered a lot of attention over recent years (see Wulf, 2007, for a review). Explanations of this phenomenon, e.g., the hypotheses of constrained action (Wulf et al., 2001) or of step-by-step execution (Masters, 1992; Beilock et al., 2002), however, do not refer to the level of underlying mechanisms on the level of sensorimotor control. For this purpose, a “nodal-point hypothesis” is presented here with the core assumption that skilled motor behavior is internally based on sensorimotor chains of nodal points, that attending to intermediate nodal points leads to a muscular re-freezing of the motor system at exactly and exclusively these points in time, and that this re-freezing is accompanied by the disruption of compensatory processes, resulting in an overall decrease of motor performance. Two experiments, on lever sequencing and basketball free throws, respectively, are reported that successfully tested these time-referenced predictions, i.e., showing that muscular activity is selectively increased and compensatory variability selectively decreased at movement-related nodal points if these points are in the focus of attention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Infants with chronic lung disease (CLD) have a capacity to maintain functional lung volume despite alterations to their lung mechanics. We hypothesize that they achieve this by altering breathing patterns and dynamic elevation of lung volume, leading to differences in the relationship between respiratory muscle activity, flow and lung volume. Lung function and transcutaneous electromyography of the respiratory muscles (rEMG) were measured in 20 infants with CLD and in 39 healthy age-matched controls during quiet sleep. We compared coefficient of variations (CVs) of rEMG and the temporal relationship of rEMG variables, to flow and lung volume [functional residual capacity (FRC)] between these groups. The time between the start of inspiratory muscle activity and the resulting flow (tria)--in relation to respiratory cycle time--was significantly longer in infants with CLD. Although FRC had similar associations with tria and postinspiratory activity (corrected for respiratory cycle time), the CV of the diaphragmatic rEMG was lower in CLD infants (22.6 versus 31.0%, p = 0.030). The temporal relationship of rEMG to flow and FRC and the loss of adaptive variability provide additional information on coping mechanisms in infants with CLD. This technique could be used for noninvasive bedside monitoring of CLD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to assess the impact of continued ranibizumab treatment for neovascular age-related macular degeneration on patients from the MARINA and ANCHOR randomised clinical studies who lost ≥ 3 lines of best-corrected visual acuity (BCVA) at any time during the first year of treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background During acute coronary syndromes patients perceive intense distress. We hypothesized that retrospective ratings of patients' MI-related fear of dying, helplessness, or pain, all assessed within the first year post-MI, are associated with poor cardiovascular outcome. Methods We studied 304 patients (61 ± 11 years, 85% men) who after a median of 52 days (range 12-365 days) after index MI retrospectively rated the level of distress in the form of fear of dying, helplessness, or pain they had perceived at the time of MI on a numeric scale ranging from 0 ("no distress") to 10 ("extreme distress"). Non-fatal hospital readmissions due to cardiovascular disease (CVD) related events (i.e., recurrent MI, elective and non-elective stent implantation, bypass surgery, pacemaker implantation, cerebrovascular incidents) were assessed at follow-up. The relative CVD event risk was computed for a (clinically meaningful) 2-point increase of distress using Cox proportional hazard models. Results During a median follow-up of 32 months (range 16-45), 45 patients (14.8%) experienced a CVD-related event requiring hospital readmission. Greater fear of dying (HR 1.21, 95% CI 1.03-1.43), helplessness (HR 1.22, 95% CI 1.04-1.44), or pain (HR 1.27, 95% CI 1.02-1.58) were significantly associated with an increased CVD risk without adjustment for covariates. A similarly increased relative risk emerged in patients with an unscheduled CVD-related hospital readmission, i.e., when excluding patients with elective stenting (fear of dying: HR 1.26, 95% CI 1.05-1.51; helplessness: 1.26, 95% CI 1.05-1.52; pain: HR 1.30, 95% CI 1.01-1.66). In the fully-adjusted models controlling for age, the number of diseased coronary vessels, hypertension, and smoking, HRs were 1.24 (95% CI 1.04-1.46) for fear of dying, 1.26 (95% CI 1.06-1.50) for helplessness, and 1.26 (95% CI 1.01-1.57) for pain. Conclusions Retrospectively perceived MI-related distress in the form of fear of dying, helplessness, or pain was associated with non-fatal cardiovascular outcome independent of other important prognostic factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The risk of Hodgkin lymphoma (HL) is increased in patients infected with HIV-1. We studied the incidence and outcomes of HL, and compared CD4⁺ T-cell trajectories in HL patients and controls matched for duration of combination antiretroviral therapy (cART). A total of 40 168 adult HIV-1-infected patients (median age, 36 years; 70% male; median CD4 cell count, 234 cells/μL) from 16 European cohorts were observed during 159 133 person-years; 78 patients developed HL. The incidence was 49.0 (95% confidence interval [CI], 39.3-61.2) per 100,000 person-years, and similar on cART and not on cART (P = .96). The risk of HL declined as the most recent (time-updated) CD4 count increased: the adjusted hazard ratio comparing more than 350 with less than 50 cells/μL was 0.27 (95% CI, 0.08-0.86). Sixty-one HL cases diagnosed on cART were matched to 1652 controls: during the year before diagnosis, cases lost 98 CD4 cells (95% CI, -159 to -36 cells), whereas controls gained 35 cells (95% CI, 24-46 cells; P < .0001). The incidence of HL is not reduced by cART, and patients whose CD4 cell counts decline despite suppression of HIV-1 replication on cART may harbor HL.