941 resultados para Differences-in-Differences method


Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: Scarce data are available on the occurrence of symptomatic intracranial hemorrhage related to intravenous thrombolysis for acute stroke in South America. We aimed to address the frequency and clinical predictors of symptomatic intracranial hemorrhage after stroke thrombolysis at our tertiary emergency unit in Brazil. METHOD: We reviewed the clinical and radiological data of 117 consecutive acute ischemic stroke patients treated with intravenous thrombolysis in our hospital between May 2001 and April 2010. We compared our results with those of the Safe Implementation of Thrombolysis in Stroke registry. Univariate and multiple regression analyses were performed to identify factors associated with symptomatic intracranial transformation. RESULTS: In total, 113 cases from the initial sample were analyzed. The median National Institutes of Health Stroke Scale score was 16 (interquartile range: 10-20). The median onset-to-treatment time was 188 minutes (interquartile range: 155-227). There were seven symptomatic intracranial hemorrhages (6.2%; Safe Implementation of Thrombolysis in Stroke registry: 4.9%; p = 0.505). In the univariate analysis, current statin treatment and elevated National Institute of Health Stroke Scale scores were related to symptomatic intracranial hemorrhage. After the multivariate analysis, current statin treatment was the only factor independently associated with symptomatic intracranial hemorrhage. CONCLUSIONS: In this series of Brazilian patients with severe strokes treated with intravenous thrombolysis in a public university hospital at a late treatment window, we found no increase in the rate of symptomatic intracranial hemorrhage. Additional studies are necessary to clarify the possible association between statins and the risk of symptomatic intracranial hemorrhage after stroke thrombolysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Chronic exposure to musical auditory stimulation has been reported to improve cardiac autonomic regulation. However, it is not clear if music acutely influences it in response to autonomic tests. We evaluated the acute effects of music on heart rate variability (HRV) responses to the postural change maneuver (PCM) in women. Method We evaluated 12 healthy women between 18 and 28 years old and HRV was analyzed in the time (SDNN, RMSSD, NN50 and pNN50) and frequency (LF, HF and LF/HF ratio) domains. In the control protocol, the women remained at seated rest for 10 minutes and quickly stood up within three seconds and remained standing still for 15 minutes. In the music protocol, the women remained at seated rest for 10 minutes, were exposed to music for 10 minutes and quickly stood up within three seconds and remained standing still for 15 minutes. HRV was recorded at the following time: rest, music (music protocol) 0–5, 5–10 and 10–15 min during standing. Results In the control protocol the SDNN, RMSSD and pNN50 indexes were reduced at 10–15 minutes after the volunteers stood up, while the LF (nu) index was increased at the same moment compared to seated rest. In the protocol with music, the indexes were not different from control but the RMSSD, pNN50 and LF (nu) were different from the music period. Conclusion Musical auditory stimulation attenuates the cardiac autonomic responses to the PCM.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of geoid models to estimate the Mean Dynamic Topography was stimulated with the launching of the GRACE satellite system, since its models present unprecedented precision and space-time resolution. In the present study, besides the DNSC08 mean sea level model, the following geoid models were used with the objective of computing the MDTs: EGM96, EIGEN-5C and EGM2008. In the method adopted, geostrophic currents for the South Atlantic were computed based on the MDTs. In this study it was found that the degree and order of the geoid models affect the determination of TDM and currents directly. The presence of noise in the MDT requires the use of efficient filtering techniques, such as the filter based on Singular Spectrum Analysis, which presents significant advantages in relation to conventional filters. Geostrophic currents resulting from geoid models were compared with the HYCOM hydrodynamic numerical model. In conclusion, results show that MDTs and respective geostrophic currents calculated with EIGEN-5C and EGM2008 models are similar to the results of the numerical model, especially regarding the main large scale features such as boundary currents and the retroflection at the Brazil-Malvinas Confluence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes a logic-based formalism for qualitative spatial reasoning with cast shadows (Perceptual Qualitative Relations on Shadows, or PQRS) and presents results of a mobile robot qualitative self-localisation experiment using this formalism. Shadow detection was accomplished by mapping the images from the robot’s monocular colour camera into a HSV colour space and then thresholding on the V dimension. We present results of selflocalisation using two methods for obtaining the threshold automatically: in one method the images are segmented according to their grey-scale histograms, in the other, the threshold is set according to a prediction about the robot’s location, based upon a qualitative spatial reasoning theory about shadows. This theory-driven threshold search and the qualitative self-localisation procedure are the main contributions of the present research. To the best of our knowledge this is the first work that uses qualitative spatial representations both to perform robot self-localisation and to calibrate a robot’s interpretation of its perceptual input.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Radio relics are diffuse synchrotron sources generally located in the peripheries of galaxy clusters in merging state. According to the current leading scenario, relics trace gigantic cosmological shock waves that cross the intra-cluster medium where particle acceleration occurs. The relic/shock connection is supported by several observational facts, including the spatial coincidence between relics and shocks found in the X-rays. Under the assumptions that particles are accelerated at the shock front and are subsequently deposited and then age downstream of the shock, Markevitch et al. (2005) proposed a method to constrain the magnetic field strength in radio relics. Measuring the thickness of radio relics at different frequencies allows to derive combined constraints on the velocity of the downstream flow and on the magnetic field, which in turns determines particle aging. We elaborate this idea to infer first constraints on magnetic fields in cluster outskirts. We consider three models of particle aging and develop a geometric model to take into account the contribution to the relic transverse size due to the projection of the shock-surface on the plane of the sky. We selected three well studied radio relics in the clusters A 521, CIZA J2242.8+5301 and 1RXS J0603.3+4214. These relics have been chosen primarily because they are almost seen edge-on and because the Mach number of the shock that is associated with these relics is measured by X-ray observations, thus allowing to break the degeneracy between magnetic field and downstream velocity in the method. For the first two clusters, our method is consistent with a pure radiative aging model allowing us to derive constraints on the relics magnetic field strength. In the case of 1RXS J0603.3+4214 we find that particle life-times are consistent with a pure radiative aging model under some conditions, however we also collect evidences for downstream particle re-acceleration in the relic W-region and for a magnetic field decaying downstream in its E-region. Our estimates of the magnetic field strength in the relics in A 521 and CIZA J2242.8+5301 provide unique information on the field properties in cluster outskirts. The constraints derived for these relics, together with the lower limits to the magnetic field that we derived from the lack of inverse Compton X-ray emission from the sources, have been combined with the constraints from Faraday rotation studies of the Coma cluster. Overall results suggest that the spatial profile of the magnetic field energy density is broader than that of the thermal gas, implying that the ε_th /ε_B ratio decreases with cluster radius. Alternatively, radio relics could trace dynamically active regions where the magnetic field strength is biased high with respect to the average value in the cluster volume.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective High rates of suicide have been described in HIV-infected patients, but it is unclear to what extent the introduction of highly active antiretroviral therapy (HAART) has affected suicide rates. The authors examined time trends and predictors of suicide in the pre-HAART (1988—1995) and HAART (1996—2008) eras in HIV-infected patients and the general population in Switzerland. Method The authors analyzed data from the Swiss HIV Cohort Study and the Swiss National Cohort, a longitudinal study of mortality in the Swiss general population. The authors calculated standardized mortality ratios comparing HIV-infected patients with the general population and used Poisson regression to identify risk factors for suicide. Results From 1988 to 2008, 15,275 patients were followed in the Swiss HIV Cohort Study for a median duration of 4.7 years. Of these, 150 died by suicide (rate 158.4 per 100,000 person-years). In men, standardized mortality ratios declined from 13.7 (95% CI=11.0—17.0) in the pre-HAART era to 3.5 (95% CI=2.5—4.8) in the late HAART era. In women, ratios declined from 11.6 (95% CI=6.4—20.9) to 5.7 (95% CI=3.2—10.3). In both periods, suicide rates tended to be higher in older patients, in men, in injection drug users, and in patients with advanced clinical stage of HIV illness. An increase in CD4 cell counts was associated with a reduced risk of suicide. Conclusions Suicide rates decreased significantly with the introduction of HAART, but they remain above the rate observed in the general population, and risk factors for suicide remain similar. HIV-infected patients remain an important target group for suicide prevention.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Cytotoxic cells are involved in most forms of drug-induced skin diseases. Till now, no in vitro test addressed this aspect of drug-allergic responses. Our report evaluates whether drug-induced cytotoxic cells can be detected in peripheral blood of nonacute patients with different forms of drug hypersensitivity, and also whether in vitro detection of these cells could be helpful in drug-allergy diagnosis. METHODS: GranzymeB enzyme-linked immunosorbent spot-forming (ELISPOT) and cell surface expression of the degranulation marker CD107a were evaluated on peripheral blood mononuclear cells from 12 drug-allergic patients in remission state and 16 drug-exposed healthy controls. RESULTS: In 10/12 allergic patients culprit but not irrelevant drug elicited granzymeB release after 48-72 h stimulation. It was clearly positive in patients with high proliferative response to the drug, measured in lymphocyte transformation tests. In patients, who showed moderate or low proliferation and low drug-response in granzymeB ELISPOT, overnight preincubation with interleukin (IL)-7/IL-15 enhanced drug-specific granzymeB release and allowed to clearly identify the offending agent. CD107a staining was positive on CD4+/CD3+, CD8+/CD3+ T cells as well as CD56+/CD3- natural killer cells. None of the drug-exposed healthy donors reacted to the tested drugs and allergic patients reacted only to the offending, but not to tolerated drugs. CONCLUSION: GranzymeB ELISPOT is a highly specific in vitro method to detect drug-reacting cytotoxic cells in peripheral blood of drug-allergic patients even several years after disease manifestation. Together with IL-7/IL-15 preincubation, it may be helpful in indentifying the offending drug even in some patients with weak proliferative drug-response.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

High overexpression of somatostatin receptors in neuroendocrine tumors allows imaging and radiotherapy with radiolabeled somatostatin analogues. To ascertain whether a tumor is suitable for in vivo somatostatin receptor targeting, its somatostatin receptor expression has to be determined. There are specific indications for use of immunohistochemistry for the somatostatin receptor subtype 2A, but this has up to now been limited by the lack of an adequate reliable antibody. The aim of this study was to correlate immunohistochemistry using the new monoclonal anti-somatostatin receptor subtype 2A antibody UMB-1 with the gold standard in vitro method quantifying somatostatin receptor levels in tumor tissues. A UMB-1 immunohistochemistry protocol was developed, and tumoral UMB-1 staining levels were compared with somatostatin receptor binding site levels quantified with in vitro I-[Tyr]-octreotide autoradiography in 89 tumors. This allowed defining an immunohistochemical staining threshold permitting to distinguish tumors with somatostatin receptor levels high enough for clinical applications from those with low receptor expression. The presence of >10% positive tumor cells correctly predicted high receptor levels in 95% of cases. In contrast, absence of UMB-1 staining truly reflected low or undetectable somatostatin receptor expression in 96% of tumors. If 1% to 10% of tumor cells were stained, a weak staining intensity was suggestive of low somatostatin receptor levels. This study allows for the first time a reliable recommendation for eligibility of an individual patient for in vivo somatostatin receptor targeting based on somatostatin receptor immunohistochemistry. Under optimal methodological conditions, UMB-1 immunohistochemistry may be equivalent to in vitro receptor autoradiography.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: We hypothesized that certain patient characteristics have different effects on the risk of early stem loosening in total hip arthroplasty (THA). We therefore conducted a case-control study using register-database records with the aim of identifying patient-specific risk factors associated with radiographic signs of aseptic loosening of the femoral component in THA. METHOD: Data were derived from a multinational European registry and were collected over a period of 25 years. 725 cases with radiographic signs of stem loosening were identified and matched to 4,310 controls without any signs of loosening. Matching criteria were type of implant, size of head, date of operation, center of primary intervention, and follow-up time. The risk factors analyzed were age at operation, sex, diagnosis and previous ipsilateral operations, height, weight, body mass index and mobility based on the Charnley classification. RESULTS: Women showed significantly lower risk of radiographic loosening than men (odds ratio (OR) 0.64). Age was also a strong factor: risk decreased by 1.8% for each additional year of age at the time of surgery. Height and weight were not associated with risk of loosening. A higher body mass index, however, increased the risk of stem loosening to a significant extent (OR 1.03) per additional unit of BMI. Charnley Class B, indicating restricted mobility, was associated with lower risk of loosening (OR 0.78). INTERPRETATION: An increased activity level, as seen in younger patients and those with unrestricted mobility, is an important factor in the etiology of stem loosening. If combined with high BMI, the risk of stem loosening within 10 years is even higher. A younger person should not be denied the benefits of a total hip arthroplasty but must accept that the risk of future failure is increased.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: The most prevalent drug hypersensitivity reactions are T-cell mediated. The only established in vitro test for detecting T-cell sensitization to drugs is the lymphocyte transformation test, which is of limited practicability. To find an alternative in vitro method to detect drug-sensitized T cells, we screened the in vitro secretion of 17 cytokines/chemokines by peripheral blood mononuclear cells (PBMC) of patients with well-documented drug allergies, in order to identify the most promising cytokines/chemokines for detection of T-cell sensitization to drugs. METHODS: Peripheral blood mononuclear cell of 10 patients, five allergic to beta-lactams and five to sulfanilamides, and of five healthy controls were incubated for 3 days with the drug antigen. Cytokine concentrations were measured in the supernatants using commercially available 17-plex bead-based immunoassay kits. RESULTS: Among the 17 cytokines/chemokines analysed, interleukin-2 (IL-2), IL-5, IL-13 and interferon-gamma (IFN-gamma) secretion in response to the drugs were significantly increased in patients when compared with healthy controls. No difference in cytokine secretion patterns between sulfonamide- and beta-lactam-reactive PBMC could be observed. The secretion of other cytokines/chemokines showed a high variability among patients. CONCLUSION: The measurement of IL-2, IL-5, IL-13 or IFN-gamma or a combination thereof might be a useful in vitro tool for detection of T-cell sensitization to drugs. Secretion of these cytokines seems independent of the type of drug antigen and the phenotype of the drug reaction. A study including a higher number of patients and controls will be needed to determine the exact sensitivity and specificity of this test.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article introduces the emic–etic debate in the scientific study of religion\s and provides a frame for the special issue’s six articles on the topic. Departing from the broader debate’s early history in the 1960s, this article contextualizes the emic–etic debate and locates its point of entry into the scientific study of religion\s in the 1980s. This article argues that in the course of the debate the insider–outsider and emic–etic complexes have become entangled. In order to facilitate an understanding of the debate, this article maintains that the emic–etic debate in the scientific study of religion\s touches upon three central dimensions (existential–political, methodologi- cal, and epistemological). In order to move toward a clearer methodological and epis- temological framework, this article furthermore proposes an iterative model that locates insider–outsider at the level of observers and emic–etic at the level of categories.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The efficacy of cognitive behavioral therapy (CBT) for the treatment of depressive disorders has been demonstrated in many randomized controlled trials (RCTs). This study investigated whether for CBT similar effects can be expected under routine care conditions when the patients are comparable to those examined in RCTs. Method: N=574 CBT patients from an outpatient clinic were stepwise matched to the patients undergoing CBT in the National Institute of Mental Health Treatment of Depression Collaborative Research Program (TDCRP). First, the exclusion criteria of the RCT were applied to the naturalistic sample of the outpatient clinic. Second, propensity score matching (PSM) was used to adjust the remaining naturalistic sample on the basis of baseline covariate distributions. Matched samples were then compared regarding treatment effects using effect sizes, average treatment effect on the treated (ATT) and recovery rates. Results: CBT in the adjusted naturalistic subsample was as effective as in the RCT. However, treatments lasted significantly longer under routine care conditions. Limitations: The samples included only a limited amount of common predictor variables and stemmed from different countries. There might be additional covariates, which could potentially further improve the matching between the samples. Conclusions: CBT for depression in clinical practice might be equally effective as manual-based treatments in RCTs when they are applied to comparable patients. The fact that similar effects under routine conditions were reached with more sessions, however, points to the potential to optimize treatments in clinical practice with respect to their efficiency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives. The purpose of this paper is to conduct a literature review of research relating to foodborne illness, food inspection policy, and restaurants in the United States. Aim 1: To convey the public health importance of studying restaurant food inspection policies and suggest that more research is needed in this field, Aim 2: To conduct a systematic literature review of recent literature pertaining to this subject such that future researchers can understand the: (1) Public perception and expectations of restaurant food inspection policies; (2) Arguments in favor of a grade card policy; and, conversely; (3) Reasons why inspection policies may not work. ^ Data/methods. This paper utilizes a systematic review format to review articles relating to food inspections and restaurants in the U.S. Eight articles were reviewed. ^ Results. The resulting data from the literature provides no conclusive answer as to how, when, and in what method inspection policies should be carried out. The authors do, however, put forward varying solutions as to how to fix the problem of foodborne illness outbreaks in restaurants. These solutions include the implementation of grade cards in restaurants and, conversely, a complete overhaul of the inspection policy system.^ Discussion. The literature on foodborne disease, food inspection policy, and restaurants in the U.S. is limited and varied. But, from the research that is available, we can see that two schools of thought exist. The first of these calls for the implementation of a grade card system, while the second proposes a reassessment and possible overhaul of the food inspection policy system. It is still unclear which of these methods would best slow the increase in foodborne disease transmission in the U.S.^ Conclusion. In order to arrive at solutions to the problem of foodborne disease transmission as it relates to restaurants in this country, we may need to look at literature from other countries and, subsequently, begin incremental changes in the way inspection policies are developed and enforced.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Web 2.0 applications enabled users to classify information resources using their own vocabularies. The bottom-up nature of these user-generated classification systems have turned them into interesting knowledge sources, since they provide a rich terminology generated by potentially large user communities. Previous research has shown that it is possible to elicit some emergent semantics from the aggregation of individual classifications in these systems. However the generation of ontologies from them is still an open research problem. In this thesis we address the problem of how to tap into user-generated classification systems for building domain ontologies. Our objective is to design a method to develop domain ontologies from user-generated classifications systems. To do so, we rely on ontologies in the Web of Data to formalize the semantics of the knowledge collected from the classification system. Current ontology development methodologies have recognized the importance of reusing knowledge from existing resources. Thus, our work is framed within the NeOn methodology scenario for building ontologies by reusing and reengineering non-ontological resources. The main contributions of this work are: An integrated method to develop ontologies from user-generated classification systems. With this method we extract a domain terminology from the classification system and then we formalize the semantics of this terminology by reusing ontologies in the Web of Data. Identification and adaptation of existing techniques for implementing the activities in the method so that they can fulfill the requirements of each activity. A novel study about emerging semantics in user-generated lists. Resumen La web 2.0 permitió a los usuarios clasificar recursos de información usando su propio vocabulario. Estos sistemas de clasificación generados por usuarios son recursos interesantes para la extracción de conocimiento debido principalmente a que proveen una extensa terminología generada por grandes comunidades de usuarios. Se ha demostrado en investigaciones previas que es posible obtener una semántica emergente de estos sistemas. Sin embargo la generación de ontologías a partir de ellos es todavía un problema de investigación abierto. Esta tesis trata el problema de cómo aprovechar los sistemas de clasificación generados por usuarios en la construcción de ontologías de dominio. Así el objetivo de la tesis es diseñar un método para desarrollar ontologías de dominio a partir de sistemas de clasificación generados por usuarios. El método propuesto reutiliza conceptualizaciones existentes en ontologías publicadas en la Web de Datos para formalizar la semántica del conocimiento que se extrae del sistema de clasificación. Por tanto, este trabajo está enmarcado dentro del escenario para desarrollar ontologías mediante la reutilización y reingeniería de recursos no ontológicos que se ha definido en la Metodología NeOn. Las principales contribuciones de este trabajo son: Un método integrado para desarrollar una ontología de dominio a partir de sistemas de clasificación generados por usuarios. En este método se extrae una terminología de dominio del sistema de clasificación y posteriormente se formaliza su semántica reutilizando ontologías en la Web de Datos. La identificación y adaptación de un conjunto de técnicas para implementar las actividades propuestas en el método de tal manera que puedan cumplir automáticamente los requerimientos de cada actividad. Un novedoso estudio acerca de la semántica emergente en las listas generadas por usuarios en la Web.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to “reproducibility maps” that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one’s own laboratory.