989 resultados para middle-out
Resumo:
BACKGROUND AND STUDY AIMS: The current gold standard in Barrett's esophagus monitoring consists of four-quadrant biopsies every 1-2 cm in accordance with the Seattle protocol. Adding brush cytology processed by digital image cytometry (DICM) may further increase the detection of patients with Barrett's esophagus who are at risk of neoplasia. The aim of the present study was to assess the additional diagnostic value and accuracy of DICM when added to the standard histological analysis in a cross-sectional multicenter study of patients with Barrett's esophagus in Switzerland. METHODS: One hundred sixty-four patients with Barrett's esophagus underwent 239 endoscopies with biopsy and brush cytology. DICM was carried out on 239 cytology specimens. Measures of the test accuracy of DICM (relative risk, sensitivity, specificity, likelihood ratios) were obtained by dichotomizing the histopathology results (high-grade dysplasia or adenocarcinoma vs. all others) and DICM results (aneuploidy/intermediate pattern vs. diploidy). RESULTS: DICM revealed diploidy in 83% of 239 endoscopies, an intermediate pattern in 8.8%, and aneuploidy in 8.4%. An intermediate DICM result carried a relative risk (RR) of 12 and aneuploidy a RR of 27 for high-grade dysplasia/adenocarcinoma. Adding DICM to the standard biopsy protocol, a pathological cytometry result (aneuploid or intermediate) was found in 25 of 239 endoscopies (11%; 18 patients) with low-risk histology (no high-grade dysplasia or adenocarcinoma). During follow-up of 14 of these 18 patients, histological deterioration was seen in 3 (21%). CONCLUSION: DICM from brush cytology may add important information to a standard biopsy protocol by identifying a subgroup of BE-patients with high-risk cellular abnormalities.
Resumo:
The involvement of μ-opioid receptors in different behavioral responses elicited by nicotine was explored by using μ-opioid receptor knock-out mice. The acute antinociceptive responses induced by nicotine in the tail-immersion and hot-plate tests were reduced in the mutant mice, whereas no difference between genotypes was observed in the locomotor responses. The rewarding effects induced by nicotine were then investigated using the conditioning place-preference paradigm. Nicotine produced rewarding responses in wild-type mice but failed to produce place preference in knock-out mice, indicating the inability of this drug to induce rewarding effects in the absence of μ-opioid receptors. Finally, the somatic expression of the nicotine withdrawal syndrome, precipitated in dependent mice by the injection of mecamylamine, was evaluated. Nicotine withdrawal was significantly attenuated in knock-out mutants when compared with wild-type mice. In summary, the present results show that μ-opioid receptors are involved in the rewarding responses induced by nicotine and participate in its antinociceptive responses and the expression of nicotine physical dependence.
Resumo:
The functional interactions between the endogenous cannabinoid and opioid systems were evaluated in pre-proenkephalin-deficient mice. Antinociception induced in the tail-immersion test by acute Delta9-tetrahydrocannabinol was reduced in mutant mice, whereas no difference between genotypes was observed in the effects induced on body temperature, locomotion, or ring catalepsy. During a chronic treatment with Delta9-tetrahydrocannabinol, the development of tolerance to the analgesic responses induced by this compound was slower in mice lacking enkephalin. In addition, cannabinoid withdrawal syndrome, precipitated in Delta9-tetrahydrocannabinol-dependent mice by the injection of SR141716A, was significantly attenuated in mutant mice. These results indicate that the endogenous enkephalinergic system is involved in the antinociceptive responses of Delta9-tetrahydrocannabinol and participates in the expression of cannabinoid abstinence.
Resumo:
Mesoamerica, defined as the broad linguistic and cultural area from middle southern Mexico to Costa Rica, might have played a pivotal role during the colonization of theAmerican continent. It has been suggested that the Mesoamerican isthmus could have played an important role in severely restricting prehistorically gene flow between North and SouthAmerica. Although the Native American component has been already described in admixedMexican populations, few studies have been carried out in native Mexican populations. In thisstudy we present mitochondrial DNA (mtDNA) sequence data for the first hypervariable region (HVR-I) in 477 unrelated individuals belonging to eleven different native populations from Mexico. Almost all the Native Mexican mtDNAs could be classified into the four pan-Amerindian haplogroups (A2, B2, C1 and D1); only three of them could be allocated to the rare Native American lineage D4h3. Their haplogroup phylogenies are clearly star-like, as expected from relatively young populations that have experienced diverse episodes of genetic drift (e.g. extensive isolation, genetic drift and founder effects) and posterior population expansions. In agreement with this observation is the fact that Native Mexican populations show a high degree of heterogeneity in their patterns of haplogroup frequencies. HaplogroupX2a was absent in our samples, supporting previous observations where this clade was only detected in the American northernmost areas. The search for identical sequences in the American continent shows that, although Native Mexican populations seem to show a closer relationship to North American populations, they cannot be related to a single geographical region within the continent. Finally, we did not find significant population structure on the maternal lineages when considering the four main and distinct linguistic groups represented in our Mexican samples (Oto-Manguean, Uto-Aztecan, Tarascan, and Mayan), suggesting that genetic divergence predates linguistic diversification in Mexico.
Resumo:
Our objective was to describe the interventions aimed at preventing a recurrent hip fracture, and other injurious falls, which were provided during hospitalization for a first hip fracture and during the two following years. A secondary objective was to study some potential determinants of these preventive interventions. The design of the study was an observational, two-year follow-up of patients hospitalized for a first hip fracture at the University Hospital of Lausanne, Switzerland. The participants were 163 patients (median age 82 years, 83% women) hospitalized in 1991 for a first hip fracture, among 263 consecutively admitted patients (84 did not meet inclusion criteria, e.g., age>50, no cancer, no high energy trauma, and 16 refused to participate). Preventive interventions included: medical investigations performed during the first hospitalization and aimed at revealing modifiable pathologies that raise the risk of injurious falls; use of medications acting on the risk of falls and fractures; preventive recommendations given by medical staff; suppression of environmental hazards; and use of home assistance services. The information was obtained from a baseline questionnaire, the medical record filled during the index hospitalization, and an interview conducted 2 years after the fracture. Potential predictors of the use of preventive interventions were: age; gender; destination after discharge from hospital; comorbidity; cognitive functioning; and activities of daily living. Bi- and multivariate associations between the preventive interventions and the potential predictors were measured. In hospital investigations to rule out medical pathologies raising the risk of fracture were performed in only 20 patients (12%). Drugs raising the risk of falls were reduced in only 17 patients (16%). Preventive procedures not requiring active collaboration by the patient (e.g., modifications of the environment) were applied in 68 patients (42%), and home assistance was provided to 67 patients (85% of the patients living at home). Bivariate analyses indicated that prevention was less often provided to patients in poor general conditions, but no ascertainment of this association was found in multivariate analyses. In conclusion, this study indicates that, in the study setting, measures aimed at preventing recurrent falls and injuries were rarely provided to patients hospitalized for a first hip fracture at the time of the study. Tertiary prevention could be improved if a comprehensive geriatric assessment were systematically provided to the elderly patient hospitalized for a first hip fracture, and passive preventive measures implemented.
Resumo:
OBJECTIVE: To study delayed failure after subthalamic nucleus (STN) deep brain stimulation in Parkinson's disease (PD) patients. METHODS: Out of 56 consecutive bilaterally STN-implanted PD patients, we selected subjects who, after initial clinical improvement (1 month after surgery), lost benefit (delayed failure, DF). RESULTS: Five patients developed sub-acutely severe gait disorders (DF). In 4/5 DF patients, a micro-lesion effect, defined as improvement without stimulation, was observed; immediate post-operative MRI demonstrated electrode located above or behind to the STN. CONCLUSIONS: Patients presenting micro-lesion effect should be carefully monitored, as this phenomenon can mask electrodes misplacement and evolution in DF
Resumo:
The objective of this study was to evaluate the efficiency and the effects of changes in parameters of chronic amygdala-hippocampal deep brain stimulation (AH-DBS) in mesial temporal lobe epilepsy (TLE). Eight pharmacoresistant patients, not candidates for ablative surgery, received chronic AH-DBS (130 Hz, follow-up 12-24 months): two patients with hippocampal sclerosis (HS) and six patients with non-lesional mesial TLE (NLES). The effects of stepwise increases in intensity (0-Off to 2 V) and stimulation configuration (quadripolar and bipolar), on seizure frequency and neuropsychological performance were studied. The two HS patients obtained a significant decrease (65-75%) in seizure frequency with high voltage bipolar DBS (≥1 V) or with quadripolar stimulation. Two out of six NLES patients became seizure-free, one of them without stimulation, suggesting a microlesional effect. Two NLES patients experienced reductions of seizure frequency (65-70%), whereas the remaining two showed no significant seizure reduction. Neuropsychological evaluations showed reversible memory impairments in two patients under strong stimulation only. AH-DBS showed long-term efficiency in most of the TLE patients. It is a valuable treatment option for patients who suffer from drug resistant epilepsy and who are not candidates for resective surgery. The effects of changes in the stimulation parameters suggest that a large zone of stimulation would be required in HS patients, while a limited zone of stimulation or even a microlesional effect could be sufficient in NLES patients, for whom the importance of the proximity of the electrode to the epileptogenic zone remains to be studied. Further studies are required to ascertain these latter observations.
Resumo:
BACKGROUND: The clinical profile and outcome of nosocomial and non-nosocomial health care-associated native valve endocarditis are not well defined. OBJECTIVE: To compare the characteristics and outcomes of community-associated and nosocomial and non-nosocomial health care-associated native valve endocarditis. DESIGN: Prospective cohort study. SETTING: 61 hospitals in 28 countries. PATIENTS: Patients with definite native valve endocarditis and no history of injection drug use who were enrolled in the ICE-PCS (International Collaboration on Endocarditis Prospective Cohort Study) from June 2000 to August 2005. MEASUREMENTS: Clinical and echocardiographic findings, microbiology, complications, and mortality. RESULTS: Health care-associated native valve endocarditis was present in 557 (34%) of 1622 patients (303 with nosocomial infection [54%] and 254 with non-nosocomial infection [46%]). Staphylococcus aureus was the most common cause of health care-associated infection (nosocomial, 47%; non-nosocomial, 42%; P = 0.30); a high proportion of patients had methicillin-resistant S. aureus (nosocomial, 57%; non-nosocomial, 41%; P = 0.014). Fewer patients with health care-associated native valve endocarditis had cardiac surgery (41% vs. 51% of community-associated cases; P < 0.001), but more of the former patients died (25% vs. 13%; P < 0.001). Multivariable analysis confirmed greater mortality associated with health care-associated native valve endocarditis (incidence risk ratio, 1.28 [95% CI, 1.02 to 1.59]). LIMITATIONS: Patients were treated at hospitals with cardiac surgery programs. The results may not be generalizable to patients receiving care in other types of facilities or to those with prosthetic valves or past injection drug use. CONCLUSION: More than one third of cases of native valve endocarditis in non-injection drug users involve contact with health care, and non-nosocomial infection is common, especially in the United States. Clinicians should recognize that outpatients with extensive out-of-hospital health care contacts who develop endocarditis have clinical characteristics and outcomes similar to those of patients with nosocomial infection. PRIMARY FUNDING SOURCE: None.
Resumo:
INTRODUCTION: Lumbar spinal stenosis (LSS) treatment is based primarily on the clinical criteria providing that imaging confirms radiological stenosis. The radiological measurement more commonly used is the dural sac cross-sectional area (DSCA). It has been recently shown that grading stenosis based on the morphology of the dural sac as seen on axial T2 MRI images, better reflects severity of stenosis than DSCA and is of prognostic value. This radiological prospective study investigates the variability of surface measurements and morphological grading of stenosis for varying degrees of angulation of the T2 axial images relative to the disc space as observed in clinical practice. MATERIALS AND METHODS: Lumbar spine TSE T2 three-dimensional (3D) MRI sequences were obtained from 32 consecutive patients presenting with either suspected spinal stenosis or low back pain. Axial reconstructions using the OsiriX software at 0°, 10°, 20° and 30° relative to the disc space orientation were obtained for a total of 97 levels. For each level, DSCA was digitally measured and stenosis was graded according to the 4-point (A-D) morphological grading by two observers. RESULTS: A good interobserver agreement was found in grade evaluation of stenosis (k = 0.71). DSCA varied significantly as the slice orientation increased from 0° to +10°, +20° and +30° at each level examined (P < 0.0001) (-15 to +32% at 10°, -24 to +143% at 20° and -29 to +231% at 30° of slice orientation). Stenosis definition based on the surface measurements changed in 39 out of the 97 levels studied, whereas the morphology grade was modified only in two levels (P < 0.01). DISCUSSION: The need to obtain continuous slices using the classical 2D MRI acquisition technique entails often at least a 10° slice inclination relative to one of the studied discs. Even at this low angulation, we found a significantly statistical difference between surface changes and morphological grading change. In clinical practice, given the above findings, it might therefore not be necessary to align the axial cuts to each individual disc level which could be more time-consuming than obtaining a single series of axial cuts perpendicular to the middle of the lumbar spine or to the most stenotic level. In conclusion, morphological grading seems to offer an alternative means of assessing severity of spinal stenosis that is little affected by image acquisition technique.
Resumo:
A detailed carbon-isotope stratigraphic study for the uppermost Pliensbachian lowermost Aalenian interval in the Median Subbetic palaeogeographic domain (External zones of the Betic Cordillera, southern Spain) has been carried out. During the Early Jurassic, the Median Subbetic, which represents a typical basin of the Hispanic Corridor connecting the Tethys and the Eastern Pacific, was located in the westernmost Tethys. The analyzed sections encompass the entire Toarcian stage as represented in the southern Iberian palaeomargin. Rocks are mainly rhythmic sequences of grey marls and marly limestones containing a rich ammonite fauna, nannofossils, and benthic foraminifers-all these provide an accurate biostratigraphic control. The lower and upper Toarcian boundaries are well represented in some of these sections and therefore represent optimal sites to link the carbon-isotope curves to ammonite zones, and to nannofossil events. delta C-13 values of bulk carbonates from the different localities of the Subbetic basin have similar variations from the uppermost Pliensbachian to the lowermost Aalenian, suggesting changes in the original DIC carbon isotope composition along the Hispanic corridor. The transition from Pliensbachian to Toarcian is marked by increasing delta C-13 values from similar to 12 to 2.0 parts per thousand, interrupted in the Serpentinum Zone by a negative shift concomitant with the Toarcian oceanic anoxic event (T-OAE), with the major ammonite extinction event of the Toarcian, and an important turnover of calcareous nannoplankton. The negative shift observed in the Serpentinum Zone confirms the global perturbation of the carbon cycling documented along the Tethys and the palaeo-Pacific in organic material and in marine carbonates. However, the amplitude of the negative excursion (similar to - 1.5 parts per thousand) is not compatible with an isotopic homogeneous seawater DIC and/or CO2 atmospheric reservoirs. The interval from the middle to the top of the Toarcian delta C-13 shows relatively constant values, minor ammonite turnovers, and is associated with increasing diversity of calcareous nannoplankton. (c) 2012 Elsevier B.V. All rights reserved.
Resumo:
BACKGROUND: The epidemiology of chest pain differs strongly between outpatient and emergency settings. In general practice, the most frequent cause is the chest wall pain. However, there is a lack of information about the characteristics of this syndrome. The aims of the study are to describe the clinical aspects of chest wall syndrome (CWS). METHODS: Prospective, observational, cohort study of patients attending 58 private practices over a five-week period from March to May 2001 with undifferentiated chest pain. During a one-year follow-up, questionnaires including detailed history and physical exam, were filled out at initial consultation, 3 and 12 months. The outcomes were: clinical characteristics associated with the CWS diagnosis and clinical evolution of the syndrome. RESULTS: Among 24 620 consultations, we observed 672 cases of chest pain and 300 (44.6%) patients had a diagnosis of chest wall syndrome. It affected all ages with a sex ratio of 1:1. History and sensibility to palpation were the keys for diagnosis. Pain was generally moderate, well localised, continuous or intermittent over a number of hours to days or weeks, and amplified by position or movement. The pain however, may be acute. Eighty-eight patients were affected at several painful sites, and 210 patients at a single site, most frequently in the midline or a left-sided site. Pain was a cause of anxiety and cardiac concern, especially when acute. CWS coexisted with coronary disease in 19 and neoplasm in 6. Outcome at one year was favourable even though CWS recurred in half of patients. CONCLUSION: CWS is common and benign, but leads to anxiety and recurred frequently. Because the majority of chest wall pain is left-sided, the possibility of coexistence with coronary disease needs careful consideration.
Resumo:
Anorectal malformations (ARMs) are a complex group of congenital anomalies involving the distal anus and rectum, as well as the urinary and genital tracts in a significant number of cases. Most ARMs result from abnormal development of the urorectal septum in early fetal life. In most cases, the anus is not perforated and the distal enteric component ends blindly (atresia) or as a fistula into the urinary tract, genital tract, or perineum. ARMs are also present in a great number of syndromes and associations of congenital anomalies. The classification of ARMs is mainly based on the position of the rectal pouch relative to the puborectal sling, the presence or absence of fistulas, and the types and locations of the fistulas. All of this information is crucial in determining the most appropriate surgical approach for each case. Imaging studies play a key role in evaluation and classification of ARMs. In neonates, clinical and radiologic examinations in the first 3 days of life help determine the type of ARM and the need for early colostomy. In older children, preoperative pelvic magnetic resonance imaging is the most efficient diagnostic method for evaluating the size, morphology, and grade of development of the sphincteric musculature.
Global mass wasting during the Middle Ordovician: Meteoritic trigger or plate-tectonic environment ?
Resumo:
Mass wasting at continental margins on a global scale during the Middle Ordovician has recently been related to high meteorite influx. Although a high meteorite influx during the Ordovician should not be neglected, we challenge the idea that mass wasting was mainly produced by meteorite impacts over a period of almost 10 Ma. Having strong arguments against the impact-related hypothesis, we propose an alternative explanation, which is based on a re-evaluation of the mass wasting sites, considering their plate-tectonic distribution and the global sea level curve. A striking and important feature is the distribution of most of the mass wasting sites along continental margins characterised by periods of magmatism, terrane accretion and continental or back-arc rifting, respectively, related to subduction of oceanic lithosphere. Such processes are commonly connected with seismic activity causing earthquakes, which can cause downslope movement of sediment and rock. Considering all that, it seems more likely that most of this mass wasting was triggered by earthquakes related to plate-tectonic processes, which caused destabilisation of continental margins resulting in megabreccias and debris flows. Moreover, the period of mass wasting coincides with sea level drops during global sea level lowstand. In some cases, sea level drops can release pore-water overpressure reducing sediment strength and hence promoting instability of sediment at continental margins. Reduced pore-water overpressure can also destabilise gas hydrate-bearing sediment, causing slope failure, and thus resulting in submarine mass wasting. Overall, the global mass wasting during the Middle Ordovician does not need meteoritic trigger. (C) 2010 International Association for Gondwana Research. Published by Elsevier B.V. All rights reserved.