924 resultados para Pre-imaginal period
Resumo:
L’approccio chirurgico agli adenomi ipofisari ACTH secernenti è la terapia d’elezione nell’uomo. L’ipofisectomia transfenoidale è invece una tecnica poco diffusa in ambito veterinario. La terapia più diffusa nel cane con ipercortisolismo ipofisi dipendente (PDH) è di tipo medico e prevede la somministrazione di farmaci inibitori della sintesi del cortisolo. Gli adenomi ipofisari possono aumentare di volume e determinare una conseguente sintomatologia neurologica; in questi casi le uniche opzioni terapeutiche sono rappresentate dall’asportazione chirurgica della neoplasia e dalla radioterapia. Nella presente tesi vengono descritti 8 interventi di ipofisectomia transfenoidale effettuati su 7 cani con macroadenoma ipofisario presso il Dipartimento di Scienze Mediche Veterinarie dell’Università di Bologna. La difficoltà maggiore per il chirurgo è rappresentata dalla localizzazione della fossa ipofisaria rispetto ai punti di repere visibile in tomografia computerizzata o in risonanza magnetica nucleare, oltre ai problemi di sanguinamento durante la rimozione della neoplasia. Nel periodo post-operatorio maggiori complicazioni si riscontrano in soggetti con adenomi ipofisari di maggiori dimensioni. Al contrario, in presenza di adenomi di dimensioni più contenute, la ripresa post-operatoria risulta più rapida e il tasso di successo maggiore. Al fine di poter eseguire nel cane l’exeresi mirata della sola neoplasia ipofisaria, al pari di quanto avviene nell’uomo, è stato condotto uno studio sulla tomografia computerizzata (TC) in 86 cani con PDH. Il protocollo TC non ha tuttavia permesso di individuare con precisione la posizione della neoplasia per guidare il chirurgo nella sua rimozione. In due casi riportati nel presente lavoro si è verificata una recidiva della neoplasia ipofisaria. In un soggetto si è optato per il reintervento, mentre nell’altro caso per la radioterapia. Entrambe le opzioni hanno garantito una buona qualità di vita per più di un anno dall’intervento terapeutico. Questi casi clinici dimostrano come il reintervento e la radioterapia possano essere considerate valide opzioni in caso di recidiva.
Resumo:
Background: Medication-related problems are common in the growing population of older adults and inappropriate prescribing is a preventable risk factor. Explicit criteria such as the Beers criteria provide a valid instrument for describing the rate of inappropriate medication (IM) prescriptions among older adults. Objective: To reduce IM prescriptions based on explicit Beers criteria using a nurse-led intervention in a nursing-home (NH) setting. Study Design: The pre/post-design included IM assessment at study start (pre-intervention), a 4-month intervention period, IM assessment after the intervention period (post-intervention) and a further IM assessment at 1-year follow-up. Setting: 204-bed inpatient NH in Bern, Switzerland. Participants: NH residents aged ≥60 years. Intervention: The intervention included four key intervention elements: (i) adaptation of Beers criteria to the Swiss setting; (ii) IM identification; (iii) IM discontinuation; and (iv) staff training. Main Outcome Measure: IM prescription at study start, after the 4-month intervention period and at 1-year follow-up. Results: The mean±SD resident age was 80.3±8.8 years. Residents were prescribed a mean±SD 7.8±4.0 medications. The prescription rate of IMs decreased from 14.5% pre-intervention to 2.8% post-intervention (relative risk [RR] = 0.2; 95% CI 0.06, 0.5). The risk of IM prescription increased nonstatistically significantly in the 1-year follow-up period compared with post-intervention (RR = 1.6; 95% CI 0.5, 6.1). Conclusions: This intervention to reduce IM prescriptions based on explicit Beers criteria was feasible, easy to implement in an NH setting, and resulted in a substantial decrease in IMs. These results underscore the importance of involving nursing staff in the medication prescription process in a long-term care setting.
Resumo:
We describe the multidisciplinary findings in a pre-Columbian mummy head from Southern Peru (Cahuachi, Nazca civilisation, radiocarbon dating between 120 and 750 AD) of a mature male individual (40-60 years) with the first two vertebrae attached in pathological position. Accordingly, the atlanto-axial transition (C1/C2) was significantly rotated and dislocated at 38° angle associated with a bulging brownish mass that considerably reduced the spinal canal by circa 60%. Using surface microscopy, endoscopy, high-resolution multi-slice computer tomography, paleohistology and immunohistochemistry, we identified an extensive epidural hematoma of the upper cervical spinal canal-extending into the skull cavity-obviously due to a rupture of the left vertebral artery at its transition between atlas and skull base. There were no signs of fractures of the skull or vertebrae. Histological and immunohistochemical examinations clearly identified dura, brain residues and densely packed corpuscular elements that proved to represent fresh epidural hematoma. Subsequent biochemical analysis provided no evidence for pre-mortal cocaine consumption. Stable isotope analysis, however, revealed significant and repeated changes in the nutrition during his last 9 months, suggesting high mobility. Finally, the significant narrowing of the rotational atlanto-axial dislocation and the epidural hematoma probably caused compression of the spinal cord and the medulla oblongata with subsequent respiratory arrest. In conclusion, we suggest that the man died within a short period of time (probably few minutes) in an upright position with the head rotated rapidly to the right side. In paleopathologic literature, trauma to the upper cervical spine has as yet only very rarely been described, and dislocation of the vertebral bodies has not been presented.
Resumo:
Boxer are predisposed to subaortic (SAS) and pulmonic stenosis (PS). To decrease the prevalence, pre-breeding cardiologic exams were performed in the last years. In our study the results of 309 pre-breeding exams of boxers presented between 1999 and 2008 were evaluated retrospectively. The overall prevalence of heart murmurs was 26.5 %. A SAS was diagnosed in 25 (8.1 %) and a PS in 10 (3.3 %) dogs. A combination of both defects was found in 7 (2.3 %) Boxers. Animals with a heart murmur of at least grade 3/6 had a significantly higher peak aortic flow velocity (VmaxAo) than animals without or only soft heart murmurs. Over the study period both the frequency of heart murmurs and diagnosis of SAS and PS decreased.
Resumo:
It has long been known that, prior to the deposition of Jurassic sediments, Montana was subjected to an intensive period of erosion. Mention of this may be found in numerous reports, especially those dealing with western Montana.
Resumo:
QUESTIONS UNDER STUDY: To compare the incidence of pre-pregnancy overweight, obesity, and difference in weight gain during pregnancy in the years 1986 and 2004, in women delivered at the maternity unit of our hospital. METHODS: Retrospective study. Maternity records of patients delivered in the years 1986 and 2004 were compared. Data extraction included booking weight, height, weight gain, birth weight as well as information on mode of delivery and gestational age at delivery. RESULTS: During the year 1986 and 2004 a total of 690 and 668 patients respectively were included in the analysis. The pre-pregnancy BMI > or =25 doubled over the 18-year period (from 15.9 to 30.1%). In 1986 only 2.6% of all pregnant women gained more than 20 kg, while in 2004 14.2% (p <0.0001) did so. The caesarean section rate was significantly higher in 2004 than 18 years earlier (28.3 and 9.3%, p <0.0001). CONCLUSIONS: We found a significant increase in all parameters between these two groups. Pregnant women are today heavier at the booking visit, are more overweight, and gain more weight during pregnancy. A similar trend is seen in the newborn babies, who have a higher birth weight than those born 18 years ago.
Resumo:
BACKGROUND: The aim of this study was to develop an experimental model that allows to elude the potential role of the preexisting graft microvasculature for vascularization and mineralization of osteochondral grafts. ANIMALS AND METHODS: For that purpose, the II-IV metatarsals of fetal DDY-mice known to be nonvascularized at day 16 of gestation (M16) but vascularized at day 18 (M18) were freshly transplanted into dorsal skin fold chambers of adult DDY mice. Using intravital microscopy angiogenesis, leukocyte-endothelium interaction and mineralization were assessed for 12 days. RESULTS: Angiogenesis occurred at 32 hours in M18, but not before 57 hours in M16 (p = 0.002), with perfusion of these vessels at 42 hours (p = 0.005) and 65 hours (p = 0.1), respectively. Vessels reached a density three times as high as that of the recipient site at day 6, remaining constant until day 12 in M18, whereas in M16 vascular density increased from day 6 and reached that of M18 at day 12 (p = 0.04). Leukocyte-endothelium interaction showed sticker counts elevated by a factor of 4-5 in M18 as compared to M16. Mineralization of osteochondral grafts did not differ between M16 and M18, which significantly increased in both groups throughout the observation period. INTERPRETATION: We propose the faster kinetics in the angiogenic response to M18 and the elevated counts of sticking leukocytes to rest on the potential of establishing end-to-end anastomoses (inosculation) of the vascularized graft with recipient vessels.
Resumo:
Detection of antibodies against Bovine viral diarrhea virus (BVDV) in serum and milk by enzyme-linked immunosorbent assay (ELISA) is a crucial part of all ongoing national schemes to eradicate this important cattle pathogen. Serum and milk are regarded as equally suited for antibody measurement. However, when retesting a seropositive cow 1 day after calving, the serum was negative in 6 out of 9 different ELISAs. To further investigate this diagnostic gap around parturition, pre- and postcalving serum and milk samples of 5 cows were analyzed by BVDV antibody ELISA and serum neutralization test (SNT). By ELISA, 3 out of the 5 animals showed a diagnostic gap in the serum for up to 12 days around calving but all animals remained positive in SNT. In milk, the ELISA was strongly positive after birth but antibody levels decreased considerably within the next few days. Because of the immunoglobulin G (IgG)1-specific transport of serum antibodies into the mammary gland for colostrum production, the IgG subclass specificity of the total and the BVDV-specific antibodies were determined. Although all 5 animals showed a clear decrease in the total and BVDV-specific IgG1 antibody levels at parturition, the precalving IgG1-to-IgG2 ratios of the BVDV-specific antibodies were considerably lower in animals that showed the diagnostic gap. Results showed that BVDV seropositive cows may become "false" negative in several ELISAs in the periparturient period and suggest that the occurrence of this diagnostic gap is influenced by the BVDV-specific IgG subclass response of the individual animal.
Resumo:
Objective: There is evidence that children after mild traumatic brain injuries (mTBI) suffer ongoing post-concussive symptoms (PCS). However, results concerning neuropsychological outcome after mTBI are controversial. Thus, our aim was to examine group differences regarding neuropsychological outcome and PCS. Additionally, we explored the influence of current and pre-injury everyday attention problems on neuropsychological outcome in children after mTBI. Method: In a prospective short-term longitudinal study, 40 children (aged 6-16 years) after mTBI and 38 children after orthopedic injury (OI) underwent neuropsychological, socio-behavioral and PCS assessments in the acute stage and at 1 week, at 4 weeks, and 4 months after the injury. Results: Parents of children after mTBI observed significantly more PCS compared to parents of children after OI, especially in the acute stage. Our results revealed no neuropsychological or socio-behavioral differences over time between both groups. However, in children after mTBI, we found negative correlations between elevated levels of everyday attention problems and reduced neuropsychological performance. Furthermore, there was a negative influence of pre-injury everyday attention problems on neuropsychological performance in children after mTBI. Conclusion: In accordance with earlier studies, parents of children after mTBI initially observed significantly more PCS compared to parents of children after OI. There were no neuropsychological or socio-behavioral group differences between children after mTBI and OI in the post-acute period. However, our exploratory findings concerning the influence of everyday attention problems on neuropsychological outcome indicate that current and pre-injury everyday attention problems were negatively associated with neuropsychological performance in children after mTBI.
Resumo:
Recently, multiple studies showed that spatial and temporal features of a task-negative default mode network (DMN) (Greicius et al., 2003) are important markers for psychiatric diseases (Balsters et al., 2013). Another prominent indicator of cognitive functioning, yielding information about the mental condition in health and disease, is working memory (WM) processing. In EEG and MEG studies, frontal-midline theta power has been shown to increase with load during WM retention in healthy subjects (Brookes et al., 2011). Negative correlations between DMN activity and theta amplitude have been found during resting state (Jann et al., 2010) as well as during WM (Michels et al., 2010). Likewise, WM training resulted in higher resting state theta power as well as increased small-worldness of the resting brain (Langer et al., 2013). Further, increased fMRI connectivity between nodes of the DMN correlated with better WM performance (Hampson et al., 2006). Hence, the brain’s default state might influence it’s functioning during task. We therefore hypothesized correlations between pre-stimulus DMN activity and EEG-theta power during WM maintenance, depending on the WM load. 17 healthy subjects performed a Sternberg WM task while being measured simultaneously with EEG and fMRI. Data was recorded within a multicenter-study: 12 subjects were measured in Zurich with a 64-channels MR-compatible system (Brain Products) in a 3T Philips scanner, 5 subjects with a 96-channel MR-compatible system (Brain Products) in a 3T Siemens Scanner in Bern. The DMN components was obtained by a group BOLD-ICA approach over the full task duration (figure 1). The subject-wise dynamics were obtained by back-reconstructed onto each subject’s fMRI data and normalized to percent signal change values. The single trial pre-stimulus-DMN activation was then temporally correlated with the single trial EEG-theta (3-8 Hz) spectral power during retention intervals. This so-called covariance mapping (Jann et al., 2010) yielded the spatial distribution of the theta EEG fluctuations during retention associated with the dynamics of the pre-stimulus DMN. In line with previous findings, theta power was increased at frontal-midline electrodes in high- versus low-load conditions during early WM retention (figure 2). However, correlations of DMN with theta power resulted in primarily positive correlations in low-load conditions, while during high-load conditions negative correlations of DMN activity and theta power were observed at frontal-midline electrodes. This DMN-dependent load effect reached significance in the middle of the retention period (TANOVA, p<0.05) (figure 3). Our results show a complex and load-dependent interaction of pre-stimulus DMN activity and theta power during retention, varying over time. While at a more global, load-independent view pre-stimulus DMN activity correlated positively with theta power during retention, the correlation was inversed during certain time windows in high-load trials, meaning that in trials with enhanced pre-stimulus DMN activity theta power decreases during retention. Since both WM performance and DMN activity are markers of mental health our results could be important for further investigations of psychiatric populations.
Resumo:
This study compares gridded European seasonal series of surface air temperature (SAT) and precipitation (PRE) reconstructions with a regional climate simulation over the period 1500–1990. The area is analysed separately for nine subareas that represent the majority of the climate diversity in the European sector. In their spatial structure, an overall good agreement is found between the reconstructed and simulated climate features across Europe, supporting consistency in both products. Systematic biases between both data sets can be explained by a priori known deficiencies in the simulation. Simulations and reconstructions, however, largely differ in the temporal evolution of past climate for European subregions. In particular, the simulated anomalies during the Maunder and Dalton minima show stronger response to changes in the external forcings than recorded in the reconstructions. Although this disagreement is to some extent expected given the prominent role of internal variability in the evolution of regional temperature and precipitation, a certain degree of agreement is a priori expected in variables directly affected by external forcings. In this sense, the inability of the model to reproduce a warm period similar to that recorded for the winters during the first decades of the 18th century in the reconstructions is indicative of fundamental limitations in the simulation that preclude reproducing exceptionally anomalous conditions. Despite these limitations, the simulated climate is a physically consistent data set, which can be used as a benchmark to analyse the consistency and limitations of gridded reconstructions of different variables. A comparison of the leading modes of SAT and PRE variability indicates that reconstructions are too simplistic, especially for precipitation, which is associated with the linear statistical techniques used to generate the reconstructions. The analysis of the co-variability between sea level pressure (SLP) and SAT and PRE in the simulation yields a result which resembles the canonical co-variability recorded in the observations for the 20th century. However, the same analysis for reconstructions exhibits anomalously low correlations, which points towards a lack of dynamical consistency between independent reconstructions.
Resumo:
The floods that occurred on the Aare and Rhine rivers in May 2015 and the mostly successful handling of this event in terms of flood protection measures are a good reminder of how important it is to comprehend the causes and processes involved in such natural hazards. While the needed data series of gauge measurements and peak discharge calculations reach back to the 19th century, historical records dating further back in time can provide additional and useful information to help understanding extreme flood events and to evaluate prevention measures such as river dams and corrections undertaken prior to instrumental measurements. In my PhD project I will use a wide range of historical sources to assess and quantify past extreme flood events. It is part of the SNF-funded project “Reconstruction of the Genesis, Process and Impact of Major Pre-instrumental Flood Events of Major Swiss Rivers Including a Peak Discharge Quantification” and will cover the research locations Fribourg (Saane R.), Burgdorf (Emme R.), Thun, Bern (both Aare R.), and the Lake of Constance at the locations Lindau, Constance and Rorschach. My main goals are to provide a long time series of quantitative data for extreme flood events, to discuss the occurring changes in these data, and to evaluate the impact of the aforementioned human influences on the drainage system. Extracting information given in account books from the towns of Basel and Solothurn may also enable me to assess the frequency and seasonality of less severe river floods. Finally, historical information will be used for remodeling the historical hydrological regime to homogenize the historical data series to modern day conditions and thus make it comparable to the data provided by instrumental measurements. The method I will apply for processing all information provided by historical sources such as chronicles, newspapers, institutional records, as well as flood marks, paintings and archeological evidence has been developed and successfully applied to the site of Basel by Wetter et al. (2011). They have also shown that data homogenization is possible by reconstructing previous stream flow conditions using historical river profiles and by carefully observing and re-constructing human changes of the river bed and its surroundings. Taken all information into account, peak discharges for past extreme flood events will be calculated with a one-dimensional hydrological model.
Resumo:
This paper aims to further our understanding of pre-Columbian agricultural systems in the Llanos de Moxos, Bolivia. Three different types of raised fields co-existing in the same site near the community of Exaltación, in north-western Beni, were studied. The morphology, texture and geochemistry of the soils of these fields and the surrounding area were analysed. Differences in field design have often been associated with the diversity of cultural practices. Our results suggest that in the study area differences in field shape, height and layout are primarily the result of an adaptation to the local edaphology. By using the technology of raised fields, pre-Columbian people were able to drain and cultivate soils with very different characteristics, making the land suitable for agriculture and possibly different crops. This study also shows that some fields in the Llanos de Moxos were built to prolong the presence of water, allowing an additional cultivation period in the dry season and/or in times of drought. Nevertheless, the nature of the highly weathered soils suggests that raised fields were not able to support large populations and their management required long fallow periods.
Resumo:
This study is a retrospective longitudinal study at Texas Children's Hospital, a 350-bed tertiary level pediatric teaching hospital in Houston, Texas, for the period 1990 to 2006. It measured the incidence and trends of positive pre-employment drug tests among new job applicants At TCH. ^ Over the study period, 16,219 job applicants underwent pre-employment drug screening at TCH. Of these, 330 applicants (2%) tested positive on both the EMIT and GC/MS. After review by the medical review officer, the number of true drug test positive applicants decreased to 126 (0.78%). ^ According to the overall annual positive drug test incidence rates, the highest overall incidence was in 2002 (14.71 per 1000 tests) and the lowest in 2004 (3.17 per 1000 tests). Despite a marked increase in 2002, over the 15-year study period the overall incidence tended to decrease. Incidence rates and trends of other illegal drugs are further discussed in the study. And in general, these incidence rates also decline in the study period. In addition to that, we found the overall, positive drug tests were more common in females than in males (55.5% versus 44.4%). ^
Resumo:
The objectives of this study were to determine the nature of the relationship between severity of iron deficiency anemia, response to iron treatment, respiratory and gastrointestinal illness and weight change. Seventy-five pre-school children from rural Guatemala received daily oral iron therapy for an eleven week period, and were classified into one of three groups having different degrees of iron deficiency anemia. Anthropometric and biochemical data were collected prior and after iron treatment; morbidity data were collected throughout the period of treatment. The outcome variables were percentage weight change, percentage of total days ill with any type of symptom, percentage of total days ill with gastrointestinal symptoms, percentage of total days ill with respiratory symptoms, percentage of total days ill with combination syndrome symptoms. Age, sex and socio-economic status, were independent of any of the independent or outcome variables used. On the other hand, the level of hemoglobin covaried with the height of the children, the smallest children were the most severely anemic. The relationships between hemoglobin levels and weight change, frequency of morbidity (gastrointestinal, respiratory and combination syndrome) and total number of days ill with any symptomatology were investigated. No statistical significance was found in these analyses except when contrasting children with normal hemoglobin levels to iron deficient children, where the findings indicated the normal children experienced more gastrointestinal morbidity. The same relationship were again analyzed but including delta hemoglobin as covariate in the analysis, this latter one was found to be significant at 7% when the percentage of days ill from gastrointestinal morbidity was tested against the hemoglobin groups. The relationship found indicates that, all other covariates accounted for, the percentage of days ill from gastrointestinal morbidity will decrease approximately 1% for each 1% increase in delta of hemoglobin. ^