14 resultados para assessment during practicum

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximately one-third of stroke patients experience depression. Stroke also has a profound effect on the lives of caregivers of stroke survivors. However, depression in this latter population has received little attention. In this study the objectives were to determine which factors are associated with and can be used to predict depression at different points in time after stroke; to compare different depression assessment methods among stroke patients; and to determine the prevalence, course and associated factors of depression among the caregivers of stroke patients. A total of 100 consecutive hospital-admitted patients no older than 70 years of age were followed for 18 months after having their first ischaemic stroke. Depression was assessed according to the Diagnostic and Statistical Manual of Mental Disorders (DSM-III-R), Beck Depression Inventory (BDI), Hamilton Rating Scale (HRSD), Visual Analogue Mood Scale (VAMS), Clinical Global Impression (CGI) and caregiver ratings. Neurological assessments and a comprehensive neuropsychological test battery were performed. Depression in caregivers was assessed by BDI. Depressive symptoms had early onsets in most cases. Mild depressive symptoms were often persistent with little change during the 18-month follow-up, although there was an increase in major depression over the same time interval. Stroke severity was associated with depression especially from 6 to 12 months post-stroke. At the acute phase, older patients were at higher risk of depression, and a higher proportion of men were depressed at 18 months post-stroke. Of the various depression assessment methods, none stood clearly apart from the others. The feasibility of each did not differ greatly, but prevalence rates differed widely according to the different criteria. When compared against DSM-III-R criteria, sensitivity and specificity were acceptable for the CGI, BDI, and HRSD. The CGI and BDI had better sensitivity than the more specific HRSD. The VAMS seemed not to be a reliable method for assessing depression among stroke patients. The caregivers often rated patients depression as more severe than did the patients themselves. Moreover, their ratings seemed to be influenced by their own depression. Of the caregivers, 30-33% were depressed. At the acute phase, caregiver depression was associated with the severity of the stroke and the older age of the patient. The best predictor of caregiver depression at later follow-up was caregiver depression at the acute phase. The results suggest that depression should be assessed during the early post-stroke period and that the follow-up of those at risk of poor emotional outcome should be extended beyond the first year post-stroke. Further, the assessment of well-being of the caregivers of stroke patients should be included as a part of a rehabilitation plan for stroke patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Old trees growing in urban environments are often felled due to symptoms of mechanical defects that could be hazardous to people and property. The decisions concerning these removals are justified by risk assessments carried out by tree care professionals. The major motivation for this study was to determine the most common profiles of potential hazard characteristics for the three most common urban tree genera in Helsinki City: Tilia, Betula and Acer, and in this way improve management practices and protection of old amenity trees. For this research, material from approximately 250 urban trees was collected in cooperation with the City of Helsinki Public Works Department during 2001 - 2004. From the total number of trees sampled, approximately 70% were defined as hazardous. The tree species had characteristic features as potential hazard profiles. For Tilia trees, hollowed heartwood with low fungal activity and advanced decay caused by Ganoderma lipsiense were the two most common profiles. In Betula spp., the primary reason for tree removal was usually lowered amenity value in terms of decline of the crown. Internal cracks, most often due to weak fork formation, were common causes of potential failure in Acer spp. Decay caused by Rigidoporus populinus often increased the risk of stem breakage in these Acer trees. Of the decay fungi observed, G. lipsiense was most often the reason for the increased risk of stem collapse. Other fungi that also caused extensive decay were R. populinus, Inonotus obliquus, Kretzschmaria deusta and Phellinus igniarius. The most common decay fungi in terms of incidence were Pholiota spp., but decay caused by these species did not have a high potential for causing stem breakage, because it rarely extended to the cambium. The various evaluations used in the study suggested contradictions in felling decisions based on trees displaying different stages of decay. For protection of old urban trees, it is crucial to develop monitoring methods so that tree care professionals could better analyse the rate of decay progression towards the sapwood and separate those trees with decreasing amounts of sound wood from those with decay that is restricted to the heartwood area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite of improving levels of hygiene, the incidence of registered food borne disease has been at the same level for many years: there were 40 to 90 epidemics in which 1000-9000 persons contracted food poisoning through food or drinking water in Finland. Until the year 2004 salmonella and campylobacter were the most common bacterial causes of food borne diseases, but in years 2005-2006 Bacillus cereus was the most common. Similar developement has been published i.e. in Germany already in the 1990´s. One reason for this can be Bacillus cereus and its emetic toxin, cereulide. Bacillus cereus is a common environmental bacterium that contaminates raw materials of food. Otherwise than salmonella and campylobacter, Bacillus cereus is a heat resistant bacterium, capable of surviving most cooking procedures due to the production of highly thermo resistant spores. The food involved has usually been heat treated and surviving spores are the source of the food poisoning. The heat treatment induces germination of the spore and the vegetative cells then produce toxins. This doctoral thesis research focuses on developing methods for assessing and eliminating risks to food safety by cereulide producing Bacillus cereus. The biochemistry and physiology of cereulide production was investigated and the results were targeted to offer tools for minimizing toxin risk in food during the production. I developed methods for the extraction and quantitative analysis of cereulide directly from food. A prerequisite for that is knowledge of the chemical and physical properties of the toxin. Because cereulide is practically insoluble in water, I used organic solvents; methanol, ethanol and pentane for the extraction. For extraction of bakery products I used high temperature (100C) and pressure (103.4 bars). Alternaties for effective extraction is to flood the plain food with ethanol, followed by stationary equilibration at room temperature. I used this protocol for extracting cereulide from potato puree and penne. Using this extraction method it is also possible also extract cereulide from liquid food, like milk. These extraction methods are important improvement steps for studying of Bacillus cereus emetic food poisonings. Prior my work, cereulide extraction was done using water. As the result, the yield was poor and variable. To investigate suspected food poisonings, it is important to show actual toxicity of the incriminated food. Many toxins, but not cereulide, inactivate during food processing like heating. The next step is to identify toxin by chemical methods. I developed with my colleague Maria Andesson a rapid assay for the detection of cereulide toxicity, within 5 to 15 minutes. By applying this test it is possible to rapidly detect which food was causing the food poisoning. The chemical identification of cereulide was achieved using mass spectrometry. I used cereulide specific molecular ions, m/z (+/-0.3) 1153.8 (M+H+), 1171.0 (M+NH4+), 1176.0 (M+Na+) and 1191.7 (M+K+) for reliable identification. I investigated foods to find out their amenability to accumulate cereulide. Cereulide was formed high amounts (0.3 to 5.5 microg/g wet wt) when of cereulide producing B. cereus strains were present in beans, rice, rice-pastry and meat-pastry, if stored at non refrigerated temperatures (21-23C). Rice and meat pastries are frequently consumed under conditions where no cooled storage is available e.g. picnics and outdoor events. Bacillus cereus is a ubiquitous spore former and is therefore difficult to eliminate from foods. It is therefore important to know which conditions will affect the formation of cereulide in foods. My research showed that the cereulide content was strongly (10 to 1000 fold differences in toxin content) affected by the growth environment of the bacterium. Storage of foods under nitrogen atmosphere (> 99.5 %) prevented the production of cereulide. But when also carbon dioxide was present, minimizing the oxygen contant (< 1%) did not protect the food from formation of cereulide in preliminary experiments. Also food supplements affected cereulide production at least in the laboratory. Adding free amino acids, leucine and valine, stimulated cereulide production 10 to 20 fold. In peptide bonded form these amino acids are natural constituents in all proteins. Interestingly, adding peptide bonded leucine and valine had no significant effect on cereulide production. Free amino acids leucine and valine are approved food supplements and widely used as flawour modifiers in food technology. My research showed that these food supplements may increase food poisoning risk even though they are not toxic themselves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Olkiluoto Island is situated in the northern Baltic Sea, near the southwestern coast of Finland, and is the proposed location of a spent nuclear fuel repository. This study examined Holocene palaeoseismicity in the Olkiluoto area and in the surrounding sea areas by computer simulations together with acoustic-seismic, sedimentological and dating methods. The most abundant rock type on the island is migmatic mica gneiss, intruded by tonalites, granodiorites and granites. The surrounding Baltic Sea seabed consists of Palaeoproterozoic crystalline bedrock, which is to a great extent covered by younger Mesoproterozoic sedimentary rocks. The area contains several ancient deep-seated fracture zones that divide it into bedrock blocks. The response of bedrock at the Olkiluoto site was modelled considering four future ice-age scenarios. Each scenario produced shear displacements of fractures with different times of occurrence and varying recovery rates. Generally, the larger the maximum ice load, the larger were the permanent shear displacements. For a basic case, the maximum shear displacements were a few centimetres at the proposed nuclear waste repository level, at proximately 500 m b.s.l. High-resolution, low-frequency echo-sounding was used to examine the Holocene submarine sedimentary structures and possible direct and indirect indicators of palaeoseismic activity in the northern Baltic Sea. Echo-sounding profiles of Holocene submarine sediments revealed slides and slumps, normal faults, debris flows and turbidite-type structures. The profiles also showed pockmarks and other structures related to gas or groundwater seepages, which might be related to fracture zone activation. Evidence of postglacial reactivation in the study area was derived from the spatial occurrence of some of the structures, especial the faults and the seepages, in the vicinity of some old bedrock fracture zones. Palaeoseismic event(s) (a single or several events) in the Olkiluoto area were dated and the palaeoenvironment was characterized using palaeomagnetic, biostratigraphical and lithostratigraphical methods, enhancing the reliability of the chronology. Combined lithostratigraphy, biostratigraphy and palaeomagnetic stratigraphy revealed an age estimation of 10 650 to 10 200 cal. years BP for the palaeoseismic event(s). All Holocene sediment faults in the northern Baltic Sea occur at the same stratigraphical level, the age of which is estimated at 10 700 cal. years BP (9500 radiocarbon years BP). Their movement is suggested to have been triggered by palaeoseismic event(s) when the Late Weichselian ice sheet was retreating from the site and bedrock stresses were released along the bedrock fracture zones. Since no younger or repeated traces of seismic events were found, it corroborates the suggestion that the major seismic activity occurred within a short time during and after the last deglaciation. The origin of the gas/groundwater seepages remains unclear. Their reflections in the echo-sounding profiles imply that part of the gas is derived from the organic-bearing Litorina and modern gyttja clays. However, at least some of the gas is derived from the bedrock. Additional information could be gained by pore water analysis from the pockmarks. Information on postglacial fault activation and possible gas and/or fluid discharges under high hydraulic heads has relevance in evaluating the safety assessment of a planned spent nuclear fuel repository in the region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For optimal treatment planning, a thorough assessment of the metastatic status of mucosal squamous cell carcinoma of the head and neck (HNSCC) is required. Current imaging methods do not allow the recognition of all patients with metastatic disease. Therefore, elective treatment of the cervical lymph nodes is usually given to patients in whom the risk of subclinical metastasis is estimated to exceed 15-20%. The objective of this study was to improve the pre-treatment evaluation of patients diagnosed with HNSCC. Particularly, we aimed at improving the identification of patients who will benefit from elective neck treatment. Computed tomography (CT) of the chest and abdomen was performed prospectively for 100 patients diagnosed with HNSCC. The findings were analysed to clarify the indications for this examination in this patient group. CT of the chest influenced the treatment approach in 3% of patients, while CT of the abdomen did not reveal any significant findings. Our results suggest that CT of the chest and abdomen is not indicated routinely for patients with newly diagnosed HNSCC but can be considered in selected cases. Retrospective analysis of 80 patients treated for early stage squamous cell carcinoma of the oral tongue was performed to investigate the potential benefits of elective neck treatment and to examine whether histopathological features of the primary tumour could be used in the prediction of occult metastases, local recurrence, or/and poor survival. Patients who had received elective neck treatment had significantly fewer cervical recurrences during the follow-up when compared to those who only had close observation of the cervical lymph nodes. Elective neck treatment did not result in survival benefit, however. Of the histopathological parameters examined, depth of infiltration and pT-category (representing tumour diameter) predicted occult cervical metastasis, but only the pT-category predicted local recurrence. Depth of infiltration can be used in the identification of at risk patients but no clear cut-off value separating high-risk and low-risk patients was found. None of the histopathological parameters examined predicted survival. Sentinel lymph node (SLN) biopsy was studied as a means of diagnosing patients with subclinical cervical metastases. SLN biopsy was applied to 46 patients who underwent elective neck dissection for oral squamous cell carcinoma. In addition, SLN biopsy was applied to 13 patients with small oral cavity tumours who were not intended to undergo elective neck dissection because of low risk of occult metastasis. The sensitivity of SLN biopsy for finding subclinical cervical metastases was found to be 67%, when SLN status was compared to the metastatic status of the rest of the neck dissection specimen. Of the patients not planned to have elective neck dissection, SLN biopsy revealed cervical metastasis in 15% of the patients. Our results suggest that SLN biopsy can not yet entirely replace elective neck dissection in the treatment of oral cancer, but it seems beneficial for patients with low risk of metastasis who are not intended for elective neck treatment according to current treatment protocols.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exposure to water-damaged buildings and the associated health problems have evoked concern and created confusion during the past 20 years. Individuals exposed to moisture problem buildings report adverse health effects such as non-specific respiratory symptoms. Microbes, especially fungi, growing on the damp material have been considered as potential sources of the health problems encountered in these buildings. Fungi and their airborne fungal spores contain allergens and secondary metabolites which may trigger allergic as well as inflammatory types of responses in the eyes and airways. Although epidemiological studies have revealed an association between damp buildings and health problems, no direct cause-and-effect relationship has been established. Further knowledge is needed about the epidemiology and the mechanisms leading to the symptoms associated with exposure to fungi. Two different approaches have been used in this thesis in order to investigate the diverse health effects associated with exposure to moulds. In the first part, sensitization to moulds was evaluated and potential cross-reactivity studied in patients attending a hospital for suspected allergy. In the second part, one typical mould known to be found in water-damaged buildings and to produce toxic secondary metabolites was used to study the airway responses in an experimental model. Exposure studies were performed on both naive and allergen sensitized mice. The first part of the study showed that mould allergy is rare and highly dependent on the atopic status of the examined individual. The prevalence of sensitization was 2.7% to Cladosporium herbarum and 2.8% to Alternaria alternata in patients, the majority of whom were atopic subjects. Some of the patients sensitized to mould suffered from atopic eczema. Frequently the patients were observed to possess specific serum IgE antibodies to a yeast present in the normal skin flora, Pityrosporum ovale. In some of these patients, the IgE binding was partly found to be due to binding to shared glycoproteins in the mould and yeast allergen extracts. The second part of the study revealed that exposure to Stachybotrys chartarum spores induced an airway inflammation in the lungs of mice. The inflammation was characterized by an influx of inflammatory cells, mainly neutrophils and lymphocytes, into the lungs but with almost no differences in airway responses seen between the satratoxin producing and non-satratoxin producing strain. On the other hand, when mice were exposed to S. chartarum and sensitized/challenged with ovalbumin the extent of the inflammation was markedly enhanced. A synergistic increase in the numbers of inflammatory cells was seen in BAL and severe inflammation was observed in the histological lung sections. In conclusion, the results in this thesis imply that exposure to moulds in water damaged buildings may trigger health effects in susceptible individuals. The symptoms can rarely be explained by IgE mediated allergy to moulds. Other non-allergic mechanisms seem to be involved. Stachybotrys chartarum is one of the moulds potentially responsible for health problems. In this thesis, new reaction models for the airway inflammation induced by S. chartarum have been found using experimental approaches. The immunological status played an important role in the airway inflammation, enhancing the effects of mould exposure. The results imply that sensitized individuals may be more susceptible to exposure to moulds than non-sensitized individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several hypnosis monitoring systems based on the processed electroencephalogram (EEG) have been developed for use during general anesthesia. The assessment of the analgesic component (antinociception) of general anesthesia is an emerging field of research. This study investigated the interaction of hypnosis and antinociception, the association of several physiological variables with the degree of intraoperative nociception, and aspects of EEG Bispectral Index Scale (BIS) monitoring during general anesthesia. In addition, EEG features and heart rate (HR) responses during desflurane and sevoflurane anesthesia were compared. A propofol bolus of 0.7 mg/kg was more effective than an alfentanil bolus of 0.5 mg in preventing the recurrence of movement responses during uterine dilatation and curettage (D C) after a propofol-alfentanil induction, combined with nitrous oxide (N2O). HR and several HR variability-, frontal electromyography (fEMG)-, pulse plethysmography (PPG)-, and EEG-derived variables were associated with surgery-induced movement responses. Movers were discriminated from non-movers mostly by the post-stimulus values per se or normalized with respect to the pre-stimulus values. In logistic regression analysis, the best classification performance was achieved with the combination of normalized fEMG power and HR during D C (overall accuracy 81%, sensitivity 53%, specificity 95%), and with the combination of normalized fEMG-related response entropy, electrocardiography (ECG) R-to-R interval (RRI), and PPG dicrotic notch amplitude during sevoflurane anesthesia (overall accuracy 96%, sensitivity 90%, specificity 100%). ECG electrode impedances after alcohol swab skin pretreatment alone were higher than impedances of designated EEG electrodes. The BIS values registered with ECG electrodes were higher than those registered simultaneously with EEG electrodes. No significant difference in the time to home-readiness after isoflurane-N2O or sevoflurane-N2O anesthesia was found, when the administration of the volatile agent was guided by BIS monitoring. All other early and intermediate recovery parameters were also similar. Transient epileptiform EEG activity was detected in eight of 15 sevoflurane patients during a rapid increase in the inspired volatile concentration, and in none of the 16 desflurane patients. The observed transient EEG changes did not adversely affect the recovery of the patients. Following the rapid increase in the inhaled desflurane concentration, HR increased transiently, reaching its maximum in two minutes. In the sevoflurane group, the increase was slower and more subtle. In conclusion, desflurane may be a safer volatile agent than sevoflurane in patients with a lowered seizure threshold. The tachycardia induced by a rapid increase in the inspired desflurane concentration may present a risk for patients with heart disease. Designated EEG electrodes may be superior to ECG electrodes in EEG BIS monitoring. When the administration of isoflurane or sevoflurane is adjusted to maintain BIS values at 50-60 in healthy ambulatory surgery patients, the speed and quality of recovery are similar after both isoflurane-N2O and sevoflurane-N2O anesthesia. When anesthesia is maintained by the inhalation of N2O and bolus doses of propofol and alfentanil in healthy unparalyzed patients, movement responses may be best avoided by ensuring a relatively deep hypnotic level with propofol. HR/RRI, fEMG, and PPG dicrotic notch amplitude are potential indicators of nociception during anesthesia, but their performance needs to be validated in future studies. Combining information from different sources may improve the discrimination of the level of nociception.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and aims. Type 1 diabetes (T1D), an autoimmune disease in which the insulin producing beta cells are gradually destroyed, is preceded by a prodromal phase characterized by appearance of diabetes-associated autoantibodies in circulation. Both the timing of the appearance of autoantibodies and their quality have been used in the prediction of T1D among first-degree relatives of diabetic patients (FDRs). So far, no general strategies for identifying individuals at increased disease risk in the general population have been established, although the majority of new cases originate in this population. The current work aimed at assessing the predictive role of diabetes-associated immunologic and metabolic risk factors in the general population, and comparing these factors with data obtained from studies on FDRs. Subjects and methods. Study subjects in the current work were subcohorts of participants of the Childhood Diabetes in Finland Study (DiMe; n=755), the Cardiovascular Risk in Young Finns Study (LASERI; n=3475), and the Finnish Type 1 Diabetes Prediction and Prevention Study (DIPP) Study subjects (n=7410). These children were observed for signs of beta-cell autoimmunity and progression to T1D, and the results obtained were compared between the FDRs and the general population cohorts. --- Results and conclusions. By combining HLA and autoantibody screening, T1D risks similar to those reported for autoantibody-positive FDRs are observed in the pediatric general population. Progression rate to T1D is high in genetically susceptible children with persistent multipositivity. Measurement of IAA affinity failed in stratifying the risk assessment in young IAA-positive children with HLA-conferred disease susceptibility, among whom affinity of IAA did not increase during the prediabetic period. Young age at seroconversion, increased weight-for-height, decreased early insulin response, and increased IAA and IA-2A levels predict T1D in young children with genetic disease susceptibility and signs of advanced beta-cell autoimmunity. Since the incidence of T1D continues to increase, efforts aimed at preventing T1D are important, and reliable disease prediction is needed both for intervention trials and for effective and safe preventive therapies in the future. Our observations confirmed that combined HLA-based screening and regular autoantibody measurements reveal similar disease risks in pediatric general population as those seen in prediabetic FDRs, and that risk assessment can be stratified further by studying glucose metabolism of prediabetic subjects. As these screening efforts are feasible in practice, the knowledge now obtained can be exploited while designing intervention trials aimed at secondary prevention of T1D.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to estimate the development of fertility in North-Central Namibia, former Ovamboland, from 1960 to 2001. Special attention was given to the onset of fertility decline and to the impact of the HIV epidemic on fertility. An additional aim was to introduce parish registers as a source of data for fertility research in Africa. Data used consisted of parish registers from Evangelical Lutheran congregations, the 1991 and 2001 Population and Housing Censuses, the 1992 and 2000 Namibia Demographic and Health Surveys, and the HIV sentinel surveillances of 1992-2004. Both period and cohort fertility were analysed. The P/F ratio method was used when analysing census data. The impact of HIV infection on fertility was estimated indirectly by comparing the fertility histories of women who died at an age of less than 50 years with the fertility of other women. The impact of the HIV epidemic on fertility was assessed both among infected women and in the general population. Fertility in the study population began to decline in 1980. The decline was rapid during the 1980s, levelled off in the early 1990s at the end of war of independence and then continued to decline until the end of the study period. According to parish registers, total fertility was 6.4 in the 1960s and 6.5 in the 1970s, and declined to 5.1 in the 1980s and 4.2 in the 1990s. Adjustment of these total fertility rates to correspond to levels of fertility based on data from the 1991 and 2001 censuses resulted in total fertility declining from 7.6 in 1960-79 to 6.0 in 1980-89, and to 4.9 in 1990-99. The decline was associated with increased age at first marriage, declining marital fertility and increasing premarital fertility. Fertility among adolescents increased, whereas the fertility of women in all other age groups declined. During the 1980s, the war of independence contributed to declining fertility through spousal separation and delayed marriages. Contraception has been employed in the study region since the 1980s, but in the early 1990s, use of contraceptives was still so limited that fertility was higher in North-Central Namibia than in other regions of the country. In the 1990s, fertility decline was largely a result of the increased prevalence of contraception. HIV prevalence among pregnant women increased from 4% in 1992 to 25% in 2001. In 2001, total fertility among HIV-infected women (3.7) was lower than that among other women (4.8), resulting in total fertility of 4.4 among the general population in 2001. The HIV epidemic explained more than a quarter of the decline in total fertility at population level during most of the 1990s. The HIV epidemic also reduced the number of children born by reducing the number of potential mothers. In the future, HIV will have an extensive influence on both the size and age structure of the Namibian population. Although HIV influences demographic development through both fertility and mortality, the effect through changes in fertility will be smaller than the effect through mortality. In the study region, as in some other regions of southern Africa, a new type of demographic transition is under way, one in which population growth stagnates or even reverses because of the combined effects of declining fertility and increasing mortality, both of which are consequences of the HIV pandemic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The resources of health systems are limited. There is a need for information concerning the performance of the health system for the purposes of decision-making. This study is about utilization of administrative registers in the context of health system performance evaluation. In order to address this issue, a multidisciplinary methodological framework for register-based data analysis is defined. Because the fixed structure of register-based data indirectly determines constraints on the theoretical constructs, it is essential to elaborate the whole analytic process with respect to the data. The fundamental methodological concepts and theories are synthesized into a data sensitive approach which helps to understand and overcome the problems that are likely to be encountered during a register-based data analyzing process. A pragmatically useful health system performance monitoring should produce valid information about the volume of the problems, about the use of services and about the effectiveness of provided services. A conceptual model for hip fracture performance assessment is constructed and the validity of Finnish registers as a data source for the purposes of performance assessment of hip fracture treatment is confirmed. Solutions to several pragmatic problems related to the development of a register-based hip fracture incidence surveillance system are proposed. The monitoring of effectiveness of treatment is shown to be possible in terms of care episodes. Finally, an example on the justification of a more detailed performance indicator to be used in the profiling of providers is given. In conclusion, it is possible to produce useful and valid information on health system performance by using Finnish register-based data. However, that seems to be far more complicated than is typically assumed. The perspectives given in this study introduce a necessary basis for further work and help in the routine implementation of a hip fracture monitoring system in Finland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The indigenous cloud forests in the Taita Hills have suffered substantial degradation for several centuries due to agricultural expansion. Currently, only 1% of the original forested area remains preserved in this region. Furthermore, climate change imposes an imminent threat for local economy and environmental sustainability. In such circumstances, elaborating tools to conciliate socioeconomic growth and natural resources conservation is an enormous challenge. This dissertation tackles essential aspects for understanding the ongoing agricultural activities in the Taita Hills and their potential environmental consequences in the future. Initially, alternative methods were designed to improve our understanding of the ongoing agricultural activities. Namely, methods for agricultural survey planning and to estimate evapotranspiration were evaluated, taking into account a number of limitations regarding data and resources availability. Next, this dissertation evaluates how upcoming agricultural expansion, together with climate change, will affect the natural resources in the Taita Hills up to the year 2030. The driving forces of agricultural expansion in the region were identified as aiming to delineate future landscape scenarios and evaluate potential impacts from the soil and water conservation point of view. In order to investigate these issues and answer the research questions, this dissertation combined state of the art modelling tools with renowned statistical methods. The results indicate that, if current trends persist, agricultural areas will occupy roughly 60% of the study area by 2030. Although the simulated land use changes will certainly increase soil erosion figures, new croplands are likely to come up predominantly in the lowlands, which comprise areas with lower soil erosion potential. By 2030, rainfall erosivity is likely to increase during April and November due to climate change. Finally, this thesis addressed the potential impacts of agricultural expansion and climate changes on Irrigation Water Requirements (IWR), which is considered another major issue in the context of the relations between land use and climate. Although the simulations indicate that climate change will likely increase annual volumes of rainfall during the following decades, IWR will continue to increase due to agricultural expansion. By 2030, new cropland areas may cause an increase of approximately 40% in the annual volume of water necessary for irrigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first line medication for mild to moderate Alzheimer s disease (AD) is based on cholinesterase inhibitors which prolong the effect of the neurotransmitter acetylcholine in cholinergic nerve synapses which relieves the symptoms of the disease. Implications of cholinesterases involvement in disease modifying processes has increased interest in this research area. The drug discovery and development process is a long and expensive process that takes on average 13.5 years and costs approximately 0.9 billion US dollars. Drug attritions in the clinical phases are common due to several reasons, e.g., poor bioavailability of compounds leading to low efficacy or toxic effects. Thus, improvements in the early drug discovery process are needed to create highly potent non-toxic compounds with predicted drug-like properties. Nature has been a good source for the discovery of new medicines accounting for around half of the new drugs approved to market during the last three decades. These compounds are direct isolates from the nature, their synthetic derivatives or natural mimics. Synthetic chemistry is an alternative way to produce compounds for drug discovery purposes. Both sources have pros and cons. The screening of new bioactive compounds in vitro is based on assaying compound libraries against targets. Assay set-up has to be adapted and validated for each screen to produce high quality data. Depending on the size of the library, miniaturization and automation are often requirements to reduce solvent and compound amounts and fasten the process. In this contribution, natural extract, natural pure compound and synthetic compound libraries were assessed as sources for new bioactive compounds. The libraries were screened primarily for acetylcholinesterase inhibitory effect and secondarily for butyrylcholinesterase inhibitory effect. To be able to screen the libraries, two assays were evaluated as screening tools and adapted to be compatible with special features of each library. The assays were validated to create high quality data. Cholinesterase inhibitors with various potencies and selectivity were found in natural product and synthetic compound libraries which indicates that the two sources complement each other. It is acknowledged that natural compounds differ structurally from compounds in synthetic compound libraries which further support the view of complementation especially if a high diversity of structures is the criterion for selection of compounds in a library.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human parvovirus B19 (B19V) is known to cause anemia, hydrops fetalis, and fetal death especially during the first half of pregnancy. Women who are in occupational contact with young children are at increased risk of B19V infection. The role of the recently discovered human parvovirus, human bocavirus (HBoV), in reproduction is unknown. The aim of this research project was to establish a scientific basis for assessing the work safety of pregnant women and for issuing special maternity leave regulations during B19V epidemics in Finland. The impact of HBoV infection on the pregnant woman and her fetus was also defined. B19V DNA was found in 0.8% of the miscarriages and in 2.4% of the intrauterine fetal death (IUFD; fetal death after completed 22 gestational weeks). All control fetuses (from induced abortions) were B19V-DNA negative. The findings on hydropic B19V DNA-positive IUFDs with evidence of acute or recent maternal B19V infection are in line with those of previous Swedish studies. However, the high prevalence of B19V-related nonhydropic IUFDs noted in the Swedish studies was mostly without evidence of maternal B19V infection and was not found during the third trimester. HBoV was not associated with miscarriages or IUFDs. Almost all of the studied pregnant women were HboV-IgG positive, and thus most probably immune to HBoV. All preterm births, perinatal deaths, smallness for gestational age (SGA) and congenital anomaly were recorded among the infants of child-care employees in a nationwide register-based cohort study over a period of 14 years. Little or no differences in the results were found between the infants of the child-care employees and those of the comparison group. The annual B19V seroconversion rate was over two-fold among the child-care employees, compared to the women in the comparison group. The seropositivity of the child-care employees increased with age, and years from qualification/joining the trade union. In general, the child-care employees are not at increased risk for adverse pregnancy outcome. However, at the population level, the risk of rare events, such as adverse pregnancy outcomes attributed to infections, could not be determined. According to previous studies, seronegative women had a 5 10% excess risk of losing the fetus during the first half of their pregnancy, but thereafter the risk was very low. Therefore, an over two-fold increased risk of B19V infection among child-care employees is considerable, and should be taken into account in the assessment of the occupational safety of pregnant women, especially during the first half of their pregnancy.