5 resultados para platelet function tests
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
In the electrical industry the 50 Hz electric and magnetic fields are often higher than in the average working environment. The electric and magnetic fields can be studied by measuring or by calculatingthe fields in the environment. For example, the electric field under a 400 kV power line is 1 to 10 kV/m, and the magnetic flux density is 1 to 15 µT. Electricand magnetic fields of a power line induce a weak electric field and electric currents in the exposed body. The average current density in a human being standing under a 400 kV line is 1 to 2 mA/m2. The aim of this study is to find out thepossible effects of short term exposure to electric and magnetic fields of electricity power transmission on workers' health, in particular the cardiovascular effects. The study consists of two parts; Experiment I: influence on extrasystoles, and Experiment II: influence on heart rate. In Experiment I two groups, 26 voluntary men (Group 1) and 27 transmission-line workers (Group 2), were measured. Their electrocardiogram (ECG) was recorded with an ambulatory recorder both in and outside the field. In Group 1 the fields were 1.7 to 4.9 kV/m and 1.1 to 7.1 pT; in Group 2 they were 0.1 to 10.2 kV/m and 1.0 to 15.4 pT. In the ECG analysis the only significant observation was a decrease in the heart rate after field exposure (Group 1). The drop cannot be explained with the first measuring method. Therefore Experiment II was carried out. In Experiment II two groups were used; Group 1 (26 male volunteers) were measured in real field exposure, Group 2 (15 male volunteers) in "sham" fields. The subjects of Group 1 spent 1 h outside the field, then 1 h in the field under a 400 kV transmission line, and then again 1 h outside the field. Under the 400 kV linethe field strength varied from 3.5 to 4.3 kV/m, and from 1.4 to 6.6 pT. Group 2spent the entire test period (3 h) in a 33 kV outdoor testing station in a "sham" field. ECG, blood pressure, and electroencephalogram (EEG) were measured by ambulatory methods. Before and after the field exposure, the subjects performed some cardiovascular autonomic function tests. The analysis of the results (Experiments I and II) showed that extrasystoles or arrythmias were as frequent in the field (below 4 kV/m and 4 pT) as outside it. In Experiment II there was no decrease detected in the heart rate, and the systolic and diastolic blood pressure stayed nearly the same. No health effects were found in this study.
Resumo:
Aims: This study was carried out to investigate the role of common liver function tests, and the degree of common bile duct dilatation in the differential diagnosis of extrahepatic cholestasis, as well as the occurrence, diagnosis and treatment of iatrogenic bile duct injuries. In bile duct injuries, special attention was paid to gender and severity distribution and long-term results. Patients and methods: All consecutive patients with diagnosed common bile duct stones or malignant strictures in ERCP between August 2000 and November 2003. Common liver function tests were measured in the morning before ERCP on all of these 212 patients, and their common bile duct diameter was measured from ERCP films. Between January 1995 and April 2002, 3736 laparoscopic cholecystectomies were performed and a total of 32 bile duct injuries were diagnosed. All pre-, per-, and postoperative data were collected retrospectively; and the patients were also interviewed by phone. Results: Plasma bilirubin proved to be the best discriminator between CBD stones and malignant strictures (p≤0.001 compared to other liver function tests and degree of common bile duct dilatation). The same effect was seen in Receiver Operating Characteristics curves (AUC 0.867). With a plasma bilirubin cut-off value of 145 μmol/l, four out of five patients could be classified correctly. The degree of common bile duct dilatation proved to be worthless in differential diagnostics. After laparoscopic cholecystectomy the total risk for bile duct injury was 0.86%, including cystic duct leaks. 86% of severe injuries and 88% of injuries requiring operative treatment were diagnosed in females. All the cystic duct leakages and 87% of the strictures were treated endoscopically. Good long-term results were seen in 84% of the whole study population. Conclusions: Plasma bilirubin is the most effective liver function test in differential diagnosis between CBD stones and malignant strictures. The only value of common bile duct dilatation is its ability to verify the presence of extrahepatic cholestasis. Female gender was associated with higher number of iatrogenic bile duct injuries, and in particular, most of the major complications occur in females. Most of the cystic duct leaks and common bile duct strictures can be treated endoscopically. The long-term results in our institution are at an internationally acceptable level.
Resumo:
Background: Dietary supplements are widely used among elite athletes but the prevalence of dietary supplement use among Finnish elite athletes is largely not known. The use of asthma medication is common among athletes. In 2009, the World Anti-Doping Agency (WADA) and the International Olympic Committee (IOC) removed the need to document asthma by lung function tests before the use of inhaled β2-agonists. Data about medication use by Paralympic athletes (PA) is limited to a study conducted at the Athens Paralympics. Aims: To investigate the prevalence of the use of self-reported dietary supplements, the use of physician-prescribed medication and the prevalence of physician-diagnosed asthma and allergies among Finnish Olympic athletes (OA). In addition, the differences in the selfreported physician-prescribed medication use were compared between the Finnish Olympic and the Paralympic athletes. Subjects and methods: Two cross-sectional studies were conducted in Finnish Olympic athletes receiving financial support from the Finnish Olympic Committee in 2002 (n=446) and in 2009 (n=372) and in Finnish top-level Paralympic athletes (n= 92) receiving financial support from Finnish Paralympic committee in 2006. The results of the Paralympic study were compared with the results of the Olympic study conducted in 2009. Both Olympic and Paralympic athletes filled in a similar semi-structured questionnaires. Results: Dietary supplements were used by 81% of the athletes in 2002 and by 73% of the athletes in 2009. After adjusting for age-, sex- and type of sport, the odds ratio OR (95% confidence interval, CI) for use of any dietary supplement was significantly less in 2009 as compared with the 2002 situation (OR 0.62; 95% CI 0.43-0.90). Vitamin D was used by 0.7% of the athletes in year 2002 but by 2% in 2009 (ns, p = 0.07). The use of asthma medication increased from 10.4 % in 2002 to 13.7% in 2009 (adjusted OR 1.71; 95% CI 1.08-2.69). For example, fixed combinations of inhaled long-acting β2-agonists (LABA) and inhaled corticosteroids (ICS) were used three times more commonly in 2009 than in 2002 (OR 3.38; 95% CI 1.26-9.12). The use of any physician-prescribed medicines (48.9% vs. 33.3%, adjusted OR 1.99; 95% CI 1.13-3.51), painkilling medicines (adjusted OR 2.61; 95% CI 1.18-5.78), oral antibiotics (adjusted OR 4.10; 95% CI 1.30-12.87) and anti-epileptic medicines (adjusted OR 37.09; 95% CI 5.92-232.31) was more common among the PA than in the OA during the previous seven days. Conclusions: The use of dietary supplements is on the decline among Finnish Olympic athletes. The intake of some essential micronutrients, such as vitamin D, is suprisingly low and this may even cause harm in those well-trained athletes. The use of asthma medication, especially fixed combinations of LABAs and ICS, is clearly increasing among Finnish Olympic athletes. The use of any physician-prescribed medicine, especially those to treat chronic diseases, seems to be more common among the Paralympians than in the Olympic athletes.
Resumo:
Dreaming is a pure form of phenomenality, created by the brain untouched by external stimulation or behavioral activity, yet including a full range of phenomenal contents. Thus, it has been suggested that the dreaming brain could be used as a model system in a biological research program on consciousness (Revonsuo, 2006). In the present thesis, the philosophical view of biological realism is accepted, and thus, dreaming is considered as a natural biological phenomenon, explainable in naturalistic terms. The major theoretical contribution of the present thesis is that it explores dreaming from a multidisciplinary perspective, integrating information from various fields of science, such as dream research, consciousness research, evolutionary psychology, and cognitive neuroscience. Further, it places dreaming into a multilevel framework, and investigates the constitutive, etiological, and contextual explanations for dreaming. Currently, the only theory offering a full multilevel explanation for dreaming, that is, a theory including constitutive, etiological, and contextual level explanations, is the Threat Simulation Theory (TST) (Revonsuo, 2000a; 2000b). The empirical significance of the present thesis lies in the tests conducted to test this specific theory put forth to explain the form, content, and biological function of dreaming. The first step in the empirical testing of the TST was to define exact criteria for what is a ‘threatening event’ in dreams, and then to develop a detailed and reliable content analysis scale with which it is possible to empirically explore and quantify threatening events in dreams. The second step was to seek answers to the following questions derived from the TST: How frequent threatening events are in dreams? What kind of qualities these events have? How threatening events in dreams relate to the most recently encoded or the most salient memory traces of threatening events experienced in waking life? What are the effects of exposure to severe waking life threat on dreams? The results reveal that threatening events are relatively frequent in dreams, and that the simulated threats are realistic. The most common threats include aggression, are targeted mainly against the dream self, and include simulations of relevant and appropriate defensive actions. Further, real threat experiences activate the threat simulation system in a unique manner, and dream content is modulated by the activation of long term episodic memory traces with highest negative saliency. To sum up, most of the predictions of the TST tested in this thesis received considerable support. The TST presents a strong argument that explains the specific design of dreams as threat simulations. The TST also offers a plausible explanation for why dreaming would have been selected for: because dreaming interacted with the environment in such a way that enhanced fitness of ancestral humans. By referring to a single threat simulation mechanism it furthermore manages to explain a wide variety of dream content data that already exists in the literature, and to predict the overall statistical patterns of threat content in different samples of dreams. The TST and the empirical tests conducted to test the theory are a prime example of what a multidisciplinary approach to mental phenomena can accomplish. Thus far, dreaming seems to have always resided in the periphery of science, never regarded worth to be studied by the mainstream. Nevertheless, when brought to the spotlight, the study of dreaming can greatly benefit from ideas in diverse branches of science. Vice versa, knowledge learned from the study of dreaming can be applied in various disciplines. The main contribution of the present thesis lies in putting dreaming back where it belongs, that is, into the spotlight in the cross-road of various disciplines.