78 resultados para Measurement instruments

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this dissertation was to adapt a questionnaire for assessing students’ approaches to learning and their experiences of the teaching-learning environment. The aim was to explore the validity of the modified Experiences of Teaching and Learning Questionnaire (ETLQ) by examining how the instruments measure the underlying dimensions of student experiences and their learning. The focus was on the relation between students’ experiences of their teaching-learning environment and their approaches to learning. Moreover, the relation between students’ experiences and students’ and teachers’ conceptions of good teaching was examined. In Study I the focus was on the use of the ETLQ in two different contexts: Finnish and British. The study aimed to explore the similarities and differences between the factor structures that emerged from both data sets. The results showed that the factor structures concerning students’ experiences of their teaching-learning environment and their approaches to learning were highly similar in the two contexts. Study I also examined how students’ experiences of the teaching-learning environment are related to their approaches to learning in the two contexts. The results showed that students’ positive experiences of their teaching-learning environment were positively related to their deep approach to learning and negatively to the surface approach to learning in both the Finnish and British data sets. This result was replicated in Study II, which examined the relation between approaches to learning and experiences of the teaching-learning environment on a group level. Furthermore, Study II aimed to explore students’ approaches to learning and their experiences of the teaching-learning environment in different disciplines. The results showed that the deep approach to learning was more common in the soft sciences than in the hard sciences. In Study III, students’ conceptions of good teaching were explored by using qualitative methods, more precisely, by open-ended questions. The aim was to examine students’ conceptions, disciplinary differences and their relation to students’ approaches to learning. The focus was on three disciplines, which differed in terms of students’ experiences of their teaching-learning environment. The results showed that students’ conceptions of good teaching were in line with the theory of good teaching and there were disciplinary differences in their conceptions. Study IV examined university teachers’ conceptions of good teaching, which corresponded to the learning-focused approach to teaching. Furthermore, another aim in this doctoral dissertation was to compare the students’ and teachers’ conceptions of good teaching, the results of which showed that these conceptions appear to have similarities. The four studies indicated that the ETLQ appears to be a sufficiently robust measurement instrument in different contexts. Moreover, its strength is its ability to be at the same time a valid research instrument and a practical tool for enhancing the quality of students’ learning. In addition, the four studies emphasise that in order to enhance teaching and learning in higher education, various perspectives have to be taken into account. This study sheds light on the interaction between students’ approaches to learning, their conceptions of good teaching, their experiences of the teaching-learning environment, and finally, the disciplinary culture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims: The aims of this study were 1) to identify and describe health economic studies that have used quality-adjusted life years (QALYs) based on actual measurements of patients' health-related quality of life (HRQoL); 2) to test the feasibility of routine collection of health-related quality of life (HRQoL) data as an indicator of effectiveness of secondary health care; and 3) to establish and compare the cost-utility of three large-volume surgical procedures in a real-world setting in the Helsinki University Central Hospital, a large referral hospital providing secondary and tertiary health-care services for a population of approximately 1.4 million. Patients and methods: So as to identify studies that have used QALYs as an outcome measure, a systematic search of the literature was performed using the Medline, Embase, CINAHL, SCI and Cochrane Library electronic databases. Initial screening of the identified articles involved two reviewers independently reading the abstracts; the full-text articles were also evaluated independently by two reviewers, with a third reviewer used in cases where the two reviewers could not agree a consensus on which articles should be included. The feasibility of routinely evaluating the cost-effectiveness of secondary health care was tested by setting up a system for collecting HRQoL data on approximately 4 900 patients' HRQoL before and after operative treatments performed in the hospital. The HRQoL data used as an indicator of treatment effectiveness was combined with diagnostic and financial indicators routinely collected in the hospital. To compare the cost-effectiveness of three surgical interventions, 712 patients admitted for routine operative treatment completed the 15D HRQoL questionnaire before and also 3-12 months after the operation. QALYs were calculated using the obtained utility data and expected remaining life years of the patients. Direct hospital costs were obtained from the clinical patient administration database of the hospital and a cost-utility analysis was performed from the perspective of the provider of secondary health care services. Main results: The systematic review (Study I) showed that although QALYs gained are considered an important measure of the effectiveness of health care, the number of studies in which QALYs are based on actual measurements of patients' HRQoL is still fairly limited. Of the reviewed full-text articles, only 70 reported QALYs based on actual before after measurements using a valid HRQoL instrument. Collection of simple cost-effectiveness data in secondary health care is feasible and could easily be expanded and performed on a routine basis (Study II). It allows meaningful comparisons between various treatments and provides a means for allocating limited health care resources. The cost per QALY gained was 2 770 for cervical operations and 1 740 for lumbar operations. In cases where surgery was delayed the cost per QALY was doubled (Study III). The cost per QALY ranges between subgroups in cataract surgery (Study IV). The cost per QALY gained was 5 130 for patients having both eyes operated on and 8 210 for patients with only one eye operated on during the 6-month follow-up. In patients whose first eye had been operated on previous to the study period, the mean HRQoL deteriorated after surgery, thus precluding the establishment of the cost per QALY. In arthroplasty patients (Study V) the mean cost per QALY gained in a one-year period was 6 710 for primary hip replacement, 52 270 for revision hip replacement, and 14 000 for primary knee replacement. Conclusions: Although the importance of cost-utility analyses has during recent years been stressed, there are only a limited number of studies in which the evaluation is based on patients own assessment of the treatment effectiveness. Most of the cost-effectiveness and cost-utility analyses are based on modeling that employs expert opinion regarding the outcome of treatment, not on patient-derived assessments. Routine collection of effectiveness information from patients entering treatment in secondary health care turned out to be easy enough and did not, for instance, require additional personnel on the wards in which the study was executed. The mean patient response rate was more than 70 %, suggesting that patients were happy to participate and appreciated the fact that the hospital showed an interest in their well-being even after the actual treatment episode had ended. Spinal surgery leads to a statistically significant and clinically important improvement in HRQoL. The cost per QALY gained was reasonable, at less than half of that observed for instance for hip replacement surgery. However, prolonged waiting for an operation approximately doubled the cost per QALY gained from the surgical intervention. The mean utility gain following routine cataract surgery in a real world setting was relatively small and confined mostly to patients who had had both eyes operated on. The cost of cataract surgery per QALY gained was higher than previously reported and was associated with considerable degree of uncertainty. Hip and knee replacement both improve HRQoL. The cost per QALY gained from knee replacement is two-fold compared to hip replacement. Cost-utility results from the three studied specialties showed that there is great variation in the cost-utility of surgical interventions performed in a real-world setting even when only common, widely accepted interventions are considered. However, the cost per QALY of all the studied interventions, except for revision hip arthroplasty, was well below 50 000, this figure being sometimes cited in the literature as a threshold level for the cost-effectiveness of an intervention. Based on the present study it may be concluded that routine evaluation of the cost-utility of secondary health care is feasible and produces information essential for a rational and balanced allocation of scarce health care resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluates how the advection of precipitation, or wind drift, between the radar volume and ground affects radar measurements of precipitation. Normally precipitation is assumed to fall vertically to the ground from the contributing volume, and thus the radar measurement represents the geographical location immediately below. In this study radar measurements are corrected using hydrometeor trajectories calculated from measured and forecasted winds, and the effect of trajectory-correction on the radar measurements is evaluated. Wind drift statistics for Finland are compiled using sounding data from two weather stations spanning two years. For each sounding, the hydrometeor phase at ground level is estimated and drift distance calculated using different originating level heights. This way the drift statistics are constructed as a function of range from radar and elevation angle. On average, wind drift of 1 km was exceeded at approximately 60 km distance, while drift of 10 km was exceeded at 100 km distance. Trajectories were calculated using model winds in order to produce a trajectory-corrected ground field from radar PPI images. It was found that at the upwind side from the radar the effective measuring area was reduced as some trajectories exited the radar volume scan. In the downwind side areas near the edge of the radar measuring area experience improved precipitation detection. The effect of trajectory-correction is most prominent in instant measurements and diminishes when accumulating over longer time periods. Furthermore, measurements of intensive and small scale precipitation patterns benefit most from wind drift correction. The contribution of wind drift on the uncertainty of estimated Ze (S) - relationship was studied by simulating the effect of different error sources to the uncertainty in the relationship coefficients a and b. The overall uncertainty was assumed to consist of systematic errors of both the radar and the gauge, as well as errors by turbulence at the gauge orifice and by wind drift of precipitation. The focus of the analysis is error associated with wind drift, which was determined by describing the spatial structure of the reflectivity field using spatial autocovariance (or variogram). This spatial structure was then used with calculated drift distances to estimate the variance in radar measurement produced by precipitation drift, relative to the other error sources. It was found that error by wind drift was of similar magnitude with error by turbulence at gauge orifice at all ranges from radar, with systematic errors of the instruments being a minor issue. The correction method presented in the study could be used in radar nowcasting products to improve the estimation of visibility and local precipitation intensities. The method however only considers pure snow, and for operational purposes some improvements are desirable, such as melting layer detection, VPR correction and taking solid state hydrometeor type into account, which would improve the estimation of vertical velocities of the hydrometeors.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies empirically whether measurement errors in aggregate production statistics affect sentiment and future output. Initial announcements of aggregate production are subject to measurement error, because many of the data required to compile the statistics are produced with a lag. This measurement error can be gauged as the difference between the latest revised statistic and its initial announcement. Assuming aggregate production statistics help forecast future aggregate production, these measurement errors are expected to affect macroeconomic forecasts. Assuming agents’ macroeconomic forecasts affect their production choices, these measurement errors should affect future output through sentiment. This thesis is primarily empirical, so the theoretical basis, strategic complementarity, is discussed quite briefly. However, it is a model in which higher aggregate production increases each agent’s incentive to produce. In this circumstance a statistical announcement which suggests aggregate production is high would increase each agent’s incentive to produce, thus resulting in higher aggregate production. In this way the existence of strategic complementarity provides the theoretical basis for output fluctuations caused by measurement mistakes in aggregate production statistics. Previous empirical studies suggest that measurement errors in gross national product affect future aggregate production in the United States. Additionally it has been demonstrated that measurement errors in the Index of Leading Indicators affect forecasts by professional economists as well as future industrial production in the United States. This thesis aims to verify the applicability of these findings to other countries, as well as study the link between measurement errors in gross domestic product and sentiment. This thesis explores the relationship between measurement errors in gross domestic production and sentiment and future output. Professional forecasts and consumer sentiment in the United States and Finland, as well as producer sentiment in Finland, are used as the measures of sentiment. Using statistical techniques it is found that measurement errors in gross domestic product affect forecasts and producer sentiment. The effect on consumer sentiment is ambiguous. The relationship between measurement errors and future output is explored using data from Finland, United States, United Kingdom, New Zealand and Sweden. It is found that measurement errors have affected aggregate production or investment in Finland, United States, United Kingdom and Sweden. Specifically, it was found that overly optimistic statistics announcements are associated with higher output and vice versa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phytoplankton ecology and productivity is one of the main branches of contemporary oceanographic research. Research groups in this branch have increasingly started to utilise bio-optical applications. My main research objective was to critically investigate the advantages and deficiencies of the fast repetition rate (FRR) fluorometry for studies of productivity of phytoplankton, and the responses of phytoplankton towards varying environmental stress. Second, I aimed to clarify the applicability of the FRR system to the optical environment of the Baltic Sea. The FRR system offers a highly dynamic tool for studies of phytoplankton photophysiology and productivity both in the field and in a controlled environment. The FRR metrics obtain high-frequency in situ determinations of the light-acclimative and photosynthetic parameters of intact phytoplankton communities. The measurement protocol is relatively easy to use without phases requiring analytical determinations. The most notable application of the FRR system lies in its potential for making primary productivity (PP) estimations. However, the realisation of this scheme is not straightforward. The FRR-PP, based on the photosynthetic electron flow (PEF) rate, are linearly related to the photosynthetic gas exchange (fixation of 14C) PP only in environments where the photosynthesis is light-limited. If the light limitation is not present, as is usually the case in the near-surface layers of the water column, the two PP approaches will deviate. The prompt response of the PEF rate to the short-term variability in the natural light field makes the field comparisons between the PEF-PP and the 14C-PP difficult to interpret, because this variability is averaged out in the 14C-incubations. Furthermore, the FRR based PP models are tuned to closely follow the vertical pattern of the underwater irradiance. Due to the photoacclimational plasticity of phytoplankton, this easily leads to overestimates of water column PP, if precautionary measures are not taken. Natural phytoplankton is subject to broad-waveband light. Active non-spectral bio-optical instruments, like the FRR fluorometer, emit light in a relatively narrow waveband, which by its nature does not represent the in situ light field. Thus, the spectrally-dependent parameters provided by the FRR system need to be spectrally scaled to the natural light field of the Baltic Sea. In general, the requirement of spectral scaling in the water bodies under terrestrial impact concerns all light-adaptive parameters provided by any active non-spectral bio-optical technique. The FRR system can be adopted to studies of all phytoplankton that possess efficient light harvesting in the waveband matching the bluish FRR excitation. Although these taxa cover the large bulk of all the phytoplankton taxa, one exception with a pronounced ecological significance is found in the Baltic Sea. The FRR system cannot be used to monitor the photophysiology of the cyanobacterial taxa harvesting light in the yellow-red waveband. These taxa include the ecologically-significant bloom-forming cyanobacterial taxa in the Baltic Sea.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The prevalence and assessment of neuroleptic-induced movement disorders (NIMDs) in a naturalistic schizophrenia population that uses conventional neuroleptics were studied. We recruited 99 chronic schizophrenic institutionalized adult patients from a state nursing home in central Estonia. The total prevalence of NIMDs according to the diagnostic criteria of the Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) was 61.6%, and 22.2% had more than one NIMD. We explored the reliability and validity of different instruments for measuring these disorders. First, we compared DSM-IV with the established observer rating scales of Barnes Akathisia Rating Scale (BARS), Simpson-Angus Scale (SAS) (for neuroleptic-induced parkinsonism, NIP) and Abnormal Involuntary Movement Scale (AIMS) (for tardive dyskinesia), all three of which have been used for diagnosing NIMD. We found a good overlap of cases for neuroleptic-induced akathisia (NIA) and tardive dyskinesia (TD) but somewhat poorer overlap for NIP, for which we suggest raising the commonly used threshold value of 0.3 to 0.65. Second, we compared the established observer rating scales with an objective motor measurement, namely controlled rest lower limb activity measured by actometry. Actometry supported the validity of BARS and SAS, but it could not be used alone in this naturalistic population with several co-existing NIMDs. It could not differentiate the disorders from each other. Quantitative actometry may be useful in measuring changes in NIA and NIP severity, in situations where the diagnosis has been made using another method. Third, after the relative failure of quantitative actometry to show diagnostic power in a naturalistic population, we explored descriptive ways of analysing actometric data, and demonstrated diagnostic power pooled NIA and pseudoakathisia (PsA) in our population. A subjective question concerning movement problems was able to discriminate NIA patients from all other subjects. Answers to this question were not selective for other NIMDs. Chronic schizophrenia populations are common worldwide, NIMD affected two-thirds of our study population. Prevention, diagnosis and treatment of NIMDs warrant more attention, especially in countries where typical antipsychotics are frequently used. Our study supported the validity and reliability of DSM-IV diagnostic criteria for NIMD in comparison with established rating scales and actometry. SAS can be used with minor modifications for screening purposes. Controlled rest lower limb actometry was not diagnostically specific in our naturalistic population with several co-morbid NIMDs, but it may be sensitive in measuring changes in NIMDs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The incidence of all forms of congenital heart defects is 0.75%. For patients with congenital heart defects, life-expectancy has improved with new treatment modalities. Structural heart defects may require surgical or catheter treatment which may be corrective or palliative. Even those with corrective therapy need regular follow-up due to residual lesions, late sequelae, and possible complications after interventions. Aims: The aim of this thesis was to evaluate cardiac function before and after treatment for volume overload of the right ventricle (RV) caused by atrial septal defect (ASD), volume overload of the left ventricle (LV) caused by patent ductus arteriosus (PDA), and pressure overload of the LV caused by coarctation of the aorta (CoA), and to evaluate cardiac function in patients with Mulibrey nanism. Methods: In Study I, of the 24 children with ASD, 7 underwent surgical correction and 17 percutaneous occlusion of ASD. Study II had 33 patients with PDA undergoing percutaneous occlusion. In Study III, 28 patients with CoA underwent either surgical correction or percutaneous balloon dilatation of CoA. Study IV comprised 26 children with Mulibrey nanism. A total of 76 healthy voluntary children were examined as a control group. In each study, controls were matched to patients. All patients and controls underwent clinical cardiovascular examinations, two-dimensional (2D) and three-dimensional (3D) echocardiographic examinations, and blood sampling for measurement of natriuretic peptides prior to the intervention and twice or three times thereafter. Control children were examined once by 2D and 3D echocardiography. M-mode echocardiography was performed from the parasternal long axis view directed by 2D echocardiography. The left atrium-to-aorta (LA/Ao) ratio was calculated as an index of LA size. The end-diastolic and end-systolic dimensions of LV as well as the end-diastolic thicknesses of the interventricular septum and LV posterior wall were measured. LV volumes, and the fractional shortening (FS) and ejection fraction (EF) as indices of contractility were then calculated, and the z scores of LV dimensions determined. Diastolic function of LV was estimated from the mitral inflow signal obtained by Doppler echocardiography. In three-dimensional echocardiography, time-volume curves were used to determine end-diastolic and end-systolic volumes, stroke volume, and EF. Diastolic and systolic function of LV was estimated from the calculated first derivatives of these curves. Results: (I): In all children with ASD, during the one-year follow-up, the z score of the RV end-diastolic diameter decreased and that of LV increased. However, dilatation of RV did not resolve entirely during the follow-up in either treatment group. In addition, the size of LV increased more slowly in the surgical subgroup but reached control levels in both groups. Concentrations of natriuretic peptides in patients treated percutaneously increased during the first month after ASD closure and normalized thereafter, but in patients treated surgically, they remained higher than in controls. (II): In the PDA group, at baseline, the end-diastolic diameter of LV measured over 2SD in 5 of 33 patients. The median N-terminal pro-brain natriuretic peptide (proBNP) concentration before closure measured 72 ng/l in the control group and 141 ng/l in the PDA group (P = 0.001) and 6 months after closure measured 78.5 ng/l (P = NS). Patients differed from control subjects in indices of LV diastolic and systolic function at baseline, but by the end of follow-up, all these differences had disappeared. Even in the subgroup of patients with normal-sized LV at baseline, the LV end-diastolic volume decreased significantly during follow-up. (III): Before repair, the size and wall thickness of LV were higher in patients with CoA than in controls. Systolic blood pressure measured a median 123 mm Hg in patients before repair (P < 0.001) and 103 mm Hg one year thereafter, and 101 mm Hg in controls. The diameter of the coarctation segment measured a median 3.0 mm at baseline, and 7.9 at the 12-month (P = 0.006) follow-up. Thicknesses of the interventricular septum and posterior wall of the LV decreased after repair but increased to the initial level one year thereafter. The velocity time integrals of mitral inflow increased, but no changes were evident in LV dimensions or contractility. During follow-up, serum levels of natriuretic peptides decreased correlating with diastolic and systolic indices of LV function in 2D and 3D echocardiography. (IV): In 2D echocardiography, the interventricular septum and LV posterior wall were thicker, and velocity time integrals of mitral inflow shorter in patients with Mulibrey nanism than in controls. In 3D echocardiography, LV end-diastolic volume measured a median 51.9 (range 33.3 to 73.4) ml/m² in patients and 59.7 (range 37.6 to 87.6) ml/m² in controls (P = 0.040), and serum levels of ANPN and proBNP a median 0.54 (range 0.04 to 4.7) nmol/l and 289 (range 18 to 9170) ng/l, in patients and 0.28 (range 0.09 to 0.72) nmol/l (P < 0.001) and 54 (range 26 to 139) ng/l (P < 0.001) in controls. They correlated with several indices of diastolic LV function. Conclusions (I): During the one-year follow-up after the ASD closure, RV size decreased but did not normalize in all patients. The size of the LV normalized after ASD closure but the increase in LV size was slower in patients treated surgically than in those treated with the percutaneous technique. Serum levels of ANPN and proBNP were elevated prior to ASD closure but decreased thereafter to control levels in patients treated with the percutaneous technique but not in those treated surgically. (II): Changes in LV volume and function caused by PDA disappeared by 6 months after percutaneous closure. Even the children with normal-sized LV benefited from the procedure. (III): After repair of CoA, the RV size and the velocity time integrals of mitral inflow increased, and serum levels of natriuretic peptides decreased. Patients need close follow-up, despite cessation of LV pressure overload, since LV hypertrophy persisted even in normotensive patients with normal growth of the coarctation segment. (IV): In children with Mulibrey nanism, the LV wall was hypertrophied, with myocardial restriction and impairment of LV function. Significant correlations appeared between indices of LV function, size of the left atrium, and levels of natriuretic peptides, indicating that measurement of serum levels of natriuretic peptides can be used in the clinical follow-up of this patient group despite its dependence on loading conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gastric motility disorders, including delayed gastric emptying (gastroparesis), impaired postprandial fundic relaxation, and gastric myoelectrical disorders, can occur in type 1 diabetes, chronic renal failure, and functional dyspepsia (FD). Symptoms like upper abdominal pain, early satiation, bloating, nausea and vomiting may be related to gastroparesis. Diabetic gastroparesis is related to autonomic neuropathy. Scintigraphy is the gold standard in measuring gastric emptying, but it is expensive, requires specific equipment, and exposes patients to radiation. It also gives information about the intragastric distribution of the test meal. The 13C-octanoic acid breath test (OBT) is an alternative, indirect method of measuring gastric emptying with a stable isotope. Electrogastrography (EGG) registers the slow wave originating in the pacemaker area of the stomach and regulating the peristaltic contractions of the antrum. This study compares these three methods of measuring gastric motility in patients with type 1 diabetes, functional dyspepsia, and chronic renal failure. Currently no effective drugs for treating gastric motility disorders are available. We studied the effect of nizatidine on gastric emptying, because in preliminary studies this drug has proven to have a prokinetic effect due to its cholinergic properties. Of the type 1 patients, 26% had delayed gastric emptying of solids as measured by scintigraphy. Abnormal intragastric distribution of the test meal occurred in 37% of the patients, indicating impaired fundic relaxation. The autonomic neuropathy score correlated positively with the gastric emptying rate of solids (P = 0.006), but HbA1C, plasma glucose levels, or abdominal symptoms were unrelated to gastric emptying or intragastric distribution of the test meal. Gastric emptying of both solids and liquids was normal in all FD patients but abnormal intragastric distribution occurred in 38% of the patients. Nizatidine improved symptom scores and quality of life in FD patients, but not significantly. Instead of enhancing, nizatidine slowed gastric emptying in FD patients (P < 0.05). No significant difference appeared in the frequency of the gastric slow waves measured by EGG in the patients and controls. The correlation between gastric half-emptying times of solids measured by scintigraphy and OBT was poor both in type 1 diabetes and FD patients. According to this study, dynamic dual-tracer scintigraphy is more accurate than OBT or EGG in measuring gastric emptying of solids. Additionally it provides information about gastric emptying of liquids and the intragastric distribution of the ingested test meal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is widely accepted that the global climate is heating up due to human activities, such as burning of fossil fuels. Therefore we find ourselves forced to make decisions on what measures, if any, need to be taken to decrease our warming effect on the planet before any irrevocable damage occurs. Research is being conducted in a variety of fields to better understand all relevant processes governing Earth s climate, and to assess the relative roles of anthropogenic and biogenic emissions into the atmosphere. One of the least well quantified problems is the impact of small aerosol particles (both of anthropogenic and biogenic origin) on climate, through reflecting solar radiation and their ability to act as condensation nuclei for cloud droplets. In this thesis, the compounds driving the biogenic formation of new particles in the atmosphere have been examined through detailed measurements. As directly measuring the composition of these newly formed particles is extremely difficult, the approach was to indirectly study their different characteristics by measuring the hygroscopicity (water uptake) and volatility (evaporation) of particles between 10 and 50 nm. To study the first steps of the formation process in the sub-3 nm range, the nucleation of gaseous precursors to small clusters, the chemical composition of ambient naturally charged ions were measured. The ion measurements were performed with a newly developed mass spectrometer, which was first characterized in the laboratory before being deployed at a boreal forest measurement site. It was also successfully compared to similar, low-resolution instruments. The ambient measurements showed that sulfuric acid clusters dominate the negative ion spectrum during new particle formation events. Sulfuric acid/ammonia clusters were detected in ambient air for the first time in this work. Even though sulfuric acid is believed to be the most important gas phase precursor driving the initial cluster formation, measurements of the hygroscopicity and volatility of growing 10-50 nm particles in Hyytiälä showed an increasing role of organic vapors of a variety of oxidation levels. This work has provided additional insights into the compounds participating both in the initial formation and subsequent growth of atmospheric new aerosol particles. It will hopefully prove an important step in understanding atmospheric gas-to-particle conversion, which, by influencing cloud properties, can have important climate impacts. All available knowledge needs to be constantly updated, summarized, and brought to the attention of our decision-makers. Only by increasing our understanding of all the relevant processes can we build reliable models to predict the long-term effects of decisions made today.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solar flares were first observed by plain eye in white light by William Carrington in England in 1859. Since then these eruptions in the solar corona have intrigued scientists. It is known that flares influence the space weather experienced by the planets in a multitude of ways, for example by causing aurora borealis. Understanding flares is at the epicentre of human survival in space, as astronauts cannot survive the highly energetic particles associated with large flares in high doses without contracting serious radiation disease symptoms, unless they shield themselves effectively during space missions. Flares may be at the epicentre of man s survival in the past as well: it has been suggested that giant flares might have played a role in exterminating many of the large species on Earth, including dinosaurs. Having said that prebiotic synthesis studies have shown lightning to be a decisive requirement for amino acid synthesis on the primordial Earth. Increased lightning activity could be attributed to space weather, and flares. This thesis studies flares in two ways: in the spectral and the spatial domain. We have extracted solar spectra using three different instruments, namely GOES (Geostationary Operational Environmental Satellite), RHESSI (Reuven Ramaty High Energy Solar Spectroscopic Imager) and XSM (X-ray Solar Monitor) for the same flares. The GOES spectra are low resolution obtained with a gas proportional counter, the RHESSI spectra are higher resolution obtained with Germanium detectors and the XSM spectra are very high resolution observed with a silicon detector. It turns out that the detector technology and response influence the spectra we see substantially, and are important to understanding what conclusions to draw from the data. With imaging data, there was not such a luxury of choice available. We used RHESSI imaging data to observe the spatial size of solar flares. In the present work the focus was primarily on current solar flares. However, we did make use of our improved understanding of solar flares to observe young suns in NGC 2547. The same techniques used with solar monitors were applied with XMM-Newton, a stellar X-ray monitor, and coupled with ground based Halpha observations these techniques yielded estimates for flare parameters in young suns. The material in this thesis is therefore structured from technology to application, covering the full processing path from raw data and detector responses to concrete physical parameter results, such as the first measurement of the length of plasma flare loops in young suns.