13 resultados para criterion variables

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scots pine (Pinus sylvestris L.) and Norway spruce (Picea abies (L.) Karst.) forests dominate in Finnish Lapland. The need to study the effect of both soil factors and site preparation on the performance of planted Scots pine has increased due to the problems encountered in reforestation, especially on mesic and moist, formerly spruce-dominated sites. The present thesis examines soil hydrological properties and conditions, and effect of site preparation on them on 10 pine- and 10 spruce-dominated upland forest sites. Finally, the effects of both the site preparation and reforestation methods, and soil hydrology on the long-term performance of planted Scots pine are summarized. The results showed that pine and spruce sites differ significantly in their soil physical properties. Under field capacity or wetter soil moisture conditions, planted pines presumably suffer from excessive soil water and poor soil aeration on most of the originally spruce sites, but not on the pine sites. The results also suggested that site preparation affects the soil-water regime and thus prerequisites for forest growth over two decades after site preparation. High variation in the survival and mean height of planted pine was found. The study suggested that on spruce sites, pine survival is the lowest on sites that dry out slowly after rainfall events, and that height growth is the fastest on soils that reach favourable aeration conditions for root growth soon after saturation, and/or where the average air-filled porosity near field capacity is large enough for good root growth. Survival, but not mean height can be enhanced by employing intensive site preparation methods on spruce sites. On coarser-textured pine sites, site preparation methods don t affect survival, but methods affecting soil fertility, such as prescribed burning and ploughing, seem to enhance the height growth of planted Scots pines over several decades. The use of soil water content in situ as the sole criterion for sites suitable for pine reforestation was tested and found to be a relatively uncertain parameter. The thesis identified new potential soil variables, which should be tested using other data in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Patients may need massive volume-replacement therapy after cardiac surgery because of large fluid transfer perioperatively, and the use of cardiopulmonary bypass. Hemodynamic stability is better maintained with colloids than crystalloids but colloids have more adverse effects such as coagulation disturbances and impairment of renal function than do crystalloids. The present study examined the effects of modern hydroxyethyl starch (HES) and gelatin solutions on blood coagulation and hemodynamics. The mechanism by which colloids disturb blood coagulation was investigated by thromboelastometry (TEM) after cardiac surgery and in vitro by use of experimental hemodilution. Materials and methods: Ninety patients scheduled for elective primary cardiac surgery (Studies I, II, IV, V), and twelve healthy volunteers (Study III) were included in this study. After admission to the cardiac surgical intensive care unit (ICU), patients were randomized to receive different doses of HES 130/0.4, HES 200/0.5, or 4% albumin solutions. Ringer’s acetate or albumin solutions served as controls. Coagulation was assessed by TEM, and hemodynamic measurements were based on thermodilutionally measured cardiac index (CI). Results: HES and gelatin solutions impaired whole blood coagulation similarly as measured by TEM even at a small dose of 7 mL/kg. These solutions reduced clot strength and prolonged clot formation time. These effects were more pronounced with increasing doses of colloids. Neither albumin nor Ringer’s acetate solution disturbed blood coagulation significantly. Coagulation disturbances after infusion of HES or gelatin solutions were clinically slight, and postoperative blood loss was comparable with that of Ringer’s acetate or albumin solutions. Both single and multiple doses of all the colloids increased CI postoperatively, and this effect was dose-dependent. Ringer’s acetate had no effect on CI. At a small dose (7 mL/kg), the effect of gelatin on CI was comparable with that of Ringer’s acetate and significantly less than that of HES 130/0.4 (Study V). However, when the dose was increased to 14 and 21 mL/kg, the hemodynamic effect of gelatin rose and became comparable with that of HES 130/0.4. Conclusions: After cardiac surgery, HES and gelatin solutions impaired clot strength in a dose-dependent manner. The potential mechanisms were interaction with fibrinogen and fibrin formation, resulting in decreased clot strength, and hemodilution. Although the use of HES and gelatin inhibited coagulation, postoperative bleeding on the first postoperative morning in all the study groups was similar. A single dose of HES solutions improved CI postoperatively more than did gelatin, albumin, or Ringer’s acetate. However, when administered in a repeated fashion, (cumulative dose of 14 mL/kg or more), no differences were evident between HES 130/0.4 and gelatin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the studies was to improve the diagnostic capability of electrocardiography (ECG) in detecting myocardial ischemic injury with a future goal of an automatic screening and monitoring method for ischemic heart disease. The method of choice was body surface potential mapping (BSPM), containing numerous leads, with intention to find the optimal recording sites and optimal ECG variables for ischemia and myocardial infarction (MI) diagnostics. The studies included 144 patients with prior MI, 79 patients with evolving ischemia, 42 patients with left ventricular hypertrophy (LVH), and 84 healthy controls. Study I examined the depolarization wave in prior MI with respect to MI location. Studies II-V examined the depolarization and repolarization waves in prior MI detection with respect to the Minnesota code, Q-wave status, and study V also with respect to MI location. In study VI the depolarization and repolarization variables were examined in 79 patients in the face of evolving myocardial ischemia and ischemic injury. When analyzed from a single lead at any recording site the results revealed superiority of the repolarization variables over the depolarization variables and over the conventional 12-lead ECG methods, both in the detection of prior MI and evolving ischemic injury. The QT integral, covering both depolarization and repolarization, appeared indifferent to the Q-wave status, the time elapsed from MI, or the MI or ischemia location. In the face of evolving ischemic injury the performance of the QT integral was not hampered even by underlying LVH. The examined depolarization and repolarization variables were effective when recorded in a single site, in contrast to the conventional 12-lead ECG criteria. The inverse spatial correlation of the depolarization and depolarization waves in myocardial ischemia and injury could be reduced into the QT integral variable recorded in a single site on the left flank. In conclusion, the QT integral variable, detectable in a single lead, with optimal recording site on the left flank, was able to detect prior MI and evolving ischemic injury more effectively than the conventional ECG markers. The QT integral, in a single-lead or a small number of leads, offers potential for automated screening of ischemic heart disease, acute ischemia monitoring and therapeutic decision-guiding as well as risk stratification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question at issue in this dissertation is the epistemic role played by ecological generalizations and models. I investigate and analyze such properties of generalizations as lawlikeness, invariance, and stability, and I ask which of these properties are relevant in the context of scientific explanations. I will claim that there are generalizable and reliable causal explanations in ecology by generalizations, which are invariant and stable. An invariant generalization continues to hold or be valid under a special change called an intervention that changes the value of its variables. Whether a generalization remains invariant during its interventions is the criterion that determines whether it is explanatory. A generalization can be invariant and explanatory regardless of its lawlike status. Stability deals with a generality that has to do with holding of a generalization in possible background conditions. The more stable a generalization, the less dependent it is on background conditions to remain true. Although it is invariance rather than stability of generalizations that furnishes us with explanatory generalizations, there is an important function that stability has in this context of explanations, namely, stability furnishes us with extrapolability and reliability of scientific explanations. I also discuss non-empirical investigations of models that I call robustness and sensitivity analyses. I call sensitivity analyses investigations in which one model is studied with regard to its stability conditions by making changes and variations to the values of the model s parameters. As a general definition of robustness analyses I propose investigations of variations in modeling assumptions of different models of the same phenomenon in which the focus is on whether they produce similar or convergent results or not. Robustness and sensitivity analyses are powerful tools for studying the conditions and assumptions where models break down and they are especially powerful in pointing out reasons as to why they do this. They show which conditions or assumptions the results of models depend on. Key words: ecology, generalizations, invariance, lawlikeness, philosophy of science, robustness, explanation, models, stability

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, thanks to developments in information technology, large-dimensional datasets have been increasingly available. Researchers now have access to thousands of economic series and the information contained in them can be used to create accurate forecasts and to test economic theories. To exploit this large amount of information, researchers and policymakers need an appropriate econometric model.Usual time series models, vector autoregression for example, cannot incorporate more than a few variables. There are two ways to solve this problem: use variable selection procedures or gather the information contained in the series to create an index model. This thesis focuses on one of the most widespread index model, the dynamic factor model (the theory behind this model, based on previous literature, is the core of the first part of this study), and its use in forecasting Finnish macroeconomic indicators (which is the focus of the second part of the thesis). In particular, I forecast economic activity indicators (e.g. GDP) and price indicators (e.g. consumer price index), from 3 large Finnish datasets. The first dataset contains a large series of aggregated data obtained from the Statistics Finland database. The second dataset is composed by economic indicators from Bank of Finland. The last dataset is formed by disaggregated data from Statistic Finland, which I call micro dataset. The forecasts are computed following a two steps procedure: in the first step I estimate a set of common factors from the original dataset. The second step consists in formulating forecasting equations including the factors extracted previously. The predictions are evaluated using relative mean squared forecast error, where the benchmark model is a univariate autoregressive model. The results are dataset-dependent. The forecasts based on factor models are very accurate for the first dataset (the Statistics Finland one), while they are considerably worse for the Bank of Finland dataset. The forecasts derived from the micro dataset are still good, but less accurate than the ones obtained in the first case. This work leads to multiple research developments. The results here obtained can be replicated for longer datasets. The non-aggregated data can be represented in an even more disaggregated form (firm level). Finally, the use of the micro data, one of the major contributions of this thesis, can be useful in the imputation of missing values and the creation of flash estimates of macroeconomic indicator (nowcasting).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hypertension is one of the major risk factors for cardiovascular morbidity. The advantages of antihypertensive therapy have been clearly demonstrated, but only about 30% of hypertensive patients have their blood pressure (BP) controlled by such treatment. One of the reasons for this poor BP control may lie in the difficulty in predicting BP response to antihypertensive treatment. The average BP reduction achieved is similar for each drug in the main classes of antihypertensive agents, but there is a marked individual variation in BP responses to any given drug. The purpose of the present study was to examine BP response to four different antihypertensive monotherapies with regard to demographic characteristics, laboratory test results and common genetic polymorphisms. The subjects of the present study are participants in the pharmacogenetic GENRES Study. A total of 208 subjects completed the whole study protocol including four drug treatment periods of four weeks, separated by four-week placebo periods. The study drugs were amlodipine, bisoprolol, hydrochlorothiazide and losartan. Both office (OBP) and 24-hour ambulatory blood pressure (ABP) measurements were carried out. BP response to study drugs were related to basic clinical characteristics, pretreatment laboratory test results and common polymorphisms in genes coding for components of the renin-angiotensin system, alpha-adducin (ADD1), beta1-adrenergic receptor (ADRB1) and beta2-adrenergic receptor (ADRB2). Age was positively correlated with BP responses to amlodipine and with OBP and systolic ABP responses to hydrochlorothiazide, while body mass index was negatively correlated with ABP responses to amlodipine. Of the laboratory test results, plasma renin activity (PRA) correlated positively with BP responses to losartan, with ABP responses to bisoprolol, and negatively with ABP responses to hydrochlorothiazide. Uniquely to this study, it was found that serum total calcium level was negatively correlated with BP responses to amlodipine, whilst serum total cholesterol level was negatively correlated with ABP responses to amlodipine. There were no significant associations of angiotensin II type I receptor 1166A/C, angiotensin converting enzyme I/D, angiotensinogen Met235Thr, ADD1 Gly460Trp, ADRB1 Ser49Gly and Gly389Arg and ADRB2 Arg16Gly and Gln27Glu polymorphisms with BP responses to the study drugs. In conclusion, this study confirmed the relationship between pretreatment PRA levels and response to three classes of antihypertensive drugs. This study is the first to note a significant inverse relation between serum calcium level and responsiveness to a calcium channel blocker. However, this study could not replicate the observations that common polymorphisms in angiotensin II type I receptor, angiotensin converting enzyme, angiotensinogen, ADD1, ADRB1, or ADRB2 genes can predict BP response to antihypertensive drugs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation empirically explored interest as a motivational force in university studies, including the role it currently plays and possible ways of enhancing this role as a student motivator. The general research questions were as follows: 1) What role does interest play in university studies? 2) What explains academic success if studying is not based on interest? 3) How do different learning environments support or impede interest-based studying? Four empirical studies addressed these questions. Study 1 (n=536) compared first-year students explanations of their disciplinary choices in three fields: veterinary medicine, humanities and law. Study 2 (n=28) focused on the role of individual interest in the humanities and veterinary medicine, fields which are very different from each other as regards their nature of studying. Study 3 (n=52) explored veterinary students motivation and study practices in relation to their study success. Study 4 (n=16) explored veterinary students interest experience in individual lectures on a daily basis. By comparing different fields and focusing on one study field in more detail, it was possible to obtain a many-sided picture of the role of interest in different learning environments. Questionnaires and quantitative methods have often been used to measure interest in academic learning. The present work is based mostly on qualitative data, and qualitative methods were applied to add to the previous research. Study 1 explored students open-ended answers, and these provided a basis for the interviews in Study 2. Study 3 explored veterinary students portfolios in a longitudinal setting. For Study 4, a diary including both qualitative and quantitative measures was designed to capture veterinary students interest experience. Qualitative content analysis was applied in all four studies, but quantitative analyses were also added. The thesis showed that university students often explain their disciplinary choices in terms of interest. Because interest is related to high-quality learning, the students seemed to have a good foundation for successful studies. However, the learning environments did not always support interest-based studying; Time-management and coping skills were found to be more important than interest in terms of study success. The results also indicated that interest is not the only motivational variable behind university studies. For example, future goals are needed in order to complete a degree. Even so, the results clearly indicated that it would be worth supporting interest-based studying both in professionally and generally oriented study fields. This support is important not only to promote high-quality learning but also meaningful studying, student well-being, and life-long learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Malaria was prevalent in Finland in the 18th century. It declined slowly without deliberate counter-measures and the last indigenous case was reported in 1954. In the present analysis of indigenous malaria in Finland, an effort was made to construct a data set on annual malaria cases of maximum temporal length to be able to evaluate the significance of different factors assumed to affect malaria trends. Methods: To analyse the long-term trend malaria statistics were collected from 1750–2008. During that time, malaria frequency decreased from about 20,000 – 50,000 per 1,000,000 people to less than 1 per 1,000,000 people. To assess the cause of the decline, a correlation analysis was performed between malaria frequency per million people and temperature data, animal husbandry, consolidation of land by redistribution and household size. Results: Anopheles messeae and Anopheles beklemishevi exist only as larvae in June and most of July. The females seek an overwintering place in August. Those that overwinter together with humans may act as vectors. They have to stay in their overwintering place from September to May because of the cold climate. The temperatures between June and July determine the number of malaria cases during the following transmission season. This did not, however, have an impact on the longterm trend of malaria. The change in animal husbandry and reclamation of wetlands may also be excluded as a possible cause for the decline of malaria. The long-term social changes, such as land consolidation and decreasing household size, showed a strong correlation with the decline of Plasmodium. Conclusion: The indigenous malaria in Finland faded out evenly in the whole country during 200 years with limited or no counter-measures or medication. It appears that malaria in Finland was basically a social disease and that malaria trends were strongly linked to changes in human behaviour. Decreasing household size caused fewer interactions between families and accordingly decreasing recolonization possibilities for Plasmodium. The permanent drop of the household size was the precondition for a permanent eradication of malaria.