24 resultados para Differencein-in-Difference estimation (DID)

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selostus: Ravikilpailumenestysmittojen periytymisasteet ja toistumiskertoimet kilpailukohtaisten tulosten perusteella

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yksi keskeisimmistä tehtävistä matemaattisten mallien tilastollisessa analyysissä on mallien tuntemattomien parametrien estimointi. Tässä diplomityössä ollaan kiinnostuneita tuntemattomien parametrien jakaumista ja niiden muodostamiseen sopivista numeerisista menetelmistä, etenkin tapauksissa, joissa malli on epälineaarinen parametrien suhteen. Erilaisten numeeristen menetelmien osalta pääpaino on Markovin ketju Monte Carlo -menetelmissä (MCMC). Nämä laskentaintensiiviset menetelmät ovat viime aikoina kasvattaneet suosiotaan lähinnä kasvaneen laskentatehon vuoksi. Sekä Markovin ketjujen että Monte Carlo -simuloinnin teoriaa on esitelty työssä siinä määrin, että menetelmien toimivuus saadaan perusteltua. Viime aikoina kehitetyistä menetelmistä tarkastellaan etenkin adaptiivisia MCMC menetelmiä. Työn lähestymistapa on käytännönläheinen ja erilaisia MCMC -menetelmien toteutukseen liittyviä asioita korostetaan. Työn empiirisessä osuudessa tarkastellaan viiden esimerkkimallin tuntemattomien parametrien jakaumaa käyttäen hyväksi teoriaosassa esitettyjä menetelmiä. Mallit kuvaavat kemiallisia reaktioita ja kuvataan tavallisina differentiaaliyhtälöryhminä. Mallit on kerätty kemisteiltä Lappeenrannan teknillisestä yliopistosta ja Åbo Akademista, Turusta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines how firms interpret new, potentially disruptive technologies in their own strategic context. The work presents a cross-case analysis of four potentially disruptive technologies or technical operating models: Bluetooth, WLAN, Grid computing and Mobile Peer-to-peer paradigm. The technologies were investigated from the perspective of three mobile operators, a device manufacturer and a software company in the ICT industry. The theoretical background for the study consists of the resource-based view of the firm with dynamic perspective, the theories on the nature of technology and innovations, and the concept of business model. The literature review builds up a propositional framework for estimating the amount of radical change in the companies' business model with two middle variables, the disruptiveness potential of a new technology, and the strategic importance of a new technology to a firm. The data was gathered in group discussion sessions in each company. The results of each case analysis were brought together to evaluate, how firms interpret the potential disruptiveness in terms of changes in product characteristics and added value, technology and market uncertainty, changes in product-market positions, possible competence disruption and changes in value network positions. The results indicate that the perceived disruptiveness in terms ofproduct characteristics does not necessarily translate into strategic importance. In addition, firms did not see the new technologies as a threat in terms of potential competence disruption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction of multiple sequence alignments is a fundamental task in Bioinformatics. Multiple sequence alignments are used as a prerequisite in many Bioinformatics methods, and subsequently the quality of such methods can be critically dependent on the quality of the alignment. However, automatic construction of a multiple sequence alignment for a set of remotely related sequences does not always provide biologically relevant alignments.Therefore, there is a need for an objective approach for evaluating the quality of automatically aligned sequences. The profile hidden Markov model is a powerful approach in comparative genomics. In the profile hidden Markov model, the symbol probabilities are estimated at each conserved alignment position. This can increase the dimension of parameter space and cause an overfitting problem. These two research problems are both related to conservation. We have developed statistical measures for quantifying the conservation of multiple sequence alignments. Two types of methods are considered, those identifying conserved residues in an alignment position, and those calculating positional conservation scores. The positional conservation score was exploited in a statistical prediction model for assessing the quality of multiple sequence alignments. The residue conservation score was used as part of the emission probability estimation method proposed for profile hidden Markov models. The results of the predicted alignment quality score highly correlated with the correct alignment quality scores, indicating that our method is reliable for assessing the quality of any multiple sequence alignment. The comparison of the emission probability estimation method with the maximum likelihood method showed that the number of estimated parameters in the model was dramatically decreased, while the same level of accuracy was maintained. To conclude, we have shown that conservation can be successfully used in the statistical model for alignment quality assessment and in the estimation of emission probabilities in the profile hidden Markov models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical analyses of measurements that can be described by statistical models are of essence in astronomy and in scientific inquiry in general. The sensitivity of such analyses, modelling approaches, and the consequent predictions, is sometimes highly dependent on the exact techniques applied, and improvements therein can result in significantly better understanding of the observed system of interest. Particularly, optimising the sensitivity of statistical techniques in detecting the faint signatures of low-mass planets orbiting the nearby stars is, together with improvements in instrumentation, essential in estimating the properties of the population of such planets, and in the race to detect Earth-analogs, i.e. planets that could support liquid water and, perhaps, life on their surfaces. We review the developments in Bayesian statistical techniques applicable to detections planets orbiting nearby stars and astronomical data analysis problems in general. We also discuss these techniques and demonstrate their usefulness by using various examples and detailed descriptions of the respective mathematics involved. We demonstrate the practical aspects of Bayesian statistical techniques by describing several algorithms and numerical techniques, as well as theoretical constructions, in the estimation of model parameters and in hypothesis testing. We also apply these algorithms to Doppler measurements of nearby stars to show how they can be used in practice to obtain as much information from the noisy data as possible. Bayesian statistical techniques are powerful tools in analysing and interpreting noisy data and should be preferred in practice whenever computational limitations are not too restrictive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Benzodiazepines (BZD) and benzodiazepine related drugs (RD) are the most commonly used psychotropics among the aged. The use of other psychotropics taken concomitantly with BZD/ RD or their cognitive effects with BZD/RD have not been studied frequently. The aim of this academic thesis was to describe and analyse relationships between the use of BZD/RD alone or concomitantly with antipsychotics, antidepressants, opioids, antiepileptics, opioids and anticholinergics in the aged and their health. Especially, the relationships between long-term use of BZD/RD and cognitive decline were studied. Additionally, the effect of melatonin on BZD/RD withdrawal and the cognitive effects of BZD/RD withdrawal were studied. This study used multiple data sets: the first study (I) was based on clinical data containing aged patients (≥65 years; N=164) admitted to Pori City Hospital due to acute disease. The second data set (Studies II and III) was based on population-based data from the Lieto Study, a clinico-epidemiological longitudinal study carried out among the aged (≥65 years) in the municipality of Lieto. Follow-up data was formed by combining the cohort data collected in 1990-1991 (N=1283) and in 1998-1999 (N=1596) from those who participated in both cohorts (N=617). The third data set (Studies IV and V) was based on the Satauni Study’s data. This study was performed in the City of Pori in 2009-2010. In the RCT part of the Satauni Study, ninety-two long-term users of BZD/RD were withdrawn from their drugs using melatonin against placebo. The change of their cognitive abilities was measured during and after BZD/ RD withdrawal. BZD/RD use was related to worse cognitive and functional abilities, and their use may predict worse cognitive outcomes compared with BZD/RD non-users. Hypnotic use of BZD/RD could be withdrawn with psychosocial support in motivated participants, but melatonin did not improve the withdrawal results compared to those with placebo. Cognitive abilities in psychomotor tests did not show, or showed only modest, improvements for up to six months after BZD/RD withdrawal. This suggests that the cognitive effects of BZD/RD may be longlasting or permanent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The melanocortin system is an important regulator of feeding, energy metabolism,and cardiovascular function and it consists of the pro-opiomelanocortin (POMC) derived melanocyte stimulating hormones (α-, β- and γ-MSH) and their endogenous melanocortin receptors, MC1R to MC5R. In the hypothalamus, α-MSH reduces food intake, and increases energy expenditure and sympathetic tone by binding to MC4R. Mutations affecting the MC4R gene lead to obesity in mammals. On the other hand, the metabolic effects of MC3R stimulation using agonists such as the endogenously expressed γ-MSH have been less extensively explored. The main objective of this study was to investigate the long-term effects of increased melanocortin tone in key areas of metabolic regulation in the central nervous system (CNS) in order to investigate the sitespecific roles of both α-MSH and γ-MSH. The aim was to stereotaxically induce local overexpression of single melanocortin peptides using lentiviral vectors expressing α-MSH (LVi-α-MSH-EGFP) and γ-MSH (LVi-γ-MSH-EGFP). The lentiviral vectors were shown to produce a long-term overexpression and biologically active peptides in cell-based assays. The LVi-α-MSHEGFP was targeted to the arcuate nucleus in the hypothalamus of diet induced obese mice where it reduced weight gain and adiposity independently of food intake. When the nucleus tractus solitarus in the brainstem was targeted, the LVi-α-MSH-EGFP treatment was shown to cause a small decrease in adiposity, which did not impact weight development. However, the α-MSH treatment increased heart rate, which was attenuated by adrenergic receptor blockade indicative of increased sympathetic activity. The LVi-γ-MSH-EGFP was targeted to the hypothalamus where it decreased fat mass in mice eating the standard diet, but the effect was abated if animals consumed a high-fat Western type diet. When the diet induced obese mice were subjected again to the standard diet, the LVi-γ-MSH-EGFP treated animals displayed increased weight loss and reduced adiposity. These results indicate that the long-term central anti-obesity effects of α-MSH are independent of food intake. In addition, overexpression of α-MSH in the brain stem efficiently blocked the development of adiposity, but increased sympathetic tone. The evidence presented in this thesis also indicates that selective MC3R agonists such as γ-MSH could be potential therapeutics in combination with low fat diets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this thesis was to investigate environmental permits of landfills with respect to the appropriateness of risk assessments focusing on contaminant migration, structures capable to protect the environment, waste and leachate management and existing environmental impacts of landfills. According to the requirements, a risk assessment is always required to demonstrate compliance with environmental protection requirements if the environmental permit decision deviates from the set requirements. However, there is a reason to doubt that all relevant risk factors are identified in current risk assessment practices in order to protect people end environment. In this dissertation, risk factors were recognized in 12 randomly selected landfills. Based on this analysis, a structural risk assessment method was created. The method was verified with two case examples. Several development needs were found in the risk assessments of the environmental permit decisions. The risk analysis equations used in the decisions did not adequately take into account all the determining factors like waste prospects, total risk quantification or human delineated factors. Instead of focusing on crucial factors, the landfill environmental protection capability is simply expressed via technical factors like hydraulic conductivity. In this thesis, it could be shown, that using adequate risk assessment approaches the most essential environmental impacts can be taken into account by consideration of contaminant transport mechanisms, leachate effects, and artificial landfill structures. The developed structural risk analysing (SRA) method shows, that landfills structures could be designed in a more cost-efficient way taking advantage of recycled or by-products. Additionally, the research results demonstrate that the environmental protection requirements of landfills should be updated to correspond to the capability to protect the environment instead of the current simplified requirements related to advective transport only.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In much of the previous research into the field of interactive storytelling, the focus has been on the creation of complete systems, then evaluating the performance of those systems based on user experience. Less focus has been placed on finding general solutions to problems that manifest in many different types of interactive storytelling systems. The goal of this thesis was to identify potential candidates for metrics that a system could use to predict player behavior or how players experience the story they are presented with, and to put these metrics to an empirical test. The three metrics that were used were morality, relationships and conflict. The game used for user testing of the metrics, Regicide is an interactive storytelling experience that was created in conjunction with Eero Itkonen. Data, in the forms of internal system data and survey answers, collected through user testing, was used to evaluate hypotheses for each metric. Out of the three chosen metrics, morality performed the best in this study. Though further research and refinement may be required, the results were promising, and point to the conclusion that user responses to questions of morality are a strong predictor for their choices in similar situations later on in the course of an interactive story. A similar examination for user relationships with other characters in the story did not produce promising results, but several problems were recognized in terms of methodology and further research with a better optimized system may yield different results. On the subject of conflict, several aspects, proposed by Ware et al. (2012), were evaluated separately. Results were inconclusive, with the aspect of directness showing the most promise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of the applications of airborne laser scanner data to forestry require that the point cloud be normalized, i.e., each point represents height from the ground instead of elevation. To normalize the point cloud, a digital terrain model (DTM), which is derived from the ground returns in the point cloud, is employed. Unfortunately, extracting accurate DTMs from airborne laser scanner data is a challenging task, especially in tropical forests where the canopy is normally very thick (partially closed), leading to a situation in which only a limited number of laser pulses reach the ground. Therefore, robust algorithms for extracting accurate DTMs in low-ground-point-densitysituations are needed in order to realize the full potential of airborne laser scanner data to forestry. The objective of this thesis is to develop algorithms for processing airborne laser scanner data in order to: (1) extract DTMs in demanding forest conditions (complex terrain and low number of ground points) for applications in forestry; (2) estimate canopy base height (CBH) for forest fire behavior modeling; and (3) assess the robustness of LiDAR-based high-resolution biomass estimation models against different field plot designs. Here, the aim is to find out if field plot data gathered by professional foresters can be combined with field plot data gathered by professionally trained community foresters and used in LiDAR-based high-resolution biomass estimation modeling without affecting prediction performance. The question of interest in this case is whether or not the local forest communities can achieve the level technical proficiency required for accurate forest monitoring. The algorithms for extracting DTMs from LiDAR point clouds presented in this thesis address the challenges of extracting DTMs in low-ground-point situations and in complex terrain while the algorithm for CBH estimation addresses the challenge of variations in the distribution of points in the LiDAR point cloud caused by things like variations in tree species and season of data acquisition. These algorithms are adaptive (with respect to point cloud characteristics) and exhibit a high degree of tolerance to variations in the density and distribution of points in the LiDAR point cloud. Results of comparison with existing DTM extraction algorithms showed that DTM extraction algorithms proposed in this thesis performed better with respect to accuracy of estimating tree heights from airborne laser scanner data. On the other hand, the proposed DTM extraction algorithms, being mostly based on trend surface interpolation, can not retain small artifacts in the terrain (e.g., bumps, small hills and depressions). Therefore, the DTMs generated by these algorithms are only suitable for forestry applications where the primary objective is to estimate tree heights from normalized airborne laser scanner data. On the other hand, the algorithm for estimating CBH proposed in this thesis is based on the idea of moving voxel in which gaps (openings in the canopy) which act as fuel breaks are located and their height is estimated. Test results showed a slight improvement in CBH estimation accuracy over existing CBH estimation methods which are based on height percentiles in the airborne laser scanner data. However, being based on the idea of moving voxel, this algorithm has one main advantage over existing CBH estimation methods in the context of forest fire modeling: it has great potential in providing information about vertical fuel continuity. This information can be used to create vertical fuel continuity maps which can provide more realistic information on the risk of crown fires compared to CBH.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thedirect torque control (DTC) has become an accepted vector control method besidethe current vector control. The DTC was first applied to asynchronous machines,and has later been applied also to synchronous machines. This thesis analyses the application of the DTC to permanent magnet synchronous machines (PMSM). In order to take the full advantage of the DTC, the PMSM has to be properly dimensioned. Therefore the effect of the motor parameters is analysed taking the control principle into account. Based on the analysis, a parameter selection procedure is presented. The analysis and the selection procedure utilize nonlinear optimization methods. The key element of a direct torque controlled drive is the estimation of the stator flux linkage. Different estimation methods - a combination of current and voltage models and improved integration methods - are analysed. The effect of an incorrect measured rotor angle in the current model is analysed andan error detection and compensation method is presented. The dynamic performance of an earlier presented sensorless flux estimation method is made better by improving the dynamic performance of the low-pass filter used and by adapting the correction of the flux linkage to torque changes. A method for the estimation ofthe initial angle of the rotor is presented. The method is based on measuring the inductance of the machine in several directions and fitting the measurements into a model. The model is nonlinear with respect to the rotor angle and therefore a nonlinear least squares optimization method is needed in the procedure. A commonly used current vector control scheme is the minimum current control. In the DTC the stator flux linkage reference is usually kept constant. Achieving the minimum current requires the control of the reference. An on-line method to perform the minimization of the current by controlling the stator flux linkage reference is presented. Also, the control of the reference above the base speed is considered. A new estimation flux linkage is introduced for the estimation of the parameters of the machine model. In order to utilize the flux linkage estimates in off-line parameter estimation, the integration methods are improved. An adaptive correction is used in the same way as in the estimation of the controller stator flux linkage. The presented parameter estimation methods are then used in aself-commissioning scheme. The proposed methods are tested with a laboratory drive, which consists of a commercial inverter hardware with a modified software and several prototype PMSMs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena oli rakentaa case yritykselle malli lyhyen aikavälin kannattavuuden estimointia varten. Tutkimusmetodi on konstruktiivinen, ja malli kehitettiin laskentaihmisten avustuksella. Teoriaosassa käytiin kirjallisuuskatsauksen avulla läpi kannattavuutta, budjetointia sekä itse ennustamista. Teoriaosassa pyrittiin löytämään sellaisia menetelmiä, joita voitaisiin käyttää lyhyen aikavälin kannattavuuden estimoinnissa. Rakennettavalle mallille asetettujen vaatimusten mukaan menetelmäksi valittiin harkintaan perustuva menetelmä (judgmental). Tutkimuksen mukaan kannattavuuteen vaikuttaa myyntihinta ja –määrä, tuotanto, raaka-aineiden hinnat ja varaston muutos. Rakennettu malli toimii kohdeyrityksessä kohtalaisen hyvin ja huomattavaa on se, että eri tehtaiden ja eri koneiden väliset erot saattavat olla kohtuullisen suuret. Nämä erot johtuvat pääasiassa tehtaan koosta ja mallien erilaisuudesta. Mallin käytännön toimivuus tulee kuitenkin parhaiten selville silloin, kun se on laskentaihmisten käytössä. Ennustamiseen liittyy kuitenkin aina omat ongelmansa ja uudetkaan menetelmät eivät välttämättä poista näitä ongelmia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many industrial applications, accurate and fast surface reconstruction is essential for quality control. Variation in surface finishing parameters, such as surface roughness, can reflect defects in a manufacturing process, non-optimal product operational efficiency, and reduced life expectancy of the product. This thesis considers reconstruction and analysis of high-frequency variation, that is roughness, on planar surfaces. Standard roughness measures in industry are calculated from surface topography. A fast and non-contact method to obtain surface topography is to apply photometric stereo in the estimation of surface gradients and to reconstruct the surface by integrating the gradient fields. Alternatively, visual methods, such as statistical measures, fractal dimension and distance transforms, can be used to characterize surface roughness directly from gray-scale images. In this thesis, the accuracy of distance transforms, statistical measures, and fractal dimension are evaluated in the estimation of surface roughness from gray-scale images and topographies. The results are contrasted to standard industry roughness measures. In distance transforms, the key idea is that distance values calculated along a highly varying surface are greater than distances calculated along a smoother surface. Statistical measures and fractal dimension are common surface roughness measures. In the experiments, skewness and variance of brightness distribution, fractal dimension, and distance transforms exhibited strong linear correlations to standard industry roughness measures. One of the key strengths of photometric stereo method is the acquisition of higher frequency variation of surfaces. In this thesis, the reconstruction of planar high-frequency varying surfaces is studied in the presence of imaging noise and blur. Two Wiener filterbased methods are proposed of which one is optimal in the sense of surface power spectral density given the spectral properties of the imaging noise and blur. Experiments show that the proposed methods preserve the inherent high-frequency variation in the reconstructed surfaces, whereas traditional reconstruction methods typically handle incorrect measurements by smoothing, which dampens the high-frequency variation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Various strength properties of paper are measured to tell how well it resists breaks in a paper machine or in printing presses. The most often measured properties are dry tensile strength and dry tear strength. However, in many situations where paper breaks, it is not dry. For example, in web breaks after the wet pressing the dry matter content can be around 45%. Thus, wet-web strength is often a more critical paper property than dry strength. Both wet and dry strength properties of the samples were measured with a L&W tensile tester. Originally this device was not designed for the measurement of the wet web tensile strength, thus a new procedure to handle the wet samples was developed. The method was tested with Pine Kraft (never dried). The effect of different strength additives on the wet-web and dry paper tensile strength was studied. The polymers used in this experiment were aqueous solution of a cationic polyamidoamine-epichlorohydrin resin (PAE), cationic hydrophilised polyisocyanate and cationic polyvinylamine (PVAm). From all three used chemicals only Cationic PAE considerably increased the wet web strength. However it was noticed that at constant solids content all chemicals decreased the wet web tensile strength. So, since all chemicals enhanced solid content it can be concluded that they work as drainage aids, not as wet web strength additives. From all chemicals only PVAm increased the dry strength and two other chemicals even decreased the strength. As chemicals were used in strong diluted forms and were injected into the pulp slurry, not on the surface of the papersheets, changes in samples densities did not happen. Also it has to be noted that all these chemicals are mainly used to improve the wet strength after the drying of the web.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drug-drug interactions (DDIs) comprise an important cause of adverse drug reactions leading to excess hospitalizations. Drug metabolism is catalyzed by 75% by cytochrome P450 (CYP) enzymes and thus they are often involved in pharmacokinetic DDIs. In general, DDIs are studied in randomized controlled clinical trials in selected study populations. The overall aim of the present studies was to perform observational pharmacoepidemiological surveys on CYP-mediated DDIs in diseases important at the population level. The prevalence of co-administrations of four prodrugs (losartan, codeine, tramadol, and clopidogrel), three sulphonylureas (glibenclamide, glimepiride, and glipizide), or two statins (lovastatin and simvastatin) with well established agents altering CYP activity, as well as of statins with fibrates, was studied in Finland utilizing data from a university hospital medication database (inpatients) and the National Prescription Register of the Social Insurance Institution of Finland, Kela (outpatients). Clinical consequences of potential DDIs were estimated by reviewing laboratory data, and information from hospital care and cause-of-death registers. Concomitant use of study substrates with interacting medication was detected in up to one fifth of patients in both hospital and community settings. Potential CYP3A4 interactions in statin users did not manifest in clear adverse laboratory values but pharmacodynamic DDIs between statins and fibrates predisposed patients to muscular toxicity. Sulphonylurea DDIs with CYP2C9 inhibitors increased the risk of hypoglycaemia. CYP3A4 inhibitor use with clopidogrel was not associated with significant changes in mortality but non-fatal thrombosis and haemorrhage complications were seen less often in this group. Concomitant administration of atorvastatin with clopidogrel moderately attenuated the antithrombotic effect by clopidogrel. The overall mortality was increased in CYP3A4 inducer and clopidogrel co-users. Atorvastatin used concomitantly with prodrug clopidogrel seems to be beneficial in terms of total and LDL cholesterol concentrations, and overall mortality compared with clopidogrel use without interacting medication. In conclusion, CYP-mediated DDIs are a common and often unrecognized consequence of irrational drug prescribing.