35 resultados para the least squares distance method
Resumo:
Cement industry significantly associated with high greenhouse gas (GHG) emissions. Considering the environmental impact, particularly global warming potential, it is important to reduce these emissions to air. The aim of the study is to investigate the mitigation possibility of GHG emissions in Ethiopian cement industry. Life cycle assessment (LCA) method used to identify and quantify GHG emissions during one ton of ordinary portland cement (OPC) production. Three mitigation scenarios: alternative fuel use, clinker substitution and thermal energy efficiency were applied on a representative gate-to-gate flow model developed with GaBi 6 software. The results of the study indicate that clinker substitution and alternative fuel use play a great role for GHG emissions mitigation with affordable cost. Applying most energy efficient kiln technology, which in turn reduces the amount of thermal energy use, has the least GHG emissions reduction intensity and high implementation cost comparing to the other scenarios. It was found that the cumulative GHG emissions mitigation potential along with other selected mitigation scenarios can be at least 48.9% per ton of cement production.
Resumo:
The aim of this study was to contribute to the current knowledge-based theory by focusing on a research gap that exists in the empirically proven determination of the simultaneous but differentiable effects of intellectual capital (IC) assets and knowledge management (KM) practices on organisational performance (OP). The analysis was built on the past research and theoreticised interactions between the latent constructs specified using the survey-based items that were measured from a sample of Finnish companies for IC and KM and the dependent construct for OP determined using information available from financial databases. Two widely used and commonly recommended measures in the literature on management science, i.e. the return on total assets (ROA) and the return on equity (ROE), were calculated for OP. Thus the investigation of the relationship between IC and KM impacting OP in relation to the hypotheses founded was possible to conduct using objectively derived performance indicators. Using financial OP measures also strengthened the dynamic features of data needed in analysing simultaneous and causal dependences between the modelled constructs specified using structural path models. The estimates were obtained for the parameters of structural path models using a partial least squares-based regression estimator. Results showed that the path dependencies between IC and OP or KM and OP were always insignificant when analysed separate to any other interactions or indirect effects caused by simultaneous modelling and regardless of the OP measure used that was either ROA or ROE. The dependency between the constructs for KM and IC appeared to be very strong and was always significant when modelled simultaneously with other possible interactions between the constructs and using either ROA or ROE to define OP. This study, however, did not find statistically unambiguous evidence for proving the hypothesised causal mediation effects suggesting, for instance, that the effects of KM practices on OP are mediated by the IC assets. Due to the fact that some indication about the fluctuations of causal effects was assessed, it was concluded that further studies are needed for verifying the fundamental and likely hidden causal effects between the constructs of interest. Therefore, it was also recommended that complementary modelling and data processing measures be conducted for elucidating whether the mediation effects occur between IC, KM and OP, the verification of which requires further investigations of measured items and can be build on the findings of this study.
Resumo:
This thesis concerns the analysis of epidemic models. We adopt the Bayesian paradigm and develop suitable Markov Chain Monte Carlo (MCMC) algorithms. This is done by considering an Ebola outbreak in the Democratic Republic of Congo, former Zaïre, 1995 as a case of SEIR epidemic models. We model the Ebola epidemic deterministically using ODEs and stochastically through SDEs to take into account a possible bias in each compartment. Since the model has unknown parameters, we use different methods to estimate them such as least squares, maximum likelihood and MCMC. The motivation behind choosing MCMC over other existing methods in this thesis is that it has the ability to tackle complicated nonlinear problems with large number of parameters. First, in a deterministic Ebola model, we compute the likelihood function by sum of square of residuals method and estimate parameters using the LSQ and MCMC methods. We sample parameters and then use them to calculate the basic reproduction number and to study the disease-free equilibrium. From the sampled chain from the posterior, we test the convergence diagnostic and confirm the viability of the model. The results show that the Ebola model fits the observed onset data with high precision, and all the unknown model parameters are well identified. Second, we convert the ODE model into a SDE Ebola model. We compute the likelihood function using extended Kalman filter (EKF) and estimate parameters again. The motivation of using the SDE formulation here is to consider the impact of modelling errors. Moreover, the EKF approach allows us to formulate a filtered likelihood for the parameters of such a stochastic model. We use the MCMC procedure to attain the posterior distributions of the parameters of the SDE Ebola model drift and diffusion parts. In this thesis, we analyse two cases: (1) the model error covariance matrix of the dynamic noise is close to zero , i.e. only small stochasticity added into the model. The results are then similar to the ones got from deterministic Ebola model, even if methods of computing the likelihood function are different (2) the model error covariance matrix is different from zero, i.e. a considerable stochasticity is introduced into the Ebola model. This accounts for the situation where we would know that the model is not exact. As a results, we obtain parameter posteriors with larger variances. Consequently, the model predictions then show larger uncertainties, in accordance with the assumption of an incomplete model.
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
Pairs trading is an algorithmic trading strategy that is based on the historical co-movement of two separate assets and trades are executed on the basis of degree of relative mispricing. The purpose of this study is to explore one new and alternative copula-based method for pairs trading. The objective is to find out whether the copula method generates more trading opportunities and higher profits than the more traditional distance and cointegration methods applied extensively in previous empirical studies. Methods are compared by selecting top five pairs from stocks of the large and medium-sized companies in the Finnish stock market. The research period includes years 2006-2015. All the methods are proven to be profitable and the Finnish stock market suitable for pairs trading. However, copula method doesn’t generate more trading opportunities or higher profits than the other methods. It seems that the limitations of the more traditional methods are not too restrictive for this particular sample data.