1000 resultados para Arnoldi Methods
Resumo:
The strongest wish of the customer concerning chemical pulp features is consistent, uniform quality. Variation may be controlled and reduced by using statistical methods. However, studies addressing the application and benefits of statistical methods in forest product sector are scarce. Thus, the customer wish is the root cause of the motivation behind this dissertation. The research problem addressed by this dissertation is that companies in the chemical forest product sector require new knowledge for improving their utilization of statistical methods. To gain this new knowledge, the research problem is studied from five complementary viewpoints – challenges and success factors, organizational learning, problem solving, economic benefit, and statistical methods as management tools. The five research questions generated on the basis of these viewpoints are answered in four research papers, which are case studies based on empirical data collection. This research as a whole complements the literature dealing with the use of statistical methods in the forest products industry. Practical examples of the application of statistical process control, case-based reasoning, the cross-industry standard process for data mining, and performance measurement methods in the context of chemical forest products manufacturing are brought to the public knowledge of the scientific community. The benefit of the application of these methods is estimated or demonstrated. The purpose of this dissertation is to find pragmatic ideas for companies in the chemical forest product sector in order for them to improve their utilization of statistical methods. The main practical implications of this doctoral dissertation can be summarized in four points: 1. It is beneficial to reduce variation in chemical forest product manufacturing processes 2. Statistical tools can be used to reduce this variation 3. Problem-solving in chemical forest product manufacturing processes can be intensified through the use of statistical methods 4. There are certain success factors and challenges that need to be addressed when implementing statistical methods
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
The purpose of this work was to describe and compare sourcing practices and challenges in different geographies, to discuss possible options to advance sustainability of global sourcing, and to provide examples to answer why sourcing driven by sustainability principles is so challenging to implement. The focus was on comparison between Europe & Asia & South-America from the perspective of sustainability adoption. By analyzing sourcing practices of the case company it was possible to describe main differences and challenges of each continent, available sourcing options, supplier relationships and ways to foster positive chance. In this qualitative case study gathered theoretical material was compared to extensive sourcing practices of case company in a vast supplier network. Sourcing specialist were interviewed and information provided by them analyzed in order to see how different research results and theories are reflecting reality and to find answers to proposed research questions.
Resumo:
Ohjelmiston suorituskyky on kokonaisvaltainen asia, johon kaikki ohjelmiston elinkaaren vaiheet vaikuttavat. Suorituskykyongelmat johtavat usein projektien viivästymisiin, kustannusten ylittymisiin sekä joissain tapauksissa projektin täydelliseen epäonnistumiseen. Software performance engineering (SPE) on ohjelmistolähtöinen lähestysmistapa, joka tarjoaa tekniikoita suorituskykyisen ohjelmiston kehittämiseen. Tämä diplomityö tutkii näitä tekniikoita ja valitsee niiden joukosta ne, jotka soveltuvat suorituskykyongelmien ratkaisemiseen kahden IT-laitehallintatuotteen kehityksessä. Työn lopputuloksena on päivitetty versio nykyisestä tuotekehitysprosessista, mikä huomioi sovellusten suorituskykyyn liittyvät haasteet tuotteiden elinkaaren eri vaiheissa.
Resumo:
The aim of this Master’s thesis is to find a method for classifying spare part criticality in the case company. Several approaches exist for criticality classification of spare parts. The practical problem in this thesis is the lack of a generic analysis method for classifying spare parts of proprietary equipment of the case company. In order to find a classification method, a literature review of various analysis methods is required. The requirements of the case company also have to be recognized. This is achieved by consulting professionals in the company. The literature review states that the analytic hierarchy process (AHP) combined with decision tree models is a common method for classifying spare parts in academic literature. Most of the literature discusses spare part criticality in stock holding perspective. This is relevant perspective also for a customer orientated original equipment manufacturer (OEM), as the case company. A decision tree model is developed for classifying spare parts. The decision tree classifies spare parts into five criticality classes according to five criteria. The criteria are: safety risk, availability risk, functional criticality, predictability of failure and probability of failure. The criticality classes describe the level of criticality from non-critical to highly critical. The method is verified for classifying spare parts of a full deposit stripping machine. The classification can be utilized as a generic model for recognizing critical spare parts of other similar equipment, according to which spare part recommendations can be created. Purchase price of an item and equipment criticality were found to have no effect on spare part criticality in this context. Decision tree is recognized as the most suitable method for classifying spare part criticality in the company.
Resumo:
Consumer neuroscience (neuromarketing) is an emerging field of marketing research which uses brain imaging techniques to study neural conditions and processes that underlie consumption. The purpose of this study was to map this fairly new and growing field in Finland by studying the opinions of both Finnish consumers and marketing professionals towards it and comparing the opinions to the current consumer neuroscience literature, and based on that evaluate the usability of brain imaging techniques as a marketing research method. Mixed methods research design was chosen for this study. Quantitative data was collected from 232 consumers and 28 marketing professionals by means of online surveys. Both respondent groups had either neutral opinions or lacked knowledge about the four themes chosen for this study: benefits, limitations and challenges, ethical issues and future prospects of consumer neuroscience. Qualitative interview data was collected from 2 individuals from Finnish neuromarketing companies to deepen insights gained from quantitative research. The four interview themes were the same as in the surveys and the interviewees’ answers were mostly in line with the current literature, although more optimistic about the future of the field. The interviews also exposed a gap between academic consumer neuroscience research and practical level applications. The results of this study suggest that there are still many unresolved challenges and relevant populations either have neutral opinions or lack information about consumer neuroscience. The practical level applications are, however, already being successfully used and this new field of marketing research is growing both globally and in Finland.
Resumo:
Pairs trading is an algorithmic trading strategy that is based on the historical co-movement of two separate assets and trades are executed on the basis of degree of relative mispricing. The purpose of this study is to explore one new and alternative copula-based method for pairs trading. The objective is to find out whether the copula method generates more trading opportunities and higher profits than the more traditional distance and cointegration methods applied extensively in previous empirical studies. Methods are compared by selecting top five pairs from stocks of the large and medium-sized companies in the Finnish stock market. The research period includes years 2006-2015. All the methods are proven to be profitable and the Finnish stock market suitable for pairs trading. However, copula method doesn’t generate more trading opportunities or higher profits than the other methods. It seems that the limitations of the more traditional methods are not too restrictive for this particular sample data.
Resumo:
Fluid handling systems account for a significant share of the global consumption of electrical energy. They also suffer from problems, which reduce their energy efficiency and increase life-cycle costs. Detecting or predicting these problems in time can make fluid handling systems more environmentally and economically sustainable to operate. In this Master’s Thesis, significant problems in fluid systems were studied and possibilities to develop variable-speed-drive-based detection methods for them was discussed. A literature review was conducted to find significant problems occurring in fluid handling systems containing pumps, fans and compressors. To find case examples for evaluating the feasibility of variable-speed-drive-based methods, queries were sent to industrial companies. As a result of this, the possibility to detect heat exchanger fouling with a variable-speed drive was analysed with data from three industrial cases. It was found that a mass flow rate estimate, which can be generated with a variable speed drive, can be used together with temperature measurements to monitor a heat exchanger’s thermal performance. Secondly, it was found that the fouling-related increase in the pressure drop of a heat exchanger can be monitored with a variable speed drive. Lastly, for systems where the flow device is speed controlled with by a pressure measurement, it was concluded that increasing rotational speed can be interpreted as progressing fouling in the heat exchanger.
Resumo:
Tannins, typically segregated into two major groups, the hydrolyzable tannins (HTs) and the proanthocyanidins (PAs), are plant polyphenolic secondary metabolites found throughout the plant kingdom. On one hand, tannins may cause harmful nutritional effects on herbivores, for example insects, and hence they work as plants’ defense against plant-eating animals. On the other hand, they may affect positively some herbivores, such as mammals, for example by their antioxidant, antimicrobial, anti-inflammatory or anticarcinogenic activities. This thesis focuses on understanding the bioactivity of plant tannins, their anthelmintic properties and the tools used for the qualitative and quantitative analysis of this endless source of structural diversity. The first part of the experimental work focused on the development of ultra-high performance liquid chromatography−tandem mass spectrometry (UHPLC-MS/MS) based methods for the rapid fingerprint analysis of bioactive polyphenols, especially tannins. In the second part of the experimental work the in vitro activity of isolated and purified HTs and their hydrolysis product, gallic acid, was tested against egg hatching and larval motility of two larval developmental stages, L1 and L2, of a common ruminant gastrointestinal parasite, Haemonchus contortus. The results indicated clear relationships between the HT structure and the anthelmintic activity. The activity of the studied compounds depended on many structural features, including size, functional groups present in the structure, and the structural rigidness. To further understand tannin bioactivity on a molecular level, the interaction between bovine serum albumin (BSA), and seven HTs and epigallocatechin gallate was examined. The objective was to define the effect of pH on the formation on tannin–protein complexes and to evaluate the stability of the formed complexes by gel electrophoresis and MALDI-TOF-MS. The results indicated that more basic pH values had a stabilizing effect on the tannin–protein complexes and that the tannin oxidative activity was directly linked with their tendency to form covalently stabilized complexes with BSA at increased pH.
Resumo:
The future of paying in the age of digitalization is a topic that includes varied visions. This master’s thesis explores images of the future of paying in the Single Euro Payment Area (SEPA) up to 2020 and 2025 through the views of experts specialized in paying. This study was commissioned by a credit management company in order to obtain more detailed information about the future of paying. Specifically, this thesis investigates what could be the most used payment methods in the future, what items could work as a medium of exchange in 2020 and how will they evolve towards the year 2025. Changing consumer behavior, trends connected to payment methods, security and private issues of new cashless payment methods were also part of this study. In the empirical part of the study the experts’ ideas about probable and preferable future images of paying were investigated through a two-round Disaggregative Delphi method. The questionnaire included numeric statements and open questions. Three alternative future images were created with the help of cluster analysis: “Unsurprising Future”, “Technology Driven Future” and “The Age of the Customer”. The plausible images had similarities and differences, which were reflected to the previous studies in the literature review. The study’s findings were formed based on the images of futures’ similarities and to the open questions answers that were received from the questionnaire. The main conclusion of the study was that development of technology will unify and diversify SEPA; the trend in 2020 seems to be towards more cashless payment methods but their usage depends on the countries’ financial possibilities and customer preferences. Mobile payments, cards and cash will be the main payment methods but the banks will have competitors from outside the financial sector. Wearable payment methods and NFC technology are seen as widely growing trends but subcutaneous payment devices will likely keep their niche position until 2025. In the meantime, security and private issues are seen to increase because of identity thefts and various frauds. Simultaneously, privacy will lose its meaning to younger consumers who are used to sharing their transaction and personal data with third parties in order to get access to attractive services. Easier access to consumers’ transaction data will probably open the door for hackers and cause new risks in paying processes. There exist many roads to future, and this study was not an attempt to give any complete answers about it even if some plausible assumptions about the future’s course were provided.
Resumo:
[Regimen sanitatis Salernitanum (latin). 1553]
Resumo:
Interpretation has been used in many tourism sectors as a technique in achieving building hannony between resources and human needs. The objectives of this study are to identify the types of the interpretive methods used, and to evaluate their effectiveness, in marine parks. This study reviews the design principles of an effective interpretation for marine wildlife tourism, and adopts Drams' five design principles (1997) into a conceptual framework. Enjoyment increase, knowledge gain, attitude and intention change, and behaviour modification were used as key indicators in the assessment of the interpretive effectiveness of the Vancouver Aquarium (VA) and Marineland Canada (MC). Since on-site research is unavailable, a virtual tour is created to represent the interpretive experiences in the two study sites. Self-administered questionnaires are used to measure responses. Through comparing responses to the questionnaires (pre-, post-virtual tours and follow-up), this study found that interpretation increased enjoyment and added to respondents' knowledge. Although the changes in attitudes and intentions are not significant, the findings indicate that attitude and intention changes did occur as a result of interpretation, but only to a limited extent. Overall results suggest that more techniques should be added to enhance the effectiveness of the interpretation in marine parks or self-guiding tours, and with careful design, virtual tours are the innovative interpretation techniques for marine parks or informal educational facilities.