1000 resultados para PEDAGOGIC METHODS
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
The purpose of this work was to describe and compare sourcing practices and challenges in different geographies, to discuss possible options to advance sustainability of global sourcing, and to provide examples to answer why sourcing driven by sustainability principles is so challenging to implement. The focus was on comparison between Europe & Asia & South-America from the perspective of sustainability adoption. By analyzing sourcing practices of the case company it was possible to describe main differences and challenges of each continent, available sourcing options, supplier relationships and ways to foster positive chance. In this qualitative case study gathered theoretical material was compared to extensive sourcing practices of case company in a vast supplier network. Sourcing specialist were interviewed and information provided by them analyzed in order to see how different research results and theories are reflecting reality and to find answers to proposed research questions.
Resumo:
Ohjelmiston suorituskyky on kokonaisvaltainen asia, johon kaikki ohjelmiston elinkaaren vaiheet vaikuttavat. Suorituskykyongelmat johtavat usein projektien viivästymisiin, kustannusten ylittymisiin sekä joissain tapauksissa projektin täydelliseen epäonnistumiseen. Software performance engineering (SPE) on ohjelmistolähtöinen lähestysmistapa, joka tarjoaa tekniikoita suorituskykyisen ohjelmiston kehittämiseen. Tämä diplomityö tutkii näitä tekniikoita ja valitsee niiden joukosta ne, jotka soveltuvat suorituskykyongelmien ratkaisemiseen kahden IT-laitehallintatuotteen kehityksessä. Työn lopputuloksena on päivitetty versio nykyisestä tuotekehitysprosessista, mikä huomioi sovellusten suorituskykyyn liittyvät haasteet tuotteiden elinkaaren eri vaiheissa.
Resumo:
The aim of this Master’s thesis is to find a method for classifying spare part criticality in the case company. Several approaches exist for criticality classification of spare parts. The practical problem in this thesis is the lack of a generic analysis method for classifying spare parts of proprietary equipment of the case company. In order to find a classification method, a literature review of various analysis methods is required. The requirements of the case company also have to be recognized. This is achieved by consulting professionals in the company. The literature review states that the analytic hierarchy process (AHP) combined with decision tree models is a common method for classifying spare parts in academic literature. Most of the literature discusses spare part criticality in stock holding perspective. This is relevant perspective also for a customer orientated original equipment manufacturer (OEM), as the case company. A decision tree model is developed for classifying spare parts. The decision tree classifies spare parts into five criticality classes according to five criteria. The criteria are: safety risk, availability risk, functional criticality, predictability of failure and probability of failure. The criticality classes describe the level of criticality from non-critical to highly critical. The method is verified for classifying spare parts of a full deposit stripping machine. The classification can be utilized as a generic model for recognizing critical spare parts of other similar equipment, according to which spare part recommendations can be created. Purchase price of an item and equipment criticality were found to have no effect on spare part criticality in this context. Decision tree is recognized as the most suitable method for classifying spare part criticality in the company.
Resumo:
Consumer neuroscience (neuromarketing) is an emerging field of marketing research which uses brain imaging techniques to study neural conditions and processes that underlie consumption. The purpose of this study was to map this fairly new and growing field in Finland by studying the opinions of both Finnish consumers and marketing professionals towards it and comparing the opinions to the current consumer neuroscience literature, and based on that evaluate the usability of brain imaging techniques as a marketing research method. Mixed methods research design was chosen for this study. Quantitative data was collected from 232 consumers and 28 marketing professionals by means of online surveys. Both respondent groups had either neutral opinions or lacked knowledge about the four themes chosen for this study: benefits, limitations and challenges, ethical issues and future prospects of consumer neuroscience. Qualitative interview data was collected from 2 individuals from Finnish neuromarketing companies to deepen insights gained from quantitative research. The four interview themes were the same as in the surveys and the interviewees’ answers were mostly in line with the current literature, although more optimistic about the future of the field. The interviews also exposed a gap between academic consumer neuroscience research and practical level applications. The results of this study suggest that there are still many unresolved challenges and relevant populations either have neutral opinions or lack information about consumer neuroscience. The practical level applications are, however, already being successfully used and this new field of marketing research is growing both globally and in Finland.
Resumo:
Pairs trading is an algorithmic trading strategy that is based on the historical co-movement of two separate assets and trades are executed on the basis of degree of relative mispricing. The purpose of this study is to explore one new and alternative copula-based method for pairs trading. The objective is to find out whether the copula method generates more trading opportunities and higher profits than the more traditional distance and cointegration methods applied extensively in previous empirical studies. Methods are compared by selecting top five pairs from stocks of the large and medium-sized companies in the Finnish stock market. The research period includes years 2006-2015. All the methods are proven to be profitable and the Finnish stock market suitable for pairs trading. However, copula method doesn’t generate more trading opportunities or higher profits than the other methods. It seems that the limitations of the more traditional methods are not too restrictive for this particular sample data.
Resumo:
Fluid handling systems account for a significant share of the global consumption of electrical energy. They also suffer from problems, which reduce their energy efficiency and increase life-cycle costs. Detecting or predicting these problems in time can make fluid handling systems more environmentally and economically sustainable to operate. In this Master’s Thesis, significant problems in fluid systems were studied and possibilities to develop variable-speed-drive-based detection methods for them was discussed. A literature review was conducted to find significant problems occurring in fluid handling systems containing pumps, fans and compressors. To find case examples for evaluating the feasibility of variable-speed-drive-based methods, queries were sent to industrial companies. As a result of this, the possibility to detect heat exchanger fouling with a variable-speed drive was analysed with data from three industrial cases. It was found that a mass flow rate estimate, which can be generated with a variable speed drive, can be used together with temperature measurements to monitor a heat exchanger’s thermal performance. Secondly, it was found that the fouling-related increase in the pressure drop of a heat exchanger can be monitored with a variable speed drive. Lastly, for systems where the flow device is speed controlled with by a pressure measurement, it was concluded that increasing rotational speed can be interpreted as progressing fouling in the heat exchanger.
Resumo:
Tannins, typically segregated into two major groups, the hydrolyzable tannins (HTs) and the proanthocyanidins (PAs), are plant polyphenolic secondary metabolites found throughout the plant kingdom. On one hand, tannins may cause harmful nutritional effects on herbivores, for example insects, and hence they work as plants’ defense against plant-eating animals. On the other hand, they may affect positively some herbivores, such as mammals, for example by their antioxidant, antimicrobial, anti-inflammatory or anticarcinogenic activities. This thesis focuses on understanding the bioactivity of plant tannins, their anthelmintic properties and the tools used for the qualitative and quantitative analysis of this endless source of structural diversity. The first part of the experimental work focused on the development of ultra-high performance liquid chromatography−tandem mass spectrometry (UHPLC-MS/MS) based methods for the rapid fingerprint analysis of bioactive polyphenols, especially tannins. In the second part of the experimental work the in vitro activity of isolated and purified HTs and their hydrolysis product, gallic acid, was tested against egg hatching and larval motility of two larval developmental stages, L1 and L2, of a common ruminant gastrointestinal parasite, Haemonchus contortus. The results indicated clear relationships between the HT structure and the anthelmintic activity. The activity of the studied compounds depended on many structural features, including size, functional groups present in the structure, and the structural rigidness. To further understand tannin bioactivity on a molecular level, the interaction between bovine serum albumin (BSA), and seven HTs and epigallocatechin gallate was examined. The objective was to define the effect of pH on the formation on tannin–protein complexes and to evaluate the stability of the formed complexes by gel electrophoresis and MALDI-TOF-MS. The results indicated that more basic pH values had a stabilizing effect on the tannin–protein complexes and that the tannin oxidative activity was directly linked with their tendency to form covalently stabilized complexes with BSA at increased pH.
Resumo:
The future of paying in the age of digitalization is a topic that includes varied visions. This master’s thesis explores images of the future of paying in the Single Euro Payment Area (SEPA) up to 2020 and 2025 through the views of experts specialized in paying. This study was commissioned by a credit management company in order to obtain more detailed information about the future of paying. Specifically, this thesis investigates what could be the most used payment methods in the future, what items could work as a medium of exchange in 2020 and how will they evolve towards the year 2025. Changing consumer behavior, trends connected to payment methods, security and private issues of new cashless payment methods were also part of this study. In the empirical part of the study the experts’ ideas about probable and preferable future images of paying were investigated through a two-round Disaggregative Delphi method. The questionnaire included numeric statements and open questions. Three alternative future images were created with the help of cluster analysis: “Unsurprising Future”, “Technology Driven Future” and “The Age of the Customer”. The plausible images had similarities and differences, which were reflected to the previous studies in the literature review. The study’s findings were formed based on the images of futures’ similarities and to the open questions answers that were received from the questionnaire. The main conclusion of the study was that development of technology will unify and diversify SEPA; the trend in 2020 seems to be towards more cashless payment methods but their usage depends on the countries’ financial possibilities and customer preferences. Mobile payments, cards and cash will be the main payment methods but the banks will have competitors from outside the financial sector. Wearable payment methods and NFC technology are seen as widely growing trends but subcutaneous payment devices will likely keep their niche position until 2025. In the meantime, security and private issues are seen to increase because of identity thefts and various frauds. Simultaneously, privacy will lose its meaning to younger consumers who are used to sharing their transaction and personal data with third parties in order to get access to attractive services. Easier access to consumers’ transaction data will probably open the door for hackers and cause new risks in paying processes. There exist many roads to future, and this study was not an attempt to give any complete answers about it even if some plausible assumptions about the future’s course were provided.
Resumo:
Interpretation has been used in many tourism sectors as a technique in achieving building hannony between resources and human needs. The objectives of this study are to identify the types of the interpretive methods used, and to evaluate their effectiveness, in marine parks. This study reviews the design principles of an effective interpretation for marine wildlife tourism, and adopts Drams' five design principles (1997) into a conceptual framework. Enjoyment increase, knowledge gain, attitude and intention change, and behaviour modification were used as key indicators in the assessment of the interpretive effectiveness of the Vancouver Aquarium (VA) and Marineland Canada (MC). Since on-site research is unavailable, a virtual tour is created to represent the interpretive experiences in the two study sites. Self-administered questionnaires are used to measure responses. Through comparing responses to the questionnaires (pre-, post-virtual tours and follow-up), this study found that interpretation increased enjoyment and added to respondents' knowledge. Although the changes in attitudes and intentions are not significant, the findings indicate that attitude and intention changes did occur as a result of interpretation, but only to a limited extent. Overall results suggest that more techniques should be added to enhance the effectiveness of the interpretation in marine parks or self-guiding tours, and with careful design, virtual tours are the innovative interpretation techniques for marine parks or informal educational facilities.
Resumo:
Euclidean distance matrix analysis (EDMA) methods are used to distinguish whether or not significant difference exists between conformational samples of antibody complementarity determining region (CDR) loops, isolated LI loop and LI in three-loop assembly (LI, L3 and H3) obtained from Monte Carlo simulation. After the significant difference is detected, the specific inter-Ca distance which contributes to the difference is identified using EDMA.The estimated and improved mean forms of the conformational samples of isolated LI loop and LI loop in three-loop assembly, CDR loops of antibody binding site, are described using EDMA and distance geometry (DGEOM). To the best of our knowledge, it is the first time the EDMA methods are used to analyze conformational samples of molecules obtained from Monte Carlo simulations. Therefore, validations of the EDMA methods using both positive control and negative control tests for the conformational samples of isolated LI loop and LI in three-loop assembly must be done. The EDMA-I bootstrap null hypothesis tests showed false positive results for the comparison of six samples of the isolated LI loop and true positive results for comparison of conformational samples of isolated LI loop and LI in three-loop assembly. The bootstrap confidence interval tests revealed true negative results for comparisons of six samples of the isolated LI loop, and false negative results for the conformational comparisons between isolated LI loop and LI in three-loop assembly. Different conformational sample sizes are further explored by combining the samples of isolated LI loop to increase the sample size, or by clustering the sample using self-organizing map (SOM) to narrow the conformational distribution of the samples being comparedmolecular conformations. However, there is no improvement made for both bootstrap null hypothesis and confidence interval tests. These results show that more work is required before EDMA methods can be used reliably as a method for comparison of samples obtained by Monte Carlo simulations.
Resumo:
Several automated reversed-phase HPLC methods have been developed to determine trace concentrations of carbamate pesticides (which are of concern in Ontario environmental samples) in water by utilizing two solid sorbent extraction techniques. One of the methods is known as on-line pre-concentration'. This technique involves passing 100 milliliters of sample water through a 3 cm pre-column, packed with 5 micron ODS sorbent, at flow rates varying from 5-10 mUmin. By the use of a valve apparatus, the HPLC system is then switched to a gradient mobile phase program consisting of acetonitrile and water. The analytes, Propoxur, Carbofuran, Carbaryl, Propham, Captan, Chloropropham, Barban, and Butylate, which are pre-concentrated on the pre-column, are eluted and separated on a 25 cm C-8 analytical column and determined by UV absorption at 220 nm. The total analytical time is 60 minutes, and the pre-column can be used repeatedly for the analysis of as many as thirty samples. The method is highly sensitive as 100 percent of the analytes present in the sample can be injected into the HPLC. No breakthrough of any of the analytes was observed and the minimum detectable concentrations range from 10 to 480 ng/L. The developed method is totally automated for the analysis of one sample. When the above mobile phase is modified with a buffer solution, Aminocarb, Benomyl, and its degradation product, MBC, can also be detected along with the above pesticides with baseline resolution for all of the analytes. The method can also be easily modified to determine Benomyl and MBC both as solute and as particulate matter. By using a commercially available solid phase extraction cartridge, in lieu of a pre-column, for the extraction and concentration of analytes, a completely automated method has been developed with the aid of the Waters Millilab Workstation. Sample water is loaded at 10 mL/min through a cartridge and the concentrated analytes are eluted from the sorbent with acetonitrile. The resulting eluate is blown-down under nitrogen, made up to volume with water, and injected into the HPLC. The total analytical time is 90 minutes. Fifty percent of the analytes present in the sample can be injected into the HPLC, and recoveries for the above eight pesticides ranged from 84 to 93 percent. The minimum detectable concentrations range from 20 to 960 ng/L. The developed method is totally automated for the analysis of up to thirty consecutive samples. The method has proven to be applicable to both purer water samples as well as untreated lake water samples.
Resumo:
New density functionals representing the exchange and correlation energies (per electron) are employed, based on the electron gas model, to calculate interaction potentials of noble gas systems X2 and XY, where X (and Y) are He,Ne,Ar and Kr, and of hydrogen atomrare gas systems H-X. The exchange energy density functional is that recommended by Handler and the correlation energy density functional is a rational function involving two parameters which were optimized to reproduce the correlation energy of He atom. Application of the two parameter function to other rare gas atoms shows that it is "universal"; i. e. ,accurate for the systems considered. The potentials obtained in this work compare well with recent experimental results and are a significant improvement over those from competing statistical modelS.
Resumo:
The purpose of this study was to determine novice t~ache~s' perceptions of th~ extent to which the Brock University teacher education program focused on strategies for promoting responsibility in students. Individual interviews were conducted with ten randomly selected teachers who were graduates of this teacher education program between the years of 1989 and 1992, and a follow-up group discussion activity, with the same teachers, was also held. Findings revealed that the topic of personal responsibility was discussed within various components of the program, including counselling group sessions, but that these discussions were often brief, indirect and inconsistent. Some of the strategies which the teachers used in their own classrooms to promote responsibility in students were ones which they had acquired from those counselling group °sessions or from associate teachers. Various strategies included: setting ~lear expectations of students with positive and negative consequences for behaviour (e.g., material rewards and detentions, respectively), cemmunic?ting'with other teachers an~ parents, and -. suspending students from school. A teacher's choice of any particular strategy seemed to be affected by his or her personality, teaching sUbject and region of employment, as well as certain aspects of the teacher education program. It was concluded that many of the teachers appeared to be controlling rude and vio~ent- behaviour, as opposed to promoting responsible behaviour. Recommendations were made for the pre-service program, as well as induction and inservice programs, to increase teacher preparedness for promoting responsible student behaviour. One of these recommendations addressed the need to help teachers learn how to effectively communicate with their students.
Resumo:
Methods of measuring specific heats of small samples were studied. Three automated methods were explored, two of which have shown promising results. The adiabatic continuous heating method, has provided smooth well behaved data but further work is presently underway to improve on the results obtained so far . The decay method has been success fully implemented demonstrating reasonable agreement with accepted data for a copper test sample.