987 resultados para theoretical methods
Resumo:
Mixed methods research involves the combined use of quantitative and qualitative methods in the same research study, and it is becoming increasingly important in several scientific areas. The aim of this paper is to review and compare through a mixed methods multiple-case study the application of this methodology in three reputable behavioural science journals: the Journal of Organizational Behavior, Addictive Behaviors and Psicothema. A quantitative analysis was carried out to review all the papers published in these journals during the period 2003-2008 and classify them into two blocks: theoretical and empirical, with the latter being further subdivided into three subtypes (quantitative, qualitative and mixed). A qualitative analysis determined the main characteristics of the mixed methods studies identified, in order to describe in more detail the ways in which the two methods are combined based on their purpose, priority, implementation and research design. From the journals selected, a total of 1.958 articles were analysed, the majority of which corresponded to empirical studies, with only a small number referring to research that used mixed methods. Nonetheless, mixed methods research does appear in all the behavioural science journals studied within the period selected, showing a range of designs, where the sequential equal weight mixed methods research design seems to stand out.
Resumo:
Metaheuristic methods have become increasingly popular approaches in solving global optimization problems. From a practical viewpoint, it is often desirable to perform multimodal optimization which, enables the search of more than one optimal solution to the task at hand. Population-based metaheuristic methods offer a natural basis for multimodal optimization. The topic has received increasing interest especially in the evolutionary computation community. Several niching approaches have been suggested to allow multimodal optimization using evolutionary algorithms. Most global optimization approaches, including metaheuristics, contain global and local search phases. The requirement to locate several optima sets additional requirements for the design of algorithms to be effective in both respects in the context of multimodal optimization. In this thesis, several different multimodal optimization algorithms are studied in regard to how their implementation in the global and local search phases affect their performance in different problems. The study concentrates especially on variations of the Differential Evolution algorithm and their capabilities in multimodal optimization. To separate the global and local search search phases, three multimodal optimization algorithms are proposed, two of which hybridize the Differential Evolution with a local search method. As the theoretical background behind the operation of metaheuristics is not generally thoroughly understood, the research relies heavily on experimental studies in finding out the properties of different approaches. To achieve reliable experimental information, the experimental environment must be carefully chosen to contain appropriate and adequately varying problems. The available selection of multimodal test problems is, however, rather limited, and no general framework exists. As a part of this thesis, such a framework for generating tunable test functions for evaluating different methods of multimodal optimization experimentally is provided and used for testing the algorithms. The results demonstrate that an efficient local phase is essential for creating efficient multimodal optimization algorithms. Adding a suitable global phase has the potential to boost the performance significantly, but the weak local phase may invalidate the advantages gained from the global phase.
Resumo:
The B3LYP/6-31G (d) density functional theory (DFT) method was used to study molecular geometry, electronic structure, infrared spectrum (IR) and thermodynamic properties. Heat of formation (HOF) and calculated density were estimated to evaluate detonation properties using Kamlet-Jacobs equations. Thermal stability of 3,6,7,8-tetranitro-3,6,7,8-tetraaza-tricyclo [3.1.1.1(2,4)]octane (TTTO) was investigated by calculating bond dissociation energy (BDE) at the unrestricted B3LYP/6-31G(d) level. Results showed the N-NO2 bond is a trigger bond during the thermolysis initiation process. The crystal structure obtained by molecular mechanics (MM) methods belongs to P2(1)/C space group, with cell parameters a = 8.239 Å, b = 8.079 Å, c = 16.860 Å, Z = 4 and r = 1.922 g cm-3. Both detonation velocity of 9.79 km s-1 and detonation pressure of 44.22 GPa performed similarly to CL-20. According to the quantitative standards of energetics and stability, TTTO essentially satisfies this requirement as a high energy density compound (HEDC).
Resumo:
Credit risk assessment is an integral part of banking. Credit risk means that the return will not materialise in case the customer fails to fulfil its obligations. Thus a key component of banking is setting acceptance criteria for granting loans. Theoretical part of the study focuses on key components of credit assessment methods of Banks in the literature when extending credits to large corporations. Main component is Basel II Accord, which sets regulatory requirement for credit risk assessment methods of banks. Empirical part comprises, as primary source, analysis of major Nordic banks’ annual reports and risk management reports. As secondary source complimentary interviews were carried out with senior credit risk assessment personnel. The findings indicate that all major Nordic banks are using combination of quantitative and qualitative information in credit risk assessment model when extending credits to large corporations. The relative input of qualitative information depends on the selected approach to the credit rating, i.e. point-in-time or through-the-cycle.
Resumo:
There are vast changes in the work environment, and the traditional rules and management methods might not be suitable for today’s employees anymore. The meaning of work is also changing due to the younger and higher educated generations entering the markets. Old customs need to be re-validated and new approaches should be taken into use. This paper strongly emphasizes the importance of happiness research and happiness at work. The values towards the meaning of work are changing; people demand happiness and quality from all aspects of their lives. The aim of this study is to define happiness - especially at work - and to explain how it can be measured and what kind of results achieved. I also want to find out how the contents of work and the working environment might enhance happiness. The correlation between education and happiness is discussed and examined. I am aware that the findings and theories are concentrating mainly on Western Countries and highlighting the values and work-environments of those societies. The main aim of the empirical study is to find out if there are connections between happiness and work in data collected by World Value Survey in 2005, and if the profession has effects on happiness. Other factors such as the correlation of age, sex, education and income are examined too. I also want to find out what kind of values people have towards work and how these affect the happiness levels. The focus is on two nations: Finland (N=1014) and Italy (N=1012). I have also taken the global comparison within, that is all 54 countries (N=66,566) included in the 5th wave (during the years 2005 -2008) of the World Value Survey. The results suggest that people are generally happy around the world; happiness decreasing with the age, the educated being happier than the uneducated and the employed happier than the unemployed. People working in neat “white collar” jobs are more likely happier than those working in factories or outdoors. Money makes us happier, until certain level is reached. Work is important to people and the importance of work adds happiness. Work is also highly appreciated, but there are more happy people among those who do not appreciate work that highly. Safety matters the most when looking for a job, and there are more happy people among those who have selected the importance of work as the first choice when looking for a job, than among those to whom an income is the most important aspect. People are more likely happy when the quality of work is high, that is when their job consists of creative and cognitive tasks and when they have a feeling of independence.
Resumo:
Energy efficiency is one of the major objectives which should be achieved in order to implement the limited energy resources of the world in a sustainable way. Since radiative heat transfer is the dominant heat transfer mechanism in most of fossil fuel combustion systems, more accurate insight and models may cause improvement in the energy efficiency of the new designed combustion systems. The radiative properties of combustion gases are highly wavelength dependent. Better models for calculating the radiative properties of combustion gases are highly required in the modeling of large scale industrial combustion systems. With detailed knowledge of spectral radiative properties of gases, the modeling of combustion processes in the different applications can be more accurate. In order to propose a new method for effective non gray modeling of radiative heat transfer in combustion systems, different models for the spectral properties of gases including SNBM, EWBM, and WSGGM have been studied in this research. Using this detailed analysis of different approaches, the thesis presents new methods for gray and non gray radiative heat transfer modeling in homogeneous and inhomogeneous H2O–CO2 mixtures at atmospheric pressure. The proposed method is able to support the modeling of a wide range of combustion systems including the oxy-fired combustion scenario. The new methods are based on implementing some pre-obtained correlations for the total emissivity and band absorption coefficient of H2O–CO2 mixtures in different temperatures, gas compositions, and optical path lengths. They can be easily used within any commercial CFD software for radiative heat transfer modeling resulting in more accurate, simple, and fast calculations. The new methods were successfully used in CFD modeling by applying them to industrial scale backpass channel under oxy-fired conditions. The developed approaches are more accurate compared with other methods; moreover, they can provide complete explanation and detailed analysis of the radiation heat transfer in different systems under different combustion conditions. The methods were verified by applying them to some benchmarks, and they showed a good level of accuracy and computational speed compared to other methods. Furthermore, the implementation of the suggested banded approach in CFD software is very easy and straightforward.
Resumo:
Statistical analyses of measurements that can be described by statistical models are of essence in astronomy and in scientific inquiry in general. The sensitivity of such analyses, modelling approaches, and the consequent predictions, is sometimes highly dependent on the exact techniques applied, and improvements therein can result in significantly better understanding of the observed system of interest. Particularly, optimising the sensitivity of statistical techniques in detecting the faint signatures of low-mass planets orbiting the nearby stars is, together with improvements in instrumentation, essential in estimating the properties of the population of such planets, and in the race to detect Earth-analogs, i.e. planets that could support liquid water and, perhaps, life on their surfaces. We review the developments in Bayesian statistical techniques applicable to detections planets orbiting nearby stars and astronomical data analysis problems in general. We also discuss these techniques and demonstrate their usefulness by using various examples and detailed descriptions of the respective mathematics involved. We demonstrate the practical aspects of Bayesian statistical techniques by describing several algorithms and numerical techniques, as well as theoretical constructions, in the estimation of model parameters and in hypothesis testing. We also apply these algorithms to Doppler measurements of nearby stars to show how they can be used in practice to obtain as much information from the noisy data as possible. Bayesian statistical techniques are powerful tools in analysing and interpreting noisy data and should be preferred in practice whenever computational limitations are not too restrictive.
Resumo:
In today's logistics environment, there is a tremendous need for accurate cost information and cost allocation. Companies searching for the proper solution often come across with activity-based costing (ABC) or one of its variations which utilizes cost drivers to allocate the costs of activities to cost objects. In order to allocate the costs accurately and reliably, the selection of appropriate cost drivers is essential in order to get the benefits of the costing system. The purpose of this study is to validate the transportation cost drivers of a Finnish wholesaler company and ultimately select the best possible driver alternatives for the company. The use of cost driver combinations as an alternative is also studied. The study is conducted as a part of case company's applied ABC-project using the statistical research as the main research method supported by a theoretical, literature based method. The main research tools featured in the study include simple and multiple regression analyses, which together with the literature and observations based practicality analysis forms the basis for the advanced methods. The results suggest that the most appropriate cost driver alternatives are the delivery drops and internal delivery weight. The possibility of using cost driver combinations is not suggested as their use doesn't provide substantially better results while increasing the measurement costs, complexity and load of use at the same time. The use of internal freight cost drivers is also questionable as the results indicate weakening trend in the cost allocation capabilities towards the end of the period. Therefore more research towards internal freight cost drivers should be conducted before taking them in use.
Resumo:
Today lean-philosophy has gathered a lot of popularity and interest in many industries. This customer-oriented philosophy helps to understand customer’s value creation which can be used to improve efficiency. A comprehensive study of lean and lean-methods in service industry were created in this research. In theoretical part lean-philosophy is studied in different levels which will help to understand its diversity. To support lean, this research also presents basic concepts of process management. Lastly theoretical part presents a development model to support process development in systematical way. The empirical part of the study was performed by performing experimental measurements during the service center’s product return process and by analyzing this data. Measurements were used to map out factors that have a negative influence on the process flow. Several development propositions were discussed to remove these factors. Problems mainly occur due to challenges in controlling customers and due to the lack of responsibility and continuous improvement on operational level. Development propositions concern such factors as change in service center’s physical environment, standardization of work tasks and training. These factors will remove waste in the product return process and support the idea of continuous improvement.
Resumo:
The main purpose of the present doctoral thesis is to investigate subjective experiences and cognitive processes in four different types of altered states of consciousness: naturally occurring dreaming, cognitively induced hypnosis, pharmacologically induced sedation, and pathological psychosis. Both empirical and theoretical research is carried out, resulting in four empirical and four theoretical studies. The thesis begins with a review of the main concepts used in consciousness research, the most influential philosophical and neurobiological theories of subjective experience, the classification of altered states of consciousness, and the main empirical methods used to study consciousness alterations. Next, findings of the original studies are discussed, as follows. Phenomenal consciousness is found to be dissociable from responsiveness, as subjective experiences do occur in unresponsive states, including anaesthetic-induced sedation and natural sleep, as demonstrated by post-awakening subjective reports. Two new tools for the content analysis of subjective experiences and dreams are presented, focusing on the diversity, complexity and dynamics of phenomenal consciousness. In addition, a new experimental paradigm of serial awakenings from non-rapid eye movement sleep is introduced, which enables more rapid sampling of dream reports than has been available in previous studies. It is also suggested that lucid dreaming can be studied using transcranial brain stimulation techniques and systematic analysis of pre-lucid dreaming. For blind judges, dreams of psychotic patients appear to be indistinguishable from waking mentation reports collected from the same patients, which indicates a close resemblance of these states of mind. However, despite phenomenological similarities, dreaming should not be treated as a uniform research model of psychotic or intact consciousness. Contrary to this, there seems to be a multiplicity of routes of how different states of consciousness can be associated. For instance, seemingly identical time perception distortions in different alterations of consciousness may have diverse underlying causes for these distortions. It is also shown that altered states do not necessarily exhibit impaired cognitive processing compared to a baseline waking state of consciousness: a case study of time perception in a hypnotic virtuoso indicates a more consistent perceptual timing under hypnosis than in a waking state. The thesis ends with a brief discussion of the most promising new perspectives for the study of alterations of consciousness.
Resumo:
The purpose of this work was to describe and compare sourcing practices and challenges in different geographies, to discuss possible options to advance sustainability of global sourcing, and to provide examples to answer why sourcing driven by sustainability principles is so challenging to implement. The focus was on comparison between Europe & Asia & South-America from the perspective of sustainability adoption. By analyzing sourcing practices of the case company it was possible to describe main differences and challenges of each continent, available sourcing options, supplier relationships and ways to foster positive chance. In this qualitative case study gathered theoretical material was compared to extensive sourcing practices of case company in a vast supplier network. Sourcing specialist were interviewed and information provided by them analyzed in order to see how different research results and theories are reflecting reality and to find answers to proposed research questions.
Resumo:
The natural abundance of the N-heterocycle containing compounds has pushed the synthetic community toward the invention of new synthetic methods that result in the structural diversity of N-heterocycles. Among this, is the efficient and highly selective diamine mediated asymmetric lithiation process. Amongst the diamine chiral ligands, (-)-sparterine, which is a naturally occurring alkaloid proved to be an efficient one. Many successful, good yielding and highly selective lithiation reactions have been accomplished with the mediation by this chiral diamine base. Although, there are some examples of experimental and theoretical mechanistic studies in the literature, there is a lack of detailed understanding as to how it exactly induces the chirality. In this thesis is described a systematic investigation of how (-)-sparteine influences the stereoselectivity in the course of asymmetric lithiation reaction. This led us to the establishment of the function of A-ring’s β-CH2 effect and D-ring effect. Consequently, the importance of the A-ring and D-ring portions of (-)-sparteine in the stereoselectivity is unraveled. Another part of this thesis deals with the asymmetric lithiation of BF3-activated N,N- dimethylaminoferrocene in the presence of (1R, 2R)-N1,N2-bis(3,3-dimethylbutyl)-N1,N2-dimethylcyclohexane-1,2-diamine ( a (R,R)-TMCDA surrogate) with i-PrLi. Computational findings were in full accord with the experimental observations. Subsequently, the theoretically provided insights into the mechanism of the reaction were exploited in computational design of a new ligand. Unfortunately, the outcome of this design was not experimentally robust and an updated approach towards a successful design was explained.
Resumo:
In this paper, we provide both qualitative and quantitative measures of the cost of measuring the integrated volatility by the realized volatility when the frequency of observation is fixed. We start by characterizing for a general diffusion the difference between the realized and the integrated volatilities for a given frequency of observations. Then, we compute the mean and variance of this noise and the correlation between the noise and the integrated volatility in the Eigenfunction Stochastic Volatility model of Meddahi (2001a). This model has, as special examples, log-normal, affine, and GARCH diffusion models. Using some previous empirical works, we show that the standard deviation of the noise is not negligible with respect to the mean and the standard deviation of the integrated volatility, even if one considers returns at five minutes. We also propose a simple approach to capture the information about the integrated volatility contained in the returns through the leverage effect.
Resumo:
Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.
Resumo:
The study of stability problems is relevant to the study of structure of a physical system. It 1S particularly important when it is not possible to probe into its interior and obtain information on its structure by a direct method. The thesis states about stability theory that has become of dominant importance in the study of dynamical systems. and has many applications in basic fields like meteorology, oceanography, astrophysics and geophysics- to mention few of them. The definition of stability was found useful 1n many situations, but inadequate in many others so that a host of other important concepts have been introduced in past many years which are more or less related to the first definition and to the common sense meaning of stability. In recent years the theoretical developments in the studies of instabilities and turbulence have been as profound as the developments in experimental methods. The study here Points to a new direction for stability studies based on Lagrangian formulation instead of the Hamiltonian formulation used by other authors.