921 resultados para Heckman selection model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In general, models of ecological systems can be broadly categorized as ’top-down’ or ’bottom-up’ models, based on the hierarchical level that the model processes are formulated on. The structure of a top-down, also known as phenomenological, population model can be interpreted in terms of population characteristics, but it typically lacks an interpretation on a more basic level. In contrast, bottom-up, also known as mechanistic, population models are derived from assumptions and processes on a more basic level, which allows interpretation of the model parameters in terms of individual behavior. Both approaches, phenomenological and mechanistic modelling, can have their advantages and disadvantages in different situations. However, mechanistically derived models might be better at capturing the properties of the system at hand, and thus give more accurate predictions. In particular, when models are used for evolutionary studies, mechanistic models are more appropriate, since natural selection takes place on the individual level, and in mechanistic models the direct connection between model parameters and individual properties has already been established. The purpose of this thesis is twofold. Firstly, a systematical way to derive mechanistic discrete-time population models is presented. The derivation is based on combining explicitly modelled, continuous processes on the individual level within a reproductive period with a discrete-time maturation process between reproductive periods. Secondly, as an example of how evolutionary studies can be carried out in mechanistic models, the evolution of the timing of reproduction is investigated. Thus, these two lines of research, derivation of mechanistic population models and evolutionary studies, are complementary to each other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the thesis was to develop a competitors’ financial performance monitoring model for management reporting. The research consisted of the selections of the comparison group and the performance meters as well as the actual creation of the model. A brief analysis of the current situation was also made. The aim of the results was to improve the financial reporting quality in the case organization by adding external business environment observation to the management reports. The comparison group for the case company was selected to include five companies that were all involved in power equipment engineering and project type business. The most limiting factor related to the comparison group selection was the availability of quarterly financial reporting. The most suitable performance meters were defined to be the developments of revenue, order backlog and EBITDA. These meters should be monitored systematically on quarterly basis and reported to the company management in a brief and informative way. The monitoring model was based on spreadsheet construction with key characteristics being usability, flexibility and simplicity. The model acts as a centered storage for financial competitor information as well as a reporting tool. The current market situation is strongly affected by the economic boom in the recent years and future challenges can be clearly seen in declining order backlogs. The case company has succeeded well related to its comparison group during the observation period since its business volume and profitability have developed in the best way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT This study estimates the repeatability coefficients of two production traits in two native populations of Brazil nut trees. It determines the number of years of suitable evaluations for an efficient selection process, determines the permanent phenotypic correlation between production traits and also the selection of promising trees in these populations. Populations, located in the Itã region (ITA) and in the in the Cujubim region (CUJ), are both belonging to the municipality of Caracaraí, state of Roraima - Brazil, and consist of 85 and 51 adult trees, respectively. Each tree was evaluated regarding the number of fruits per plant (NFP) and fresh seed weight per plant (SWP), for eight (ITA) and five consecutive years (CUJ). Statistical analyses were performed according to the mixed model methodology, using Software Selegen-REML/BLUP (RESENDE, 2007). The repeatability coefficients were low for NFP (0.3145 and 0.3269 for ITA and CUJ, respectively) and also for SWP (0.2957 and 0.3436 for ITA and CUJ, respectively). It on average takes nine evaluation years to reach coefficients of determination higher than 80%. Permanent phenotypic correlation values higher than 0.95 were obtained for NFP and SWP in both populations. Although trees with a high number of fruits and seed weight were identified, more evaluation years are needed to perform the selection process more efficiently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a one-dimensional, semi-empirical dynamic model for the simulation and analysis of a calcium looping process for post-combustion CO2 capture. Reduction of greenhouse emissions from fossil fuel power production requires rapid actions including the development of efficient carbon capture and sequestration technologies. The development of new carbon capture technologies can be expedited by using modelling tools. Techno-economical evaluation of new capture processes can be done quickly and cost-effectively with computational models before building expensive pilot plants. Post-combustion calcium looping is a developing carbon capture process which utilizes fluidized bed technology with lime as a sorbent. The main objective of this work was to analyse the technological feasibility of the calcium looping process at different scales with a computational model. A one-dimensional dynamic model was applied to the calcium looping process, simulating the behaviour of the interconnected circulating fluidized bed reactors. The model incorporates fundamental mass and energy balance solvers to semi-empirical models describing solid behaviour in a circulating fluidized bed and chemical reactions occurring in the calcium loop. In addition, fluidized bed combustion, heat transfer and core-wall layer effects were modelled. The calcium looping model framework was successfully applied to a 30 kWth laboratory scale and a pilot scale unit 1.7 MWth and used to design a conceptual 250 MWth industrial scale unit. Valuable information was gathered from the behaviour of a small scale laboratory device. In addition, the interconnected behaviour of pilot plant reactors and the effect of solid fluidization on the thermal and carbon dioxide balances of the system were analysed. The scale-up study provided practical information on the thermal design of an industrial sized unit, selection of particle size and operability in different load scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Service provider selection has been said to be a critical factor in the formation of supply chains. Through successful selection companies can attain competitive advantage, cost savings and more flexible operations. Service provider management is the next crucial step in outsourcing process after the selection has been made. Without proper management companies cannot be sure about the level of service they have bought and they may suffer from service provider's opportunistic behavior. In worst case scenario the buyer company may end up in locked-in situation in which it is totally dependent of the service provider. This thesis studies how the case company conducts its carrier selection process along with the criteria related to it. A model for the final selection is also provided. In addition, case company's carrier management procedures are reflected against recommendations from previous researches. The research was conducted as a qualitative case study on the principal company, Neste Oil Retail. A literature review was made on outsourcing, service provider selection and service provider management. On the basis of the literature review, this thesis ended up recommending Analytic hierarchy process as the preferred model for the carrier selection. Furthermore, Agency theory was seen to be a functional framework for carrier management in this study. Empirical part of this thesis was conducted in the case company by interviewing the key persons in the selection process, making observations and going through documentations related to the subject. According to the results from the study, both carrier selection process as well as carrier management were closely in line with suggestions from literature review. Analytic hierarchy process results revealed that the case company considers service quality as the most important criteria with financial situation and price of service following behind with almost identical weights with each other. Equipment and personnel was seen as the least important selection criterion. Regarding carrier management, the study resulted in the conclusion that the company should consider engaging more in carrier development and working towards beneficial and effective relationships. Otherwise, no major changes were recommended for the case company processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Open innovation paradigm states that the boundaries of the firm have become permeable, allowing knowledge to flow inwards and outwards to accelerate internal innovations and take unused knowledge to the external environment; respectively. The successful implementation of open innovation practices in firms like Procter & Gamble, IBM, and Xerox, among others; suggest that it is a sustainable trend which could provide basis for achieving competitive advantage. However, implementing open innovation could be a complex process which involves several domains of management; and whose term, classification, and practices have not totally been agreed upon. Thus, with many possible ways to address open innovation, the following research question was formulated: How could Ericsson LMF assess which open innovation mode to select depending on the attributes of the project at hand? The research followed the constructive research approach which has the following steps: find a practical relevant problem, obtain general understanding of the topic, innovate the solution, demonstrate the solution works, show theoretical contributions, and examine the scope of applicability of the solution. The research involved three phases of data collection and analysis: Extensive literature review of open innovation, strategy, business model, innovation, and knowledge management; direct observation of the environment of the case company through participative observation; and semi-structured interviews based of six cases involving multiple and heterogeneous open innovation initiatives. Results from the cases suggest that the selection of modes depend on multiple reasons, with a stronger influence of factors related to strategy, business models, and resources gaps. Based on these and others factors found in the literature review and observations; it was possible to construct a model that supports approaching open innovation. The model integrates perspectives from multiple domains of the literature review, observations inside the case company, and factors from the six open innovation cases. It provides steps, guidelines, and tools to approach open innovation and assess the selection of modes. Measuring the impact of open innovation could take years; thus, implementing and testing entirely the model was not possible due time limitation. Nevertheless, it was possible to validate the core elements of the model with empirical data gathered from the cases. In addition to constructing the model, this research contributed to the literature by increasing the understanding of open innovation, providing suggestions to the case company, and proposing future steps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An appropriate supplier selection and its profound effects on increasing the competitive advantage of companies has been widely discussed in supply chain management (SCM) literature. By raising environmental awareness among companies and industries they attach more importance to sustainable and green activities in selection procedures of raw material providers. The current thesis benefits from data envelopment analysis (DEA) technique to evaluate the relative efficiency of suppliers in the presence of carbon dioxide (CO2) emission for green supplier selection. We incorporate the pollution of suppliers as an undesirable output into DEA. However, to do so, two conventional DEA model problems arise: the lack of the discrimination power among decision making units (DMUs) and flexibility of the inputs and outputs weights. To overcome these limitations, we use multiple criteria DEA (MCDEA) as one alternative. By applying MCDEA the number of suppliers which are identified as efficient will be decreased and will lead to a better ranking and selection of the suppliers. Besides, in order to compare the performance of the suppliers with an ideal supplier, a “virtual” best practice supplier is introduced. The presence of the ideal virtual supplier will also increase the discrimination power of the model for a better ranking of the suppliers. Therefore, a new MCDEA model is proposed to simultaneously handle undesirable outputs and virtual DMU. The developed model is applied for green supplier selection problem. A numerical example illustrates the applicability of the proposed model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is a worldwide leading cause of death. The standard method for evaluating critical partial occlusions is coronary arteriography, a catheterization technique which is invasive, time consuming, and costly. There are noninvasive approaches for the early detection of CAD. The basis for the noninvasive diagnosis of CAD has been laid in a sequential analysis of the risk factors, and the results of the treadmill test and myocardial perfusion scintigraphy (MPS). Many investigators have demonstrated that the diagnostic applications of MPS are appropriate for patients who have an intermediate likelihood of disease. Although this information is useful, it is only partially utilized in clinical practice due to the difficulty to properly classify the patients. Since the seminal work of Lotfi Zadeh, fuzzy logic has been applied in numerous areas. In the present study, we proposed and tested a model to select patients for MPS based on fuzzy sets theory. A group of 1053 patients was used to develop the model and another group of 1045 patients was used to test it. Receiver operating characteristic curves were used to compare the performance of the fuzzy model against expert physician opinions, and showed that the performance of the fuzzy model was equal or superior to that of the physicians. Therefore, we conclude that the fuzzy model could be a useful tool to assist the general practitioner in the selection of patients for MPS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Personalized medicine will revolutionize our capabilities to combat disease. Working toward this goal, a fundamental task is the deciphering of geneticvariants that are predictive of complex diseases. Modern studies, in the formof genome-wide association studies (GWAS) have afforded researchers with the opportunity to reveal new genotype-phenotype relationships through the extensive scanning of genetic variants. These studies typically contain over half a million genetic features for thousands of individuals. Examining this with methods other than univariate statistics is a challenging task requiring advanced algorithms that are scalable to the genome-wide level. In the future, next-generation sequencing studies (NGS) will contain an even larger number of common and rare variants. Machine learning-based feature selection algorithms have been shown to have the ability to effectively create predictive models for various genotype-phenotype relationships. This work explores the problem of selecting genetic variant subsets that are the most predictive of complex disease phenotypes through various feature selection methodologies, including filter, wrapper and embedded algorithms. The examined machine learning algorithms were demonstrated to not only be effective at predicting the disease phenotypes, but also doing so efficiently through the use of computational shortcuts. While much of the work was able to be run on high-end desktops, some work was further extended so that it could be implemented on parallel computers helping to assure that they will also scale to the NGS data sets. Further, these studies analyzed the relationships between various feature selection methods and demonstrated the need for careful testing when selecting an algorithm. It was shown that there is no universally optimal algorithm for variant selection in GWAS, but rather methodologies need to be selected based on the desired outcome, such as the number of features to be included in the prediction model. It was also demonstrated that without proper model validation, for example using nested cross-validation, the models can result in overly-optimistic prediction accuracies and decreased generalization ability. It is through the implementation and application of machine learning methods that one can extract predictive genotype–phenotype relationships and biological insights from genetic data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bearing performance signi cantly a ects the dynamic behaviors and estimated working life of a rotating system. A common bearing type is the ball bearing, which has been under investigation in numerous published studies. The complexity of the ball bearing models described in the literature varies. Naturally, model complexity is related to computational burden. In particular, the inclusion of centrifugal forces and gyroscopic moments signi cantly increases the system degrees of freedom and lengthens solution time. On the other hand, for low or moderate rotating speeds, these e ects can be neglected without signi cant loss of accuracy. The objective of this paper is to present guidelines for the appropriate selection of a suitable bearing model for three case studies. To this end, two ball bearing models were implemented. One considers high-speed forces, and the other neglects them. Both models were used to study a three structures, and the simulation results were.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To study emerging diseases, I employed a model pathogen-host system involving infections of insect larvae with the opportunistic fungus Aspergillus flavus, providing insight into three mechanisms ofpathogen evolution namely de novo mutation, genome decay, and virulence factoracquisition In Chapter 2 as a foundational experiment, A. flavus was serially propagated through insects to study the evolution of an opportunistic pathogen during repeated exposure to a single host. While A. flavus displayed de novo phenotypic alterations, namely decreased saprobic capacity, analysis of genotypic variation in Chapter 3 signified a host-imposed bottleneck on the pathogen population, emphasizing the host's role in shaping pathogen population structure. Described in Chapter 4, the serial passage scheme enabled the isolation of an A. flavus cysteine/methionine auxotroph with characteristics reminiscent of an obligate insect pathogen, suggesting that lost biosynthetic capacity may restrict host range based on nutrient availability and provide selection pressure for further evolution. As outlined in Chapter 6, cysteine/methionine auxotrophy had the pleiotrophic effect of increasing virulence factor production, affording the slow-growing auxotroph with a modified pathogenic strategy such that virulence was not reduced. Moreover in Chapter 7, transformation with a virulence factor from a facultative insect pathogen failed to increase virulence, demonstrating the necessity of an appropriate genetic background for virulence factor acquisition to instigate pathogen evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although Insurers Face Adverse Selection and Moral Hazard When They Set Insurance Contracts, These Two Types of Asymmetrical Information Have Been Given Separate Treatments Sofar in the Economic Literature. This Paper Is a First Attempt to Integrate Both Problems Into a Single Model. We Show How It Is Possible to Use Time in Order to Achieve a First-Best Allocation of Risks When Both Problems Are Present Simultaneously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical tests in vector autoregressive (VAR) models are typically based on large-sample approximations, involving the use of asymptotic distributions or bootstrap techniques. After documenting that such methods can be very misleading even with fairly large samples, especially when the number of lags or the number of equations is not small, we propose a general simulation-based technique that allows one to control completely the level of tests in parametric VAR models. In particular, we show that maximized Monte Carlo tests [Dufour (2002)] can provide provably exact tests for such models, whether they are stationary or integrated. Applications to order selection and causality testing are considered as special cases. The technique developed is applied to quarterly and monthly VAR models of the U.S. economy, comprising income, money, interest rates and prices, over the period 1965-1996.