82 resultados para Probabilistic choice models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine the drivers behind the establishment mode choice of German multinational enterprises (MNEs) in the sectors of Automotive, Chemicals and Mechanical Engineering in Brazil for the years 1993-2013 using a novel sample of primary data obtained directly from German MNEs. Based on prevalent theories found in the literature, we test the most common hypotheses on our sample. Firms with high R&D activities and firms with prior market knowledge in Brazil in from of previous sales offices are more likely to enter Brazil by a Greenfield investment. We also show that it is the specific private ownership of the German so-called hidden champions that drive those specific SMEs to enter Brazil by Greenfield, a sneaking suspicion that has been made before. Finally, we show that the establishment mode choice between Brazil and the USA only deviates to a low extent, with German MNEs preferring to enter Brazil by Greenfield and the USA by M&A. Thereby, we provide valuable insights for future research in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work project (WP) is a study about a clustering strategy for Sport Zone. The general cluster study’s objective is to create groups such that within each group the individuals are similar to each other, but should be different among groups. The clusters creation is a mix of common sense, trial and error and some statistical supporting techniques. Our particular objective is to support category managers to better define the product type to be displayed in the stores’ shelves by doing store clusters. This research was carried out for Sport Zone, and comprises an objective definition, a literature review, the clustering activity itself, some factor analysis and a discriminant analysis to better frame our work. Together with this quantitative part, a survey addressed to category managers to better understand their key drivers, for choosing the type of product of each store, was carried out. Based in a non-random sample of 65 stores with data referring to 2013, the final result was the choice of 6 store clusters (Figure 1) which were individually characterized as the main outcome of this work. In what relates to our selected variables, all were important for the distinction between clusters, which proves the adequacy of their choice. The interpretation of the results gives category managers a tool to understand which products best fit the clustered stores. Furthermore, as a side finding thanks to the clusterization, a STP (Segmentation, Targeting and Positioning) was initiated, being this WP the first steps of a continuous process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An infinite-horizon discrete time model with multiple size-class structures using a transition matrix is built to assess optimal harvesting schedules in the context of Non-Industrial Private Forest (NIPF) owners. Three model specifications accounting for forest income, financial return on an asset and amenity valuations are considered. Numerical simulations suggest uneven-aged forest management where a rational forest owner adapts her or his forest policy by influencing the regeneration of trees or adjusting consumption dynamics depending on subjective time preference and market return rate dynamics on the financial asset. Moreover she or he does not value significantly non-market benefits captured by amenity valuations relatively to forest income.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of human cell models that recapitulate hepatic functionality allows the study of metabolic pathways involved in toxicity and disease. The increased biological relevance, cost-effectiveness and high-throughput of cell models can contribute to increase the efficiency of drug development in the pharmaceutical industry. Recapitulation of liver functionality in vitro requires the development of advanced culture strategies to mimic in vivo complexity, such as 3D culture, co-cultures or biomaterials. However, complex 3D models are typically associated with poor robustness, limited scalability and compatibility with screening methods. In this work, several strategies were used to develop highly functional and reproducible spheroid-based in vitro models of human hepatocytes and HepaRG cells using stirred culture systems. In chapter 2, the isolation of human hepatocytes from resected liver tissue was implemented and a liver tissue perfusion method was optimized towards the improvement of hepatocyte isolation and aggregation efficiency, resulting in an isolation protocol compatible with 3D culture. In chapter 3, human hepatocytes were co-cultivated with mesenchymal stem cells (MSC) and the phenotype of both cell types was characterized, showing that MSC acquire a supportive stromal function and hepatocytes retain differentiated hepatic functions, stability of drug metabolism enzymes and higher viability in co-cultures. In chapter 4, a 3D alginate microencapsulation strategy for the differentiation of HepaRG cells was evaluated and compared with the standard 2D DMSO-dependent differentiation, yielding higher differentiation efficiency, comparable levels of drug metabolism activity and significantly improved biosynthetic activity. The work developed in this thesis provides novel strategies for 3D culture of human hepatic cell models, which are reproducible, scalable and compatible with screening platforms. The phenotypic and functional characterization of the in vitro systems performed contributes to the state of the art of human hepatic cell models and can be applied to the improvement of pre-clinical drug development efficiency of the process, model disease and ultimately, development of cell-based therapeutic strategies for liver failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops the model of Bicego, Grosso, and Otranto (2008) and applies Hidden Markov Models to predict market direction. The paper draws an analogy between financial markets and speech recognition, seeking inspiration from the latter to solve common issues in quantitative investing. Whereas previous works focus mostly on very complex modifications of the original hidden markov model algorithm, the current paper provides an innovative methodology by drawing inspiration from thoroughly tested, yet simple, speech recognition methodologies. By grouping returns into sequences, Hidden Markov Models can then predict market direction the same way they are used to identify phonemes in speech recognition. The model proves highly successful in identifying market direction but fails to consistently identify whether a trend is in place. All in all, the current paper seeks to bridge the gap between speech recognition and quantitative finance and, even though the model is not fully successful, several refinements are suggested and the room for improvement is significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the increasingly competitive market of higher education introduced by the Bologna Declaration, understanding the decision-making of master in management students is at the center of institutional management and marketing efforts on its mission to attract prospective students in a less costly, more efficient manner. The means-end chain approach, applied to the choice of a Portuguese institution in which to pursue a master in management, points to the position in rankings and to the non-specificity of the program as the most important attributes. Additionally, results show that students with distinct demographic, household, or background characteristics choose in significantly different manners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently researchers showed that more choice is not always better. Choosing from large assortments can be overwhelming, raising expectations and decreasing overall level of consumer satisfaction. Author contributes to existing overchoice studies by using real assortment of online stores to find influence of assortment size on customer satisfaction. 90 students participated in the main experiment, where they chose a smartphone case for their friend. Results of the study show that large assortment size leads to higher expectations, higher choice difficulty and higher level of satisfaction. This research does not show overchoice presence and author suggests future studies could focus more on assortment variety and more personal characteristics of consumers, like preference uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulated moving bed (SMB) chromatography is attracting more and more attention since it is a powerful technique for complex separation tasks. Nowadays, more than 60% of preparative SMB units are installed in the pharmaceutical and in the food in- dustry [SDI, Preparative and Process Liquid Chromatography: The Future of Process Separations, International Strategic Directions, Los Angeles, USA, 2002. http://www. strategicdirections.com]. Chromatography is the method of choice in these ¯elds, be- cause often pharmaceuticals and ¯ne-chemicals have physico-chemical properties which di®er little from those of the by-products, and they may be thermally instable. In these cases, standard separation techniques as distillation and extraction are not applicable. The noteworthiness of preparative chromatography, particulary SMB process, as a sep- aration and puri¯cation process in the above mentioned industries has been increasing, due to its °exibility, energy e±ciency and higher product purity performance. Consequently, a new SMB paradigm is requested by the large number of potential small- scale applications of the SMB technology, which exploits the °exibility and versatility of the technology. In this new SMB paradigm, a number of possibilities for improving SMB performance through variation of parameters during a switching interval, are pushing the trend toward the use of units with smaller number of columns because less stationary phase is used and the setup is more economical. This is especially important for the phar- maceutical industry, where SMBs are seen as multipurpose units that can be applied to di®erent separations in all stages of the drug-development cycle. In order to reduce the experimental e®ort and accordingly the coast associated with the development of separation processes, simulation models are intensively used. One impor- tant aspect in this context refers to the determination of the adsorption isotherms in SMB chromatography, where separations are usually carried out under strongly nonlinear conditions in order to achieve higher productivities. The accurate determination of the competitive adsorption equilibrium of the enantiomeric species is thus of fundamental importance to allow computer-assisted optimization or process scale-up. Two major SMB operating problems are apparent at production scale: the assessment of product quality and the maintenance of long-term stable and controlled operation. Constraints regarding product purity, dictated by pharmaceutical and food regulatory organizations, have drastically increased the demand for product quality control. The strict imposed regulations are increasing the need for developing optically pure drugs.(...)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research is titled “The Future of Airline Business Models: Which Will Win?” and it is part of the requirements for the award of a Masters in Management from NOVA BSE and another from Luiss Guido Carlo University. The purpose is to elaborate a complete market analysis of the European Air Transportation Industry in order to predict which Airlines, strategies and business models may be successful in the next years. First, an extensive literature review of the business model concept has been done. Then, a detailed overview of the main European Airlines and the strategies that they have been implementing so far has been developed. Finally, the research is illustrated with three case studies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern fully integrated transceivers architectures, require circuits with low area, low cost, low power, and high efficiency. A key block in modern transceivers is the power amplifier, which is deeply studied in this thesis. First, we study the implementation of a classical Class-A amplifier, describing the basic operation of an RF power amplifier, and analysing the influence of the real models of the reactive components in its operation. Secondly, the Class-E amplifier is deeply studied. The different types of implementations are reviewed and theoretical equations are derived and compared with simulations. There were selected four modes of operation for the Class-E amplifier, in order to perform the implementation of the output stage, and the subsequent comparison of results. This led to the selection of the mode with the best trade-off between efficiency and harmonics distortion, lower power consumption and higher output power. The optimal choice was a parallel circuit containing an inductor with a finite value. To complete the implementation of the PA in switching mode, a driver was implemented. The final block (output stage together with the driver) got 20 % total efficiency (PAE) transmitting 8 dBm output power to a 50 W load with a total harmonic distortion (THD) of 3 % and a total consumption of 28 mW. All implementations are designed using standard 130 nm CMOS technology. The operating frequency is 2.4 GHz and it was considered an 1.2 V DC power supply. The proposed circuit is intended to be used in a Bluetooth transmitter, however, it has a wider range of applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Composite materials have a complex behavior, which is difficult to predict under different types of loads. In the course of this dissertation a methodology was developed to predict failure and damage propagation of composite material specimens. This methodology uses finite element numerical models created with Ansys and Matlab softwares. The methodology is able to perform an incremental-iterative analysis, which increases, gradually, the load applied to the specimen. Several structural failure phenomena are considered, such as fiber and/or matrix failure, delamination or shear plasticity. Failure criteria based on element stresses were implemented and a procedure to reduce the stiffness of the failed elements was prepared. The material used in this dissertation consist of a spread tow carbon fabric with a 0°/90° arrangement and the main numerical model analyzed is a 26-plies specimen under compression loads. Numerical results were compared with the results of specimens tested experimentally, whose mechanical properties are unknown, knowing only the geometry of the specimen. The material properties of the numerical model were adjusted in the course of this dissertation, in order to find the lowest difference between the numerical and experimental results with an error lower than 5% (it was performed the numerical model identification based on the experimental results).