60 resultados para Management Misperceptions: An Obstacle to Motivation
Resumo:
The harmful dinoflagellate Prorocentrum minimum has different effects upon various species of grazing bivalves, and these effects also vary with life-history stage. Possible effects of this dinoflagellate upon mussels have not been reported; therefore, experiments exposing adult blue mussels, Mytilus edulis, to P. minimum were conducted. Mussels were exposed to cultures of toxic P. minimum or benign Rhodomonas sp. in glass aquaria. After a short period of acclimation, samples were collected on day 0 (before the exposure) and after 3, 6, and 9 days of continuous-exposure experiment. Hemolymph was extracted for flow-cytometric analyses of hemocyte, immune-response functions, and soft tissues were excised for histopathology. Mussels responded to P. minimum exposure with diapedesis of hemocytes into the intestine, presumably to isolate P. minimum cells within the gut, thereby minimizing damage to other tissues. This immune response appeared to have been sustained throughout the 9-day exposure period, as circulating hemocytes retained hematological and functional properties. Bacteria proliferated in the intestines of the P. minimum-exposed mussels. Hemocytes within the intestine appeared to be either overwhelmed by the large number of bacteria or fully occupied in the encapsulating response to P. minimum cells; when hemocytes reached the intestine lumina, they underwent apoptosis and bacterial degradation. This experiment demonstrated that M. edulis is affected by ingestion of toxic P. minimum; however, the specific responses observed in the blue mussel differed from those reported for other bivalve species. This finding highlights the need to study effects of HABs on different bivalve species, rather than inferring that results from one species reflect the exposure responses of all bivalves.
Resumo:
The remarkable increase in trade flows and in migratory flows of highly educated people are two important features of globalization of the last decades. This paper extends a two-country model of inter- and intraindustry trade to a rich environment featuring technological differences, skill differences and the possibility of international labor mobility. The model is used to explain the patterns of trade and migration as countries remove barriers to trade and to labor mobility. We parameterize the model to match the features of the Western and Eastern European members of the EU and analyze first the effects of the trade liberalization which occured between 1989 and 2004, and then the gains and losses from migration which are expected to occur if legal barriers to labor mobility are substantially reduced. The lower barriers to migration would result in significant migration of skilled workers from Eastern European countries. Interestingly, this would not only benefit the migrants and most Western European workers but, via trade, it would also benefit the workers remaining in Eastern Europe. Key Words: Skilled Migration, Gains from Variety, Real Wages, Eastern-Western Europe. JEL Codes: F12, F22, J61.
Resumo:
We present a real data set of claims amounts where costs related to damage are recorded separately from those related to medical expenses. Only claims with positive costs are considered here. Two approaches to density estimation are presented: a classical parametric and a semi-parametric method, based on transformation kernel density estimation. We explore the data set with standard univariate methods. We also propose ways to select the bandwidth and transformation parameters in the univariate case based on Bayesian methods. We indicate how to compare the results of alternative methods both looking at the shape of the overall density domain and exploring the density estimates in the right tail.
Resumo:
Analysis of gas emissions by the input-output subsystem approach provides detailed insight into pollution generation in an economy. Structural decomposition analysis, on the other hand, identifies the factors behind the changes in key variables over time. Extending the input-output subsystem model to account for the changes in these variables reveals the channels by which environmental burdens are caused and transmitted throughout the production system. In this paper we propose a decomposition of the changes in the components of CO2 emissions captured by an input-output subsystems representation. The empirical application is for the Spanish service sector, and the economic and environmental data are for years 1990 and 2000. Our results show that services increased their CO2 emissions mainly because of a rise in emissions generated by non-services to cover the final demand for services. In all service activities, the decomposed effects show an increase in CO2 emissions due to a decrease in emission coefficients (i.e., emissions per unit of output) compensated by an increase in emissions caused both by the input-output coefficients and the rise in demand for services. Finally, large asymmetries exist not only in the quantitative changes in the CO2 emissions of the various services but also in the decomposed effects of these changes. Keywords: structural decomposition analysis, input-output subsystems, CO2 emissions, service sector.
Resumo:
In a recent paper Bermúdez [2009] used bivariate Poisson regression models for ratemaking in car insurance, and included zero-inflated models to account for the excess of zeros and the overdispersion in the data set. In the present paper, we revisit this model in order to consider alternatives. We propose a 2-finite mixture of bivariate Poisson regression models to demonstrate that the overdispersion in the data requires more structure if it is to be taken into account, and that a simple zero-inflated bivariate Poisson model does not suffice. At the same time, we show that a finite mixture of bivariate Poisson regression models embraces zero-inflated bivariate Poisson regression models as a special case. Additionally, we describe a model in which the mixing proportions are dependent on covariates when modelling the way in which each individual belongs to a separate cluster. Finally, an EM algorithm is provided in order to ensure the models’ ease-of-fit. These models are applied to the same automobile insurance claims data set as used in Bermúdez [2009] and it is shown that the modelling of the data set can be improved considerably.
Resumo:
This paper presents an automatic vision-based system for UUV station keeping. The vehicle is equipped with a down-looking camera, which provides images of the sea-floor. The station keeping system is based on a feature-based motion detection algorithm, which exploits standard correlation and explicit textural analysis to solve the correspondence problem. A visual map of the area surveyed by the vehicle is constructed to increase the flexibility of the system, allowing the vehicle to position itself when it has lost the reference image. The testing platform is the URIS underwater vehicle. Experimental results demonstrating the behavior of the system on a real environment are presented
Resumo:
The statistical analysis of literary style is the part of stylometry that compares measurable characteristicsin a text that are rarely controlled by the author, with those in other texts. When thegoal is to settle authorship questions, these characteristics should relate to the author’s style andnot to the genre, epoch or editor, and they should be such that their variation between authors islarger than the variation within comparable texts from the same author.For an overview of the literature on stylometry and some of the techniques involved, see for exampleMosteller and Wallace (1964, 82), Herdan (1964), Morton (1978), Holmes (1985), Oakes (1998) orLebart, Salem and Berry (1998).Tirant lo Blanc, a chivalry book, is the main work in catalan literature and it was hailed to be“the best book of its kind in the world” by Cervantes in Don Quixote. Considered by writterslike Vargas Llosa or Damaso Alonso to be the first modern novel in Europe, it has been translatedseveral times into Spanish, Italian and French, with modern English translations by Rosenthal(1996) and La Fontaine (1993). The main body of this book was written between 1460 and 1465,but it was not printed until 1490.There is an intense and long lasting debate around its authorship sprouting from its first edition,where its introduction states that the whole book is the work of Martorell (1413?-1468), while atthe end it is stated that the last one fourth of the book is by Galba (?-1490), after the death ofMartorell. Some of the authors that support the theory of single authorship are Riquer (1990),Chiner (1993) and Badia (1993), while some of those supporting the double authorship are Riquer(1947), Coromines (1956) and Ferrando (1995). For an overview of this debate, see Riquer (1990).Neither of the two candidate authors left any text comparable to the one under study, and thereforediscriminant analysis can not be used to help classify chapters by author. By using sample textsencompassing about ten percent of the book, and looking at word length and at the use of 44conjunctions, prepositions and articles, Ginebra and Cabos (1998) detect heterogeneities that mightindicate the existence of two authors. By analyzing the diversity of the vocabulary, Riba andGinebra (2000) estimates that stylistic boundary to be near chapter 383.Following the lead of the extensive literature, this paper looks into word length, the use of the mostfrequent words and into the use of vowels in each chapter of the book. Given that the featuresselected are categorical, that leads to three contingency tables of ordered rows and therefore tothree sequences of multinomial observations.Section 2 explores these sequences graphically, observing a clear shift in their distribution. Section 3describes the problem of the estimation of a suden change-point in those sequences, in the followingsections we propose various ways to estimate change-points in multinomial sequences; the methodin section 4 involves fitting models for polytomous data, the one in Section 5 fits gamma modelsonto the sequence of Chi-square distances between each row profiles and the average profile, theone in Section 6 fits models onto the sequence of values taken by the first component of thecorrespondence analysis as well as onto sequences of other summary measures like the averageword length. In Section 7 we fit models onto the marginal binomial sequences to identify thefeatures that distinguish the chapters before and after that boundary. Most methods rely heavilyon the use of generalized linear models
Resumo:
Positioning a robot with respect to objects by using data provided by a camera is a well known technique called visual servoing. In order to perform a task, the object must exhibit visual features which can be extracted from different points of view. Then, visual servoing is object-dependent as it depends on the object appearance. Therefore, performing the positioning task is not possible in presence of non-textured objects or objects for which extracting visual features is too complex or too costly. This paper proposes a solution to tackle this limitation inherent to the current visual servoing techniques. Our proposal is based on the coded structured light approach as a reliable and fast way to solve the correspondence problem. In this case, a coded light pattern is projected providing robust visual features independently of the object appearance
Resumo:
The principal focus of the PhD thesis lies in the Social Software area and the appropriation of technology in "non-Western" societies taking the example of Bulgaria. The term "non-Western" is used to explain places considered technologically underdeveloped. The aims have been to capture how Bulgarian users creatively interpret and appropriate Internet identifying the sociocultural, political and subjective conditions in which that appropriation occurs, to identify emerging practices based on the interpretation and use of Internet and the impact they had on society and what conditions could influence the technological interpretation and the meaning these practices had for both users and social configuration of Internet as media in Bulgaria. An ethnographic approach has been used simultaneously in different online and offline contexts. On the one hand, this study is based on exploration of the Bulgarian Internet Space through online participant observation in forums and websites reviews and on the other hand, on semi-structured interviews with different types of users of the virtual platforms found, made both face to face and online and finally online participant observation at the same platforms. It is based on some contributions of the ethnographic work of Christine Hine in virtual environments and the notions of time and space of Barbara Czarniawska contextualized in the modern form of organization that occurs in a network of multiple and fragmented contexts across many movements.
Resumo:
We consider stock market contagion as a significant increase in cross-market linkages after a shock to one country or group of countries. Under this definition we study if contagion occurred from the U.S. Financial Crisis to the rest of the major stock markets in the world by using the adjusted (unconditional) correlation coefficient approach (Forbes and Rigobon, 2002) which consists of testing if average crossmarket correlations increase significantly during the relevant period of turmoil. We would not reject the null hypothesis of interdependence in favour of contagion if the increase in correlation only suggests a continuation of high linkages in all state of the world. Moreover, if contagion occurs, this would justify the intervention of the IMF and the suddenly portfolio restructuring during the period under study.
Resumo:
We study the earnings structure and the equilibrium assignment of workers when workers exert intra-firm spillovers on each other.We allow for arbitrary spillovers provided output depends on some aggregate index of workers' skill. Despite the possibility of increasing returns to skills, equilibrium typically exists. We show that equilibrium will typically be segregated; that the skill space can be partitioned into a set of segments and any firm hires from only one segment. Next, we apply the model to analyze the effect of information technology on segmentation and the distribution of income. There are two types of human capital, productivity and creativity, i.e. the ability to produce ideas that may be duplicated over a network. Under plausible assumptions, inequality rises and then falls when network size increases, and the poorest workers cannot lose. We also analyze the impact of an improvement in worker quality and of an increased international mobility of ideas.
Resumo:
A new algorithm called the parameterized expectations approach(PEA) for solving dynamic stochastic models under rational expectationsis developed and its advantages and disadvantages are discussed. Thisalgorithm can, in principle, approximate the true equilibrium arbitrarilywell. Also, this algorithm works from the Euler equations, so that theequilibrium does not have to be cast in the form of a planner's problem.Monte--Carlo integration and the absence of grids on the state variables,cause the computation costs not to go up exponentially when the numberof state variables or the exogenous shocks in the economy increase. \\As an application we analyze an asset pricing model with endogenousproduction. We analyze its implications for time dependence of volatilityof stock returns and the term structure of interest rates. We argue thatthis model can generate hump--shaped term structures.