47 resultados para improving profitability
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Estudi elaborat a partir d’una estada a Xerox Research Centre Europe a Grenoble, França,entre juny i desembre del 2006. El projecte tradueïx termes tècnics anglesos a noruec. És asimètric perquè no tenim recursos lingüístics per a la llengua noruega, però solament per a l'anglès. S’ha desenvolupat i posat en pràctica mètodes que comprovaven contigüitat ("local reordering" i permutació selectiva) per a millorar el funcionament d’una eina anterior. Contigüitat és quan una paraula es traduïx en paraules múltiples, aquestes paraules han de ser adjacents en l'oració. A més, s’ha construït una taula de les operacions de recerca per als termes tècnics i s’ha integrat aquesta taula en un programa de demostració.
Resumo:
In the literature on risk, one generally assume that uncertainty is uniformly distributed over the entire working horizon, when the absolute risk-aversion index is negative and constant. From this perspective, the risk is totally exogenous, and thus independent of endogenous risks. The classic procedure is "myopic" with regard to potential changes in the future behavior of the agent due to inherent random fluctuations of the system. The agent's attitude to risk is rigid. Although often criticized, the most widely used hypothesis for the analysis of economic behavior is risk-neutrality. This borderline case must be envisaged with prudence in a dynamic stochastic context. The traditional measures of risk-aversion are generally too weak for making comparisons between risky situations, given the dynamic �complexity of the environment. This can be highlighted in concrete problems in finance and insurance, context for which the Arrow-Pratt measures (in the small) give ambiguous.
Resumo:
We show a standard model where the optimal tax reform is to cut labor taxes and leave capital taxes very high in the short and medium run. Only in the very long run would capital taxes be zero. Our model is a version of Chamley??s, with heterogeneous agents, without lump sum transfers, an upper bound on capital taxes, and a focus on Pareto improving plans. For our calibration labor taxes should be low for the first ten to twenty years, while capital taxes should be at their maximum. This policy ensures that all agents benefit from the tax reform and that capital grows quickly after when the reform begins. Therefore, the long run optimal tax mix is the opposite from the short and medium run tax mix. The initial labor tax cut is financed by deficits that lead to a positive long run level of government debt, reversing the standard prediction that government accumulates savings in models with optimal capital taxes. If labor supply is somewhat elastic benefits from tax reform are high and they can be shifted entirely to capitalists or workers by varying the length of the transition. With inelastic labor supply there is an increasing part of the equilibrium frontier, this means that the scope for benefitting the workers is limited and the total benefits from reforming taxes are much lower.
Resumo:
The objective of this paper is to re-evaluate the attitude to effort of a risk-averse decision-maker in an evolving environment. In the classic analysis, the space of efforts is generally discretized. More realistic, this new approach emploies a continuum of effort levels. The presence of multiple possible efforts and performance levels provides a better basis for explaining real economic phenomena. The traditional approach (see, Laffont, J. J. & Tirole, J., 1993, Salanie, B., 1997, Laffont, J.J. and Martimort, D, 2002, among others) does not take into account the potential effect of the system dynamics on the agent's behavior to effort over time. In the context of a Principal-agent relationship, not only the incentives of the Principal can determine the private agent to allocate a good effort, but also the evolution of the dynamic system. The incentives can be ineffective when the environment does not incite the agent to invest a good effort. This explains why, some effici
Resumo:
PADICAT is the web archive created in 2005 in Catalonia (Spain ) by the Library of Catalonia (BC ) , the National Library of Catalonia , with the aim of collecting , processing and providing permanent access to the digital heritage of Catalonia . Its harvesting strategy is based on the hybrid model ( of massive harvesting . SPA top level domain ; selective compilation of the web site output of Catalan organizations; focused harvesting of public events) . The system provides open access to the whole collection , on the Internet . We consider necessary to complement the current search for new and visualization software with open source software tool, CAT ( Curator Archiving Tool) , composed by three modules aimed to effectively managing the processes of human cataloguing ; to publish directories where the digital resources and special collections ; and to offer statistical information of added value to end users. Within the framework of the International Internet Preservation Consortium meeting ( Vienna 2010) , the progress in the development of this new tool, and the philosophy that has motivated his design, are presented to the international community.
Resumo:
In this paper the two main drawbacks of the heat balance integral methods are examined. Firstly we investigate the choice of approximating function. For a standard polynomial form it is shown that combining the Heat Balance and Refined Integral methods to determine the power of the highest order term will either lead to the same, or more often, greatly improved accuracy on standard methods. Secondly we examine thermal problems with a time-dependent boundary condition. In doing so we develop a logarithmic approximating function. This new function allows us to model moving peaks in the temperature profile, a feature that previous heat balance methods cannot capture. If the boundary temperature varies so that at some time t & 0 it equals the far-field temperature, then standard methods predict that the temperature is everywhere at this constant value. The new method predicts the correct behaviour. It is also shown that this function provides even more accurate results, when coupled with the new CIM, than the polynomial profile. Analysis primarily focuses on a specified constant boundary temperature and is then extended to constant flux, Newton cooling and time dependent boundary conditions.
Resumo:
This article studies how product introduction decisions relate to profitability and uncertainty in the context of multi-product firms and product differentiation. These two features, common to many modern industries, have not received much attention in the literature as compared to the classical problem of firm entry, even if the determinants of firm and product entry are quite different. The theoretical predictions about the sign of the impact of uncertainty on product entry are not conclusive. Therefore, an econometric model relating firms’ product introduction decisions with profitability and profit uncertainty is proposed. Firm’s estimated profits are obtained from a structural model of product demand and supply, and uncertainty is proxied by profits’ variance. The empirical analysis is carried out using data on the Spanish car industry for the period 1990-2000. The results show a positive relationship between product introduction and profitability, and a negative one with respect to profit variability. Interestingly, the degree of uncertainty appears to be a driving force of entry stronger than profitability, suggesting that the product proliferation process in the Spanish car market may have been mainly a consequence of lower uncertainty rather than the result of having a more profitable market. Keywords: Product introduction, entry, uncertainty, multiproduct firms, automobile JEL codes: L11, L13
Resumo:
In image segmentation, clustering algorithms are very popular because they are intuitive and, some of them, easy to implement. For instance, the k-means is one of the most used in the literature, and many authors successfully compare their new proposal with the results achieved by the k-means. However, it is well known that clustering image segmentation has many problems. For instance, the number of regions of the image has to be known a priori, as well as different initial seed placement (initial clusters) could produce different segmentation results. Most of these algorithms could be slightly improved by considering the coordinates of the image as features in the clustering process (to take spatial region information into account). In this paper we propose a significant improvement of clustering algorithms for image segmentation. The method is qualitatively and quantitative evaluated over a set of synthetic and real images, and compared with classical clustering approaches. Results demonstrate the validity of this new approach
Resumo:
This article studies how product introduction decisions relate to profitability and uncertainty in the context of multi-product firms and product differentiation. These two features, common to many modern industries, have not received much attention in the literature as compared to the classical problem of firm entry, even if the determinants of firm and product entry are quite different. The theoretical predictions about the sign of the impact of uncertainty on product entry are not conclusive. Therefore, an econometric model relating firms’ product introduction decisions with profitability and profit uncertainty is proposed. Firm’s estimated profits are obtained from a structural model of product demand and supply, and uncertainty is proxied by profits’ variance. The empirical analysis is carried out using data on the Spanish car industry for the period 1990-2000. The results show a positive relationship between product introduction and profitability, and a negative one with respect to profit variability. Interestingly, the degree of uncertainty appears to be a driving force of entry stronger than profitability, suggesting that the product proliferation process in the Spanish car market may have been mainly a consequence of lower uncertainty rather than the result of having a more profitable market
Resumo:
Coffee and cocoa represent the main sources of income for small farmers in the Northern Amazon Region of Ecuador. The provinces of Orellana and Sucumbios, as border areas, have benefited from investments made by many public and private institutions. Many of the projects carried out in the area have been aimed at energising the production of coffee and cocoa, strengthening the producers’ associations and providing commercialisation infrastructure. Improving the quality of life of this population threatened by poverty and high migration flows mainly from Colombia is a significant challenge. This paper presents research highlighting the importance of associative commercialisation to raising income from coffee and cocoa. The research draws on primary information obtained during field work, and from official information from the Ministry of Agriculture. The study presents an overview of current organisational structures, initiatives of associative commercialisation, stockpiling of infrastructure and ownership regimes, as well as estimates for both ‘robusta’ coffee and national cocoa production and income. The analysis of the main constraints presents different alternatives for the implementation of public land policies. These policies are aimed at mitigating the problems associated with the organisational structure of the producers, with processes of commercialisation and with environmental aspects, among others.
Resumo:
Annotation of protein-coding genes is a key goal of genome sequencing projects. In spite of tremendous recent advances in computational gene finding, comprehensive annotation remains a challenge. Peptide mass spectrometry is a powerful tool for researching the dynamic proteome and suggests an attractive approach to discover and validate protein-coding genes. We present algorithms to construct and efficiently search spectra against a genomic database, with no prior knowledge of encoded proteins. By searching a corpus of 18.5 million tandem mass spectra (MS/MS) from human proteomic samples, we validate 39,000 exons and 11,000 introns at the level of translation. We present translation-level evidence for novel or extended exons in 16 genes, confirm translation of 224 hypothetical proteins, and discover or confirm over 40 alternative splicing events. Polymorphisms are efficiently encoded in our database, allowing us to observe variant alleles for 308 coding SNPs. Finally, we demonstrate the use of mass spectrometry to improve automated gene prediction, adding 800 correct exons to our predictions using a simple rescoring strategy. Our results demonstrate that proteomic profiling should play a role in any genome sequencing project.
Resumo:
With the two aims of monitoring social change and improving social measurement, the European Social Survey is now closing its third round. This paper shows how the accumulated experience of the two first rounds has been used to validate the questionnaire, better adapt the sampling design to the country characteristics and efficiently commit fieldwork in Spain. For example, the dynamic character of the population nowadays makes necessary to estimated design effects at each round from the data of the previous round. The paper also demonstrates how, starting with a response rate of 52% at first round, a 66% response rate is achieved at the third round thanks to an extensive quality control conducted by the polling agency and the ESS national team and based on a detailed analysis of the non-response cases and the incidences reported by the interviewed in the contact form.
Resumo:
Estimates for the U.S. suggest that at least in some sectors productivity enhancing reallocationis the dominant factor in accounting for producitivity growth. An open question, particularlyrelevant for developing countries, is whether reallocation is always productivity enhancing. Itmay be that imperfect competition or other barriers to competitive environments imply that thereallocation process is not fully e?cient in these countries. Using a unique plant-levellongitudinal dataset for Colombia for the period 1982-1998, we explore these issues byexamining the interaction between market allocation, and productivity and profitability.Moreover, given the important trade, labor and financial market reforms in Colombia during theearly 1990's, we explore whether and how the contribution of reallocation changed over theperiod of study. Our data permit measurement of plant-level quantities and prices. Takingadvantage of the rich structure of our price data, we propose a sequential mehodology to estimateproductivity and demand shocks at the plant level. First, we estimate total factor productivity(TFP) with plant-level physical output data, where we use downstream demand to instrumentinputs. We then turn to estimating demand shocks and mark-ups with plant-level price data, usingTFP to instrument for output in the inversedemand equation. We examine the evolution of thedistributions of TFP and demand shocks in response to the market reforms in the 1990's. We findthat market reforms are associated with rising overall productivity that is largely driven byreallocation away from low- and towards highproductivity businesses. In addition, we find thatthe allocation of activity across businesses is less driven by demand factors after reforms. Wefind that the increase in aggregate productivity post-reform is entirely accounted for by theimproved allocation of activity.
Resumo:
With the two aims of monitoring social change and improving social measurement, the European Social Survey is now closing its third round. This paper shows how the accumulated experience of the two first rounds has been used to validate the questionnaire, better adapt the sampling design to the country characteristics and efficiently commit fieldwork in Spain. For example, the dynamic character of the population nowadays makes necessary to estimated design effects at each round from the data of the previous round. The paper also demonstrates how, starting with a response rate of 52% at first round, a 66% response rate is achieved at the third round thanks to an extensive quality control conducted by the polling agency and the ESS national team and based on a detailed analysis of the non-response cases and the incidences reported by the interviewed in the contact form.
Resumo:
A national survey designed for estimating a specific population quantity is sometimes used for estimation of this quantity also for a small area, such as a province. Budget constraints do not allow a greater sample size for the small area, and so other means of improving estimation have to be devised. We investigate such methods and assess them by a Monte Carlo study. We explore how a complementary survey can be exploited in small area estimation. We use the context of the Spanish Labour Force Survey (EPA) and the Barometer in Spain for our study.