116 resultados para Pesticide application
Resumo:
Thermal systems interchanging heat and mass by conduction, convection, radiation (solar and thermal ) occur in many engineering applications like energy storage by solar collectors, window glazing in buildings, refrigeration of plastic moulds, air handling units etc. Often these thermal systems are composed of various elements for example a building with wall, windows, rooms, etc. It would be of particular interest to have a modular thermal system which is formed by connecting different modules for the elements, flexibility to use and change models for individual elements, add or remove elements without changing the entire code. A numerical approach to handle the heat transfer and fluid flow in such systems helps in saving the full scale experiment time, cost and also aids optimisation of parameters of the system. In subsequent sections are presented a short summary of the work done until now on the orientation of the thesis in the field of numerical methods for heat transfer and fluid flow applications, the work in process and the future work.
Why Catalonia will see its energy metabolism increase in the near future: an application of MuSIASEM
Resumo:
This paper applies the so-called Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) to the economy of the Spanish region of Catalonia. By applying Georgescu-Roegen's fund-flow model, it arrives at the conclusion that within a context of the end of cheap oil, the current development model based on the growth of low productivity sectors such as services and construction must change. The change is needed not only because of the increasing scarcity of affordable energy carriers, or because of the increasing environmental impact that the present development represents, but also because of an ageing population that demands labour productivity gains. This will imply industry requiring more energy consumption per worker in order to increase its productivity, and therefore its competitiveness. Thus, we conclude that energy intensity, and exosomatic energy metabolism of Catalonia will increase dramatically in the near future unless major conservation efforts are implemented in both the household and transport sectors.
Resumo:
This article provides a fresh methodological and empirical approach for assessing price level convergence and its relation to purchasing power parity (PPP) using annual price data for seventeen US cities. We suggest a new procedure that can handle a wide range of PPP concepts in the presence of multiple structural breaks using all possible pairs of real exchange rates. To deal with cross-sectional dependence, we use both cross-sectional demeaned data and a parametric bootstrap approach. In general, we find more evidence for stationarity when the parity restriction is not imposed, while imposing parity restriction provides leads toward the rejection of the panel stationar- ity. Our results can be embedded on the view of the Balassa-Samuelson approach, but where the slope of the time trend is allowed to change in the long-run. The median half-life point estimate are found to be lower than the consensus view regardless of the parity restriction.
Resumo:
Report for the scientific sojourn at the University of Linköping between April to July 2007. Monitoring of the air intake system of an automotive engine is important to meet emission related legislative diagnosis requirements. During the research the problem of fault detection in the air intake system was stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem was solved using Interval-based Consistency Techniques. Interval-based consistency techniques are shown to be particularly efficient for checking the consistency of the Analytical Redundancy Relations (ARRs), dealing with uncertain measurements and parameters, and using experimental data. All experiments were performed on a four-cylinder turbo-charged spark-ignited SAAB engine located in the research laboratory at Vehicular System Group - University of Linköping.
Resumo:
En el siguiente documento podrá encontrar de una forma clara y entendedora, a través de la creación de un sencillo aplicativo, el mecanismo para la creación de una aplicación J2EE basada en el framework de desarrollo Yakarta Struts. En el mismo partirá desde cero, desde el inicio en la captación de requerimientos, pasando por la etapa de análisis y diseño y la posterior implementación.
Resumo:
The paper proposes and applies statistical tests for poverty dominance that check for whether poverty comparisons can be made robustly over ranges of poverty lines and classes of poverty indices. This helps provide both normative and statistical confidence in establishing poverty rankings across distributions. The tests, which can take into account the complex sampling procedures that are typically used by statistical agencies to generate household-level surveys, are implemented using the Canadian Survey of Labour and Income Dynamics (SLID) for 1996, 1999 and 2002. Although the yearly cumulative distribution functions cross at the lower tails of the distributions, the more recent years tend to dominate earlier years for a relatively wide range of poverty lines. Failing to take into account SLID's sampling variability (as is sometimes done) can inflate significantly one's confidence in ranking poverty. Taking into account SLID's complex sampling design (as has not been done before) can also decrease substantially the range of poverty lines over which a poverty ranking can be inferred.
Resumo:
This paper develops a methodology to estimate the entire population distributions from bin-aggregated sample data. We do this through the estimation of the parameters of mixtures of distributions that allow for maximal parametric flexibility. The statistical approach we develop enables comparisons of the full distributions of height data from potential army conscripts across France's 88 departments for most of the nineteenth century. These comparisons are made by testing for differences-of-means stochastic dominance. Corrections for possible measurement errors are also devised by taking advantage of the richness of the data sets. Our methodology is of interest to researchers working on historical as well as contemporary bin-aggregated or histogram-type data, something that is still widely done since much of the information that is publicly available is in that form, often due to restrictions due to political sensitivity and/or confidentiality concerns.
Resumo:
The remarkable increase in trade flows and in migratory flows of highly educated people are two important features of globalization of the last decades. This paper extends a two-country model of inter- and intraindustry trade to a rich environment featuring technological differences, skill differences and the possibility of international labor mobility. The model is used to explain the patterns of trade and migration as countries remove barriers to trade and to labor mobility. We parameterize the model to match the features of the Western and Eastern European members of the EU and analyze first the effects of the trade liberalization which occured between 1989 and 2004, and then the gains and losses from migration which are expected to occur if legal barriers to labor mobility are substantially reduced. The lower barriers to migration would result in significant migration of skilled workers from Eastern European countries. Interestingly, this would not only benefit the migrants and most Western European workers but, via trade, it would also benefit the workers remaining in Eastern Europe. Key Words: Skilled Migration, Gains from Variety, Real Wages, Eastern-Western Europe. JEL Codes: F12, F22, J61.
Application of standard and refined heat balance integral methods to one-dimensional Stefan problems
Resumo:
The work in this paper concerns the study of conventional and refined heat balance integral methods for a number of phase change problems. These include standard test problems, both with one and two phase changes, which have exact solutions to enable us to test the accuracy of the approximate solutions. We also consider situations where no analytical solution is available and compare these to numerical solutions. It is popular to use a quadratic profile as an approximation of the temperature, but we show that a cubic profile, seldom considered in the literature, is far more accurate in most circumstances. In addition, the refined integral method can give greater improvement still and we develop a variation on this method which turns out to be optimal in some cases. We assess which integral method is better for various problems, showing that it is largely dependent on the specified boundary conditions.
Resumo:
This paper investigates vulnerability to poverty in Haiti. Research in vulnerability in developing countries has been scarce due to the high data requirements of vulnerability studies (e.g. panel or long series of cross-sections). The methodology adopted here allows the assessment of vulnerability to poverty by exploiting the short panel structure of nested data at different levels. The decomposition method reveals that vulnerability in Haiti is largely a rural phenomenon and that schooling correlates negatively with vulnerability. Most importantly, among the different shocks affecting household's income, it is found that meso-level shocks are in general far more important than covariate shocks. This finding points to some interesting policy implications in decentralizing policies to alleviate vulnerability to poverty.
Resumo:
This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.
Resumo:
Analysis of gas emissions by the input-output subsystem approach provides detailed insight into pollution generation in an economy. Structural decomposition analysis, on the other hand, identifies the factors behind the changes in key variables over time. Extending the input-output subsystem model to account for the changes in these variables reveals the channels by which environmental burdens are caused and transmitted throughout the production system. In this paper we propose a decomposition of the changes in the components of CO2 emissions captured by an input-output subsystems representation. The empirical application is for the Spanish service sector, and the economic and environmental data are for years 1990 and 2000. Our results show that services increased their CO2 emissions mainly because of a rise in emissions generated by non-services to cover the final demand for services. In all service activities, the decomposed effects show an increase in CO2 emissions due to a decrease in emission coefficients (i.e., emissions per unit of output) compensated by an increase in emissions caused both by the input-output coefficients and the rise in demand for services. Finally, large asymmetries exist not only in the quantitative changes in the CO2 emissions of the various services but also in the decomposed effects of these changes. Keywords: structural decomposition analysis, input-output subsystems, CO2 emissions, service sector.
Resumo:
In recent years traditional inequality measures have been used to quite a considerable extent to examine the international distribution of environmental indicators. One of its main characteristics is that each one assigns different weights to the changes that occur in the different sections of the variable distribution and, consequently, the results they yield can potentially be very different. Hence, we suggest the appropriateness of using a range of well-recommended measures to achieve more robust results. We also provide an empirical test for the comparative behaviour of several suitable inequality measures and environmental indicators. Our findings support the hypothesis that in some cases there are differences among measures in both the sign of the evolution and its size. JEL codes: D39; Q43; Q56. Keywords: international environment factor distribution; Kaya factors; Inequality measurement
Resumo:
In a recent paper Bermúdez [2009] used bivariate Poisson regression models for ratemaking in car insurance, and included zero-inflated models to account for the excess of zeros and the overdispersion in the data set. In the present paper, we revisit this model in order to consider alternatives. We propose a 2-finite mixture of bivariate Poisson regression models to demonstrate that the overdispersion in the data requires more structure if it is to be taken into account, and that a simple zero-inflated bivariate Poisson model does not suffice. At the same time, we show that a finite mixture of bivariate Poisson regression models embraces zero-inflated bivariate Poisson regression models as a special case. Additionally, we describe a model in which the mixing proportions are dependent on covariates when modelling the way in which each individual belongs to a separate cluster. Finally, an EM algorithm is provided in order to ensure the models’ ease-of-fit. These models are applied to the same automobile insurance claims data set as used in Bermúdez [2009] and it is shown that the modelling of the data set can be improved considerably.