16 resultados para Input-output analysis (IOA)
em Scielo Saúde Pública - SP
Resumo:
This paper examines the post-War industrialization process in the Brazilian State of Minas Gerais, focusing on one of its desirable outcomes, namely the capacity to generate growth through the impact of strong input-output linkages. This process is placed into historical perspective considering the ideas that permeate the economic development debate throughout the period of analysis. Changes in the regional economic structure are assessed through the use of three input-output tables for the years of 1953, 1980 and 1995. By adopting the fields of influence methodology as the analytical core, it is shown that the efforts towards the creation of a more integrated regional economy have generated stronger influence of the targeted sectors (metal products, transportation equipment, chemical, and services). However, structural changes also contributed to strengthen leakage in the system originated in traditional economic activities.
Resumo:
Em média, os salários no Brasil são onerados em 42,5% do seu valor bruto, somando-se a parte que é descontada do salário do trabalhador com a que incide sobre a folha de pagamentos das empresas. Isso torna o país uma das economias que mais tributam rendimentos do trabalho assalariado no mundo. O maior ônus sobre os salários recai sobre as empresas, estimulando práticas como a contratação de empregados sem carteira de trabalho assinada e a terceirização, fazendo da informalidade um dos elementos determinantes dos crescentes déficits do INSS. A folha de pagamentos é tributada em média em 35%, sendo a contribuição previdenciária o tributo de maior peso. Após diagnosticar o problema, este texto discute aspectos relacionados aos regimes previdenciários e as bases de incidência adequadas a cada um deles. Mostra ainda que o regime geral da previdência no Brasil assumiu conotação de política pública de renda complementar. Nesse sentido, propõe-se a substituição do INSS patronal, uma base restrita, por uma contribuição de 0,61% sobre as movimentações nas contas-correntes bancárias, uma base universal, e compara os efeitos sobre a economia de um tributo cumulativo com os produzidos por um imposto sobre o valor agregado. Utilizando o modelo de input-output de Leontief como mecanismo de análise, o trabalho revela que uma contribuição sobre as transações bancárias implica menor carga tributária sobre os preços setoriais e menor distorção alocativa que os 20% cobrados sobre a folha de salários das empresas para o INSS. Por fim, o texto procura desmistificar a crítica envolvendo a cumulatividade tributária.
Resumo:
Non-linear functional representation of the aerodynamic response provides a convenient mathematical model for motion-induced unsteady transonic aerodynamic loads response, that accounts for both complex non-linearities and time-history effects. A recent development, based on functional approximation theory, has established a novel functional form; namely, the multi-layer functional. For a large class of non-linear dynamic systems, such multi-layer functional representations can be realised via finite impulse response (FIR) neural networks. Identification of an appropriate FIR neural network model is facilitated by means of a supervised training process in which a limited sample of system input-output data sets is presented to the temporal neural network. The present work describes a procedure for the systematic identification of parameterised neural network models of motion-induced unsteady transonic aerodynamic loads response. The training process is based on a conventional genetic algorithm to optimise the network architecture, combined with a simplified random search algorithm to update weight and bias values. Application of the scheme to representative transonic aerodynamic loads response data for a bidimensional airfoil executing finite-amplitude motion in transonic flow is used to demonstrate the feasibility of the approach. The approach is shown to furnish a satisfactory generalisation property to different motion histories over a range of Mach numbers in the transonic regime.
Resumo:
The formal calibration procedure of a phase fraction meter is based on registering the outputs resulting from imposed phase fractions at known flow regimes. This can be straightforwardly done in laboratory conditions, but is rarely the case in industrial conditions, and particularly for on-site applications. Thus, there is a clear need for less restrictive calibration methods regarding to the prior knowledge of the complete set of inlet conditions. A new procedure is proposed in this work for the on-site construction of the calibration curve from total flown mass values of the homogeneous dispersed phase. The solution is obtained by minimizing a convenient error functional, assembled with data from redundant tests to handle the intrinsic ill-conditioned nature of the problem. Numerical simulations performed for increasing error levels demonstrate that acceptable calibration curves can be reconstructed, even from total mass measured within a precision of up to 2%. Consequently, the method can readily be applied, especially in on-site calibration problems in which classical procedures fail due to the impossibility of having a strict control of all the input/output parameters.
Resumo:
ABSTRACT The traditional method of net present value (NPV) to analyze the economic profitability of an investment (based on a deterministic approach) does not adequately represent the implicit risk associated with different but correlated input variables. Using a stochastic simulation approach for evaluating the profitability of blueberry (Vaccinium corymbosum L.) production in Chile, the objective of this study is to illustrate the complexity of including risk in economic feasibility analysis when the project is subject to several but correlated risks. The results of the simulation analysis suggest that the non-inclusion of the intratemporal correlation between input variables underestimate the risk associated with investment decisions. The methodological contribution of this study illustrates the complexity of the interrelationships between uncertain variables and their impact on the convenience of carrying out this type of business in Chile. The steps for the analysis of economic viability were: First, adjusted probability distributions for stochastic input variables (SIV) were simulated and validated. Second, the random values of SIV were used to calculate random values of variables such as production, revenues, costs, depreciation, taxes and net cash flows. Third, the complete stochastic model was simulated with 10,000 iterations using random values for SIV. This result gave information to estimate the probability distributions of the stochastic output variables (SOV) such as the net present value, internal rate of return, value at risk, average cost of production, contribution margin and return on capital. Fourth, the complete stochastic model simulation results were used to analyze alternative scenarios and provide the results to decision makers in the form of probabilities, probability distributions, and for the SOV probabilistic forecasts. The main conclusion shown that this project is a profitable alternative investment in fruit trees in Chile.
Resumo:
For an accurate use of pesticide leaching models it is necessary to assess the sensitivity of input parameters. The aim of this work was to carry out sensitivity analysis of the pesticide leaching model PEARL for contrasting soil types of Dourados river watershed in the state of Mato Grosso do Sul, Brazil. Sensitivity analysis was done by carrying out many simulations with different input parameters and calculating their influence on the output values. The approach used was called one-at-a-time sensitivity analysis, which consists in varying independently input parameters one at a time and keeping all others constant with the standard scenario. Sensitivity analysis was automated using SESAN tool that was linked to the PEARL model. Results have shown that only soil characteristics influenced the simulated water flux resulting in none variation of this variable for scenarios with different pesticides and same soil. All input parameters that showed the greatest sensitivity with regard to leached pesticide are related to soil and pesticide properties. Sensitivity of all input parameters was scenario dependent, confirming the need of using more than one standard scenario for sensitivity analysis of pesticide leaching models.
Resumo:
A system is said to be "instantaneous" when for a given constant input an equilibrium output is obtained after a while. In the meantime, the output is changing from its initial value towards the equilibrium one. This is the transient period of the system and transients are important features of open-respirometry systems. During transients, one cannot compute the input amplitude directly from the output. The existing models (e.g., first or second order dynamics) cannot account for many of the features observed in real open-respirometry systems, such as time lag. Also, these models do not explain what should be expected when a system is speeded up or slowed down. The purpose of the present study was to develop a mechanistic approach to the dynamics of open-respirometry systems, employing basic thermodynamic concepts. It is demonstrated that all the main relevant features of the output dynamics are due to and can be adequately explained by a distribution of apparent velocities within the set of molecules travelling along the system. The importance of the rate at which the molecules leave the sensor is explored for the first time. The study approaches the difference in calibrating a system with a continuous input and with a "unit impulse": the former truly reveals the dynamics of the system while the latter represents the first derivative (in time) of the former and, thus, cannot adequately be employed in the apparent time-constant determination. Also, we demonstrate why the apparent order of the output changes with volume or flow.
Resumo:
ABSTRACT OBJECTIVE To develop an assessment tool to evaluate the efficiency of federal university general hospitals. METHODS Data envelopment analysis, a linear programming technique, creates a best practice frontier by comparing observed production given the amount of resources used. The model is output-oriented and considers variable returns to scale. Network data envelopment analysis considers link variables belonging to more than one dimension (in the model, medical residents, adjusted admissions, and research projects). Dynamic network data envelopment analysis uses carry-over variables (in the model, financing budget) to analyze frontier shift in subsequent years. Data were gathered from the information system of the Brazilian Ministry of Education (MEC), 2010-2013. RESULTS The mean scores for health care, teaching and research over the period were 58.0%, 86.0%, and 61.0%, respectively. In 2012, the best performance year, for all units to reach the frontier it would be necessary to have a mean increase of 65.0% in outpatient visits; 34.0% in admissions; 12.0% in undergraduate students; 13.0% in multi-professional residents; 48.0% in graduate students; 7.0% in research projects; besides a decrease of 9.0% in medical residents. In the same year, an increase of 0.9% in financing budget would be necessary to improve the care output frontier. In the dynamic evaluation, there was progress in teaching efficiency, oscillation in medical care and no variation in research. CONCLUSIONS The proposed model generates public health planning and programming parameters by estimating efficiency scores and making projections to reach the best practice frontier.
Resumo:
Records with the search string biogeograph* were collected from the Science Citation Index (SCI). A total of 3456 records were downloaded for the 1945-2006 period from titles of articles and reviews, and 10,543 records were downloaded for 1991-2006, taking into consideration also abstracts and keywords. Temporal trends of publications, geographical and institutional distribution of the research output, authorship, and core journals were evaluated. There were as many as 122 countries carrying out biogeographic research; in the most recent period, USA is the top producing country, followed by the United Kingdom, Australia, France, Germany, Spain, and Canada. There were 17,493 authors contributing to the field. During 1991-2006 there were 4098 organizations with authors involved in biogeographic research; institutions with higher number of papers are the Natural History Museum (United Kingdom), the University of California, Berkeley (USA), the Museum National d'Histoire Naturelle (France), the Universidad Nacional Autónoma de México (Mexico), the American Museum of Natural History (USA) and the Russian Academy of Sciences (Russia). Research articles are spread over a variety of journals, with the Journal of Biogeography, Molecular Phylogenetics and Evolution, Molecular Ecology, and Biological Journal of the Linnean Society being the core journals. From 28,759 keywords retrieved those with the highest frequency were evolution, phylogeny, diversity, mitochondrial DNA, pattern(s), systematics, and population(s). We conclude that publications on biogeography have increased substantially during the last years, especially since 1998. The preferred journal for biogeographic papers is the Journal of Biogeography. Most frequent keywords seem to indicate that biogeography fits well within both evolutionary biology and ecology, with molecular biology and phylogenetics being important factors that drive their current development.
Resumo:
A new version of the normal coordinate analysis package NCT is presented. The upgrade was mainly devised to enable the NCT package to manipulate easily the Hessian matrix evaluated by quantum chemical calculations. Program codes were almost wholly rewritten to be more efficient with GNU Fortran77, or g77, and compiled under FreeBSD and MS-DOS with the DJGPP implementation. Three typical usages of the program package are presented by giving the related input and output files. Functionality of the programs was carefully and satisfactorily checked for some sample calculations.
Resumo:
The objective of this study was to verify the potential of SNAP III (Scheduling and Network Analysis Program) as a support tool for harvesting and wood transport planning in Brazil harvesting subsystem definition and establishment of a compatible route were assessed. Initially, machine operational and production costs were determined in seven subsystems for the study area, and quality indexes, construction and maintenance costs of forest roads were obtained and used as SNAP III program input data. The results showed, that three categories of forest road occurrence were observed in the study area: main, secondary and tertiary which, based on quality index, allowed a medium vehicle speed of about 41, 30 and 24 km/hours and a construction cost of about US$ 5,084.30, US$ 2,275.28 and US$ 1,650.00/km, respectively. The SNAP III program used as a support tool for the planning, was found to have a high potential tool in the harvesting and wood transport planning. The program was capable of defining efficiently, the harvesting subsystem on technical and economical basis, the best wood transport route and the forest road to be used in each period of the horizon planning.
Resumo:
Techniques of evaluation of risks coming from inherent uncertainties to the agricultural activity should accompany planning studies. The risk analysis should be carried out by risk simulation using techniques as the Monte Carlo method. This study was carried out to develop a computer program so-called P-RISCO for the application of risky simulations on linear programming models, to apply to a case study, as well to test the results comparatively to the @RISK program. In the risk analysis it was observed that the average of the output variable total net present value, U, was considerably lower than the maximum U value obtained from the linear programming model. It was also verified that the enterprise will be front to expressive risk of shortage of water in the month of April, what doesn't happen for the cropping pattern obtained by the minimization of the irrigation requirement in the months of April in the four years. The scenario analysis indicated that the sale price of the passion fruit crop exercises expressive influence on the financial performance of the enterprise. In the comparative analysis it was verified the equivalence of P-RISCO and @RISK programs in the execution of the risk simulation for the considered scenario.
Resumo:
The fuzzy logic admits infinite intermediate logical values between false and true. With this principle, it developed in this study a system based on fuzzy rules, which indicates the body mass index of ruminant animals in order to obtain the best time to slaughter. The controller developed has as input the variables weight and height, and as output a new body mass index, called Fuzzy Body Mass Index (Fuzzy BMI), which may serve as a detection system at the time of livestock slaughtering, comparing one another by the linguistic variables "Very Low", "Low", "Average ", "High" and "Very High". For demonstrating the use application of this fuzzy system, an analysis was made with 147 Nellore beeves to determine Fuzzy BMI values for each animal and indicate the location of body mass of any herd. The performance validation of the system was based on a statistical analysis using the Pearson correlation coefficient of 0.923, representing a high positive correlation, indicating that the proposed method is appropriate. Thus, this method allows the evaluation of the herd comparing each animal within the group, thus providing a quantitative method of farmer decision. It was concluded that this study established a computational method based on fuzzy logic that mimics part of human reasoning and interprets the body mass index of any bovine species and in any region of the country.
Resumo:
The present study aimed at evaluating the use of Artificial Neural Network to correlate the values resulting from chemical analyses of samples of coffee with the values of their sensory analyses. The coffee samples used were from the Coffea arabica L., cultivars Acaiá do Cerrado, Topázio, Acaiá 474-19 and Bourbon, collected in the southern region of the state of Minas Gerais. The chemical analyses were carried out for reducing and non-reducing sugars. The quality of the beverage was evaluated by sensory analysis. The Artificial Neural Network method used values from chemical analyses as input variables and values from sensory analysis as output values. The multiple linear regression of sensory analysis values, according to the values from chemical analyses, presented a determination coefficient of 0.3106, while the Artificial Neural Network achieved a level of 80.00% of success in the classification of values from the sensory analysis.
Resumo:
This work describes the methodology, basic procedures and instrumental employed by the Solar Energy Laboratory at Universidade Federal do Rio Grande do Sul for the determination of current-voltage characteristic curves of photovoltaic modules. According to this methodology, I-V characteristic curves were acquired for several modules under diverse conditions. The main electrical parameters were determined and the temperature and irradiance influence on photovoltaic modules performance was quantified. It was observed that most of the tested modules presented output power values considerably lower than those specified by the manufacturers. The described hardware allows the testing of modules with open-circuit voltage up to 50 V and short-circuit current up to 8 A.