915 resultados para reference model
Resumo:
This paper discusses distribution and the historical phases of capitalism. It assumes that technical progress and growth are taking place, and, given that, its question is on the functional distribution of income between labor and capital, having as reference classical theory of distribution and Marx’s falling tendency of the rate of profit. Based on the historical experience, it, first, inverts the model, making the rate of profit as the constant variable in the long run and the wage rate, as the residuum; second, it distinguishes three types of technical progress (capital-saving, neutral and capital-using) and applies it to the history of capitalism, having the UK and France as reference. Given these three types of technical progress, it distinguishes four phases of capitalist growth, where only the second is consistent with Marx prediction. The last phase, after World War II, should be, in principle, capital-saving, consistent with growth of wages above productivity. Instead, since the 1970s wages were kept stagnant in rich countries because of, first, the fact that the Information and Communication Technology Revolution proved to be highly capital using, opening room for a new wage of substitution of capital for labor; second, the new competition coming from developing countries; third, the emergence of the technobureaucratic or professional class; and, fourth, the new power of the neoliberal class coalition associating rentier capitalists and financiers
Resumo:
We study an intertemporal asset pricing model in which a representative consumer maximizes expected utility derived from both the ratio of his consumption to some reference level and this level itself. If the reference consumption level is assumed to be determined by past consumption levels, the model generalizes the usual habit formation specifications. When the reference level growth rate is made dependent on the market portfolio return and on past consumption growth, the model mixes a consumption CAPM with habit formation together with the CAPM. It therefore provides, in an expected utility framework, a generalization of the non-expected recursive utility model of Epstein and Zin (1989). When we estimate this specification with aggregate per capita consumption, we obtain economically plausible values of the preference parameters, in contrast with the habit formation or the Epstein-Zin cases taken separately. All tests performed with various preference specifications confirm that the reference level enters significantly in the pricing kernel.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Classical Monte Carlo simulations were carried out on the NPT ensemble at 25°C and 1 atm, aiming to investigate the ability of the TIP4P water model [Jorgensen, Chandrasekhar, Madura, Impey and Klein; J. Chem. Phys., 79 (1983) 926] to reproduce the newest structural picture of liquid water. The results were compared with recent neutron diffraction data [Soper; Bruni and Ricci; J. Chem. Phys., 106 (1997) 247]. The influence of the computational conditions on the thermodynamic and structural results obtained with this model was also analyzed. The findings were compared with the original ones from Jorgensen et al [above-cited reference plus Mol. Phys., 56 (1985) 1381]. It is notice that the thermodynamic results are dependent on the boundary conditions used, whereas the usual radial distribution functions g(O/O(r)) and g(O/H(r)) do not depend on them.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The objective of this paper is to present a methodology to analyze a transmission line model used in electromagnetic transitory simulators, called equivalent impedance test. Initially the definition of equivalent impedance reference test is shown. Soon after this methodology is applied to a transmission line model, the Quasi-Modes model. The studies were accomplished in a hypothetical non-transposed three-phase transmission fine of 440 kV. The line length is 500 km, and it was modeled through cascades of pi-circuits (with 50 pi's circuits, each with 10 km length).
Resumo:
In this paper a comparative analysis of the environmental impact caused by the use of natural gas and diesel in thermoelectric power plants utilizing combined cycle is performed. The objective is to apply a thermoeconomical analysis in order to compare the two proposed fuels. In this analysis, a new methodology that incorporates the economical engineering concept to the ecological efficiency once Cardu and Baica [1, 2], which evaluates, in general terms, the environmental impacts caused by CO2, SO2, NOx and Particulate Matter (PM), adopting as reference the air quality standards in vigour is employed. The thermoeconomic model herein proposed utilizes functional diagrams that allow the minimization the Exergetic Manufacturing Cost, which represents the cost of production of electricity incorporating the environmental impact effects to study the performance of the thermoelectric power plant [3,4], It follows that it is possible to determine the environmental impact caused by thermoelectric power plants and, under the ecological standpoint, the use of natural gas as a fuel is the best option compared to the use of the diesel, presenting ecological efficiency values of 0.944 and 0.914 respectively. From the Exergoeconomic point of view of, it was found out that the EMC (Exergetic Manufacturing Cost) is better when natural gas is used as fuel compared to the diesel fuel. Copyright © 2006 by ASME.
Resumo:
Includes bibliography
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The purpose of this study is to describe the development of a model to predict the digestible lysine requirements of broilers using the factorial approach, and to evaluate the model using as reference the model presented in Brazilian Tables for Poultry and Swine. The model partitions the requirement for maintenance and growth for feather-free body protein and feather protein, in which the inputs are body and feather protein weight and the daily rates of protein deposition in the feather-free body and feathers. The parameters that express the lysine requirement for maintenance were obtained in metabolism trials with roosters, and those for the efficiency of lysine utilization in experiments with broilers from 1 to 42 d. Based on these results the model proposed was: Lys (mg/d) = [(151xBP(m)(-0.27)xBP(t)) + (0.01xP(t)x18)] + [(75xBPD/0.77) + (18xFPD/0.77)], where Lys = digestible lysine requirement (mg/d), BPm=body protein weight at maturity (kg), BPt=body protein weight at time t (kg), FPt=feather protein weight at time t (kg), BPD=body protein deposition (g/d), FPD = feather protein deposition (g/d). The model yields sensible predictions of the digestible lysine requirements of broilers of different strains and ages growing at their potential, and suggests a lower lysine requirement after 27 d than does the Brazilian model. The proposed model is the first step in the development of a simulation model that predicts the food intake of a broiler and hence the dietary amino acid content that would optimise performance.
Resumo:
One approach to verify the adequacy of estimation methods of reference evapotranspiration is the comparison with the Penman-Monteith method, recommended by the United Nations of Food and Agriculture Organization - FAO, as the standard method for estimating ET0. This study aimed to compare methods for estimating ET0, Makkink (MK), Hargreaves (HG) and Solar Radiation (RS), with Penman-Monteith (PM). For this purpose, we used daily data of global solar radiation, air temperature, relative humidity and wind speed for the year 2010, obtained through the automatic meteorological station, with latitude 18° 91' 66 S, longitude 48° 25' 05 W and altitude of 869m, at the National Institute of Meteorology situated in the Campus of Federal University of Uberlandia - MG, Brazil. Analysis of results for the period were carried out in daily basis, using regression analysis and considering the linear model y = ax, where the dependent variable was the method of Penman-Monteith and the independent, the estimation of ET0 by evaluated methods. Methodology was used to check the influence of standard deviation of daily ET0 in comparison of methods. The evaluation indicated that methods of Solar Radiation and Penman-Monteith cannot be compared, yet the method of Hargreaves indicates the most efficient adjustment to estimate ETo.
Resumo:
The aim of this paper is to compare 18 reference evapotranspiration models to the standard Penman-Monteith model in the Jaboticabal, Sao Paulo, region for the following time scales: daily, 5-day, 15-day and seasonal. A total of 5 years of daily meteorological data was used for the following analyses: accuracy (mean absolute percentage error, Mape), precision (R-2) and tendency (bias) (systematic error, SE). The results were also compared at the 95% probability level with Tukey's test. The Priestley-Taylor (1972) method was the most accurate for all time scales, the Tanner-Pelton (1960) method was the most accurate in the winter, and the Thornthwaite (1948) method was the most accurate of the methods that only used temperature data in the equations.
Resumo:
High throughput sequencing (HTS) provides new research opportunities for work on non-model organisms, such as differential expression studies between populations exposed to different environmental conditions. However, such transcriptomic studies first require the production of a reference assembly. The choice of sampling procedure, sequencing strategy and assembly workflow is crucial. To develop a reliable reference transcriptome for Triatoma brasiliensis, the major Chagas disease vector in Northeastern Brazil, different de novo assembly protocols were generated using various datasets and software. Both 454 and Illumina sequencing technologies were applied on RNA extracted from antennae and mouthparts from single or pooled individuals. The 454 library yielded 278 Mb. Fifteen Illumina libraries were constructed and yielded nearly 360 million RNA-seq single reads and 46 million RNA-seq paired-end reads for nearly 45 Gb. For the 454 reads, we used three assemblers, Newbler, CAP3 and/or MIRA and for the Illumina reads, the Trinity assembler. Ten assembly workflows were compared using these programs separately or in combination. To compare the assemblies obtained, quantitative and qualitative criteria were used, including contig length, N50, contig number and the percentage of chimeric contigs. Completeness of the assemblies was estimated using the CEGMA pipeline. The best assembly (57,657 contigs, completeness of 80 %, < 1 % chimeric contigs) was a hybrid assembly leading to recommend the use of (1) a single individual with large representation of biological tissues, (2) merging both long reads and short paired-end Illumina reads, (3) several assemblers in order to combine the specific advantages of each.
Resumo:
High Throughput Sequencing capabilities have made the process of assembling a transcriptome easier, whether or not there is a reference genome. But the quality of a transcriptome assembly must be good enough to capture the most comprehensive catalog of transcripts and their variations, and to carry out further experiments on transcriptomics. There is currently no consensus on which of the many sequencing technologies and assembly tools are the most effective. Many non-model organisms lack a reference genome to guide the transcriptome assembly. One question, therefore, is whether or not a reference-based genome assembly gives better results than de novo assembly. The blood-sucking insect Rhodnius prolixus-a vector for Chagas disease-has a reference genome. It is therefore a good model on which to compare reference-based and de novo transcriptome assemblies. In this study, we compared de novo and reference-based genome assembly strategies using three datasets (454, Illumina, 454 combined with Illumina) and various assembly software. We developed criteria to compare the resulting assemblies: the size distribution and number of transcripts, the proportion of potentially chimeric transcripts, how complete the assembly was (completeness evaluated both through CEGMA software and R. prolixus proteome fraction retrieved). Moreover, we looked for the presence of two chemosensory gene families (Odorant-Binding Proteins and Chemosensory Proteins) to validate the assembly quality. The reference-based assemblies after genome annotation were clearly better than those generated using de novo strategies alone. Reference-based strategies revealed new transcripts, including new isoforms unpredicted by automatic genome annotation. However, a combination of both de novo and reference-based strategies gave the best result, and allowed us to assemble fragmented transcripts.
Resumo:
We assessed the efficacy of three different forest intervention techniques, in terms of phytosociological and edaphic responses, that were implemented in 2007. In a farm where trees are planted and managed for cellulose production as well as set aside for environmental conservation, four stands were analysed: three of them were considered degraded and were managed using different intervention techniques (transposition, perch, and abandonment), and a fourth stand comprising pristine vegetation was considered a control (reference). Floristic and phytosociology data were collected in three 10 × 10 m plots established in each stand. Also, a total of 48 soil samples were collected to analyse physical and chemical attributes of the topsoil for the different stands. In terms of biodiversity, all the treatments showed significantly lower values when compared to the reference area. However, the soils in all the treatment and reference stands are similar in terms of physical and chemical attributes. Taking into account the specificities of each restoration technique, we verified that the integrated use of a set of management practices, constituted by the (1) abandonment of the area and (2) following a selective killing of the eucalyptus, is the most suitable and promising model to provide fast and effective restoration in terms of environmental indicators.