81 resultados para Equação de projeto


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present work are established initially the fundamental relationships of thermodynamics that govern the equilibrium between phases, the models used for the description of the behavior non ideal of the liquid and vapor phases in conditions of low pressures. This work seeks the determination of vapor-liquid equilibrium (VLE) data for a series of multicomponents mixtures of saturated aliphatic hydrocarbons, prepared synthetically starting from substances with analytical degree and the development of a new dynamic cell with circulation of the vapor phase. The apparatus and experimental procedures developed are described and applied for the determination of VLE data. VLE isobarics data were obtained through a Fischer's ebulliometer of circulation of both phases, for the systems pentane + dodecane, heptane + dodecane and decane + dodecane. Using the two new dynamic cells especially projected, of easy operation and low cost, with circulation of the vapor phase, data for the systems heptane + decane + dodecane, acetone + water, tween 20 + dodecane, phenol + water and distillation curves of a gasoline without addictive were measured. Compositions of the equilibrium phases were found by densimetry, chromatography, and total organic carbon analyzer. Calibration curves of density versus composition were prepared from synthetic mixtures and the behavior excess volumes were evaluated. The VLE data obtained experimentally for the hydrocarbon and aqueous systems were submitted to the test of thermodynamic consistency, as well as the obtained from the literature data for another binary systems, mainly in the bank DDB (Dortmund Data Bank), where the Gibbs-Duhem equation is used obtaining a satisfactory data base. The results of the thermodynamic consistency tests for the binary and ternary systems were evaluated in terms of deviations for applications such as model development. Later, those groups of data (tested and approved) were used in the KijPoly program for the determination of the binary kij parameters of the cubic equations of state original Peng-Robinson and with the expanded alpha function. These obtained parameters can be applied for simulation of the reservoirs petroleum conditions and of the several distillation processes found in the petrochemistry industry, through simulators. The two designed dynamic cells used equipments of national technology for the determination Humberto Neves Maia de Oliveira Tese de Doutorado PPGEQ/PRH-ANP 14/UFRN of VLE data were well succeed, demonstrating efficiency and low cost. Multicomponents systems, mixtures of components of different molecular weights and also diluted solutions may be studied in these developed VLE cells

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study that resulted in this dissertation was developed at OU RNCE PETROBRAS, in Natal, which implemented a project of rational use and reuse of water, including use of wastewater from a Sewage Treatment Plant (STP) already in place, diluted with water from own wells for irrigation of green area of the building complex corporate enterprise. Establish a methodology that can serve as guidelines for future projects controlled reuse of water like this was the objective of this research. Been proposed, implemented and evaluated three instruments of sanitary and environmental control: 1) adaptation of sewage treatment plant and quality control of the treated effluent 2) analysis of soil-nutrient interaction in the irrigated area, 3) knowledge of the local hydrogeology, especially with regard to the direction of flow of the aquifer and location of collection wells of Companhia de Águas e Esgotos do Rio Grande do Norte (CAERN) situated in the surroundings. These instruments have proven sufficient and appropriate to ensure the levels of sanitary and environmental control proposed and studied, which were: a) control of water quality off the STP and the output of the irrigation reservoir, b) control of water quality sub surface soil and assessment of progress on soil composition, c) assessment of water quality in the aquifer. For this, we must: 1) establishing the monitoring plan of the STP and its effluent quality sampling points and defining the parameters of analysis, improve the functioning of that identifying the adequacy of flow and screening as the main factors of operational control, and increase the efficiency of the station to a relatively low cost, using additional filters, 2) propose, implement and adapt simple collectors to assess the quality of water percolating into the soil of the irrigated area, 3) determine the direction of groundwater flow in the area study and select the wells for monitoring of the aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The management of water resources in the river basin level, as it defines the Law nº 9433/97, requires the effective knowledge of the processes of hydrological basin, resulting from studies based on consistent series of hydrological data that reflect the characteristics of the basin. In this context, the objective of this work was to develop the modeling of catchment basin of the river Jundiaí - RN and carry out the study of attenuation of a flood of the dam Tabatinga, by means of a monitoring project of hydrological data and climatology of the basin, with a view to promoting the development of research activities by applying methodologies unified and appropriate for the assessment of hydrological studies in the transition region of the semiarid and the forest zone on the coast of Rio Grande do Norte. For the study of the hydrological characteristics of the basin was conducted the automatic design of the basin of the river Jundiaí, with the aid of programs of geoprocessing, was adopted a hydrological model daily, the NRCS, which is a model determined and concentrated. For the use of this model was necessary to determine some parameters that are used in this model, as the Curve Number. Having in mind that this is the first study that is being conducted in the basin with the employment of this model, it was made sensitivity analysis of the results of this model from the adoption of different values of CN, situated within a range appropriate to the conditions of use, occupation and the nature of the soil of this basin. As the objective of this study was also developing a simulation model of the operation of the Tabatinga dam and with this flood control caused in the city of Macaíba, it was developed a mathematical model of fluid balance, developed to be used in Microsoft Excel. The simulation was conducted in two phases: the first step was promoted the water balance daily that allowed the analysis of the sensitivity of the model in relation to the volume of waiting, as well as the determination of the period of greatest discharges daily averages. From this point, it was assumed for the second stage, which was in the determination of the hydrograph of discharges effluent slots, that was determined by means of the fluid balance time, on the basis of the discharges effluents generated by a mathematical equation whose parameters were adjusted according to the hydrograph daily. Through the analyzes it was realized that the dam Tabatinga only has how to carry out the attenuation of floods through the regularization of the volume of waiting, with this there is a loss of approximately 56.5% on storage capacity of this dam, because for causing the attenuation effect of filled the shell of this dam has to remain more than 5m below the level of the sill, representing at least 50.582.927m3. The results obtained with the modeling represents a first step in the direction of improving the level of hydrological information about the behavior of the basins of the semiarid. In order to monitor quantitatively the hydrographic basin of the river Jundiaí will be necessary to install a rain gauge register, next to the Tabatinga dam and a pressure transducer, for regular measurements of flow in the reservoir of the dam. The climatological data will be collected in full automatic weather station installed in Agricultural School Jundiaí

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study aims to check whether the use of activities mediated by the History of Mathematics can contribute to improve the understanding of resolution the 2nd degree equation for teachers and undergraduates that reproduce methods of solving such equations, uncritically, without domain of the justifications for their actions. For this, we adapted a didactic sequence with activities that aims to cause a rediscovery of resolutive formula of 2nd degree equation through the method known as cut and paste. Finally, we presented the activity module containing the didactic sequence used during the study, as suggestion for use in the classroom, by the math teacher

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Einstein s theory of General Relativity the field equations relate the geometry of space-time with the content of matter and energy, sources of the gravitational field. This content is described by a second order tensor, known as energy-momentum tensor. On the other hand, the energy-momentum tensors that have physical meaning are not specified by this theory. In the 700s, Hawking and Ellis set a couple of conditions, considered feasible from a physical point of view, in order to limit the arbitrariness of these tensors. These conditions, which became known as Hawking-Ellis energy conditions, play important roles in the gravitation scenario. They are widely used as powerful tools for analysis; from the demonstration of important theorems concerning to the behavior of gravitational fields and geometries associated, the gravity quantum behavior, to the analysis of cosmological models. In this dissertation we present a rigorous deduction of the several energy conditions currently in vogue in the scientific literature, such as: the Null Energy Condition (NEC), Weak Energy Condition (WEC), the Strong Energy Condition (SEC), the Dominant Energy Condition (DEC) and Null Dominant Energy Condition (NDEC). Bearing in mind the most trivial applications in Cosmology and Gravitation, the deductions were initially made for an energy-momentum tensor of a generalized perfect fluid and then extended to scalar fields with minimal and non-minimal coupling to the gravitational field. We also present a study about the possible violations of some of these energy conditions. Aiming the study of the single nature of some exact solutions of Einstein s General Relativity, in 1955 the Indian physicist Raychaudhuri derived an equation that is today considered fundamental to the study of the gravitational attraction of matter, which became known as the Raychaudhuri equation. This famous equation is fundamental for to understanding of gravitational attraction in Astrophysics and Cosmology and for the comprehension of the singularity theorems, such as, the Hawking and Penrose theorem about the singularity of the gravitational collapse. In this dissertation we derive the Raychaudhuri equation, the Frobenius theorem and the Focusing theorem for congruences time-like and null congruences of a pseudo-riemannian manifold. We discuss the geometric and physical meaning of this equation, its connections with the energy conditions, and some of its several aplications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent astronomical observations indicate that the universe has null spatial curvature, is accelerating and its matter-energy content is composed by circa 30% of matter (baryons + dark matter) and 70% of dark energy, a relativistic component with negative pressure. However, in order to built more realistic models it is necessary to consider the evolution of small density perturbations for explaining the richness of observed structures in the scale of galaxies and clusters of galaxies. The structure formation process was pioneering described by Press and Schechter (PS) in 1974, by means of the galaxy cluster mass function. The PS formalism establishes a Gaussian distribution for the primordial density perturbation field. Besides a serious normalization problem, such an approach does not explain the recent cluster X-ray data, and it is also in disagreement with the most up-to-date computational simulations. In this thesis, we discuss several applications of the nonextensive q-statistics (non-Gaussian), proposed in 1988 by C. Tsallis, with special emphasis in the cosmological process of the large structure formation. Initially, we investigate the statistics of the primordial fluctuation field of the density contrast, since the most recent data from the Wilkinson Microwave Anisotropy Probe (WMAP) indicates a deviation from gaussianity. We assume that such deviations may be described by the nonextensive statistics, because it reduces to the Gaussian distribution in the limit of the free parameter q = 1, thereby allowing a direct comparison with the standard theory. We study its application for a galaxy cluster catalog based on the ROSAT All-Sky Survey (hereafter HIFLUGCS). We conclude that the standard Gaussian model applied to HIFLUGCS does not agree with the most recent data independently obtained by WMAP. Using the nonextensive statistics, we obtain values much more aligned with WMAP results. We also demonstrate that the Burr distribution corrects the normalization problem. The cluster mass function formalism was also investigated in the presence of the dark energy. In this case, constraints over several cosmic parameters was also obtained. The nonextensive statistics was implemented yet in 2 distinct problems: (i) the plasma probe and (ii) in the Bremsstrahlung radiation description (the primary radiation from X-ray clusters); a problem of considerable interest in astrophysics. In another line of development, by using supernova data and the gas mass fraction from galaxy clusters, we discuss a redshift variation of the equation of state parameter, by considering two distinct expansions. An interesting aspect of this work is that the results do not need a prior in the mass parameter, as usually occurs in analyzes involving only supernovae data.Finally, we obtain a new estimate of the Hubble parameter, through a joint analysis involving the Sunyaev-Zeldovich effect (SZE), the X-ray data from galaxy clusters and the baryon acoustic oscillations. We show that the degeneracy of the observational data with respect to the mass parameter is broken when the signature of the baryon acoustic oscillations as given by the Sloan Digital Sky Survey (SDSS) catalog is considered. Our analysis, based on the SZE/X-ray data for a sample of 25 galaxy clusters with triaxial morphology, yields a Hubble parameter in good agreement with the independent studies, provided by the Hubble Space Telescope project and the recent estimates of the WMAP

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The information tecnology (IT) has, over the years, gaining prominence as a strategic element and competitive edge in organizations, public or private. In the judiciary, with the implementation of actions related to Judiciário Eletrônico, information technology (IT), definitely earns its status as a strategic element and significantly raises the level of dependence of the organs of their services and products. Increasingly, the quality of services provided by IT has direct impact on the quality of services provided by the agency as a whole. The Ministério Público do Estado do Rio Grande do Norte (MPRN) deployments shares of Electronic Government, along with an administrative reform, beyond these issues raised, caused a large increase in institutional demand for products and services provided by the Diretoria de Tecnologia da Informação (DTI), a sector responsible for the provision of IT services. Taking as starting point strategic goal set by MPRN to reach a 85% level of user satisfaction in four years, we seek to propose a method that assists in meeting the goal, respecting the capacity constraints of the IT sector. To achieve the proposed objective, we conducted a work in two distinct and complementary stages. In the first step we conducted a case study in MPRN, in which, through an internal and external diagnosis of DTI, accomplished by an action of internal consulting and one research of the user satisfaction, we seek to identify opportunities of change seeking to raise the quality perceived of the services provided by the DTI , from the viewpoint of their customers. The situational report, drawn from the data collected, fostered changes in DTI, which were then evaluated with the managers. In the second stage, with the results obtained in the initial process, empirical observation, evaluation of side projects of quality improvement in the sector, and validation with the managers, of the initial model, we developed an improved process, gazing beyond the identification of gaps in service a strategy for the selection of best management practices and deployment of these, in a incremental and adaptive way, allowing the application of the process in organs with little staff allocated to the provision of information technology services

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study deals with cognitive competences and abilities that are relevant to selection and education regarding Information Technology (IT). These competences relate to problem solving, decision making, and practical intelligence that regard scholar and extracurricular knowledge mobilization. The research aimed to contribute for the improvement of a selection instrument, consisting of five arrays of skills (dealing with objectives and prospection), as well as the development and comprehension of those skills that are involved in IT education. This is done by means of an analysis on the selection instrument used in the first selective process that occurred at Metropole Digital an Institute at the Federal University of Rio Grande do Norte in Brazil. This was evaluated aiming to acknowledge IT education (with basic training and emphasis on Web programming and electronics). The methodology used was of quantitative method involving performance scores relating education delivery. An Anova analysis of variance was done along with descriptive analysis involving socioeconomic data that was not observed in the meaningful relations between parental instruction and student performance in the graduate course. These analyses were able to point out the importance and need of the policies for vacancy reservation on behalf of public school students. A Spearman correlation analysis was done considering the instrument selection performance in the training course. The instrument is presented as a predictor that is significantly moderate and presents a good performance in the course as a whole. A Cluster and Regression analysis was also realized in the process. The first analysis allowed finding performance groups (Clusters) that ranged from medium and inferior. The regression analysis was able to point out association amongst criterion variables and the (average performance in basic and advanced modules) and explanatory (five matrixes). Regression analysis indicated that matrix 1 and matrix 3 were pointed out as being the strongest ones. In all the above analysis, the correlation between the instrument and the course was considered moderate. Thus this can be related in some of the aspects present in the course such as emphasis on evaluation itself as well as in technical contents and practical skills (educational ones) and competences and selection skills. It is known that the mediation of technological artifact in cultural context can foster the development of skills and abilities relevant to IT training. This study provides subsidies to reflect on the adoption of selection instrument and IT training in the Institute. Thus the research offers means to achieve a interdisciplinary discussion and enriching of areas such as Psychology and Information Technology; all of which regarding competencies and skills relevant in IT training

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proposals that aim a redirection of current health attention models in Brazil are indispensible due to the need of health issues and its challenges imposed by society. These issues come forth in Higher Education Institutions in an attempt to devise ways to face these demands. The research aims to analyze the Pedagogical Project (PP) present in the Dentistry Graduation Course in the Federal University of Rio Grande do Norte in Brazil-(UFRN). This is done in coherence the Brazilian National Curricular Guidelines considering main competences established in the PP. This research was approved by the Ethics Committee at UFRN under document number 285/201. The work is of descriptive nature and was realized with 30 students in the Dentistry Graduation Course. Interviews were realized with the use of problem situation approach. The research was also supported by documental studies that dealt with syllabus present in the disciplines taught at UFRN. Data were processed with the use of the ALCESTE 4.9 software. It is possible to acknowledge that some conservative conceptions arise, even though there is use of active methodologies and innovation that aim to promote reflection and articulation for general competence development such as proposed in the Pedagogical Project in Dentistry Graduation Course at UFRN. These conceptions are mainly present in the teaching-learning process where students do not have full participation. Thus it is possible to conclude that even though there are advances and breakthroughs. This is seen with that fact that there was inclusion of multidisciplinary clinical work as well optional courses in the curriculum. It was also seen that there was occasional use of active teaching methodologies in Dentistry at UFRN. But there is still a need for a didactical and methodological resizing. These actions require the need for progressive development of competences and abilities during the formative process according to what was established in the Brazilian National Curricular Guidelines

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Health policies in Brazil, the decentralization of SUS management responsibilities for the three spheres of government has driven the creation and regulation of the audits of health services in the National Audit Office, this is a trend of neoliberal policies imposed by international bodies like the World Bank and IMF to peripheral countries characterized by productive restructuring and reforming the state focuses on the presence of two competing projects in the area of health: Health Sector Reform Project which is based on the democratic rule of law with the assumption of health as social right and duty of the State in defending the extension of the conquest of rights and democratization of access to health care guaranteed through the public financing strategies and the effective decentralization of decisions pervaded by social control and privatized Health Project which is based on the state minimum, with a reduction in social spending or in partnerships and privatization, stronger nonprofit sector, subject to capitalist interests, is made effective through strategies targeting health policy and refilantropização actions. In this context, the present study is an analysis on the work of social audits of public health in infants from a qualitative and quantitative approach, embodied by the critical method of dialectical Marxist social theory that enabled us to unveil the characterization, the demands, challenges and outline the profile of Social Work in teams inserted audits of SUS in RN, but also provided evidence to demonstrate the prospects and possibilities of this area of activity of social workers. It was also found that through the audit work that the state fulfill its role as bureaucratic and regulator of health services with efficiency, effectiveness and economy. Yet, paradoxically, the audits of SUS may provide a vehicle for enforcing rights and ensuring the fundamental principles contained in the project of health reform, because it can be configured in a space of political struggle as representing a new field of knowledge production that needs to be appropriate for a theoretical critic able to redirect the social interests in favor of the user. From this perspective, it is concluded that the work of social audits of public health in infants despite the social relevance that prints, as they constitute an activity study of reality and its transformation proposition requires a transformative political action guided the discussion Marxist theory holds that the ethical project professional politician of Social Work

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aspect Oriented approaches associated to different activities of the software development process are, in general, independent and their models and artifacts are not aligned and inserted in a coherent process. In the model driven development, the various models and the correspondence between them are rigorously specified. With the integration of aspect oriented software development (DSOA) and model driven development (MDD) it is possible to automatically propagate models from one activity to another, avoiding the loss of information and important decisions established in each activity. This work presents MARISA-MDD, a strategy based on models that integrate aspect-oriented requirements, architecture and detailed design, using the languages AOV-graph, AspectualACME and aSideML, respectively. MARISA-MDD defines, for each activity, representative models (and corresponding metamodels) and a number of transformations between the models of each language. These transformations have been specified and implemented in ATL (Atlas Definition Language), in the Eclipse environment. MARISA-MDD allows the automatic propagation between AOV-graph, AspectualACME, and aSideML models. To validate the proposed approach two case studies, the Health Watcher and the Mobile Media have been used in the MARISA-MDD environment for the automatic generation of AspectualACME and aSideML models, from the AOV-graph model

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents the concept, design and implementation of a MP-SoC platform, named STORM (MP-SoC DirecTory-Based PlatfORM). Currently the platform is composed of the following modules: SPARC V8 processor, GPOP processor, Cache module, Memory module, Directory module and two different modles of Network-on-Chip, NoCX4 and Obese Tree. All modules were implemented using SystemC, simulated and validated, individually or in group. The modules description is presented in details. For programming the platform in C it was implemented a SPARC assembler, fully compatible with gcc s generated assembly code. For the parallel programming it was implemented a library for mutex managing, using the due assembler s support. A total of 10 simulations of increasing complexity are presented for the validation of the presented concepts. The simulations include real parallel applications, such as matrix multiplication, Mergesort, KMP, Motion Estimation and DCT 2D

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Reconfigurable Computing is an intermediate solution at the resolution of complex problems, making possible to combine the speed of the hardware with the flexibility of the software. An reconfigurable architecture possess some goals, among these the increase of performance. The use of reconfigurable architectures to increase the performance of systems is a well known technology, specially because of the possibility of implementing certain slow algorithms in the current processors directly in hardware. Amongst the various segments that use reconfigurable architectures the reconfigurable processors deserve a special mention. These processors combine the functions of a microprocessor with a reconfigurable logic and can be adapted after the development process. Reconfigurable Instruction Set Processors (RISP) are a subgroup of the reconfigurable processors, that have as goal the reconfiguration of the instruction set of the processor, involving issues such formats, operands and operations of the instructions. This work possess as main objective the development of a RISP processor, combining the techniques of configuration of the set of executed instructions of the processor during the development, and reconfiguration of itself in execution time. The project and implementation in VHDL of this RISP processor has as intention to prove the applicability and the efficiency of two concepts: to use more than one set of fixed instructions, with only one set active in a given time, and the possibility to create and combine new instructions, in a way that the processor pass to recognize and use them in real time as if these existed in the fixed set of instruction. The creation and combination of instructions is made through a reconfiguration unit, incorporated to the processor. This unit allows the user to send custom instructions to the processor, so that later he can use them as if they were fixed instructions of the processor. In this work can also be found simulations of applications involving fixed and custom instructions and results of the comparisons between these applications in relation to the consumption of power and the time of execution, which confirm the attainment of the goals for which the processor was developed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New programming language paradigms have commonly been tested and eventually incorporated into hardware description languages. Recently, aspect-oriented programming (AOP) has shown successful in improving the modularity of object-oriented and structured languages such Java, C++ and C. Thus, one can expect that, using AOP, one can improve the understanding of the hardware systems under design, as well as make its components more reusable and easier to maintain. We apply AOP in applications developed using the SystemC library. Several examples will be presented illustrating how to combine AOP and SystemC. During the presentation of these examples, the benefits of this new approach will also be discussed