89 resultados para Modelo de Processamento de Informação Humano (MPHI)
Resumo:
The semiconductor technologies evolutions leads devices to be developed with higher processing capability. Thus, those components have been used widely in more fields. Many industrial environment such as: oils, mines, automotives and hospitals are frequently using those devices on theirs process. Those industries activities are direct related to environment and health safe. So, it is quite important that those systems have extra safe features yield more reliability, safe and availability. The reference model eOSI that will be presented by this work is aimed to allow the development of systems under a new view perspective which can improve and make simpler the choice of strategies for fault tolerant. As a way to validate the model na architecture FPGA-based was developed.
Resumo:
Equipment maintenance is the major cost factor in industrial plants, it is very important the development of fault predict techniques. Three-phase induction motors are key electrical equipments used in industrial applications mainly because presents low cost and large robustness, however, it isn t protected from other fault types such as shorted winding and broken bars. Several acquisition ways, processing and signal analysis are applied to improve its diagnosis. More efficient techniques use current sensors and its signature analysis. In this dissertation, starting of these sensors, it is to make signal analysis through Park s vector that provides a good visualization capability. Faults data acquisition is an arduous task; in this way, it is developed a methodology for data base construction. Park s transformer is applied into stationary reference for machine modeling of the machine s differential equations solution. Faults detection needs a detailed analysis of variables and its influences that becomes the diagnosis more complex. The tasks of pattern recognition allow that systems are automatically generated, based in patterns and data concepts, in the majority cases undetectable for specialists, helping decision tasks. Classifiers algorithms with diverse learning paradigms: k-Neighborhood, Neural Networks, Decision Trees and Naïves Bayes are used to patterns recognition of machines faults. Multi-classifier systems are used to improve classification errors. It inspected the algorithms homogeneous: Bagging and Boosting and heterogeneous: Vote, Stacking and Stacking C. Results present the effectiveness of constructed model to faults modeling, such as the possibility of using multi-classifiers algorithm on faults classification
Resumo:
This work aims at the implementation and adaptation of a computational model for the study of the Fischer-Tropsch reaction in a slurry bed reactor from synthesis gas (CO+H2) for the selective production of hydrocarbons (CnHm), with emphasis on evaluation of the influence of operating conditions on the distribution of products formed during the reaction.The present model takes into account effects of rigorous phase equilibrium in a reactive flash drum, a detailed kinetic model able of predicting the formation of each chemical species of the reaction system, as well as control loops of the process variables for pressure and level of slurry phase. As a result, a system of Differential Algebraic Equations was solved using the computational code DASSL (Petzold, 1982). The consistent initialization for the problem was based on phase equilibrium formed by the existing components in the reactor. In addition, the index of the system was reduced to 1 by the introduction of control laws that govern the output of the reactor products. The results were compared qualitatively with experimental data collected in the Fischer-Tropsch Synthesis plant installed at Laboratório de Processamento de Gás - CTGÁS-ER-Natal/RN
Resumo:
During the process of the salt production, the first the salt crystals formed are disposed of as industrial waste. This waste is formed basically by gypsum, composed of calcium sulfate dihydrate (CaSO4.2H2O), known as carago cru or malacacheta . After be submitted the process of calcination to produce gypsum (CaSO4.0,5H2O), can be made possible its application in cement industry. This work aims to optimize the time and temperature for the process of calcination of the gypsum (carago) for get beta plaster according to the specifications of the norms of civil construction. The experiments involved the chemical and mineralogical characterization of the gypsum (carago) from the crystallizers, and of the plaster that is produced in the salt industry located in Mossoró, through the following techniques: x-ray diffraction (XRD), x-ray fluorescence (FRX), thermogravimetric analysis (TG/DTG) and scanning electron microscopy (SEM) with EDS. For optimization of time and temperature of the process of calcination was used the planning three factorial with levels with response surfaces of compressive mechanical tests and setting time, according norms NBR-13207: Plasters for civil construction and x-ray diffraction of plasters (carago) beta obtained in calcination. The STATISTICA software 7.0 was used for the calculations to relate the experimental data for a statistical model. The process for optimization of calcination of gypsum (carago) occurred in the temperature range from 120° C to 160° C and the time in the range of 90 to 210 minutes in the oven at atmospheric pressure, it was found that with the increase of values of temperature of 160° C and time calcination of 210 minutes to get the results of tests of resistance to compression with values above 10 MPa which conform to the standard required (> 8.40) and that the X-ray diffractograms the predominance of the phase of hemidrato beta, getting a beta plaster of good quality and which is in accordance with the norms in force, giving a by-product of the salt industry employability in civil construction
Resumo:
Environmental sustainability has become one of the topics of greatest interest in industry, mainly due to effluent generation. Phenols are found in many industries effluents, these industries might be refineries, coal processing, pharmaceutical, plastics, paints and paper and pulp industries. Because phenolic compounds are toxic to humans and aquatic organisms, Federal Resolution CONAMA No. 430 of 13.05.2011 limits the maximum content of phenols, in 0.5 mg.L-1, for release in freshwater bodies. In the effluents treatment, the liquid-liquid extraction process is the most economical for the phenol recovery, because consumes little energy, but in most cases implements an organic solvent, and the use of it can cause some environmental problems due to the high toxicity of this compound. Because of this, exists a need for new methodologies, which aims to replace these solvents for biodegradable ones. Some literature studies demonstrate the feasibility of phenolic compounds removing from aqueous effluents, by biodegradable solvents. In this extraction kind called "Cloud Point Extraction" is used a nonionic surfactant as extracting agent of phenolic compounds. In order to optimize the phenol extraction process, this paper studies the mathematical modeling and optimization of extraction parameters and investigates the effect of the independent variables in the process. A 32 full factorial design has been done with operating temperature and surfactant concentration as independent variables and, parameters extraction: Volumetric fraction of coacervate phase, surfactant and residual concentration of phenol in dilute phase after separation phase and phenol extraction efficiency, as dependent variables. To achieve the objectives presented before, the work was carried out in five steps: (i) selection of some literature data, (ii) use of Box-Behnken model to find out mathematical models that describes the process of phenol extraction, (iii) Data analysis were performed using STATISTICA 7.0 and the analysis of variance was used to assess the model significance and prediction (iv) models optimization using the response surface method (v) Mathematical models validation using additional measures, from samples different from the ones used to construct the model. The results showed that the mathematical models found are able to calculate the effect of the surfactant concentration and the operating temperature in each extraction parameter studied, respecting the boundaries used. The models optimization allowed the achievement of consistent and applicable results in a simple and quick way leading to high efficiency in process operation.
Resumo:
This research study deals with the production and distribution of drinking water with quality and safety in order to meet the needs of the Man. Points out the limitations of the methodology for assessing water quality in use today. Approaches the recommendations of the World Health Organization (WHO) for adoption, by the companies responsible for producing and distributing water, of assessment methodologies and risk management (HACCP), in order to ensure the quality and safety of water drinking. Suggests strategies for implementing the plan for water safety plan. Uses the process of water production, composed by Maxaranguape river basin, the water treatment plant and distribution system, which is part of the Plan for Expansion of the Supply System of Natal, as case study. The results, it was possible to devise strategies for implementation of the Water Safety Plan (WSP), which comprises the following steps: a) a preliminary stage. b) assessment system. c) process monitoring. d) management plan and e) validation and verification of the PSA. At each stage are included actions for its implementation. The implementation of the PSA shows a new type of water production, in which the fountain as a whole (watershed and point of capture), the Water Treatment Plant (WTP) and distribution, shall compose the production process, over which to build quality and safety of the final product (drinking water)
Resumo:
This thesis reflects upon the question of how does philosophy think a particular today, which is not only a legitimate philosophical task but a determinative characteristic of philosophy in general. Today's thought follows two paths: first, an hermeneutical-phenomenological analysis of Martin Heidegger's thought with regards to his own contemporaneity; and secondly, through the analysis of the contemporary phenomenon of Information Technology which in the present work is to be considered a privileged sign of our times and distinctive of the mindfulness of philosophy. Therefore, the starting point is an investigation of Heidegger's thinking on his own era to whom facticity is a way of accessing the fundamental question of philosophy. This thesis is led by three guiding words which hold onto a perspective of unity in Heidegger s lifetime of work: 1. Technicity, 2. History, 3. Language, to thereby develop a characterization of human existence as 1. Technopolitical, 2. Technoscientific, and 3. Technological. Finally, in keeping with this triangular characterization of the human, a philosophical comprehension of our times will be established and drawn by Information Technology illustrating three of its' factual signs that are understood to be the 'Remains of Being' today: The Emptying of Speech (Language); The Emptying of Science (History); The Emptying of The Object (Technicity). Through these nowadays phenomenon, it is possible to maintain a grip on the fundamental question, precisely when the task of philosophy seems to have peremptorily lost its meaning and come to its logical end and to show how philosophizing in the information era is as possible as it is necessary.
Resumo:
The nature of this thesis is interventionist and aims to create an alternative on how to control and evaluate the public policies implementation developed at the Institute for Technical Assistance and Rural Extension of Rio Grande do Norte State. The cenarium takes place in a public institution , classified as a municipality that belongs to the Rio Grande do Norte government and adopts the design science research methodology , where it generates a set of artifacts that guide the development of a computerized information system . To ensure the decisions, the literature was reviewed aiming to bring and highlight concepts that will be used as base to build the intervention. The use of an effective methodology called Iconix systems analysis , provides a software development process in a short time . As a result of many artifacts created by the methodology there is a software computer able of running on the Internet environment with G2C behavior, it is suggested as a management tool for monitoring artifacts generated by the various methods. Moreover, it reveals barriers faced in the public companies environment such as lack of infrastructure , the strength of the workforce and the executives behavior
Resumo:
The information tecnology (IT) has, over the years, gaining prominence as a strategic element and competitive edge in organizations, public or private. In the judiciary, with the implementation of actions related to Judiciário Eletrônico, information technology (IT), definitely earns its status as a strategic element and significantly raises the level of dependence of the organs of their services and products. Increasingly, the quality of services provided by IT has direct impact on the quality of services provided by the agency as a whole. The Ministério Público do Estado do Rio Grande do Norte (MPRN) deployments shares of Electronic Government, along with an administrative reform, beyond these issues raised, caused a large increase in institutional demand for products and services provided by the Diretoria de Tecnologia da Informação (DTI), a sector responsible for the provision of IT services. Taking as starting point strategic goal set by MPRN to reach a 85% level of user satisfaction in four years, we seek to propose a method that assists in meeting the goal, respecting the capacity constraints of the IT sector. To achieve the proposed objective, we conducted a work in two distinct and complementary stages. In the first step we conducted a case study in MPRN, in which, through an internal and external diagnosis of DTI, accomplished by an action of internal consulting and one research of the user satisfaction, we seek to identify opportunities of change seeking to raise the quality perceived of the services provided by the DTI , from the viewpoint of their customers. The situational report, drawn from the data collected, fostered changes in DTI, which were then evaluated with the managers. In the second stage, with the results obtained in the initial process, empirical observation, evaluation of side projects of quality improvement in the sector, and validation with the managers, of the initial model, we developed an improved process, gazing beyond the identification of gaps in service a strategy for the selection of best management practices and deployment of these, in a incremental and adaptive way, allowing the application of the process in organs with little staff allocated to the provision of information technology services
Resumo:
The Health Family Program (HFP) was founded in the 1990s with the objective of changing the health care model through a restructuring of primary care. Oral health was officially incorporated into HFP mainly through the efforts of dental professionals, and was seen as a way to break from oral health care models based on curative, technical biological and inequity methods. Despite the fast expansion of HFP oral health teams, it is essential to ask if changes are really occurring in the oral health model of municipalities. Therefore, the purpose of this study is to evaluate the incorporation of oral health teams into the Health Family Program by analyzing the factors that may interfere positively or negatively in the implementation of this strategy and consequently in the process of changing oral health care models in the National Health System in the state of Rio Grande do Norte, Brazil. This evaluation involves three dimensions: access, work organization and strategies of planning. For this purpose,19 municipalities, geographically distributed according to Regional Public Health Units (RPHU), were randomly selected. The data collection instruments used were: structured interview of supervisors and dentists, structured observation, documental research and data from national health data banks. It was possible to identify critical points that may be impeding the implementation of oral health into HFP, such as, low incomes, no legal employment contract, difficulty in referring patients for high-complexity procedures, in developing intersectoral actions and program strategies such as epidemiologic diagnosis and evaluation of the new actions. The majority of municipalities showed little or no improvement in oral health care after incorporating the new model into HFP. All of them had failures in most of the aspects mentioned above. Furthermore, these municipalities are similar in other areas, such as low educational levels in children from 7 to 14 years of age, high child mortality rates and wide social inequalities. On the other hand, the five municipalities that had improved oral health, according to the categories analyzed, offered better living conditions to the population, with higher life expectancy, low infant mortality rates, per capita income among the highest in the state as well as high Human Development Index (HDI) means. Therefore, it is possible to conclude that public policies that include aspects beyond the health sector are decisive for a real change in health care models
Resumo:
Hebb postulated that memory could be stored thanks to the synchronous activity of many neurons, building a neural assembly. Knowing of the importance of the hippocampal structure to the formation of new explicit memories, we used electrophysiological recording of multiple neurons to access the relevance of rate coding from neural firing rates in comparison to the temporal coding of neural assemblies activity in the consolidation of an aversive memory in rats. Animals were trained at the discriminative avoidance task using a modified elevated plus-maze. During experimental sessions, slow wave sleep periods (SWS) were recorded. Our results show an increase in the identified neural assemblies activity during post-training SWS, but not for the neural firing rate. In summary, we demonstrate that for this particular task, the relevant information needed for a proper memory consolidation lies within the temporal patters of synchronized neural activity, not in its firing rate
Resumo:
We have recently verified that the monoamine depleting drug reserpine at doses that do not modify motor function - impairs memory in a rodent model of aversive discrimination. In this study, the effects of reserpine (0.1-0.5 mg/kg) on the performance of rats in object recognition, spatial working memory (spontaneous alternation) and emotional memory (contextual freezing conditioning) tasks were investigated. While object recognition and spontaneous alternation behavior were not affected by reserpine treatment, contextual fear conditioning was impaired. Together with previous studies, these results suggest that mild monoamine depletion would preferentially induce deficits in tasks involved with emotional contexts. Possible relationships with cognitive and emotional processing deficits in Parkinson disease are discussed
Resumo:
Nowadays several electronics devices support digital videos. Some examples of these devices are cellphones, digital cameras, video cameras and digital televisions. However, raw videos present a huge amount of data, millions of bits, for their representation as the way they were captured. To store them in its primary form it would be necessary a huge amount of disk space and a huge bandwidth to allow the transmission of these data. The video compression becomes essential to make possible information storage and transmission. Motion Estimation is a technique used in the video coder that explores the temporal redundancy present in video sequences to reduce the amount of data necessary to represent the information. This work presents a hardware architecture of a motion estimation module for high resolution videos according to H.264/AVC standard. The H.264/AVC is the most advanced video coder standard, with several new features which allow it to achieve high compression rates. The architecture presented in this work was developed to provide a high data reuse. The data reuse schema adopted reduces the bandwidth required to execute motion estimation. The motion estimation is the task responsible for the largest share of the gains obtained with the H.264/AVC standard so this module is essential for final video coder performance. This work is included in Rede H.264 project which aims to develop Brazilian technology for Brazilian System of Digital Television
Resumo:
The way to deal with information assets means nowadays the main factor not only for the success but also for keeping the companies in the global world. The number of information security incidents has grown for the last years. The establishment of information security policies that search to keep the security requirements of assets in the desired degrees is the major priority for the companies. This dissertation suggests a unified process for elaboration, maintenance and development of information security policies, the Processo Unificado para Políticas de Segurança da Informação - PUPSI. The elaboration of this proposal started with the construction of a structure of knowledge based on documents and official rules, published in the last two decades, about security policies and information security. It's a model based on the examined documents which defines the needed security policies to be established in the organization, its work flow and identifies the sequence of hierarchy among them. It's also made a model of the entities participating in the process. Being the problem treated by the model so complex, which involves all security policies that the company must have. PUPSI has an interative and developing approach. This approach was obtained from the instantiation of the RUP - Rational Unified Process model. RUP is a platform for software development object oriented, of Rational Software (IBM group). Which uses the best practice known by the market. PUPSI got from RUP a structure of process that offers functionality, diffusion capacity and comprehension, performance and agility for the process adjustment, offering yet capacity of adjustment to technological and structural charges of the market and the company
Resumo:
Multimedia systems must incorporate middleware concepts in order to abstract hardware and operational systems issues. Applications in those systems may be executed in different kinds of platforms, and their components need to communicate with each other. In this context, it is needed the definition of specific communication mechanisms for the transmission of information flow. This work presents a interconnection component model for distributed multimedia environments, and its implementation details. The model offers specific communication mechanisms for transmission of information flow between software components considering the Cosmos framework requirements in order to support component dynamic reconfiguration