983 resultados para CODIGOS CORRETORES DE ERROS
Resumo:
This work discusses the environmental management thematic, on the basis of ISO 14001 standard and learning organization. This study is carried through an exploratory survey in a company of fuel transport, located in Natal/RN. The objective of this research was to investigate the practices of environmental management, carried through in the context of an implemented ISO 14001 environmental management system, in the researched organization, from the perspective of the learning organization. The methodology used in this work is supported in the quantitative method, combining the exploratory and descriptive types, and uses the technique of questionnaires, having as scope of the research, the managers, employee controlling, coordinators, supervisors and - proper and contracted - of the company. To carry through the analysis of the data of this research, it was used software Excel and Statistical version 6.0. The analysis of the data is divided in two parts: descriptive analysis and analysis of groupings (clusters). The results point, on the basis of the studied theory, as well as in the results of the research, that the implemented ISO 14001 environmental system in the searched organization presents elements that promote learning organization. From the results, it can be concluded that the company uses external information in the decision taking on environmental problems; that the employees are mobilized to generate ideas and to collect n environmental information and that the company has carried through partnerships in the activities of the environmental area with other companies. All these item cited can contribute for the generation of knowledge of the organization. It can also be concluded that the company has evaluated environmental errors occurrences in the past, as well as carried through environmental benchmarking. These practical can be considered as good ways of the company to acquire knowledge. The results also show that the employees have not found difficulties in the accomplishment of the tasks when the manager of its sector is not present. This result can demonstrate that the company has a good diffusion of knowledge
Resumo:
This work investigates the importance of Eco-Materiais applied in the civil construction and the necessity of knowledge of the real estate market, showing the importance of application of recycled products where inserted inside of a bigger scope of the sustainable development which has the subjects as the ambient management. In the theoretical referencial boarded the recycled and perfectly ecological products that demonstrate the applicability of this type of products in the sector of the civil construction, beyond the economic and social placesThe main popouse of the real estate sector is to show the awareness and demonstration in the negotiation of property constructed with these products, therefore, already it is practised by the market of the civil construction where much time sao commercialized by real estate and the its correctors lacking in same knowledge that is more deepened on these materials, having this evidence been made with statistical application of questionnaire and analyzed with base. We finish showing the statistical results with application of 142 questionnaires in a universe of 145 real estate from Natal/RN. With this, we may say that today exists a very strong concern with the environmental laws and the generated ambient impact in the civil construction and that the real estate sector has a feeling that the necessity of if inserting in this process, therefore, the real estate market in our State is in expansion and sensible to the necessity of changes, since the Natal/RN meets in the script of the tourism the International demanding of the professional that a globalized knowledge works with property, so the necessity of understanding the environmental laws and understanding application of the echo-materias used in the construction will give a better quality of life and at the same time to protect the nature
Resumo:
The research aimed identify how the quality of services provided by Casa de Apoio à Criança com Câncer Durval Paiva is perceived by its users, giving an opportunity of improve their performance in social services provision pointing out the failures experienced, the institution will have the user as a important partner in fails identification, serving as a subsidy to the actions of correction and improvement to such situations demands. With this work implementation will be observed contributions that will permeate to the fields of theory and practice, enabling progress and enrichment on the subject. The theoretical contribution is observed as this work execution will provide greater advance about the models developed for the third sector. The proposed work will raise awareness issues about the full potential of the social economy, with regard to the quality of services provided by organizations, allowing a better definition of priorities on their development. The study addressed three issues: identifying the people that receive support of the Casa Durval Paiva, identifying what is the level of satisfaction of families served and evaluate the services provided by Casa Durval Paiva that demand improvements in the perception of families assisted. Found a demand for services has been found that the institution has a multidisciplinary team with a high level of professionalism, and supervised with students of various educational institutions and many volunteers to complement the actions of individual professionals. Was measured a high satisfaction of users of the services provided by Casa Durval Paiva
Resumo:
The portfolio theory is a field of study devoted to investigate the decision-making by investors of resources. The purpose of this process is to reduce risk through diversification and thus guarantee a return. Nevertheless, the classical Mean-Variance has been criticized regarding its parameters and it is observed that the use of variance and covariance has sensitivity to the market and parameter estimation. In order to reduce the estimation errors, the Bayesian models have more flexibility in modeling, capable of insert quantitative and qualitative parameters about the behavior of the market as a way of reducing errors. Observing this, the present study aimed to formulate a new matrix model using Bayesian inference as a way to replace the covariance in the MV model, called MCB - Covariance Bayesian model. To evaluate the model, some hypotheses were analyzed using the method ex post facto and sensitivity analysis. The benchmarks used as reference were: (1) the classical Mean Variance, (2) the Bovespa index's market, and (3) in addition 94 investment funds. The returns earned during the period May 2002 to December 2009 demonstrated the superiority of MCB in relation to the classical model MV and the Bovespa Index, but taking a little more diversifiable risk that the MV. The robust analysis of the model, considering the time horizon, found returns near the Bovespa index, taking less risk than the market. Finally, in relation to the index of Mao, the model showed satisfactory, return and risk, especially in longer maturities. Some considerations were made, as well as suggestions for further work
Resumo:
In the current conjuncture, the environmental factor has been changing the position of companies that are practicing or minimally adopting environmental management. Such tool has been used by companies to face the problems caused by solid waste, in particular green coconut waste, which is constantly among the material discarded by society (companies/ consumer). It is a typical tropical fruit whose fresh water is very benefic for human health, and its popularization has caused a progressive increase of its consumption. Following this stream of thought, this present work came up with an analysis of strengths, weaknesses, threats, and opportunities SWOT analysis on green coconut solid waste management at two agribusiness companies in the state of Rio Grande do Norte (RN), Brazil, aiming to know the challenges and the potentials of this kind of waste. According to the approach of the problem, this work fits a descriptive, exploratory, and qualitative research. The data collection was obtained by a questionnaire and a structured interview, in order to evaluate the strategic posture of agribusiness companies through SWOT analysis, which is an English acronym for Strengths, Weaknesses, Opportunities and Threats. The SWOT analysis is an effective tool to analyze the internal and external environment of an organization. This tool contributes to locate the company at the environment in question and when well applied it enables the detection of mistakes, the strengthening of correct procedures, the avoidance of threats, and the bet on opportunities. The studied agribusiness industries have very similar profiles, such as a long business life span, and a strategy that extends the useful life of the fruit, by using its waste for the manufacturing of new subproducts. In both, the daily quantity of waste resulted of this process reaches approximately 20 thousand units of the fruit in high season, being necessary a focus directed at use and/or treatment of these waste. Further to SWOT analysis, it was ascertained that the agribusiness company A works through a defensive marketing strategy and acts vulnerably, in other words, unable of acting before this market segment, for it has decided to stop using the waste due to a lack of equipment and technology. On the other hand, the agribusiness company B has incorporated an offensive marketing strategy because even not possessing equipments, technology, and appropriated internal installations, it still insists on use and benefits of green coconut waste in its agribusiness. Thus, it is considered that the potential of green coconut waste management for the production of several subproducts reduces the impacts produced by inappropriate placement and generates profits in a short, medium and long term. Such profits being tangible and intangible, as the interest for sustainability actions is not only a matter of obtaining return on capital, but it is an important question in order to move on into business, since it is not enough to have quality on products and process nowadays. It is necessary to establish socio-environmental practices aiming the image of the company as the prevailing role on consumers buying decision
Resumo:
Foram obtidas equações de regressão linear simples para estimar a composição química corporal de bovinos Santa Gertrudes, a partir da composição química e física do corte das 9-10-11ª costelas. Quinze tourinhos, entre nove a 15 meses de idade e de 220 a 505 kg de peso, foram mantidos confinados. Os animais foram abatidos após jejum completo de 18 horas, sendo que seis deles foram abatidos após adaptação. A composição química em água, proteína, extrato etéreo e minerais foi determinada no corte das costelas e em amostras obtidas após moagem completa e homogeneização de todos os tecidos corporais, divididos em: sangue, couro, cabeça + patas, vísceras e carcaça. A composição física do corte das costelas foi obtida por separação manual do músculo, gordura e ossos. O peso do corpo vazio foi altamente correlacionado ao peso da carcaça quente (r² = 0,99). As porcentagens de água e extrato etéreo das 9-10-11ª costelas mostraram-se altamente correlacionadas com a composição química do corpo vazio, o que não ocorreu para as porcentagens de proteína e minerais. Esses teores foram calculados pela composição do corpo vazio desengordurado. A composição física do corte das costelas foi eficiente para estimar as porcentagens de água, extrato etéreo e minerais do corpo vazio, utilizando-se a porcentagem de gordura separável das costelas, mas não para estimar o teor de proteína. A composição física do corte das costelas demonstrou ser uma técnica eficiente, mas a composição química apresentou maiores coeficientes de determinação e menores erros da estimativa. Como a porcentagem de água no corpo vazio e no corte das costelas (r² = 0,95), e as porcentagens de água e de extrato etéreo no corpo vazio foram altamente correlacionadas (r² = 0,94), a porcentagem de água no corte das 9-10-11ª costelas poderia ser a única variável para estimativa da composição química corporal.
Resumo:
The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column
Resumo:
Visando estimar parâmetros genéticos em bovinos, foram utilizados registros de pesos padronizados aos 120, 210, 365, 450 e 550 dias de idade (P120, P210, P365, P450 e P550), altura do posterior mensurada próxima ao sobreano (ALT) e circunferências escrotais (CE) padronizadas aos 365, 450 e 550 dias de idade (CE365, CE450 e CE550). Os dados foram provenientes de animais machos e fêmeas, nascidos entre 1998 e 2003 em dez fazendas de seis estados brasileiros. Os componentes de (co)variância foram estimados pela metodologia REML em análises uni, bi e trivariadas, utilizando-se modelos animal. As estimativas de herdabilidade do efeito direto com os respectivos erros-padrão foram: ALT 0,63 (0,09), P120 0,25 (0,03), P210 0,34 (0,03), P365 0,45 (0,04), P450 0,48 (0,04), P550 0,49 (0,04), CE365 0,48 (0,04), CE450 0,53 (0,04) e CE550 0,42 (0,09). As correlações genéticas entre a ALT e as variáveis P120, P210, P365, P450 e P550 foram de 0,68; 0,64; 0,53; 0,58 e 0,59, respectivamente. As associações genéticas do P120 com as CE ajustadas para peso e idade foram próximas de zero, entretanto, essas correlações foram positivas e moderadas, quando as CE foram ajustadas somente pela idade. As correlações genéticas da ALT com as CE, quando ajustadas para peso e idade, foram: -0,19 (CE365), -0,24 (CE450) e 0,00 (CE550). Utilizando um modelo que não incluiu o peso do animal como covariável, as correlações genéticas das CE com a ALT foram: 0,21 (CE365), 0,12 (CE450) e 0,39 (CE550). Essas estimativas indicam que as características de crescimento e CE apresentam variabilidade genética na raça Nelore, podendo ser incluídas em programas de melhoramento genético, e a seleção para peso em qualquer idade deve acarretar aumento na estatura dos animais. Desta forma, para obtenção de animais com tamanho e peso adequados ao sistema de produção, faz-se necessária a utilização de um índice de seleção aliando estas características.
Resumo:
This work intends to analyze the behavior of the gas flow of plunger lift wells producing to well testing separators in offshore production platforms to aim a technical procedure to estimate the gas flow during the slug production period. The motivation for this work appeared from the expectation of some wells equipped with plunger lift method by PETROBRAS in Ubarana sea field located at Rio Grande do Norte State coast where the produced fluids measurement is made in well testing separators at the platform. The oil artificial lift method called plunger lift is used when the available energy of the reservoir is not high enough to overcome all the necessary load losses to lift the oil from the bottom of the well to the surface continuously. This method consists, basically, in one free piston acting as a mechanical interface between the formation gas and the produced liquids, greatly increasing the well s lifting efficiency. A pneumatic control valve is mounted at the flow line to control the cycles. When this valve opens, the plunger starts to move from the bottom to the surface of the well lifting all the oil and gas that are above it until to reach the well test separator where the fluids are measured. The well test separator is used to measure all the volumes produced by the well during a certain period of time called production test. In most cases, the separators are designed to measure stabilized flow, in other words, reasonably constant flow by the use of level and pressure electronic controllers (PLC) and by assumption of a steady pressure inside the separator. With plunger lift wells the liquid and gas flow at the surface are cyclical and unstable what causes the appearance of slugs inside the separator, mainly in the gas phase, because introduce significant errors in the measurement system (e.g.: overrange error). The flow gas analysis proposed in this work is based on two mathematical models used together: i) a plunger lift well model proposed by Baruzzi [1] with later modifications made by Bolonhini [2] to built a plunger lift simulator; ii) a two-phase separator model (gas + liquid) based from a three-phase separator model (gas + oil + water) proposed by Nunes [3]. Based on the models above and with field data collected from the well test separator of PUB-02 platform (Ubarana sea field) it was possible to demonstrate that the output gas flow of the separator can be estimate, with a reasonable precision, from the control signal of the Pressure Control Valve (PCV). Several models of the System Identification Toolbox from MATLAB® were analyzed to evaluate which one better fit to the data collected from the field. For validation of the models, it was used the AIC criterion, as well as a variant of the cross validation criterion. The ARX model performance was the best one to fit to the data and, this way, we decided to evaluate a recursive algorithm (RARX) also with real time data. The results were quite promising that indicating the viability to estimate the output gas flow rate from a plunger lift well producing to a well test separator, with the built-in information of the control signal to the PCV
Resumo:
This work presents a set of intelligent algorithms with the purpose of correcting calibration errors in sensors and reducting the periodicity of their calibrations. Such algorithms were designed using Artificial Neural Networks due to its great capacity of learning, adaptation and function approximation. Two approaches willbe shown, the firstone uses Multilayer Perceptron Networks to approximate the many shapes of the calibration curve of a sensor which discalibrates in different time points. This approach requires the knowledge of the sensor s functioning time, but this information is not always available. To overcome this need, another approach using Recurrent Neural Networks was proposed. The Recurrent Neural Networks have a great capacity of learning the dynamics of a system to which it was trained, so they can learn the dynamics of a sensor s discalibration. Knowingthe sensor s functioning time or its discalibration dynamics, it is possible to determine how much a sensor is discalibrated and correct its measured value, providing then, a more exact measurement. The algorithms proposed in this work can be implemented in a Foundation Fieldbus industrial network environment, which has a good capacity of device programming through its function blocks, making it possible to have them applied to the measurement process
Resumo:
This work presents a cooperative navigation systemof a humanoid robot and a wheeled robot using visual information, aiming to navigate the non-instrumented humanoid robot using information obtained from the instrumented wheeled robot. Despite the humanoid not having sensors to its navigation, it can be remotely controlled by infra-red signals. Thus, the wheeled robot can control the humanoid positioning itself behind him and, through visual information, find it and navigate it. The location of the wheeled robot is obtained merging information from odometers and from landmarks detection, using the Extended Kalman Filter. The marks are visually detected, and their features are extracted by image processing. Parameters obtained by image processing are directly used in the Extended Kalman Filter. Thus, while the wheeled robot locates and navigates the humanoid, it also simultaneously calculates its own location and maps the environment (SLAM). The navigation is done through heuristic algorithms based on errors between the actual and desired pose for each robot. The main contribution of this work was the implementation of a cooperative navigation system for two robots based on visual information, which can be extended to other robotic applications, as the ability to control robots without interfering on its hardware, or attaching communication devices
Resumo:
The development of wireless sensor networks for control and monitoring functions has created a vibrant investigation scenario, covering since communication aspects to issues related with energy efficiency. When source sensors are endowed with cameras for visual monitoring, a new scope of challenges is raised, as transmission and monitoring requirements are considerably changed. Particularly, visual sensors collect data following a directional sensing model, altering the meaning of concepts as vicinity and redundancy but allowing the differentiation of source nodes by their sensing relevancies for the application. In such context, we propose the combined use of two differentiation strategies as a novel QoS parameter, exploring the sensing relevancies of source nodes and DWT image coding. This innovative approach supports a new scope of optimizations to improve the performance of visual sensor networks at the cost of a small reduction on the overall monitoring quality of the application. Besides definition of a new concept of relevance and the proposition of mechanisms to support its practical exploitation, we propose five different optimizations in the way images are transmitted in wireless visual sensor networks, aiming at energy saving, transmission with low delay and error recovery. Putting all these together, the proposed innovative differentiation strategies and the related optimizations open a relevant research trend, where the application monitoring requirements are used to guide a more efficient operation of sensor networks
Resumo:
The Support Vector Machines (SVM) has attracted increasing attention in machine learning area, particularly on classification and patterns recognition. However, in some cases it is not easy to determinate accurately the class which given pattern belongs. This thesis involves the construction of a intervalar pattern classifier using SVM in association with intervalar theory, in order to model the separation of a pattern set between distinct classes with precision, aiming to obtain an optimized separation capable to treat imprecisions contained in the initial data and generated during the computational processing. The SVM is a linear machine. In order to allow it to solve real-world problems (usually nonlinear problems), it is necessary to treat the pattern set, know as input set, transforming from nonlinear nature to linear problem. The kernel machines are responsible to do this mapping. To create the intervalar extension of SVM, both for linear and nonlinear problems, it was necessary define intervalar kernel and the Mercer s theorem (which caracterize a kernel function) to intervalar function
Resumo:
We propose a multi-resolution, coarse-to-fine approach for stereo matching, where the first matching happens at a different depth for each pixel. The proposed technique has the potential of attenuating several problems faced by the constant depth algorithm, making it possible to reduce the number of errors or the number of comparations needed to get equivalent results. Several experiments were performed to demonstrate the method efficiency, including comparison with the traditional plain correlation technique, where the multi-resolution matching with variable depth, proposed here, generated better results with a smaller processing time
Resumo:
This paper presents the performanee analysis of traffie retransmission algorithms pro¬posed to the HCCA medium aeeess meehanism of IEEE 802.11 e standard applied to industrial environmen1. Due to the nature of this kind of environment, whieh has eleetro¬magnetic interferenee, and the wireless medium of IEEE 802.11 standard, suseeptible to such interferenee, plus the lack of retransmission meehanisms, refers to an impraetieable situation to ensure quality of service for real-time traffic, to whieh the IEEE 802.11 e stan¬dard is proposed and this environment requires. Thus, to solve this problem, this paper proposes a new approach that involves the ereation and evaluation of retransmission al-gorithms in order to ensure a levei of robustness, reliability and quality of serviee to the wireless communication in such environments. Thus, according to this approaeh, if there is a transmission error, the traffie scheduler is able to manage retransmissions to reeo¬ver data 10s1. The evaluation of the proposed approaeh is performed through simulations, where the retransmission algorithms are applied to different seenarios, whieh are abstrae¬tions of an industrial environment, and the results are obtained by using an own-developed network simulator and compared with eaeh other to assess whieh of the algorithms has better performanee in a pre-defined applieation