40 resultados para Estatistica - Processamento de dados

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing use of shallow seismic methods of high resolution, for investigations of geological problems, environmental or industrial, has impelled the development of techniques, flows and computational algorithms. The practice of applying techniques for processing this data, until recently it wasn t used and the interpretation of the data was made as they were acquired. In order to facilitate and contribute to the improvement of the practices adopted, was developed a free graphical application and open source, called OpenSeismic which is based on free software Seismic Un*x, widely used in the treatment of conventional seismic data used in the exploration of hydrocarbon reservoirs. The data used to validate the initiative were marine seismic data of high resolution, acquired by the laboratory of Geology and Marine Geophysics and Environmental Monitoring - GGEMMA, of the Federal University of Rio Grande do Norte UFRN, for the SISPLAT Project, located at the region of paleo-valley of the Rio Acu. These data were submitted to the processing flow developed by Gomes (2009), using the free software developed in this work, the OpenSeismic, as well other free software, the Seismic Un*x and the commercial software ProMAX, where despite its peculiarities has presented similar results

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Fazenda Belém oil field (Potiguar Basin, Ceará State, Brazil) occur frequently sinkholes and sudden terrain collapses associated to an unconsolidated sedimentary cap covering the Jandaíra karst. This research was carried out in order to understand the mechanisms of generation of these collapses. The main tool used was Ground Penetrating Radar (GPR). This work is developed twofold: one aspect concerns methodology improvements in GPR data processing whilst another aspect concerns the geological study of the Jandaíra karst. This second aspect was strongly supported both by the analysis of outcropping karst structures (in another regions of Potiguar Basin) and by the interpretation of radargrams from the subsurface karst in Fazenda Belém. It was designed and tested an adequate flux to process GPR data which was adapted from an usual flux to process seismic data. The changes were introduced to take into account important differences between GPR and Reflection Seismic methods, in particular: poor coupling between source and ground, mixed phase of the wavelet, low signal-to-noise ratio, monochannel acquisition, and high influence of wave propagation effects, notably dispersion. High frequency components of the GPR pulse suffer more pronounced effects of attenuation than low frequency components resulting in resolution losses in radargrams. In Fazenda Belém, there is a stronger need of an suitable flux to process GPR data because both the presence of a very high level of aerial events and the complexity of the imaged subsurface karst structures. The key point of the processing flux was an improvement in the correction of the attenuation effects on the GPR pulse based on their influence on the amplitude and phase spectra of GPR signals. In low and moderate losses dielectric media the propagated signal suffers significant changes only in its amplitude spectrum; that is, the phase spectrum of the propagated signal remains practically unaltered for the usual travel time ranges. Based on this fact, it is shown using real data that the judicious application of the well known tools of time gain and spectral balancing can efficiently correct the attenuation effects. The proposed approach can be applied in heterogeneous media and it does not require the precise knowledge of the attenuation parameters of the media. As an additional benefit, the judicious application of spectral balancing promotes a partial deconvolution of the data without changing its phase. In other words, the spectral balancing acts in a similar way to a zero phase deconvolution. In GPR data the resolution increase obtained with spectral balancing is greater than those obtained with spike and predictive deconvolutions. The evolution of the Jandaíra karst in Potiguar Basin is associated to at least three events of subaerial exposition of the carbonatic plataform during the Turonian, Santonian, and Campanian. In Fazenda Belém region, during the mid Miocene, the Jandaíra karst was covered by continental siliciclastic sediments. These sediments partially filled the void space associated to the dissolution structures and fractures. Therefore, the development of the karst in this region was attenuated in comparison to other places in Potiguar Basin where this karst is exposed. In Fazenda Belém, the generation of sinkholes and terrain collapses are controlled mainly by: (i) the presence of an unconsolidated sedimentary cap which is thick enough to cover completely the karst but with sediment volume lower than the available space associated to the dissolution structures in the karst; (ii) the existence of important structural of SW-NE and NW-SE alignments which promote a localized increase in the hydraulic connectivity allowing the channeling of underground water, thus facilitating the carbonatic dissolution; and (iii) the existence of a hydraulic barrier to the groundwater flow, associated to the Açu-4 Unity. The terrain collapse mechanisms in Fazenda Belém occur according to the following temporal evolution. The meteoric water infiltrates through the unconsolidated sedimentary cap and promotes its remobilization to the void space associated with the dissolution structures in Jandaíra Formation. This remobilization is initiated at the base of the sedimentary cap where the flow increases its abrasion due to a change from laminar to turbulent flow regime when the underground water flow reaches the open karst structures. The remobilized sediments progressively fill from bottom to top the void karst space. So, the void space is continuously migrated upwards ultimately reaching the surface and causing the sudden observed terrain collapses. This phenomenon is particularly active during the raining season, when the water table that normally is located in the karst may be temporarily located in the unconsolidated sedimentary cap

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Oil spills in marine environments represent immediate environmental impacts of large magnitude. For that reason the Environmental Sensitivity to Oil Maps constitute a major instrument for planning actions of containment and cleanup. For both the Environmental Sensitivity Maps always need to be updated, to have an appropriate scale and to represent accurately the coastal areas. In this context, this thesis presents a methodology for collecting and processing remote sensing data for the purpose of updating the territorial basis of thematic maps of Environmental Sensitivity to Oil. To ensure greater applicability of the methodology, sensors with complementary characteristics, which provide their data at a low financial cost, were selected and tested. To test the methodology, an area located on the northern coast of the Northeast of Brazil was chosen. The results showed that the products of ASTER data and image hybrid sensor PALSAR + CCD and HRC + CCD, have a great potential to be used as a source of cartographic information on projects that seek to update the Environmental Sensitivity Maps of Oil

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Last century Six Sigma Strategy has been the focus of study for many scientists, between the discoveries we have the importance of data process for the free of error product manufactory. So, this work focuses on data quality importance in an enterprise. For this, a descriptive-exploratory study of seventeen pharmacies of manipulations from Rio Grande do Norte was undertaken with the objective to be able to create a base structure model to classify enterprises according to their data bases. Therefore, statistical methods such as cluster and discriminant analyses were used applied to a questionnaire built for this specific study. Data collection identified four group showing strong and weak characteristics for each group and that are differentiated from each other

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In February 2011, the National Agency of Petroleum, Natural Gas and Biofuels (ANP) has published a new Technical Rules for Handling Land Pipeline Petroleum and Natural Gas Derivatives (RTDT). Among other things, the RTDT made compulsory the use of monitoring systems and leak detection in all onshore pipelines in the country. This document provides a study on the method for detection of transient pressure. The study was conducted on a industrial duct 16" diameter and 9.8 km long. The pipeline is fully pressurized and carries a multiphase mixture of crude oil, water and natural gas. For the study, was built an infrastructure for data acquisition and validation of detection algorithms. The system was designed with SCADA architecture. Piezoresistive sensors were installed at the ends of the duct and Digital Signal Processors (DSPs) were used for sampling, storage and processing of data. The study was based on simulations of leaks through valves and search for patterns that characterize the occurrence of such phenomena

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work discusses the application of techniques of ensembles in multimodal recognition systems development in revocable biometrics. Biometric systems are the future identification techniques and user access control and a proof of this is the constant increases of such systems in current society. However, there is still much advancement to be developed, mainly with regard to the accuracy, security and processing time of such systems. In the search for developing more efficient techniques, the multimodal systems and the use of revocable biometrics are promising, and can model many of the problems involved in traditional biometric recognition. A multimodal system is characterized by combining different techniques of biometric security and overcome many limitations, how: failures in the extraction or processing the dataset. Among the various possibilities to develop a multimodal system, the use of ensembles is a subject quite promising, motivated by performance and flexibility that they are demonstrating over the years, in its many applications. Givin emphasis in relation to safety, one of the biggest problems found is that the biometrics is permanently related with the user and the fact of cannot be changed if compromised. However, this problem has been solved by techniques known as revocable biometrics, which consists of applying a transformation on the biometric data in order to protect the unique characteristics, making its cancellation and replacement. In order to contribute to this important subject, this work compares the performance of individual classifiers methods, as well as the set of classifiers, in the context of the original data and the biometric space transformed by different functions. Another factor to be highlighted is the use of Genetic Algorithms (GA) in different parts of the systems, seeking to further maximize their eficiency. One of the motivations of this development is to evaluate the gain that maximized ensembles systems by different GA can bring to the data in the transformed space. Another relevant factor is to generate revocable systems even more eficient by combining two or more functions of transformations, demonstrating that is possible to extract information of a similar standard through applying different transformation functions. With all this, it is clear the importance of revocable biometrics, ensembles and GA in the development of more eficient biometric systems, something that is increasingly important in the present day

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Oil spills in marine environments represent immediate environmental impacts of large magnitude. For that reason the Environmental Sensitivity to Oil Maps constitute a major instrument for planning actions of containment and cleanup. For both the Environmental Sensitivity Maps always need to be updated, to have an appropriate scale and to represent accurately the coastal areas. In this context, this thesis presents a methodology for collecting and processing remote sensing data for the purpose of updating the territorial basis of thematic maps of Environmental Sensitivity to Oil. To ensure greater applicability of the methodology, sensors with complementary characteristics, which provide their data at a low financial cost, were selected and tested. To test the methodology, an area located on the northern coast of the Northeast of Brazil was chosen. The results showed that the products of ASTER data and image hybrid sensor PALSAR + CCD and HRC + CCD, have a great potential to be used as a source of cartographic information on projects that seek to update the Environmental Sensitivity Maps of Oil

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hypertensive syndromes in pregnancy (HSP) are configured as one of the major complications in the pregnancy and postpartum period and can lead premature newborn and subsequent hospitalization of the newborn to the Neonatal Intensive Care Unit (NICU). This study aimed to analyze the perceptions, meanings and feelings of mothers on the hypertensive syndromes in pregnancy and premature obstetric labor. The research was qualitative and has a theoretical methodological the Social Representations Theory(SRT) in the approach to the Central Nucleus Theory. The study included 70 women, mean age 29 years, predominantly school to high school, most of them married or in consensual union, primiparous and prevalence of cesarean delivery occurred between 32 and 37 weeks of pregnancy.The data were collected from may to december 2008 in the Maternity School Januário Cicco in Natal , and obtained through the following instruments for data collection: questionnaire including questions about socio-demographic status; the Free Words Association Test (FWAT) and and verbalized mental image construction used three stimuli: such as pregnancy with high blood pressure, preterm birth and NICU, and interview with the following guiding question: what it meant for you to have a pregnancy with high blood pressure and consequently the birth of a premature baby? Data analysis was performed using multi-method obtained from the data processing by EVOC (Ensemble Programmes Permettant L 'Analyze des Évocations) and ALCESTE (Analyse Lexicale par Contexte d'un Ensemble de Segment de Texte) and thematic analysis in categories. The results will be presented in four thematic units under the following representative universes: HSP, prematurity as a result of HSP, NICU and the social representations of mothers on the hypertensive disorder of pregnancy sequenced premature birth and hospitalization of the child in the NICU. The results obtained by multimethod analyses showed similar constructions and point to death as the central nucleus and negative aspects, coping strategies, need of care, knowledge about the disease, fragility and meanings of the NICU as peripheral elements. It is considered that the perceptions, meanings and feelings of puerperal women in relation to HSPs and to premature delivery are a negative social representation, with representational elements that may have influenced the adverse effects on the disease and its consequences. We suggest action on the peripheral elements of this representation, with adequate orientation, early diagnosis, effective conduct, receptive attitude on the part of the team, health promotion measures and effective public policies, in order to improve the care provided to puerperal women, making them feel welcome and minimizing their suffering

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trata-se de um estudo de delineamento transversal de caráter multidisciplinar, o qual conta com um estatístico que contribuiu para o delineamento do estudo, realizando o cálculo amostral e contribuindo efetivamente para análise dos dados e alunos de psicologia e pediatrias que contribuíram para a coleta de dados. A literatura aponta que a transmissão inadequada do diagnóstico da Síndrome de Down pode prejudicar o vínculo mãe-bebê e o posterior desenvolvimento da criança. Sendo assim, este estudo objetivou analisar os sentimentos maternos frente a este diagnóstico, verificando diferentes formas de transmissão e possíveis facilitadores da aceitação da Síndrome. A amostra foi constituída por 20 mães cujos filhos apresentam Síndrome de Down, na faixa etária de 0 à 03 anos e que recebem atendimento em ambulatório de um Hospital Universitário de Pediatria. Para coleta dos dados fez-se uso de um questionário, após a assinatura do Termo de Consentimento Livre e Esclarecido. Os dados foram analisados através de dois softwares de processamento de dados, o SPSS e o ALCESTE (Análise Lexical por Contexto de um Conjunto de Segmento de Texto). Os dados indicaram que 90% das mães receberam o diagnóstico de Síndrome de Down depois do parto. 75% dos diagnósticos foram comunicados pelo médico pediatra e 15% pelas enfermeiras. As mãe referiram que o diagnóstico foi tardio, inadequado e insuficiente no informativo. Observouse que as entrevistadas viveram os mesmos sentimentos observados na literatura como: choque, negação, tristeza e ira, adaptação e reorganização. Tais resultados permitem concluir que o diagnóstico de SD nas mães investigadas foi em sua maioria tardio, realidade comum no Brasil, principalmente quando se trata de classes econômicas baixas. As mães apontam que percebem este diagnóstico como tardio, inadequado e insuficiente no informativo, e gera sentimentos que a literatura já cita como comuns frente a esse tipo de diagnóstico. Portanto, observamos que a notícia pode ser um fator que dificulte ou facilite o estabelecimento do vínculo mãe-bebê, comprometendo a busca de recursos para o desenvolvimento da criança

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The progresses of the Internet and telecommunications have been changing the concepts of Information Technology IT, especially with regard to outsourcing services, where organizations seek cost-cutting and a better focus on the business. Along with the development of that outsourcing, a new model named Cloud Computing (CC) evolved. It proposes to migrate to the Internet both data processing and information storing. Among the key points of Cloud Computing are included cost-cutting, benefits, risks and the IT paradigms changes. Nonetheless, the adoption of that model brings forth some difficulties to decision-making, by IT managers, mainly with regard to which solutions may go to the cloud, and which service providers are more appropriate to the Organization s reality. The research has as its overall aim to apply the AHP Method (Analytic Hierarchic Process) to decision-making in Cloud Computing. There to, the utilized methodology was the exploratory kind and a study of case applied to a nationwide organization (Federation of Industries of RN). The data collection was performed through two structured questionnaires answered electronically by IT technicians, and the company s Board of Directors. The analysis of the data was carried out in a qualitative and comparative way, and we utilized the software to AHP method called Web-Hipre. The results we obtained found the importance of applying the AHP method in decision-making towards the adoption of Cloud Computing, mainly because on the occasion the research was carried out the studied company already showed interest and necessity in adopting CC, considering the internal problems with infrastructure and availability of information that the company faces nowadays. The organization sought to adopt CC, however, it had doubt regarding the cloud model and which service provider would better meet their real necessities. The application of the AHP, then, worked as a guiding tool to the choice of the best alternative, which points out the Hybrid Cloud as the ideal choice to start off in Cloud Computing. Considering the following aspects: the layer of Infrastructure as a Service IaaS (Processing and Storage) must stay partly on the Public Cloud and partly in the Private Cloud; the layer of Platform as a Service PaaS (Software Developing and Testing) had preference for the Private Cloud, and the layer of Software as a Service - SaaS (Emails/Applications) divided into emails to the Public Cloud and applications to the Private Cloud. The research also identified the important factors to hiring a Cloud Computing provider

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to the current need of the industry to integrate data of the beginning of production originating from of several sources and of transforming them in useful information for sockets of decisions, a search exists every time larger for systems of visualization of information that come to collaborate with that functionality. On the other hand, a common practice nowadays, due to the high competitiveness of the market, it is the development of industrial systems that possess characteristics of modularity, distribution, flexibility, scalability, adaptation, interoperability, reusability and access through web. Those characteristics provide an extra agility and a larger easiness in adapting to the frequent changes of demand of the market. Based on the arguments exposed above, this work consists of specifying a component-based architecture, with the respective development of a system based on that architecture, for the visualization of industrial data. The system was conceived to be capable to supply on-line information and, optionally, historical information of variables originating from of the beginning of production. In this work it is shown that the component-based architecture developed possesses the necessary requirements for the obtaining of a system robust, reliable and of easy maintenance, being, like this, in agreement with the industrial needs. The use of that architecture allows although components can be added, removed or updated in time of execution, through a manager of components through web, still activating more the adaptation process and updating of the system

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of Geographic Information Systems (GIS) has becoming very important in fields where detailed and precise study of earth surface features is required. Applications in environmental protection are such an example that requires the use of GIS tools for analysis and decision by managers and enrolled community of protected areas. In this specific field, a challenge that remains is to build a GIS that can be dynamically fed with data, allowing researchers and other agents to recover actual and up to date information. In some cases, data is acquired in several ways and come from different sources. To solve this problem, some tools were implemented that includes a model for spatial data treatment on the Web. The research issues involved start with the feeding and processing of environmental control data collected in-loco as biotic and geological variables and finishes with the presentation of all information on theWeb. For this dynamic processing, it was developed some tools that make MapServer more flexible and dynamic, allowing data uploading by the proper users. Furthermore, it was also developed a module that uses interpolation to aiming spatial data analysis. A complex application that has validated this research is to feed the system with data coming from coral reef regions located in northeast of Brazil. The system was implemented using the best interactivity concept provided by the AJAX model and resulted in a substantial contribution for efficiently accessing information, being an essential mechanism for controlling events in the environmental monitoring

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to evaluate the potential of near-infrared reflectance spectroscopy (NIRS) as a rapid and non-destructive method to determine the soluble solid content (SSC), pH and titratable acidity of intact plums. Samples of plum with a total solids content ranging from 5.7 to 15%, pH from 2.72 to 3.84 and titratable acidity from 0.88 a 3.6% were collected from supermarkets in Natal-Brazil, and NIR spectra were acquired in the 714 2500 nm range. A comparison of several multivariate calibration techniques with respect to several pre-processing data and variable selection algorithms, such as interval Partial Least Squares (iPLS), genetic algorithm (GA), successive projections algorithm (SPA) and ordered predictors selection (OPS), was performed. Validation models for SSC, pH and titratable acidity had a coefficient of correlation (R) of 0.95 0.90 and 0.80, as well as a root mean square error of prediction (RMSEP) of 0.45ºBrix, 0.07 and 0.40%, respectively. From these results, it can be concluded that NIR spectroscopy can be used as a non-destructive alternative for measuring the SSC, pH and titratable acidity in plums

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The object of this study is the construction of metaphor and metonymy in comics. This work is inserted in the field of Embodied Cognitive Linguistics, specifically based on the Neural Theory of Language (FELDMAN, 2006) and, consistent with this theoretical and methodological framework, the notions of categorization (LAKOFF & JOHNSON, 1999), embodiment (GIBBS, 2005), figurativity (GIBBS, 1994; BERGEN, 2005), and mental simulation (BARSALOU, 1999; FELDMAN, 2006) have also been used. The hypothesis defended is that the construction of figurativity in texts consisting of verbal and nonverbal mechanisms is linked to the activation of neural structures related to our actions and perceptions. Thus, language is considered a cognitive faculty connected to the brain apparatus and to bodily experiences, in such a way that it provides samples of the continuous process of meaning (re)construction performed by the reader, whom (re)defines his or her views about the world as certain neural networks are (or stop being) activated during linguistic processing. The data obtained during the analysys shows that, as regards comics, the act of reading together the graphics and verbal language seems to have an important role in the construction of figurativity, including cases of metaphors which are metonymically motivated. These preliminary conclusions were drawn from the data analysis taken from V de Vingança (MOORE; LLOYD, 2006). The corpus study was guided by the methodology of introspection, i.e., the individual analysis of linguistic aspects as manifested in one's own cognition (TALMY, 2005).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is considered that the Strategic Alignment IT is the first step within the IT Governance process for any institution. Taking as initial point the recognition that the governance corporate has an overall view of the organizations, the IT Governance takes place as a sub-set responsible for the implementation of the organization strategies in what concerns the provision of the necessary tools for the achievement of the goals set in the Institutional Development Plan. In order to do so, COBIT specifies that such Governance shall be built on the following principles: Strategic Alignment, Value Delivery, Risk Management, Performance Measurement. This paper aims at the Strategic Alignment, considered by the authors as the foundation for the development of the entire IT Governance core. By deepening the technical knowledge of the management system development, UFRN has made a decisive step towards the technical empowerment needed to the “Value Delivery”, yet, by perusing the primarily set processes to the “Strategic Alignment”, gaps that limited the IT strategic view in the implementation of the organizational goals were found. In the qualitative study that used documentary research with content analysis and interviews with the strategic and tactical managers, the view on the role of SINFO – Superintendência de Informática was mapped. The documentary research was done on public documents present on the institutional site and on TCU – Tribunal de Contas da União – documents that map the IT Governance profiles on the federal public service as a whole. As a means to obtain the documentary research results equalization, questionnaires/interviews and iGovTI indexes, quantitative tools to the standardization of the results were used, always bearing in mind the usage of the same scale elements present in the TCU analysis. This being said, similarly to what the TCU study through the IGovTI index provides, this paper advocates a particular index to the study area – SA (Strategic Alignment), calculated from the representative variables of the COBIT 4.1 domains and having the representative variables of the Strategic Alignment primary process as components. As a result, an intermediate index among the values in two adjacent surveys done by TCU in the years of 2010 and 2012 was found, which reflects the attitude and view of managers towards the IT governance: still linked to Data Processing in which a department performs its tasks according to the demand of the various departments or sectors, although there is a commission that discusses the issues related to infrastructure acquisition and systems development. With an Operational view rather than Strategic/Managerial and low attachment to the tools consecrated by the market, several processes are not contemplated in the framework COBIT defined set; this is mainly due to the inexistence of a formal strategic plan for IT; hence, the partial congruency between the organization goals and the IT goals.