66 resultados para Processamentos de dados
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
In this work, the quantitative analysis of glucose, triglycerides and cholesterol (total and HDL) in both rat and human blood plasma was performed without any kind of pretreatment of samples, by using near infrared spectroscopy (NIR) combined with multivariate methods. For this purpose, different techniques and algorithms used to pre-process data, to select variables and to build multivariate regression models were compared between each other, such as partial least squares regression (PLS), non linear regression by artificial neural networks, interval partial least squares regression (iPLS), genetic algorithm (GA), successive projections algorithm (SPA), amongst others. Related to the determinations of rat blood plasma samples, the variables selection algorithms showed satisfactory results both for the correlation coefficients (R²) and for the values of root mean square error of prediction (RMSEP) for the three analytes, especially for triglycerides and cholesterol-HDL. The RMSEP values for glucose, triglycerides and cholesterol-HDL obtained through the best PLS model were 6.08, 16.07 e 2.03 mg dL-1, respectively. In the other case, for the determinations in human blood plasma, the predictions obtained by the PLS models provided unsatisfactory results with non linear tendency and presence of bias. Then, the ANN regression was applied as an alternative to PLS, considering its ability of modeling data from non linear systems. The root mean square error of monitoring (RMSEM) for glucose, triglycerides and total cholesterol, for the best ANN models, were 13.20, 10.31 e 12.35 mg dL-1, respectively. Statistical tests (F and t) suggest that NIR spectroscopy combined with multivariate regression methods (PLS and ANN) are capable to quantify the analytes (glucose, triglycerides and cholesterol) even when they are present in highly complex biological fluids, such as blood plasma
Resumo:
Self-organizing maps (SOM) are artificial neural networks widely used in the data mining field, mainly because they constitute a dimensionality reduction technique given the fixed grid of neurons associated with the network. In order to properly the partition and visualize the SOM network, the various methods available in the literature must be applied in a post-processing stage, that consists of inferring, through its neurons, relevant characteristics of the data set. In general, such processing applied to the network neurons, instead of the entire database, reduces the computational costs due to vector quantization. This work proposes a post-processing of the SOM neurons in the input and output spaces, combining visualization techniques with algorithms based on gravitational forces and the search for the shortest path with the greatest reward. Such methods take into account the connection strength between neighbouring neurons and characteristics of pattern density and distances among neurons, both associated with the position that the neurons occupy in the data space after training the network. Thus, the goal consists of defining more clearly the arrangement of the clusters present in the data. Experiments were carried out so as to evaluate the proposed methods using various artificially generated data sets, as well as real world data sets. The results obtained were compared with those from a number of well-known methods existent in the literature
Resumo:
This work focuses the geomorphological characterization and spatial data modeling in the shallow continental shelf within the Folha Touros limits (SB-25-CV-II), based on bathymetric data analysis and remote sensing products interpretation. The Rio Grande do Norte state is located in northeastern Brazil and the work area is located at the transition region between the eastern and northern portions of their coast. The bathymetric surveys were conduced between march and may 2009, using a 10 meters long vessel and 0.70 meters draught, equipped with global positioning system and echo sounder (dual beam, 200KHz , 14°). The fieldwork resulted in 44 bathymetric profiles espaced 1.5 km and 30 km average length. The bathymetric data amount were 111,200 points and were navigated 1395.7 km within na area about 1,850 km2. The bathymetric data were corrected for the tide level, vessel draught and were subsequently entered into a geographic information system for further processing. Analysis of remote sensing products was carried out using Landsat 7/ETM + band 1, from november 1999. The image was used for visualization and mapping submerged features. The results showed the presence of geomorphological features within the study area. Were observed, from the analysis of local bathymetry and satellite image, seven types of geomorphological features. The channels, with two longitudinals channels (e. g. San Roque and Cioba channels) and other perpendicular to the coast (e. g. Touros, Pititinga and Barretas). Coastal reef formations (Maracajaú, Rio do Fogo and Cioba). Longitudinal waves, described in the literature as longitudinal dunes. The occurrence of a transverse dune field. Another feature observed was the oceanic reefs, an rock alignment parallel to the coast. Were identified four riscas , from north to south: risca do Liso, Gameleira, Zumbi, Pititinga (the latter being described for the first time). Finally, an oceanic terrace was observed in the deepest area of study. Image interpretation corroborated with the in situ results, enabling visualization and description for all features in the region. The results were analysed in an integrating method (using the diferent methodologies applied in this work) and it was essential to describe all features in the area. This method allowed us to evaluate which methods generated better results to describe certain features. From these results was possible to prove the existence of submerged features in the eastern shallow continental shelf of Rio Grande do Norte. In this way, the conclusions was (1) this study contributed to the provision of new information about the area in question, particularly with regard to data collection in situ depths, (2) the method of data collection and interpretation proves to be effective because, through this, it was possible to visualize and interpret the features present in the study area and (3) the interpretation and discussion of results in an integrated method, using different methodologies, can provide better results
Resumo:
When a company desires to invest in a project, it must obtain resources needed to make the investment. The alternatives are using firm s internal resources or obtain external resources through contracts of debt and issuance of shares. Decisions involving the composition of internal resources, debt and shares in the total resources used to finance the activities of a company related to the choice of its capital structure. Although there are studies in the area of finance on the debt determinants of firms, the issue of capital structure is still controversial. This work sought to identify the predominant factors that determine the capital structure of Brazilian share capital, non-financial firms. This work was used a quantitative approach, with application of the statistical technique of multiple linear regression on data in panel. Estimates were made by the method of ordinary least squares with model of fixed effects. About 116 companies were selected to participate in this research. The period considered is from 2003 to 2007. The variables and hypotheses tested in this study were built based on theories of capital structure and in empirical researches. Results indicate that the variables, such as risk, size, and composition of assets and firms growth influence their indebtedness. The profitability variable was not relevant to the composition of indebtedness of the companies analyzed. However, analyzing only the long-term debt, comes to the conclusion that the relevant variables are the size of firms and, especially, the composition of its assets (tangibility).This sense, the smaller the size of the undertaking or the greater the representation of fixed assets in total assets, the greater its propensity to long-term debt. Furthermore, this research could not identify a predominant theory to explain the capital structure of Brazilian
Resumo:
This work aims to analyze risks related to information technology (IT) in procedures related to data migration. This is done considering ALEPH, Integrated Libray System (ILS) that migrated data to the Library Module present in the software called Sistema Integrado de Gestão de Atividades Acadêmicas (SIGAA) at the Zila Mamede Central Library at the Federal University of Rio Grande do Norte (UFRN) in Natal/Brazil. The methodological procedure used was of a qualitative exploratory research with the realization of case study at the referred library in order to better understand this phenomenon. Data collection was able once there was use of a semi-structured interview that was applied with (11) subjects that are employed at the library as well as in the Technology Superintendence at UFRN. In order to examine data Content analysis as well as thematic review process was performed. After data migration the results of the interview were then linked to both analysis units and their system register with category correspondence. The main risks detected were: data destruction; data loss; data bank communication failure; user response delay; data inconsistency and duplicity. These elements point out implication and generate disorders that affect external and internal system users and lead to stress, work duplicity and hassles. Thus, some measures were taken related to risk management such as adequate planning, central management support, and pilot test simulations. For the advantages it has reduced of: risk, occurrence of problems and possible unforeseen costs, and allows achieving organizational objectives, among other. It is inferred therefore that the risks present in data bank conversion in libraries exist and some are predictable, however, it is seen that librarians do not know or ignore and are not very worried in the identification risks in data bank conversion, their acknowledge would minimize or even extinguish them. Another important aspect to consider is the existence of few empirical research that deal specifically with this subject and thus presenting the new of new approaches in order to promote better understanding of the matter in the corporate environment of the information units
Resumo:
Since centuries ago, the Asians use seaweed as an important source of feeding and are their greatest world-wide consumers. The migration of these peoples for other countries, made the demand for seaweed to increase. This increasing demand prompted an industry with annual values of around US$ 6 billion. The algal biomass used for the industry is collected in natural reservoirs or cultivated. The market necessity for products of the seaweed base promotes an unsustainable exploration of the natural banks, compromising its associated biological balance. In this context, seaweed culture appears as a viable alternative to prevent the depletion of these natural supplies. Geographic Information Systems (GIS) provide space and produce information that can facilitate the evaluation of important physical and socio-economic characteristics for the planning of seaweed culture. This objective of this study is to identify potential coastal areas for seaweed culture in the state of Rio Grande do Norte, from the integration of social-environmental data in the SIG. In order to achieve this objective, a geo-referred database composed of geographical maps, nautical maps and orbital digital images was assembled; and a bank of attributes including physical and oceanographical variables (winds, chains, bathymetry, operational distance from the culture) and social and environmental factors (main income, experience with seaweed harvesting, demographic density, proximity of the sheltered coast and distance of the banks) was produced. In the modeling of the data, the integration of the space database with the bank of attributes for the attainment of the map of potentiality of seaweed culture was carried out. Of a total of 2,011 ha analyzed by the GIS for the culture of seaweed, around 34% or 682 ha were indicated as high potential, 55% or 1,101 ha as medium potential, and 11% or 228 ha as low potential. The good indices of potentiality obtained in the localities studied demonstrate that there are adequate conditions for the installation of seaweed culture in the state of Rio Grande do Norte
Resumo:
The main objective of this study is to apply recently developed methods of physical-statistic to time series analysis, particularly in electrical induction s profiles of oil wells data, to study the petrophysical similarity of those wells in a spatial distribution. For this, we used the DFA method in order to know if we can or not use this technique to characterize spatially the fields. After obtain the DFA values for all wells, we applied clustering analysis. To do these tests we used the non-hierarchical method called K-means. Usually based on the Euclidean distance, the K-means consists in dividing the elements of a data matrix N in k groups, so that the similarities among elements belonging to different groups are the smallest possible. In order to test if a dataset generated by the K-means method or randomly generated datasets form spatial patterns, we created the parameter Ω (index of neighborhood). High values of Ω reveals more aggregated data and low values of Ω show scattered data or data without spatial correlation. Thus we concluded that data from the DFA of 54 wells are grouped and can be used to characterize spatial fields. Applying contour level technique we confirm the results obtained by the K-means, confirming that DFA is effective to perform spatial analysis
Resumo:
Oil spills in marine environments represent immediate environmental impacts of large magnitude. For that reason the Environmental Sensitivity to Oil Maps constitute a major instrument for planning actions of containment and cleanup. For both the Environmental Sensitivity Maps always need to be updated, to have an appropriate scale and to represent accurately the coastal areas. In this context, this thesis presents a methodology for collecting and processing remote sensing data for the purpose of updating the territorial basis of thematic maps of Environmental Sensitivity to Oil. To ensure greater applicability of the methodology, sensors with complementary characteristics, which provide their data at a low financial cost, were selected and tested. To test the methodology, an area located on the northern coast of the Northeast of Brazil was chosen. The results showed that the products of ASTER data and image hybrid sensor PALSAR + CCD and HRC + CCD, have a great potential to be used as a source of cartographic information on projects that seek to update the Environmental Sensitivity Maps of Oil
Resumo:
Produced water is characterized as one of the most common wastes generated during exploration and production of oil. This work aims to develop methodologies based on comparative statistical processes of hydrogeochemical analysis of production zones in order to minimize types of high-cost interventions to perform identification test fluids - TIF. For the study, 27 samples were collected from five different production zones were measured a total of 50 chemical species. After the chemical analysis was applied the statistical data, using the R Statistical Software, version 2.11.1. Statistical analysis was performed in three steps. In the first stage, the objective was to investigate the behavior of chemical species under study in each area of production through the descriptive graphical analysis. The second step was to identify a function that classify production zones from each sample, using discriminant analysis. In the training stage, the rate of correct classification function of discriminant analysis was 85.19%. The next stage of processing of the data used for Principal Component Analysis, by reducing the number of variables obtained from the linear combination of chemical species, try to improve the discriminant function obtained in the second stage and increase the discrimination power of the data, but the result was not satisfactory. In Profile Analysis curves were obtained for each production area, based on the characteristics of the chemical species present in each zone. With this study it was possible to develop a method using hydrochemistry and statistical analysis that can be used to distinguish the water produced in mature fields of oil, so that it is possible to identify the zone of production that is contributing to the excessive elevation of the water volume.
Resumo:
This work demonstrates the importance of using tools used in geographic information systems (GIS) and spatial data analysis (SDA) for the study of infectious diseases. Analysis methods were used to describe more fully the spatial distribution of a particular disease by incorporating the geographical element in the analysis. In Chapter 1, we report the historical evolution of these techniques in the field of human health and use Hansen s disease (leprosy) in Rio Grande do Norte as an example. In Chapter 2, we introduced a few basic theoretical concepts on the methodology and classified the types of spatial data commonly treated. Chapters 3 and 4 defined and demonstrated the use of the two most important techniques for analysis of health data, which are data point processes and data area. We modelled the case distribution of Hansen s disease in the city of Mossoró - RN. In the analysis, we used R scripts and made available routines and analitical procedures developed by the author. This approach can be easily used by researchers in several areas. As practical results, major risk areas in Mossoró leprosy were detected, and its association with the socioeconomic profile of the population at risk was found. Moreover, it is clearly shown that his approach could be of great help to be used continuously in data analysis and processing, allowing the development of new strategies to work might increase the use of such techniques in data analysis in health care
Resumo:
Introduction: Chagas Disease is a serious public health problem, with 5 million infected individuals in Brazil. Of these, approximately 30% develop chronic Chagas cardiomyopathy (CCC), where the main symptoms are fatigue and dyspnea. Objective: To correlate maximal exercise capacity with pulmonary function, inspiratory muscle strength and quality of life in patients with CCC. Methodology: Twelve individuals suffering from CCC were evaluated (7 men), with a mean age of 54.91± 8.60 years and the following inclusion criteria: functional class II and III according to the New York Heart Association (NYHA); left ventricle ejection fraction below 45%; clinical stability (> 3 months); symptom duration > 1 year, body mass index (BMI) < 35Kg/m2 and non-smokers or ex-smokers with a history of smoking <10 packs/day. All subjects were submitted to spirometry, manometer testing, maximal cardiopulmonary exercise testing (CPX) and a quality of life questionnaire (Minnesota). Results: A negative correlation was observed between VO2máx and MLHFQ scores (r=-0.626; p=0.03) and a positive association with MIP (r=0.713; p=0.009). Positive correlations were also recorded between MIP and spirometric variables [FEV1(r=0.825;p=0.001 ), FVC(r=0.66;p=0.01 and FEF25-75%(r=0.639;p=0.02)]. Conclusion: The present study demonstrated that in patients with CCC: VO2MAX is directly related to inspiratory muscle strength and quality of life, while deteriorating lung function is directly associated with respiratory muscle weakness
Resumo:
Este trabajo tiene como objetivo comprender los sentidos dados a la relación entre el planeamiento y el proceso de formación continuada por profesores de una escuela pública municipal de la ciudad de Natal (RN). La conjetura guía es la percepción de que los profesores parecen no concebir el planeamiento escolar como espacio de formación colectiva y continuada. Así, esta conjetura es motivo de reflexión a medida que el planeamiento puede ser visto como un proceso cargado por tensiones y conflictos estructurales reprimidos en la escuela. Nuestro principio teórico metodológico es una interpretación de múltiples referencias que parte de conceptos en diferentes modelos de análisis en la comprensión de una realidad en que diferenciadas dimensiones están entremezcladas. Adoptamos como metodología de investigación la Entrevista Comprensiva, en que el objeto de la investigación se construye por medio de la elaboración teórica de las hipótesis surgidas en el campo de la investigación. El investigador busca dominar y personaliza los instrumentos y teoría a un proyecto concreto de investigación, cuya imagen más cercana es lo que se denomina de artesanía intelectual . En el proceso de construcción comprendemos la necesidad de entender el registro de un saber social incorporado por los individuos a su historicidad, a sus orientaciones y definiciones de su acción en relación al conjunto de la sociedad. En este sentido, los profesores entrevistados comentan su entendimiento revelando sentidos sobre cómo hacer efectivo un planeamiento que atienda las realidades del cotidiano de los alumnos. Observamos en nuestros análisis que parte del grupo de profesores tiene consciencia de hacer del planeamiento como base del trabajo docente no apenas dirigido al aspecto práctico elaboración y ejecución -, sino relacionado a un proceso de otros aspectos simultáneos como la reflexión acción-reflexión sobre la acción. De este modo, existe la posibilidad de mejorar el planeamiento transformándolo en más dinámico y participativo a través de los proyectos de trabajo como alternativa de enseñanza y de acercar la práctica pedagógica a la realidad de los alumnos, por ese motivo el planeamiento diario es de fundamental relevancia, ya que el espacio escolar es complejo y dinámico. Sin embargo, percibimos que existe incomprensión del planeamiento como espacio de formación continuada en la escuela en consecuencia de prácticas sin reflexión, y por eso, el proceso de planeamiento tiende a ser visto apenas como técnico y no un proceso político que apunta hacia la acción reflexiva. Por estas razones surgen tensiones externas e internas, unidas a las incertidumbres del trabajo docente en el cotidiano escolar, asociada a los sentimientos antagónicos que pueden ser elementos que dificultan y limitan el trabajo profesional de los docentes, llevándolos a la improvisación. Los profesores sugieren la construcción de la propuesta pedagógica dirigida a la formación continuada, coligada a la introducción de una práctica reflexiva que considere a la colectividad, siendo incluidas la autonomía, la flexibilidad y la abertura del planeamiento, y resaltan la actuación mediadora del coordinador pedagógico como de fundamental importancia para fortalecer el trabajo colectivo en la escuela y de enfatizar practicas reflexivas
Resumo:
Nowadays, telecommunications is one of the most dynamic and strategic areas in the world. Organizations are always seeking to find new management practices within an ever increasing competitive environment where resources are getting scarce. In this scenario, data obtained from business and corporate processes have even greater importance, although this data is not yet adequately explored. Knowledge Discovery in Databases (KDD) appears then, as an option to allow the study of complex problems in different areas of management. This work proposes both a systematization of KDD activities using concepts from different methodologies, such as CRISP-DM, SEMMA and FAYYAD approaches and a study concerning the viability of multivariate regression analysis models to explain corporative telecommunications sales using performance indicators. Thus, statistical methods were outlined to analyze the effects of such indicators on the behavior of business productivity. According to business and standard statistical analysis, equations were defined and fit to their respective determination coefficients. Tests of hypotheses were also conducted on parameters with the purpose of validating the regression models. The results show that there is a relationship between these development indicators and the amount of sales