13 resultados para 700103 Information processing services
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
In recent decades, changes have been occurring in the telecommunications industry, allied to competition driven by the policies of privatization and concessions, have fomented the world market irrefutably causing the emergence of a new reality. The reflections in Brazil have become evident due to the appearance of significant growth rates, getting in 2012 to provide a net operating income of 128 billion dollars, placing the country among the five major powers in the world in mobile communications. In this context, an issue of increasing importance to the financial health of companies is their ability to retain their customers, as well as turn them into loyal customers. The appearance of infidelity from customer operators has been generating monthly rates shutdowns about two to four percent per month accounting for business management one of its biggest challenges, since capturing a new customer has meant an expenditure greater than five times to retention. For this purpose, models have been developed by means of structural equation modeling to identify the relationships between the various determinants of customer loyalty in the context of services. The original contribution of this thesis is to develop a model for loyalty from the identification of relationships between determinants of satisfaction (latent variables) and the inclusion of attributes that determine the perceptions of service quality for the mobile communications industry, such as quality, satisfaction, value, trust, expectation and loyalty. It is a qualitative research which will be conducted with customers of operators through simple random sampling technique, using structured questionnaires. As a result, the proposed model and statistical evaluations should enable operators to conclude that customer loyalty is directly influenced by technical and operational quality of the services offered, as well as provide a satisfaction index for the mobile communication segment
Resumo:
The present study investigated the impact of the treatment modalities of Acute Lymphoblastic Leukemia on neurocognitive abilities of children and adolescents survivors, aged between 6 and 16 years of age, accompanied in pediatric oncology sectors of public health services in the cities of Campina Grande-PB and Natal-RN. The study included 52 children, 13 of these being children and adolescents diagnosed with leukemia and 39 healthy children matched in relation to the study group considering gender, age, school type and level of maternal education. Later the group of children with leukemia was subdivided into two subgroups depending on treatment modality which were submitted: Group 1A (only chemotherapy) and 1B (chemotherapy and radiotherapy). All participants were subjected to a battery of neuropsychological tests that investigated the following neurocognitive abilities: intellectual ability, memory system, attention, visuospatiality and visuoconstruction, processing speed and executive functions. Data were analyzed using descriptive and inferential measurements with the aid of the U test of Mann-Whitney and T test, considering the influence of the variables: sex, age at diagnosis, time since completion of treatment and level of schooling mothers, on the performance of children. Overall, it is concluded that the illness and the treatment of acute lymphoblastic leukemia significantly favors the emergence of cognitive deficits, particularly in terms of visuospatial skills, and executive skills visoconstrutivas. In turn, the treatment modality of radiotherapy is associated with the presence of more severe deficits, highlighting the significant impact on the speed of information processing. It is hoped that the results presented here will contribute to a better understanding of the nature and extent of neurocognitive effects arising ALL treatment
Resumo:
LOPES-DOS-SANTOS, V. , CONDE-OCAZIONEZ, S. ; NICOLELIS, M. A. L. , RIBEIRO, S. T. , TORT, A. B. L. . Neuronal assembly detection and cell membership specification by principal component analysis. Plos One, v. 6, p. e20996, 2011.
Resumo:
TORT, A. B. L. ; SCHEFFER-TEIXEIRA, R ; Souza, B.C. ; DRAGUHN, A. ; BRANKACK, J. . Theta-associated high-frequency oscillations (110-160 Hz) in the hippocampus and neocortex. Progress in Neurobiology , v. 100, p. 1-14, 2013.
Resumo:
Different types of network oscillations occur in different behavioral, cognitive, or vigilance states. The rodent hippocampus expresses prominentoscillations atfrequencies between 4 and 12Hz,which are superimposed by phase-coupledoscillations (30 –100Hz).These patterns entrain multineuronal activity over large distances and have been implicated in sensory information processing and memory formation. Here we report a new type of oscillation at near- frequencies (2– 4 Hz) in the hippocampus of urethane-anesthetized mice. The rhythm is highly coherent with nasal respiration and with rhythmic field potentials in the olfactory bulb: hence, we called it hippocampal respiration-induced oscillations. Despite the similarity in frequency range, several features distinguish this pattern from locally generatedoscillations: hippocampal respiration-induced oscillations have a unique laminar amplitude profile, are resistant to atropine, couple differentlytooscillations, and are abolished when nasal airflow is bypassed bytracheotomy. Hippocampal neurons are entrained by both the respiration-induced rhythm and concurrent oscillations, suggesting a direct interaction between endogenous activity in the hippocampus and nasal respiratory inputs. Our results demonstrate that nasal respiration strongly modulates hippocampal network activity in mice, providing a long-range synchronizing signal between olfactory and hippocampal networks.
Resumo:
This dissertation of Mestrado investigated the performance and quality of web sites. The target of the research is the proposal of an integrated model of evaluation of services of digital information in web educational sites. The universe of the research was constituted by eighteen Brazilian Universities that offer after-graduation courses, in the levels of mestrado and doutorado in the area of Engineering of Production. The adopted methodology was a descriptive and exploratory research, using the technique of systematic comment and focus group, for the collection of the data, using itself changeable independent dependents and, through the application of two instruments of research. The analysis protocol was the instrument adopted for evaluation and attainment of qualitative results. E the analysis grating was applied for evaluation and attainment of the quantitative results. The qualitative results had identified to the lack of standardization of web sites, under the attributes of content, hierarchy of information, design of the colors and letters. It of accessibility for carriers of auditory and visual special necessities was observed inexistence, as well as the lack of convergence of medias and assistivas technologies. The language of the sites also was evaluated and all present Portuguese only language. The general result demonstrated in grafico and tables with classification of the Universities, predominating the Good note As for the quantitative results, analysis method ed was estatistico, in order to get the descriptive and inferencial result between the dependent and independent variaveis. How much a category of analysis of the services of the evaluated sites, was found it props up and the index generality weighed. These results had served of base for ranking of existence or inexistence the Universities, how much of the information of services in its web sites. In analysis inferencial the result of the test of correlation or association of the independent variaveis (level, concept of the CAPES and period of existence of the program) with the caracteristicas, called was gotten categories of services. For this analysis the estatisticos methods had been used: coefficient of Spearman and the Test of Fisher. But the category you discipline of the Program of Mestrado presented significance with variavel independent and concept of the CAPES. Main conclusion of this study it was ausencia of satandardization o how much to the subjective aspects, design, hierarchy of information navigability and content precision and the accessibility inexistence and convergence. How much to the quantitative aspects, the information services offered by web sites of the evaluated Universities, still they do not present a satisfactory and including quality. Absence of strategies, adoption of tools web, techniques of institucional marketing and services that become them more interactive, navigable is perceived and with aggregate value
Resumo:
The number of applications based on embedded systems grows significantly every year, even with the fact that embedded systems have restrictions, and simple processing units, the performance of these has improved every day. However the complexity of applications also increase, a better performance will always be necessary. So even such advances, there are cases, which an embedded system with a single unit of processing is not sufficient to achieve the information processing in real time. To improve the performance of these systems, an implementation with parallel processing can be used in more complex applications that require high performance. The idea is to move beyond applications that already use embedded systems, exploring the use of a set of units processing working together to implement an intelligent algorithm. The number of existing works in the areas of parallel processing, systems intelligent and embedded systems is wide. However works that link these three areas to solve any problem are reduced. In this context, this work aimed to use tools available for FPGA architectures, to develop a platform with multiple processors to use in pattern classification with artificial neural networks
Resumo:
The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors
Resumo:
This paper analyzes the performance of a parallel implementation of Coupled Simulated Annealing (CSA) for the unconstrained optimization of continuous variables problems. Parallel processing is an efficient form of information processing with emphasis on exploration of simultaneous events in the execution of software. It arises primarily due to high computational performance demands, and the difficulty in increasing the speed of a single processing core. Despite multicore processors being easily found nowadays, several algorithms are not yet suitable for running on parallel architectures. The algorithm is characterized by a group of Simulated Annealing (SA) optimizers working together on refining the solution. Each SA optimizer runs on a single thread executed by different processors. In the analysis of parallel performance and scalability, these metrics were investigated: the execution time; the speedup of the algorithm with respect to increasing the number of processors; and the efficient use of processing elements with respect to the increasing size of the treated problem. Furthermore, the quality of the final solution was verified. For the study, this paper proposes a parallel version of CSA and its equivalent serial version. Both algorithms were analysed on 14 benchmark functions. For each of these functions, the CSA is evaluated using 2-24 optimizers. The results obtained are shown and discussed observing the analysis of the metrics. The conclusions of the paper characterize the CSA as a good parallel algorithm, both in the quality of the solutions and the parallel scalability and parallel efficiency
Resumo:
The information tecnology (IT) has, over the years, gaining prominence as a strategic element and competitive edge in organizations, public or private. In the judiciary, with the implementation of actions related to Judiciário Eletrônico, information technology (IT), definitely earns its status as a strategic element and significantly raises the level of dependence of the organs of their services and products. Increasingly, the quality of services provided by IT has direct impact on the quality of services provided by the agency as a whole. The Ministério Público do Estado do Rio Grande do Norte (MPRN) deployments shares of Electronic Government, along with an administrative reform, beyond these issues raised, caused a large increase in institutional demand for products and services provided by the Diretoria de Tecnologia da Informação (DTI), a sector responsible for the provision of IT services. Taking as starting point strategic goal set by MPRN to reach a 85% level of user satisfaction in four years, we seek to propose a method that assists in meeting the goal, respecting the capacity constraints of the IT sector. To achieve the proposed objective, we conducted a work in two distinct and complementary stages. In the first step we conducted a case study in MPRN, in which, through an internal and external diagnosis of DTI, accomplished by an action of internal consulting and one research of the user satisfaction, we seek to identify opportunities of change seeking to raise the quality perceived of the services provided by the DTI , from the viewpoint of their customers. The situational report, drawn from the data collected, fostered changes in DTI, which were then evaluated with the managers. In the second stage, with the results obtained in the initial process, empirical observation, evaluation of side projects of quality improvement in the sector, and validation with the managers, of the initial model, we developed an improved process, gazing beyond the identification of gaps in service a strategy for the selection of best management practices and deployment of these, in a incremental and adaptive way, allowing the application of the process in organs with little staff allocated to the provision of information technology services
Resumo:
Central Nervous System are the most common pediatric solid tumors. 60% of these tumors arise in posterior fossa, mainly in cerebellum. The first therapeutic approach is surgical resection. Malignant tumors require additional strategies - chemotherapy and radiotherapy. The increasing survival evidences that childhood brain tumors result in academic and social difficulties that compromise the quality of life of the patients. This study investigated the intellectual functioning of children between 7 to 15 years diagnosed with posterior fossa tumors and treated at CEHOPE - Recife / PE. 21 children were eligible - including 13 children with pilocytic astrocytoma (G1) who underwent only surgery resection, and eight children with medulloblastoma (G2) - submitted to surgical resection, chemotherapy and craniospinal radiotherapy. Participants were evaluated by the Wechsler Intelligence Scale for Children - WISC-III. Children of G1 scored better than children of G2. Inferential tools (Mann-Whitney Ü Test) identified significant diferences (p ≤ 0.05) between the Performance IQ (PIQ) and Processing Speed Index (PSI) as a function of treatment modality; Full Scale IQ (FSIQ), PIQ and PSI as a function of parental educational level; PIQ, FSIQ, IVP and Freedom from Distractibility (FDI) as a function of time between diagnosis and evaluation. These results showed the late and progressive impact of radiotherapy on white matter and information processing speed. Furthermore, children whose parents have higher educational level showed better intellectual performance, indicating the influence of xxii socio-cultural variables on cognitive development. The impact of cancer and its treatment on cognitive development and learning should not be underestimated. These results support the need to increase the understanding of such effects in order to propose therapeutic strategies which ensure that, in addition to the cure, the full development of children with this pathology
Resumo:
This study aims to analyze tourist information provided by the official websites of the 2014 FIFA World Cup host cities. The framework developed by Díaz (2005) was applied to analyze different aspects, such as: local tourist information, tourist services distribution, communication and interaction between website and users, and website foreign language versions. This dissertation describes how society and tourism are related by analyzing the consequences of technological evolution in the travel and tourism sector, showing the importance of the use of information and communication technology to provide accurate, upto- date and low-cost information to tourist destinations. Because of the nature of the study, the research subjects are the 12 Brazilian host cities represented by their respective official webpages (cities, states and convention bureaus), and also Brazil s official website, totalizing 36 elements to be analyzed. The methodology has been characterized as descriptive and exploratory with quantitative analysis, and also using desk research and survey literature review. In order to analyze the data collected, parametric and nonparametric statistics tests were used, such as: variance analysis (ANOVA and KRUSKAL-WALLIS) to measure means variance between groups combined with multiple comparison tests (Tukey and Games Howell); nonparametric correlations tests (Kendall s Tau b); and cluster analyses. Finally, Microsoft Excel was used to collect data and SPSS for managing data through quantitative analyses tests. Overall, the websites of the south region showed better results than the other Brazilian regions. Despite this result, the data analysis demonstrated that the available tourist information are incomplete as it was verified that tourist host cities websites are unable to provide all the information needed for the web visitors to organize and plan their journey. This means that visitors have to look for more information in other sources
Resumo:
This study includes the results of the analysis of areas susceptible to degradation by remote sensing in semi-arid region, which is a matter of concern and affects the whole population and the catalyst of this process occurs by the deforestation of the savanna and improper practices by the use of soil. The objective of this research is to use biophysical parameters of the MODIS / Terra and images TM/Landsat-5 to determine areas susceptible to degradation in semi-arid Paraiba. The study area is located in the central interior of Paraíba, in the sub-basin of the River Taperoá, with average annual rainfall below 400 mm and average annual temperature of 28 ° C. To draw up the map of vegetation were used TM/Landsat-5 images, specifically, the composition 5R4G3B colored, commonly used for mapping land use. This map was produced by unsupervised classification by maximum likelihood. The legend corresponds to the following targets: savanna vegetation sparse and dense, riparian vegetation and exposed soil. The biophysical parameters used in the MODIS were emissivity, albedo and vegetation index for NDVI (NDVI). The GIS computer programs used were Modis Reprojections Tools and System Information Processing Georeferenced (SPRING), which was set up and worked the bank of information from sensors MODIS and TM and ArcGIS software for making maps more customizable. Initially, we evaluated the behavior of the vegetation emissivity by adapting equation Bastiaanssen on NDVI for spatialize emissivity and observe changes during the year 2006. The albedo was used to view your percentage of increase in the periods December 2003 and 2004. The image sensor of Landsat TM were used for the month of December 2005, according to the availability of images and in periods of low emissivity. For these applications were made in language programs for GIS Algebraic Space (LEGAL), which is a routine programming SPRING, which allows you to perform various types of algebras of spatial data and maps. For the detection of areas susceptible to environmental degradation took into account the behavior of the emissivity of the savanna that showed seasonal coinciding with the rainy season, reaching a maximum emissivity in the months April to July and in the remaining months of a low emissivity . With the images of the albedo of December 2003 and 2004, it was verified the percentage increase, which allowed the generation of two distinct classes: areas with increased variation percentage of 1 to 11.6% and the percentage change in areas with less than 1 % albedo. It was then possible to generate the map of susceptibility to environmental degradation, with the intersection of the class of exposed soil with varying percentage of the albedo, resulting in classes susceptibility to environmental degradation