15 resultados para precision and accuracy

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

New drug delivery systems have been used to increase chemotherapy efficacy due the possible drug resistance of cancer cells. Poly (lactic acid) (PLA) microparticles are able to reduce toxicity and prolong methotrexate (MTX) release. In addition, the use of PLA/poloxamer polymer blends can improve drug release due to changes in the interaction of particles with biological surfaces. The aim of this study was developing spray dried biodegradable MTX-loaded microparticles and evaluate PLA interactions with different kinds of Pluronic® (PLUF127 and PLUF68) in order to modulate drug release. The variables included different drug:polymer (1:10, 1:4.5, 1:3) and polymer:copolymer ratios (25:75, 50:50, 75:25). The precision and accuracy of spray drying method was confirmed assessing drug loading into particles (75.0- 101.3%). The MTX/PLA microparticles showed spherical shape with an apparently smooth surface, which was dependent on the PLU ratio used into blends particles. XRD and thermal analysis demonstrated that the drug was homogeneously dispersed into polymer matrix, whereas the miscibility among components was dependent on the used polymer:copolymer ratio. No new drug- polymer bond was identified by FTIR analysis. The in vitro performance of MTX-loaded PLA microparticles demonstrated an extended-release profile fitted using Korsmeyer- Peppas kinetic model. The PLU accelerated drug release rate possible due PLU leached in the matrix. Nevertheless, drug release studies carried out in cell culture demonstrated the ability of PLU modulating drug release from blend microparticles. This effect was confirmed by cytotoxicity observed according to the amount of drug released as a function of time. Thus, studied PLU was able to improve the performance of spray dried MTX-loaded PLA microparticles, which can be successfully used as carries for modulated drug delivery with potential in vivo application

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of this Doctoral Thesis was monitoring, in trimestral scale, the coastal morphology of the Northeastern coast sections of Rio Grande do Norte State, in Brazil, which is an area of Potiguar Basin influenced by the oil industry activities. The studied sections compose coastal areas with intense sedimentary erosion and high environmental sensitivity to the oil spill. In order to achieve the general objective of this study, the work has been systematized in four steps. The first one refers to the evaluation of the geomorphological data acquisition methodologies used on Digital Elevation Model (DEM) of sandy beaches. The data has been obtained from Soledade beach, located on the Northeastern coast of Rio Grande Norte. The second step has been centered on the increasing of the reference geodetic infrastructure to accomplish the geodetic survey of the studied area by implanting a station in Corta Cachorro Barrier Island and by conducting monitoring geodetic surveys to understand the beach system based on the Coastline (CL) and on DEM multitemporal analysis. The third phase has been related to the usage of the methodology developed by Santos; Amaro (2011) and Santos et al. (2012) for the surveying, processing, representation, integration and analysis of Coastlines from sandy coast, which have been obtained through geodetic techniques of positioning, morphological change analysis and sediment transport. The fourth stage represents the innovation of surveys in coastal environment by using the Terrestrial Laser Scanning (TLS), based on Light Detection and Ranging (LiDAR), to evaluate a highly eroded section on Soledade beach where the oil industry structures are located. The evaluation has been achieved through high-precision DEM and accuracy during the modeling of the coast morphology changes. The result analysis of the integrated study about the spatial and temporal interrelations of the intense coastal processes in areas of building cycles and destruction of beaches has allowed identifying the causes and consequences of the intense coastal erosion in exposed beach sections and in barrier islands

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Investigations in the field of pharmaceutical analysis and quality control of medicines require analytical procedures with good perfomance characteristics. Calibration is one of the most important steps in chemical analysis, presenting direct relation to parameters such as linearity. This work consisted in the development of a new methodology to obtain calibration curves for drug analysis: the stationary cuvette one. It was compared to the currently used methodology, and possible sources of variation between them were evaluated. The results demonstrated that the proposed technique presented similar reproducibility compared to the traditional methodology. In addition to that, some advantages were observed, such as user-friendliness, cost-effectiveness, accuracy, precision and robustness. Therefore, the stationary cuvette methodology may be considered the best choice to obtain calibration curves for drug analyis by spectrophotometry

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of this Doctoral Thesis was monitoring, in trimestral scale, the coastal morphology of the Northeastern coast sections of Rio Grande do Norte State, in Brazil, which is an area of Potiguar Basin influenced by the oil industry activities. The studied sections compose coastal areas with intense sedimentary erosion and high environmental sensitivity to the oil spill. In order to achieve the general objective of this study, the work has been systematized in four steps. The first one refers to the evaluation of the geomorphological data acquisition methodologies used on Digital Elevation Model (DEM) of sandy beaches. The data has been obtained from Soledade beach, located on the Northeastern coast of Rio Grande Norte. The second step has been centered on the increasing of the reference geodetic infrastructure to accomplish the geodetic survey of the studied area by implanting a station in Corta Cachorro Barrier Island and by conducting monitoring geodetic surveys to understand the beach system based on the Coastline (CL) and on DEM multitemporal analysis. The third phase has been related to the usage of the methodology developed by Santos; Amaro (2011) and Santos et al. (2012) for the surveying, processing, representation, integration and analysis of Coastlines from sandy coast, which have been obtained through geodetic techniques of positioning, morphological change analysis and sediment transport. The fourth stage represents the innovation of surveys in coastal environment by using the Terrestrial Laser Scanning (TLS), based on Light Detection and Ranging (LiDAR), to evaluate a highly eroded section on Soledade beach where the oil industry structures are located. The evaluation has been achieved through high-precision DEM and accuracy during the modeling of the coast morphology changes. The result analysis of the integrated study about the spatial and temporal interrelations of the intense coastal processes in areas of building cycles and destruction of beaches has allowed identifying the causes and consequences of the intense coastal erosion in exposed beach sections and in barrier islands

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research investigates the acting and the importance for the users of the public squares located predominantly in residential areas. It presents the results of the posoccupation evaluations accomplished in three squares, whose physical, environmental characteristics, equipments and fumitures are different in its qualities and amounts, taking in consideration aspects related to the users' physical and psychological comfort and of the inhabitants of the I spill. The collection of data involved physical risings and of files, observations behavious, application of questionnaires and interviews, analyzed qualitative as quantitatively so much for a larger precision and validity of the investigation. The results were obtained through the relationship among the users' perception the environmental attributes and the different levels of apropriation/use of the studied places. They indicate that the aspects composicionais of physical order of the space affects the type and the intensity of use of the squares intimately, contributing positive or negatively for its valor. It is evidenced like this, that the low freqüentation of the public squares of Natal, is due mainly to referring aspects to the physical quality and the amount of the fumitures and urban equipments. It is ended that the investments and the physical planning of these public spaces should be based in the real knowledge of the aspirations of the population objective, in way to allow its largest use and valor

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The general objective of this thesis has been seasonal monitoring (quarterly time scale) of coastal and estuarine areas of a section of the Northern Coast of Rio Grande do Norte, Brazil, environmentally sensitive and with intense sediment erosion in the oil activities to underpin the implementation of projects for containment of erosion and mitigate the impacts of coastal dynamics. In order to achieve the general objective, the work was done systematically in three stages which consisted the specific objectives. The first stage was the implementation of geodetic reference infrastructure for carrying out the geodetic survey of the study area. This process included the implementation of RGLS (Northern Coast of the RN GPS Network), consisting of stations with geodetic coordinates and orthometric heights of precision; positioning of Benchmarks and evaluation of the gravimetric geoid available, for use in GPS altimetry of precision; and development of software for GPS altimetry of precision. The second stage was the development and improvement of methodologies for collection, processing, representation, integration and analysis of CoastLine (CL) and Digital Elevation Models (DEM) obtained by geodetic positioning techniques. As part of this stage have been made since, the choice of equipment and positioning methods to be used, depending on the required precision and structure implanted, and the definition of the LC indicator and of the geodesic references best suited, to coastal monitoring of precision. The third step was the seasonal geodesic monitoring of the study area. It was defined the execution times of the geodetic surveys by analyzing the pattern of sediment dynamics of the study area; the performing of surveys in order to calculate and locate areas and volumes of erosion and accretion (sandy and volumetric sedimentary balance) occurred on CL and on the beaches and islands surfaces throughout the year, and study of correlations between the measured variations (in area and volume) between each survey and the action of the coastal dynamic agents. The results allowed an integrated study of spatial and temporal interrelationships of the causes and consequences of intensive coastal processes operating in the area, especially to the measurement of variability of erosion, transport, balance and supply sedimentary over the annual cycle of construction and destruction of beaches. In the analysis of the results, it was possible to identify the causes and consequences of severe coastal erosion occurred on beaches exposed, to analyze the recovery of beaches and the accretion occurring in tidal inlets and estuaries. From the optics of seasonal variations in the CL, human interventions to erosion contention have been proposed with the aim of restoring the previous situation of the beaches in the process of erosion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: Evaluating the kit-Bh performance in carrying out of breast biopsies. METHODS: They were randomly selected a sample of 30 patients with breast cancer undergoing mastectomy, based on the results of a pilot study from February 2008 to April 2010. They were excluded women with had not palpable, stone-hard consistency tumors, previous surgical manipulation or that contains liquid. Using the helicoid biopsy Kit (kit Bh) and an equipment Core biopsy with cannula and needle and 14 gauge respectively, it was collected a fragment of sound equipment in the area and in tumors in each specimen, totaling 120 fragments for histological study. For data analysis, it was defined a 95% confidence level and used the SPSS-13 version, the Kappa index and the parametric Student t test. RESULTS: Mean age of patients was 51.6 years (± 11.1 years). The infiltrating ductal carcinoma showed a higher incidence, 26 cases (86.7%). The Core biopsy had a sensitivity of 93.3%, specificity of 100% and accuracy 96.7%, while the helicoid biopsy had a sensitivity of 96.7%, specificity of 100% and accuracy 98.3%. By comparing the histology of tumors and the fragments of biopsies, there was high degree of agreement in diagnoses (kappa of 0.93 with p <0.05) CONCLUSION: Both devices provided the histological diagnosis of lesions with high accuracy. Results of this study showed that the helicoid biopsy is a reliable alternative in 22 the preoperative diagnosis of breast lesions. Further studies in vivo better will define the role of Kit Bh in the diagnosis of these lesions

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research aimed to understand the general perception of men about their health care in a health unit família. This is an exploratory and descriptive qualitative approach, developed at the Center for Health Dr Vulpiano Cavalcante in the City of Parnamirim / RN. Participated in the investigation 12 men enrolled in the Estratégia de Saúde da Família ESF (Family Health Strategy), in the age-group 20 to 59 years lived in the catchment area of one of the ESF teams unit above. Data were collected from July to August 2009, through structured interviews, which, after transcription have undergone a process of identifying meaning units, coded and categorized according to the precepts of content analysis according to Bardin. Following in the footsteps of this method, originated the following themes: "Revealing what motivates men to seek assistance in the Estratégia de Saúde da Família," "Expressing knowledge of the Estratégia de Saúde da Família ", "an opinion about the care health and relationship with the ESF. The analysis was processed according to the principles of symbolic interactionism as Blumer. To support the discussions were used literary themes about man in the context of public health policies and the man in the family and the influences of gender. These, when they had their properties and dimensions analyzed, raised the central category Man on the Estratégia de Saúde da Família . The results show that respondents have limited knowledge about the ESF, your actions turn to use the service when no obligation arising from discomforts and accuracy of care. Moreover, delivering the opinion of the care with your health, even need, but recognizing that do not perform self-care. Given this reality, we can conclude that the perception of the man about his health in the ESF permeates gender issues that influence their behavior toward disease prevention and health promotion. This situation requires that the professionals and managers of health initiatives for the inclusion of man in the actions of the ESF assistance starting from the understanding of their conceptions of health care.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The transplantation of organs and tissues presents itself as an important therapeutic option, both from a medical standpoint, the social or economic. Thus, the identification of variables that can interfere in the effectiveness of organs and tissues donation for transplantation needs to be investigated adequately, because it stands before increasing index of chronic and degenerative diseases in the population, what makes the waiting list for transplantation grow disproportionately and patients come to death without the opportunity of realization the treatment due to a lack of donors. In this context, has defined as objective of this study evaluate the factors associated with the effectiveness of organs and tissues donation for transplantation. It is a evaluative research, quantitative, prospective, with longitudinal design, developed at Central of Catchment, Notification and organ donation for transplant, Organ Procurement Organization and in six accredited hospitals to collect and transplantation of organs and tissues, in Natal/RN, between august 2010 and february 2011, after the approval of the Research Ethics Committee, under No. 414/10 and CAAE 007.0.294.000-10. The probabilistic sample without replacement was composed of 65 potential donors. It was used as an instrument of data collection a structured script non-participant observation of checklist type. Data were analyzed using descriptive statistics and presented in tables, charts, graphs and figures. For this, was used Microsoft Excel 2007 and statistical program SPSS version 20.0. To check the level of significance was chosen by applying the chi-square test (χ2) and Mann Whitney and caselas for less than five, it is considered the Fisher exact test. It was adopted as the significance level p-value <0.05. Among the surveyed it was observed that most of the individuals were male (50,8%), in the age group 45 years (53,8%), mean age of 42,3 years, minimum 5 and maximum 73 years (± 17,32 years). Single / widowers / divorced (56,9%), with up to completed elementary school (60,0%) in the exercise of professional activity (86,2%), catholic (83,1%) and residents in metropolitan region of Natal (52,3%). Was obtained donation effectiveness of 27,7%. There was no statistical significance between structure and effectiveness of the donation, but were observed inadequacies in physical resources (36,9%), materials (30,8%), organizational structure (29,2%) and human resources (18,5%). In the process, the maintenance phase (p= 0.004), diagnosis of brain death (p= 0.032), family interview (p≤ 0.001) and documentation (p= 0.001) presented statistical significance with effectiveness. Thus, it is accepts the alternative hypothesis of the study, in which is evidenced that the adequacy of the factors related to structure and process is associated to effectiveness of organs and tissues donation for transplantation. In this way, the effectiveness of organ and tissue donation ends in an essential way the rapidity and accuracy with which the donation process is conducted, requiring appropriate structure, with appropriate physical and material resources and skilled human resources to optimize the reduction of time and the suffering of those waiting for an organ or tissue transplant queued in Brazil

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation of Mestrado investigated the performance and quality of web sites. The target of the research is the proposal of an integrated model of evaluation of services of digital information in web educational sites. The universe of the research was constituted by eighteen Brazilian Universities that offer after-graduation courses, in the levels of mestrado and doutorado in the area of Engineering of Production. The adopted methodology was a descriptive and exploratory research, using the technique of systematic comment and focus group, for the collection of the data, using itself changeable independent dependents and, through the application of two instruments of research. The analysis protocol was the instrument adopted for evaluation and attainment of qualitative results. E the analysis grating was applied for evaluation and attainment of the quantitative results. The qualitative results had identified to the lack of standardization of web sites, under the attributes of content, hierarchy of information, design of the colors and letters. It of accessibility for carriers of auditory and visual special necessities was observed inexistence, as well as the lack of convergence of medias and assistivas technologies. The language of the sites also was evaluated and all present Portuguese only language. The general result demonstrated in grafico and tables with classification of the Universities, predominating the Good note As for the quantitative results, analysis method ed was estatistico, in order to get the descriptive and inferencial result between the dependent and independent variaveis. How much a category of analysis of the services of the evaluated sites, was found it props up and the index generality weighed. These results had served of base for ranking of existence or inexistence the Universities, how much of the information of services in its web sites. In analysis inferencial the result of the test of correlation or association of the independent variaveis (level, concept of the CAPES and period of existence of the program) with the caracteristicas, called was gotten categories of services. For this analysis the estatisticos methods had been used: coefficient of Spearman and the Test of Fisher. But the category you discipline of the Program of Mestrado presented significance with variavel independent and concept of the CAPES. Main conclusion of this study it was ausencia of satandardization o how much to the subjective aspects, design, hierarchy of information navigability and content precision and the accessibility inexistence and convergence. How much to the quantitative aspects, the information services offered by web sites of the evaluated Universities, still they do not present a satisfactory and including quality. Absence of strategies, adoption of tools web, techniques of institucional marketing and services that become them more interactive, navigable is perceived and with aggregate value

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work deals with a mathematical fundament for digital signal processing under point view of interval mathematics. Intend treat the open problem of precision and repesention of data in digital systems, with a intertval version of signals representation. Signals processing is a rich and complex area, therefore, this work makes a cutting with focus in systems linear invariant in the time. A vast literature in the area exists, but, some concepts in interval mathematics need to be redefined or to be elaborated for the construction of a solid theory of interval signal processing. We will construct a basic fundaments for signal processing in the interval version, such as basic properties linearity, stability, causality, a version to intervalar of linear systems e its properties. They will be presented interval versions of the convolution and the Z-transform. Will be made analysis of convergences of systems using interval Z-transform , a essentially interval distance, interval complex numbers , application in a interval filter.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Simulations based on cognitively rich agents can become a very intensive computing task, especially when the simulated environment represents a complex system. This situation becomes worse when time constraints are present. This kind of simulations would benefit from a mechanism that improves the way agents perceive and react to changes in these types of environments. In other worlds, an approach to improve the efficiency (performance and accuracy) in the decision process of autonomous agents in a simulation would be useful. In complex environments, and full of variables, it is possible that not every information available to the agent is necessary for its decision-making process, depending indeed, on the task being performed. Then, the agent would need to filter the coming perceptions in the same as we do with our attentions focus. By using a focus of attention, only the information that really matters to the agent running context are perceived (cognitively processed), which can improve the decision making process. The architecture proposed herein presents a structure for cognitive agents divided into two parts: 1) the main part contains the reasoning / planning process, knowledge and affective state of the agent, and 2) a set of behaviors that are triggered by planning in order to achieve the agent s goals. Each of these behaviors has a runtime dynamically adjustable focus of attention, adjusted according to the variation of the agent s affective state. The focus of each behavior is divided into a qualitative focus, which is responsible for the quality of the perceived data, and a quantitative focus, which is responsible for the quantity of the perceived data. Thus, the behavior will be able to filter the information sent by the agent sensors, and build a list of perceived elements containing only the information necessary to the agent, according to the context of the behavior that is currently running. Based on the human attention focus, the agent is also dotted of a affective state. The agent s affective state is based on theories of human emotion, mood and personality. This model serves as a basis for the mechanism of continuous adjustment of the agent s attention focus, both the qualitative and the quantative focus. With this mechanism, the agent can adjust its focus of attention during the execution of the behavior, in order to become more efficient in the face of environmental changes. The proposed architecture can be used in a very flexibly way. The focus of attention can work in a fixed way (neither the qualitative focus nor the quantitaive focus one changes), as well as using different combinations for the qualitative and quantitative foci variation. The architecture was built on a platform for BDI agents, but its design allows it to be used in any other type of agents, since the implementation is made only in the perception level layer of the agent. In order to evaluate the contribution proposed in this work, an extensive series of experiments were conducted on an agent-based simulation over a fire-growing scenario. In the simulations, the agents using the architecture proposed in this work are compared with similar agents (with the same reasoning model), but able to process all the information sent by the environment. Intuitively, it is expected that the omniscient agent would be more efficient, since they can handle all the possible option before taking a decision. However, the experiments showed that attention-focus based agents can be as efficient as the omniscient ones, with the advantage of being able to solve the same problems in a significantly reduced time. Thus, the experiments indicate the efficiency of the proposed architecture

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The precision and the fast identification of abnormalities of bottom hole are essential to prevent damage and increase production in the oil industry. This work presents a study about a new automatic approach to the detection and the classification of operation mode in the Sucker-rod Pumping through dynamometric cards of bottom hole. The main idea is the recognition of the well production status through the image processing of the bottom s hole dynamometric card (Boundary Descriptors) and statistics and similarity mathematics tools, like Fourier Descriptor, Principal Components Analysis (PCA) and Euclidean Distance. In order to validate the proposal, the Sucker-Rod Pumping system real data are used

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents an evaluative study about the effects of using a machine learning technique on the main features of a self-organizing and multiobjective genetic algorithm (GA). A typical GA can be seen as a search technique which is usually applied in problems involving no polynomial complexity. Originally, these algorithms were designed to create methods that seek acceptable solutions to problems where the global optimum is inaccessible or difficult to obtain. At first, the GAs considered only one evaluation function and a single objective optimization. Today, however, implementations that consider several optimization objectives simultaneously (multiobjective algorithms) are common, besides allowing the change of many components of the algorithm dynamically (self-organizing algorithms). At the same time, they are also common combinations of GAs with machine learning techniques to improve some of its characteristics of performance and use. In this work, a GA with a machine learning technique was analyzed and applied in a antenna design. We used a variant of bicubic interpolation technique, called 2D Spline, as machine learning technique to estimate the behavior of a dynamic fitness function, based on the knowledge obtained from a set of laboratory experiments. This fitness function is also called evaluation function and, it is responsible for determining the fitness degree of a candidate solution (individual), in relation to others in the same population. The algorithm can be applied in many areas, including in the field of telecommunications, as projects of antennas and frequency selective surfaces. In this particular work, the presented algorithm was developed to optimize the design of a microstrip antenna, usually used in wireless communication systems for application in Ultra-Wideband (UWB). The algorithm allowed the optimization of two variables of geometry antenna - the length (Ls) and width (Ws) a slit in the ground plane with respect to three objectives: radiated signal bandwidth, return loss and central frequency deviation. These two dimensions (Ws and Ls) are used as variables in three different interpolation functions, one Spline for each optimization objective, to compose a multiobjective and aggregate fitness function. The final result proposed by the algorithm was compared with the simulation program result and the measured result of a physical prototype of the antenna built in the laboratory. In the present study, the algorithm was analyzed with respect to their success degree in relation to four important characteristics of a self-organizing multiobjective GA: performance, flexibility, scalability and accuracy. At the end of the study, it was observed a time increase in algorithm execution in comparison to a common GA, due to the time required for the machine learning process. On the plus side, we notice a sensitive gain with respect to flexibility and accuracy of results, and a prosperous path that indicates directions to the algorithm to allow the optimization problems with "η" variables

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors