68 resultados para Equivalência difusa
Resumo:
Na computação científica é necessário que os dados sejam o mais precisos e exatos possível, porém a imprecisão dos dados de entrada desse tipo de computação pode estar associada às medidas obtidas por equipamentos que fornecem dados truncados ou arredondados, fazendo com que os cálculos com esses dados produzam resultados imprecisos. Os erros mais comuns durante a computação científica são: erros de truncamentos, que surgem em dados infinitos e que muitas vezes são truncados", ou interrompidos; erros de arredondamento que são responsáveis pela imprecisão de cálculos em seqüências finitas de operações aritméticas. Diante desse tipo de problema Moore, na década de 60, introduziu a matemática intervalar, onde foi definido um tipo de dado que permitiu trabalhar dados contínuos,possibilitando, inclusive prever o tamanho máximo do erro. A matemática intervalar é uma saída para essa questão, já que permite um controle e análise de erros de maneira automática. Porém, as propriedades algébricas dos intervalos não são as mesmas dos números reais, apesar dos números reais serem vistos como intervalos degenerados, e as propriedades algébricas dos intervalos degenerados serem exatamente as dos números reais. Partindo disso, e pensando nas técnicas de especificação algébrica, precisa-se de uma linguagem capaz de implementar uma noção auxiliar de equivalência introduzida por Santiago [6] que ``simule" as propriedades algébricas dos números reais nos intervalos. A linguagem de especificação CASL, Common Algebraic Specification Language, [1] é uma linguagem de especificação algébrica para a descrição de requisitos funcionais e projetos modulares de software, que vem sendo desenvolvida pelo CoFI, The Common Framework Initiative [2] a partir do ano de 1996. O desenvolvimento de CASL se encontra em andamento e representa um esforço conjunto de grandes expoentes da área de especificações algébricas no sentido de criar um padrão para a área. A dissertação proposta apresenta uma especificação em CASL do tipo intervalo, munido da aritmética de Moore, afim de que ele venha a estender os sistemas que manipulem dados contínuos, sendo possível não só o controle e a análise dos erros de aproximação, como também a verificação algébrica de propriedades do tipo de sistema aqui mencionado. A especificação de intervalos apresentada aqui foi feita apartir das especificações dos números racionais proposta por Mossakowaski em 2001 [3] e introduz a noção de igualdade local proposta por Santiago [6, 5, 4]
Resumo:
With the increasing complexity of software systems, there is also an increased concern about its faults. These faults can cause financial losses and even loss of life. Therefore, we propose in this paper the minimization of faults in software by using formally specified tests. The combination of testing and formal specifications is gaining strength in searches mainly through the MBT (Model-Based Testing). The development of software from formal specifications, when the whole process of refinement is done rigorously, ensures that what is specified in the application will be implemented. Thus, the implementation generated from these specifications would accurately depict what was specified. But not always the specification is refined to the level of implementation and code generation, and in these cases the tests generated from the specification tend to find fault. Additionally, the generation of so-called "invalid tests", ie tests that exercise the application scenarios that were not addressed in the specification, complements more significantly the formal development process. Therefore, this paper proposes a method for generating tests from B formal specifications. This method was structured in pseudo-code. The method is based on the systematization of the techniques of black box testing of boundary value analysis, equivalence partitioning, as well as the technique of orthogonal pairs. The method was applied to a B specification and B test machines that generate test cases independent of implementation language were generated. Aiming to validate the method, test cases were transformed manually in JUnit test cases and the application, created from the B specification and developed in Java, was tested. Faults were found with the execution of the JUnit test cases
Resumo:
Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies
Resumo:
A automação consiste em uma importante atividade do processo de teste e é capaz de reduzir significativamente o tempo e custo do desenvolvimento. Algumas ferramentas tem sido propostas para automatizar a realização de testes de aceitação em aplicações Web. Contudo, grande parte delas apresenta limitações importantes tais como necessidade de valoração manual dos casos de testes, refatoração do código gerado e forte dependência com a estrutura das páginas HTML. Neste trabalho, apresentamos uma linguagem de especificação de teste e uma ferramenta concebidas para minimizar os impactos propiciados por essas limitações. A linguagem proposta dá suporte aos critérios de classes de equivalência e a ferramenta, desenvolvida sob a forma de um plug-in para a plataforma Eclipse, permite a geração de casos de teste através de diferentes estratégias de combinação. Para realizar a avaliação da abordagem, utilizamos um dos módulos do Sistema Unificado de Administração Publica (SUAP) do Instituto Federal do Rio Grande do Norte (IFRN). Participaram da avaliação analistas de sistemas e um técnico de informática que atuam como desenvolvedores do sistema utilizado.
Resumo:
Automation has become increasingly necessary during the software test process due to the high cost and time associated with such activity. Some tools have been proposed to automate the execution of Acceptance Tests in Web applications. However, many of them have important limitations such as the strong dependence on the structure of the HTML pages and the need of manual valuing of the test cases. In this work, we present a language for specifying acceptance test scenarios for Web applications called IFL4TCG and a tool that allows the generation of test cases from these scenarios. The proposed language supports the criterion of Equivalence Classes Partition and the tool allows the generation of test cases that meet different combination strategies (i.e., Each-Choice, Base-Choice and All Combinations). In order to evaluate the effectiveness of the proposed solution, we used the language and the associated tool for designing and executing Acceptance Tests on a module of Sistema Unificado de Administração Pública (SUAP) of Instituto Federal Rio Grande do Norte (IFRN). Four Systems Analysts and one Computer Technician, which work as developers of the that system, participated in the evaluation. Preliminary results showed that IFL4TCG can actually help to detect defects in Web applications
Resumo:
The segmentation of an image aims to subdivide it into constituent regions or objects that have some relevant semantic content. This subdivision can also be applied to videos. However, in these cases, the objects appear in various frames that compose the videos. The task of segmenting an image becomes more complex when they are composed of objects that are defined by textural features, where the color information alone is not a good descriptor of the image. Fuzzy Segmentation is a region-growing segmentation algorithm that uses affinity functions in order to assign to each element in an image a grade of membership for each object (between 0 and 1). This work presents a modification of the Fuzzy Segmentation algorithm, for the purpose of improving the temporal and spatial complexity. The algorithm was adapted to segmenting color videos, treating them as 3D volume. In order to perform segmentation in videos, conventional color model or a hybrid model obtained by a method for choosing the best channels were used. The Fuzzy Segmentation algorithm was also applied to texture segmentation by using adaptive affinity functions defined for each object texture. Two types of affinity functions were used, one defined using the normal (or Gaussian) probability distribution and the other using the Skew Divergence. This latter, a Kullback-Leibler Divergence variation, is a measure of the difference between two probability distributions. Finally, the algorithm was tested in somes videos and also in texture mosaic images composed by images of the Brodatz album
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
I thank to my advisor, João Marcos, for the intellectual support and patience that devoted me along graduate years. With his friendship, his ability to see problems of the better point of view and his love in to make Logic, he became a great inspiration for me. I thank to my committee members: Claudia Nalon, Elaine Pimentel and Benjamin Bedregal. These make a rigorous lecture of my work and give me valuable suggestions to make it better. I am grateful to the Post-Graduate Program in Systems and Computation that accepted me as student and provided to me the propitious environment to develop my research. I thank also to the CAPES for a 21 months fellowship. Thanks to my research group, LoLITA (Logic, Language, Information, Theory and Applications). In this group I have the opportunity to make some friends. Someone of them I knew in my early classes, they are: Sanderson, Haniel and Carol Blasio. Others I knew during the course, among them I’d like to cite: Patrick, Claudio, Flaulles and Ronildo. I thank to Severino Linhares and Maria Linhares who gently hosted me at your home in my first months in Natal. This couple jointly with my colleagues of student flat Fernado, Donátila and Aline are my nuclear family in Natal. I thank my fiancée Luclécia for her precious a ective support and to understand my absence at home during my master. I thank also my parents Manoel and Zenilda, my siblings Alexandre, Paulo and Paula.Without their confidence and encouragement I wouldn’t achieve success in this journey. If you want the hits, be prepared for the misses Carl Yastrzemski
Resumo:
The Rio do Peixe Basin is located in the border of Paraíba and Ceará states, immediately to the north of the Patos shear zone, encompassing an area of 1,315 km2. This is one of the main basins of eocretaceous age in Northeast Brazil, associated to the rifting event that shaped the present continental margin. The basin can be divided into four sub-basins, corresponding to Pombal, Sousa, Brejo das Freiras and Icozinho half-grabens. This dissertation was based on the analysis and interpretation of remote sensing products, field stratigraphic and structural data, and seismic sections and gravity data. Field work detailed the lithofacies characterization of the three formations previously recognised in the basin, Antenor Navarro, Sousa and Rio Piranhas. Unlike the classical vertical stacking, field relations and seismostratigraphic analysis highlighted the interdigitation and lateral equivalency between these units. On bio/chrono-stratigraphic and tectonic grounds, they correlate with the Rift Tectonosequence of neocomian age. The Antenor Navarro Formation rests overlies the crystalline basement in non conformity. It comprises lithofacies originated by a braided fluvial system system, dominated by immature, coarse and conglomeratic sandstones, and polymict conglomerates at the base. Its exposures occur in the different halfgrabens, along its flexural margins. Paleocurrent data indicate source areas in the basement to the north/NW, or input along strike ramps. The Sousa Formation is composed by fine-grained sandstones, siltites and reddish, locally grey-greenish to reddish laminated shales presenting wavy marks, mudcracks and, sometimes, carbonate beds. This formation shows major influence of a fluvial, floodplain system, with seismostratigraphic evidence of lacustrine facies at subsurface. Its distribution occupies the central part of the Sousa and Brejo das Freiras half-grabens, which constitute the main depocenters of the basin. Paleocurrent analysis shows that sediment transport was also from north/NW to south/SE
Resumo:
Data Visualization is widely used to facilitate the comprehension of information and find relationships between data. One of the most widely used techniques for multivariate data (4 or more variables) visualization is the 2D scatterplot. This technique associates each data item to a visual mark in the following way: two variables are mapped to Cartesian coordinates so that a visual mark can be placed on the Cartesian plane; the others variables are mapped gradually to visual properties of the mark, such as size, color, shape, among others. As the number of variables to be visualized increases, the amount of visual properties associated to the mark increases as well. As a result, the complexity of the final visualization is higher. However, increasing the complexity of the visualization does not necessarily implies a better visualization and, sometimes, it provides an inverse situation, producing a visually polluted and confusing visualization—this problem is called visual properties overload. This work aims to investigate whether it is possible to work around the overload of the visual channel and improve insight about multivariate data visualized through a modification in the 2D scatterplot technique. In this modification, we map the variables from data items to multisensoriy marks. These marks are composed not only by visual properties, but haptic properties, such as vibration, viscosity and elastic resistance, as well. We believed that this approach could ease the insight process, through the transposition of properties from the visual channel to the haptic channel. The hypothesis was verified through experiments, in which we have analyzed (a) the accuracy of the answers; (b) response time; and (c) the grade of personal satisfaction with the proposed approach. However, the hypothesis was not validated. The results suggest that there is an equivalence between the investigated visual and haptic properties in all analyzed aspects, though in strictly numeric terms the multisensory visualization achieved better results in response time and personal satisfaction.
Resumo:
Brazil is the world's second largest producer of cassava, which most of the production is used to make flour and starch, generating large amounts of waste, cassava. In general, this waste is disposed of directly into the soil and waterways, causing serious environmental impacts. In view of this, the aim of this work was to evaluate the use of cassava wastewater water (cassava) as organic fertilizer in Brachiaria brizantha pasture. Marandu. The experiment was conducted at the Campus Macaíba the Federal University of Rio Grande do Norte. The treatments were increasing rates of cassava, applied to the soil as organic fertilizer. The experimental design was a randomized block design with six treatments and four replications. The treatments consist of cassava doses (0, 15, 30, 60 and 120 m³ ha- 1 ) and a treatment with mineral fertilizer (AM) in the form of NPK (140: 30: 120 kg ha-1 ). Three cuts with an interval of 60 days were carried out. The variables evaluated were: plant height; accumulation of morphological components of fodder; Trapping Light (IL); Leaf Area Index (LAI); Total chlorophyll (CT); Feature Production Seca (PMS). The dry matter production at a dose of 120 m³ha-1 had a quantitative increase, with a total production in 2796 kg ha-1 DM in the second cut, providing an increase of 493% compared to control, and the residual effect observed in the third cut caused a 100% increase compared to 0 m³ ha-1 . Comparing the PMS obtained with the use of AM and other treatments it was observed that it was the second cut equivalent to a dose of 120 m³ ha-1 and the third equivalence has been cut at doses 60 and 120 m³ha-1. For the variables plant height, IL, IAF, CT and leaves Mass adding cassava in the soil promoted a positive linear increase for the three cuts. However, with the AM the IAF was superior to the other treatments. The thatched mass reached its highest production (838 kg ha-1 DM) in the second cut when using a dose of 120 m³ha-1 . In dead material mass in the second and third sections, there was increased linearly increased total of 322 and 452% respectively, compared to a dose of 0 m³ha-1 . The use of cassava showed herbicidal effect for the variable mass of the undesirable negative linear response resulting in decreasing the amount of residue with increasing doses. Manipueira can be used as organic fertilizer in Brachiaria brizantha cv. Marandu for improvements in the productive characteristics, as promoted significant increases in 8 most of the variables studied, especially at a dose of 120 m³ha-1 . This benefits the environment by being alternative for disposal of cassava.
Resumo:
The purpose of this research is to analyze different daylighting systems in schools in the city of Natal/RN. Although with the abundantly daylight available locally, there are a scarce and diffuse architectural recommendations relating sky conditions, dimensions of daylight systems, shading, fraction of sky visibility, required illuminance, glare, period of occupation and depth of the lit area. This research explores different selected apertures systems to explore the potential of natural light for each system. The method has divided into three phases: The first phase is the modeling which involves the construction of three-dimensional model of a classroom in Sketchup software 2014, which is featured in follow recommendations presented in the literature to obtain a good quality of environmental comfort in school settings. The second phase is the dynamic performance computer simulation of the light through the Daysim software. The input data are the climate file of 2009 the city of Natal / RN, the classroom volumetry in 3ds format with the assignment of optical properties of each surface, the sensor mapping file and the user load file . The results produced in the simulation are organized in a spreadsheet prepared by Carvalho (2014) to determine the occurrence of useful daylight illuminance (UDI) in the range of 300 to 3000lux and build graphics illuminance curves and contours of UDI to identify the uniformity of distribution light, the need of the minimum level of illuminance and the occurrence of glare.
Resumo:
Clays are materials with specific properties that make them promising for various studies. In this work we used the vermiculite clay as support for iron compounds, in order to obtain promising materials for application in the heterogeneous type photo-Fenton process. In all, the study included six solid, starting from the vermiculite (V0) was obtained calcined vermiculite (V0-C), the mixed material (V0/β-FeOOH) formed by vermiculite more akaganeite, exchanged vermiculite (v0t-C), vermiculite impregnated Wet (V0u-C) and V0u-CL that is the solid obtained by impregnating with a back washing. The solids of the study had their physical and chemical characteristics investigated by the following characterization techniques: X-Ray Diffraction (XRD), Infrared Spectroscopy (IR), Energy Dispersive Spectroscopy (EDS), X-Ray Fluorescence Spectroscopy (XRF), UV-Vis by Diffuse Reflectance (DR UV-Vis), Thermogravimetric Analysis (TGA) and Scanning Electron Microscopy (SEM). The V0 material showed three distinct phases, which are the very vermiculite, hidrobiotite and biotite, the last two phases are part of the geological of formation process vermiculite. The solids obtained after the modification showed an increase in the amount of iron present in the clay, these being quantities important for application in photocatalysis. The micrographs and EDS data, show that after treatment of addition of the metal , the iron was intercalary in structure of vermiculite for solid V0t-C and V0u-C, however, this did not occur with mixed material. In the photoFenton process, was observed a maximum removal of 88.8% of the dye methylene blue coloring for the catalyst V0/β-FeOOH, while for the other solids was obtained values between 76.8 and 62.6%, compared to 37.8% of discoloration without the presence of catalyst. Therefore, it is concluded that the vermiculite clay presents as a good catalyst and iron support for the, beyond of presenting a low cost because of its high abundance.
Resumo:
The Nursing Homes are an important alternative care in the world, but Brazil still has no valid instrument to monitor the quality these institutions. In the United States, the Observable Indicators of Nursing Home Care Quality Instrument (OIQ) is used to assess the quality of Nursing Home care using 30 indicators of structure (2 dimensions) and process (5 dimensions) related to quality person-centered care. The present study aimed at cross-culturally adapting the OIQ in order to evaluate the quality of Nursing Home care in Brazil. Conceptual and item equivalence were determined to assess the relevance and viability of OIQ in the Brazilian context, using the Content Validity Index (CVI) and a group of specialists composed of 10 participants directly involved in the object of study. Next, operational, idiomatic and semantic equivalence were carried out concurrently. This consisted of 5 phases: (1) two translations and (2) their respective back translations; (3) formal appraisal of referential and general meaning; (4) review by a second group of specialists; (5) application of the pretest at three Nursing Homes by different social entities: health professionals, sanitary surveillance regulators and potential consumers. Measurement equivalence was evaluated by the Cronbach’s alpha test to verify the internal consistency of the instrument. To measure inter-evaluator agreement, the General Agreement Index (ICG) and Kappa coefficient were used. Timely compliance and 95% Confidence Interval of indicators, dimensions and total construct were estimated. The CVI obtained high results for both relevance (95.3%) and viability (94.3%) in the Brazilian context. With respect to referential meaning, similarity was observed, ranging between 90-100% for the first back translation and 70-100% for the second. In relation to general meaning, version 1 was better, classified as “unchanged” in 80% of the items, whereas in version 2 it was only 47%. In the pretest, the OIQ was easy to understand and apply. The following outcomes were obtained: a high Cronbach’s alpha (0.93), satisfactory ICG (75%) and substantial agreement between the pairs of evaluators (health professionals, regulators from the Superintendency of Sanitary Surveillance –SUVISA-, and potential consumers), according to the Kappa coefficient (0.65). It´s possible take the operational equivalence held since it preserved the original layout in the Brazilian version from the maintenance in application mode, response options, number of items, statements and scores. The performance of nursing homes obtained approximate average scores of 87, a variation 55-111 considering a range from 30 to 150 points. The worst outcomes were related to process indicators with a mean of 2.8 per item, while structure was 3.75 on a scale of 1 to 5. The lowest score was obtained for the care dimension (mean 2). The OIQ version was deemed to be a valid and reliable instrument in the Brazilian context. It is recommended that health professionals, regulators and potential consumers adopt it to access and monitor the quality of Nursing Home care and demonstrating opportunities for improvement.
Desenvolvimento da célula base de microestruturas periódicas de compósitos sob otimização topológica
Resumo:
This thesis develops a new technique for composite microstructures projects by the Topology Optimization process, in order to maximize rigidity, making use of Deformation Energy Method and using a refining scheme h-adaptative to obtain a better defining the topological contours of the microstructure. This is done by distributing materials optimally in a region of pre-established project named as Cell Base. In this paper, the Finite Element Method is used to describe the field and for government equation solution. The mesh is refined iteratively refining so that the Finite Element Mesh is made on all the elements which represent solid materials, and all empty elements containing at least one node in a solid material region. The Finite Element Method chosen for the model is the linear triangular three nodes. As for the resolution of the nonlinear programming problem with constraints we were used Augmented Lagrangian method, and a minimization algorithm based on the direction of the Quasi-Newton type and Armijo-Wolfe conditions assisting in the lowering process. The Cell Base that represents the composite is found from the equivalence between a fictional material and a preescribe material, distributed optimally in the project area. The use of the strain energy method is justified for providing a lower computational cost due to a simpler formulation than traditional homogenization method. The results are presented prescription with change, in displacement with change, in volume restriction and from various initial values of relative densities.