892 resultados para defeitos topológicos


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The neurovascular system of the pulp and of the periodontium is interconnected and among the possible intercommunications between these two tissues, there is the cavo inter-radicular canal. It is a small canal that goes through any inter-radicular dentine and arises in the furca region of the multi-radicular teeth. Its predominance has been studied in the literature, by several methodologies, with divergent results. The objective of this work was to establish, in vitro, the predominance of the cavo inter-radicular canal, in human lower molars, through the diaphanization technique and dye leakage. For this research, 140 teeth (100 first and second 40 lower molars) were selected, extracted due to different reasons, belonging to a teeth bank of the Endodontics discipline of the Dentistry College at Federal University of Rio Grande do Norte. The teeth were preserved in formol until the moment of use and immersed in physiological solution. Had the endodontic access fulfilled and the whole external surface, except for the furcation, sealed with two layers of nail enamel. The cleaning of the pulpar chamber floor was carried out with sodium hypochlorite solution 5%, being this solution renewed every 5 minutes, during 1 hour. The teeth were immersed in Indian dye and, after drying of the dye, they had their crowns split up in the amelo-cemental junction. Then, they were examined in a stereomicroscope, where marks of the coloring were observed in the furcation and on the pulpar floor. After this recording, the sample was diaphanized and with the transparent teeth, it was possible to observe in the stereomicroscope, the true inter-radicular canals. As a result of this experiment, the presence of these canals was observed in 13 % of the first and 7, 5 % of the second evaluated molars. The study showed that both the presence of the cavo inter-radicular canal is real and the diaphanization and dye leakage is an efficient method for this type of research

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uma das principais limitações relacionadas ao processo de retificação de cerâmica é a confiabilidade do material devido aos defeitos introduzidos no processamento. A compreensão dos mecanismos envolvidos na remoção de material durante a retificação e a interação com os parâmetros de processo e microestrutura é fundamental para minimizar estes defeitos. A proposta desta revisão é apresentar os modelos de remoção de material e a forma como afetam as propriedades mecânicas da peça final.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heterogeneous catalysts such as aluminophosphate and silicoaluminophosphate, molecular sieves with AEL of ALPO-11 and SAPO-11, were synthesized by the hydrothermal method with the following molar composition: 2.9 Al +3.2 P + 3.5 DIPA +32.5 H20 (ALPO-11); 2.9 Al +3.2 P + 0.5 Si + 3.5 DIPA +32.5 H20 (SAPO-11) starting from silica (only in the SAPO-11), pseudoboehmite, orthophosphoric acid (85%) and water, in the presence of a di-isopropylamine organic template. The crystallization process occurred when the reactive hydrogel was charged into a vessel and autoclaved at 170ºC for a period of 48 hours under autogeneous pressure. The obtained materials were washed, dried and calcined to remove the molecular sieves of DIPA. The samples were characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), infrared spectroscopy (FT-IR), thermo gravimetric differential thermal analysis (TG/DTA) and nitrogen adsorption (BET). The acidic properties were determined using adsorption of n-butylamine followed by programmed thermodessorption. This method revealed that ALPO-11 has weaker acid sites due to structural defects, while SAPO-11 shows an acidity that ranges from weak to moderate. However, a small quantity of strong acid sites could be detected there. The deactivation of the catalysts was conducted by the cracking of the n-hexane in a fixed bed continuous flow microrreator coupled on line to a gas chromatograph. The main products obtained were: ethane, propane, isobutene, n-butane, n-pentane and isopentane. The Vyazovkin (model-free) kinetics method was used to determine the regeneration and removal of the organic template

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is a study of coordination compounds by quantum theory of atoms in molecules (QTAIM), based on the topological analysis of the electron density of molecular systems, both theoretically and experimentally obtained. The coordination chemistry topics which were studied are the chelate effect, bent titanocene and chemical bond in coordination complexes. The chelate effect was investigated according to topological and thermodynamic parameters. The exchange of monodentate ligands on polydentate ligands from same transition metal increases the stability of the complex both from entropy and enthalpy contributions. In some cases, the latter had a higher contribution to the stability of the complex in comparison with entropy. This enthalpic contribution is explained according to topological analysis of the M-ligand bonds where polidentate complex had higher values of electron density of bond critical point, Laplacian of electron density of bond critical point and delocalization index (number of shared electrons between two atoms). In the second chapter, was studied bent titanocenes with bulky cyclopentadienyl derivative π-ligand. The topological study showed the presence of secondary interactions between the atoms of π-ligands or between atoms of π-ligand and -ligand. It was found that, in the case of titanocenes with small difference in point group symmetry and with bulky ligands, there was an nearly linear relationship between stability and delocalization index involving the ring carbon atoms (Cp) and the titanium. However, the titanocene stability is not only related to the interaction between Ti and C atoms of Cp ring, but secondary interactions also play important role on the stability of voluminous titanocenes. The third chapter deals with the chemical bond in coordination compounds by means of QTAIM. The quantum theory of atoms in molecules so far classifies bonds and chemical interactions in two categories: closed shell interaction (ionic bond, hydrogen bond, van der Waals interaction, etc) and shared interaction (covalent bond). Based on topological parameters such as electron density, Laplacian of electron density, delocalization index, among others, was classified the chemical bond in coordination compounds as an intermediate between closed shell and shared interactions

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microporous materials zeolite type Beta and mesoporous type MCM-41 and AlMCM-41 were synthesized hydrothermally and characterized by methods of X-ray diffraction, Fourier transform infrared, scanning electron microscopy, surface acidity, nitrogen adsorption, thermal analysis TG / DTG. Also we performed a kinetic study of sunflower oil on micro and mesoporous catalysts. The microporous material zeolite beta showed a lower crystallinity due to the existence of smaller crystals and a larger number of structural defects. As for the mesoporous materials MCM-41 and AlMCM-41 samples showed formation of hexagonal one-dimensional structure. The study of kinetic behavior of sunflower oil with zeolite beta catalysts, AlMCM-41 and MCM-41 showed a lower activation energy in front of the energy of pure sunflower oil, mainly zeolite beta. In the thermal cracking and thermocatalytic of sunflower oil were obtained two liquid fractions containing an aqueous phase and another organic - organic liquid fraction (FLO). The FLO first collected in both the thermal cracking as the thermocatalytic, showed very high level of acidity, performed characterizations of physicochemical properties of the second fraction in accordance with the specifications of the ANP. The second FLO thermocatalytic collected in cracking of sunflower oil presented results in the range of diesel oil, introducing himself as a promising alternative for use as biofuel liquid similar to diesel, either instead or mixed with it

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The regeneration of bone defects with loss of substance remains as a therapeutic challenge in the medical field. There are basically four types of grafts: autologous, allogenic, xenogenic and isogenic. It is a consensus that autologous bone is the most suitable material for this purpose, but there are limitations to its use, especially the insufficient amount in the donor. Surveys show that the components of the extracellular matrix (ECM) are generally conserved between different species and are well tolerated even in xenogenic recipient. Thus, several studies have been conducted in the search for a replacement for autogenous bone scaffold using the technique of decellularization. To obtain these scaffolds, tissue must undergo a process of cell removal that causes minimal adverse effects on the composition, biological activity and mechanical integrity of the remaining extracellular matrix. There is not, however, a conformity among researchers about the best protocol for decellularization, since each of these treatments interfere differently in biochemical composition, ultrastructure and mechanical properties of the extracellular matrix, affecting the type of immune response to the material. Further down the arsenal of research involving decellularization bone tissue represents another obstacle to the arrival of a consensus protocol. The present study aimed to evaluate the influence of decellularization methods in the production of biological scaffolds from skeletal organs of mice, for their use for grafting. This was a laboratory study, sequenced in two distinct stages. In the first phase 12 mice hemi-calvariae were evaluated, divided into three groups (n = 4) and submitted to three different decellularization protocols (SDS [group I], trypsin [Group II], Triton X-100 [Group III]). We tried to identify the one that promotes most efficient cell removal, simultaneously to the best structural preservation of the bone extracellular matrix. Therefore, we performed quantitative analysis of the number of remaining cells and descriptive analysis of the scaffolds, made possible by microscopy. In the second stage, a study was conducted to evaluate the in vitro adhesion of mice bone marrow mesenchymal cells, cultured on these scaffolds, previously decellularized. Through manual counting of cells on scaffolds there was a complete cell removal in Group II, Group I showed a practically complete cell removal, and Group III displayed cell remains. The findings allowed us to observe a significant difference only between Groups II and III (p = 0.042). Better maintenance of the collagen structure was obtained with Triton X-100, whereas the decellularization with Trypsin was responsible for the major structural changes in the scaffolds. After culture, the adhesion of mesenchymal cells was only observed in specimens deccelularized with Trypsin. Due to the potential for total removal of cells and the ability to allow adherence of these, the protocol based on the use of Trypsin (Group II) was considered the most suitable for use in future experiments involving bone grafting decellularized scaffolds

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A 3D binary image is considered well-composed if, and only if, the union of the faces shared by the foreground and background voxels of the image is a surface in R3. Wellcomposed images have some desirable topological properties, which allow us to simplify and optimize algorithms that are widely used in computer graphics, computer vision and image processing. These advantages have fostered the development of algorithms to repair bi-dimensional (2D) and three-dimensional (3D) images that are not well-composed. These algorithms are known as repairing algorithms. In this dissertation, we propose two repairing algorithms, one randomized and one deterministic. Both algorithms are capable of making topological repairs in 3D binary images, producing well-composed images similar to the original images. The key idea behind both algorithms is to iteratively change the assigned color of some points in the input image from 0 (background)to 1 (foreground) until the image becomes well-composed. The points whose colors are changed by the algorithms are chosen according to their values in the fuzzy connectivity map resulting from the image segmentation process. The use of the fuzzy connectivity map ensures that a subset of points chosen by the algorithm at any given iteration is the one with the least affinity with the background among all possible choices

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automation has become increasingly necessary during the software test process due to the high cost and time associated with such activity. Some tools have been proposed to automate the execution of Acceptance Tests in Web applications. However, many of them have important limitations such as the strong dependence on the structure of the HTML pages and the need of manual valuing of the test cases. In this work, we present a language for specifying acceptance test scenarios for Web applications called IFL4TCG and a tool that allows the generation of test cases from these scenarios. The proposed language supports the criterion of Equivalence Classes Partition and the tool allows the generation of test cases that meet different combination strategies (i.e., Each-Choice, Base-Choice and All Combinations). In order to evaluate the effectiveness of the proposed solution, we used the language and the associated tool for designing and executing Acceptance Tests on a module of Sistema Unificado de Administração Pública (SUAP) of Instituto Federal Rio Grande do Norte (IFRN). Four Systems Analysts and one Computer Technician, which work as developers of the that system, participated in the evaluation. Preliminary results showed that IFL4TCG can actually help to detect defects in Web applications

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJETIVO: Comparar uma malha comercial de poliéster com o pericárdio bovino preservado em glicerina na reconstituição de defeitos da parede abdominal. MÉTODOS: Foram utilizados 30 ratos, divididos em dois grupos eqüitativos. Efetuou-se uma excisão retangular de 2,5 x 2 cm, incluindo toda a musculatura abdominal e peritônio. No grupo I a parede abdominal foi reparada com malha de poliéster e no grupo II com pericárdio bovino conservado em glicerina. Os animais foram sacrificados aos 15, 60 e 90 dias de pós-operatório, sendo o local cirúrgico avaliado macroscópica e histologicamente. RESULTADOS: Os animais do grupo I apresentaram aderências mais severas e em maior número quando comparados aos do grupo II; porém, sem comprometimento funcional. A análise histológica revelou incorporação dos tecidos aos implantes, com maior resposta fibroblástica nos animais do grupo I. CONCLUSÃO: A malha de poliéster oferece maior resistência estrutural e resposta fibroblástica mais intensa; contudo, promove grande quantidade de aderências às vísceras abdominais, quando comparada ao pericárdio bovino.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Checking the conformity between implementation and design rules in a system is an important activity to try to ensure that no degradation occurs between architectural patterns defined for the system and what is actually implemented in the source code. Especially in the case of systems which require a high level of reliability is important to define specific design rules for exceptional behavior. Such rules describe how exceptions should flow through the system by defining what elements are responsible for catching exceptions thrown by other system elements. However, current approaches to automatically check design rules do not provide suitable mechanisms to define and verify design rules related to the exception handling policy of applications. This paper proposes a practical approach to preserve the exceptional behavior of an application or family of applications, based on the definition and runtime automatic checking of design rules for exception handling of systems developed in Java or AspectJ. To support this approach was developed, in the context of this work, a tool called VITTAE (Verification and Information Tool to Analyze Exceptions) that extends the JUnit framework and allows automating test activities to exceptional design rules. We conducted a case study with the primary objective of evaluating the effectiveness of the proposed approach on a software product line. Besides this, an experiment was conducted that aimed to realize a comparative analysis between the proposed approach and an approach based on a tool called JUnitE, which also proposes to test the exception handling code using JUnit tests. The results showed how the exception handling design rules evolve along different versions of a system and that VITTAE can aid in the detection of defects in exception handling code

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software Repository Mining (MSR) is a research area that analyses software repositories in order to derive relevant information for the research and practice of software engineering. The main goal of repository mining is to extract static information from repositories (e.g. code repository or change requisition system) into valuable information providing a way to support the decision making of software projects. On the other hand, another research area called Process Mining (PM) aims to find the characteristics of the underlying process of business organizations, supporting the process improvement and documentation. Recent works have been doing several analyses through MSR and PM techniques: (i) to investigate the evolution of software projects; (ii) to understand the real underlying process of a project; and (iii) create defect prediction models. However, few research works have been focusing on analyzing the contributions of software developers by means of MSR and PM techniques. In this context, this dissertation proposes the development of two empirical studies of assessment of the contribution of software developers to an open-source and a commercial project using those techniques. The contributions of developers are assessed through three different perspectives: (i) buggy commits; (ii) the size of commits; and (iii) the most important bugs. For the opensource project 12.827 commits and 8.410 bugs have been analyzed while 4.663 commits and 1.898 bugs have been analyzed for the commercial project. Our results indicate that, for the open source project, the developers classified as core developers have contributed with more buggy commits (although they have contributed with the majority of commits), more code to the project (commit size) and more important bugs solved while the results could not indicate differences with statistical significance between developer groups for the commercial project

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O hipotireoidismo congênito (HC) é o distúrbio endócrino congênito mais frequente, com incidência variando de 1:2.000 a 1:4.000 crianças nascidas vivas e uma das principais causas de retardo mental que pode ser prevenida. Os Programas de Triagem Neonatal para a doença permitem a identificação precoce dos afetados e seu tratamento de modo a evitar as complicações da falta do hormônio. A maioria dos casos de hipotireoidismo congênito é decorrente de disgenesias tireoidianas (85%), entre elas a ectopia, hipoplasia ou agenesia tireoidianas, e os demais resultam de defeitos de síntese hormonal. As crianças afetadas (> 95%) geralmente não apresentam sintomas sugestivos da doença ao nascimento. Os sintomas e sinais mais comuns são: icterícia neonatal prolongada, choro rouco, letargia, movimentos lentos, constipação, macroglossia, hérnia umbilical, fontanelas amplas, hipotonia e pele seca. Várias estratégias são utilizadas para a triagem do HC. No Brasil, esta é obrigatória por lei e geralmente é feita com a dosagem de TSH em sangue seco coletado do calcanhar. A idade recomendada para sua realização é após as 48 horas de vida até o quarto dia. A confirmação diagnóstica é obrigatória com as dosagens de TSH e T4 livre ou T4 total.