966 resultados para Testes de fluxo lateral


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emissions of CO2 in the atmosphere have increased successively by various mechanisms caused by human action, especially as fossil fuel combustion and industrial chemical processes. This leads to the increase in average temperature in the atmosphere, which we call global warming. The search for new technologies to minimize environmental impacts arising from this phenomenon has been investigated. The capture of CO2 is one of the alternatives that can help reduce emis ions of greenhouse gases. The CO2 can be captured through the process of selective adsorption using adsorbents for this purpose. Were synthesized by hydrothermal method, materials of the type MCM-41 and Al-MCM-41 in the molar ratio Si / Al equal to 50. The synthesis of gels were prepared from a source of silicon, sodium, water and aluminum in the case of Al-MCM-41. The period of synthesis of the materials was 5 days in autoclave at 100°C. After that time materials were filtered, washed and dried in greenhouse at 100 º C for 4 hours and then calcined at 450 º C. Then the calcined material was functionalized with the Di-isopropylamine (DIPA) by the method of wet impregnation. We used 0.5 g of material mesopores to 3.5 mL of DIPA. The materials were functionalized in a closed container for 24 hours, and after this period were dried at brackground temperature for 2 hours. Were subsequently subjected to heat treatment at 250°C for 1 hour. These materials were used for the adsorption of CO2 and were characterized by XRD, FT-IR, BET / BJH, SEM, EDX and TG / DTG. Tests of adsorption of CO2 was carried out under the following conditions: 100 mg of adsorbent, temperature of 75°C under flow of 100 mL/min of CO2 for 2 hours. The desorption of CO2 was carried out by thermogravimetry from ambient temperature to 900ºC under flow of 25 mL min of He and a ratio of 10ºC/min. The difratogramas X-ray for the synthesized samples showed the characteristic peaks of MCM-41, showing that the structure of it was obtained. For samples functionalized there was a decrease of the intensities of these peaks, with a consequent reduction in the structural ordering of the material. However, the structure was preserved mesopores. The adsorption tests showed that the functionalized MCM-41 is presented as a material promising adsorbent, for CO2 capture, with a loss of mass on the desorption CO2 of 7,52%, while that in Al-MCM- 41 functionalized showed no such loss

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A dislexia do desenvolvimento, dificuldade específica de leitura, é caracterizada pela dificuldade em realizar a decodificação fono-grafêmica e percepção de fonemas acusticamente semelhantes. Este estudo teve como objetivo caracterizar o desempenho de crianças com dislexia quanto às habilidades auditivas e de consciência fonológica, correlacionando-as. Participaram deste estudo crianças com dislexia e com bom desempenho escolar, submetidas a avaliações audiológica, do processamento auditivo e das habilidades fonológicas. Os resultados indicaram diferença estatisticamente significante entre as habilidades auditivas de seqüência para sons verbais, mensagem competitiva ipsi e contra-lateral, dicótico de dígitos e dissílabos alternados e ainda nos subtestes de síntese, segmentação, manipulação e transposição. Os achados deste estudo evidenciaram correlação entre provas de memória auditiva e manipulação silábica e fonêmica e associação entre habilidades auditivas e fonológicas, sugerindo que os processos auditivos interferem diretamente na percepção de aspectos acústicos, temporais e seqüenciais dos sons para formação de uma representação fonológica estável.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O transtorno do processamento auditivo é uma entidade clínica que pode estar associado a diversos distúrbios da comunicação humana, entre estes o distúrbio de aprendizagem. OBJETIVO: Caracterizar e comparar o desempenho de escolares com e sem distúrbio de aprendizagem nos testes de Fala com Ruído e Escuta Dicótica de Dígitos e Verbal. MATERIAL E MÉTODO: Participaram 40 escolares, de ambos os gêneros, com faixa etária de 8 a 12 anos, divididos em dois grupos: GI: composto por 20 escolares com diagnóstico de distúrbio de Aprendizagem e GII: composto por 20 escolares com bom desempenho escolar, pareados segundo gênero, faixa etária e escolaridade com GI. Foram realizadas avaliações audiológicas básicas e Testes de Dicótico de Dígitos, Dissílabos Alternados (SSW) e Fala com Ruído. FORMA DE ESTUDO: Estudo transversal com corte histórica. RESULTADOS: Os escolares de GI apresentaram desempenho inferior ao dos escolares de GII, nos testes Dicótico de Dígitos e Dissílabos Alternados e desempenho sem diferença estatisticamente significante no Teste de fala com Ruído. CONCLUSÃO: Os achados sugerem que o grupo de escolares com distúrbio de aprendizagem apresenta desempenho inferior em relação ao grupo sem dificuldades, refletindo dificuldades no processamento das informações auditivas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJETIVO: caracterizar e comparar, por meio de testes comportamentais, o processamento auditivo de escolares com diagnóstico interdisciplinar de (I) distúrbio da aprendizagem, (II) dislexia e (III) escolares com bom desempenho acadêmico. MÉTODOS: participaram deste estudo 30 escolares na faixa etária de 8 a 16 anos de idade, de ambos os gêneros, de 2ª a 4ª séries do ensino fundamental, divididos em três grupos: GI composto por 10 escolares com diagnóstico interdisciplinar de distúrbio de aprendizagem, GII: composto por 10 escolares com diagnóstico interdisciplinar de dislexia e GIII composto por 10 escolares sem dificuldades de aprendizagem, pareados segundo gênero e faixa etária com os grupos GI e GII. Foram realizadas avaliação audiológica e de processamento auditivo. RESULTADOS: os escolares de GIII apresentaram desempenho superior nos testes de processamento auditivo em relação aos escolares de GI e GII. GI apresentou desempenho inferior nas habilidades auditivas avaliadas para testes dicóticos de dígitos e dissílabos alternados, logoaudiometria pediátrica, localização sonora, memória verbal e não-verbal, ao passo que GII apresentou as mesmas alterações de GI, com exceção do teste de logoaudiometria pediátrica. CONCLUSÃO: os escolares com transtornos de aprendizagem apresentaram desempenho inferior nos testes de processamento auditivo, sendo que os escolares com distúrbio de aprendizagem apresentaram maior número de habilidades auditivas alteradas, em comparação com os escolares com dislexia, por terem apresentado atenção sustentada reduzida. O grupo de escolares com dislexia apresentou alterações decorrentes da dificuldade relacionada à codificação e decodificação de estímulos sonoros.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJETIVO: caracterizar e comparar o desempenho das crianças com diagnóstico de gagueira nos testes de padrão temporal, com crianças sem queixas e/ou sinais de transtornos psiquiátricos ou neurológicos, dificuldades de fala, audição, linguagem e/ou aprendizagem. MÉTODO: participaram 30 crianças entre 9 e 12 anos de idade, de ambos os gêneros, divididas em dois grupos: GI - 15 crianças com gagueira desenvolvimental persistente; GII - 15 crianças sem queixas e/ou sinais de transtornos psiquiátricos ou neurológicos, dificuldades de fala, audição, linguagem e/ou aprendizagem. Para avaliação do processamento auditivo temporal, foi aplicado os Testes Tonais de Padrão de Frequência (PPS-Pitch Pattern Sequence Test) e Testes Tonais de Padrão de Duração (DPS - Duration Pattern Sequence Test). RESULTADOS: o grupo II apresentou desempenho superior no teste de padrão de frequência e de padrão de duração quando comparado ao grupo I. Os resultados indicaram que houve diferença estatisticamente significante entre os grupos estudados. CONCLUSÃO: os participantes do grupo I apresentaram desempenho alterado nos testes de padrão temporal, o que indica que existe relação entre a gagueira e o transtorno do processamento auditivo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

JUSTIFICATIVA E OBJETIVOS: em pacientes sob intubação traqueal ou traqueostomia, a umidificação e o aquecimento do gás inalado são necessários para a prevenção de lesões no sistema respiratório, conseqüentes ao contato do gás frio e seco com as vias aéreas. O objetivo da pesquisa foi avaliar o efeito do sistema respiratório circular com absorvedor de dióxido de carbono do aparelho de anestesia Cícero da Dräger, quanto à capacidade de aquecimento e umidificação dos gases inalados, utilizando-se fluxo baixo (1 L.min-1) ou mínimo (0,5 L.min-1) de gases frescos. MÉTODO: O estudo aleatório foi realizado em 24 pacientes, estado físico ASA I, com idades entre 18 e 65 anos, submetidos à anestesia geral, utilizando-se a Estação de Trabalho Cícero da Dräger (Alemanha), para realização de cirurgias abdominais, os quais foram distribuídos aleatoriamente em dois grupos: grupo de Baixo Fluxo (BF), no qual foi administrado 0,5 L.min-1 de oxigênio e 0,5 L.min-1 de óxido nitroso e fluxo mínimo (FM), administrando-se somente oxigênio a 0,5 L.min-1. Os atributos estudados foram temperatura, umidade relativa e absoluta da sala de operação e do gás no sistema inspiratório. RESULTADOS: Os valores da temperatura, umidade relativa e umidade absoluta no sistema inspiratório na saída do aparelho de anestesia e junto ao tubo traqueal não apresentaram diferença significante entre os grupos, mas aumentaram ao longo do tempo nos dois grupos (BF e FM), havendo influência da temperatura da sala de operação sobre a temperatura do gás inalado, nos dois grupos estudados. Níveis de umidade e temperatura próximos dos ideais foram alcançados, nos dois grupos, a partir de 90 minutos. CONCLUSÕES: Não há diferença significante da umidade e temperatura do gás inalado utilizando-se baixo fluxo e fluxo mínimo de gases frescos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increase of applications complexity has demanded hardware even more flexible and able to achieve higher performance. Traditional hardware solutions have not been successful in providing these applications constraints. General purpose processors have inherent flexibility, since they perform several tasks, however, they can not reach high performance when compared to application-specific devices. Moreover, since application-specific devices perform only few tasks, they achieve high performance, although they have less flexibility. Reconfigurable architectures emerged as an alternative to traditional approaches and have become an area of rising interest over the last decades. The purpose of this new paradigm is to modify the device s behavior according to the application. Thus, it is possible to balance flexibility and performance and also to attend the applications constraints. This work presents the design and implementation of a coarse grained hybrid reconfigurable architecture to stream-based applications. The architecture, named RoSA, consists of a reconfigurable logic attached to a processor. Its goal is to exploit the instruction level parallelism from intensive data-flow applications to accelerate the application s execution on the reconfigurable logic. The instruction level parallelism extraction is done at compile time, thus, this work also presents an optimization phase to the RoSA architecture to be included in the GCC compiler. To design the architecture, this work also presents a methodology based on hardware reuse of datapaths, named RoSE. RoSE aims to visualize the reconfigurable units through reusability levels, which provides area saving and datapath simplification. The architecture presented was implemented in hardware description language (VHDL). It was validated through simulations and prototyping. To characterize performance analysis some benchmarks were used and they demonstrated a speedup of 11x on the execution of some applications

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some programs may have their entry data specified by formalized context-free grammars. This formalization facilitates the use of tools in the systematization and the rise of the quality of their test process. This category of programs, compilers have been the first to use this kind of tool for the automation of their tests. In this work we present an approach for definition of tests from the formal description of the entries of the program. The generation of the sentences is performed by taking into account syntactic aspects defined by the specification of the entries, the grammar. For optimization, their coverage criteria are used to limit the quantity of tests without diminishing their quality. Our approach uses these criteria to drive generation to produce sentences that satisfy a specific coverage criterion. The approach presented is based on the use of Lua language, relying heavily on its resources of coroutines and dynamic construction of functions. With these resources, we propose a simple and compact implementation that can be optimized and controlled in different ways, in order to seek satisfaction the different implemented coverage criteria. To make the use of our tool simpler, the EBNF notation for the specification of the entries was adopted. Its parser was specified in the tool Meta-Environment for rapid prototyping

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of middleware technology in various types of systems, in order to abstract low-level details related to the distribution of application logic, is increasingly common. Among several systems that can be benefited from using these components, we highlight the distributed systems, where it is necessary to allow communications between software components located on different physical machines. An important issue related to the communication between distributed components is the provision of mechanisms for managing the quality of service. This work presents a metamodel for modeling middlewares based on components in order to provide to an application the abstraction of a communication between components involved in a data stream, regardless their location. Another feature of the metamodel is the possibility of self-adaptation related to the communication mechanism, either by updating the values of its configuration parameters, or by its replacement by another mechanism, in case of the restrictions of quality of service specified are not being guaranteed. In this respect, it is planned the monitoring of the communication state (application of techniques like feedback control loop), analyzing performance metrics related. The paradigm of Model Driven Development was used to generate the implementation of a middleware that will serve as proof of concept of the metamodel, and the configuration and reconfiguration policies related to the dynamic adaptation processes. In this sense was defined the metamodel associated to the process of a communication configuration. The MDD application also corresponds to the definition of the following transformations: the architectural model of the middleware in Java code, and the configuration model to XML

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A automação consiste em uma importante atividade do processo de teste e é capaz de reduzir significativamente o tempo e custo do desenvolvimento. Algumas ferramentas tem sido propostas para automatizar a realização de testes de aceitação em aplicações Web. Contudo, grande parte delas apresenta limitações importantes tais como necessidade de valoração manual dos casos de testes, refatoração do código gerado e forte dependência com a estrutura das páginas HTML. Neste trabalho, apresentamos uma linguagem de especificação de teste e uma ferramenta concebidas para minimizar os impactos propiciados por essas limitações. A linguagem proposta dá suporte aos critérios de classes de equivalência e a ferramenta, desenvolvida sob a forma de um plug-in para a plataforma Eclipse, permite a geração de casos de teste através de diferentes estratégias de combinação. Para realizar a avaliação da abordagem, utilizamos um dos módulos do Sistema Unificado de Administração Publica (SUAP) do Instituto Federal do Rio Grande do Norte (IFRN). Participaram da avaliação analistas de sistemas e um técnico de informática que atuam como desenvolvedores do sistema utilizado.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automation has become increasingly necessary during the software test process due to the high cost and time associated with such activity. Some tools have been proposed to automate the execution of Acceptance Tests in Web applications. However, many of them have important limitations such as the strong dependence on the structure of the HTML pages and the need of manual valuing of the test cases. In this work, we present a language for specifying acceptance test scenarios for Web applications called IFL4TCG and a tool that allows the generation of test cases from these scenarios. The proposed language supports the criterion of Equivalence Classes Partition and the tool allows the generation of test cases that meet different combination strategies (i.e., Each-Choice, Base-Choice and All Combinations). In order to evaluate the effectiveness of the proposed solution, we used the language and the associated tool for designing and executing Acceptance Tests on a module of Sistema Unificado de Administração Pública (SUAP) of Instituto Federal Rio Grande do Norte (IFRN). Four Systems Analysts and one Computer Technician, which work as developers of the that system, participated in the evaluation. Preliminary results showed that IFL4TCG can actually help to detect defects in Web applications

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Checking the conformity between implementation and design rules in a system is an important activity to try to ensure that no degradation occurs between architectural patterns defined for the system and what is actually implemented in the source code. Especially in the case of systems which require a high level of reliability is important to define specific design rules for exceptional behavior. Such rules describe how exceptions should flow through the system by defining what elements are responsible for catching exceptions thrown by other system elements. However, current approaches to automatically check design rules do not provide suitable mechanisms to define and verify design rules related to the exception handling policy of applications. This paper proposes a practical approach to preserve the exceptional behavior of an application or family of applications, based on the definition and runtime automatic checking of design rules for exception handling of systems developed in Java or AspectJ. To support this approach was developed, in the context of this work, a tool called VITTAE (Verification and Information Tool to Analyze Exceptions) that extends the JUnit framework and allows automating test activities to exceptional design rules. We conducted a case study with the primary objective of evaluating the effectiveness of the proposed approach on a software product line. Besides this, an experiment was conducted that aimed to realize a comparative analysis between the proposed approach and an approach based on a tool called JUnitE, which also proposes to test the exception handling code using JUnit tests. The results showed how the exception handling design rules evolve along different versions of a system and that VITTAE can aid in the detection of defects in exception handling code