919 resultados para Tools and techniques
Resumo:
Structural Health Monitoring (SHM) has diverse potential applications, and many groups work in the development of tools and techniques for monitoring structural performance. These systems use arrays of sensors and can be integrated with remote or local computers. There are several different approaches that can be used to obtain information about the existence, location and extension of faults by non destructive tests. In this paper an experimental technique is proposed for damage location based on an observability grammian matrix. The dynamic properties of the structure are identified through experimental data using the eigensystem realization algorithm (ERA). Experimental tests were carried out in a structure through varying the mass of some elements. Output signals were obtained using accelerometers.
Resumo:
O texto analisa a Arquivologia e sua relação com a mediação da informação. Defende que a base teórica da Arquivologia proporciona condições fundamentais para que, no âmbito da prática, se desenvolvam operações metodológicas que resultem no tratamento adequado dos documentos. Dessa forma, ressaltando a práxis arquivística, compreende-se a utilização de instrumentos e técnicas como uma mediação de sistemas, na qual as etapas da metodologia arquivística atendem ao objetivo primordial de organização de massas documentais, possibilitando seu tratamento, com o propósito de recuperar e disponibilizar as informações dos respectivos conjuntos documentais. A atuação técnica de profissionais da informação, especificamente do arquivista, nesse contexto, já configura uma mediação, mas uma mediação, sobretudo, que lida com a protoinformação. Dessa forma, argumenta-se que é necessário entender como essa protoinformação torna-se informação. Afirma-se que, nesse sentido, a mediação da informação apresenta-se como objeto que vislumbra tal compreensão, partindo, para tanto, do parâmetro da apropriação da informação dos usuários-pesquisadores do arquivo e sua produção e/ou alteração do conhecimento resultante da relação com esse ambiente, para garantir, de fato, uma mediação da informação arquivística. Advoga que essa perspectiva inovadora da mediação da informação nos arquivos, caracteriza uma abordagem que carece de maiores reflexões na área.
Resumo:
Nas últimas duas décadas, o crescimento do interesse pela metodologia Seis Sigma intensificou a aplicação da abordagem estatística e de outras abordagens quantitativas com o intuito de melhorar não apenas a qualidade de produtos, serviços e processos, como também aumentar o desempenho organizacional e o processo de tomada de decisão. Este artigo trata da aplicação da abordagem estatística no contexto da gestão da qualidade em indústrias de alimentos de médio e grande porte do Estado de São Paulo com o propósito de: identificar quais ferramentas e técnicas estatísticas são mais amplamente empregadas por indústrias do setor para garantir e controlar a qualidade; avaliar a interdependência entre o sucesso da implementação de programas de qualidade e segurança alimentar como Boas Práticas de Fabricação (BPF) e sistema de Análise de Perigos e Pontos Críticos de Controle (APPCC) e o uso de estatística; e analisar estimativas do grau de relevância do pensamento estatístico e de seus benefícios como ferramenta de melhoria da qualidade. Um survey exploratório-descritivo foi realizado e os resultados revelaram que a abordagem estatística começa a ser mais valorizada nas indústrias de alimentos pela relevância de seus benefícios assim como já ocorre em outros setores. Há evidências de que a implantação bem sucedida dos programas de segurança alimentar seja uma condição primordial para o uso efetivo de estatística e de outras abordagens quantitativas.
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
No contexto brasileiro, a investigação acerca das necessidades educacionais especiais concentra-se no estudo das dificuldades e possibilidades de inclusão desses alunos em classes regulares de ensino, enfatizando os processos de ensino e aprendizagem. Poucos estudos, no Brasil, admitem a família como objeto de análise, embora não se questione a sua importância para o desenvolvimento infantil. Dessa forma, com base no modelo bioecológico e na teoria estrutural sistêmica, admitindo-se a família como um campo de desenvolvimento comum a todos os membros faz-se necessário conhecer o modo como ela se estrutura para atender as demandas decorrentes da necessidade especial de seu filho e os efeitos dessa dinâmica nos demais membros. A partir disso objetivou-se descrever, a estrutura e a dinâmica de famílias de crianças com necessidades educacionais especiais, além de: analisar as interações e relações estabelecidas dentro de cada subsistema (conjugal, fraternal, parental) e entre eles, assim como identificar a organização familiar, a partir dos mecanismos de coesão e hierarquia de acordo com o modelo estrutural sistêmico. Como estratégia de pesquisa utilizou-se o estudo de casos múltiplos, com duas famílias de crianças com necessidades educacionais especiais, sendo uma menina surda, de dez anos, e um menino, de doze anos, com dificuldades de aprendizagem. Os instrumentos e técnicas aplicados foram: Roteiro de Entrevista Semi-Estruturado, Inventário de Rotina (IR), Observação Sistemática, Diário de Campo, Family System Test (FAST) e Genograma. Os escores de proximidade obtidos no FAST foram coerentes com os resultados do IR, demonstrando maior coesão na díade mãe-filho que na díade pai-filho, nas duas famílias; quanto à flexibilidade das fronteiras, em geral, a percepção das famílias foi de fronteiras rígidas, nos sistemas, familiar, parental e fraternal, sendo que, a distribuição de hierarquia foi percebida pela díade parental, nas duas famílias, como sinal de prediletância, para o subsistema fraternal, e dominação, para o parental, o que interferiu nas estruturas relacionais desses subsistemas percebidas pelos membros. Na avaliação do subsistema fraternal, a ausência de poder, representada pelos pais e a representação dessa variável pelas crianças resultou em diferenças de percepção, no grupo. Portanto, esse estudo permitiu, por meio da identificação das relações e percepções dos membros das famílias, a compreensão de sua dinâmica e a influência desta, na trajetória desenvolvimental das crianças e do grupo, a partir, das demandas decorrentes do diagnóstico e das estratégias peculiares a cada família para enfrentar as necessidades especiais de suas crianças. Percebe-se que a família, sendo a principal parceira da escola na educação, precisa ser olhada como um sistema cujas estratégias relacionais são fundamentais para que a criança tenha suas habilidades estimuladas podendo, assim, superar suas dificuldades.
Resumo:
Although technological development has created several tools and techniques of graphic representation, we highlight here the importance of the manual drawing abilities for the design creative process. Freehand drawing is used to facilitate the development of projects and show them more quickly and efficiently, and is an essential technique for any designer, regardless is informational, product or fashion designer.
Resumo:
The teaching/learning activities of the daylighting built environment require from the Architecture and Urbanism undergraduate student the ability to abstract the effects of daylight distributed in three-dimensional space that is being designed. Several tools and techniques can be used to facilitate the understanding of the involved phenomena, among which the computational simulation. This paper reports the digital inclusion of the daylighting teaching in the Architecture and Urbanism undergraduate course at the School of Architecture, Arts and Social Communication of Bauru (FAAC) of UNESP – Sao Paulo State University, that began in 2010. The inclusion process involved free software use, specifically the programs DIALux and SketchUp+Radiance, both with graphical output for the illuminated scenes visualization and for result analysis. The graphic model is converted from SketchUp to Radiance by a plugin and a user-friendly interface for Windows was developed to simulate the lighting. The process of digital inclusion is consolidated, with wide acceptance by students, for which computational simulation facilitates understanding of relation between daylight and built environment and helps the design process of elements for daylighting control.
Resumo:
Many tools and techniques for addressing software maintenance problems rely on code coverage information. Often, this coverage information is gathered for a specific version of a software system, and then used to perform analyses on subsequent versions of that system without being recalculated. As a software system evolves, however, modifications to the software alter the software’s behavior on particular inputs, and code coverage information gathered on earlier versions of a program may not accurately reflect the coverage that would be obtained on later versions. This discrepancy may affect the success of analyses dependent on code coverage information. Despite the importance of coverage information in various analyses, in our search of the literature we find no studies specifically examining the impact of software evolution on code coverage information. Therefore, we conducted empirical studies to examine this impact. The results of our studies suggest that even relatively small modifications can greatly affect code coverage information, and that the degree of impact of change on coverage may be difficult to predict.
Resumo:
Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage of the spreadsheets created by end-users contain faults, and stories abound of spreadsheet faults that have led to multi-million dollar losses. Despite such evidence, until recently, relatively little research had been done to help end-users create more dependable software.
Resumo:
Not long ago, most software was written by professional programmers, who could be presumed to have an interest in software engineering methodologies and in tools and techniques for improving software dependability. Today, however, a great deal of software is written not by professionals but by end-users, who create applications such as multimedia simulations, dynamic web pages, and spreadsheets. Applications such as these are often used to guide important decisions or aid in important tasks, and it is important that they be sufficiently dependable, but evidence shows that they frequently are not. For example, studies have shown that a large percentage of the spreadsheets created by end-users contain faults. Despite such evidence, until recently, relatively little research had been done to help end-users create more dependable software. We have been working to address this problem by finding ways to provide at least some of the benefits of formal software engineering techniques to end-user programmers. In this talk, focusing on the spreadsheet application paradigm, I present several of our approaches, focusing on methodologies that utilize source-code-analysis techniques to help end-users build more dependable spreadsheets. Behind the scenes, our methodologies use static analyses such as dataflow analysis and slicing, together with dynamic analyses such as execution monitoring, to support user tasks such as validation and fault localization. I show how, to accommodate the user base of spreadsheet languages, an interface to these methodologies can be provided in a manner that does not require an understanding of the theory behind the analyses, yet supports the interactive, incremental process by which spreadsheets are created. Finally, I present empirical results gathered in the use of our methodologies that highlight several costs and benefits trade-offs, and many opportunities for future work.
Resumo:
Self-organisation is increasingly being regarded as an effective approach to tackle modern systems complexity. The self-organisation approach allows the development of systems exhibiting complex dynamics and adapting to environmental perturbations without requiring a complete knowledge of the future surrounding conditions. However, the development of self-organising systems (SOS) is driven by different principles with respect to traditional software engineering. For instance, engineers typically design systems combining smaller elements where the composition rules depend on the reference paradigm, but typically produce predictable results. Conversely, SOS display non-linear dynamics, which can hardly be captured by deterministic models, and, although robust with respect to external perturbations, are quite sensitive to changes on inner working parameters. In this thesis, we describe methodological aspects concerning the early-design stage of SOS built relying on the Multiagent paradigm: in particular, we refer to the A&A metamodel, where MAS are composed by agents and artefacts, i.e. environmental resources. Then, we describe an architectural pattern that has been extracted from a recurrent solution in designing self-organising systems: this pattern is based on a MAS environment formed by artefacts, modelling non-proactive resources, and environmental agents acting on artefacts so as to enable self-organising mechanisms. In this context, we propose a scientific approach for the early design stage of the engineering of self-organising systems: the process is an iterative one and each cycle is articulated in four stages, modelling, simulation, formal verification, and tuning. During the modelling phase we mainly rely on the existence of a self-organising strategy observed in Nature and, hopefully encoded as a design pattern. Simulations of an abstract system model are used to drive design choices until the required quality properties are obtained, thus providing guarantees that the subsequent design steps would lead to a correct implementation. However, system analysis exclusively based on simulation results does not provide sound guarantees for the engineering of complex systems: to this purpose, we envision the application of formal verification techniques, specifically model checking, in order to exactly characterise the system behaviours. During the tuning stage parameters are tweaked in order to meet the target global dynamics and feasibility constraints. In order to evaluate the methodology, we analysed several systems: in this thesis, we only describe three of them, i.e. the most representative ones for each of the three years of PhD course. We analyse each case study using the presented method, and describe the exploited formal tools and techniques.
Resumo:
In orthopaedic and dental implantology, novel tools and techniques are being sought to improve the regeneration of bone tissue. Numerous attempts have been made to enhance the osteoconductivity of titanium prostheses, including modifications in their surface properties and coating with layers of calcium phosphate. The technique whereby such layers are produced has recently undergone a revolutionary change, which has had profound consequences for their potential to serve as drug-carrier systems. Hitherto, calcium phosphate layers were deposited upon the surfaces of metal implants under highly unphysiological physical conditions, which precluded the incorporation of proteinaceous osteoinductive drugs. These agents could only be adsorbed, superficially, upon preformed layers. Such superficially adsorbed molecules are released too rapidly within a biological milieu to be effective in their osteoinductive capacity. Now, it is possible to deposit calcium phosphate layers under physiological conditions of temperature and pH by the so-called biomimetic process, during which bioactive agents can be coprecipitated. Since these molecules are integrated into the inorganic latticework, they are released gradually in vivo as the layer undergoes degradation. This feature enhances the capacity of these coatings to act as a carrier system for osteogenic agents.
Resumo:
Code queries focus mainly on the static structure of a system. To comprehend the dynamic behavior of a system however, a software engineer needs to be able to reason about the dynamics of this system, for instance by querying a database of dynamic information. Such a querying mechanism should be directly available in the IDE where the developers implements, navigates and reasons about the software system. We propose (i) concepts to gather dynamic information, (ii) the means to query this information, and (iii) tools and techniques to integrate querying of dynamic information in the IDE, including the presentation of results generated by queries.
Resumo:
Wireless Mesh Networks (WMN) have proven to be a key technology for increased network coverage of Internet infrastructures. The development process for new protocols and architectures in the area of WMN is typically split into evaluation by network simulation and testing of a prototype in a test-bed. Testing a prototype in a real test-bed is time-consuming and expensive. Irrepressible external interferences can occur which makes debugging difficult. Moreover, the test-bed usually supports only a limited number of test topologies. Finally, mobility tests are impractical. Therefore, we propose VirtualMesh as a new testing architecture which can be used before going to a real test-bed. It provides instruments to test the real communication software including the network stack inside a controlled environment. VirtualMesh is implemented by capturing real traffic through a virtual interface at the mesh nodes. The traffic is then redirected to the network simulator OMNeT++. In our experiments, VirtualMesh has proven to be scalable and introduces moderate delays. Therefore, it is suitable for predeployment testing of communication software for WMNs.