993 resultados para Engenharia de software experimental


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta proyecto pretender crear una red docente especializada en las tecnologías de la información geográfica (TIG), con especial interés en la geomática, y en la captura, análisis y publicación de datos geográficos. Estos sistemas deben de ser sostenibles, por ello se hace especial énfasis en el uso de programas, servicios y datos libres, con el fin de poder impartir una docencia que aporte un perfil diferente al alumno, que no solo será analista, sino también productor y difusor de contenidos geográficos en forma de cartografía. Todo ello repercute en el alumnado facilitando el conocimiento de la información originados por esta red docente, elevando las posibilidades de inserción laboral, y dinamizando el emprendimiento. Por ello, las acciones colaborativas y metodológicas se centran en la creación y personalización de un aula experimental de cartografía y Sistemas de Información Geográfica (SIG) donde poder practicar los servicios y procesos que la sociedad está demandando, y que los productores están sirviendo. Estos esfuerzos requieren el concurso de un servidor de servicios y datos que también forma parte de este proyecto.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O desenvolvimento técnico e científico em torno das blindagens tem procurado contrariar o constante aperfeiçoamento dos projéteis e do seu poder de penetração. Para cumprir este objetivo, é necessário recorrer a soluções inovadoras tanto em termos dos materiais utilizados no fabrico, como ao nível do formato da própria blindagem. Com este trabalho pretende-se dar os primeiros passos na produção e teste de materiais em situação de tiro real, bem como na modelação e simulação dos mesmos com o objetivo de adquirir capacidades, técnicas e procedimentos para trabalhos futuros. O trabalho teve início com a realização da pesquisa bibliográfica sobre vários aspetos desde o fabrico até à avaliação do desempenho de blindagens balísticas. O trabalho teórico desta tese incidiu na modelação por elementos finitos dos elementos balísticos (proteções e projétil) tendo como objetivo a simulação numérica da sua interação em condições de impacto. O trabalho experimental foi realizado no campo de tiro da Escola das Armas onde foram testados os alvos com os materiais compósitos produzidos e com placas de alumínio, com diferentes combinações. A análise de resultados permitiu comparar e avaliar as diferenças entre as estimativas teóricas e as suas limitações com as observações experimentais do tiro real.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Na fase inicial desta Dissertação foi realizada uma revisão abordando as operações por corte de arranque de apara mais comuns, considerando-se que estas seriam o torneamento, a furação mecânica e a fresagem, sendo esta última operação abordada de uma forma mais detalhada, uma vez que esta operação será o tema central de estudo neste trabalho. Realizou-se também um estudo mais elaborado sobre alguns fundamentos de fresagem como a maquinagem concordante e discordante, fatores que influenciam no acabamento final da maquinagem, diâmetro efetivo de corte para uma ferramenta de topo esférico, flexão da ferramenta, entre outros. Temas como o CAD-CAM, CNC e Engenharia Inversa foram também particularizados neste trabalho, pois o conhecimento destes assuntos por parte do leitor seriam importantes para a compreensão do trabalho posteriormente desenvolvido. Na parte seguinte deste trabalho foi realizada uma parte experimental. Um primeiro trabalho foi desenvolvido com base numa geometria a duas dimensões, onde foi necessária a realização das geometrias no software AUTOCAD® e posterior utilização do MASTERCAM® para criar os ciclos de maquinagem e, para um segundo trabalho, desta vez com base numa geometria a três dimensões, utilizou-se o SOLID-WORKS® para extração da Bucha e Cavidade e o MASTERCAM® para criar os ciclos de maquinagem. Ambos os trabalhos foram produzidos no Centro de maquinagem existente no Departamento de Engenharia Mecânica do Instituto Superior de Engenharia de Coimbra.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The data structure of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. This research develops a methodology for evaluating, ex ante, the relative desirability of alternative data structures for end user queries. This research theorizes that the data structure that yields the lowest weighted average complexity for a representative sample of information requests is the most desirable data structure for end user queries. The theory was tested in an experiment that compared queries from two different relational database schemas. As theorized, end users querying the data structure associated with the less complex queries performed better Complexity was measured using three different Halstead metrics. Each of the three metrics provided excellent predictions of end user performance. This research supplies strong evidence that organizations can use complexity metrics to evaluate, ex ante, the desirability of alternate data structures. Organizations can use these evaluations to enhance the efficient and effective retrieval of information by creating data structures that minimize end user query complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A hydrogel intervertebral disc (lVD) model consisting of an inner nucleus core and an outer anulus ring was manufactured from 30 and 35% by weight Poly(vinyl alcohol) hydrogel (PVA-H) concentrations and subjected to axial compression in between saturated porous endplates at 200 N for 11 h, 30 min. Repeat experiments (n = 4) on different samples (N = 2) show good reproducibility of fluid loss and axial deformation. An axisymmetric nonlinear poroelastic finite element model with variable permeability was developed using commercial finite element software to compare axial deformation and predicted fluid loss with experimental data. The FE predictions indicate differential fluid loss similar to that of biological IVDs, with the nucleus losing more water than the anulus, and there is overall good agreement between experimental and finite element predicted fluid loss. The stress distribution pattern indicates important similarities with the biological lVD that includes stress transference from the nucleus to the anulus upon sustained loading and renders it suitable as a model that can be used in future studies to better understand the role of fluid and stress in biological IVDs. (C) 2005 Springer Science + Business Media, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper arises out of a research study into the online help facilities provided in popular software applications such as word processors. Its particular focus is on experimental methods of evaluating the effectiveness and usability of those facilities. Focus groups, questionnaires, and online surveys had already been used in other phases of the study, but it was judged that these approaches would be unsuitable for measuring effectiveness and usability because they are susceptible to respondents' subjectivity. Direct observation of people working on set word-processing tasks was ruled out initially because of a lack of trained observers; it would have taken too long for the investigator to observe a large enough sample by himself. Automatic recording of users' actions was also rejected, as it would have demanded equipment and/or software that was not available and seemed too expensive to acquire. The approach and techniques described here were an attempt to overcome these difficulties by using observers drawn from the same population of students that provided the test subjects; as a by-product, this may also have enhanced the acceptability (and hence possibly the validity) of the experiments by reducing the exam pressure perceived by participants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The results of empirical studies are limited to particular contexts, difficult to generalise and the studies themselves are expensive to perform. Despite these problems, empirical studies in software engineering can be made effective and they are important to both researchers and practitioners. The key to their effectiveness lies in the maximisation of the information that can be gained by examining existing studies, conducting power analyses for an accurate minimum sample size and benefiting from previous studies through replication. This approach was applied in a controlled experiment examining the combination of automated static analysis tools and code inspection in the context of verification and validation (V&V) of concurrent Java components. The combination of these V&V technologies was shown to be cost-effective despite the size of the study, which thus contributes to research in V&V technology evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The results of an experimental study of retail investors' use of eXtensible Business Reporting Language tagged (interactive) data and PDF format for making investment decisions are reported. The main finding is that data format made no difference to participants' ability to locate and integrate information from statement footnotes to improve investment decisions. Interactive data were perceived by participants as quick and 'accurate', but it failed to facilitate the identification of the adjustment needed to make the ratios accurate for comparison. An important implication is that regulators and software designers should work to reduce user reliance on the comparability of ratios generated automatically using interactive data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In data mining, efforts have focused on finding methods for efficient and effective cluster analysis in large databases. Active themes of research focus on the scalability of clustering methods, the effectiveness of methods for clustering complex shapes and types of data, high-dimensional clustering techniques, and methods for clustering mixed numerical and categorical data in large databases. One of the most accuracy approach based on dynamic modeling of cluster similarity is called Chameleon. In this paper we present a modified hierarchical clustering algorithm that used the main idea of Chameleon and the effectiveness of suggested approach will be demonstrated by the experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reasonable choice is a critical success factor for decision- making in the field of software engineering (SE). A case-driven comparative analysis has been introduced and a procedure for its systematic application has been suggested. The paper describes how the proposed method can be built in a general framework for SE activities. Some examples of experimental versions of the framework are brie y presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Raster graphic ampelometric software was not exclusively developed for the estimation of leaf area, but also for the characterization of grapevine (Viti vinifera L.) leaves. The software was written in C-Hprogramming language, using the C-1-1- Builder 2007 for Windows 95-XP and Linux operation systems. It handles desktop-scanned images. On the image analysed with the GRA.LE.D., the user has to determine 11 points. These points are then connected and the distances between them calculated. The GRA.LE.D. software supports standard ampelometric measurements such as leaf area, angles between the veins and lengths of the veins. These measurements are recorded by the software and exported into plain ASCII text files for single or multiple samples. Twenty-two biometric data points of each leaf are identified by the GRA.LE.D. It presents the opportunity to statistically analyse experimental data, allows comparison of cultivars and enables graphic reconstruction of leaves using the Microsoft Excel Chart Wizard. The GRA. LE.D. was thoroughly calibrated and compared to other widely used instruments and methods such as photo-gravimetry, LiCor L0100, WinDIAS2.0 and ImageTool. By comparison, the GRA.LE.D. presented the most accurate measurements of leaf area, but the LiCor L0100 and the WinDIAS2.0 were faster, while the photo-gravimetric method proved to be the most time-consuming. The WinDIAS2.0 instrument was the least reliable. The GRA.LE.D. is uncomplicated, user-friendly, accurate, consistent, reliable and has wide practical application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phenomenonal growth of the Internet has connected us to a vast amount of computation and information resources around the world. However, making use of these resources is difficult due to the unparalleled massiveness, high communication latency, share-nothing architecture and unreliable connection of the Internet. In this dissertation, we present a distributed software agent approach, which brings a new distributed problem-solving paradigm to the Internet computing researches with enhanced client-server scheme, inherent scalability and heterogeneity. Our study discusses the role of a distributed software agent in Internet computing and classifies it into three major categories by the objects it interacts with: computation agent, information agent and interface agent. The discussion of the problem domain and the deployment of the computation agent and the information agent are presented with the analysis, design and implementation of the experimental systems in high performance Internet computing and in scalable Web searching. ^ In the computation agent study, high performance Internet computing can be achieved with our proposed Java massive computation agent (JAM) model. We analyzed the JAM computing scheme and built a brutal force cipher text decryption prototype. In the information agent study, we discuss the scalability problem of the existing Web search engines and designed the approach of Web searching with distributed collaborative index agent. This approach can be used for constructing a more accurate, reusable and scalable solution to deal with the growth of the Web and of the information on the Web. ^ Our research reveals that with the deployment of the distributed software agent in Internet computing, we can have a more cost effective approach to make better use of the gigantic scale network of computation and information resources on the Internet. The case studies in our research show that we are now able to solve many practically hard or previously unsolvable problems caused by the inherent difficulties of Internet computing. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software product line engineering promotes large software reuse by developing a system family that shares a set of developed core features, and enables the selection and customization of a set of variabilities that distinguish each software product family from the others. In order to address the time-to-market, the software industry has been using the clone-and-own technique to create and manage new software products or product lines. Despite its advantages, the clone-and-own approach brings several difficulties for the evolution and reconciliation of the software product lines, especially because of the code conflicts generated by the simultaneous evolution of the original software product line, called Source, and its cloned products, called Target. This thesis proposes an approach to evolve and reconcile cloned products based on mining software repositories and code conflict analysis techniques. The approach provides support to the identification of different kinds of code conflicts – lexical, structural and semantics – that can occur during development task integration – bug correction, enhancements and new use cases – from the original evolved software product line to the cloned product line. We have also conducted an empirical study of characterization of the code conflicts produced during the evolution and merging of two large-scale web information system product lines. The results of our study demonstrate the approach potential to automatically or semi-automatically solve several existing code conflicts thus contributing to reduce the complexity and costs of the reconciliation of cloned software product lines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principal effluent in the oil industry is the produced water, which is commonly associated to the produced oil. It presents a pronounced volume of production and it can be reflected on the environment and society, if its discharge is unappropriated. Therefore, it is indispensable a valuable careful to establish and maintain its management. The traditional treatment of produced water, usualy includes both tecniques, flocculation and flotation. At flocculation processes, there are traditional floculant agents that aren’t well specified by tecnichal information tables and still expensive. As for the flotation process, it’s the step in which is possible to separate the suspended particles in the effluent. The dissolved air flotation (DAF) is a technique that has been consolidating economically and environmentally, presenting great reliability when compared with other processes. The DAF is presented as a process widely used in various fields of water and wastewater treatment around the globe. In this regard, this study was aimed to evaluate the potential of an alternative natural flocculant agent based on Moringa oleifera to reduce the amount of oil and grease (TOG) in produced water from the oil industry by the method of flocculation/DAF. the natural flocculant agent was evaluated by its efficacy, as well as its efficiency when compared with two commercial flocculant agents normally used by the petroleum industry. The experiments were conducted following an experimental design and the overall efficiencies for all flocculants were treated through statistical calculation based on the use of STATISTICA software version 10.0. Therefore, contour surfaces were obtained from the experimental design and were interpreted in terms of the response variable removal efficiency TOG (total oil and greases). The plan still allowed to obtain mathematical models for calculating the response variable in the studied conditions. Commercial flocculants showed similar behavior, with an average overall efficiency of 90% for oil removal, however it is the economical analysis the decisive factor to choose one of these flocculant agents to the process. The natural alternative flocculant agent based on Moringa oleifera showed lower separation efficiency than those of commercials one (average 70%), on the other hand this flocculant causes less environmental impacts and it´s less expensive