959 resultados para automated software testing
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Construção de um software para avaliação do risco de úlcera por pressão em Unidade Terapia Intensiva
Resumo:
Pós-graduação em Enfermagem (mestrado profissional) - FMB
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Self-adaptive Software (SaS) presents specific characteristics compared to traditional ones, as it makes possible adaptations to be incorporated at runtime. These adaptations, when manually performed, normally become an onerous, error-prone activity. In this scenario, automated approaches have been proposed to support such adaptations; however, the development of SaS is not a trivial task. In parallel, reference architectures are reusable artifacts that aggregate the knowledge of architectures of software systems in specific domains. They have facilitated the development, standardization, and evolution of systems of those domains. In spite of their relevance, in the SaS domain, reference architectures that could support a more systematic development of SaS are not found yet. Considering this context, the main contribution of this paper is to present a reference architecture based on reflection for SaS, named RA4SaS (Reference Architecture for SaS). Its main purpose is to support the development of SaS that presents adaptations at runtime. To show the viability of this reference architecture, a case study is presented. As result, it has been observed that RA4SaS has presented good perspective to efficiently contribute to the area of SaS.
Resumo:
The objective of this study was to develop a model that allows testing in the wind tunnel at high angles of attack and validates its most critical components by analyzing the results of simulations in finite element software. During the project this structure suffered major loads identified during the flight conditions and, from these, we calculated the stresses in critical regions defined as the parts of the model that have higher failure probabilities. All aspects associated with Load methods, mesh refining and stress analysis were taken into account in this approach. The selection of the analysis software was based on project needs, seeking greater ease of modeling and simulation. We opted for the software ANSYS® since the entire project is being developed in CAD platforms enabling a friendly integration between software's modeling and analysis
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Pós-graduação em Agronomia (Energia na Agricultura) - FCA
Resumo:
The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness
Resumo:
Spreadsheets are widely used but often contain faults. Thus, in prior work we presented a data-flow testing methodology for use with spreadsheets, which studies have shown can be used cost-effectively by end-user programmers. To date, however, the methodology has been investigated across a limited set of spreadsheet language features. Commercial spreadsheet environments are multiparadigm languages, utilizing features not accommodated by our prior approaches. In addition, most spreadsheets contain large numbers of replicated formulas that severely limit the efficiency of data-flow testing approaches. We show how to handle these two issues with a new data-flow adequacy criterion and automated detection of areas of replicated formulas, and report results of a controlled experiment investigating the feasibility of our approach.
Resumo:
Nearly all biologic tissues exhibit viscoelastic behavior. This behavior is characterized by hysteresis in the response of the material to load or strain. This information can be utilized in extrapolation of life expectancy of vascular implant materials including native tissues and synthetic materials. This behavior is exhibited in many engineering materials as well such as the polymers PTFE, polyamide, polyethylene, etc. While procedures have been developed for evaluating the engineering polymers the techniques for biologic tissues are not as mature. There are multiple reasons for this. A major one is a cultural divide between the medical and engineering communities. Biomedical engineers are beginning to fill that void. A digitally controlled drivetrain designed to evaluate both elastic and viscoelastic characteristics of biologic tissues has been developed. The initial impetus for the development of this device was to evaluate the potential for human umbilical tissue to serve as a vascular graft material. The consequence is that the load frame is configured for membrane type specimens with rectangular dimensions of no more than 25mm per side. The designed load capacity of the drivetrain is to impose an axial load of 40N on the specimen. This drivetrain is capable of assessing the viscoelastic response of the specimens by four different test modes: stress relaxation, creep, harmonic induced oscillations, and controlled strain rate tests. The fluorocarbon PTFE has mechanical properties commensurate with vascular tissue. In fact, it has been used for vascular grafts in patients who have been victims of various traumas. Hardware and software validation of the device was accomplished by testing PTFE and comparing the results to properties that have been published by both researchers and manufacturers.
Resumo:
The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness
Resumo:
Bound-constrained minimization is a subject of active research. To assess the performance of existent solvers, numerical evaluations and comparisons are carried on. Arbitrary decisions that may have a crucial effect on the conclusions of numerical experiments are highlighted in the present work. As a result, a detailed evaluation based on performance profiles is applied to the comparison of bound-constrained minimization solvers. Extensive numerical results are presented and analyzed.
Resumo:
Purpose: To evaluate the relationship between glaucomatous structural damage assessed by the Cirrus Spectral Domain OCT (SDOCT) and functional loss as measured by standard automated perimetry (SAP). Methods: Four hundred twenty-two eyes (78 healthy, 210 suspects, 134 glaucomatous) of 250 patients were recruited from the longitudinal Diagnostic Innovations in Glaucoma Study and from the African Descent and Glaucoma Evaluation Study. All eyes underwent testing with the Cirrus SDOCT and SAP within a 6-month period. The relationship between parapapillary retinal nerve fiber layer thickness (RNFL) sectors and corresponding topographic SAP locations was evaluated using locally weighted scatterplot smoothing and regression analysis. SAP sensitivity values were evaluated using both linear as well as logarithmic scales. We also tested the fit of a model (Hood) for structure-function relationship in glaucoma. Results: Structure was significantly related to function for all but the nasal thickness sector. The relationship was strongest for superotemporal RNFL thickness and inferonasal sensitivity (R(2) = 0.314, P < 0.001). The Hood model fitted the data relatively well with 88% of the eyes inside the 95% confidence interval predicted by the model. Conclusions: RNFL thinning measured by the Cirrus SDOCT was associated with correspondent visual field loss in glaucoma.
Resumo:
Among the ongoing attempts to enhance cognitive performance, an emergent and yet underrepresented venue is brought by hemoencefalographic neurofeedback (HEG). This paper presents three related advances in HEG neurofeedback for cognitive enhancement: a) a new HEG protocol for cognitive enhancement, as well as b) the results of independent measures of biological efficacy (EEG brain maps) extracted in three phases, during a one year follow up case study; c) the results of the first controlled clinical trial of HEG, designed to assess the efficacy of the technique for cognitive enhancement of an adult and neurologically intact population. The new protocol was developed in the environment of a software that organizes digital signal algorithms in a flowchart format. Brain maps were produced through 10 brain recordings. The clinical trial used a working memory test as its independent measure of achievement. The main conclusion of this study is that the technique appears to be clinically promising. Approaches to cognitive performance from a metabolic viewpoint should be explored further. However, it is particularly important to note that, to our knowledge, this is the world's first controlled clinical study on the matter and it is still early for an ultimate evaluation of the technique.