995 resultados para Integration testing


Relevância:

70.00% 70.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Nowadays software testing and quality assurance have a great value in software development process. Software testing does not mean a concrete discipline, it is the process of validation and verification that starts from the idea of future product and finishes at the end of product’s maintenance. The importance of software testing methods and tools that can be applied on different testing phases is highly stressed in industry. The initial objectives for this thesis were to provide a sufficient literature review on different testing phases and for each of the phases define the method that can be effectively used for improving software’s quality. Software testing phases, chosen for study are: unit testing, integration testing, functional testing, system testing, acceptance testing and usability testing. The research showed that there are many software testing methods that can be applied at different phases and in the most of the cases the choice of the method should be done depending on software type and its specification. In the thesis the problem, concerned to each of the phases was identified; the method that can help in eliminating this problem was suggested and particularly described.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The ability of integrating into a unified percept sensory inputs deriving from different sensory modalities, but related to the same external event, is called multisensory integration and might represent an efficient mechanism of sensory compensation when a sensory modality is damaged by a cortical lesion. This hypothesis has been discussed in the present dissertation. Experiment 1 explored the role of superior colliculus (SC) in multisensory integration, testing patients with collicular lesions, patients with subcortical lesions not involving the SC and healthy control subjects in a multisensory task. The results revealed that patients with collicular lesions, paralleling the evidence of animal studies, demonstrated a loss of multisensory enhancement, in contrast with control subjects, providing the first lesional evidence in humans of the essential role of SC in mediating audio-visual integration. Experiment 2 investigated the role of cortex in mediating multisensory integrative effects, inducing virtual lesions by inhibitory theta-burst stimulation on temporo-parietal cortex, occipital cortex and posterior parietal cortex, demonstrating that only temporo-parietal cortex was causally involved in modulating the integration of audio-visual stimuli at the same spatial location. Given the involvement of the retino-colliculo-extrastriate pathway in mediating audio-visual integration, the functional sparing of this circuit in hemianopic patients is extremely relevant in the perspective of a multisensory-based approach to the recovery of unisensory defects. Experiment 3 demonstrated the spared functional activity of this circuit in a group of hemianopic patients, revealing the presence of implicit recognition of the fearful content of unseen visual stimuli (i.e. affective blindsight), an ability mediated by the retino-colliculo-extrastriate pathway and its connections with amygdala. Finally, Experiment 4 provided evidence that a systematic audio-visual stimulation is effective in inducing long-lasting clinical improvements in patients with visual field defect and revealed that the activity of the spared retino-colliculo-extrastriate pathway is responsible of the observed clinical amelioration, as suggested by the greater improvement observed in patients with cortical lesions limited to the occipital cortex, compared to patients with lesions extending to other cortical areas, found in tasks high demanding in terms of spatial orienting. Overall, the present results indicated that multisensory integration is mediated by the retino-colliculo-extrastriate pathway and that a systematic audio-visual stimulation, activating this spared neural circuit, is able to affect orientation towards the blind field in hemianopic patients and, therefore, might constitute an effective and innovative approach for the rehabilitation of unisensory visual impairments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tämän diplomityö käsittelee UPM-Kymmenen paperitoimialoilla käynnissä olevaa Chain 2000 –projektin toteutusta uuden markkinan integroinnin aikana SAP –toiminnanohjausjärjestelmään yhden paperitehtaan näkökulmasta. Koska tehdasjärjestelmän kaikkien osa-alueiden integraatio on erittäin vaativa ja pitkäkestoinen prosessi, keskittyy tämä työ yhden markkinan integraatioon. Tavoitteena on suunnitella, toteuttaa ja testata tässä integroinnissa tarvittavat järjestelmämuutokset ja kuvata integroinnissa käytettäviä työkaluja, toimintamalleja ja prosesseja. Samalla kerrotaan tämän vaativan projektin mukanaan tuomia ongelmia ja ratkaisuja yhden paperitehtaan kannalta. Työssä esitellään erilaisia keinoja ja työkaluja varsinkin IT -projektien hallinnointiin ja toteuttamiseen sekä käydään läpi ohjelmistotestaaminen, ERP -järjestelmät ja tietovarastot. Työ tuo esille miten haastavaa globaalin IT –projektin toteuttaminen on. Tarkastelun tuloksena huomataan, että standardityökalujen käyttö aiheuttaa ongelmia erikoistilanteissa ja väärä informaatio tuo yritykselle ylimääräisiä kustannuksia. Projektin myötä toimintojen toteutuksen painotus ja samalla vastuu tiedon oikeellisuudesta siirtyy jatkuvasti tehtaalle päin. Integraatiotestaamisesta ja tarvittavista muutoksista on diplomityön aikana selviydytty kiitettävästi, mutta täysi varmuus kaikkien integraatioon osallistuvien järjestelmien toimivuudesta saadaan vasta itse käyttöönotossa kesällä 2004. Ylläpito käyttöönoton jälkeen vaatii myös resursseja.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ohjelmistotestauksen merkitys on kasvanut sen mukaan mitä enemmän ohjelmisto-tuotteet vaikuttavat jokapäiväisesseen elämämme. Tämän vuoksi yritysten investointien ja laadunvarmentamisen yhteys on ilmeinen. Organisaatiot panostavat yhä enemmän ei–funktionaaliseen testaukseen, kuten turvallisuuden, suorituskyvyn ja käytettävyyden testaamiseen. Tämän työn tarkoituksena on tutkia ohjelmistotestauksen nykytilannetta Suomessa. Syy tähän on uudistaa ja parantaa ohjelmistotestauksen kurssitarjontaa Turun yliopistossa vastaamaan parhaalla mahdollisella tavalla yritysten tarvetta. Opinnäyte on toteutettu replikaatio-tutkimuksena. Pääosa kyselystä sisältää kysymyksiä ohjelmistotestauksen menetelmistä ja työkaluista testausprosessin toimintojen aikana. Lisäksi on yleisiä kysymyksiä yrityksistä ja niiden ohjelmistotestausympäristöistä. Kyselyssä otetaan myös kantaa yritysten käyttämiin monenlaisiin testaus-tasoihin, -tyyppeihin ja testauksessa kohdattuihin haasteisiin. Tämä opinnäyte perustuu testausprosessistandardeihin. Ohjelmistotestausstandardit ovat keskeisessä asemassa tässä työssä, vaikka ne ovat olleet viime aikoina vahvan kritiikin kohteena. Epäilys standardien välttämättömyyteen on syntynyt muutoksista ohjelmistokehityksessä. Tämä työ esittelee tulokset ohjelmistotestauksen käytännöistä. Tuloksia on verrattu aiheeseen liittyvän aiemman kyselyn (Lee, Kang, & Lee, 2011) tuloksiin. Ajanpuutteen havaitaan olevan suuri haaste ohjelmistotestauksessa. Ketterä ohjelmistokehitys on saavuttanut suosiota kaikissa vastaajien yrityksissä. Testauksen menetelmät ja työkalut testauksen arviointiin, suunnitteluun ja raportointiin ovat hyvin vähäisessä käytössä. Toisaalta testauksen menetelmien ja työkalujen käyttö automaattiseen testauksen toteuttamiseen ja virheiden hallintaan on lisääntynyt. Järjestelmä-, hyväksyntä-, yksikkö- ja integraatiotestaus ovat käytössä kaikkien vastaajien edustamissa yrityksissä. Kaikkien vastaajien mielestä regressio- sekä tutkiva- ja ei-funktionaalinen testaus ovat tärkeitä tekniikoita.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aspect-oriented programming (AOP) is a promising technology that supports separation of crosscutting concerns (i.e., functionality that tends to be tangled with, and scattered through the rest of the system). In AOP, a method-like construct named advice is applied to join points in the system through a special construct named pointcut. This mechanism supports the modularization of crosscutting behavior; however, since the added interactions are not explicit in the source code, it is hard to ensure their correctness. To tackle this problem, this paper presents a rigorous coverage analysis approach to ensure exercising the logic of each advice - statements, branches, and def-use pairs - at each affected join point. To make this analysis possible, a structural model based on Java bytecode - called PointCut-based Del-Use Graph (PCDU) - is proposed, along with three integration testing criteria. Theoretical, empirical, and exploratory studies involving 12 aspect-oriented programs and several fault examples present evidence of the feasibility and effectiveness of the proposed approach. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Currently, a high penetration level of Distributed Generations (DGs) has been observed in the Danish distribution systems, and even more DGs are foreseen to be present in the upcoming years. How to utilize them for maintaining the security of the power supply under the emergency situations, has been of great interest for study. This master project is intended to develop a control architecture for studying purposes of distribution systems with large scale integration of solar power. As part of the EcoGrid EU Smart Grid project, it focuses on the system modelling and simulation of a Danish representative LV network located in Bornholm island. Regarding the control architecture, two types of reactive control techniques are implemented and compare. In addition, a network voltage control based on a tap changer transformer is tested. The optimized results after applying a genetic algorithm to five typical Danish domestic loads are lower power losses and voltage deviation using Q(U) control, specially with large consumptions. Finally, a communication and information exchange system is developed with the objective of regulating the reactive power and thereby, the network voltage remotely and real-time. Validation test of the simulated parameters are performed as well.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Adenoviral vectors are currently the most widely used gene therapeutic vectors, but their inability to integrate into host chromosomal DNA shortened their transgene expression and limited their use in clinical trials. In this project, we initially planned to develop a technique to test the effect of the early region 1 (E1) on adenovirus integration by comparing the integration efficiencies between an E1-deleted adenoviral vector (SubE1) and an Elcontaining vector (SubE3). However, we did not harvest any SubE3 virus, even if we repeated the transfection and successfully rescued the SubE1 virus (2/4 transfections generated viruses) and positive control virus (6/6). The failure of rescuing SubE3 could be caused by the instability of the genomic plasmid pFG173, as it had frequent intemal deletions when we were purifying It. Therefore, we developed techniques to test the effect of E1 on homologous recombination (HR) since literature suggested that adenovirus integration is initiated by HR. We attempted to silence the E1 in 293 cells by transfecting E1A/B-specific small interfering RNA (siRNA). However, no silenced phenotype was observed, even if we varied the concentrations of E1A/B siRNA (from 30 nM to 270 nM) and checked the silencing effects at different time points (48, 72, 96 h). One possible explanation would be that the E1A/B siRNA sequences are not potent enough to Induce the silenced phenotype. For evaluating HR efficiencies, an HR assay system based on bacterial transfonmatJon was designed. We constmcted two plasmids ( designated as pUC19-dl1 and pUC19-dl2) containing different defective lacZa cassettes (forming white colonies after transformation) that can generate a functional lacZa cassette (forming blue colonies) through HR after transfecting into 293 cells. The HR efficiencies would be expressed as the percentages of the blue colonies among all the colonies. Unfortunately, after transfonnation of plasmid isolated from 293 cells, no colony was found, even at a transformation efficiency of 1.8x10^ colonies/pg pUC19, suggesting the sensitivity of this system was low. To enhance the sensitivity, PCR was used. We designed a set of primers that can only amplify the recombinant plasmid fomied through HR. Therefore, the HR efficiencies among different treatments can be evaluated by the amplification results, and this system could be used to test the effect of E1 region on adenovirus integration. In addition, to our knowledge there was no previous studies using PCR/ Realtime PCR to evaluate HR efficiency, so this system also provides a PCR-based method to carry out the HR assays.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper examines the linkage between two parallel stock exchanges trading the same shares in Colombia, namely the Bogotá Stock Exchange and the Medellín Stock Exchange. We provide empirical evidence to support the hypothesis that these two markets can be best described as fully integrated over a period of almost four decades, which is consistent with the view that arbitrage opportunities are only possible in the short but not in the long run. In addition, we find evide

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Negative dimensional integration method (NDIM) is a technique to deal with D-dimensional Feynman loop integrals. Since most of the physical quantities in perturbative Quantum Field Theory (pQFT) require the ability of solving them, the quicker and easier the method to evaluate them the better. The NDIM is a novel and promising technique, ipso facto requiring that we put it to test in different contexts and situations and compare the results it yields with those that we already know by other well-established methods. It is in this perspective that we consider here the calculation of an on-shell two-loop three point function in a massless theory. Surprisingly this approach provides twelve non-trivial results in terms of double power series. More astonishing than this is the fact that we can show these twelve solutions to be different representations for the same well-known single result obtained via other methods. It really comes to us as a surprise that the solution for the particular integral we are dealing with is twelvefold degenerate.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The current study investigated the cognitive workload of sentence and clause wrap-up in younger and older readers. A large number of studies have demonstrated the presence of wrap-up effects, peaks in processing time at clause and sentence boundaries that some argue reflect attention to organizational and integrative semantic processes. However, the exact nature of these wrap-up effects is still not entirely clear, with some arguing that wrap-up is not related to processing difficulty, but rather is triggered by a low-level oculomotor response or the implicit monitoring of intonational contour. The notion that wrap-up effects are resource-demanding was directly tested by examining the degree to which sentence and clause wrap-up affects the parafoveal preview benefit. Older and younger adults read passages in which a target word N occurred in a sentence-internal, clause-final, or sentence-final position. A gaze-contingent boundary change paradigm was used in which, on some trials, a non-word preview of word N+1 was replaced by a target word once the eyes crossed an invisible boundary located between words N and N+1. All measures of reading time on word N were longer at clause and sentence boundaries than in the sentence-internal position. In the earliest measures of reading time, sentence and clause wrap-up showed evidence of reducing the magnitude of the preview benefit similarly for younger and older adults. However, this effect was moderated by age in gaze duration, such that older adults showed a complete reduction in the preview benefit in the sentence-final condition. Additionally, sentence and clause wrap-up were negatively associated with the preview benefit. Collectively, the findings from the current study suggest that wrap-up is cognitively demanding and may be less efficient with age, thus, resulting in a reduction of the parafoveal preview during normal reading.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wastewater control at storage terminals of liquid chemical products in bulk is very difficult because of the variety of products handled in the facilities generating effluents of variable composition. The main objective of this work was to verify if the Vibrio fischeri acute toxicity test could be routinely included in the wastewater management of those facilities along with physical and chemical analysis in order to evaluate and improve the quality of the generated effluents. The study was performed in two phases before and after the implementation of better operational practices/treatment technologies. Chemical oxygen demand (COD) and toxicity of treated effluents did not correlate showing that effluents with low COD contain toxic substances and non-biodegradable organic matter, which may be not degraded when discharged into the aquatic environment. Segregation of influents or pre-treatment based on toxicity results and biodegradability index were implemented in the facilities generating significant improvements in the quality of final effluents with reduction of Biochemical oxygen demand (BOD) and toxicity. The integration of physical and chemical analysis with the V.fischeri toxicity test turned out to be an excellent tool for wastewater management in chemical terminals allowing rapid decision making for pollution control and prevention measures. Reuse of rain water was also proposed and when implemented by the facilities resulted in economical and environmental benefits. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Graphical user interfaces (GUIs) make software easy to use by providing the user with visual controls. Therefore, correctness of GUI's code is essential to the correct execution of the overall software. Models can help in the evaluation of interactive applications by allowing designers to concentrate on its more important aspects. This paper presents a generic model for language-independent reverse engineering of graphical user interface based applications, and we explore the integration of model-based testing techniques in our approach, thus allowing us to perform fault detection. A prototype tool has been constructed, which is already capable of deriving and testing a user interface behavioral model of applications written in Java/Swing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a study of a state of the art implementation of a new computer integrated testing (CIT) facility within a company that designs and manufactures transport refrigeration systems. The aim was to use state of the art hardware, software and planning procedures in the design and implementation of three CIT systems. Typical CIT system components include data acquisition (DAQ) equipment, application and analysis software, communication devices, computer-based instrumentation and computer technology. It is shown that the introduction of computer technology into the area of testing can have a major effect on such issues as efficiency, flexibility, data accuracy, test quality, data integrity and much more. Findings reaffirm how the overall area of computer integration continues to benefit any organisation, but with more recent advances in computer technology, communication methods and software capabilities, less expensive more sophisticated test solutions are now possible. This allows more organisations to benefit from the many advantages associated with CIT. Examples of computer integration test set-ups and the benefits associated with computer integration have been discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.