988 resultados para UNCONSTRAINED TESTING ENVIRONMENT


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emergence of an all-composite passenger airframe marks a major advance in the development of aerostructures. Underpinning this milestone is over four decades of intensive research in this area. Nonetheless, the first-generation of composite aerostructures is very conservative. This paper will discuss the need for the development of a virtual testing capability to enable better exploitation of the material's full potential in future designs. Recent progress, by the author, in this area is presented followed by a discussion of current limitations and opportunities for further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software environment emulation provides a means for simulating an operational environment of a system. This process involves approximation of systems’ external behaviors and their communications with a system to be tested in the environment. Development of such an environment is a tedious task and involves complex low level coding. Model driven engineering is an avenue to raise the level of abstraction beyond programming by specifying solution directly using problem domain concepts. In this paper we propose a novel domain-specific modeling tool to generate complex testing environments. Our tool employs a suite of domain-specific visual modeling languages for modelling emulation environment at a high level of abstraction. These high level specifications are then automatically transformed to runtimeenvironment for application integration testing, boosting development productivity and ease of use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software integration testing plays an increasingly important role as the software industry has experienced a major change from isolated applications to highly distributed computing environments. Conducting integration testing is a challenging task because it is often very difficult to replicate a real enterprise environment. Emulating testing environment is one of the key solutions to this problem. However, existing specification-based emulation techniques require manual coding of their message processing engines, therefore incurring high development cost. In this paper, we present a suite of domain-specific visual modelinglanguages to describe emulated testing environments at a highabstraction level. Our solution allows domain experts to model atesting environment from abstract interface layers. These layermodels are then transformed to runtime environment for application testing. Our user study shows that our visual languages are easy to use, yet with sufficient expressive power to model complex testing applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bound-constrained minimization is a subject of active research. To assess the performance of existent solvers, numerical evaluations and comparisons are carried on. Arbitrary decisions that may have a crucial effect on the conclusions of numerical experiments are highlighted in the present work. As a result, a detailed evaluation based on performance profiles is applied to the comparison of bound-constrained minimization solvers. Extensive numerical results are presented and analyzed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Jatkuvasti lisääntyvä matkapuhelinten käyttäjien määrä, internetin kehittyminen yleiseksi tiedon ja viihteen lähteeksi on luonut tarpeen palvelulle liikkuvan työaseman liittämiseksi tietokoneverkkoihin. GPRS on uusi teknologia, joka tarjoaa olemassa olevia matka- puhelinverkkoja (esim. NMT ja GSM) nopeamman, tehokkaamman ja taloudellisemman liitynnän pakettidataverkkoihin, kuten internettiin ja intranetteihin. Tämän työn tavoitteena oli toteuttaa GPRS:n paketinohjausyksikön (Packet Control Unit, PCU) testauksessa tarvittavat viestintäajurit työasemaympristöön. Aidot matkapuhelinverkot ovat liian kalliita, eikä niistä saa tarvittavasti lokitulostuksia, jotta niitä voisi käyttää GPRS:n testauksessa ohjelmiston kehityksen alkuvaihessa. Tämän takia PCU-ohjelmiston testaus suoritetaan joustavammassa ja helpommin hallittavassa ympäristössä, joka ei aseta kovia reaaliaikavaatimuksia. Uusi toimintaympäristö ja yhteysmedia vaativat PCU:n ja muiden GPRS-verkon yksiköiden välisistä yhteyksistä huolehtivien ohjelman osien, viestintäajurien uuden toteutuksen. Tämän työn tuloksena syntyivät tarvittavien viestintäajurien työasemaversiot. Työssä tarkastellaan eri tiedonsiirtotapoja ja -protokollia testattavan ohjelmiston vaateiden, toteutetun ajurin ja testauksen kannalta. Työssä esitellään kunkin ajurin toteuttama rajapinta ja toteutuksen aste, eli mitkä toiminnot on toteutettu ja mitä on jätetty pois. Ajureiden rakenne ja toiminta selvitetään siltä osin, kuin se on oleellista ohjelman toiminnan kannalta.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this paper is to document and explain the allocation of takeover purchase price to identifiable intangible assets (IIAs), purchased goodwill, and/or target net tangible assets in an accounting environment unconstrained with respect to IIA accounting policy choice. Using a sample of Australian acquisitions during the unconstrained accounting environment from 1988 to 2004, we find the percentage allocation of purchase price to IIAs averaged 19.09%. The percentage allocation to IIAs is significantly positively related to return on assets and insignificantly related to leverage, contrary to opportunism. Efficiency suggests an explanation: profitable firms acquire and capitalise a higher percentage of IIAs in acquisitions. The target's investment opportunity set is significantly positively related to the percentage allocation to IIAs, consistent with information-signalling. The paper contributes to the accounting policy choice literature by showing how Australian firms make the one-off accounting policy choice in regards allocation of takeover purchase price (which is often a substantial dollar amount to) in an environment where accounting for IIAs was unconstrained.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

该文提出将虚拟样机技术应用于工业机器人仿真研究过程 ,研究与开发工业机器人虚拟样机系统。首先说明虚拟样机技术并分析其关键技术 ,然后说明了工业机器人虚拟样机系统的构成与样机系统在机器人仿真研究中的研究内容。总结该项技术主要解决以下两方面的问题 ,即机器人仿真研究中的系统集成及为以机器人为主体的生产线虚拟设计、验证环境提供底层的数字化环境

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Shape memory NiTi alloys have been used extensively for medical device applications such as orthopedic, dental, vascular and cardiovascular devices on account of their unique shape memory effect (SME) and super-elasticity (SE). Laser welding is found to be the most suitable method used to fabricate NiTi-based medical components. However, the performance of laser-welded NiTi alloys under corrosive environments is not fully understood and a specific focus on understanding the corrosion fatigue behaviour is not evident in the literature. This study reveals a comparison of corrosion fatigue behaviour of laser-welded and bare NiTi alloys using bending rotation fatigue (BRF) test which was integrated with a specifically designed corrosion cell. The testing environment was Hanks’ solution (simulated body fluid) at 37.5oC. Electrochemical impedance spectroscopic (EIS) measurement was carried out to monitor the change of corrosion resistance at different periods during the BRF test. Experiments indicate that the laser-welded NiTi alloy would be more susceptible to the corrosion fatigue attack than the bare NiTi alloy. This finding can serve as a benchmark for the product designers and engineers to determine the factor of safety of NiTi medical devices fabricated using laser welding.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of a virtual testing environment, as a cost-effective industrial design tool in the design and analysis of composite structures, requires the need to create models efficiently, as well as accelerate the analysis by reducing the number of degrees of freedom, while still satisfying the need for accurately tracking the evolution of a debond, delamination or crack front. The eventual aim is to simulate both damage initiation and propagation in components with realistic geometrical features, where crack propagation paths are not trivial. Meshless approaches, and the Element-Free Galerkin (EFG) method, are particularly suitable for problems involving changes in topology and have been successfully applied to simulate damage in homogeneous materials and concrete. In this work, the method is utilized to model initiation and mixed-mode propagation of cracks in composite laminates, and to simulate experimentally-observed crack migration which is difficult to model using standard finite element analysis. N

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study examined the effectiveness of motor-encoding activities on memory and performance of students in a Grade One reading program. There were two experiments in the study. Experiment 1 replicated a study by Eli Saltz and David Dixon (1982). The effect of motoric enactment (Le., pretend play) of sentences on memory for the sentences was investigated. Forty Grade One students performed a "memory-for-sentences" technique, devised by Saltz and Dixon. Only the experimental group used motoric enactment of the sentences. Although quantitative findings revealed no significant difference between the mean scores of the experimental group versus the control group, aspects of the experimental design could have affected the results. It was suggested that Saltz and Dixon's study could be replicated again, with more attention given to variables such as population size, nature of the test sentences, subjects' previous educational experience and conditions related to the testing environment. The second experiment was an application of Saltz and Dixon's theory that motoric imagery should facilitate memory for sentences. The intent was to apply this theory to Grade One students' ability to remember words from their reading program. An experimental gym program was developed using kinesthetic activities to reinforce the skills of the classroom reading program. The same subject group was used in Experiment 2. It was hypothesized that the subjects who experienced the experimental gym program would show greater signs of progress in reading ability, as evidenced by their scores on Form G of the Woodcock Reading Mastery Test--Revised. The data from the WRM--R were analyzed with a 3-way split-plot analysis of variance in which group (experimental vs. control) and sex were the between subjects variables and test-time (pre-test vs. post-test) was the within-subjects variable. Findings revealed the following: (a) both groups made substantial gains over time on the visual-auditory learning sub-test and the triple action of group x sex x time also was significant; (b) children in the experimental and control groups performed similarly on both the pre- and post-test of the letter identification test; (c) time was the only significant effect on subjects' performance on the word identification task; (d) work attack scores showed marked improvement in performance over time for both the experimenta+ and control groups; (e) passage comprehension scores indicated an improvement in performance for both groups over time. Similar to Experiment 1, it is suggested that several modifications in the experimental design could produce significant results. These factors are addressed with suggestions for further research in the area of active learning; more specifically, the effect of motor-encoding activities on memory and academic performance of children.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article aims to describe the proposed development of an emulator application for digital TV that operates on a local computer network, to have it as a testing environment for students and researchers. The emulator application is presented as a useful tool to reduce validation costs of applications developed for digital TV.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The cybernetics revolution of the last years improved a lot our lives, having an immediate access to services and a huge amount of information over the Internet. Nowadays the user is increasingly asked to insert his sensitive information on the Internet, leaving its traces everywhere. But there are some categories of people that cannot risk to reveal their identities on the Internet. Even if born to protect U.S. intelligence communications online, nowadays Tor is the most famous low-latency network, that guarantees both anonymity and privacy of its users. The aim of this thesis project is to well understand how the Tor protocol works, not only studying its theory, but also implementing those concepts in practice, having a particular attention for security topics. In order to run a Tor private network, that emulates the real one, a virtual testing environment has been configured. This behavior allows to conduct experiments without putting at risk anonymity and privacy of real users. We used a Tor patch, that stores TLS and circuit keys, to be given as inputs to a Tor dissector for Wireshark, in order to obtain decrypted and decoded traffic. Observing clear traffic allowed us to well check the protocol outline and to have a proof of the format of each cell. Besides, these tools allowed to identify a traffic pattern, used to conduct a traffic correlation attack to passively deanonymize hidden service clients. The attacker, controlling two nodes of the Tor network, is able to link a request for a given hidden server to the client who did it, deanonymizing him. The robustness of the traffic pattern and the statistics, such as the true positive rate, and the false positive rate, of the attack are object of a potential future work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Inert gas washout tests, performed using the single- or multiple-breath washout technique, were first described over 60 years ago. As measures of ventilation distribution inhomogeneity, they offer complementary information to standard lung function tests, such as spirometry, as well as improved feasibility across wider age ranges and improved sensitivity in the detection of early lung damage. These benefits have led to a resurgence of interest in these techniques from manufacturers, clinicians and researchers, yet detailed guidelines for washout equipment specifications, test performance and analysis are lacking. This manuscript provides recommendations about these aspects, applicable to both the paediatric and adult testing environment, whilst outlining the important principles that are essential for the reader to understand. These recommendations are evidence based, where possible, but in many places represent expert opinion from a working group with a large collective experience in the techniques discussed. Finally, the important issues that remain unanswered are highlighted. By addressing these important issues and directing future research, the hope is to facilitate the incorporation of these promising tests into routine clinical practice.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Skull-stripping (or brain extraction) is an important pre-processing step in neuroimage analysis. This document describes a skull-stripping filter implemented using the Insight Toolkit ITK, which we named itk::StripTsImageFilter. It is a composite filter based on existing ITK classes. The filter has been implemented with usability, robustness, speed and versatility in mind, rather than accuracy. This makes it useful for many pre-processing tasks in neuroimage analysis. This paper is accompanied by the source code, input data and a testing environment.