834 resultados para test-process features


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this paper is to conceptualise and operationalise the concept of supply chain management sustainability practices. Based on a multi-stage procedure involving a literature review, expert Q-sort and pre-test process, pilot test and survey of 156 supply chain directors and managers in Ireland, we develop a multidimensional conceptualisation and measure of social and environmental supply chain management sustainability practices. The research findings show theoretically sound constructs based on four underlying sustainable supply chain management practices: monitoring, implementing systems, new product and process development and strategy redefinition. A two-factor model is then identified as the most reliable: comprising process-based and market-based practices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Four experiments consider some of the circumstances under which children follow two different rule pairs when sorting cards. Previous research has repeatedly found that 3-year-olds encounter substantial difficulties implementing the second of two conflicting rule sets, despite their knowledge of these rules. One interpretation of this phenomenon [Cognitive Complexity and Control (CCC) theory] is that 3-year-olds have problems establishing an appropriate hierarchical ordering for rules. The present data suggest an alternative account of children's card sorting behaviour, according to which the cognitive salience of test card features may be more important than inflexibility with respect to rule representation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Highly heterogeneous mountain snow distributions strongly affect soil moisture patterns; local ecology; and, ultimately, the timing, magnitude, and chemistry of stream runoff. Capturing these vital heterogeneities in a physically based distributed snow model requires appropriately scaled model structures. This work looks at how model scale—particularly the resolutions at which the forcing processes are represented—affects simulated snow distributions and melt. The research area is in the Reynolds Creek Experimental Watershed in southwestern Idaho. In this region, where there is a negative correlation between snow accumulation and melt rates, overall scale degradation pushed simulated melt to earlier in the season. The processes mainly responsible for snow distribution heterogeneity in this region—wind speed, wind-affected snow accumulations, thermal radiation, and solar radiation—were also independently rescaled to test process-specific spatiotemporal sensitivities. It was found that in order to accurately simulate snowmelt in this catchment, the snow cover needed to be resolved to 100 m. Wind and wind-affected precipitation—the primary influence on snow distribution—required similar resolution. Thermal radiation scaled with the vegetation structure (~100 m), while solar radiation was adequately modeled with 100–250-m resolution. Spatiotemporal sensitivities to model scale were found that allowed for further reductions in computational costs through the winter months with limited losses in accuracy. It was also shown that these modeling-based scale breaks could be associated with physiographic and vegetation structures to aid a priori modeling decisions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Today there are many system development projects that break both budget and time plan. Often this depends on defects in the information systems that could have been prevented. The cost of test can in some cases be as high as 50 % of the projects total cost and it's at the same time an important part of development. Test as such has moved its focus from the software it self and its faults to a wider perspective on whole infrastructures of information systems where assure a good quality is important. Sogeti in the Netherlands have developed a test method called TMap (Test Management approach) that can be used for structured testing of information systems. TMap haven't been used as much as desired in the office in Borlänge. Because Microsoft is releasing a new version of their platform Visual Studio Team System (VSTS 2010) some colleges at Sogeti in the Netherlands are about to develop a template that can support the use of TMap in VSTS 2010. When we write this the template is still in development. The goal for Sogeti was to find out the differences between the test functionality in VSTS 2008 and 2010. By using the purpose with this essay, which was to analyze the test process in VSTS 2008 with TMap against the test process in VSTS 2010 together with the template we got much help to achieve the goal. The analysis was done with four different aspects: The TPI and TMMi models, problem and strength analyses and a few question formulations. The TPI and TMMi models where used to analyses and evaluate the test process. The analysis showed that there were differences between the both test processes. VSTS 2010 together with the template gave a better support to use TMap and perform test. In VSTS 2010 the test tool Camano is connected to TFS and the tool is also to make the execution and logging of tests easier. This leads to a test process that is easier to handle and has a better support for TMap.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION With the advent of Web 2.0, social networking websites like Facebook, MySpace and LinkedIn have become hugely popular. According to (Nilsen, 2009), social networking websites have global1 figures of almost 250 millions unique users among the top five2, with the time people spend on those networks increasing 63% between 2007 and 2008. Facebook alone saw a massive growth of 566% in number of minutes in the same period of time. Furthermore their appeal is clear, they enable users to easily form persistent networks of friends with whom they can interact and share content. Users then use those networks to keep in touch with their current friends and to reconnect with old friends. However, online social network services have rapidly evolved into highly complex systems which contain a large amount of personally salient information derived from large networks of friends. Since that information varies from simple links to music, photos and videos, users not only have to deal with the huge amount of data generated by them and their friends but also with the fact that it‟s composed of many different media forms. Users are presented with increasing challenges, especially as the number of friends on Facebook rises. An example of a problem is when a user performs a simple task like finding a specific friend in a group of 100 or more friends. In that case he would most likely have to go through several pages and make several clicks till he finds the one he is looking for. Another example is a user with more than 100 friends in which his friends make a status update or another action per day, resulting in 10 updates per hour to keep up. That is plausible, especially since the change in direction of Facebook to rival with Twitter, by encouraging users to update their status as they do on Twitter. As a result, to better present the web of information connected to a user the use of better visualizations is essential. The visualizations used nowadays on social networking sites haven‟t gone through major changes during their lifetimes. They have added more functionality and gave more tools to their users, but still the core of their visualization hasn‟t changed. The information is still presented in a flat way in lists/groups of text and images which can‟t show the extra connections pieces of information. Those extra connections can give new meaning and insights to the user, allowing him to more easily see if that content is important to him and the information related to it. However showing extra connections of information but still allowing the user to easily navigate through it and get the needed information with a quick glance is difficult. The use of color coding, clusters and shapes becomes then essential to attain that objective. But taking into consideration the advances in computer hardware in the last decade and the software platforms available today, there is the opportunity to take advantage of 3D. That opportunity comes in because we are at a phase were the hardware and the software available is ready for the use of 3D in the web. With the use of the extra dimension brought by 3D, visualizations can be constructed to show the content and its related information to the user at the same screen and in a clear way. Also it would allow a great deal of interactivity. Another opportunity to create better information‟s visualization presents itself in the form of the open APIs, specifically the ones made available by the social networking sites. Those APIs allow any developers to create their own applications or sites taking advantage of the huge amount of information there is on those networks. Specifically to this case, they open the door for the creation of new social network visualizations. Nevertheless, the third dimension is by itself not enough to create a better interface for a social networking website, there are some challenges to overcome. One of those challenges is to make the user understand what the system is doing during the interaction with the user. Even though that is important in 2D visualizations, it becomes essential in 3D due to the extra dimension. To overcome that challenge it‟s necessary the use of the principles of animations defined by the artists at Walt Disney Studios (Johnston, et al., 1995). By applying those principles in the development of the interface, the actions of the system in response to the user inputs became clear and understandable. Furthermore, a user study needs to be performed so the users‟ main goals and motivations, while navigating the social network, are revealed. Their goals and motivations are important in the construction of an interface that reflects the user expectations for the interface, but also helps in the development of appropriate metaphors. Those metaphors have an important role in the interface, because if correctly chosen they help the user understand the elements of the interface instead of making him memorize it. The last challenge is the use of 3D visualization on the web, since there have been several attempts to bring 3D into it, mainly with the various versions of VRML which were destined to failure due to the hardware limitations at the time. However, in the last couple of years there has been a movement to make the necessary tools to finally allow developers to use 3D in a useful way, using X3D or OpenGL but especially flash. This thesis argues that there is a need for a better social network visualization that shows all the dimensions of the information connected to the user and that allows him to move through it. But there are several characteristics the new visualization has to possess in order for it to present a real gain in usability to Facebook‟s users. The first quality is to have the friends at the core of its design, and the second to make use of the metaphor of circles of friends to separate users in groups taking into consideration the order of friendship. To achieve that several methods have to be used, from the use of 3D to get an extra dimension for presenting relevant information, to the use of direct manipulation to make the interface comprehensible, predictable and controllable. Moreover animation has to be use to make all the action on the screen perceptible to the user. Additionally, with the opportunity given by the 3D enabled hardware, the flash platform, through the use of the flash engine Papervision3D and the Facebook platform, all is in place to make the visualization possible. But even though it‟s all in place, there are challenges to overcome like making the system actions in 3D understandable to the user and creating correct metaphors that would allow the user to understand the information and options available to him. This thesis document is divided in six chapters, with Chapter 2 reviewing the literature relevant to the work described in this thesis. In Chapter 3 the design stage that resulted in the application presented in this thesis is described. In Chapter 4, the development stage, describing the architecture and the components that compose the application. In Chapter 5 the usability test process is explained and the results obtained through it are presented and analyzed. To finish, Chapter 6 presents the conclusions that were arrived in this thesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research investigated the pattern of antibody response by means of enzyme linked immunosorbent assay (Elisa) and indirect fluorescent antibody test (IFAT) through the course of experimental Trypanosoma evansi infection in dogs. Clinical and parasitological features were also studied. The average prepatent period was 11.2 days and parasitaemia showed an undulating course. Biometrical study of parasites revealed a mean total length of 21.68mm. The disease was characterized by intermittent fever closely related to the degree of parasitaemia and main clinical signs consisted of pallor of mucous membrane, edema, progressive emaciation and enlargement of palpable lymph nodes. Diagnostic antibody was detected within 12 to 15 days and 15 to 19 days of infection by IFAT and Elisa, respectively. High and persistent antibody levels were detected by both tests and appeared not to correlate with control of parasitaemia

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Some programs may have their entry data specified by formalized context-free grammars. This formalization facilitates the use of tools in the systematization and the rise of the quality of their test process. This category of programs, compilers have been the first to use this kind of tool for the automation of their tests. In this work we present an approach for definition of tests from the formal description of the entries of the program. The generation of the sentences is performed by taking into account syntactic aspects defined by the specification of the entries, the grammar. For optimization, their coverage criteria are used to limit the quantity of tests without diminishing their quality. Our approach uses these criteria to drive generation to produce sentences that satisfy a specific coverage criterion. The approach presented is based on the use of Lua language, relying heavily on its resources of coroutines and dynamic construction of functions. With these resources, we propose a simple and compact implementation that can be optimized and controlled in different ways, in order to seek satisfaction the different implemented coverage criteria. To make the use of our tool simpler, the EBNF notation for the specification of the entries was adopted. Its parser was specified in the tool Meta-Environment for rapid prototyping

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Automation has become increasingly necessary during the software test process due to the high cost and time associated with such activity. Some tools have been proposed to automate the execution of Acceptance Tests in Web applications. However, many of them have important limitations such as the strong dependence on the structure of the HTML pages and the need of manual valuing of the test cases. In this work, we present a language for specifying acceptance test scenarios for Web applications called IFL4TCG and a tool that allows the generation of test cases from these scenarios. The proposed language supports the criterion of Equivalence Classes Partition and the tool allows the generation of test cases that meet different combination strategies (i.e., Each-Choice, Base-Choice and All Combinations). In order to evaluate the effectiveness of the proposed solution, we used the language and the associated tool for designing and executing Acceptance Tests on a module of Sistema Unificado de Administração Pública (SUAP) of Instituto Federal Rio Grande do Norte (IFRN). Four Systems Analysts and one Computer Technician, which work as developers of the that system, participated in the evaluation. Preliminary results showed that IFL4TCG can actually help to detect defects in Web applications

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper seeks to apply a routine for highways detection through the mathematical morphology tools in high resolution image. The Mathematical Morphology theory consists of describing structures geometric presents quantitatively in the image (targets or features). This explains the use of the Mathematical Morphology in this work. As high resolution images will be used, the largest difficulty in the highways detection process is the presence of trees and automobiles in the borders tracks. Like this, for the obtaining of good results through the use of morphologic tools was necessary to choose the structuring element appropriately to be used in the functions. Through the appropriate choice of the morphologic operators and structuring elements it was possible to detect the highways tracks. The linear feature detection using mathematical morphology techniques, can contribute in cartographic applications, as cartographic products updating.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is of interest in some applications to determine whether there is a relationship between a hazard rate function (or a cumulative incidence function) and a mark variable which is only observed at uncensored failure times. We develop nonparametric tests for this problem when the mark variable is continuous. Tests are developed for the null hypothesis that the mark-specific hazard rate is independent of the mark versus ordered and two-sided alternatives expressed in terms of mark-specific hazard functions and mark-specific cumulative incidence functions. The test statistics are based on functionals of a bivariate test process equal to a weighted average of differences between a Nelson--Aalen-type estimator of the mark-specific cumulative hazard function and a nonparametric estimator of this function under the null hypothesis. The weight function in the test process can be chosen so that the test statistics are asymptotically distribution-free.Asymptotically correct critical values are obtained through a simple simulation procedure. The testing procedures are shown to perform well in numerical studies, and are illustrated with an AIDS clinical trial example. Specifically, the tests are used to assess if the instantaneous or absolute risk of treatment failure depends on the amount of accumulation of drug resistance mutations in a subject's HIV virus. This assessment helps guide development of anti-HIV therapies that surmount the problem of drug resistance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation established a standard foam index: the absolute foam index test. This test characterized a wide range of coal fly ash by the absolute volume of air-entraining admixture (AEA) necessary to produce a 15-second metastable foam in a coal fly ash-cement slurry in a specified time. The absolute foam index test was used to characterize fly ash samples having loss on ignition (LOI) values that ranged from 0.17 to 23.3 %wt. The absolute foam index characterized the fly ash samples by absolute volume of AEA, defined as the amount of undiluted AEA solution added to obtain a 15-minute endpoint signified by 15-second metastable foam. Results were compared from several foam index test time trials that used different initial test concentrations to reach termination at selected times. Based on the coefficient of variation (CV), a 15-minute endpoint, with limits of 12 to 18 minutes was chosen. Various initial test concentrations were used to accomplish consistent contact times and concentration gradients for the 15-minute test endpoint for the fly ash samples. A set of four standard concentrations for the absolute foam index test were defined by regression analyses and a procedure simplifying the test process. The set of standard concentrations for the absolute foam index test was determined by analyzing experimental results of 80 tests on coal fly ashes with loss on ignition (LOI) values ranging from 0.39 to 23.3 wt.%. A regression analysis informed selection of four concentrations (2, 6, 10, and 15 vol.% AEA) that are expected to accommodate fly ashes with 0.39 to 23.3 wt.% LOI, depending on the AEA type. Higher concentrations should be used for high-LOI fly ash when necessary. A procedure developed using these standard concentrations is expected to require only 1-3 trials to meet specified endpoint criteria for most fly ashes. The AEA solution concentration that achieved the metastable foam in the foam index test was compared to the AEA equilibrium concentration obtained from the direct adsorption isotherm test with the same fly ash. The results showed that the AEA concentration that satisfied the absolute foam index test was much less than the equilibrium concentration. This indicated that the absolute foam index test was not at or near equilibrium. Rather, it was a dynamic test where the time of the test played an important role in the results. Even though the absolute foam index was not an equilibrium condition, a correlation was made between the absolute foam index and adsorption isotherms. Equilibrium isotherm equations obtained from direct isotherm tests were used to calculate the equilibrium concentrations and capacities of fly ash from 0.17 to 10.5% LOI. The results showed that the calculated fly ash capacity was much less than capacities obtained from isotherm tests that were conducted with higher initial concentrations. This indicated that the absolute foam index was not equilibrium. Rather, the test is dynamic where the time of the test played an important role in the results. Even though the absolute foam index was not an equilibrium condition, a correlation was made between the absolute foam index and adsorption isotherms for fly ash of 0.17 to 10.5% LOI. Several batches of mortars were mixed for the same fly ash type increasing only the AEA concentration (dosage) in each subsequent batch. Mortar air test results for each batch showed for each increase in AEA concentration, air contents increased until a point where the next increase in AEA concentration resulted in no increase in air content. This was maximum air content that could be achieved by the particular mortar system; the system reached its air capacity at the saturation limit. This concentration of AEA was compared to the critical micelle concentration (CMC) for the AEA and the absolute foam index.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we present a solution to the problem of action and gesture recognition using sparse representations. The dictionary is modelled as a simple concatenation of features computed for each action or gesture class from the training data, and test data is classified by finding sparse representation of the test video features over this dictionary. Our method does not impose any explicit training procedure on the dictionary. We experiment our model with two kinds of features, by projecting (i) Gait Energy Images (GEIs) and (ii) Motion-descriptors, to a lower dimension using Random projection. Experiments have shown 100% recognition rate on standard datasets and are compared to the results obtained with widely used SVM classifier.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Embedded context management in resource-constrained devices (e.g. mobile phones, autonomous sensors or smart objects) imposes special requirements in terms of lightness for data modelling and reasoning. In this paper, we explore the state-of-the-art on data representation and reasoning tools for embedded mobile reasoning and propose a light inference system (LIS) aiming at simplifying embedded inference processes offering a set of functionalities to avoid redundancy in context management operations. The system is part of a service-oriented mobile software framework, conceived to facilitate the creation of context-aware applications—it decouples sensor data acquisition and context processing from the application logic. LIS, composed of several modules, encapsulates existing lightweight tools for ontology data management and rule-based reasoning, and it is ready to run on Java-enabled handheld devices. Data management and reasoning processes are designed to handle a general ontology that enables communication among framework components. Both the applications running on top of the framework and the framework components themselves can configure the rule and query sets in order to retrieve the information they need from LIS. In order to test LIS features in a real application scenario, an ‘Activity Monitor’ has been designed and implemented: a personal health-persuasive application that provides feedback on the user’s lifestyle, combining data from physical and virtual sensors. In this case of use, LIS is used to timely evaluate the user’s activity level, to decide on the convenience of triggering notifications and to determine the best interface or channel to deliver these context-aware alerts.d

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Embedded context management in resource-constrained devices (e.g. mobile phones, autonomous sensors or smart objects) imposes special requirements in terms of lightness for data modelling and reasoning. In this paper, we explore the state-of-the-art on data representation and reasoning tools for embedded mobile reasoning and propose a light inference system (LIS) aiming at simplifying embedded inference processes offering a set of functionalities to avoid redundancy in context management operations. The system is part of a service-oriented mobile software framework, conceived to facilitate the creation of context-aware applications?it decouples sensor data acquisition and context processing from the application logic. LIS, composed of several modules, encapsulates existing lightweight tools for ontology data management and rule-based reasoning, and it is ready to run on Java-enabled handheld devices. Data management and reasoning processes are designed to handle a general ontology that enables communication among framework components. Both the applications running on top of the framework and the framework components themselves can configure the rule and query sets in order to retrieve the information they need from LIS. In order to test LIS features in a real application scenario, an ?Activity Monitor? has been designed and implemented: a personal health-persuasive application that provides feedback on the user?s lifestyle, combining data from physical and virtual sensors. In this case of use, LIS is used to timely evaluate the user?s activity level, to decide on the convenience of triggering notifications and to determine the best interface or channel to deliver these context-aware alerts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Testing concurrent software is difficult due to problems with inherent nondeterminism. In previous work, we have presented a method and tool support for the testing of concurrent Java components. In this paper, we extend that work by presenting and discussing techniques for testing Java thread interrupts and timed waits. Testing thread interrupts is important because every Java component that calls wait must have code dealing with these interrupts. For a component that uses interrupts and timed waits to provide its basic functionality, the ability to test these features is clearly even more important. We discuss the application of the techniques and tool support to one such component, which is a nontrivial implementation of the readers-writers problem.