996 resultados para Concept testing
Resumo:
Specific demand for service concept creation has come about from industrial organizations’ desire to find new and innovative ways to differentiate their offering by increasing the level of customer services. Providers of professional services have also demanded new concepts and approaches for their businesses as these industries have become increasingly competitive. Firms are now seeking better ways to understand and segment their customers, to ensure the delivery of quality services and strengthen their position in aggressively competitive markets. This thesis is intended to provide management consulting companies with a new work method that enables service concept creation in a business-to-business environment. The model defines the service concept as a combination of delivered value and the target customers; the third-dimension operating model is brought to the new system in testing of the service concept creation guidelines in the target organization. For testing, service concepts for a management consulting company are created. Service concepts are designed to serve as a solid foundation for further service improvements. Recommendations and proposals for further action related to service development in the target organization are presented, and recommendations to further improve the model created are given.
Resumo:
Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.
Resumo:
Background: None of the HIV T-cell vaccine candidates that have reached advanced clinical testing have been able to induce protective T cell immunity. A major reason for these failures may have been suboptimal T cell immunogen designs. Methods: To overcome this problem, we used a novel immunogen design approach that is based on functional T cell response data from more than 1,000 HIV-1 clade B and C infected individuals and which aims to direct the T cell response to the most vulnerable sites of HIV-1. Results: Our approach identified 16 regions in Gag, Pol, Vif and Nef that were relatively conserved and predominantly targeted by individuals with reduced viral loads. These regions formed the basis of the HIVACAT T-cell Immunogen (HTI) sequence which is 529 amino acids in length, includes more than 50 optimally defined CD4+ and CD8+ T-cell epitopes restricted by a wide range of HLA class I and II molecules and covers viral sites where mutations led to a dramatic reduction in viral replicative fitness. In both, C57BL/6 mice and Indian rhesus macaques immunized with an HTI-expressing DNA plasmid (DNA.HTI) induced broad and balanced T-cell responses to several segments within Gag, Pol, and Vif. DNA.HTI induced robust CD4+ and CD8+ T cell responses that were increased by a booster vaccination using modified virus Ankara (MVA.HTI), expanding the DNA.HTI induced response to up to 3.2% IFN-γ T-cells in macaques. HTI-specific T cells showed a central and effector memory phenotype with a significant fraction of the IFN-γ+ CD8+ T cells being Granzyme B+ and able to degranulate (CD107a+). Conclusions: These data demonstrate the immunogenicity of a novel HIV-1 T cell vaccine concept that induced broadly balanced responses to vulnerable sites of HIV-1 while avoiding the induction of responses to potential decoy targets that may divert effective T-cell responses towards variable and less protective viral determinants.
Resumo:
The purpose of this two-phase study was to define the concept of vaccination competence and assess the vaccination competence of graduating public health nurse students (PHN students) and public health nurses (PHNs) in Finland, with the goal of promoting and maintaining vaccination competence and developing vaccination education. The first phase of the study included semi-structured interviews with vaccination professionals, graduating PHN students and clients (a total of n=40), asking them to describe vaccination competence as well as the factors strengthening and weakening it. The data were analyzed through content analysis. In the second phase of the study, structured instruments were developed, and vaccination competence of PHN students (n=129) in Finland and PHNs (n=405) was assessed using a self-assessment scale (VAS) and taking a knowledge test. PHNs were used as a reference group, enabling us to determine whether a satisfactory level of vaccination competence was achieved by the end of studies, or whether it was gained through work experience vaccinating clients. The data were collected from five polytechnic institutions and seven health centers located in various parts of the country. The data were collected using instruments developed for this study, and were analyzed statistically. In the first phase, based on the results of the interviews, vaccination competence was defined as a large multi-faceted entity, including the concepts of competent vaccinator, competent implementation of the vaccination, and the outcome of the implementation. Semi-structured interviews revealed that factors strengthening and weakening vaccination competence were connected to the vaccinator, the client being vaccinated, the vaccination environment and vaccinator education. On the whole, factors strengthening and weakening vaccination were the opposite of each other. In the second phase, on the self-assessment of vaccination competence, students rated themselves as significantly lower than working professionals. On the knowledge test, the percentage of correct answers was lower for students than PHNs. When all background variables were taken into account in multivariate analysis, there was no longer a significant difference between the students and PHNs on the self-assessment. However, in multivariate analysis, the PHNs still performed better than students on the knowledge test. For this study, a satisfactory level of vaccination competence was defined as a mean of 8.0 on the self-assessment and 80% correct answers on the knowledge test. Based on these criteria, students almost reached the level of satisfactory in their overall self-assessment, and PHNs did. Both groups, however, did rank themselves as satisfactory in some sum variables. On the knowledge test the students did not achieve a level of satisfactory (80%) in their total score, though PHNs did. As before, both groups did achieve a level of satisfactory in several sum variables. Further research and development should focus on vaccination education, the testing of vaccination competence and vaccination practices in clinical practice, as well as on developing the measurement tools.
Resumo:
The problem of software (SW) defaults is becoming more and more topical because of increasing amount of the SW and its complication. The majority of these defaults are founded during the test part that consumes about 40-50% of the development efforts. Test automation allows reducing the cost of this process and increasing testing effectiveness. In the middle of 1980 the first tools for automated testing appeared and the automated process was implemented in different kinds of SW testing. In short time, it became obviously, automated testing can cause many problems such as increasing product cost, decreasing reliability and even project fail. This thesis describes automated testing process, its concept, lists main problems, and gives an algorithm for automated test tools selection. Also this work presents an overview of the main automated test tools for embedded systems.
Resumo:
Point-of-care (POC) –diagnostics is a field with rapidly growing market share. As these applications become more widely used, there is an increasing pressure to improve their performance to match the one of a central laboratory tests. Lanthanide luminescence has been widely utilized in diagnostics because of the numerous advantages gained by the utilization of time-resolved or anti-Stokes detection. So far the use of lanthanide labels in POC has been scarce due to limitations set by the instrumentation required for their detection and the shortcomings, e.g. low brightness, of these labels. Along with the advances in the research of lanthanide luminescence, and in the field of semiconductors, these materials are becoming a feasible alternative for the signal generation also in the future POC assays. The aim of this thesis was to explore ways of utilizing time-resolved detection or anti-Stokes detection in POC applications. The long-lived fluorescence for the time-resolved measurement can be produced with lanthanide chelates. The ultraviolet (UV) excitation required by these chelates is cumbersome to produce with POC compatible fluorescence readers. In this thesis the use of a novel light-harvesting ligand was studied. This molecule can be used to excite Eu(III)-ions at wavelengths extending up to visible part of the spectrum. An enhancement solution based on this ligand showed a good performance in a proof-of-concept -bioaffinity assay and produced a bright signal upon 365 nm excitation thanks to the high molar absorptivity of the chelate. These features are crucial when developing miniaturized readers for the time-resolved detection of fluorescence. Upconverting phosphors (UCPs) were studied as an internal light source in glucose-sensing dry chemistry test strips and ways of utilizing their various emission wavelengths and near-infrared excitation were explored. The use of nanosized NaYF :Yb3+,Tm3+-particles enabled the replacement of an external UV-light source with a NIR-laser and gave an additional degree of freedom in the optical setup of the detector instrument. The new method enabled a blood glucose measurement with results comparable to a current standard method of measuring reflectance. Microsized visible emitting UCPs were used in a similar manner, but with a broad absorbing indicator compound filtering the excitation and emission wavelengths of the UCP. This approach resulted in a novel way of benefitting from the non-linear relationship between the excitation power and emission intensity of the UCPs, and enabled the amplification of the signal response from the indicator dye.
Resumo:
In standard multivariate statistical analysis common hypotheses of interest concern changes in mean vectors and subvectors. In compositional data analysis it is now well established that compositional change is most readily described in terms of the simplicial operation of perturbation and that subcompositions replace the marginal concept of subvectors. To motivate the statistical developments of this paper we present two challenging compositional problems from food production processes. Against this background the relevance of perturbations and subcompositions can be clearly seen. Moreover we can identify a number of hypotheses of interest involving the specification of particular perturbations or differences between perturbations and also hypotheses of subcompositional stability. We identify the two problems as being the counterpart of the analysis of paired comparison or split plot experiments and of separate sample comparative experiments in the jargon of standard multivariate analysis. We then develop appropriate estimation and testing procedures for a complete lattice of relevant compositional hypotheses
Resumo:
A novel series of polyaromatic ionomers with similar equivalent weights but very different sulphonic acid distributions along the ionomer backbone has been designed and prepared. By synthetically organising the sequence-distribution so that it consists of fully defined ionic segments (containing singlets, doublets or quadruplets of sulphonic acid groups) alternating strictly with equally well-defined nonionic spacer segments, a new class of polymers which may be described as microblock ionomers has been developed. These materials exhibit very different properties and morphologies from analogous randomly substituted systems. Progressively extending the nonionic spacer length in the repeat unit (maintaining a constant equivalent weight by increasing the degree of sulphonation. of the ionic segment) leads to an increasing degree of nanophase separation between hydrophilic and hydrophobic domains in these materials. Membranes cast from ionomers with the more highly phase-separated morphologies show significantly higher onset temperatures for uncontrolled swelling in water. This new type of ionomer design has enabled the fabrication of swelling-resistant hydrocarbon membranes, suitable for fuel cell operation, with very much higher ion exchange capacities (>2 meq g(-1)) than those previously reported in the literature. When tested in a fuel cell at high temperature (120 degrees C) and low relative humidity (35% RH), the best microblock membrane matched the performance of Nafion 112. Moreover, comparative low load cycle testing of membrane -electrode assemblies suggests that the durability of the new membranes under conditions of high temperature and low relative humidity is superior to that of conventional perfluorinated materials.
Resumo:
Many weeds occur in patches but farmers frequently spray whole fields to control the weeds in these patches. Given a geo-referenced weed map, technology exists to confine spraying to these patches. Adoption of patch spraying by arable farmers has, however, been negligible partly due to the difficulty of constructing weed maps. Building on previous DEFRA and HGCA projects, this proposal aims to develop and evaluate a machine vision system to automate the weed mapping process. The project thereby addresses the principal technical stumbling block to widespread adoption of site specific weed management (SSWM). The accuracy of weed identification by machine vision based on a single field survey may be inadequate to create herbicide application maps. We therefore propose to test the hypothesis that sufficiently accurate weed maps can be constructed by integrating information from geo-referenced images captured automatically at different times of the year during normal field activities. Accuracy of identification will also be increased by utilising a priori knowledge of weeds present in fields. To prove this concept, images will be captured from arable fields on two farms and processed offline to identify and map the weeds, focussing especially on black-grass, wild oats, barren brome, couch grass and cleavers. As advocated by Lutman et al. (2002), the approach uncouples the weed mapping and treatment processes and builds on the observation that patches of these weeds are quite stable in arable fields. There are three main aspects to the project. 1) Machine vision hardware. Hardware component parts of the system are one or more cameras connected to a single board computer (Concurrent Solutions LLC) and interfaced with an accurate Global Positioning System (GPS) supplied by Patchwork Technology. The camera(s) will take separate measurements for each of the three primary colours of visible light (red, green and blue) in each pixel. The basic proof of concept can be achieved in principle using a single camera system, but in practice systems with more than one camera may need to be installed so that larger fractions of each field can be photographed. Hardware will be reviewed regularly during the project in response to feedback from other work packages and updated as required. 2) Image capture and weed identification software. The machine vision system will be attached to toolbars of farm machinery so that images can be collected during different field operations. Images will be captured at different ground speeds, in different directions and at different crop growth stages as well as in different crop backgrounds. Having captured geo-referenced images in the field, image analysis software will be developed to identify weed species by Murray State and Reading Universities with advice from The Arable Group. A wide range of pattern recognition and in particular Bayesian Networks will be used to advance the state of the art in machine vision-based weed identification and mapping. Weed identification algorithms used by others are inadequate for this project as we intend to collect and correlate images collected at different growth stages. Plants grown for this purpose by Herbiseed will be used in the first instance. In addition, our image capture and analysis system will include plant characteristics such as leaf shape, size, vein structure, colour and textural pattern, some of which are not detectable by other machine vision systems or are omitted by their algorithms. Using such a list of features observable using our machine vision system, we will determine those that can be used to distinguish weed species of interest. 3) Weed mapping. Geo-referenced maps of weeds in arable fields (Reading University and Syngenta) will be produced with advice from The Arable Group and Patchwork Technology. Natural infestations will be mapped in the fields but we will also introduce specimen plants in pots to facilitate more rigorous system evaluation and testing. Manual weed maps of the same fields will be generated by Reading University, Syngenta and Peter Lutman so that the accuracy of automated mapping can be assessed. The principal hypothesis and concept to be tested is that by combining maps from several surveys, a weed map with acceptable accuracy for endusers can be produced. If the concept is proved and can be commercialised, systems could be retrofitted at low cost onto existing farm machinery. The outputs of the weed mapping software would then link with the precision farming options already built into many commercial sprayers, allowing their use for targeted, site-specific herbicide applications. Immediate economic benefits would, therefore, arise directly from reducing herbicide costs. SSWM will also reduce the overall pesticide load on the crop and so may reduce pesticide residues in food and drinking water, and reduce adverse impacts of pesticides on non-target species and beneficials. Farmers may even choose to leave unsprayed some non-injurious, environmentally-beneficial, low density weed infestations. These benefits fit very well with the anticipated legislation emerging in the new EU Thematic Strategy for Pesticides which will encourage more targeted use of pesticides and greater uptake of Integrated Crop (Pest) Management approaches, and also with the requirements of the Water Framework Directive to reduce levels of pesticides in water bodies. The greater precision of weed management offered by SSWM is therefore a key element in preparing arable farming systems for the future, where policy makers and consumers want to minimise pesticide use and the carbon footprint of farming while maintaining food production and security. The mapping technology could also be used on organic farms to identify areas of fields needing mechanical weed control thereby reducing both carbon footprints and also damage to crops by, for example, spring tines. Objective i. To develop a prototype machine vision system for automated image capture during agricultural field operations; ii. To prove the concept that images captured by the machine vision system over a series of field operations can be processed to identify and geo-reference specific weeds in the field; iii. To generate weed maps from the geo-referenced, weed plants/patches identified in objective (ii).
Resumo:
This paper proposes a Dual-Magnet Magnetic Compliance Unit (DMCU) for use in medium sized space rover platforms to enhance terrain handling capabilities and speed of traversal. An explanation of magnetic compliance and how it can be applied to space robotics is shown, along with an initial mathematical model for this system. A design for the DMCU is proposed along with a 4-wheeled DMCU Testing Rig.
Resumo:
Area-wide development viability appraisals are undertaken to determine the economic feasibility of policy targets in relation to planning obligations. Essentially, development viability appraisals consist of a series of residual valuations of hypothetical development sites across a local authority area at a particular point in time. The valuations incorporate the estimated financial implications of the proposed level of planning obligations. To determine viability the output land values are benchmarked against threshold land value and therefore the basis on which this threshold is established and the level at which it is set is critical to development viability appraisal at the policy-setting (area-wide) level. Essentially it is an estimate of the value at which a landowner would be prepared to sell. If the estimated site values are higher than the threshold land value the policy target is considered viable. This paper investigates the effectiveness of existing methods of determining threshold land value. They will be tested against the relationship between development value and costs. Modelling reveals that threshold land value that is not related to shifts in development value renders marginal sites unviable and fails to collect proportionate planning obligations from high value/low cost sites. Testing the model against national average house prices and build costs reveals the high degree of volatility in residual land values over time and underlines the importance of making threshold land value relative to the main driver of this volatility, namely development value.
Resumo:
This Thesis project is a part of the all-round automation of production of concentrating solar PV/T systems Absolicon X10. ABSOLICON Solar Concentrator AB has been invented and started production of the prospective solar concentrated system Absolicon X10. The aims of this Thesis project are designing, assembling, calibrating and putting in operation the automatic measurement system intended to evaluate the shape of concentrating parabolic reflectors.On the basis of the requirements of the company administration and needs of real production process the operation conditions for the Laser testing rig were formulated. The basic concept to use laser radiation was defined.At the first step, the complex design of the whole system was made and division on the parts was defined. After the preliminary conducted simulations the function and operation conditions of the all parts were formulated.At the next steps, the detailed design of all the parts was conducted. Most components were ordered from respective companies. Some of the mechanical components were made in the workshop of the company. All parts of the Laser-testing rig were assembled and tested. Software part, which controls the Laser-testing rig work, was created on the LabVIEW basis. To tune and test software part the special simulator was designed and assembled.When all parts were assembled in the complete system, the Laser-testing rig was tested, calibrated and tuned.In the workshop of Absolicon AB, the trial measurements were conducted and Laser-testing rig was installed in the production line at the plant in Soleftea.
Resumo:
At the beginning of 2003 the four year long research project REBUS on education, research, development and demonstration of competitive solar combisystems was launched. Research groups in Norway, Denmark, Sweden and Latvia are working together with partners from industry on innovative solutions for solar heating in the Nordic countries. Existing system concepts have been analyzed and based on the results new system designs have been developed. The proposed solutions have to fulfill country specific technical, sociological and cost requirements. Due to the similar demands on the systems in Denmark and Sweden it has been decided to develop a common system concept for both countries, which increases the market potential for the manufacturer. The focus of the development is on systems for the large number of rather well insulated existing single family houses. In close collaboration with the industrial partners a system concept has been developed that is characterized by its high compactness and flexibility. It allows the use of different types of boilers, heating distribution systems and a variable store and collector size. Two prototypes have been built, one for the Danish market with a gas boiler, and one for the Swedish market with a pellet boiler as auxiliary heater. After intensive testing and eventual further improvements at least two systems will be installed and monitored in demonstration houses. The systems have been modeled in TRNSYS and the simulation results will be used to further improve the system and evaluate the system performance.
Resumo:
It is well known that cointegration between the level of two variables (labeled Yt and yt in this paper) is a necessary condition to assess the empirical validity of a present-value model (PV and PVM, respectively, hereafter) linking them. The work on cointegration has been so prevalent that it is often overlooked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. The basis of this result is the use of rational expectations in forecasting future values of variables in the PVM. If this condition fails, the present-value equation will not be valid, since it will contain an additional term capturing the (non-zero) conditional expected value of future error terms. Our article has a few novel contributions, but two stand out. First, in testing for PVMs, we advise to split the restrictions implied by PV relationships into orthogonality conditions (or reduced rank restrictions) before additional tests on the value of parameters. We show that PV relationships entail a weak-form common feature relationship as in Hecq, Palm, and Urbain (2006) and in Athanasopoulos, Guillén, Issler and Vahid (2011) and also a polynomial serial-correlation common feature relationship as in Cubadda and Hecq (2001), which represent restrictions on dynamic models which allow several tests for the existence of PV relationships to be used. Because these relationships occur mostly with nancial data, we propose tests based on generalized method of moment (GMM) estimates, where it is straightforward to propose robust tests in the presence of heteroskedasticity. We also propose a robust Wald test developed to investigate the presence of reduced rank models. Their performance is evaluated in a Monte-Carlo exercise. Second, in the context of asset pricing, we propose applying a permanent-transitory (PT) decomposition based on Beveridge and Nelson (1981), which focus on extracting the long-run component of asset prices, a key concept in modern nancial theory as discussed in Alvarez and Jermann (2005), Hansen and Scheinkman (2009), and Nieuwerburgh, Lustig, Verdelhan (2010). Here again we can exploit the results developed in the common cycle literature to easily extract permament and transitory components under both long and also short-run restrictions. The techniques discussed herein are applied to long span annual data on long- and short-term interest rates and on price and dividend for the U.S. economy. In both applications we do not reject the existence of a common cyclical feature vector linking these two series. Extracting the long-run component shows the usefulness of our approach and highlights the presence of asset-pricing bubbles.
Resumo:
This paper introduces the concept of common deterministic shifts (CDS). This concept is simple, intuitive and relates to the common structure of shifts or policy interventions. We propose a Reduced Rank technique to investigate the presence of CDS. The proposed testing procedure has standard asymptotics and good small-sample properties. We further link the concept of CDS to that of superexogeneity. It is shown that CDS tests can be constructed which allow to test for super-exogeneity. The Monte Carlo evidence indicates that the CDS test for super-exogeneity dominates testing procedures proposed in the literature.