890 resultados para Scale validation process
Resumo:
Mitochondrial DNA (mtDNA) analysis is usually a last resort in routine forensic DNA casework. However, it has become a powerful tool for the analysis of highly degraded samples or samples containing too little or no nuclear DNA, such as old bones and hair shafts. The gold standard methodology still constitutes the direct sequencing of polymerase chain reaction (PCR) products or cloned amplicons from the HVS-1 and HVS-2 (hypervariable segment) control region segments. Identifications using mtDNA are time consuming, expensive and can be very complex, depending on the amount and nature of the material being tested. The main goal of this work is to develop a less labour-intensive and less expensive screening method for mtDNA analysis, in order to aid in the exclusion of non-matching samples and as a presumptive test prior to final confirmatory DNA sequencing. We have selected 14 highly discriminatory single nucleotide polymorphisms (SNPs) based on simulations performed by Salas and Amigo (2010) [1] to be typed using SNaPShotTM (Applied Biosystems, Foster City, CA, USA). The assay was validated by typing more than 100 HVS-1/HVS-2 sequenced samples. No differences were observed between the SNP typing and DNA sequencing when results were compared, with the exception of allelic dropouts observed in a few haplotypes. Haplotype diversity simulations were performed using 172 mtDNA sequences representative of the Brazilian population and a score of 0.9794 was obtained when the 14 SNPs were used, showing that the theoretical prediction approach for the selection of highly discriminatory SNPs suggested by Salas and Amigo (2010) [1] was confirmed in the population studied. As the main goal of the work is to develop a screening assay to skip the sequencing of all samples in a particular case, a pair-wise comparison of the sequences was done using the selected SNPs. When both HVS-1/HVS-2 SNPs were used for simulations, at least two differences were observed in 93.2% of the comparisons performed. The assay was validated with casework samples. Results show that the method is straightforward and can be used for exclusionary purposes, saving time and laboratory resources. The assay confirms the theoretic prediction suggested by Salas and Amigo (2010) [1]. All forensic advantages, such as high sensitivity and power of discrimination, as also the disadvantages, such as the occurrence of allele dropouts, are discussed throughout the article. © 2013 Elsevier B.V.
Resumo:
The present study aims to verify motivation of goals orientation using the scale TEOSQ (TEOSQ) developed by Duda (1992), translated, adapted and validated by Hirota and De Marco (2006), presenting as an experimental research methodology proposed (MARCONI and LAKATOS, 2006), with 37 practitioners basketball aged 11 to 17 (mean age 14.02 +1.42 years) of the City of Sao Caetano South – São Paulo, Brazil. Used for statistical software SPSS, version 15.0 in order to get the Cronbach's Alpha and the mean, standard deviation and median of each orientation – goal of Task and ego goal. Observed with the results that the scale has good values related to the statistic of Alpha showing Task 0.69 to 0.67 Ego. The mean age between 11 and 13 years old related to the orientation task was 4.60(+0.62) and ego orientation to 3.11(+0.84), for ages 14 to 15 years the average task was 4.23(+0.78) and 2.74(+1.02) for the ego and the age of 16 and 17 years the average orientation task was 4.78(+0.1) and ego of 2.66(+0.47). The total result of the group concerning the task was 4.36(+0.75) and ego of 2.83(+0.97). We conclude that the validation process was in line with expectations, showing consistent values of Alpha, which demonstrate that the students while learning basketball are more sure of their actions, but optimistic, persistent in their goals and adopt a position of responsibility the rest of the team
Resumo:
This article develops an integrative framework of the concept of perceived brand authenticity (PBA) and sheds light on PBA’s (1) measurement, (2) drivers, (3) consequences, as well as (4) an underlying process of its effects and (5) boundary conditions. A multi-phase scale development process resulted in a 15-item PBA scale to measure its four dimensions of credibility, integrity, symbolism, and continuity. PBA is influenced by indexical, existential, and iconic cues, whereby the latter’s influence is moderated by consumers’ level of marketing skepticism. Results also suggest that PBA drives brand choice likelihood through self-congruence for consumers high in self-authenticity.
Resumo:
Although brand authenticity is gaining increasing interest in consumer behavior research and managerial practice, literature on its measurement and contribution to branding theory is still limited. This article develops an integrative framework of the concept of brand authenticity and reports the development and validation of a scale measuring consumers' perceived brand authenticity (PBA). A multi-phase scale development process resulted in a 15-item PBA scale measuring four dimensions: credibility, integrity, symbolism, and continuity. This scale is reliable across different brands and cultural contexts. We find that brand authenticity perceptions are influenced by indexical, existential, and iconic cues, whereby some of the latters' influence is moderated by consumers' level of marketing skepticism. Results also suggest that PBA increases emotional brand attachment and word-of-mouth, and that it drives brand choice likelihood through self-congruence for consumers high in self-authenticity.
Resumo:
Providing accurate maps of coral reefs where the spatial scale and labels of the mapped features correspond to map units appropriate for examining biological and geomorphic structures and processes is a major challenge for remote sensing. The objective of this work is to assess the accuracy and relevance of the process used to derive geomorphic zone and benthic community zone maps for three western Pacific coral reefs produced from multi-scale, object-based image analysis (OBIA) of high-spatial-resolution multi-spectral images, guided by field survey data. Three Quickbird-2 multi-spectral data sets from reefs in Australia, Palau and Fiji and georeferenced field photographs were used in a multi-scale segmentation and object-based image classification to map geomorphic zones and benthic community zones. A per-pixel approach was also tested for mapping benthic community zones. Validation of the maps and comparison to past approaches indicated the multi-scale OBIA process enabled field data, operator field experience and a conceptual hierarchical model of the coral reef environment to be linked to provide output maps at geomorphic zone and benthic community scales on coral reefs. The OBIA mapping accuracies were comparable with previously published work using other methods; however, the classes mapped were matched to a predetermined set of features on the reef.
Resumo:
This paper introduces the experience of using videoconferencing and recording as a mechanism to support courses which need to be promoted or discontinued within the framework of the European convergence process. Our objective is to make these courses accessible as live streaming during the lessons as well as recorded lectures and associated documents available to the students as soon as the lesson has finished. The technology used has been developed in our university and it is all open source. Although this is a technical project the key is the human factor involved. The people managing the virtual sessions are students of the courses being recorded. However, they lack technical knowledge, so we had to train them in audiovisuals and enhance the usability of the videoconferencing tool and platform. The validation process is being carried out in five real scenarios at our university. During the whole period we are evaluating technical and pedagogical issues of this experience for both students and teachers to guide the future development of the service. Depending on the final results, the service of lectures recording will be available as educational resource for all of the teaching staff of our university.
Resumo:
The apparition of new mobile phones operating systems often leads to a flood of mobile applications rushing into the market without taking into account needs of the most vulnerable users groups: the people with disabilities. The need of accessible applications for mobile is very important especially when it comes to access basic mobile functions such as making calls through a contact manager. This paper presents the technical validation process and results of an Accessible Contact Manager for mobile phones as a part of the evaluation of accessible applications for mobile phones for people with disabilities.
Resumo:
Purpose: To provide for the basis for collecting strength training data using a rigorously validated injury report form. Methods: A group of specialist designed a questionnaire of 45 item grouped into 4 dimensions. Six stages were used to assess face, content, and criterion validity of the weight training injury report form. A 13 members panel assessed the form for face validity, and an expert panel assessed it for content and criterion validity. Panel members were consulted until consensus was reached. A yardstick developed by an expert panel using Intraclass correlation technique was used to assess the reability of the form. Test-retest reliability was assessed with the intraclass correlation coefficient (ICC).The strength training injury report form was developed, and the face, content, and criterion validity successfully assessed. A six step protocol to create a yardstick was also developed to assist in the validation process. Both inter-rater and intra rater reliability results indicated a 98% agreement. Inter-rater reliability agreement of 98% for three injuries. Results: The Cronbach?s alpha of the questionnaire was 0.944 (pmenor que0.01) and the ICC of the entire questionnaire was 0.894 (pmenor que0.01). Conclusion: The questionnaire gathers together enough psychometric properties to be considered a valid and reliable tool for register injury data in strength training, and providing researchers with a basis for future studies in this area. Key Words: data collection; validation; injury prevention; strength training
Resumo:
The aim of this paper is to present the experience of using lecture recordings to support curriculum changes within the framework of the European convergence process, mainly courses that need to be promoted or discontinued. We will explain an integrated solution for recording lectures consisting of a web portal, a videoconferencing tool and an economical and easily transportable kit. The validation process was performed recording three different courses at the Universidad Politécnica of Madrid (UPM) and using different diffusion channels, such as Moodle, an open source web portal called GlobalPlaza that supports streaming and recordings and the YouTube UPM channel. To assess the efficiency of our solution, a formal evaluation was conducted and will be also presented in this paper. The results show that lecture recordings allow teachers to support discontinued and new courses and enable students from remote areas to participate in international educational programmes, also the resulting recordings will be used as learning objects for future virtual courses.
Resumo:
Medical microbiology and virology laboratories use nucleic acid tests (NAT) to detect genomic material of infectious organisms in clinical samples. Laboratories choose to perform assembled (or in-house) NAT if commercial assays are not available or if assembled NAT are more economical or accurate. One reason commercial assays are more expensive is because extensive validation is necessary before the kit is marketed, as manufacturers must accept liability for the performance of their assays, assuming their instructions are followed. On the other hand, it is a particular laboratory's responsibility to validate an assembled NAT prior to using it for testing and reporting results on human samples. There are few published guidelines for the validation of assembled NAT. One procedure that laboratories can use to establish a validation process for an assay is detailed in this document. Before validating a method, laboratories must optimise it and then document the protocol. All instruments must be calibrated and maintained throughout the testing process. The validation process involves a series of steps including: (i) testing of dilution series of positive samples to determine the limits of detection of the assay and their linearity over concentrations to be measured in quantitative NAT; (ii) establishing the day-to-day variation of the assay's performance; (iii) evaluating the sensitivity and specificity of the assay as far as practicable, along with the extent of cross-reactivity with other genomic material; and (iv) assuring the quality of assembled assays using quality control procedures that monitor the performance of reagent batches before introducing new lots of reagent for testing.
Resumo:
Creative sourcing strategies, designed to extract more value from the supply base, have become a competitive, strategic differentiator. To fuel creativity, companies install sourcing teams that can capitalize on the specialized knowledge and expertise of their employees across the company. This article introduces the concept of a team creativity climate (TCC) - team members' shared perceptions of their joint policies, procedures, and practices with respect to developing creative sourcing strategies – as a means to address the unique challenges associated with a collective, cross-functional approach to develop value-enhancing sourcing strategies. Using a systematic scale development process that validates the proposed concept, the authors confirm its ability to predict sourcing team performance, and suggest some research avenues extending from this concept.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
Plusieurs facteurs d’ordre environnemental, social ou individuel peuvent influencer l’adhésion à une saine alimentation. Parmi les déterminants individuels, les connaissances en nutrition ont un rôle à jouer. D’ailleurs, de nombreux programmes de promotion de la saine alimentation se basent sur une amélioration des connaissances en nutrition pour engendrer des changements positifs dans les comportements et les apports alimentaires de la population. Plusieurs études ont associé positivement le niveau de connaissances en nutrition et l’alimentation saine, mais il a été observé que les associations sont plus marquées avec l’utilisation de questionnaires validés sur les connaissances en nutrition. Une validation rigoureuse des questionnaires est donc primordiale pour assurer la validité des résultats obtenus. Au Canada, il n’existe pas d’outil spécifiquement conçu pour mesurer les connaissances en nutrition et plus précisément l’adhésion à la saine alimentation telle que présentée par le Guide alimentaire canadien (GAC), et ce mémoire illustre la pertinence d’un tel instrument pour la population canadienne. Les résultats de la validation ont mené à l’obtention d’un questionnaire valide et fiable pour la population à l’étude. L’usage de documents reconnus dans la littérature pour la conception du questionnaire de même que l’application de plusieurs méthodes de validation telles qu’utilisées par d’autres auteurs dans le domaine ont permis de bien valider l’instrument. Le questionnaire développé pourrait donc permettre la mesure adéquate des connaissances en nutrition dans un contexte canadien-français.
Resumo:
Objectives: Because there is scientific evidence that an appropriate intake of dietary fibre should be part of a healthy diet, given its importance in promoting health, the present study aimed to develop and validate an instrument to evaluate the knowledge of the general population about dietary fibres. Study design: The present study was a cross sectional study. Methods: The methodological study of psychometric validation was conducted with 6010 participants, residing in ten countries from 3 continents. The instrument is a questionnaire of self-response, aimed at collecting information on knowledge about food fibres. For exploratory factor analysis (EFA) was chosen the analysis of the main components using varimax orthogonal rotation and eigenvalues greater than 1. In confirmatory factor analysis by structural equation modelling (SEM) was considered the covariance matrix and adopted the Maximum Likelihood Estimation algorithm for parameter estimation. Results: Exploratory factor analysis retained two factors. The first was called Dietary Fibre and Promotion of Health (DFPH) and included 7 questions that explained 33.94 % of total variance ( = 0.852). The second was named Sources of Dietary Fibre (SDF) and included 4 questions that explained 22.46% of total variance ( = 0.786). The model was tested by SEM giving a final solution with four questions in each factor. This model showed a very good fit in practically all the indexes considered, except for the ratio 2/df. The values of average variance extracted (0.458 and 0.483) demonstrate the existence of convergent validity; the results also prove the existence of discriminant validity of the factors (r2 = 0.028) and finally good internal consistency was confirmed by the values of composite reliability (0.854 and 0.787). Conclusions: This study allowed validating the KADF scale, increasing the degree of confidence in the information obtained through this instrument in this and in future studies.