991 resultados para Program validation
Resumo:
Our objective was to validate a new device dedicated to measure the light disturbances surrounding bright sources of light under different sources of potential variability. Twenty subjects were involved in the study. Light distortion was measured using an experimental prototype (light distortion analyzer, CEORLab, University of Minho, Portugal) comprising twenty-four LED arrays panel at 2 m. Sources of variability included: intrasession and intersession repeated measures, pupil size (3 versus 6 mm), defocus (þ0.50) correction for the working distance, angular resolution (15 deg versus 30 deg), temporal stimuli presentation, and pupil size. Size, shape, location, and irregularity parameters have been obtained. At a low speed of presentation of the stimuli, changes in angular resolution did not have an effect on the results of the parameters measured. Results did not change with pupil size. Intensity of the central glare source significantly influenced the outcomes. Examination time was reduced by 30% when a 30 deg angular resolution was explored instead of 15 deg. Measurements were fast and repeatable under the same experimental conditions. Size and shape parameters showed the highest consistency, whereas location and irregularity parameters showed lower consistency. The system was sensitive to changes in the intensity of the central glare source but not to pupil changes in this sample of healthy subjects.
Resumo:
Conventional methods of gene prediction rely on the recognition of DNA-sequence signals, the coding potential or the comparison of a genomic sequence with a cDNA, EST, or protein database. Reasons for limited accuracy in many circumstances are species-specific training and the incompleteness of reference databases. Lately, comparative genome analysis has attracted increasing attention. Several analysis tools that are based on human/mouse comparisons are already available. Here, we present a program for the prediction of protein-coding genes, termed SGP-1 (Syntenic Gene Prediction), which is based on the similarity of homologous genomic sequences. In contrast to most existing tools, the accuracy of SGP-1 depends little on species-specific properties such as codon usage or the nucleotide distribution. SGP-1 may therefore be applied to nonstandard model organisms in vertebrates as well as in plants, without the need for extensive parameter training. In addition to predicting genes in large-scale genomic sequences, the program may be useful to validate gene structure annotations from databases. To this end, SGP-1 output also contains comparisons between predicted and annotated gene structures in HTML format. The program can be accessed via a Web server at http://soft.ice.mpg.de/sgp-1. The source code, written in ANSI C, is available on request from the authors.
Resumo:
This exploratory, descriptive, cross-sectional, and quantitative study aimed to develop and validate an index of family vulnerability to disability and dependence (FVI-DD). This study was adapted from the Family Development Index, with the addition of social and health indicators of disability and dependence. The instrument was applied to 248 families in the city of Sao Paulo, followed by exploratory factor analysis. Factor validation was performed using the concurrent and discriminant validity of the Lawton scale and Katz Index. The descriptive level adopted for the study was p < 0.05. The final vulnerability index comprised 50 questions classified into seven factors contemplating social and health dimensions, and this index exhibited good internal consistency (Cronbach’s alpha = 0.82). FVI-DD was validated using both the Lawton scale and Katz Index. We conclude that FVI-DD can accurately and reliably assess family vulnerability to disability and dependence.
Resumo:
BACKGROUND: The hospital readmission rate has been proposed as an important outcome indicator computable from routine statistics. However, most commonly used measures raise conceptual issues. OBJECTIVES: We sought to evaluate the usefulness of the computerized algorithm for identifying avoidable readmissions on the basis of minimum bias, criterion validity, and measurement precision. RESEARCH DESIGN AND SUBJECTS: A total of 131,809 hospitalizations of patients discharged alive from 49 hospitals were used to compare the predictive performance of risk adjustment methods. A subset of a random sample of 570 medical records of discharge/readmission pairs in 12 hospitals were reviewed to estimate the predictive value of the screening of potentially avoidable readmissions. MEASURES: Potentially avoidable readmissions, defined as readmissions related to a condition of the previous hospitalization and not expected as part of a program of care and occurring within 30 days after the previous discharge, were identified by a computerized algorithm. Unavoidable readmissions were considered as censored events. RESULTS: A total of 5.2% of hospitalizations were followed by a potentially avoidable readmission, 17% of them in a different hospital. The predictive value of the screen was 78%; 27% of screened readmissions were judged clearly avoidable. The correlation between the hospital rate of clearly avoidable readmission and all readmissions rate, potentially avoidable readmissions rate or the ratio of observed to expected readmissions were respectively 0.42, 0.56 and 0.66. Adjustment models using clinical information performed better. CONCLUSION: Adjusted rates of potentially avoidable readmissions are scientifically sound enough to warrant their inclusion in hospital quality surveillance.
Resumo:
In most pathology laboratories worldwide, formalin-fixed paraffin embedded (FFPE) samples are the only tissue specimens available for routine diagnostics. Although commercial kits for diagnostic molecular pathology testing are becoming available, most of the current diagnostic tests are laboratory-based assays. Thus, there is a need for standardized procedures in molecular pathology, starting from the extraction of nucleic acids. To evaluate the current methods for extracting nucleic acids from FFPE tissues, 13 European laboratories, participating to the European FP6 program IMPACTS (www.impactsnetwork.eu), isolated nucleic acids from four diagnostic FFPE tissues using their routine methods, followed by quality assessment. The DNA-extraction protocols ranged from homemade protocols to commercial kits. Except for one homemade protocol, the majority gave comparable results in terms of the quality of the extracted DNA measured by the ability to amplify differently sized control gene fragments by PCR. For array-applications or tests that require an accurately determined DNA-input, we recommend using silica based adsorption columns for DNA recovery. For RNA extractions, the best results were obtained using chromatography column based commercial kits, which resulted in the highest quantity and best assayable RNA. Quality testing using RT-PCR gave successful amplification of 200 bp-250 bp PCR products from most tested tissues. Modifications of the proteinase-K digestion time led to better results, even when commercial kits were applied. The results of the study emphasize the need for quality control of the nucleic acid extracts with standardised methods to prevent false negative results and to allow data comparison among different diagnostic laboratories.
Resumo:
Most warning systems for plant disease control are based on Vinho, in Bento Gonçalves - RS, during the growing seasons 2000/ weather models dependent on the relationships between leaf wetness 01, 2002/03 and 2003/2004, using the grape cultivar Isabel. The duration and mean air temperature in this period considering the conventional system used by local growers was compared with the target disease intensity. For the development of a warning system to new warning system by using different cumulative daily disease severity control grapevine downy mildew, the equation generated by Lalancette values (CDDSV) as the criterion to schedule fungicide application and et al. (7) was used. This equation was employed to elaborate a critical reapplication. In experiments conducted in 2003/04, CDDSV of 12 - period table and program a computerized device, which records, though 14 showed promising to schedule the first spraying and the interval electronic sensors, leaf wetness duration, mean temperature in this between fungicide applications, reducing by 37.5% the number of period and automatically calculates the daily value of probability of applications and maintaining the same control efficiency in leaves infection occurrence. The system was validated at Embrapa Uva e and bunches, similarly to the conventional system.
Resumo:
In 1995, a pioneering MD-PhD program was initiated in Brazil for the training of medical scientists in experimental sciences at the Federal University of Rio de Janeiro. The program’s aim was achieved with respect to publication of theses in the form of papers with international visibility and also in terms of fostering the scientific careers of the graduates. The expansion of this type of program is one of the strategies for improving the preparation of biomedical researchers in Brazil. A noteworthy absence of interest in carrying out clinical research limits the ability of young Brazilian physicians to solve biomedical problems. To understand the students’ views of science, we used qualitative and quantitative triangulation methods, as well as participant observation to evaluate the students’ concepts of science and common sense. Subjective aspects were clearly less evident in their concepts of science. There was a strong concern about "methodology", "truth" and "usefulness". "Intuition", "creativity" and "curiosity" were the least mentioned thematic categories. Students recognized the value of intuition when it appeared as an explicit option but they did not refer to it spontaneously. Common sense was associated with "consensus", "opinion" and ideas that "require scientific validation". Such observations indicate that MD-PhD students share with their senior academic colleagues the same reluctance to consider common sense as a valid adjunct for the solution of scientific problems. Overcoming this difficulty may be an important step toward stimulating the interest of physicians in pursuing experimental research.
Resumo:
A method for determination of organohalogen pesticides in strawberry by gas chromatography with electron capture detection was validated and applied in a monitoring program. Linearity, matrix effects, and day effect were evaluated for the analytes alpha-endosulfan, beta-endosulfan, endosulfan sulphate, lambda-cyhalothrin, procymidone, and trifluralin. The linear range varied according to the chromatographic response of the analyte. Significant matrix effects were observed. The mean recoveries ranged from 74.6 to 115.4%, with repeatability standard deviations between 1.6 and 21.0% and intermediate precision between 5.9 and 21.0%. Detection, quantification and decision limit, and detection capacity ranged from 0.003 to 0.007 mg/kg, 0.005 to 0.013 mg/kg; 0.003 to 3.128 mg/kg; and 0.005 to 3.266 mg/kg, respectively. The method was fit for the purpose of monitoring organohalogen residues in strawberries. Residues of these pesticides were detected in 124 of the 186 samples analyzed between 2009 and 2011 in the state of Minas Gerais. Nine of them did not comply with the current legislation requirements; among them, seven (3.8%) had residues of unauthorized pesticide for the culture of strawberry, one (0.5%) had residues above the maximum residue limit, and another one (0.5%) exhibited both non-conformities.
Resumo:
Validation ofan Ice Skating Protocol to Predict Aerobic Power in Hockey Players In assessing the physiological capacity of ice hockey players, researchers have often reported the outcomes from different anaerobic skate tests, and the general physical fitness of participants. However, with respect to measuring the aerobic power of ice hockey players, few studies have reported a sport-specific protocol, and currently there is a lack of cohort-specific information describing aerobic power based on evaluations using an on-ice protocol. The Faught Aerobic Skating Test (FAST) uses an on-ice continuous skating protocol to induce a physical stress on a participant's aerobic energy system. The FAST incorporates the principle of increasing workloads at measured time intervals during a continuous skating exercise. Regression analysis was used to determine the estimate of aerobic power within gender and age level. Data were collected on 532 hockey players, (males=384, females=148) ranging in age between 9 and 25 years. Participants completed a laboratory test to measure aerobic power using a modified Bruce protocol, and the on-ice FAST. Regression equations were developed for six male and female, age-specific cohorts separately. The most consistent predictors were weight and final stage completed on the FAST. These results support the application of the FAST to estimate aerobic power among hockey players with R^ values ranging from 0.174 to 0.396 and SEE ranging from 5.65 to 8.58 ml kg' min'' depending on the cohort. Thus we conclude that FAST to be an accurate predictor of aerobic power in age and gender-specific hockey playing cohorts.
Resumo:
The current classification system for spinal cord injury (SCI) considers only somatic information and neglects autonomic damage after injiuy. Heart rate variability (HRV) has the potential to be a valuable measure of cardiac autonomic control after (SCI). Five individuals with tetraplegia and four able-bodied controls underwent 1 min continuous ECG recordings during rest, after Metoprolol administration (max dose=3x5mg) and after Atropine administration (0.02mg/kg) in both supine and 40° head-up tilt. After Metoprolol administration there was a 61.8% decrease in the LF:HF ratio in the SCI participants suggesting that the LF:HF ratio is a reflection of cardiac sympathetic outflow. After Atropine administration there was a 99.1% decrease in the HF power in the SCI participants suggesting that HF power is highly representative of cardiac parasympathetic outflow. There were no significant differences between the SCI and able-bodied participants. Thus, HRV measures are a valid index of cardiac autonomic control after SCI.
Resumo:
Les interventions proactives ou comportementales en classe sont reconnues empiriquement pour leur efficacité à améliorer le comportement ou le rendement scolaire des enfants ayant un TDAH (DuPaul & Eckert, 1997; Hoza, Kaiser, & Hurt, 2008; Pelham & Fabiano, 2008; Zentall, 2005). Or, l’écart entre les interventions probantes et celles retrouvées dans le milieu général de l’éducation souligne l’importance de répliquer les résultats d’études obtenus dans un environnement contrôlé dans un format de livraison réaliste. L’objectif principal de cette thèse est d’élaborer et d’évaluer un programme de consultation individuelle (PCI) fondé sur une démarche de résolution de problème et d’évaluation fonctionnelle, pour soutenir les enseignants du primaire dans la planification et la mise en œuvre cohérente des interventions privilégiées pour aider les enfants ayant un TDAH. D’abord, une recension des principales modalités d’intervention auprès des enfants ayant un TDAH est effectuée afin d’identifier les interventions à inclure lors du développement du programme. Par la suite, des solutions favorisant le transfert des interventions probantes à la classe ordinaire sont détaillées par la proposition du PCI ayant lieu entre un intervenant psychosocial et l’enseignant. Enfin, l’évaluation du PCI auprès de trente-sept paires enfant-enseignant est présentée. Tous les enfants ont un diagnostic de TDAH et prennent une médication (M). Les parents de certains enfants ont participé à un programme d’entraînement aux habiletés parentales (PEHP). L’échantillon final est: M (n = 4), M et PEHP (n = 11), M et PCI (n = 11), M, PEHP et PCI (n = 11). Les résultats confirment l’efficacité du PCI au-delà de M et M + PEHP pour éviter une aggravation des comportements inappropriés et améliorer le rendement scolaire des enfants ayant un TDAH. Par ailleurs, une augmentation de l’utilisation des stratégies efficaces par l’enseignant est observable lorsqu’il a à la fois participé au PCI et reçu une formation continue sur le TDAH en cours d’emploi. Les implications cliniques de l’intervention pour l’enfant ayant un TDAH et son enseignant de classe ordinaire sont discutées.
Resumo:
Le biais de confusion est un défi majeur des études observationnelles, surtout s'ils sont induits par des caractéristiques difficiles, voire impossibles, à mesurer dans les banques de données administratives de soins de santé. Un des biais de confusion souvent présents dans les études pharmacoépidémiologiques est la prescription sélective (en anglais « prescription channeling »), qui se manifeste lorsque le choix du traitement dépend de l'état de santé du patient et/ou de son expérience antérieure avec diverses options thérapeutiques. Parmi les méthodes de contrôle de ce biais, on retrouve le score de comorbidité, qui caractérise l'état de santé d'un patient à partir de médicaments délivrés ou de diagnostics médicaux rapportés dans les données de facturations des médecins. La performance des scores de comorbidité fait cependant l'objet de controverses car elle semble varier de façon importante selon la population d'intérêt. Les objectifs de cette thèse étaient de développer, valider, et comparer les performances de deux scores de comorbidité (un qui prédit le décès et l’autre qui prédit l’institutionnalisation), développés à partir des banques de services pharmaceutiques de la Régie de l'assurance-maladie du Québec (RAMQ) pour leur utilisation dans la population âgée. Cette thèse vise également à déterminer si l'inclusion de caractéristiques non rapportées ou peu valides dans les banques de données administratives (caractéristiques socio-démographiques, troubles mentaux ou du sommeil), améliore la performance des scores de comorbidité dans la population âgée. Une étude cas-témoins intra-cohorte fut réalisée. La cohorte source consistait en un échantillon aléatoire de 87 389 personnes âgées vivant à domicile, répartie en une cohorte de développement (n=61 172; 70%) et une cohorte de validation (n=26 217; 30%). Les données ont été obtenues à partir des banques de données de la RAMQ. Pour être inclus dans l’étude, les sujets devaient être âgés de 66 ans et plus, et être membres du régime public d'assurance-médicaments du Québec entre le 1er janvier 2000 et le 31 décembre 2009. Les scores ont été développés à partir de la méthode du Framingham Heart Study, et leur performance évaluée par la c-statistique et l’aire sous les courbes « Receiver Operating Curves ». Pour le dernier objectif qui est de documenter l’impact de l’ajout de variables non-mesurées ou peu valides dans les banques de données au score de comorbidité développé, une étude de cohorte prospective (2005-2008) a été réalisée. La population à l'étude, de même que les données, sont issues de l'Étude sur la Santé des Aînés (n=1 494). Les variables d'intérêt incluaient statut marital, soutien social, présence de troubles de santé mentale ainsi que troubles du sommeil. Tel que décrit dans l'article 1, le Geriatric Comorbidity Score (GCS) basé sur le décès, a été développé et a présenté une bonne performance (c-statistique=0.75; IC95% 0.73-0.78). Cette performance s'est avérée supérieure à celle du Chronic Disease Score (CDS) lorsqu'appliqué dans la population à l'étude (c-statistique du CDS : 0.47; IC 95%: 0.45-0.49). Une revue de littérature exhaustive a montré que les facteurs associés au décès étaient très différents de ceux associés à l’institutionnalisation, justifiant ainsi le développement d'un score spécifique pour prédire le risque d'institutionnalisation. La performance de ce dernier s'est avérée non statistiquement différente de celle du score de décès (c-statistique institutionnalisation : 0.79 IC95% 0.77-0.81). L'inclusion de variables non rapportées dans les banques de données administratives n'a amélioré que de 11% la performance du score de décès; le statut marital et le soutien social ayant le plus contribué à l'amélioration observée. En conclusion, de cette thèse, sont issues trois contributions majeures. D'une part, il a été démontré que la performance des scores de comorbidité basés sur le décès dépend de la population cible, d'où l'intérêt du Geriatric Comorbidity Score, qui fut développé pour la population âgée vivant à domicile. D'autre part, les médicaments associés au risque d'institutionnalisation diffèrent de ceux associés au risque de décès dans la population âgé, justifiant ainsi le développement de deux scores distincts. Cependant, les performances des deux scores sont semblables. Enfin, les résultats indiquent que, dans la population âgée, l'absence de certaines caractéristiques ne compromet pas de façon importante la performance des scores de comorbidité déterminés à partir de banques de données d'ordonnances. Par conséquent, les scores de comorbidité demeurent un outil de recherche important pour les études observationnelles.
Resumo:
La douleur est une expérience multidimensionnelle comportant des aspects sensoriels, émotionnels et cognitifs. Théoriquement, des méthodes de mesures comportementales, physiologiques, neurophysiologiques et sensorielles peuvent quantifier la douleur. Peu d’études ont étudié la validation des mesures utilisées en médecine vétérinaire. La recherche combine les travaux de Maîtrise et de Doctorat, traite en partie de la validité de méthodes. Dans cet objectif, nos travaux de recherche étudiaient la validité de méthodes comportementales, physiologiques et neurophysiologiques usuelles pour la mesure de la douleur en comparant les expressions de douleur (vache et chien) chez des animaux contrôle par comparaison à des animaux sous analgésie préventive ou sous traitement curatif suivant une douleur induite par chirurgie (modèles de douleur viscérale bovine ou orthopédique canine) ou causée par une maladie naturelle (arthrose canine). Une première étude comparait les mesures de la douleur entre les vaches du groupe placebo et celles sous analgésie postopératoire sur une durée de 21 jours suivant l’induction d’une douleur viscérale chronique. Les vaches du groupe placebo ont présenté une plus forte sensibilité à la douleur et une diminution de la noradrénaline et de la transthyrétine mesurées dans le liquide cérébro-spinal, une diminution de l’activité motrice (AM) (moindre que dans les groupes avec analgésie), de l’agitation enregistrée par vidéo-analyse et une augmentation du stress selon la mesure de l’activité électrodermique (AED). Les méthodes d’intérêt identifiées étaient les marqueurs spinaux, la mesure de la sensibilisation, de comportements par vidéo-analyse et de l’AM par bio-télémétrie. En utilisant des méthodes semblables à celles précédemment décrites, deux études expérimentales de douleur orthopédique ont été réalisées afin de comparer les réponses à la douleur entre des chiens traités avec une analgésie préventive (opioïdes et anti-inflammatoires, étude #2) ou un biphosphonate (tiludronate, étude #3) par comparaison à des chiens contrôles. Seules les échelles de douleur étaient différentes entre les études de recherche. Pour l’étude #2, les ii chiens sous analgésie ont présenté de plus faibles scores de douleur mesurés avec l’échelle de douleur nommée 4A-VET et ceci simultanément à une faible réponse de l’AED une heure après la chirurgie de trochléoplastie. La fréquence du comportement spontané de ‘la marche avec plein appui de la patte opérée’ mesurée à l’aide de la vidéo-analyse augmentait chez les chiens sous analgésie préventive 24 heures après la chirurgie. L’étude #3 démontrait surtout l’apparition de sensibilisation centrale (à la fois par l’évaluation sensorielle quantitative et les marqueurs spinaux) chez les chiens contrôle, 56 jours après l’induction d’arthrose chirurgicale. Ainsi, les chiens traités avec le tiludronate ont présenté une différence sur la substance P et la transthyrétine cérébro-spinale, une diminution de la sensibilisation périphérique, plus d’appui de la patte opérée lors de la marche selon la mesure du pic de force verticale (PFV), une augmentation de la fréquence de ‘la marche avec plein appui de la patte opérée’. La sensibilisation centrale était associée à la diminution de PFV, et une augmentation de l’AED et du comportement spontané de ‘la marche avec plein appui de la patte opérée’. Pour l’étude #4, la validité et la sensibilité des méthodes ont été évaluées dans une condition d’arthrose naturelle chez des chiens traités avec une diète enrichie en moule verte, un produit ayant des effets anti-inflammatoires et chondroprotecteurs attendus. Les chiens traités présentaient une diminution des scores de douleur via l’échelle nommée CSOM, une augmentation de PFV et une augmentation de l’AM. Dans l’ensemble, les résultats confirment que la vidéo-analyse évaluait la douleur de façon objective et pour des modèles différents de douleur et les marqueurs spinaux sont prometteurs. Le PFV était spécifique de la douleur orthopédique. La sensibilisation était présente lors de douleur pathologique. L’AED n’est pas valide pour la mesure de la douleur. La baisse d’AM suggèrerait un comportement de douleur. Les études étaient exploratoires pour les échelles de douleur en raison de leur niveau (débutant) de développement et du manque d’informations sur les qualités métrologiques de ces mesures.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
Motivation: DNA assembly programs classically perform an all-against-all comparison of reads to identify overlaps, followed by a multiple sequence alignment and generation of a consensus sequence. If the aim is to assemble a particular segment, instead of a whole genome or transcriptome, a target-specific assembly is a more sensible approach. GenSeed is a Perl program that implements a seed-driven recursive assembly consisting of cycles comprising a similarity search, read selection and assembly. The iterative process results in a progressive extension of the original seed sequence. GenSeed was tested and validated on many applications, including the reconstruction of nuclear genes or segments, full-length transcripts, and extrachromosomal genomes. The robustness of the method was confirmed through the use of a variety of DNA and protein seeds, including short sequences derived from SAGE and proteome projects.