916 resultados para automatic test case generation
Resumo:
The public primary school system in the State of Geneva, Switzerland, is characterized by centrally evaluated pupil performance measured with the use of standardized tests. As a result, consistent data are collected among the system. The 2010-2011 dataset is used to develop a two-stage data envelopment analysis (DEA) of school efficiency. In the first stage, DEA is employed to calculate an individual efficiency score for each school. It shows that, on average, each school could reduce its inputs by 7% whilst maintaining the same quality of pupil performance. The cause of inefficiency lies in perfectible management. In the second stage, efficiency is regressed on school characteristics and environmental variables;external factors outside of the control of headteachers. The model is tested for multicollinearity, heteroskedasticity and endogeneity. Four variables are identified as statistically significant. School efficiency is negatively influenced by (1) the provision of special education, (2) the proportion of disadvantaged pupils enrolled at the school and (3) operations being held on multiple sites, but positively influenced by school size (captured by the number of pupils). The proportion of allophone pupils; schools located in urban areas and the provision of reception classes for immigrant pupils are not significant. Although the significant variables influencing school efficiency are outside of the control of headteachers, it is still possible to either boost the positive impact or curb the negative impact. Dans le canton de Genève (Suisse), les écoles publiques primaires sont caractérisées par un financement assuré par les collectivités publiques (canton et communes) et par une évaluation des élèves à l'aide d'épreuves standardisées à trois moments distincts de leur scolarité. Cela permet de réunir des informations statistiques consistantes. La base de données de l'année 2010-2011 est utilisée dans une analyse en deux étapes de l'efficience des écoles. Dans une première étape, la méthode d'analyse des données par enveloppement (DEA) est utilisée pour calculer un score d'efficience pour chaque école. Cette analyse démontre que l'efficience moyenne des écoles s'élève à 93%. Chaque école pourrait, en moyenne, réduire ses ressources de 7% tout en conservant constants les résultats des élèves aux épreuves standardisées. La source de l'inefficience réside dans un management des écoles perfectible. Dans une seconde étape, les scores d'efficience sont régressés sur les caractéristiques des écoles et sur des variables environnementales. Ces variables ne sont pas sous le contrôle (ou l'influence) des directeurs d'école. Le modèle est testé pour la multicolinéartié, l'hétéroscédasticité et l'endogénéité. Quatre variables sont statistiquement significatives. L'efficience des écoles est influencée négativement par (1) le fait d'offrir un enseignement spécialisé en classe séparée, (2) la proporition d'élèves défavorisés et (3) le fait d'opérer sur plusieurs sites différents. L'efficience des écoles est influencée positivement par la taille de l'école, mesurée par le nombre d'élèves. La proporition d'élèves allophones, le fait d'être situé dans une zone urbaine et d'offrir des classes d'accueil pour les élèves immigrants constituent autant de variables non significatives. Le fait que les variables qui influencent l'efficience des écoles ne soient pas sous le contrôle des directeurs ne signifie pas qu'il faille céder au fatalisme. Différentes pistes sont proposées pour permettre soit de réduire l'impact négatif soit de tirer parti de l'impact positif des variables significatives.
Resumo:
Early studies in patients with systemic lupus erythematosus (SLE) reported increased incidence of tuberculosis. The tuberculin skin test (TST) is the technique of choice to detect latent tuberculosis infection (LTBI) but has several limitations. OBJECTIVES We compared TST and the newer T.SPOT.TB test to diagnose LTBI in SLE patients. METHODS In this observational cohort study conducted between August 2009 and February 2012, we recruited 92 patients from those attending the SLE clinic of our university hospital. Data recorded were epidemiological and sociodemographic characteristics. Laboratory analyses included TST and T.SPOT.TB tests. RESULTS Of the patients studied, 92% were women with an average age of 42.7 years. Overall, the degree of correlation between the two tests was low (Kappa index = 0.324) but was better in patients not receiving corticosteroids (CTC)/immunosuppressive (IS) therapy (Kappa = 0.436) and in those receiving hydroxychloroquine (Kappa = 0.473). While TST results were adversely affected by those receiving CTC and/or IS drugs (P = 0.021), the T.SPOT.TB results were not. CONCLUSION Although the TST test remains a useful tool for diagnosing LTBI in SLE patients, the T.SPOT.TB test is perhaps better employed when the patient is receiving CTC and/or IS drugs.
Resumo:
Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.
Resumo:
Lab tests are frequently used in primary care to guide patient care. This is particularly the case when a severe disorder, or one that will affect patients' initial care, needs to be excluded rapidly. At the PMU-FLON walk-in clinic the use of HIV testing as recommended by the Swiss Office of Public Health was hampered by the delay in obtaining test results. This led us to introduce rapid HIV testing which provides results within 30 minutes. Following the first 250 tests the authors discuss the results as well as the benefits of rapid HIV testing in an urban walk-in clinic.
Resumo:
Introduction: Patients with cystic fibrosis (CF) are more susceptible to pathogens like P. aeruginosa (PA). PA primo-‐infections require particular attention, as with failure in eradication, there is accelerated lung deterioration. The main aim of this study is to assess the rate of PA eradication according to our particular protocol with inhaled tobramycin and oral ciprofloxacin, as there is no consensus in the literature on what eradication protocol the best is. Methods: Retrospective single centre study with data analysis from June 1st 2007 to June 1st 2011 of patients who had primo-‐infections exclusively treated by 3 x 28 days of inhaled tobramycin and oral ciprofloxacin for the first and last 21 days. Success in eradication is defined by ≥ 3 negative bacteriologies for 6 months after the beginning of the protocol. If ≥ 1 bacteriology is positive, we consider the eradication as a failure. Results: Out of 41 patients, 18 were included in our analysis. 7 girls (38.9%) and 11 boys (61.1%) followed the eradication protocol. Boys had 12 primo-‐infections and girls had 8. Among these 20 primo-‐infections, 16 (80%) had an all-‐overall success in eradication and 4 (20%) a failure. No significant statistical difference for age between these groups (t-‐test = 0.07, p = 0.94), neither for FEV1% (t-‐test = 0.96, p = 0.41) nor BMI (t-‐test = 1.35, p = 0.27). Rate of success was 100% for girls and 66.6% for boys. Conclusion: Our protocol succeeded in an overall eradication rate of 80%, without statistical significant impact on FEV1 % and BMI values. However, there is a sex difference with eradication rates in girls (100%) and boys (66.6%). A sex difference has not yet been reported in the literature. This should be evaluated in further studies.
Resumo:
Conservation of the function of open reading frames recently identified in fungal genome projects can be assessed by complementation of deletion mutants of putative Saccharomyces cerevisiae orthologs. A parallel complementation assay expressing the homologous wild type S. cerevisiae gene is generally performed as a positive control. However, we and others have found that failure of complementation can occur in this case. We investigated the specific cases of S. cerevisiae TBF1 and TIM54 essential genes. Heterologous complementation with Candida glabrata TBF1 or TIM54 gene was successful using the constitutive promoters TDH3 and TEF. In contrast, homologous complementation with S. cerevisiae TBF1 or TIM54 genes failed using these promoters, and was successful only using the natural promoters of these genes. The reduced growth rate of S. cerevisiae complemented with C. glabrata TBF1 or TIM54 suggested a diminished functionality of the heterologous proteins compared to the homologous proteins. The requirement of the homologous gene for the natural promoter was alleviated for TBF1 when complementation was assayed in the absence of sporulation and germination, and for TIM54 when two regions of the protein presumably responsible for a unique translocation pathway of the TIM54 protein into the mitochondrial membrane were deleted. Our results demonstrate that the use of different promoters may prove necessary to obtain successful complementation, with use of the natural promoter being the best approach for homologous complementation.
Resumo:
The recent advance in high-throughput sequencing and genotyping protocols allows rapid investigation of Mendelian and complex diseases on a scale not previously been possible. In my thesis research I took advantage of these modern techniques to study retinitis pigmentosa (RP), a rare inherited disease characterized by progressive loss of photoreceptors and leading to blindness; and hypertension, a common condition affecting 30% of the adult population. Firstly, I compared the performance of different next generation sequencing (NGS) platforms in the sequencing of the RP-linked gene PRPF31. The gene contained a mutation in an intronic repetitive element, which presented difficulties for both classic sequencing methods and NGS. We showed that all NGS platforms are powerful tools to identify rare and common DNA variants, also in case of more complex sequences. Moreover, we evaluated the features of different NGS platforms that are important in re-sequencing projects. The main focus of my thesis was then to investigate the involvement of pre-mRNA splicing factors in autosomal dominant RP (adRP). I screened 5 candidate genes in a large cohort of patients by using long-range PCR as enrichment step, followed by NGS. We tested two different approaches: in one, all target PCRs from all patients were pooled and sequenced as a single DNA library; in the other, PCRs from each patient were separated within the pool by DNA barcodes. The first solution was more cost-effective, while the second one allowed obtaining faster and more accurate results, but overall they both proved to be effective strategies for gene screenings in many samples. We could in fact identify novel missense mutations in the SNRNP200 gene, encoding an essential RNA helicase for splicing catalysis. Interestingly, one of these mutations showed incomplete penetrance in one family with adRP. Thus, we started to study the possible molecular causes underlying phenotypic differences between asymptomatic and affected members of this family. For the study of hypertension, I joined a European consortium to perform genome-wide association studies (GWAS). Thanks to the use of very informative genotyping arrays and of phenotipically well-characterized cohorts, we could identify a novel susceptibility locus for hypertension in the promoter region of the endothelial nitric oxide synthase gene (NOS3). Moreover, we have proven the direct causality of the associated SNP using three different methods: 1) targeted resequencing, 2) luciferase assay, and 3) population study. - Le récent progrès dans le Séquençage à haut Débit et les protocoles de génotypage a permis une plus vaste et rapide étude des maladies mendéliennes et multifactorielles à une échelle encore jamais atteinte. Durant ma thèse de recherche, j'ai utilisé ces nouvelles techniques de séquençage afin d'étudier la retinite pigmentale (RP), une maladie héréditaire rare caractérisée par une perte progressive des photorécepteurs de l'oeil qui entraine la cécité; et l'hypertension, une maladie commune touchant 30% de la population adulte. Tout d'abord, j'ai effectué une comparaison des performances de différentes plateformes de séquençage NGS (Next Generation Sequencing) lors du séquençage de PRPF31, un gène lié à RP. Ce gène contenait une mutation dans un élément répétable intronique, qui présentait des difficultés de séquençage avec la méthode classique et les NGS. Nous avons montré que les plateformes de NGS analysées sont des outils très puissants pour identifier des variations de l'ADN rares ou communes et aussi dans le cas de séquences complexes. De plus, nous avons exploré les caractéristiques des différentes plateformes NGS qui sont importantes dans les projets de re-séquençage. L'objectif principal de ma thèse a été ensuite d'examiner l'effet des facteurs d'épissage de pre-ARNm dans une forme autosomale dominante de RP (adRP). Un screening de 5 gènes candidats issus d'une large cohorte de patients a été effectué en utilisant la long-range PCR comme étape d'enrichissement, suivie par séquençage avec NGS. Nous avons testé deux approches différentes : dans la première, toutes les cibles PCRs de tous les patients ont été regroupées et séquencées comme une bibliothèque d'ADN unique; dans la seconde, les PCRs de chaque patient ont été séparées par code barres d'ADN. La première solution a été la plus économique, tandis que la seconde a permis d'obtenir des résultats plus rapides et précis. Dans l'ensemble, ces deux stratégies se sont démontrées efficaces pour le screening de gènes issus de divers échantillons. Nous avons pu identifier des nouvelles mutations faux-sens dans le gène SNRNP200, une hélicase ayant une fonction essentielle dans l'épissage. Il est intéressant de noter qu'une des ces mutations montre une pénétrance incomplète dans une famille atteinte d'adRP. Ainsi, nous avons commencé une étude sur les causes moléculaires entrainant des différences phénotypiques entre membres affectés et asymptomatiques de cette famille. Lors de l'étude de l'hypertension, j'ai rejoint un consortium européen pour réaliser une étude d'association Pangénomique ou genome-wide association study Grâce à l'utilisation de tableaux de génotypage très informatifs et de cohortes extrêmement bien caractérisées au niveau phénotypique, un nouveau locus lié à l'hypertension a été identifié dans la région promotrice du gène endothélial nitric oxide sinthase (NOS3). Par ailleurs, nous avons prouvé la cause directe du SNP associé au moyen de trois méthodes différentes: i) en reséquençant la cible avec NGS, ii) avec des essais à la luciférase et iii) une étude de population.
Resumo:
OBJECTIVE: To identify specific major congenital malformations associated with use of carbamazepine in the first trimester of pregnancy. DESIGN: A review of all published cohort studies to identify key indications and a population based case-control study to test these indications. SETTING: Review of PubMed, Web of Science, and Embase for papers about carbamazepine exposure in the first trimester of pregnancy and specific malformations, and the EUROCAT Antiepileptic Study Database, including data from 19 European population based congenital anomaly registries, 1995-2005. PARTICIPANTS: The literature review covered eight cohort studies of 2680 pregnancies with carbamazepine monotherapy exposure, and the EUROCAT dataset included 98 075 registrations of malformations covering over 3.8 million births. MAIN OUTCOME MEASURES: Overall prevalence for a major congenital malformation after exposure to carbamazepine monotherapy in the first trimester. Odds ratios for malformations with exposure to carbamazepine among cases (five types of malformation identified in the literature review) compared with two groups of controls: other non-chromosomal registrations of malformations and chromosomal syndromes. RESULTS: The literature review yielded an overall prevalence for a major congenital malformation of 3.3% (95% confidence interval 2.7 to 4.2) after exposure to carbamazepine monotherapy in the first trimester. In 131 registrations of malformations, the fetus had been exposed to carbamazepine monotherapy. Spina bifida was the only specific major congenital malformation significantly associated with exposure to carbamazepine monotherapy (odds ratio 2.6 (95% confidence interval 1.2 to 5.3) compared with no antiepileptic drug), but the risk was smaller for carbamazepine than for valproic acid (0.2, 0.1 to 0.6). There was no evidence for an association with total anomalous pulmonary venous return (no cases with carbamazepine exposure), cleft lip (with or without palate) (0.2, 0.0 to 1.3), diaphragmatic hernia (0.9, 0.1 to 6.6), or hypospadias (0.7, 0.3 to 1.6) compared with no exposure to antiepileptic drugs. Further exploratory analysis suggested a higher risk of single ventricle and atrioventricular septal defect. CONCLUSION: Carbamazepine teratogenicity is relatively specific to spina bifida, though the risk is less than with valproic acid. Despite the large dataset, there was not enough power to detect moderate risks for some rare major congenital malformations.
Resumo:
Projecte de recerca elaborat a partir d’una estada al Max Planck Institute for Human Cognitive and Brain Sciences, Alemanya, entre 2010 i 2012. El principal objectiu d’aquest projecte era estudiar en detall les estructures subcorticals, en concret, el rol dels ganglis basals en control cognitiu durant processament lingüístic i no-lingüístic. Per tal d’assolir una diferenciació minuciosa en els diferents nuclis dels ganglis basals s’utilitzà ressonància magnètica d’ultra-alt camp i alta resolució (7T-MRI). El còrtex prefrontal lateral i els ganglis basals treballant conjuntament per a mitjançar memòria de treball i la regulació “top-down” de la cognició. Aquest circuit regula l’equilibri entre respostes automàtiques i d’alt-ordre cognitiu. Es crearen tres condicions experimentals principals: frases/seqüències noambigües, no-gramatical i ambigües. Les frases/seqüències no-ambigües haurien de provocar una resposta automàtica, mentre les frases/seqüències ambigües i no-gramaticals produïren un conflicte amb la resposta automàtica, i per tant, requeririen una resposta de d’alt-ordre cognitiu. Dins del domini de la resposta de control, la ambigüitat i no-gramaticalitat representen dues dimensions diferents de la resolució de conflicte, mentre per una frase/seqüència temporalment ambigua existeix una interpretació correcte, aquest no és el cas per a les frases/seqüències no-gramaticals. A més, el disseny experimental incloïa una manipulació lingüística i nolingüística, la qual posà a prova la hipòtesi que els efectes són de domini-general; així com una manipulació semàntica i sintàctica que avaluà les diferències entre el processament d’ambigüitat/error “intrínseca” vs. “estructural”. Els resultats del primer experiment (sintax-lingüístic) mostraren un gradient rostroventralcaudodorsal de control cognitiu dins del nucli caudat, això és, les regions més rostrals sostenint els nivells més alts de processament cognitiu
Resumo:
In this work we propose a new automatic methodology for computing accurate digital elevation models (DEMs) in urban environments from low baseline stereo pairs that shall be available in the future from a new kind of earth observation satellite. This setting makes both views of the scene similarly, thus avoiding occlusions and illumination changes, which are the main disadvantages of the commonly accepted large-baseline configuration. There still remain two crucial technological challenges: (i) precisely estimating DEMs with strong discontinuities and (ii) providing a statistically proven result, automatically. The first one is solved here by a piecewise affine representation that is well adapted to man-made landscapes, whereas the application of computational Gestalt theory introduces reliability and automation. In fact this theory allows us to reduce the number of parameters to be adjusted, and tocontrol the number of false detections. This leads to the selection of a suitable segmentation into affine regions (whenever possible) by a novel and completely automatic perceptual grouping method. It also allows us to discriminate e.g. vegetation-dominated regions, where such an affine model does not apply anda more classical correlation technique should be preferred. In addition we propose here an extension of the classical ”quantized” Gestalt theory to continuous measurements, thus combining its reliability with the precision of variational robust estimation and fine interpolation methods that are necessary in the low baseline case. Such an extension is very general and will be useful for many other applications as well.
Resumo:
In this article, we present the current state of our work on a linguistically-motivated model for automatic summarization of medical articles in Spanish. The model takes into account the results of an empirical study which reveals that, on the one hand, domain-specific summarization criteria can often be derived from the summaries of domain specialists, and, on the other hand, adequate summarization strategies must be multidimensional, i.e., cover various types of linguistic clues. We take into account the textual, lexical, discursive, syntactic and communicative dimensions. This is novel in the field of summarization. The experiments carried out so far indicate that our model is suitable to provide high quality summarizations.
Resumo:
Automatic classification of makams from symbolic data is a rarely studied topic. In this paper, first a review of an n-gram based approach is presented using various representations of the symbolic data. While a high degree of precision can be obtained, confusion happens mainly for makams using (almost) the same scale and pitch hierarchy but differ in overall melodic progression, seyir. To further improve the system, first n-gram based classification is tested for various sections of the piece to take into account a feature of the seyir that melodic progression starts in a certain region of the scale. In a second test, a hierarchical classification structure is designed which uses n-grams and seyir features in different levels to further improve the system.
Resumo:
Lexical Resources are a critical component for Natural Language Processing applications. However, the high cost of comparing and merging different resources has been a bottleneck to have richer resources with a broad range of potential uses for a significant number of languages.With the objective of reducing cost byeliminating human intervention, we present a new method for automating the merging of resources,with special emphasis in what we call the mapping step. This mapping step, which converts the resources into a common format that allows latter the merging, is usually performed with huge manual effort and thus makes the whole process very costly. Thus, we propose a method to perform this mapping fully automatically. To test our method, we have addressed the merging of two verb subcategorization frame lexica for Spanish, The resultsachieved, that almost replicate human work, demonstrate the feasibility of the approach.
Resumo:
The West Liberty Foods turkey cooperative was formed in 1996 to purchase the assets and assume operations of Louis Rich Foods (an investor-owned processing rm), which, at the time, announced the imminent shutdown of its West Liberty, Iowa, processing facility. We study the creation and performance of this �new generation� cooperative using eld interviews with grower members and company management. We describe changes, before and after the buyout, in the contractual apparatus used for procuring live turkeys, and in the communication requirements, work expectations, and nancial positions of growers. During the private ownership period, most of the inputs (except labor and facilities) were provided by the rm; there was substantial supervision of the growers' actions; growers faced little price and production risk; and growers' equity was due largely to ownership of land and other farm assets. Our interviews reveal that, after cooperative formation, growers were exposed to considerable additional risk; monitoring of growers by the rm was less intensive; grower time and effort commitments to turkey production increased substantially; and a signicant fraction of rm (cooperative) equity came from growers' willingness to leverage their farm and personal assets (and hence indirectly their existing relationships with local lenders). We argue that some of these changes are consistent with a nancial contract where asset pledging and its corollary risk generate higher work effort by growers and a reduction in agency rents. These economies likely compensate for an organizational deadweight loss traditionally associated with cooperative governance.
Resumo:
Contact theory and threat group theory offer contradictory hypotheses regarding the effect of contact with immigrants. Despite recent efforts to test the validity of both approaches, we still lack a definitive conclusion. This article integrates both approaches and tests the effect of contact towards immigrants and how this changes when different contexts are considered. Mainly, we investigate the effect of the economic environment and the immigrant group size on modifying attitudes toward immigration. The hypotheses, which are tested in Catalonia, show that contact with immigrants reduce negative attitudes towards immigration, especially friendship and family contact. However, mixed results are reported regarding the effect of economic environment and immigrant group size. Whereas the former modifies positively the effect of workplace contact, the latter has no effect. Findings have implications for the impact of context when dealing with the impact of contact on attitudes towards immigration.