26 resultados para query rewriting
em Université de Lausanne, Switzerland
Resumo:
Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: "no copy" approach - data stay mostly in the CSV files; "zero configuration" - no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results.
Resumo:
Abstract :This article examines the interplay of text and image in The Fairy Tales of Charles Perrault (1977), translated by Angela Carter and illustrated by Martin Ware, as a form of intersemiotic dialogue that sheds new light on Carter's work. It argues that Ware's highly original artwork based on the translation not only calls into question the association of fairy tales with children's literature (which still characterizes Carter's translation), but also captures an essential if heretofore neglected aspect of Carter's creative process, namely the dynamics between translating, illustrating and rewriting classic tales. Several elements from Ware's illustrations are indeed taken up and elaborated on in The Bloody Chamber and Other Stories (1979), the collection of "stories about fairy stories" that made Carter famous. These include visual details and strategies that she transposed to the realm of writing, giving rise to reflections on the relation between visuality and textuality.RésuméCet article considère l'interaction du texte et de l'image dans les contes de Perrault traduits par Angela Carter et illustrés par Martin Ware (The Fairy Tales of Charles Perrault, 1977) comme une forme de dialogue intersémiotique particulièrement productif. Il démontre que les illustrations originales de Ware ne mettent pas seulement en question l'assimilation des contes à la littérature de jeunesse (qui est encore la perspective adoptée par la traductrice dans ce livre), mais permettent aussi de saisir un aspect essentiel bien que jusque là ignoré du procession de création dans l'oeuvre de Carter, à savoir la dynamique qui lie la traduction, l'illustration et la réécriture des contes classiques. Plusieurs éléments des illustrations de Ware sont ainsi repris et élaborés dans The Bloody Chamber and Other Stories (1979), la collection de "stories about fairy stories" qui rendit Carter célèbre. La transposition de détails et de stratégies visuelles dans l'écriture donnent ainsi l'occasion de réflexions sur les rapports entre la visualité et la textualité.
Resumo:
In "Reading, Translating, Rewriting: Angela Carter's Translational Poetics", author Martine Hennard Dutheil de la Rochère delves into Carter's The Fairy Tales of Charles Perrault (1977) to illustrate that this translation project had a significant impact on Carter's own writing practice. Hennard combines close analyses of both texts with an attention to Carter's active role in the translation and composition process to explore this previously unstudied aspect of Carter's work. She further uncovers the role of female fairy-tale writers and folktales associated with the Grimms' Kinder- und Hausmärchen in the rewriting process, unlocking new doors to The Bloody Chamber. Hennard begins by considering the editorial evolution of The Fairy Tales of Charles Perrault from 1977 to the present day, as Perrault's tales have been rediscovered and repurposed. In the chapters that follow, she examines specific linkages between Carter's Perrault translation and The Bloody Chamber, including targeted analysis of the stories of Red Riding Hood, Bluebeard, Puss-in-Boots, Beauty and the Beast, Sleeping Beauty, and Cinderella. Hennard demonstrates how, even before The Bloody Chamber, Carter intervened in the fairy-tale debate of the late 1970s by reclaiming Perrault for feminist readers when she discovered that the morals of his worldly tales lent themselves to her own materialist and feminist goals. Hennard argues that The Bloody Chamber can therefore be seen as the continuation of and counterpoint to The Fairy Tales of Charles Perrault, as it explores the potential of the familiar stories for alternative retellings. While the critical consensus reads into Carter an imperative to subvert classic fairy tales, the book shows that Carter valued in Perrault a practical educator as well as a proto-folklorist and went on to respond to more hidden aspects of his texts in her rewritings. Reading, Translating, Rewriting is informative reading for students and teachers of fairy-tale studies and translation studies.
Resumo:
The article reopens the file of sources, parallels and rewritings of 1 Cor 2.9, a saying that Paul attributes to some written source, when others sources put it into Jesus' mouth (e.g. GosThom 17). A state of research highlights that the hypothesis of an oral source is generally preferred but an accurate study of 1 Clem 34.8, a parallel too often neglected, supports the presence of a written source that existed before 1 Cor 2.9. GosJud 47.10-13 will help to understand the attribution of the saying to Jesus. The last important part of this article studies its parallel in Islamic traditions, a ḥadīth qudsī.
Resumo:
Abstract-Due to the growing use of biometric technologies inour modern society, spoofing attacks are becoming a seriousconcern. Many solutions have been proposed to detect the use offake "fingerprints" on an acquisition device. In this paper, wepropose to take advantage of intrinsic features of friction ridgeskin: pores. The aim of this study is to investigate the potential ofusing pores to detect spoofing attacks.Results show that the use of pores is a promising approach. Fourmajor observations were made: First, results confirmed that thereproduction of pores on fake "fingerprints" is possible. Second,the distribution of the total number of pores between fake andgenuine fingerprints cannot be discriminated. Third, thedifference in pore quantities between a query image and areference image (genuine or fake) can be used as a discriminatingfactor in a linear discriminant analysis. In our sample, theobserved error rates were as follows: 45.5% of false positive (thefake passed the test) and 3.8% of false negative (a genuine printhas been rejected). Finally, the performance is improved byusing the difference of pore quantity obtained between adistorted query fingerprint and a non-distorted referencefingerprint. By using this approach, the error rates improved to21.2% of false acceptation rate and 8.3% of false rejection rate.
Resumo:
Continuation elleptique du Tristan en prose, qui s'inscrit dans l'interstice séparant la naissance de Tristan du remariage de Méliadus avec la fille du roi Hoël, le Roman de Meliadus (1235-1240) est une oeuvre fondamentalement ouverte, de par son inachèvement et de par le dialogue constant qu'il instaure avec les autres romans arthuriens. S'il revendique sa filiation et assume son statut de récit puîné, les réminiscences qu'il exhibe masquent aussi les gauchissements, les infléchissements qui lui permettent de faire du neuf avec du vieux. C'est ce jeu - aux deux sens du terme - que cette étude se propose de mettre en lumière et de voir fonctionner, non seulement dans le Roman de Meliadus proprement dit, mais également dans trois de ses relectures, qui actualisent et renouvellent la signification du roman en profondeur. La première est une continuation qui date de la toute fin du XIIIe ou du début du XIVe siècle et qui est aujourd'hui conservée par le seul manuscrit Ferrell 5. La deuxième actualisation retenue est celle qu'offre Meliadus de Leonnoys, l'imprimé publié en 1528 par Galliot du Pré, puis en 1532 par Denis Janot, fruit d'un minutieux travail de découpage et de remontage. La dernière enfin est l'extrait paru en 1776 dans la Bibliothèque Universelle des Romans sous le titre Méliadus de Léonnois. An ellipitic continuation of the Prose Tristan, which inscribes itself in the space separating the birth of Tristan from Meliadus' new marriage with king Hoël's daughter, the Meliadus' romance (1235-1240) is essentially an open text on account of its incompleteness and the dialogue it establishes with other arthurian romances. Even asserting filiation status, the reminiscences also show the reshaping and the inflection that allow the text to transform old into new. Analyzing this game is the central purpose of this work; to observe the operation in the Meliadus' romance, as well as in three of its recuperations that profoundly renew the significance of the novel; beginning with a continuation from the end of the 13th century or the early 14th century, preserved nowadays in only one manuscript (Ferrel 5); followed by the meticulous work of cutting and reassembling offered by the Meliadus of Leonnoys (printed by Galliot du Pré in 1528 first and again by Denis Janot in 1532) and finally an excerpt published in 1776 in the Bibliothèque Universelle des Romans with the title Méliadus of Leonnois.
Resumo:
Uveal melanoma metastases occur most commonly in the liver. Given the 50% mortality rate in patients at high risk of developing liver metastases, we tested an adjuvant intra-arterial hepatic (i.a.h.) chemotherapy with fotemustine after proton beam irradiation of the primary tumour. We treated 22 high-risk patients with adjuvant i.a.h. fotemustine. Planned treatment duration was 6 months, starting with four weekly doses of 100 mg/m(2), and after a 5-week rest, repeated every 3 weeks. The survival of this patient group was compared with that of a 3 : 1 matched control group randomly selected from our institutional database. Half of the patients experienced > or =grade 3 hepatotoxicity (one patient developing cholangitis 8 years later). Catheter-related complications occurred in 18%. With a median follow-up of 4.6 years for the fotemustine group and 8.5 years for the control group, median overall survival was 9 years [95% confidence interval (CI) 2.2-12.7] and 7.4 years (95% CI 5.4-12.7; P=0.5), respectively, with 5-year survival rates of 75 and 56%. Treatment with adjuvant i.a.h. fotemustine is feasible. However, toxicities are important. Although our data suggest a survival benefit, it was not statistically significant. Confirming such a benefit would require a large, internationally coordinated, prospective randomized trial.
Resumo:
The purpose of this study was to examine the relationship between skeletal muscle monocarboxylate transporters 1 and 4 (MCT1 and MCT4) expression, skeletal muscle oxidative capacity and endurance performance in trained cyclists. Ten well-trained cyclists (mean +/- SD; age 24.4 +/- 2.8 years, body mass 73.2 +/- 8.3 kg, VO(2max) 58 +/- 7 ml kg(-1) min(-1)) completed three endurance performance tasks [incremental exercise test to exhaustion, 2 and 10 min time trial (TT)]. In addition, a muscle biopsy sample from the vastus lateralis muscle was analysed for MCT1 and MCT4 expression levels together with the activity of citrate synthase (CS) and 3-hydroxyacyl-CoA dehydrogenase (HAD). There was a tendency for VO(2max) and peak power output obtained in the incremental exercise test to be correlated with MCT1 (r = -0.71 to -0.74; P < 0.06), but not MCT4. The average power output (P (average)) in the 2 min TT was significantly correlated with MCT4 (r = -0.74; P < 0.05) and HAD (r = -0.92; P < 0.01). The P (average) in the 10 min TT was only correlated with CS activity (r = 0.68; P < 0.05). These results indicate the relationship between MCT1 and MCT4 as well as cycle TT performance may be influenced by the length and intensity of the task.
Resumo:
PURPOSE: Most existing methods for accelerated parallel imaging in MRI require additional data, which are used to derive information about the sensitivity profile of each radiofrequency (RF) channel. In this work, a method is presented to avoid the acquisition of separate coil calibration data for accelerated Cartesian trajectories. METHODS: Quadratic phase is imparted to the image to spread the signals in k-space (aka phase scrambling). By rewriting the Fourier transform as a convolution operation, a window can be introduced to the convolved chirp function, allowing a low-resolution image to be reconstructed from phase-scrambled data without prominent aliasing. This image (for each RF channel) can be used to derive coil sensitivities to drive existing parallel imaging techniques. As a proof of concept, the quadratic phase was applied by introducing an offset to the x(2) - y(2) shim and the data were reconstructed using adapted versions of the image space-based sensitivity encoding and GeneRalized Autocalibrating Partially Parallel Acquisitions algorithms. RESULTS: The method is demonstrated in a phantom (1 × 2, 1 × 3, and 2 × 2 acceleration) and in vivo (2 × 2 acceleration) using a 3D gradient echo acquisition. CONCLUSION: Phase scrambling can be used to perform parallel imaging acceleration without acquisition of separate coil calibration data, demonstrated here for a 3D-Cartesian trajectory. Further research is required to prove the applicability to other 2D and 3D sampling schemes. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.
Resumo:
This study investigated the influence of two warm-up protocols on neural and contractile parameters of knee extensors. A series of neuromuscular tests including voluntary and electrically evoked contractions were performed before and after running- (R (WU); slow running, athletic drills, and sprints) and strength-based (S (WU); bilateral 90 degrees back squats, Olympic lifting movements and reactivity exercises) warm ups (duration ~40 min) in ten-trained subjects. The estimated overall mechanical work was comparable between protocols. Maximal voluntary contraction torque (+15.6%; P < 0.01 and +10.9%; P < 0.05) and muscle activation (+10.9 and +12.9%; P < 0.05) increased to the same extent after R (WU) and S (WU), respectively. Both protocols caused a significant shortening of time to contract (-12.8 and -11.8% after R (WU) and S (WU); P < 0.05), while the other twitch parameters did not change significantly. Running- and strength-based warm ups induce similar increase in knee extensors force-generating capacity by improving the muscle activation. Both protocols have similar effects on M-wave and isometric twitch characteristics.
Resumo:
In this paper, we propose two active learning algorithms for semiautomatic definition of training samples in remote sensing image classification. Based on predefined heuristics, the classifier ranks the unlabeled pixels and automatically chooses those that are considered the most valuable for its improvement. Once the pixels have been selected, the analyst labels them manually and the process is iterated. Starting with a small and nonoptimal training set, the model itself builds the optimal set of samples which minimizes the classification error. We have applied the proposed algorithms to a variety of remote sensing data, including very high resolution and hyperspectral images, using support vector machines. Experimental results confirm the consistency of the methods. The required number of training samples can be reduced to 10% using the methods proposed, reaching the same level of accuracy as larger data sets. A comparison with a state-of-the-art active learning method, margin sampling, is provided, highlighting advantages of the methods proposed. The effect of spatial resolution and separability of the classes on the quality of the selection of pixels is also discussed.
Resumo:
The purpose of this research is to assess the vulnerabilities of a high resolution fingerprint sensor when confronted with fake fingerprints. The study has not been focused on the decision outcome of the biometric device, but essentially on the scores obtained following the comparison between a query (genuine or fake) and a template using an AFIS system. To do this, fake fingerprints of 12 subjects have been produced with and without their cooperation. These fake fingerprints have been used alongside with real fingers. The study led to three major observations: First, genuine fingerprints produced scores higher than fake fingers (translating a closer proximity) and this tendency is observed considering each subject separately. Second, scores are however not sufficient as a single measure to differentiate these samples (fake from genuine) given the variation due to the donors themselves. That explains why fingerprint readers without vitality detection can be fooled. Third, production methods and subjects greatly influence the scores obtained for fake fingerprints.
Resumo:
BACKGROUND: When fructose is ingested together with glucose (GLUFRU) during exercise, plasma lactate and exogenous carbohydrate oxidation rates are higher than with glucose alone. OBJECTIVE: The objective was to investigate to what extent GLUFRU increased lactate kinetics and oxidation rate and gluconeogenesis from lactate (GNG(L)) and from fructose (GNG(F)). DESIGN: Seven endurance-trained men performed 120 min of exercise at approximately 60% VOmax (maximal oxygen consumption) while ingesting 1.2 g glucose/min + 0.8 g of either glucose or fructose/min (GLUFRU). In 2 trials, the effects of glucose and GLUFRU on lactate and glucose kinetics were investigated with glucose and lactate tracers. In a third trial, labeled fructose was added to GLUFRU to assess fructose disposal. RESULTS: In GLUFRU, lactate appearance (120 +/- 6 mumol . kg(1) . min(1)), lactate disappearance (121 +/- 7 mumol . kg(1) . min(1)), and oxidation (127 +/- 12 mumol . kg(1) . min(1)) rates increased significantly (P < 0.001) in comparison with glucose alone (94 +/- 16, 95 +/- 16, and 97 +/- 16 mumol . kg(1) . min(1), respectively). GNG(L) was negligible in both conditions. In GLUFRU, GNG(F) and exogenous fructose oxidation increased with time and leveled off at 18.8 +/- 3.7 and 38 +/- 4 mumol . kg(1) . min(1), respectively, at 100 min. Plasma glucose appearance rate was significantly higher (P < 0.01) in GLUFRU (91 +/- 6 mumol . kg(1) . min(1)) than in glucose alone (82 +/- 9 mumol . kg(1) . min(1)). Carbohydrate oxidation rate was higher (P < 0.05) in GLUFRU. CONCLUSIONS: Fructose increased total carbohydrate oxidation, lactate production and oxidation, and GNG(F). Fructose oxidation was explained equally by fructose-derived lactate and glucose oxidation, most likely in skeletal and cardiac muscle. This trial was registered at clinicaltrials.gov as NCT01128647.
Resumo:
The determination of the age of a document is a very frequent query, however it is also one of the most challenging and controversial areas of questioned document examination. Several approaches were defined to address this problem. The first is based on the introduction date of raw constituents (like paper or ink) on the market. The second approach is based on the aging of documents, which is unfortunately not only influenced by passing time, but also by storage conditions and document composition. The third approach considers the relative age of documents and aims at reconstructing their chronology. The three approaches are equally complex to develop and encounter quantity of problems which are not negligible. This article aims to expose the potential applications and limitations of current ink dating methods. Method development and validation and the interpretation of evidence prove to be essential criteria for the dating of documents.