980 resultados para Syntactic Projection
Resumo:
Depuis quelques années, les applications intégrant un module de dialogues avancés sont en plein essor. En revanche, le processus d’universalisation de ces systèmes est rapidement décourageant : ceux-ci étant naturellement dépendants de la langue pour laquelle ils ont été conçus, chaque nouveau langage à intégrer requiert son propre temps de développement. Un constat qui ne s’améliore pas en considérant que la qualité est souvent tributaire de la taille de l’ensemble d’entraînement. Ce projet cherche donc à accélérer le processus. Il rend compte de différentes méthodes permettant de générer des versions polyglottes d’un premier système fonctionnel, à l’aide de la traduction statistique. L’information afférente aux données sources est projetée afin de générer des données cibles parentes, qui diminuent d’autant le temps de développement subséquent. En ce sens, plusieurs approches ont été expérimentées et analysées. Notamment, une méthode qui regroupe les données avant de réordonner les différents candidats de traduction permet d’obtenir de bons résultats.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The ability to control both the minimum size of holes and the minimum size of structural members are essential requirements in the topology optimization design process for manufacturing. This paper addresses both requirements by means of a unified approach involving mesh-independent projection techniques. An inverse projection is developed to control the minimum hole size while a standard direct projection scheme is used to control the minimum length of structural members. In addition, a heuristic scheme combining both contrasting requirements simultaneously is discussed. Two topology optimization implementations are contributed: one in which the projection (either inverse or direct) is used at each iteration; and the other in which a two-phase scheme is explored. In the first phase, the compliance minimization is carried out without any projection until convergence. In the second phase, the chosen projection scheme is applied iteratively until a solution is obtained while satisfying either the minimum member size or minimum hole size. Examples demonstrate the various features of the projection-based techniques presented.
Distributed Estimation Over an Adaptive Incremental Network Based on the Affine Projection Algorithm
Resumo:
We study the problem of distributed estimation based on the affine projection algorithm (APA), which is developed from Newton`s method for minimizing a cost function. The proposed solution is formulated to ameliorate the limited convergence properties of least-mean-square (LMS) type distributed adaptive filters with colored inputs. The analysis of transient and steady-state performances at each individual node within the network is developed by using a weighted spatial-temporal energy conservation relation and confirmed by computer simulations. The simulation results also verify that the proposed algorithm provides not only a faster convergence rate but also an improved steady-state performance as compared to an LMS-based scheme. In addition, the new approach attains an acceptable misadjustment performance with lower computational and memory cost, provided the number of regressor vectors and filter length parameters are appropriately chosen, as compared to a distributed recursive-least-squares (RLS) based method.
Resumo:
There is now considerable evidence to suggest that non-demented people with Parkinson's disease (PD) experience difficulties using the morphosyntactic aspects of language. It remains unclear, however, at precisely which point in the processing of morphosyntax, these difficulties emerge. The major objective of the present study was to examine the impact of PD on the processes involved in accessing morphosyntactic information in the lexicon. Nineteen people with PD and 19 matched control subjects participated in the study which employed on-line word recognition tasks to examine morphosyntactic priming for local grammatical dependencies that occur both within (e.g. is going) and across (e.g. she gives) phrasal boundaries (Experiments 1 and 2, respectively). The control group evidenced robust morphosyntactic priming effects that were consistent with the involvement of both pre- (Experiment 1) and post-lexical (Experiment 2) processing routines. Whilst the participants with PD also recorded priming for dependencies within phrasal boundaries (Experiment 1), priming effects were observed over an abnormally brief time course. Further, in contrast to the controls, the PD group failed to record morphosyntactic priming for constructions that crossed phrasal boundaries (Experiment 2). The results demonstrate that attentionally mediated mechanisms operating at both the pre- and post-lexical stages of processing are able to contribute to morphosyntactic priming effects. In addition, the findings support the notion that, whilst people with PD are able to access morphosyntactic information in a normal manner, the time frame in which this information remains available for processing is altered. Deficits may also be experienced at the post-lexical integrational stage of processing.
Resumo:
Recent advances in the control of molecular engineering architectures have allowed unprecedented ability of molecular recognition in biosensing, with a promising impact for clinical diagnosis and environment control. The availability of large amounts of data from electrical, optical, or electrochemical measurements requires, however, sophisticated data treatment in order to optimize sensing performance. In this study, we show how an information visualization system based on projections, referred to as Projection Explorer (PEx), can be used to achieve high performance for biosensors made with nanostructured films containing immobilized antigens. As a proof of concept, various visualizations were obtained with impedance spectroscopy data from an array of sensors whose electrical response could be specific toward a given antibody (analyte) owing to molecular recognition processes. In addition to discussing the distinct methods for projection and normalization of the data, we demonstrate that an excellent distinction can be made between real samples tested positive for Chagas disease and Leishmaniasis, which could not be achieved with conventional statistical methods. Such high performance probably arose from the possibility of treating the data in the whole frequency range. Through a systematic analysis, it was inferred that Sammon`s mapping with standardization to normalize the data gives the best results, where distinction could be made of blood serum samples containing 10(-7) mg/mL of the antibody. The method inherent in PEx and the procedures for analyzing the impedance data are entirely generic and can be extended to optimize any type of sensor or biosensor.
Resumo:
Thc oen itninteureasc ttioo nb eo fa pcreangtmraal tiiscs uaen di ns ymnotadcetlisc ocfo nsestnrtaeinnctse processing. It is well established that object relatives (1) are harder to process than subject relatives (2). Passivization, other things being equal, increases sentence complexity. However, one of the functions of the passive construction is to promote an NP into the role of subject so that it can be more easily bound to the head NP in a higher clause. Thus, (3) is predicted to be marginally preferred over (1). Passiviazation in this instance may be seen as a way of avoiding the object relative construction. 1. The pipe that the traveller smoked annoyed the passengers. 2. The traveller that smoked the pipe annoyed the passengers. 3.The pipe that was smoked by the traveller annoyed the 4.The traveller that the pipe was smoked by annoyed the 5.The traveller that the lady was assaulted by annoyed the In (4) we have relativization of an NP which has been demoted by passivization to the status of a by-phrase. Such relative clauses may only be obtained under quite restrictive pragmatic conditions. Many languages do not permit relativization of a constituent as low as a by-phrase on the NP accessibility hierarchy (Comrie, 1984). The factors which determine the acceptability of demoted NP relatives like (4-5) reflect the ease with which the NP promoted to subject position can be taken as a discourse topic. We explored the acceptability of sentences such as (1-5) using pair-wise judgements of samddifferent meaning, accompanied by ratings of easeof understanding. Results are discussed with reference to Gibsons DLT model of linguistic complexity and sentence processing (Gibson, 2000)
Resumo:
Objective: Summarize all relevant findings in published literature regarding the potential dose reduction related to image quality using Sinogram-Affirmed Iterative Reconstruction (SAFIRE) compared to Filtered Back Projection (FBP). Background: Computed Tomography (CT) is one of the most used radiographic modalities in clinical practice providing high spatial and contrast resolution. However it also delivers a relatively high radiation dose to the patient. Reconstructing raw-data using Iterative Reconstruction (IR) algorithms has the potential to iteratively reduce image noise while maintaining or improving image quality of low dose standard FBP reconstructions. Nevertheless, long reconstruction times made IR unpractical for clinical use until recently. Siemens Medical developed a new IR algorithm called SAFIRE, which uses up to 5 different strength levels, and poses an alternative to the conventional IR with a significant reconstruction time reduction. Methods: MEDLINE, ScienceDirect and CINAHL databases were used for gathering literature. Eleven articles were included in this review (from 2012 to July 2014). Discussion: This narrative review summarizes the results of eleven articles (using studies on both patients and phantoms) and describes SAFIRE strengths for noise reduction in low dose acquisitions while providing acceptable image quality. Conclusion: Even though the results differ slightly, the literature gathered for this review suggests that the dose in current CT protocols can be reduced at least 50% while maintaining or improving image quality. There is however a lack of literature concerning paediatric population (with increased radiation sensitivity). Further studies should also assess the impact of SAFIRE on diagnostic accuracy.
Resumo:
Background: Computed tomography (CT) is one of the most used modalities for diagnostics in paediatric populations, which is a concern as it also delivers a high patient dose. Research has focused on developing computer algorithms that provide better image quality at lower dose. The iterative reconstruction algorithm Sinogram-Affirmed Iterative Reconstruction (SAFIRE) was introduced as a new technique that reduces noise to increase image quality. Purpose: The aim of this study is to compare SAFIRE with the current gold standard, Filtered Back Projection (FBP), and assess whether SAFIRE alone permits a reduction in dose while maintaining image quality in paediatric head CT. Methods: Images were collected using a paediatric head phantom using a SIEMENS SOMATOM PERSPECTIVE 128 modulated acquisition. 54 images were reconstructed using FBP and 5 different strengths of SAFIRE. Objective measures of image quality were determined by measuring SNR and CNR. Visual measures of image quality were determined by 17 observers with different radiographic experiences. Images were randomized and displayed using 2AFC; observers scored the images answering 5 questions using a Likert scale. Results: At different dose levels, SAFIRE significantly increased SNR (up to 54%) in the acquired images compared to FBP at 80kVp (5.2-8.4), 110kVp (8.2-12.3), 130kVp (8.8-13.1). Visual image quality was higher with increasing SAFIRE strength. The highest image quality was scored with SAFIRE level 3 and higher. Conclusion: The SAFIRE algorithm is suitable for image noise reduction in paediatric head CT. Our data demonstrates that SAFIRE enhances SNR while reducing noise with a possible reduction of dose of 68%.
Resumo:
The development of an algorithm for the construction of auxiliary projection nets (conform, equivalent and orthographic), in the equatorial and polar versions, is presented. The algorithm for the drawing of the "IGAREA 220" counting net (ALYES & MENDES, 1972), is also presented. Those algorithms are the base of STEGRAPH program (vers. 2.0), for MS-DOS computers, which has other applications.
Resumo:
Magdeburg, Univ., Fak. für Mathematik, Diss., 2011
Resumo:
El present treball tracta el tema de les nominalitzacions búlgares en el marc teòric de Principis i Paràmetres. Es proposa que la formació dels nominals resulta de la fusió d’un nucli nominalitzador [nº] amb una arrel o radical on [nº] és un morfema de gènere o un sufix derivacional amb gènere inherent. El diferent comportament de les nominalitzacions s’explica per una diferència a l’estructura sintàctica. També demostro que sense estructura eventiva no hi ha estructura argumental i que podem distingir entre nominals d’estructura argumental, d’estructura de participants i nominals de resultat. Finalment, proposo que la seqüència correcta tan dels prefixos com dels sufixos s’obté, exclusivament, mitjançant moviments de projeccions màximes.