988 resultados para first pass effect


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Elastic Net Regularizers have shown much promise in designing sparse classifiers for linear classification. In this work, we propose an alternating optimization approach to solve the dual problems of elastic net regularized linear classification Support Vector Machines (SVMs) and logistic regression (LR). One of the sub-problems turns out to be a simple projection. The other sub-problem can be solved using dual coordinate descent methods developed for non-sparse L2-regularized linear SVMs and LR, without altering their iteration complexity and convergence properties. Experiments on very large datasets indicate that the proposed dual coordinate descent - projection (DCD-P) methods are fast and achieve comparable generalization performance after the first pass through the data, with extremely sparse models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis we propose a new approach to deduction methods for temporal logic. Our proposal is based on an inductive definition of eventualities that is different from the usual one. On the basis of this non-customary inductive definition for eventualities, we first provide dual systems of tableaux and sequents for Propositional Linear-time Temporal Logic (PLTL). Then, we adapt the deductive approach introduced by means of these dual tableau and sequent systems to the resolution framework and we present a clausal temporal resolution method for PLTL. Finally, we make use of this new clausal temporal resolution method for establishing logical foundations for declarative temporal logic programming languages. The key element in the deduction systems for temporal logic is to deal with eventualities and hidden invariants that may prevent the fulfillment of eventualities. Different ways of addressing this issue can be found in the works on deduction systems for temporal logic. Traditional tableau systems for temporal logic generate an auxiliary graph in a first pass.Then, in a second pass, unsatisfiable nodes are pruned. In particular, the second pass must check whether the eventualities are fulfilled. The one-pass tableau calculus introduced by S. Schwendimann requires an additional handling of information in order to detect cyclic branches that contain unfulfilled eventualities. Regarding traditional sequent calculi for temporal logic, the issue of eventualities and hidden invariants is tackled by making use of a kind of inference rules (mainly, invariant-based rules or infinitary rules) that complicates their automation. A remarkable consequence of using either a two-pass approach based on auxiliary graphs or aone-pass approach that requires an additional handling of information in the tableau framework, and either invariant-based rules or infinitary rules in the sequent framework, is that temporal logic fails to carry out the classical correspondence between tableaux and sequents. In this thesis, we first provide a one-pass tableau method TTM that instead of a graph obtains a cyclic tree to decide whether a set of PLTL-formulas is satisfiable. In TTM tableaux are classical-like. For unsatisfiable sets of formulas, TTM produces tableaux whose leaves contain a formula and its negation. In the case of satisfiable sets of formulas, TTM builds tableaux where each fully expanded open branch characterizes a collection of models for the set of formulas in the root. The tableau method TTM is complete and yields a decision procedure for PLTL. This tableau method is directly associated to a one-sided sequent calculus called TTC. Since TTM is free from all the structural rules that hinder the mechanization of deduction, e.g. weakening and contraction, then the resulting sequent calculus TTC is also free from this kind of structural rules. In particular, TTC is free of any kind of cut, including invariant-based cut. From the deduction system TTC, we obtain a two-sided sequent calculus GTC that preserves all these good freeness properties and is finitary, sound and complete for PLTL. Therefore, we show that the classical correspondence between tableaux and sequent calculi can be extended to temporal logic. The most fruitful approach in the literature on resolution methods for temporal logic, which was started with the seminal paper of M. Fisher, deals with PLTL and requires to generate invariants for performing resolution on eventualities. In this thesis, we present a new approach to resolution for PLTL. The main novelty of our approach is that we do not generate invariants for performing resolution on eventualities. Our method is based on the dual methods of tableaux and sequents for PLTL mentioned above. Our resolution method involves translation into a clausal normal form that is a direct extension of classical CNF. We first show that any PLTL-formula can be transformed into this clausal normal form. Then, we present our temporal resolution method, called TRS-resolution, that extends classical propositional resolution. Finally, we prove that TRS-resolution is sound and complete. In fact, it finishes for any input formula deciding its satisfiability, hence it gives rise to a new decision procedure for PLTL. In the field of temporal logic programming, the declarative proposals that provide a completeness result do not allow eventualities, whereas the proposals that follow the imperative future approach either restrict the use of eventualities or deal with them by calculating an upper bound based on the small model property for PLTL. In the latter, when the length of a derivation reaches the upper bound, the derivation is given up and backtracking is used to try another possible derivation. In this thesis we present a declarative propositional temporal logic programming language, called TeDiLog, that is a combination of the temporal and disjunctive paradigms in Logic Programming. We establish the logical foundations of our proposal by formally defining operational and logical semantics for TeDiLog and by proving their equivalence. Since TeDiLog is, syntactically, a sublanguage of PLTL, the logical semantics of TeDiLog is supported by PLTL logical consequence. The operational semantics of TeDiLog is based on TRS-resolution. TeDiLog allows both eventualities and always-formulas to occur in clause heads and also in clause bodies. To the best of our knowledge, TeDiLog is the first declarative temporal logic programming language that achieves this high degree of expressiveness. Since the tableau method presented in this thesis is able to detect that the fulfillment of an eventuality is prevented by a hidden invariant without checking for it by means of an extra process, since our finitary sequent calculi do not include invariant-based rules and since our resolution method dispenses with invariant generation, we say that our deduction methods are invariant-free.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerous studies have shown that accentuation and implicit verb causality influenced pronoun resolution. However, many researchers cannot agree on the time course, as well as they know little about the interaction between the two types of information during comprehending Chinese sentences. The study aimed to explore the effects of accentuation and implicit verb causality on the pronoun processing during spoken Chinese sentences comprehension as well as their time courses, using auditory moving window technique and cross-modal probe paradigm. The main results were: 1) The reading time of the second clause in stressed pronoun condition was significantly longer than that in unstressed pronoun condition. Accentuation influenced the activation level of candidate antecedents. 2) Implicit verb causality influenced the pronoun interpretation during spoken Chinese sentences comprehension. It also affected the activation level of candidate antecedents immediately after people heard the pronoun. 3) There was “the first-mentioned effect” in spoken Chinese sentences comprehension. The effect seemed as if a general phenomenon during the pronoun processing. 4) Accentuation, Implicit verb causality and the first-mentioned effect interacted during the pronoun processing and spoken Chinese sentences comprehension. This study supported the focus hypothesis, indicating accentuation could shift the center of attention even in nonparallel-structure sentences; implicit verb causality influences the pronoun processing immediately; there was interaction between accentuation and implicit verb causality during spoken sentence comprehension.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We conducted a pilot study on 10 patients undergoing general surgery to test the feasibility of diffuse reflectance spectroscopy in the visible wavelength range as a noninvasive monitoring tool for blood loss during surgery. Ratios of raw diffuse reflectance at wavelength pairs were tested as a first-pass for estimating hemoglobin concentration. Ratios can be calculated easily and rapidly with limited post-processing, and so this can be considered a near real-time monitoring device. We found the best hemoglobin correlations were when ratios at isosbestic points of oxy- and deoxyhemoglobin were used, specifically 529/500 nm. Baseline subtraction improved correlations, specifically at 520/509 nm. These results demonstrate proof-of-concept for the ability of this noninvasive device to monitor hemoglobin concentration changes due to surgical blood loss. The 529/500 nm ratio also appears to account for variations in probe pressure, as determined from measurements on two volunteers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A periodic finite-difference time-domain (FDTD) analysis is presented and applied for the first time in the study of a two-dimensional (2-D) leaky-wave planar antenna based on dipole frequency selective surfaces (FSSs). First, the effect of certain aspects of the FDTD modeling in the modal analysis of complex waves is studied in detail. Then, the FDTD model is used for the dispersion analysis of the antenna of interest. The calculated values of the leaky-wave attenuation constants suggest that, for an antenna of this type and moderate length, a significant amount of power reaches the edges of the antenna, and thus diffraction can play an important role. To test the validity of our dispersion analysis, measured radiation patterns of a fabricated prototype are presented and compared with those predicted by a leaky-wave approach based on the periodic FDTD results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Patterns of arsenic excretion were followed in a cohort (n = 6) eating a defined rice diet, 300 g per day d.wt. where arsenic speciation was characterized in cooked rice, following a period of abstinence from rice, and other high arsenic containing foods. A control group who did not consume rice were also monitored. The rice consumed in the study contained inorganic arsenic and dimethylarsinic acid (DMA) at a ratio of 1:1, yet the urine speciation was dominated by DMA (90%). At steady state (rice consumption/urinary excretion) similar to 40% of rice derived arsenic was excreted via urine. By monitoring of each urine pass throughout the day it was observed that there was considerable variation (up to 13-fold) for an individual's total arsenic urine content, and that there was a time dependent variation in urinary total arsenic content. This calls into question the robustness of routinely used first pass/spot check urine sampling for arsenic analysis. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION: Transdermal drug delivery offers a number of advantages for the patient, not only due to its non-invasive and convenient nature, but also due to factors such as avoidance of first-pass metabolism and prevention of gastrointestinal degradation. It has been demonstrated that microneedles (MNs) can increase the number of compounds amenable to transdermal delivery by penetrating the skin's protective barrier, the stratum corneum, and creating a pathway for drug permeation to the dermal tissue below.

AREAS COVERED: MNs have been extensively investigated for drug and vaccine delivery. The different types of MN arrays and their delivery capabilities are discussed in terms of drugs, including biopharmaceutics and vaccines. Patient usage and effects on the skin are also considered.

EXPERT OPINION: MN research and development is now at the stage where commercialisation is a viable possibility. There are a number of long-term safety questions relating to patient usage which will need to be addressed moving forward. Regulatory guidance is awaited to direct the scale-up of the manufacturing process alongside provision of clearer patient instruction for safe and effective use of MN devices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Transdermal drug delivery offers a number of advantages for the patient, due not only its non-invasive and convenient nature, but also factors such as avoidance of first pass metabolism and prevention of gastrointestinal degradation. It has been demonstrated that microneedle arrays can increase the number of compounds amenable to transdermal delivery by penetrating the skin's protective barrier, the stratum corneum, and creating a pathway for drug permeation to the dermal tissue below. Microneedles have been extensively investigated in recent decades for drug and vaccine delivery as well as minimally invasive patient monitoring/diagnosis. This review focuses on a range of critically important aspects of microneedle technology, namely their material composition, manufacturing techniques, methods of evaluation and commercial translation to the clinic for patient benefit and economic return. Microneedle research and development is finally now at the stage where commercialisation is a realistic possibility. However, progress is still required in the areas of scaled-up manufacture and regulatory approval.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A gas turbine is made up of three basic components: a compressor, a combustion chamber and a turbine. Air is drawn into the engine by the compressor, which compresses it and delivers it to the combustion chamber. There, the air is mixed with the fuel and the mixture ignited, producing a rise of temperature and therefore an expansion of the gases. These are expelled through the engine nozzle, but first pass through the turbine, designed to extract energy to keep the compressor rotating [1]. The work described here uses data recorded from a Rolls Royce Spey MK 202 turbine, whose simplified diagram can be seen in Fig. 1. Both the compressor and the turbine are split into low pressure (LP) and high pressure (HP) stages. The HP turbine drives the HP compressor and the LP turbine drives the LP compressor. They are connected by concentric shafts that rotate at different speeds, denoted as NH and NL.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--Brock University, 2010.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This lexical decision study with eye tracking of Japanese two-kanji-character words investigated the order in which a whole two-character word and its morphographic constituents are activated in the course of lexical access, the relative contributions of the left and the right characters in lexical decision, the depth to which semantic radicals are processed, and how nonlinguistic factors affect lexical processes. Mixed-effects regression analyses of response times and subgaze durations (i.e., first-pass fixation time spent on each of the two characters) revealed joint contributions of morphographic units at all levels of the linguistic structure with the magnitude and the direction of the lexical effects modulated by readers’ locus of attention in a left-to-right preferred processing path. During the early time frame, character effects were larger in magnitude and more robust than radical and whole-word effects, regardless of the font size and the type of nonwords. Extending previous radical-based and character-based models, we propose a task/decision-sensitive character-driven processing model with a level-skipping assumption: Connections from the feature level bypass the lower radical level and link up directly to the higher character level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En Chine la croissance économique observée durant ces trois dernières décennies, s’est accompagnée d’importants changements sociaux. Jusqu’en 2008, le droit du travail hérité de l’ère socialiste et inadapté à l’économie de marché, servait les intérêts de la croissance au détriment de ceux des travailleurs. La nouvelle loi sur le contrat de travail de 2008 a pour ambition de corriger cette situation en rééquilibrant les relations du travail dans un contexte de redistribution plus juste des nouvelles richesses. L’objectif de ce mémoire est de comprendre comment les entreprises étrangères présentes en Chine appréhendent ce changement institutionnel. Cela impacte-t-il leur gestion et leur stratégie de localisation ? Cette question est traitée, à travers l’étude du cas d’une entreprise étrangère implantée à Shanghai depuis 10 ans. Le premier effet observé est une professionnalisation du secteur des ressources humaines. L’augmentation des coûts de fonctionnement à laquelle la nouvelle loi participe a également pour effet une relocalisation des activités de production dans des régions à moindres coûts dans le centre de la Chine. L’expertise spécifiquement acquise localement est une des raisons majeures interdisant une délocalisation dans un pays tierce.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a parallel Linear Hashtable Motion Estimation Algorithm (LHMEA). Most parallel video compression algorithms focus on Group of Picture (GOP). Based on LHMEA we proposed earlier [1][2], we developed a parallel motion estimation algorithm focus inside of frame. We divide each reference frames into equally sized regions. These regions are going to be processed in parallel to increase the encoding speed significantly. The theory and practice speed up of parallel LHMEA according to the number of PCs in the cluster are compared and discussed. Motion Vectors (MV) are generated from the first-pass LHMEA and used as predictors for second-pass Hexagonal Search (HEXBS) motion estimation, which only searches a small number of Macroblocks (MBs). We evaluated distributed parallel implementation of LHMEA of TPA for real time video compression.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dynamics affects the distribution and abundance of stratospheric ozone directly through transport of ozone itself and indirectly through its effect on ozone chemistry via temperature and transport of other chemical species. Dynamical processes must be considered in order to understand past ozone changes, especially in the northern hemisphere where there appears to be significant low-frequency variability which can look “trend-like” on decadal time scales. A major challenge is to quantify the predictable, or deterministic, component of past ozone changes. Over the coming century, changes in climate will affect the expected recovery of ozone. For policy reasons it is important to be able to distinguish and separately attribute the effects of ozone-depleting substances and greenhouse gases on both ozone and climate. While the radiative-chemical effects can be relatively easily identified, this is not so evident for dynamics — yet dynamical changes (e.g., changes in the Brewer-Dobson circulation) could have a first-order effect on ozone over particular regions. Understanding the predictability and robustness of such dynamical changes represents another major challenge. Chemistry-climate models have recently emerged as useful tools for addressing these questions, as they provide a self-consistent representation of dynamical aspects of climate and their coupling to ozone chemistry. We can expect such models to play an increasingly central role in the study of ozone and climate in the future, analogous to the central role of global climate models in the study of tropospheric climate change.