883 resultados para First-line


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first terrestrial Pb-isotope paradox refers to the fact that on average, rocks from the Earth's surface (i.e. the accessible Earth) plot significantly to the right of the meteorite isochron in a common Pb-isotope diagram. The Earth as a whole, however, should plot close to the meteorite isochron, implying the existence of at least one terrestrial reservoir that plots to the left of the meteorite isochron. The core and the lower continental crust are the two candidates that have been widely discussed in the past. Here we propose that subducted oceanic crust and associated continental sediment stored as garnetite slabs in the mantle Transition Zone or mid-lower mantle are an additional potential reservoir that requires consideration. We present evidence from the literature that indicates that neither the core nor the lower crust contains sufficient unradiogenic Pb to balance the accessible Earth. Of all mantle magmas, only rare alkaline melts plot significantly to the left of the meteorite isochron. We interpret these melts to be derived from the missing mantle reservoir that plots to the left of the meteorite isochron but, significantly, above the mid-ocean ridge basalt (MORB)-source mantle evolution line. Our solution to the paradox predicts the bulk silicate Earth to be more radiogenic in Pb-207/Pb-204 than present-day MORB-source mantle, which opens the possibility that undegassed primitive mantle might be the source of certain ocean island basalts (OIB). Further implications for mantle dynamics and oceanic magmatism are discussed based on a previously justified proposal that lamproites and associated rocks could derive from the Transition Zone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Swinfen Charitable Trust has used email for some years as a low-cost telemedicine medium to provide consultant support for doctors in developing countries. A scalable, automatic message-routing system was constructed which automates many of the tasks involved in message handling. During the first 12 months of its use, 1510 messages were processed automatically. There were 128 referrals from 18 hospitals in nine countries. Of these 128 queries, 89 (70%) were replied to within 72 h; the median delay was 1.1 day. The 39 unanswered queries were sent to backup specialists for reply and 36 of them (92%) were replied to within 72 h. In the remaining three cases, a second-line (backup) specialist was required. The referrals were handled by 54 volunteer specialists from a panel of over 70. Two system operators, located 10 time zones apart, managed the system. The median time from receipt of a new referral to its allocation to a specialist was 0.2 days (interquartile range, IQR, 0.1-0.8). The median interval between receipt of a new referral and first reply was 2.6 days (IQR 0.8-5.9). Automatic message handling solves many of the problems of manual email telemedicine systems and represents a potentially scalable way of doing low-cost telemedicine in the developing world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cultured human choriocarcinoma cells of the BeWo line exhibited saturable accumulation of radioiodide. Inhibition by competing anions followed the affinity series perchlorate >= iodide >= thiocyanate, consistent with uptake through the thyroid iodide transporter, NIS, whose messenger RNA was found in BeWo cells, and whose protein was distributed towards the apical pole of the cells. Efflux obeyed first order kinetics and was inhibited by DIDS, an antagonist of anion exchangers including pendrin, whose messenger RNA was also present. In cultures where iodide uptake through NIS was blocked with excess perchlorate, radiolodide accumulation was stimulated by exposure to medium in which physiological anions were replaced by 2-morpholinoethanesulfonic acid (MES), consistent with the operation of an anion exchange mechanism taking up iodide. Chloride in the medium was more effective than sulfate at inhibiting this uptake, matching the ionic specificity of pendrin. These studies provide evidence that the trophoblast accumulates iodide through NIS and releases it to the fetal compartment through pendrin. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Saturated fat plays a role in common debilitating diseases such as obesity, type 2 diabetes, and coronary heart disease. It is also clear that certain fatty acids act as regulators of metabolism via both direct and indirect signalling of target tissues. As the molecular mechanisms of saturated fatty acid signalling in the liver are poorly defined, hepatic gene expression analysis was undertaken in a human hepatocyte cell line after incubation with palmitate. Profiling of mRNA expression using cDNA microarray analysis revealed that 162 of approximately 18,000 genes tested were differentially expressed after incubation with palmitate for 48 h. Altered transcription profiles were observed in a wide variety of genes, including genes involved in lipid and cholesterol transport, cholesterol catabolism, cell growth and proliferation, cell signalling, P-oxidation, and oxidative stress response. While palinitate signalling has been examined in pancreatic beta-cells, this is the first report showing that palmitate regulates expression of numerous genes via direct molecular signalling mechanisms in liver cells. (C) 2005 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We demonstrate a portable process for developing a triple bottom line model to measure the knowledge production performance of individual research centres. For the first time, this study also empirically illustrates how a fully units-invariant model of Data Envelopment Analysis (DEA) can be used to measure the relative efficiency of research centres by capturing the interaction amongst a common set of multiple inputs and outputs. This study is particularly timely given the increasing transparency required by governments and industries that fund research activities. The process highlights the links between organisational objectives, desired outcomes and outputs while the emerging performance model represents an executive managerial view. This study brings consistency to current measures that often rely on ratios and univariate analyses that are not otherwise conducive to relative performance analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic signature verification is a well-established and an active area of research with numerous applications such as bank check verification, ATM access, etc. This paper proposes a novel approach to the problem of automatic off-line signature verification and forgery detection. The proposed approach is based on fuzzy modeling that employs the Takagi-Sugeno (TS) model. Signature verification and forgery detection are carried out using angle features extracted from box approach. Each feature corresponds to a fuzzy set. The features are fuzzified by an exponential membership function involved in the TS model, which is modified to include structural parameters. The structural parameters are devised to take account of possible variations due to handwriting styles and to reflect moods. The membership functions constitute weights in the TS model. The optimization of the output of the TS model with respect to the structural parameters yields the solution for the parameters. We have also derived two TS models by considering a rule for each input feature in the first formulation (Multiple rules) and by considering a single rule for all input features in the second formulation. In this work, we have found that TS model with multiple rules is better than TS model with single rule for detecting three types of forgeries; random, skilled and unskilled from a large database of sample signatures in addition to verifying genuine signatures. We have also devised three approaches, viz., an innovative approach and two intuitive approaches using the TS model with multiple rules for improved performance. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O estudo pretende mostrar que a clareza semântica e a fácil navegabilidade nos sites de Relações com Investidores são essenciais para a comunicação com os investidores individuais bem como para a sua compreensão das Boas Práticas da Governança Corporativa adotadas pelas empresas que aderiram ao Novo Mercado da Bovespa. O trabalho está dividido em quatro etapas. O primeiro capítulo explica o que é Governança Corporativa, como esse conceito foi implementado no Brasil, apresenta o mercado de ações, o Novo Mercado e aborda os temas relacionados ao setor financeiro. Em seguida, aborda a evolução da comunicação empresarial e como as organizações tiveram que adaptar a sua cultura organizacional e os canais de comunicação devido à constante e ininterrupta série de transações (aquisições, fusões e incorporações) que acontecem no Brasil desde 1994, com o início do Plano Real. Esse processo proporcionou uma alteração geopolítica, cultural, econômica e social nas corporações. Ainda nesse capítulo o estudo apresenta as características dos canais que as organizações utilizam para se comunicar com os públicos de referência. Mostra também como a internet e as demais mídias digitais se integraram nesse processo corporativo, a relação com os investidores, os sites de RI das empresas do Novo Mercado e o perfil do investidor individual. Por último, o estudo apresenta a avaliação dos sites de RI, os critérios adotados para analisar a construção das homepages e demais páginas. Nesse ponto, o objetivo foi avaliar a clareza semântica, ou seja, a maneira como as informações são transmitidas para os investidores individuais, a acessibilidade desse canal de comunicação como a quantidade de cliques necessária para ter acesso a qualquer informação e se os sites possuem espaços específicos para esse público. Finalmente, são apresentados os resultados e uma análise da comunicação empresarial dessas empresas antes e depois da entrada das mesmas no Novo Mercado da Bovespa e as considerações finais.(AU)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an analytic solution to the problem of on-line gradient-descent learning for two-layer neural networks with an arbitrary number of hidden units in both teacher and student networks. The technique, demonstrated here for the case of adaptive input-to-hidden weights, becomes exact as the dimensionality of the input space increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyse the dynamics of a number of second order on-line learning algorithms training multi-layer neural networks, using the methods of statistical mechanics. We first consider on-line Newton's method, which is known to provide optimal asymptotic performance. We determine the asymptotic generalization error decay for a soft committee machine, which is shown to compare favourably with the result for standard gradient descent. Matrix momentum provides a practical approximation to this method by allowing an efficient inversion of the Hessian. We consider an idealized matrix momentum algorithm which requires access to the Hessian and find close correspondence with the dynamics of on-line Newton's method. In practice, the Hessian will not be known on-line and we therefore consider matrix momentum using a single example approximation to the Hessian. In this case good asymptotic performance may still be achieved, but the algorithm is now sensitive to parameter choice because of noise in the Hessian estimate. On-line Newton's method is not appropriate during the transient learning phase, since a suboptimal unstable fixed point of the gradient descent dynamics becomes stable for this algorithm. A principled alternative is to use Amari's natural gradient learning algorithm and we show how this method provides a significant reduction in learning time when compared to gradient descent, while retaining the asymptotic performance of on-line Newton's method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On-line learning is one of the most powerful and commonly used techniques for training large layered networks and has been used successfully in many real-world applications. Traditional analytical methods have been recently complemented by ones from statistical physics and Bayesian statistics. This powerful combination of analytical methods provides more insight and deeper understanding of existing algorithms and leads to novel and principled proposals for their improvement. This book presents a coherent picture of the state-of-the-art in the theoretical analysis of on-line learning. An introduction relates the subject to other developments in neural networks and explains the overall picture. Surveys by leading experts in the field combine new and established material and enable non-experts to learn more about the techniques and methods used. This book, the first in the area, provides a comprehensive view of the subject and will be welcomed by mathematicians, scientists and engineers, whether in industry or academia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this contribution, certain aspects of the nonlinear dynamics of magnetic field lines are reviewed. First, the basic facts (known from literature) concerning the Hamiltonian structure are briefly summarized. The paper then concentrates on the following subjects: (i) Transition from the continuous description to discrete maps; (ii) Characteristics of incomplete chaos; (iii) Control of chaos. The presentation is concluded by some remarks on the motion of particles in stochastic magnetic fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The orientations of lines and edges are important in defining the structure of the visual environment, and observers can detect differences in line orientation within the first few hundred milliseconds of scene viewing. The present work is a psychophysical investigation of the mechanisms of early visual orientation-processing. In experiments with briefly presented displays of line elements, observers indicated whether all the elements were uniformly oriented or whether a uniquely oriented target was present among uniformly oriented nontargets. The minimum difference between nontarget and target orientations that was required for effective target-detection (the orientation increment threshold) varied little with the number of elements and their spatial density, but the percentage of correct responses in detection of a large orientation-difference increased with increasing element density. The differing variations with element density of thresholds and percent-correct scores may indicate the operation of more than one mechanism in early visual orientation-processIng. Reducing element length caused threshold to increase with increasing number of elements, showing that the effectiveness of rapid, spatially parallel orientation-processing depends on element length. Orientational anisotropy in line-target detection has been reported previously: a coarse periodic variation and some finer variations in orientation increment threshold with nontarget orientation have been found. In the present work, the prominence of the coarse variation in relation to finer variations decreased with increasing effective viewing duration, as if the operation of coarse orientation-processing mechanisms precedes the operation of finer ones. Orientational anisotropy was prominent even when observers lay horizontally and viewed displays by looking upwards through a black cylinder that excluded all possible visual references for orientation. So, gravitational and visual cues are not essential to the definition of an orientational reference frame for early vision, and such a reference can be well defined by retinocentric neural coding, awareness of body-axis orientation, or both.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nearest feature line-based subspace analysis is first proposed in this paper. Compared with conventional methods, the newly proposed one brings better generalization performance and incremental analysis. The projection point and feature line distance are expressed as a function of a subspace, which is obtained by minimizing the mean square feature line distance. Moreover, by adopting stochastic approximation rule to minimize the objective function in a gradient manner, the new method can be performed in an incremental mode, which makes it working well upon future data. Experimental results on the FERET face database and the UCI satellite image database demonstrate the effectiveness.