995 resultados para Dependent Translation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aims to present a contrastive approach between three different ways of building concepts after proving the similar syntactic possibilities that coexist in terms. However, from the semantic point of view we can see that each language family has a different distribution in meaning. But the most important point we try to show is that the differences found in the psychological process when communicating concepts should guide the translator and the terminologist in the target text production and the terminology planning process. Differences between languages in the information transmission process are due to the different roles the different types of knowledge play. We distinguish here the analytic-descriptive knowledge and the analogical knowledge among others. We also state that none of them is the best when determining the correctness of a term, but there has to be adequacy criteria in the selection process. This concept building or term building success is important when looking at the linguistic map of the information society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For some years now, translation theorist and educator Anthony Pym has been trying to establish a dialogue between the academic tradition he comes from and the world of the language industries into which he is meant to introduce his students: in other words, between the Translation Studies discipline and the localisation sector. This rapprochement is also the stated aim of his new book The Moving Text (p. 159). Rather than collect and synthesise what was previously dispersed over several articles, Pym has rewritten his material completely, both literally and conceptually, all in the light of the more than three decades of research he has conducted into the field of cross--cultural communication. The theoretical arguments are ably supported by a few short but telling and well-exploited examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plácido Castro‘s work has aroused our interest, because it evolves around the question of Galician personality and identity. While working as a journalist and a translator or while writing essays on different literary issues, Plácido Castro has never forgotten his roots or his nation. One could even say that his whole life turns around Galicia. Our purpose is to make a critical analysis of his work, especially as a translator, and try to show how he used translation in order to develop national conscience and identity and to see how far his ideology interfered in the interpretation and translation of Rossetti‘s poetry, in which he found a great similarity with Rosalìa de Castro‘s work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The daily access to news broadcast is something that, generally speaking, we do not abstain ourselves from, whether it is to be aware of what is going on in our country or to be informed about international events. But are we attentive to how the information about those events, namely those that occur outside Portugal, reaches us? How is that information handled – and who handles it – until we have it at our disposal? Is the audience aware that a large part of the news must be translated and must have a linguistic treatment? And how can we describe that same translation and the way it is presented? This case study is just an example of translation‘s role and its crucial presence on TV news broadcast, considering the way translation is processed, how it is barely noticed – or not –, how it influences the construction of the story and how the story influences the translation process. This case study was presented at the 2nd International Conference ―Media for All‖, with the theme ―Text on Air, Text on Screen‖, which took place at the Polytechnic Institute of Leiria, on 7-9 November 2007. O acesso diário a serviços informativos noticiosos é algo de que, de um modo geral, não nos privamos, seja para ficar a par do que se passa no nosso país, seja para ficarmos mais informados sobre eventos e acontecimentos internacionais. Mas teremos noção de como essas informações sobre esses acontecimentos, nomeadamente os que têm lugar fora de Portugal, chegam até nós? Qual será o tratamento dado a essas informações – e quem trata essas informações – até serem colocadas à nossa disposição? Será que o público, de um modo geral, se apercebe que grande parte das notícias é alvo de tratamento tradutológico e linguístico? E como se caracterizará essa tradução e a forma de apresentação da mesma? O presente estudo de caso é apenas um exemplo do papel e da presença fulcral da tradução em serviços noticiosos televisivos, seja pela forma como é feita, pelo modo como passa despercebida (ou não), pela influência que terá na construção da peça noticiosa e pela influência que a peça noticiosa terá no modo como se processa a tradução. Este estudo de caso foi apresentado na 2ª Conferência Internacional ―Media for All‖, subordinada ao tema ―Text on Air, Text on Screen‖, que teve lugar no Instituto Politécnico de Leiria de 7 a 9 de Novembro de 2007.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the growing complexity and adaptability requirements of real-time systems, which often exhibit unrestricted Quality of Service (QoS) inter-dependencies among supported services and user-imposed quality constraints, it is increasingly difficult to optimise the level of service of a dynamic task set within an useful and bounded time. This is even more difficult when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand and tasks may be inter-dependent. This paper focuses on optimising a dynamic local set of inter-dependent tasks that can be executed at varying levels of QoS to achieve an efficient resource usage that is constantly adapted to the specific constraints of devices and users, nature of executing tasks and dynamically changing system conditions. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves as the algorithms are given more time to run, with a minimum overhead when compared against their traditional versions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the growing complexity and dynamism of many embedded application domains (including consumer electronics, robotics, automotive and telecommunications), it is increasingly difficult to react to load variations and adapt the system's performance in a controlled fashion within an useful and bounded time. This is particularly noticeable when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand and tasks may exhibit unrestricted QoS inter-dependencies. This paper proposes a novel anytime adaptive QoS control policy in which the online search for the best set of QoS levels is combined with each user's personal preferences on their services' adaptation behaviour. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves as the algorithms are given more time to run, with a minimum overhead when compared against their traditional versions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aspergillus fumigatus (Af) and Pseudomonas aeruginosa (Pa) are leading fungal and bacterial pathogens, respectively, in many clinical situations. Relevant to this, their interface and co-existence has been studied. In some experiments in vitro, Pa products have been defined that are inhibitory to Af. In some clinical situations, both can be biofilm producers, and biofilm could alter their physiology and affect their interaction. That may be most relevant to airways in cystic fibrosis (CF), where both are often prominent residents. We have studied clinical Pa isolates from several sources for their effects on Af, including testing involving their biofilms. We show that the described inhibition of Af is related to the source and phenotype of the Pa isolate. Pa cells inhibited the growth and formation of Af biofilm from conidia, with CF isolates more inhibitory than non-CF isolates, and non-mucoid CF isolates most inhibitory. Inhibition did not require live Pa contact, as culture filtrates were also inhibitory, and again non-mucoid>mucoid CF>non-CF. Preformed Af biofilm was more resistant to Pa, and inhibition that occurred could be reproduced with filtrates. Inhibition of Af biofilm appears also dependent on bacterial growth conditions; filtrates from Pa grown as biofilm were more inhibitory than from Pa grown planktonically. The differences in Pa shown from these different sources are consistent with the extensive evolutionary Pa changes that have been described in association with chronic residence in CF airways, and may reflect adaptive changes to life in a polymicrobial environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The iterative simulation of the Brownian bridge is well known. In this article, we present a vectorial simulation alternative based on Gaussian processes for machine learning regression that is suitable for interpreted programming languages implementations. We extend the vectorial simulation of path-dependent trajectories to other Gaussian processes, namely, sequences of Brownian bridges, geometric Brownian motion, fractional Brownian motion, and Ornstein-Ulenbeck mean reversion process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to cater for an extended readership, crime fiction, like most popular genres, is based on the repetition of a formula allowing for the reader's immediate identification. This first domestication is followed, at the time of its translation, by a second process, which wipes out those characteristics of the source text that may come into conflict with the dominant values of the target culture. An analysis of the textual and paratextual strategies used in the English translation of José Carlos Somoza's La caverna de las ideas (2000) shows the efforts to make the novel more easily marketable in the English-speaking world through the elimination of most of the obstacles to easy readability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resumo: A decisão da terapêutica hormonal no tratamento do cancro da mama baseiase na determinação do receptor de estrogénio alfa por imunohistoquímica (IHC). Contudo, a presença deste receptor não prediz a resposta em todas as situações, em parte devido a limitações do método IHC. Investigámos se a expressão dos genes ESR1 e ESR2, bem como a metilação dos respectivos promotores, pode estar relacionada com a evolução desfavorável de uma proporção de doentes tratados com tamoxifeno assim como com a perda dos receptores de estrogénio alfa (ERα) e beta (ERß). Amostras de 211 doentes com cancro da mama diagnosticado entre 1988 e 2004, fixadas em formalina e preservadas em parafina, foram utilizadas para a determinação por IHC da presença dos receptores ERα e ERß. O mRNA total do gene ESR1 e os níveis específicos do transcrito derivado do promotor C (ESR1_C), bem como dos transcritos ESR2_ß1, ESR2_ß2/cx, and ESR2_ß5 foram avaliados por Real-time PCR. Os promotores A e C do gene ESR1 e os promotores 0K e 0N do gene ESR2 foram investigados por análise de metilação dos dinucleotidos CpG usando bisulfite-PCR para análise com enzimas de restrição, ou para methylation specific PCR. Atendendo aos resultados promissores relacionados com a metilação do promotor do gene ESR1, complementamos o estudo com um método quantitativo por matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) suportado pelo software Epityper para a medição da metilação nos promotores A e C. Fez-se a avaliação da estabilidade do mRNA nas linhas celulares de cancro da mama MCF-7 e MDA-MB-231 tratadas com actinomicina D. Baixos níveis do transcrito ESR1_C associaram-se a uma melhor sobrevivência global (p = 0.017). Níveis elevados do transcrito ESR1_C associaram-se a uma resposta inferior ao tamoxifeno (HR = 2.48; CI 95% 1.24-4.99), um efeito mais pronunciado em doentes com tumores de fenótipo ERα/PgR duplamente positivo (HR = 3.41; CI 95% 1.45-8.04). A isoforma ESR1_C mostrou ter uma semi-vida prolongada, bem como uma estrutura secundária da região 5’UTR muito mais relaxada em comparação com a isoforma ESR1_A. A análise por Western-blot mostrou que ao nível da 21 proteína, a selectividade de promotores é indistinguivel. Não se detectou qualquer correlação entre os níveis das isoformas do gene ESR2 ou entre a metilação dos promotores do gene ESR2, e a detecção da proteína ERß. A metilação do promotor C do gene ESR1, e não do promotor A, foi responsável pela perda do receptor ERα. Estes resultados sugerem que os níveis do transcrito ESR1_C sejam usados como um novo potencial marcador para o prognóstico e predição de resposta ao tratamento com tamoxifeno em doentes com cancro da mama. Abstract: The decision of endocrine breast cancer treatment relies on ERα IHC-based assessment. However, ER positivity does not predict response in all cases in part due to IHC methodological limitations. We investigated whether ESR1 and ESR2 gene expression and respective promoter methylation may be related to non-favorable outcome of a proportion of tamoxifen treated patients as well as to ERα and ERß loss. Formalin-fixed paraffin-embedded breast cancer samples from 211 patients diagnosed between 1988 and 2004 were submitted to IHC-based ERα and ERß protein determination. ESR1 whole mRNA and promoter C specific transcript levels, as well as ESR2_ß1, ESR2_ß2/cx, and ESR2_ß5 transcripts were assessed by real-time PCR. ESR1 promoters A and C, and ESR2 promoters 0N and 0K were investigated by CpG methylation analysis using bisulfite-PCR for restriction analysis, or methylation specific PCR. Due to the promising results related to ESR1 promoter methylation, we have used a quantification method by matrixassisted laser desorption/ionization time-of-flight mass spectrometry (MALDITOF MS) together with Epityper software to measure methylation at promoters A and C. mRNA stability was assessed in actinomycin D treated MCF-7 and MDA-MB-231 cells. ERα protein was quantified using transiently transfected breast cancer cells. Low ESR1_C transcript levels were associated with better overall survival (p = 0.017). High levels of ESR1_C transcript were associated with non-favorable response in tamoxifen treated patients (HR = 2.48; CI 95% 1.24-4.99), an effect that was more pronounced in patients with ERα/PgR double-positive tumors (HR = 3.41; CI 95% 1.45-8.04). The ESR1_C isoform had a prolonged mRNA half-life and a more relaxed 5’UTR structure compared to ESR1_A isoform. Western-blot analysis showed that at protein level, the promoter selectivity is undistinguishable. There was no correlation between levels of ESR2 isoforms or ESR2 promoter methylation and ERß protein staining. ESR1 promoter C CpG methylation and not promoter A was responsible for ERα loss. We propose ESR1_C levels as a putative novel marker for breast cancer prognosis and prediction of tamoxifen response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a new method to blindly unmix hyperspectral data, termed dependent component analysis (DECA). This method decomposes a hyperspectral images into a collection of reflectance (or radiance) spectra of the materials present in the scene (endmember signatures) and the corresponding abundance fractions at each pixel. DECA assumes that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. These abudances are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. This method overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical based approaches. The effectiveness of the proposed method is illustrated using simulated data based on U.S.G.S. laboratory spectra and real hyperspectral data collected by the AVIRIS sensor over Cuprite, Nevada.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of the colour group on the morbidity due to Schistosoma mansoni was examined in two endemic areas situated in the State of Minas Gerais, Brazil. Of the 2773 eligible inhabitants, 1971 (71.1%) participated in the study: 545 (27.6%) were classified as white, 719 (36.5%) as intermediate and 707 (35.9%) as black. For each colour group, signs and symptoms of individuals who eliminated S.mansoni eggs (cases) were compared to those who did not present eggs in the faeces (controls). The odds ratios were adjusted by age, gender, previous treatment for schistosomiasis, endemic area and quality of the household. There was no evidence of a modifier effect of colour on diarrhea, bloody faeces or abdominal pain. A modifier effect of colour on hepatomegaly was evident among those heaviest infected (> 400 epg): the adjusted odds ratios for palpable liver at the middle clavicular and the middle sternal lines were smaller among blacks (5.4 and 6.5, respectively) and higher among whites (10.6 and 12.9) and intermediates (10.4 and 10.1, respectively). These results point out the existence of some degree of protection against hepatomegaly among blacks heaviest infected in the studied areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The intensification of agricultural productivity is an important challenge worldwide. However, environmental stressors can provide challenges to this intensification. The progressive occurrence of the cyanotoxins cylindrospermopsin (CYN) and microcystin-LR (MC-LR) as a potential consequence of eutrophication and climate change is of increasing concern in the agricultural sector because it has been reported that these cyanotoxins exert harmful effects in crop plants. A proteomic-based approach has been shown to be a suitable tool for the detection and identification of the primary responses of organisms exposed to cyanotoxins. The aim of this study was to compare the leaf-proteome profiles of lettuce plants exposed to environmentally relevant concentrations of CYN and a MC-LR/CYN mixture. Lettuce plants were exposed to 1, 10, and 100 lg/l CYN and a MC-LR/CYN mixture for five days. The proteins of lettuce leaves were separated by twodimensional electrophoresis (2-DE), and those that were differentially abundant were then identified by matrix-assisted laser desorption/ionization time of flight-mass spectrometry (MALDI-TOF/TOF MS). The biological functions of the proteins that were most represented in both experiments were photosynthesis and carbon metabolism and stress/defense response. Proteins involved in protein synthesis and signal transduction were also highly observed in the MC-LR/CYN experiment. Although distinct protein abundance patterns were observed in both experiments, the effects appear to be concentration-dependent, and the effects of the mixture were clearly stronger than those of CYN alone. The obtained results highlight the putative tolerance of lettuce to CYN at concentrations up to 100 lg/l. Furthermore, the combination of CYN with MC-LR at low concentrations (1 lg/l) stimulated a significant increase in the fresh weight (fr. wt) of lettuce leaves and at the proteomic level resulted in the increase in abundance of a high number of proteins. In contrast, many proteins exhibited a decrease in abundance or were absent in the gels of the simultaneous exposure to 10 and 100 lg/l MC-LR/CYN. In the latter, also a significant decrease in the fr. wt of lettuce leaves was obtained. These findings provide important insights into the molecular mechanisms of the lettuce response to CYN and MC-LR/CYN and may contribute to the identification of potential protein markers of exposure and proteins that may confer tolerance to CYN and MC-LR/CYN. Furthermore, because lettuce is an important crop worldwide, this study may improve our understanding of the potential impact of these cyanotoxins on its quality traits (e.g., presence of allergenic proteins).