986 resultados para text-dependent speaker recognition


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neospora caninum is an apicomplexan parasite responsible for major economic losses due to abortions in cattle. Toll-like receptors (TLRs) sense specific microbial products and direct downstream signaling pathways in immune cells, linking innate, and adaptive immunity. Here, we analyze the role of TLR2 on innate and adaptive immune responses during N. caninum infection. Inflammatory peritoneal macrophages and bone marrow-derived dendritic cells exposed to N. caninum-soluble antigens presented an upregulated expression of TLR2. Increased receptor expression was correlated to TLR2/MyD88-dependent antigen-presenting cell maturation and pro-inflammatory cytokine production after stimulation by antigens. Impaired innate responses observed after infection of mice genetically deficient for TLR2((-/-)) was followed by downregulation of adaptive T helper 1 (Th1) immunity, represented by diminished parasite-specific CD4(+) and CD8(+) T-cell proliferation, IFN-gamma:interleukin (IL)-10 ratio, and IgG subclass synthesis. In parallel, TLR2(-/-) mice presented higher parasite burden than wild-type (WT) mice at acute and chronic stages of infection. These results show that initial recognition of N. caninum by TLR2 participates in the generation of effector immune responses against N. caninum and imply that the receptor may be a target for future prophylactic strategies against neosporosis. Immunology and Cell Biology (2010) 88, 825-833; doi:10.1038/icb.2010.52; published online 20 April 2010

Relevância:

30.00% 30.00%

Publicador:

Resumo:

T-cell cytokine profiles, anti Porphyromonas gingivalis antibodies and Western blot analysis of antibody responses were examined in BALB/c, CBA/CaH, C57BL6 and DBA/2J mice immunized intraperitoneally with different doses of P. gingivalis outer membrane antigens, Splenic CD4 and CD8 cells were examined for intracytoplasmic interleukin (IL)-4, interferon (IFN)-gamma and IL-LD by FAGS analysis and levels of anti-P. gingivalis antibodies in the serum samples determined by enzyme-linked immunosorbent assay. Western blot analysis was performed on the sera from mice immunized with 100 mug of P. gingivalis antigens. The four strains of mice demonstrated varying degrees of T-cell immunity although the T-cell cytokine profiles exhibited by each strain were not affected by different immunizing doses. While BALB/c and DBA/2J mice exhibited responses that peaked at immunizing doses of 100-200 mug of P. gingivalis antigens, CBA/CaH and C57BL6 demonstrated weak T-cell responsiveness compared with control mice. Like the T-cell responses, serum antibody levels were not dose dependent. DBA/23 exhibited the lowest levels of anti-P. gingivalis antibodies followed by BALB/c with CBA/CaH and C57BL6 mice demonstrating the highest levels. Western blot analysis showed that there were differences in reactivity between the strains to a group of 13 antigens ranging in molecular weight from 15 to 43 kDa. Antibody responses to a number of these bands in BALB/c mice were of low density, whereas CBA/CaH and C57BL6 mice demonstrated high-density bands and DBA/2J mice showed medium to high responses. In conclusion, different immunizing doses of P. gingivalis outer membrane antigens had little effect on the T-cell cytokine responses and serum anti-P. gingivalis antibody levels. Western blot analysis, however, indicated that the four strains of mice exhibited different reactivity to some lower-molecular-weight antigens. Future studies are required to determine the significance of these differences, which may affect the outcome of P. gingivalis infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epstein-Barr virus (EBV)-encoded nuclear antigen 1 (EBNA1) includes a unique glycine-alanine repeat domain that inhibits the endogenous presentation of cytotoxic T lymphocyte (CTL) epitopes through the class I pathway by blocking proteasome-dependent degradation of this antigen. This immune evasion mechanism has been implicated in the pathogenesis of EBV-associated diseases. Here, we show that cotranslational ubiquitination combined with N-end rule targeting enhances the intracellular degradation of EBNA1, thus resulting in a dramatic reduction in the half-life of the antigen. Using DNA expression vectors encoding different forms of ubiquitinated EBNA1 for in vivo studies revealed that this rapid degradation, remarkably, leads to induction of a very strong CTL response to an EBNA1-specific CTL epitope. Furthermore, this targeting also restored the endogenous processing of HLA class I-restricted CTL epitopes within EBNA1 for immune recognition by human EBV-specific CTLs. These observations provide, for the first time, evidence that the glycine-alanine repeat-mediated proteasomal block on EBNA1 can be reversed by specifically targeting this antigen for rapid degradation resulting in enhanced CD8+ T cell-mediated recognition in vitro and in vivo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blasting has been the most frequently used method for rock breakage since black powder was first used to fragment rocks, more than two hundred years ago. This paper is an attempt to reassess standard design techniques used in blasting by providing an alternative approach to blast design. The new approach has been termed asymmetric blasting. Based on providing real time rock recognition through the capacity of measurement while drilling (MWD) techniques, asymmetric blasting is an approach to deal with rock properties as they occur in nature, i.e., randomly and asymmetrically spatially distributed. It is well accepted that performance of basic mining operations, such as excavation and crushing rely on a broken rock mass which has been pre conditioned by the blast. By pre-conditioned we mean well fragmented, sufficiently loose and with adequate muckpile profile. These muckpile characteristics affect loading and hauling [1]. The influence of blasting does not end there. Under the Mine to Mill paradigm, blasting has a significant leverage on downstream operations such as crushing and milling. There is a body of evidence that blasting affects mineral liberation [2]. Thus, the importance of blasting has increased from simply fragmenting and loosing the rock mass, to a broader role that encompasses many aspects of mining, which affects the cost of the end product. A new approach is proposed in this paper which facilitates this trend 'to treat non-homogeneous media (rock mass) in a non-homogeneous manner (an asymmetrical pattern) in order to achieve an optimal result (in terms of muckpile size distribution).' It is postulated there are no logical reasons (besides the current lack of means to infer rock mass properties in the blind zones of the bench and onsite precedents) for drilling a regular blast pattern over a rock mass that is inherently heterogeneous. Real and theoretical examples of such a method are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dental implant recognition in patients without available records is a time-consuming and not straightforward task. The traditional method is a complete user-dependent process, where the expert compares a 2D X-ray image of the dental implant with a generic database. Due to the high number of implants available and the similarity between them, automatic/semi-automatic frameworks to aide implant model detection are essential. In this study, a novel computer-aided framework for dental implant recognition is suggested. The proposed method relies on image processing concepts, namely: (i) a segmentation strategy for semi-automatic implant delineation; and (ii) a machine learning approach for implant model recognition. Although the segmentation technique is the main focus of the current study, preliminary details of the machine learning approach are also reported. Two different scenarios are used to validate the framework: (1) comparison of the semi-automatic contours against implant’s manual contours of 125 X-ray images; and (2) classification of 11 known implants using a large reference database of 601 implants. Regarding experiment 1, 0.97±0.01, 2.24±0.85 pixels and 11.12±6 pixels of dice metric, mean absolute distance and Hausdorff distance were obtained, respectively. In experiment 2, 91% of the implants were successfully recognized while reducing the reference database to 5% of its original size. Overall, the segmentation technique achieved accurate implant contours. Although the preliminary classification results prove the concept of the current work, more features and an extended database should be used in a future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 2008 economic crisis challenged accounting, either demanding recognition and measurement criteria well adjusted to this scenario or even questioning its ability to inform appropriately entities' financial situation before the crisis occurred. So, our purpose was to verify if during economic crises listed companies in the Brazilian capital market tended to adopt earnings management (EM) practices. Our sample consisted in 3,772 firm-years observations, in 13 years - 1997 to 2009. We developed regression models considering discretionary accruals as EM proxy (dependent variable), crisis as a macroeconomic factor (dummy variable of interest), ROA, market-to-book, size, leverage, foreign direct investment (FDI) and sector as control variables. Different for previous EM studies two approaches were used in data panel regression models and multiple crises were observed simultaneously. Statistics tests revealed a significant relation between economic crisis and EM practices concerning listed companies in Brazil in both approaches used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Describe the overall transmission of malaria through a compartmental model, considering the human host and mosquito vector. METHODS: A mathematical model was developed based on the following parameters: human host immunity, assuming the existence of acquired immunity and immunological memory, which boosts the protective response upon reinfection; mosquito vector, taking into account that the average period of development from egg to adult mosquito and the extrinsic incubation period of parasites (transformation of infected but non-infectious mosquitoes into infectious mosquitoes) are dependent on the ambient temperature. RESULTS: The steady state equilibrium values obtained with the model allowed the calculation of the basic reproduction ratio in terms of the model's parameters. CONCLUSIONS: The model allowed the calculation of the basic reproduction ratio, one of the most important epidemiological variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arguably, the most difficult task in text classification is to choose an appropriate set of features that allows machine learning algorithms to provide accurate classification. Most state-of-the-art techniques for this task involve careful feature engineering and a pre-processing stage, which may be too expensive in the emerging context of massive collections of electronic texts. In this paper, we propose efficient methods for text classification based on information-theoretic dissimilarity measures, which are used to define dissimilarity-based representations. These methods dispense with any feature design or engineering, by mapping texts into a feature space using universal dissimilarity measures; in this space, classical classifiers (e.g. nearest neighbor or support vector machines) can then be used. The reported experimental evaluation of the proposed methods, on sentiment polarity analysis and authorship attribution problems, reveals that it approximates, sometimes even outperforms previous state-of-the-art techniques, despite being much simpler, in the sense that they do not require any text pre-processing or feature engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of the colour group on the morbidity due to Schistosoma mansoni was examined in two endemic areas situated in the State of Minas Gerais, Brazil. Of the 2773 eligible inhabitants, 1971 (71.1%) participated in the study: 545 (27.6%) were classified as white, 719 (36.5%) as intermediate and 707 (35.9%) as black. For each colour group, signs and symptoms of individuals who eliminated S.mansoni eggs (cases) were compared to those who did not present eggs in the faeces (controls). The odds ratios were adjusted by age, gender, previous treatment for schistosomiasis, endemic area and quality of the household. There was no evidence of a modifier effect of colour on diarrhea, bloody faeces or abdominal pain. A modifier effect of colour on hepatomegaly was evident among those heaviest infected (> 400 epg): the adjusted odds ratios for palpable liver at the middle clavicular and the middle sternal lines were smaller among blacks (5.4 and 6.5, respectively) and higher among whites (10.6 and 12.9) and intermediates (10.4 and 10.1, respectively). These results point out the existence of some degree of protection against hepatomegaly among blacks heaviest infected in the studied areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica