936 resultados para Pattern recognition, cluster finding, calibration and fitting methods
Resumo:
The conjugation of antigens with ligands of pattern recognition receptors (PRR) is emerging as a promising strategy for the modulation of specific immunity. Here, we describe a new Escherichia coli system for the cloning and expression of heterologous antigens in fusion with the OprI lipoprotein, a TLR ligand from the Pseudomonas aeruginosa outer membrane (OM). Analysis of the OprI expressed by this system reveals a triacylated lipid moiety mainly composed by palmitic acid residues. By offering a tight regulation of expression and allowing for antigen purification by metal affinity chromatography, the new system circumvents the major drawbacks of former versions. In addition, the anchoring of OprI to the OM of the host cell is further explored for the production of novel recombinant bacterial cell wall-derived formulations (OM fragments and OM vesicles) with distinct potential for PRR activation. As an example, the African swine fever virus ORF A104R was cloned and the recombinant antigen was obtained in the three formulations. Overall, our results validate a new system suitable for the production of immunogenic formulations that can be used for the development of experimental vaccines and for studies on the modulation of acquired immunity.
Resumo:
In this work liver contour is semi-automatically segmented and quantified in order to help the identification and diagnosis of diffuse liver disease. The features extracted from the liver contour are jointly used with clinical and laboratorial data in the staging process. The classification results of a support vector machine, a Bayesian and a k-nearest neighbor classifier are compared. A population of 88 patients at five different stages of diffuse liver disease and a leave-one-out cross-validation strategy are used in the classification process. The best results are obtained using the k-nearest neighbor classifier, with an overall accuracy of 80.68%. The good performance of the proposed method shows a reliable indicator that can improve the information in the staging of diffuse liver disease.
Resumo:
Steatosis, also known as fatty liver, corresponds to an abnormal retention of lipids within the hepatic cells and reflects an impairment of the normal processes of synthesis and elimination of fat. Several causes may lead to this condition, namely obesity, diabetes, or alcoholism. In this paper an automatic classification algorithm is proposed for the diagnosis of the liver steatosis from ultrasound images. The features are selected in order to catch the same characteristics used by the physicians in the diagnosis of the disease based on visual inspection of the ultrasound images. The algorithm, designed in a Bayesian framework, computes two images: i) a despeckled one, containing the anatomic and echogenic information of the liver, and ii) an image containing only the speckle used to compute the textural features. These images are computed from the estimated RF signal generated by the ultrasound probe where the dynamic range compression performed by the equipment is taken into account. A Bayes classifier, trained with data manually classified by expert clinicians and used as ground truth, reaches an overall accuracy of 95% and a 100% of sensitivity. The main novelties of the method are the estimations of the RF and speckle images which make it possible to accurately compute textural features of the liver parenchyma relevant for the diagnosis.
Resumo:
Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
The indiscriminate use of antibiotics in foodproducing animals has received increasing attention as a contributory factor in the international emergence of antibiotic- resistant bacteria (Woodward in Pesticide, veterinary and other residues in food, CRC Press, Boca Raton, 2004). Numerous analytical methods for quantifying antibacterial residues in edible animal products have been developed over years (Woodward in Pesticide, veterinary and other residues in food, CRC Press, Boca Raton, 2004; Botsoglou and Fletouris in Handbook of food analysis, residues and other food component analysis, Marcel Dekker, Ghent, 2004). Being Amoxicillin (AMOX) one of those critical veterinary drugs, efforts have been made to develop simple and expeditious methods for its control in food samples. In literature, only one AMOX-selective electrode has been reported so far. In that work, phosphotungstate:amoxycillinium ion exchanger was used as electroactive material (Shoukry et al. in Electroanalysis 6:914–917, 1994). Designing new materials based on molecularly imprinted polymers (MIPs) which are complementary to the size and charge of AMOX could lead to very selective interactions, thus enhancing the selectivity of the sensing unit. AMOXselective electrodes used imprinted polymers as electroactive materials having AMOX as target molecule to design a biomimetic imprinted cavity. Poly(vinyl chloride), sensors of methacrylic acid displayed Nernstian slopes (60.7 mV/decade) and low detection limits (2.9×10-5 mol/L). The potentiometric responses were not affected by pH within 4–5 and showed good selectivity. The electrodes were applied successfully to the analysis of real samples.
Resumo:
Time-sensitive Wireless Sensor Network (WSN) applications require finite delay bounds in critical situations. This paper provides a methodology for the modeling and the worst-case dimensioning of cluster-tree WSNs. We provide a fine model of the worst-case cluster-tree topology characterized by its depth, the maximum number of child routers and the maximum number of child nodes for each parent router. Using Network Calculus, we derive “plug-and-play” expressions for the endto- end delay bounds, buffering and bandwidth requirements as a function of the WSN cluster-tree characteristics and traffic specifications. The cluster-tree topology has been adopted by many cluster-based solutions for WSNs. We demonstrate how to apply our general results for dimensioning IEEE 802.15.4/Zigbee cluster-tree WSNs. We believe that this paper shows the fundamental performance limits of cluster-tree wireless sensor networks by the provision of a simple and effective methodology for the design of such WSNs.
Resumo:
OBJECTIVE To analyze if dietary patterns during the third gestational trimester are associated with birth weight.METHODS Longitudinal study conducted in the cities of Petropolis and Queimados, Rio de Janeiro (RJ), Southeastern Brazil, between 2007 and 2008. We analyzed data from the first and second follow-up wave of a prospective cohort. Food consumption of 1,298 pregnant women was assessed using a semi-quantitative questionnaire about food frequency. Dietary patterns were obtained by exploratory factor analysis, using the Varimax rotation method. We also applied the multivariate linear regression model to estimate the association between food consumption patterns and birth weight.RESULTS Four patterns of consumption – which explain 36.4% of the variability – were identified and divided as follows: (1) prudent pattern (milk, yogurt, cheese, fruit and fresh-fruit juice, cracker, and chicken/beef/fish/liver), which explained 14.9% of the consumption; (2) traditional pattern, consisting of beans, rice, vegetables, breads, butter/margarine and sugar, which explained 8.8% of the variation in consumption; (3) Western pattern (potato/cassava/yams, macaroni, flour/farofa/grits, pizza/hamburger/deep fried pastries, soft drinks/cool drinks and pork/sausages/egg), which accounts for 6.9% of the variance; and (4) snack pattern (sandwich cookie, salty snacks, chocolate, and chocolate drink mix), which explains 5.7% of the consumption variability. The snack dietary pattern was positively associated with birth weight (β = 56.64; p = 0.04) in pregnant adolescents.CONCLUSIONS For pregnant adolescents, the greater the adherence to snack pattern during pregnancy, the greater the baby’s birth weight.
Resumo:
Liquid crystalline cellulosic-based solutions described by distinctive properties are at the origin of different kinds of multifunctional materials with unique characteristics. These solutions can form chiral nematic phases at rest, with tuneable photonic behavior, and exhibit a complex behavior associated with the onset of a network of director field defects under shear. Techniques, such as Nuclear Magnetic Resonance (NMR), Rheology coupled with NMR (Rheo-NMR), rheology, optical methods, Magnetic Resonance Imaging (MRI), Wide Angle X-rays Scattering (WAXS), were extensively used to enlighten the liquid crystalline characteristics of these cellulosic solutions. Cellulosic films produced by shear casting and fibers by electrospinning, from these liquid crystalline solutions, have regained wider attention due to recognition of their innovative properties associated to their biocompatibility. Electrospun membranes composed by helical and spiral shape fibers allow the achievement of large surface areas, leading to the improvement of the performance of this kind of systems. The moisture response, light modulated, wettability and the capability of orienting protein and cellulose crystals, opened a wide range of new applications to the shear casted films. Characterization by NMR, X-rays, tensile tests, AFM, and optical methods allowed detailed characterization of those soft cellulosic materials. In this work, special attention will be given to recent developments, including, among others, a moisture driven cellulosic motor and electro-optical devices.
Resumo:
ABSTRACT OBJECTIVE To analyze explanations for tuberculosis and therapeutic itineraries of Brazilian indigenous people. METHODS Case study with a qualitative-descriptive approach. We conducted semi-structured interviews with 11 Munduruku indigenous, including direct observation of treatment for tuberculosis in the municipality of Jacareacanga, south-western region of the state of Para, Brazil. To identify explanations for tuberculosis and therapeutic itineraries, we performed thematic content analysis. RESULTS Traditional medicine was the first therapeutic option chosen by the indigenous. However, biomedicine was also employed, which indicates a circulation between different therapeutic contexts and health concepts among the Munduruku. The explanations provided ranged from recognition of the signs and symptoms specific to tuberculosis to the attribution of the disease to a spirit that leaves the body and wanders in the woods, returning ill into the body. Unlike the biomedical model, which links tuberculosis transmission strictly to interpersonal contact, in closed spaces without natural lighting and ventilation (preferably domestic environments), the Munduruku associate the disease to an indirect contact between people socially distant (enemies or adversaries) in public and open places. CONCLUSIONS The explanations made by the indigenous are unique and deserve the attention of those who are responsible for developing health public policies, as well as of the teams who work on the villages. To guarantee an efficient control of tuberculosis in these regions, it is necessary that the developed actions integrate biomedicine knowledge and the traditional medicine of the indigenous people, in addition to respecting and welcoming local culture manifestations.
Resumo:
Many learning problems require handling high dimensional datasets with a relatively small number of instances. Learning algorithms are thus confronted with the curse of dimensionality, and need to address it in order to be effective. Examples of these types of data include the bag-of-words representation in text classification problems and gene expression data for tumor detection/classification. Usually, among the high number of features characterizing the instances, many may be irrelevant (or even detrimental) for the learning tasks. It is thus clear that there is a need for adequate techniques for feature representation, reduction, and selection, to improve both the classification accuracy and the memory requirements. In this paper, we propose combined unsupervised feature discretization and feature selection techniques, suitable for medium and high-dimensional datasets. The experimental results on several standard datasets, with both sparse and dense features, show the efficiency of the proposed techniques as well as improvements over previous related techniques.
Resumo:
Feature selection is a central problem in machine learning and pattern recognition. On large datasets (in terms of dimension and/or number of instances), using search-based or wrapper techniques can be cornputationally prohibitive. Moreover, many filter methods based on relevance/redundancy assessment also take a prohibitively long time on high-dimensional. datasets. In this paper, we propose efficient unsupervised and supervised feature selection/ranking filters for high-dimensional datasets. These methods use low-complexity relevance and redundancy criteria, applicable to supervised, semi-supervised, and unsupervised learning, being able to act as pre-processors for computationally intensive methods to focus their attention on smaller subsets of promising features. The experimental results, with up to 10(5) features, show the time efficiency of our methods, with lower generalization error than state-of-the-art techniques, while being dramatically simpler and faster.
Resumo:
Journal of Human Evolution, V. 55, pp. 148-163
Resumo:
A procura de padrões nos dados de modo a formar grupos é conhecida como aglomeração de dados ou clustering, sendo uma das tarefas mais realizadas em mineração de dados e reconhecimento de padrões. Nesta dissertação é abordado o conceito de entropia e são usados algoritmos com critérios entrópicos para fazer clustering em dados biomédicos. O uso da entropia para efetuar clustering é relativamente recente e surge numa tentativa da utilização da capacidade que a entropia possui de extrair da distribuição dos dados informação de ordem superior, para usá-la como o critério na formação de grupos (clusters) ou então para complementar/melhorar algoritmos existentes, numa busca de obtenção de melhores resultados. Alguns trabalhos envolvendo o uso de algoritmos baseados em critérios entrópicos demonstraram resultados positivos na análise de dados reais. Neste trabalho, exploraram-se alguns algoritmos baseados em critérios entrópicos e a sua aplicabilidade a dados biomédicos, numa tentativa de avaliar a adequação destes algoritmos a este tipo de dados. Os resultados dos algoritmos testados são comparados com os obtidos por outros algoritmos mais “convencionais" como o k-médias, os algoritmos de spectral clustering e um algoritmo baseado em densidade.
Resumo:
Na atualidade, está a emergir um novo paradigma de interação, designado por Natural User Interface (NUI) para reconhecimento de gestos produzidos com o corpo do utilizador. O dispositivo de interação Microsoft Kinect foi inicialmente concebido para controlo de videojogos, para a consola Xbox360. Este dispositivo demonstra ser uma aposta viável para explorar outras áreas, como a do apoio ao processo de ensino e de aprendizagem para crianças do ensino básico. O protótipo desenvolvido visa definir um modo de interação baseado no desenho de letras no ar, e realizar a interpretação dos símbolos desenhados, usando os reconhecedores de padrões Kernel Discriminant Analysis (KDA), Support Vector Machines (SVM) e $N. O desenvolvimento deste projeto baseou-se no estudo dos diferentes dispositivos NUI disponíveis no mercado, bibliotecas de desenvolvimento NUI para este tipo de dispositivos e algoritmos de reconhecimento de padrões. Com base nos dois elementos iniciais, foi possível obter uma visão mais concreta de qual o hardware e software disponíveis indicados à persecução do objetivo pretendido. O reconhecimento de padrões constitui um tema bastante extenso e complexo, de modo que foi necessária a seleção de um conjunto limitado deste tipo de algoritmos, realizando os respetivos testes por forma a determinar qual o que melhor se adequava ao objetivo pretendido. Aplicando as mesmas condições aos três algoritmos de reconhecimento de padrões permitiu avaliar as suas capacidades e determinar o $N como o que apresentou maior eficácia no reconhecimento. Por último, tentou-se averiguar a viabilidade do protótipo desenvolvido, tendo sido testado num universo de elementos de duas faixas etárias para determinar a capacidade de adaptação e aprendizagem destes dois grupos. Neste estudo, constatou-se um melhor desempenho inicial ao modo de interação do grupo de idade mais avançada. Contudo, o grupo mais jovem foi revelando uma evolutiva capacidade de adaptação a este modo de interação melhorando progressivamente os resultados.