921 resultados para Processing wikipedia data
Resumo:
Antigen recognition by cytotoxic CD8 T cells is dependent upon a number of critical steps in MHC class I antigen processing including proteosomal cleavage, TAP transport into the endoplasmic reticulum, and MHC class 1 binding. Based on extensive experimental data relating to each of these steps there is now the capacity to model individual antigen processing steps with a high degree of accuracy. This paper demonstrates the potential to bring together models of individual antigen processing steps, for example proteosome cleavage, TAP transport, and MHC binding, to build highly informative models of functional pathways. In particular, we demonstrate how an artificial neural network model of TAP transport was used to mine a HLA-binding database so as to identify H LA-binding peptides transported by TAP. This integrated model of antigen processing provided the unique insight that HLA class I alleles apparently constitute two separate classes: those that are TAP-efficient for peptide loading (HLA-B27, -A3, and -A24) and those that are TAP-inefficient (HLA-A2, -B7, and -B8). Hence, using this integrated model we were able to generate novel hypotheses regarding antigen processing, and these hypotheses are now capable of being tested experimentally. This model confirms the feasibility of constructing a virtual immune system, whereby each additional step in antigen processing is incorporated into a single modular model. Accurate models of antigen processing have implications for the study of basic immunology as well as for the design of peptide-based vaccines and other immunotherapies. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
The settling characteristics of cell debris and inclusion bodies prior to, and following, fractionation in a disc-stack centrifuge were measured using Cumulative Sedimentation Analysis (CSA) and Centrifugal Disc photosedimentation (CDS). The impact of centrifuge feedrate and repeated homogenisation on both cell debris and inclusion body collection efficiency was investigated. Increasing the normalised centrifuge feedrate (Q/Sigma) from 1.32 x 10(-9) m s(-1) to 3.97 x 10(-9) m s(-1) leads to a 36% increase in inclusion body paste purity. Purity may also be improved by repeated homogenisation. Increasing the number of homogeniser passes results in smaller cell debris size whilst leaves inclusion body size unaltered. At a normalised centrifuge feedrate of 2.65 x 10(-9) m s(-1), increasing the number of homogeniser passes from two (2) to ten (10) improved overall inclusion body paste purity by 58%. Grade-efficiency curves for both the cell debris and inclusion bodies have also been generated in this study. The data are described using an equation developed by Mannweiler (1989) with parameters of k = 0.15-0.26 and n = 2.5-2.6 for inclusion bodies, and k = 0.12-0.14 and n = 2.0-2.2 for cell debris. This is the first accurate experimentally-determined grade efficiency curve for cell debris. Previous studies have simply estimated debris grade efficiency curves using an approximate debris size distribution and grade efficiency curves determined with 'ideal particles' (e.g. spherical PVA particles). The findings of this study may be used to simulate and optimise the centrifugal fractionation of inclusion bodies from cell debris.
Resumo:
The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.
Resumo:
We assess the effects of chemical processing, ethylene oxide sterilization, and threading on bone surface and mechanical properties of bovine undecalcified bone screws. In addition, we evaluate the possibility of manufacturing bone screws with predefined dimensions. Scanning electronic microscopic images show that chemical processing and ethylene oxide treatment causes collagen fiber amalgamation on the bone surface. Processed screws hold higher ultimate loads under bending and torsion than the in natura bone group, with no change in pull-out strength between groups. Threading significantly reduces deformation and bone strength under torsion. Metrological data demonstrate the possibility of manufacturing bone screws with standardized dimensions.
Resumo:
Two hazard risk assessment matrices for the ranking of occupational health risks are described. The qualitative matrix uses qualitative measures of probability and consequence to determine risk assessment codes for hazard-disease combinations. A walk-through survey of an underground metalliferous mine and concentrator is used to demonstrate how the qualitative matrix can be applied to determine priorities for the control of occupational health hazards. The semi-quantitative matrix uses attributable risk as a quantitative measure of probability and uses qualitative measures of consequence. A practical application of this matrix is the determination of occupational health priorities using existing epidemiological studies. Calculated attributable risks from epidemiological studies of hazard-disease combinations in mining and minerals processing are used as examples. These historic response data do not reflect the risks associated with current exposures. A method using current exposure data, known exposure-response relationships and the semi-quantitative matrix is proposed for more accurate and current risk rankings.
Resumo:
Models of plant architecture allow us to explore how genotype environment interactions effect the development of plant phenotypes. Such models generate masses of data organised in complex hierarchies. This paper presents a generic system for creating and automatically populating a relational database from data generated by the widely used L-system approach to modelling plant morphogenesis. Techniques from compiler technology are applied to generate attributes (new fields) in the database, to simplify query development for the recursively-structured branching relationship. Use of biological terminology in an interactive query builder contributes towards making the system biologist-friendly. (C) 2002 Elsevier Science Ireland Ltd. All rights reserved.
Resumo:
Frequency deviation is a common problem for power system signal processing. Many power system measurements are carried out in a fixed sampling rate assuming the system operates in its nominal frequency (50 or 60 Hz). However, the actual frequency may deviate from the normal value from time to time due to various reasons such as disturbances and subsequent system transients. Measurement of signals based on a fixed sampling rate may introduce errors under such situations. In order to achieve high precision signal measurement appropriate algorithms need to be employed to reduce the impact from frequency deviation in the power system data acquisition process. This paper proposes an advanced algorithm to enhance Fourier transform for power system signal processing. The algorithm is able to effectively correct frequency deviation under fixed sampling rate. Accurate measurement of power system signals is essential for the secure and reliable operation of power systems. The algorithm is readily applicable to such occasions where signal processing is affected by frequency deviation. Both mathematical proof and numerical simulation are given in this paper to illustrate robustness and effectiveness of the proposed algorithm. Crown Copyright (C) 2003 Published by Elsevier Science B.V. All rights reserved.
Resumo:
Image segmentation is an ubiquitous task in medical image analysis, which is required to estimate morphological or functional properties of given anatomical targets. While automatic processing is highly desirable, image segmentation remains to date a supervised process in daily clinical practice. Indeed, challenging data often requires user interaction to capture the required level of anatomical detail. To optimize the analysis of 3D images, the user should be able to efficiently interact with the result of any segmentation algorithm to correct any possible disagreement. Building on a previously developed real-time 3D segmentation algorithm, we propose in the present work an extension towards an interactive application where user information can be used online to steer the segmentation result. This enables a synergistic collaboration between the operator and the underlying segmentation algorithm, thus contributing to higher segmentation accuracy, while keeping total analysis time competitive. To this end, we formalize the user interaction paradigm using a geometrical approach, where the user input is mapped to a non-cartesian space while this information is used to drive the boundary towards the position provided by the user. Additionally, we propose a shape regularization term which improves the interaction with the segmented surface, thereby making the interactive segmentation process less cumbersome. The resulting algorithm offers competitive performance both in terms of segmentation accuracy, as well as in terms of total analysis time. This contributes to a more efficient use of the existing segmentation tools in daily clinical practice. Furthermore, it compares favorably to state-of-the-art interactive segmentation software based on a 3D livewire-based algorithm.
Resumo:
The characteristics of tunable wavelength filters based on a-SiC:H multilayered stacked pin cells are studied both theoretically and experimentally. The optical transducers were produced by PECVD and tested for a proper fine tuning of the cyan and yellow fluorescent proteins emission. The active device consists of a p-i'(a-SiC:H)-n/p-i(a-Si:H)-n heterostructures sandwiched between two transparent contacts. Experimental data on spectral response analysis, current-voltage characteristics and color and transmission rate discrimination are reported. Cyan and yellow fluorescent input channels were transmitted together, each one with a specific transmission rate and different intensities. The multiplexed optical signal was analyzed by reading out, under positive and negative applied voltages, the generated photocurrents. Results show that the optimized optical transducer has the capability of combining the transient fluorescent signals onto a single output signal without losing any specificity (color and intensity). It acts as a voltage controlled optical filter: when the applied voltages are chosen appropriately the transducer can select separately the cyan and yellow channel emissions (wavelength and frequency) and also to quantify their relative intensities. A theoretical analysis supported by a numerical simulation is presented.
Resumo:
This paper presents a methodology supported on the data base knowledge discovery process (KDD), in order to find out the failure probability of electrical equipments’, which belong to a real electrical high voltage network. Data Mining (DM) techniques are used to discover a set of outcome failure probability and, therefore, to extract knowledge concerning to the unavailability of the electrical equipments such us power transformers and high-voltages power lines. The framework includes several steps, following the analysis of the real data base, the pre-processing data, the application of DM algorithms, and finally, the interpretation of the discovered knowledge. To validate the proposed methodology, a case study which includes real databases is used. This data have a heavy uncertainty due to climate conditions for this reason it was used fuzzy logic to determine the set of the electrical components failure probabilities in order to reestablish the service. The results reflect an interesting potential of this approach and encourage further research on the topic.
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files.
Resumo:
Amorphous SiC tandem heterostructures are used to filter a specific band, in the visible range. Experimental and simulated results are compared to validate the use of SiC multilayered structures in applications where gain compensation is needed or to attenuate unwanted wavelengths. Spectral response data acquired under different frequencies, optical wavelength control and side irradiations are analyzed. Transfer function characteristics are discussed. Color pulsed communication channels are transmitted together and the output signal analyzed under different background conditions. Results show that under controlled wavelength backgrounds, the device sensitivity is enhanced in a precise wavelength range and quenched in the others, tuning or suppressing a specific band. Depending on the background wavelength and irradiation side, the device acts either as a long-, a short-, or a band-rejection pass filter. An optoelectronic model supports the experimental results and gives insight on the physics of the device.