68 resultados para Two dimensional fuzzy fault tree analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following the success of the first round table in 2001, the Swiss Proteomic Society has organized two additional specific events during its last two meetings: a proteomic application exercise in 2002 and a round table in 2003. Such events have as their main objective to bring together, around a challenging topic in mass spectrometry, two groups of specialists, those who develop and commercialize mass spectrometry equipment and software, and expert MS users for peptidomics and proteomics studies. The first round table (Geneva, 2001) entitled "Challenges in Mass Spectrometry" was supported by brief oral presentations that stressed critical questions in the field of MS development or applications (Stöcklin and Binz, Proteomics 2002, 2, 825-827). Topics such as (i) direct analysis of complex biological samples, (ii) status and perspectives for MS investigations of noncovalent peptide-ligant interactions; (iii) is it more appropriate to have complementary instruments rather than a universal equipment, (iv) standardization and improvement of the MS signals for protein identification, (v) what would be the new generation of equipment and finally (vi) how to keep hardware and software adapted to MS up-to-date and accessible to all. For the SPS'02 meeting (Lausanne, 2002), a full session alternative event "Proteomic Application Exercise" was proposed. Two different samples were prepared and sent to the different participants: 100 micro g of snake venom (a complex mixture of peptides and proteins) and 10-20 micro g of almost pure recombinant polypeptide derived from the shrimp Penaeus vannamei carrying an heterogeneous post-translational modification (PTM). Among the 15 participants that received the samples blind, eight returned results and most of them were asked to present their results emphasizing the strategy, the manpower and the instrumentation used during the congress (Binz et. al., Proteomics 2003, 3, 1562-1566). It appeared that for the snake venom extract, the quality of the results was not particularly dependant on the strategy used, as all approaches allowed Lication of identification of a certain number of protein families. The genus of the snake was identified in most cases, but the species was ambiguous. Surprisingly, the precise identification of the recombinant almost pure polypeptides appeared to be much more complicated than expected as only one group reported the full sequence. Finally the SPS'03 meeting reported here included a round table on the difficult and challenging task of "Quantification by Mass Spectrometry", a discussion sustained by four selected oral presentations on the use of stable isotopes, electrospray ionization versus matrix-assisted laser desorption/ionization approaches to quantify peptides and proteins in biological fluids, the handling of differential two-dimensional liquid chromatography tandem mass spectrometry data resulting from high throughput experiments, and the quantitative analysis of PTMs. During these three events at the SPS meetings, the impressive quality and quantity of exchanges between the developers and providers of mass spectrometry equipment and software, expert users and the audience, were a key element for the success of these fruitful events and will have definitively paved the way for future round tables and challenging exercises at SPS meetings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study deals with the analysis and mapping of Swiss franc interest rates. Interest rates depend on time and maturity, defining term structure of the interest rate curves (IRC). In the present study IRC are considered in a two-dimensional feature space - time and maturity. Exploratory data analysis includes a variety of tools widely used in econophysics and geostatistics. Geostatistical models and machine learning algorithms (multilayer perceptron and Support Vector Machines) were applied to produce interest rate maps. IR maps can be used for the visualisation and pattern perception purposes, to develop and to explore economical hypotheses, to produce dynamic asset-liability simulations and for financial risk assessments. The feasibility of an application of interest rates mapping approach for the IRC forecasting is considered as well. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A conductometric micromethod combined with image analysis system has been developed allowing to determine the CO2 production within 'two-dimensional' tissues, i.e., flat and thin cell layers or epithelial sheets. The preparation was mounted into an airtight chamber separated in two compartments by a thin silicone membrane permeable to gases. The lower compartment contained the nutritive medium and the preparation. The upper compartment and a conductivity measuring capillary connected in series were perfused with a solution of Ba(OH)2. The CO2 produced by the tissue precipitated as BaCO3 and the resulting decrease of electrical conductivity was linearly related to the total CO2 production. In addition, the pattern of CO2 production was directly observable as the BaCO3 crystals formed upon the silicone membrane over the regions which produced CO2. The spatial distribution of the crystals was quantified by video image processing and the regional CO2 production evaluated with a spatial resolution of 100 microns. This new microtechnique was originally developed to study the CO2 production in the early chick blastoderm which is a disc 1-5 cells thick. At the stage of young neurula the CO2 production was found to be 235 +/- 37 nmol.h-1 (mean +/- SD; n = 10) per blastoderm and large variations of local CO2 production were detected from one region to another (from 0.6 to 6.5 nmol.h-1.mm-2). These results indicate a high metabolic and functional differentiation of cells within the blastoderm. The possible applications and improvements of such a microtechnique are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A monoclonal antibody, LAU-A1, which selectively reacts with all cells of the T-lineage, was derived from a fusion between spleen cells of a mouse immunized with paediatric thymocytes and mouse myeloma P X 63/Ag8 cells. As shown by an antibody-binding radioimmunoassay and analysis by flow microfluorometry of cells labelled by indirect immunofluorescence, the LAU-A1 antibody reacted with all six T-cell lines but not with any of the B-cell lines or myeloid cell lines tested from a panel of 17 human hematopoietic cell lines. The LAU-A1 antibody was also shown to react with the majority of thymocytes and E-rosette-enriched peripheral blood lymphocytes. Among the malignant cell populations tested, the blasts from all 20 patients with acute T-cell lymphoblastic leukemia (T-ALL) were found to react with the LAU-A1 antibody, whereas blasts from 85 patients with common ALL and 63 patients with acute myeloid leukemias were entirely negative. Examination of frozen tissue sections from fetal and adult thymuses stained by an indirect immunoperoxidase method revealed that cells expressing the LAU-A1 antigen were localized in both the cortex and the medulla. From the very broad reactivity spectrum of LAU-A1 antibody, we conclude that this antibody is directed against a T-cell antigen expressed throughout the T-cell differentiation lineage. SDS-PAGE analysis of immunoprecipitates formed by LAU-A1 antibody with detergent lysates of radiolabeled T-cells showed that the LAU-A1 antigen had an apparent mol. wt of 76,000 under non-reducing conditions. Under reducing conditions a single band with an apparent mol. wt of 40,000 was observed. Two-dimensional SDS-PAGE analysis confirmed that the 76,000 mol. wt component consisted of an S-S-linked dimeric complex. The surface membrane expression of LAU-A1 antigen on HSB-2 T-cells was modulated when these cells were cultured in the presence of LAU-A1 antibody. Re-expression of LAU-A1 antigen occurred within 24 hr after transfer of the modulated cells into antibody-free medium.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To permit the tracking of turbulent flow structures in an Eulerian frame from single-point measurements, we make use of a generalization of conventional two-dimensional quadrant analysis to three-dimensional octants. We characterize flow structures using the sequences of these octants and show how significance may be attached to particular sequences using statistical mull models. We analyze an example experiment and show how a particular dominant flow structure can be identified from the conditional probability of octant sequences. The frequency of this structure corresponds to the dominant peak in the velocity spectra and exerts a high proportion of the total shear stress. We link this structure explicitly to the propensity for sediment entrainment and show that greater insight into sediment entrainment can be obtained by disaggregating those octants that occur within the identified macroturbulence structure from those that do not. Hence, this work goes beyond critiques of Reynolds stress approaches to bed load entrainment that highlight the importance of outward interactions, to identifying and prioritizing the quadrants/octants that define particular flow structures. Key Points <list list-type=''bulleted'' id=''jgrf20196-list-0001''> <list-item id=''jgrf20196-li-0001''>A new method for analysing single point velocity data is presented <list-item id=''jgrf20196-li-0002''>Flow structures are identified by a sequence of flow states (termed octants) <list-item id=''jgrf20196-li-0003''>The identified structure exerts high stresses and causes bed-load entrainment

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because natural selection is likely to act on multiple genes underlying a given phenotypic trait, we study here the potential effect of ongoing and past selection on the genetic diversity of human biological pathways. We first show that genes included in gene sets are generally under stronger selective constraints than other genes and that their evolutionary response is correlated. We then introduce a new procedure to detect selection at the pathway level based on a decomposition of the classical McDonald-Kreitman test extended to multiple genes. This new test, called 2DNS, detects outlier gene sets and takes into account past demographic effects and evolutionary constraints specific to gene sets. Selective forces acting on gene sets can be easily identified by a mere visual inspection of the position of the gene sets relative to their two-dimensional null distribution. We thus find several outlier gene sets that show signals of positive, balancing, or purifying selection but also others showing an ancient relaxation of selective constraints. The principle of the 2DNS test can also be applied to other genomic contrasts. For instance, the comparison of patterns of polymorphisms private to African and non-African populations reveals that most pathways show a higher proportion of nonsynonymous mutations in non-Africans than in Africans, potentially due to different demographic histories and selective pressures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

NlmCategory="UNASSIGNED">A version of cascaded systems analysis was developed specifically with the aim of studying quantum noise propagation in x-ray detectors. Signal and quantum noise propagation was then modelled in four types of x-ray detectors used for digital mammography: four flat panel systems, one computed radiography and one slot-scan silicon wafer based photon counting device. As required inputs to the model, the two dimensional (2D) modulation transfer function (MTF), noise power spectra (NPS) and detective quantum efficiency (DQE) were measured for six mammography systems that utilized these different detectors. A new method to reconstruct anisotropic 2D presampling MTF matrices from 1D radial MTFs measured along different angular directions across the detector is described; an image of a sharp, circular disc was used for this purpose. The effective pixel fill factor for the FP systems was determined from the axial 1D presampling MTFs measured with a square sharp edge along the two orthogonal directions of the pixel lattice. Expectation MTFs were then calculated by averaging the radial MTFs over all possible phases and the 2D EMTF formed with the same reconstruction technique used for the 2D presampling MTF. The quantum NPS was then established by noise decomposition from homogenous images acquired as a function of detector air kerma. This was further decomposed into the correlated and uncorrelated quantum components by fitting the radially averaged quantum NPS with the radially averaged EMTF(2). This whole procedure allowed a detailed analysis of the influence of aliasing, signal and noise decorrelation, x-ray capture efficiency and global secondary gain on NPS and detector DQE. The influence of noise statistics, pixel fill factor and additional electronic and fixed pattern noises on the DQE was also studied. The 2D cascaded model and decompositions performed on the acquired images also enlightened the observed quantum NPS and DQE anisotropy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The extension of traditional data mining methods to time series has been effectively applied to a wide range of domains such as finance, econometrics, biology, security, and medicine. Many existing mining methods deal with the task of change points detection, but very few provide a flexible approach. Querying specific change points with linguistic variables is particularly useful in crime analysis, where intuitive, understandable, and appropriate detection of changes can significantly improve the allocation of resources for timely and concise operations. In this paper, we propose an on-line method for detecting and querying change points in crime-related time series with the use of a meaningful representation and a fuzzy inference system. Change points detection is based on a shape space representation, and linguistic terms describing geometric properties of the change points are used to express queries, offering the advantage of intuitiveness and flexibility. An empirical evaluation is first conducted on a crime data set to confirm the validity of the proposed method and then on a financial data set to test its general applicability. A comparison to a similar change-point detection algorithm and a sensitivity analysis are also conducted. Results show that the method is able to accurately detect change points at very low computational costs. More broadly, the detection of specific change points within time series of virtually any domain is made more intuitive and more understandable, even for experts not related to data mining.