243 resultados para Fusion approaches
Resumo:
In mammalian cells, fusion between early endocytic vesicles has been shown to require the ubiquitous intracellular fusion factors N-ethylmaleimide-sensitive factor (NSF) and alpha-SNAP, as well as a factor specific for early endosomes, the small GTPase Rab5 [1-3]. We have previously demonstrated an additional requirement for phosphatidylinositol 3-kinase (PI 3-kinase) activity [4]. The membrane association of early endosomal antigen 1 (EEA1), a specific marker of early endosomes [5,6], has recently been shown to be similarly dependent on PI 3-kinase activity [7], and we therefore postulated that it might be involved in endosome fusion. Here, we present evidence that EEA1 has an important role in determining the efficiency of endosome fusion in vitro. Both the carboxy-terminal domain of EEA1 (residues 1098-1411) and specific antibodies against EEA1 inhibited endosome fusion when included in an in vitro assay. Furthermore, depletion of EEA1, both from the membrane fraction used in the assay by washing with salt and from the cytosol using an EEA1-specific antibody, resulted in inhibition of endosome fusion. The involvement of EEA1 in endosome fusion accounts for the sensitivity of the endosome fusion assay to inhibitors of PI 3-kinase.
Resumo:
Rab5-dependent endosome fusion is sensitive to the phosphoinositide 3-kinase inhibitor, wortmannin. It has been proposed that phosphoinositide 3-kinase activity may be required for activation of rab5 by influencing its nucleotide cycle such as to promote its active GTP state. In this report we demonstrate that endosome fusion remains sensitive to wortmannin despite preloading of endosomes with stimulatory levels of a GTPase-defective mutant rab5(Q79L) or of a xanthosine triphosphate-binding mutant, rab5(D136N), in the presence of the nonhydrolysable analogue XTPgammaS. These results suggest that activation of rab5 cannot be the principal function of the wortmannin-sensitive factor on the endosome fusion pathway. This result is extrapolated to all GTPases by demonstrating that endosome fusion remains wortmannin sensitive despite prior incubation with the nonhydrolysable nucleotide analogue GTPgammaS. Consistent with these results, direct measurement of clathrin-coated vesicle-stimulated nucleotide dissociation from exogenous rab5 was insensitive to the presence of wortmannin. A large excess of rab5(Q79L), beyond levels required for maximal stimulation of the fusion assay, afforded protection against wortmannin inhibition, and partial protection was also observed with an excess of wild-type rab5 independent of GTPgammaS.
Resumo:
CCTV (Closed-Circuit TeleVision) systems are broadly deployed in the present world. To ensure in-time reaction for intelligent surveillance, it is a fundamental task for real-world applications to determine the gender of people of interest. However, normal video algorithms for gender profiling (usually face profiling) have three drawbacks. First, the profiling result is always uncertain. Second, the profiling result is not stable. The degree of certainty usually varies over time, sometimes even to the extent that a male is classified as a female, and vice versa. Third, for a robust profiling result in cases that a person’s face is not visible, other features, such as body shape, are required. These algorithms may provide different recognition results - at the very least, they will provide different degrees of certainties. To overcome these problems, in this paper, we introduce an Dempster-Shafer (DS) evidential approach that makes use of profiling results from multiple algorithms over a period of time, in particular, Denoeux’s cautious rule is applied for fusing mass functions through time lines. Experiments show that this approach does provide better results than single profiling results and classic fusion results. Furthermore, it is found that if severe mis-classification has occurred at the beginning of the time line, the combination can yield undesirable results. To remedy this weakness, we further propose three extensions to the evidential approach proposed above incorporating notions of time-window, time-attenuation, and time-discounting, respectively. These extensions also applies Denoeux’s rule along with time lines and take the DS approach as a special case. Experiments show that these three extensions do provide better results than their predecessor when mis-classifications occur.
Resumo:
The growing accessibility to genomic resources using next-generation sequencing (NGS) technologies has revolutionized the application of molecular genetic tools to ecology and evolutionary studies in non-model organisms. Here we present the case study of the European hake (Merluccius merluccius), one of the most important demersal resources of European fisheries. Two sequencing platforms, the Roche 454 FLX (454) and the Illumina Genome Analyzer (GAII), were used for Single Nucleotide Polymorphisms (SNPs) discovery in the hake muscle transcriptome. De novo transcriptome assembly into unique contigs, annotation, and in silico SNP detection were carried out in parallel for 454 and GAII sequence data. High-throughput genotyping using the Illumina GoldenGate assay was performed for validating 1,536 putative SNPs. Validation results were analysed to compare the performances of 454 and GAII methods and to evaluate the role of several variables (e.g. sequencing depth, intron-exon structure, sequence quality and annotation). Despite well-known differences in sequence length and throughput, the two approaches showed similar assay conversion rates (approximately 43%) and percentages of polymorphic loci (67.5% and 63.3% for GAII and 454, respectively). Both NGS platforms therefore demonstrated to be suitable for large scale identification of SNPs in transcribed regions of non-model species, although the lack of a reference genome profoundly affects the genotyping success rate. The overall efficiency, however, can be improved using strict quality and filtering criteria for SNP selection (sequence quality, intron-exon structure, target region score).
Resumo:
Background: Large-scale biological jobs on high-performance computing systems require manual intervention if one or more computing cores on which they execute fail. This places not only a cost on the maintenance of the job, but also a cost on the time taken for reinstating the job and the risk of losing data and execution accomplished by the job before it failed. Approaches which can proactively detect computing core failures and take action to relocate the computing core's job onto reliable cores can make a significant step towards automating fault tolerance. Method: This paper describes an experimental investigation into the use of multi-agent approaches for fault tolerance. Two approaches are studied, the first at the job level and the second at the core level. The approaches are investigated for single core failure scenarios that can occur in the execution of parallel reduction algorithms on computer clusters. A third approach is proposed that incorporates multi-agent technology both at the job and core level. Experiments are pursued in the context of genome searching, a popular computational biology application. Result: The key conclusion is that the approaches proposed are feasible for automating fault tolerance in high-performance computing systems with minimal human intervention. In a typical experiment in which the fault tolerance is studied, centralised and decentralised checkpointing approaches on an average add 90% to the actual time for executing the job. On the other hand, in the same experiment the multi-agent approaches add only 10% to the overall execution time