997 resultados para membrane computing
Resumo:
Artemisinin induced dormancy is a proposed mechanism for failures of mono-therapy and is linked with artemisinin resistance in Plasmodium falciparum. The biological characterization and dynamics of dormant parasites are not well understood. Here we report that following dihydroartemisinin (DHA) treatment in vitro, a small subset of morphologically dormant parasites was stained with rhodamine 123 (RH), a mitochondrial membrane potential (MMP) marker, and persisted to recovery. FACS sorted RH-positive parasites resumed growth at 10,000/well while RH-negative parasites failed to recover at 5 million/well. Furthermore, transcriptional activity for mitochondrial enzymes was only detected in RH-positive dormant parasites. Importantly, after treating dormant parasites with different concentrations of atovaquone, a mitochondrial inhibitor, the recovery of dormant parasites was delayed or stopped. This demonstrates that mitochondrial activity is critical for survival and regrowth of dormant parasites and that RH staining provides a means of identifying these parasites. These findings provide novel paths for studying and eradicating this dormant stage.
Resumo:
Macrophages have the capacity to rapidly secrete a wide range of inflammatory mediators that influence the development and extent of an inflammatory response. Newly synthesized and/or preformed stored cytokines and other inflammatory mediators are released upon stimulation, the timing, and volume of which is highly regulated. To finely tune this process, secretion is regulated at many levels; at the level of transcription and translation and post-translationally at the endoplasmic reticulum (ER), Golgi, and at or near the cell surface. Here, we discuss recent advances in deciphering these cytokine pathways in macrophages, focusing on recent discoveries regarding the cellular machinery and mechanisms implicated in the synthesis, trafficking, and secretion of cytokines. The specific roles of trafficking machinery including chaperones, GTPases, cytoskeletal proteins, and SNARE membrane fusion proteins will be discussed.
Resumo:
The trans-activator of transcription (TAT) peptide is regarded as the “gold standard” for cell-penetrating peptides, capable of traversing a mammalian membrane passively into the cytosolic space. This characteristic has been exploited through conjugation of TAT for applications such as drug delivery. However, the process by which TAT achieves membrane penetration remains ambiguous and unresolved. Mechanistic details of TAT peptide action are revealed herein by using three complementary methods: quartz crystal microbalance with dissipation (QCM-D), scanning electrochemical microscopy (SECM) and atomic force microscopy (AFM). When combined, these three scales of measurement define that the membrane uptake of the TAT peptide is by trans-membrane insertion using a “worm-hole” pore that leads to ion permeability across the membrane layer. AFM data provided nanometre-scale visualisation of TAT punctuation using a mammalian-mimetic membrane bilayer. The TAT peptide does not show the same specificity towards a bacterial mimetic membrane and QCM-D and SECM showed that the TAT peptide demonstrates a disruptive action towards these membranes. This investigation supports the energy-independent uptake of the cationic TAT peptide and provides empirical data that clarify the mechanism by which the TAT peptide achieves its membrane activity. The novel use of these three biophysical techniques provides valuable insight into the mechanism for TAT peptide translocation, which is essential for improvements in the cellular delivery of TAT-conjugated cargoes including therapeutic agents required to target specific intracellular locations.
Resumo:
Biological systems are typically complex and adaptive, involving large numbers of entities, or organisms, and many-layered interactions between these. System behaviour evolves over time, and typically benefits from previous experience by retaining memory of previous events. Given the dynamic nature of these phenomena, it is non-trivial to provide a comprehensive description of complex adaptive systems and, in particular, to define the importance and contribution of low-level unsupervised interactions to the overall evolution process. In this chapter, the authors focus on the application of the agent-based paradigm in the context of the immune response to HIV. Explicit implementation of lymph nodes and the associated lymph network, including lymphatic chain structure, is a key objective, and requires parallelisation of the model. Steps taken towards an optimal communication strategy are detailed.
Resumo:
Background Recent advances in Immunology highlighted the importance of local properties on the overall progression of HIV infection. In particular, the gastrointestinal tract is seen as a key area during early infection, and the massive cell depletion associated with it may influence subsequent disease progression. This motivated the development of a large-scale agent-based model. Results Lymph nodes are explicitly implemented, and considerations on parallel computing permit large simulations and the inclusion of local features. The results obtained show that GI tract inclusion in the model leads to an accelerated disease progression, during both the early stages and the long-term evolution, compared to a theoretical, uniform model. Conclusions These results confirm the potential of treatment policies currently under investigation, which focus on this region. They also highlight the potential of this modelling framework, incorporating both agent-based and network-based components, in the context of complex systems where scaling-up alone does not result in models providing additional insights.
Resumo:
Biomedical systems involve a large number of entities and intricate interactions between these. Their direct analysis is, therefore, difficult, and it is often necessary to rely on computational models. These models require significant resources and parallel computing solutions. These approaches are particularly suited, given parallel aspects in the nature of biomedical systems. Model hybridisation also permits the integration and simultaneous study of multiple aspects and scales of these systems, thus providing an efficient platform for multidisciplinary research.
Resumo:
Several algorithms and techniques widely used in Computer Science have been adapted from, or inspired by, known biological phenomena. This is a consequence of the multidisciplinary background of most early computer scientists. The field has now matured, and permits development of tools and collaborative frameworks which play a vital role in advancing current biomedical research. In this paper, we briefly present examples of the former, and elaborate upon two of the latter, applied to immunological modelling and as a new paradigm in gene expression.
Resumo:
One of the main challenges in data analytics is that discovering structures and patterns in complex datasets is a computer-intensive task. Recent advances in high-performance computing provide part of the solution. Multicore systems are now more affordable and more accessible. In this paper, we investigate how this can be used to develop more advanced methods for data analytics. We focus on two specific areas: model-driven analysis and data mining using optimisation techniques.
Resumo:
As computational models in fields such as medicine and engineering get more refined, resource requirements are increased. In a first instance, these needs have been satisfied using parallel computing and HPC clusters. However, such systems are often costly and lack flexibility. HPC users are therefore tempted to move to elastic HPC using cloud services. One difficulty in making this transition is that HPC and cloud systems are different, and performance may vary. The purpose of this study is to evaluate cloud services as a means to minimise both cost and computation time for large-scale simulations, and to identify which system properties have the most significant impact on performance. Our simulation results show that, while the performance of Virtual CPU (VCPU) is satisfactory, network throughput may lead to difficulties.
Resumo:
The research field of urban computing – defined as “the integration of computing, sensing, and actuation technologies into everyday urban settings and lifestyles” – considers the design and use of ubiquitous computing technology in public and shared urban environments. Its impact on cities, buildings, and spaces evokes innumerable kinds of change. Embedded into our everyday lived environments, urban computing technologies have the potential to alter the meaning of physical space, and affect the activities performed in those spaces. This paper starts a multi-themed discussion of various aspects that make up the, at times, messy and certainly transdisciplinary field of urban computing and urban informatics.
Resumo:
The laz gene of Neisseria meningitidis is predicted to encode a lipid-modified azurin (Laz). Laz is very similar to azurin, a periplasmic protein, which belongs to the copper-containing proteins in the cupredoxin superfamily. In other bacteria, azurin is an electron donor to nitrite reductase, an important enzyme in the denitrifying process. It is not known whether Laz could function as an electron transfer protein in this important pathogen. Laz protein was heterologously expressed in Escherichia coli and purified. Electrospray mass spectrometry indicated that the Laz protein contains one copper ion. Laz was shown to be redox-active in the presence of its redox center copper ion. When oxidized, Laz exhibits an intense blue colour and absorbs visible light around 626 nm. The absorption is lost when exposed to diethyldithiocarbamate, a copper chelating agent. Polyclonal antibodies were raised against purified Laz for detecting expression of Laz under different growth conditions and to determine the orientation of Laz on the outer membrane. The expression of Laz under microaerobic and microaerobic denitrifying conditions was slightly higher than that under aerobic conditions. However, the expression of Laz was similar between the wild type strain and an fnr mutant, suggesting that Fumarate/Nitrate reduction regulator (FNR) does not regulate the expression of Laz despite the presence of a partial FNR box upstream of the laz gene. We propose that some Laz protein is exposed on the outer membrane surface of N. meningitidis as the αLaz antibodies can increase killing by complement in a capsule deficient N. meningitidis strain, in a dose-dependent fashion.
Resumo:
Cloud computing has significantly impacted a broad range of industries, but these technologies and services have been absorbed throughout the marketplace unevenly. Some industries have moved aggressively towards cloud computing, while others have moved much more slowly. For the most part, the energy sector has approached cloud computing in a measured and cautious way, with progress often in the form of private cloud solutions rather than public ones, or hybridized information technology systems that combine cloud and existing non-cloud architectures. By moving towards cloud computing in a very slow and tentative way, however, the energy industry may prevent itself from reaping the full benefit that a more complete migration to the public cloud has brought about in several other industries. This short communication is accordingly intended to offer a high-level overview of cloud computing, and to put forward the argument that the energy sector should make a more complete migration to the public cloud in order to unlock the major system-wide efficiencies that cloud computing can provide. Also, assets within the energy sector should be designed with as much modularity and flexibility as possible so that they are not locked out of cloud-friendly options in the future.
Resumo:
The efficient computation of matrix function vector products has become an important area of research in recent times, driven in particular by two important applications: the numerical solution of fractional partial differential equations and the integration of large systems of ordinary differential equations. In this work we consider a problem that combines these two applications, in the form of a numerical solution algorithm for fractional reaction diffusion equations that after spatial discretisation, is advanced in time using the exponential Euler method. We focus on the efficient implementation of the algorithm on Graphics Processing Units (GPU), as we wish to make use of the increased computational power available with this hardware. We compute the matrix function vector products using the contour integration method in [N. Hale, N. Higham, and L. Trefethen. Computing Aα, log(A), and related matrix functions by contour integrals. SIAM J. Numer. Anal., 46(5):2505–2523, 2008]. Multiple levels of preconditioning are applied to reduce the GPU memory footprint and to further accelerate convergence. We also derive an error bound for the convergence of the contour integral method that allows us to pre-determine the appropriate number of quadrature points. Results are presented that demonstrate the effectiveness of the method for large two-dimensional problems, showing a speedup of more than an order of magnitude compared to a CPU-only implementation.
Resumo:
Asset management has broadened from a focus on maintenance management to whole of life cycle asset management requiring a suite of new competencies from asset procurement to management and disposal. Well developed skills and competencies as well as practical experience are a prerequisite to maintain capability, to manage demand as well to plan and set priorities and ensure on-going asset sustainability. This paper has as its focus to establish critical understandings of data, information and knowledge for asset management along with the way in which benchmarking these attributes through computer-aided design may aid a strategic approach to asset management. The paper provides suggestions to improve sharing, integration and creation of asset-related knowledge through the application of codification and personalization approaches.
Resumo:
Fair Use Week has celebrated the evolution and development of the defence of fair use under copyright law in the United States. As Krista Cox noted, ‘As a flexible doctrine, fair use can adapt to evolving technologies and new situations that may arise, and its long history demonstrates its importance in promoting access to information, future innovation, and creativity.’ While the defence of fair use has flourished in the United States, the adoption of the defence of fair use in other jurisdictions has often been stymied. Professor Peter Jaszi has reflected: ‘We can only wonder (with some bemusement) why some of our most important foreign competitors, like the European Union, haven’t figured out that fair use is, to a great extent, the “secret sauce” of U.S. cultural competitiveness.’ Jurisdictions such as Australia have been at a dismal disadvantage, because they lack the freedoms and flexibilities of the defence of fair use.