996 resultados para COMPUTATIONAL CELLS
Resumo:
This study examines the applicability of a micromechanics approach based upon the computational cell methodology incorporating the Gurson-Tvergaard (GT) model and the CTOA criterion to describe ductile crack extension of longitudinal crack-like defects in high pressure pipeline steels. A central focus is to gain additional insight into the effectiveness and limitations of both approaches to describe crack growth response and to predict the burst pressure for the tested cracked pipes. A verification study conducted on burst testing of large-diameter, precracked pipe specimens with varying crack depth to thickness ratio (a/t) shows the potential predictive capability of the cell approach even though both the CT model and the CTOA criterion appear to depend on defect geometry. Overall, the results presented here lend additional support for further developments in the cell methodology as a valid engineering tool for integrity assessments of pipelines with axial defects. (C) 2011 Elsevier Ltd. All rights reserved,
Resumo:
Running hydrodynamic models interactively allows both visual exploration and change of model state during simulation. One of the main characteristics of an interactive model is that it should provide immediate feedback to the user, for example respond to changes in model state or view settings. For this reason, such features are usually only available for models with a relatively small number of computational cells, which are used mainly for demonstration and educational purposes. It would be useful if interactive modeling would also work for models typically used in consultancy projects involving large scale simulations. This results in a number of technical challenges related to the combination of the model itself and the visualisation tools (scalability, implementation of an appropriate API for control and access to the internal state). While model parallelisation is increasingly addressed by the environmental modeling community, little effort has been spent on developing a high-performance interactive environment. What can we learn from other high-end visualisation domains such as 3D animation, gaming, virtual globes (Autodesk 3ds Max, Second Life, Google Earth) that also focus on efficient interaction with 3D environments? In these domains high efficiency is usually achieved by the use of computer graphics algorithms such as surface simplification depending on current view, distance to objects, and efficient caching of the aggregated representation of object meshes. We investigate how these algorithms can be re-used in the context of interactive hydrodynamic modeling without significant changes to the model code and allowing model operation on both multi-core CPU personal computers and high-performance computer clusters.
Resumo:
The determination of the local Lagrangian evolution of the flow topology in wall-bounded turbulence, and of the Lagrangian evolution associated with entrainment across the turbulent / non-turbulent interface into a turbulent boundary layer, require accurate tracking of a fluid particle and its local velocity gradients. This paper addresses the implementation of fluid-particle tracking in both a turbulent boundary layer direct numerical simulation and in a fully developed channel flow simulation. Determination of the sub-grid particle velocity is performed using both cubic B-spline, four-point Hermite spline and higher-order Hermite spline interpolation. Both wall-bounded flows show similar oscillations in the Lagrangian tracers of both velocity and velocity gradients, corresponding to the movement of particles across the boundaries of computational cells. While these oscillation in the particle velocity are relatively small and have negligible effect on the particle trajectories for time-steps of the order of CFL = 0.1, they appear to be the cause of significant oscillations in the evolution of the invariants of the velocity gradient tensor.
Resumo:
During the epoch when the first collapsed structures formed (6<z<50) our Universe went through an extended period of changes. Some of the radiation from the first stars and accreting black holes in those structures escaped and changed the state of the Intergalactic Medium (IGM). The era of this global phase change in which the state of the IGM was transformed from cold and neutral to warm and ionized, is called the Epoch of Reionization.In this thesis we focus on numerical methods to calculate the effects of this escaping radiation. We start by considering the performance of the cosmological radiative transfer code C2-Ray. We find that although this code efficiently and accurately solves for the changes in the ionized fractions, it can yield inaccurate results for the temperature changes. We introduce two new elements to improve the code. The first element, an adaptive time step algorithm, quickly determines an optimal time step by only considering the computational cells relevant for this determination. The second element, asynchronous evolution, allows different cells to evolve with different time steps. An important constituent of methods to calculate the effects of ionizing radiation is the transport of photons through the computational domain or ``ray-tracing''. We devise a novel ray tracing method called PYRAMID which uses a new geometry - the pyramidal geometry. This geometry shares properties with both the standard Cartesian and spherical geometries. This makes it on the one hand easy to use in conjunction with a Cartesian grid and on the other hand ideally suited to trace radiation from a radially emitting source. A time-dependent photoionization calculation not only requires tracing the path of photons but also solving the coupled set of photoionization and thermal equations. Several different solvers for these equations are in use in cosmological radiative transfer codes. We conduct a detailed and quantitative comparison of four different standard solvers in which we evaluate how their accuracy depends on the choice of the time step. This comparison shows that their performance can be characterized by two simple parameters and that the C2-Ray generally performs best.
Resumo:
Our docking program, Fitted, implemented in our computational platform, Forecaster, has been modified to carry out automated virtual screening of covalent inhibitors. With this modified version of the program, virtual screening and further docking-based optimization of a selected hit led to the identification of potential covalent reversible inhibitors of prolyl oligopeptidase activity. After visual inspection, a virtual hit molecule together with four analogues were selected for synthesis and made in one-five chemical steps. Biological evaluations on recombinant POP and FAPα enzymes, cell extracts, and living cells demonstrated high potency and selectivity for POP over FAPα and DPPIV. Three compounds even exhibited high nanomolar inhibitory activities in intact living human cells and acceptable metabolic stability. This small set of molecules also demonstrated that covalent binding and/or geometrical constraints to the ligand/protein complex may lead to an increase in bioactivity.
Resumo:
A two-phase three-dimensional computational model of an intermediate temperature (120--190°C) proton exchange membrane (PEM) fuel cell is presented. This represents the first attempt to model PEM fuel cells employing intermediate temperature membranes, in this case, phosphoric acid doped polybenzimidazole (PBI). To date, mathematical modeling of PEM fuel cells has been restricted to low temperature operation, especially to those employing Nafion ® membranes; while research on PBI as an intermediate temperature membrane has been solely at the experimental level. This work is an advancement in the state of the art of both these fields of research. With a growing trend toward higher temperature operation of PEM fuel cells, mathematical modeling of such systems is necessary to help hasten the development of the technology and highlight areas where research should be focused.^ This mathematical model accounted for all the major transport and polarization processes occurring inside the fuel cell, including the two phase phenomenon of gas dissolution in the polymer electrolyte. Results were presented for polarization performance, flux distributions, concentration variations in both the gaseous and aqueous phases, and temperature variations for various heat management strategies. The model predictions matched well with published experimental data, and were self-consistent.^ The major finding of this research was that, due to the transport limitations imposed by the use of phosphoric acid as a doping agent, namely low solubility and diffusivity of dissolved gases and anion adsorption onto catalyst sites, the catalyst utilization is very low (∼1--2%). Significant cost savings were predicted with the use of advanced catalyst deposition techniques that would greatly reduce the eventual thickness of the catalyst layer, and subsequently improve catalyst utilization. The model also predicted that an increase in power output in the order of 50% is expected if alternative doping agents to phosphoric acid can be found, which afford better transport properties of dissolved gases, reduced anion adsorption onto catalyst sites, and which maintain stability and conductive properties at elevated temperatures.^
Resumo:
Human and robots have complementary strengths in performing assembly operations. Humans are very good at perception tasks in unstructured environments. They are able to recognize and locate a part from a box of miscellaneous parts. They are also very good at complex manipulation in tight spaces. The sensory characteristics of the humans, motor abilities, knowledge and skills give the humans the ability to react to unexpected situations and resolve problems quickly. In contrast, robots are very good at pick and place operations and highly repeatable in placement tasks. Robots can perform tasks at high speeds and still maintain precision in their operations. Robots can also operate for long periods of times. Robots are also very good at applying high forces and torques. Typically, robots are used in mass production. Small batch and custom production operations predominantly use manual labor. The high labor cost is making it difficult for small and medium manufacturers to remain cost competitive in high wage markets. These manufactures are mainly involved in small batch and custom production. They need to find a way to reduce the labor cost in assembly operations. Purely robotic cells will not be able to provide them the necessary flexibility. Creating hybrid cells where humans and robots can collaborate in close physical proximities is a potential solution. The underlying idea behind such cells is to decompose assembly operations into tasks such that humans and robots can collaborate by performing sub-tasks that are suitable for them. Realizing hybrid cells that enable effective human and robot collaboration is challenging. This dissertation addresses the following three computational issues involved in developing and utilizing hybrid assembly cells: - We should be able to automatically generate plans to operate hybrid assembly cells to ensure efficient cell operation. This requires generating feasible assembly sequences and instructions for robots and human operators, respectively. Automated planning poses the following two challenges. First, generating operation plans for complex assemblies is challenging. The complexity can come due to the combinatorial explosion caused by the size of the assembly or the complex paths needed to perform the assembly. Second, generating feasible plans requires accounting for robot and human motion constraints. The first objective of the dissertation is to develop the underlying computational foundations for automatically generating plans for the operation of hybrid cells. It addresses both assembly complexity and motion constraints issues. - The collaboration between humans and robots in the assembly cell will only be practical if human safety can be ensured during the assembly tasks that require collaboration between humans and robots. The second objective of the dissertation is to evaluate different options for real-time monitoring of the state of human operator with respect to the robot and develop strategies for taking appropriate measures to ensure human safety when the planned move by the robot may compromise the safety of the human operator. In order to be competitive in the market, the developed solution will have to include considerations about cost without significantly compromising quality. - In the envisioned hybrid cell, we will be relying on human operators to bring the part into the cell. If the human operator makes an error in selecting the part or fails to place it correctly, the robot will be unable to correctly perform the task assigned to it. If the error goes undetected, it can lead to a defective product and inefficiencies in the cell operation. The reason for human error can be either confusion due to poor quality instructions or human operator not paying adequate attention to the instructions. In order to ensure smooth and error-free operation of the cell, we will need to monitor the state of the assembly operations in the cell. The third objective of the dissertation is to identify and track parts in the cell and automatically generate instructions for taking corrective actions if a human operator deviates from the selected plan. Potential corrective actions may involve re-planning if it is possible to continue assembly from the current state. Corrective actions may also involve issuing warning and generating instructions to undo the current task.
Resumo:
The arteriovenous fistula (AVF) is characterized by enhanced blood flow and is the most widely used vascular access for chronic haemodialysis (Sivanesan et al., 1998). A large proportion of the AVF late failures are related to local haemodynamics (Sivanesan et al., 1999a). As in AVF, blood flow dynamics plays an important role in growth, rupture, and surgical treatment of aneurysm. Several techniques have been used to study the flow patterns in simplified models of vascular anastomose and aneurysm. In the present investigation, Computational Fluid Dynamics (CFD) is used to analyze the flow patterns in AVF and aneurysm through the velocity waveform obtained from experimental surgeries in dogs (Galego et al., 2000), as well as intra-operative blood flow recordings of patients with radiocephalic AVF ( Sivanesan et al., 1999b) and physiological pulses (Aires, 1991), respectively. The flow patterns in AVF for dog and patient surgeries data are qualitatively similar. Perturbation, recirculation and separation zones appeared during cardiac cycle, and these were intensified in the diastole phase for the AVF and aneurysm models. The values of wall shear stress presented in this investigation of AVF and aneurysm models oscillated in the range that can both cause damage to endothelial cells and develop atherosclerosis.
Resumo:
The human brain is often considered to be the most cognitively capable among mammalian brains and to be much larger than expected for a mammal of our body size. Although the number of neurons is generally assumed to be a determinant of computational power, and despite the widespread quotes that the human brain contains 100 billion neurons and ten times more glial cells, the absolute number of neurons and glial cells in the human brain remains unknown. Here we determine these numbers by using the isotropic fractionator and compare them with the expected values for a human-sized primate. We find that the adult male human brain contains on average 86.1 +/- 8.1 billion NeuN-positive cells (""neurons"") and 84.6 +/- 9.8 billion NeuN-negative (""nonneuronal"") cells. With only 19% of all neurons located in the cerebral cortex, greater cortical size (representing 82% of total brain mass) in humans compared with other primates does not reflect an increased relative number of cortical neurons. The ratios between glial cells and neurons in the human brain structures are similar to those found in other primates, and their numbers of cells match those expected for a primate of human proportions. These findings challenge the common view that humans stand out from other primates in their brain composition and indicate that, with regard to numbers of neuronal and nonneuronal cells, the human brain is an isometrically scaled-up primate brain. J. Comp. Neurol. 513:532-541, 2009. (c) 2009 Wiley-Liss, Inc.
Resumo:
Malaria, caused by Plasmodium falciparum (P. falciparum), ranks as one of the most baleful infectious diseases worldwide. New antimalarial treatments are needed to face existing or emerging drug resistant strains. Protein degradation appears to play a significant role during the asexual intraerythrocytic developmental cycle (IDC) of P. falciparum. Inhibition of the ubiquitin proteasome system (UPS), a major intracellular proteolytic pathway, effectively reduces infection and parasite replication. P. falciparum and erythrocyte UPS coexist during IDC but the nature of their relationship is largely unknown. We used an approach based on Tandem Ubiquitin-Binding Entities (TUBEs) and 1D gel electrophoresis followed by mass spectrometry to identify major components of the TUBEs-associated ubiquitin proteome of both host and parasite during ring, trophozoite and schizont stages. Ring-exported protein (REX1), a P. falciparum protein located in Maurer's clefts and important for parasite nutrient import, was found to reach a maximum level of ubiquitylation in trophozoites stage. The Homo sapiens (H. sapiens) TUBEs associated ubiquitin proteome decreased during the infection, whereas the equivalent P. falciparum TUBEs-associated ubiquitin proteome counterpart increased. Major cellular processes such as DNA repair, replication, stress response, vesicular transport and catabolic events appear to be regulated by ubiquitylation along the IDC P. falciparum infection.
Resumo:
Dissertation presented to obtain the Ph.D degree in Ciências da Engenharia e Tecnologia, especialidade Biotecnologia
Analysis of metabolic flux distributions in relation to the extracellular environment in Avian cells
Resumo:
Continuous cell lines that proliferate in chemically defined and simple media have been highly regarded as suitable alternatives for vaccine production. One such cell line is the AG1.CR.pIX avian cell line developed by PROBIOGEN. This cell line can be cultivated in a fully scalable suspension culture and adapted to grow in chemically defined, calf serum free, medium [1]–[5]. The medium composition and cultivation strategy are important factors for reaching high virus titers. In this project, a series of computational methods was used to simulate the cell’s response to different environments. The study is based on the metabolic model of the central metabolism proposed in [1]. In a first step, Metabolic Flux Analysis (MFA) was used along with measured uptake and secretion fluxes to estimate intracellular flux values. The network and data were found to be consistent. In a second step, Flux Balance Analysis (FBA) was performed to access the cell’s biological objective. The objective that resulted in the best predicted results fit to the experimental data was the minimization of oxidative phosphorylation. Employing this objective, in the next step Flux Variability Analysis (FVA) was used to characterize the flux solution space. Furthermore, various scenarios, where a reaction deletion (elimination of the compound from the media) was simulated, were performed and the flux solution space for each scenario was calculated. Growth restrictions caused by essential and non-essential amino acids were accurately predicted. Fluxes related to the essential amino acids uptake and catabolism, the lipid synthesis and ATP production via TCA were found to be essential to exponential growth. Finally, the data gathered during the previous steps were analyzed using principal component analysis (PCA), in order to assess potential changes in the physiological state of the cell. Three metabolic states were found, which correspond to zero, partial and maximum biomass growth rate. Elimination of non-essential amino acids or pyruvate from the media showed no impact on the cell’s assumed normal metabolic state.
Resumo:
Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.
Resumo:
BACKGROUND: Systemic lupus erythematosus (SLE) is a prototypical autoimmune disease in which increased apoptosis and decreased apoptotic cells removal has been described as most relevant in the pathogenesis. Long-chain acyl-coenzyme A synthetases (ACSLs) have been involved in the immunological dysfunction of mouse models of lupus-like autoimmunity and apoptosis in different in vitro cell systems. The aim of this work was to assess among the ACSL isoforms the involvement of ACSL2, ACSL4 and ACSL5 in SLE pathogenesis. FINDINGS: With this end, we determined the ACSL2, ACSL4 and ACSL5 transcript levels in peripheral blood mononuclear cells (PBMCs) of 45 SLE patients and 49 healthy controls by quantitative real time-PCR (q-PCR). We found that patients with SLE had higher ACSL5 transcript levels than healthy controls [median (range), healthy controls =16.5 (12.3-18.0) vs. SLE = 26.5 (17.8-41.7), P = 3.9x10 E-5] but no differences were found for ACSL2 and ACSL4. In in vitro experiments, ACSL5 mRNA expression was greatly increased when inducing apoptosis in Jurkat T cells and PBMCs by Phorbol-Myristate-Acetate plus Ionomycin (PMA+Io). On the other hand, short interference RNA (siRNA)-mediated silencing of ACSL5 decreased induced apoptosis in Jurkat T cells up to the control levels as well as decreased mRNA expression of FAS, FASLG and TNF. CONCLUSIONS: These findings indicate that ACSL5 may play a role in the apoptosis that takes place in SLE. Our results point to ACSL5 as a potential novel functional marker of pathogenesis and a possible therapeutic target in SLE