917 resultados para C. computational simulation
Resumo:
The interaction of comets with the solar wind has been the focus of many studies including numerical modeling. We compare the results of our multifluid MHD simulation of comet 1P/Halley to data obtained during the flyby of the European Space Agency's Giotto spacecraft in 1986. The model solves the full set of MHD equations for the individual fluids representing the solar wind protons, the cometary light and heavy ions, and the electrons. The mass loading, charge-exchange, dissociative ion-electron recombination, and collisional interactions between the fluids are taken into account. The computational domain spans over several million kilometers, and the close vicinity of the comet is resolved to the details of the magnetic cavity. The model is validated by comparison to the corresponding Giotto observations obtained by the Ion Mass Spectrometer, the Neutral Mass Spectrometer, the Giotto magnetometer experiment, and the Johnstone Plasma Analyzer instrument. The model shows the formation of the bow shock, the ion pile-up, and the diamagnetic cavity and is able to reproduce the observed temperature differences between the pick-up ion populations and the solar wind protons. We give an overview of the global interaction of the comet with the solar wind and then show the effects of the Lorentz force interaction between the different plasma populations.
Resumo:
This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges.
Resumo:
Femoroacetabular impingement (FAI) is a dynamic conflict of the hip defined by a pathological, early abutment of the proximal femur onto the acetabulum or pelvis. In the past two decades, FAI has received increasing focus in both research and clinical practice as a cause of hip pain and prearthrotic deformity. Anatomical abnormalities such as an aspherical femoral head (cam-type FAI), a focal or general overgrowth of the acetabulum (pincer-type FAI), a high riding greater or lesser trochanter (extra-articular FAI), or abnormal torsion of the femur have been identified as underlying pathomorphologies. Open and arthroscopic treatment options are available to correct the deformity and to allow impingement-free range of motion. In routine practice, diagnosis and treatment planning of FAI is based on clinical examination and conventional imaging modalities such as standard radiography, magnetic resonance arthrography (MRA), and computed tomography (CT). Modern software tools allow three-dimensional analysis of the hip joint by extracting pelvic landmarks from two-dimensional antero-posterior pelvic radiographs. An object-oriented cross-platform program (Hip2Norm) has been developed and validated to standardize pelvic rotation and tilt on conventional AP pelvis radiographs. It has been shown that Hip2Norm is an accurate, consistent, reliable and reproducible tool for the correction of selected hip parameters on conventional radiographs. In contrast to conventional imaging modalities, which provide only static visualization, novel computer assisted tools have been developed to allow the dynamic analysis of FAI pathomechanics. In this context, a validated, CT-based software package (HipMotion) has been introduced. HipMotion is based on polygonal three-dimensional models of the patient’s pelvis and femur. The software includes simulation methods for range of motion, collision detection and accurate mapping of impingement areas. A preoperative treatment plan can be created by performing a virtual resection of any mapped impingement zones both on the femoral head-neck junction, as well as the acetabular rim using the same three-dimensional models. The following book chapter provides a summarized description of current computer-assisted tools for the diagnosis and treatment planning of FAI highlighting the possibility for both static and dynamic evaluation, reliability and reproducibility, and its applicability to routine clinical use.
Resumo:
We model Callisto's exosphere based on its ice as well as non-ice surface via the use of a Monte-Carlo exosphere model. For the ice component we implement two putative compositions that have been computed from two possible extreme formation scenarios of the satellite. One composition represents the oxidizing state and is based on the assumption that the building blocks of Callisto were formed in the protosolar nebula and the other represents the reducing state of the gas, based on the assumption that the satellite accreted from solids condensed in the jovian sub-nebula. For the non-ice component we implemented the compositions of typical CI as well as L type chondrites. Both chondrite types have been suggested to represent Callisto's non-ice composition best. As release processes we consider surface sublimation, ion sputtering and photon-stimulated desorption. Particles are followed on their individual trajectories until they either escape Callisto's gravitational attraction, return to the surface, are ionized, or are fragmented. Our density profiles show that whereas the sublimated species dominate close to the surface on the sun-lit side, their density profiles (with the exception of H and H-2) decrease much more rapidly than the sputtered particles. The Neutral gas and Ion Mass (NIM) spectrometer, which is part of the Particle Environment Package (PEP), will investigate Callisto's exosphere during the JUICE mission. Our simulations show that NIM will be able to detect sublimated and sputtered particles from both the ice and non-ice surface. NIM's measured chemical composition will allow us to distinguish between different formation scenarios. (C) 2015 Elsevier Inc. All rights reserved.
Resumo:
The social processes that lead to destructive behavior in celebratory crowds can be studied through an agent-based computer simulation. Riots are an increasingly common outcome of sports celebrations, and pose the potential for harm to participants, bystanders, property, and the reputation of the groups with whom participants are associated. Rioting cannot necessarily be attributed to the negative emotions of individuals, such as anger, rage, frustration and despair. For instance, the celebratory behavior (e.g., chanting, cheering, singing) during UConn’s “Spring Weekend” and after the 2004 NCAA Championships resulted in several small fires and overturned cars. Further, not every individual in the area of a riot engages in violence, and those who do, do not do so continuously. Instead, small groups carry out the majority of violent acts in relatively short-lived episodes. Agent-based computer simulations are an ideal method for modeling complex group-level social phenomena, such as celebratory gatherings and riots, which emerge from the interaction of relatively “simple” individuals. By making simple assumptions about individuals’ decision-making and behaviors and allowing actors to affect one another, behavioral patterns emerge that cannot be predicted by the characteristics of individuals. The computer simulation developed here models celebratory riot behavior by repeatedly evaluating a single algorithm for each individual, the inputs of which are affected by the characteristics of nearby actors. Specifically, the simulation assumes that (a) actors possess 1 of 5 distinct social identities (group memberships), (b) actors will congregate with actors who possess the same identity, (c) the degree of social cohesion generated in the social context determines the stability of relationships within groups, and (d) actors’ level of aggression is affected by the aggression of other group members. Not only does this simulation provide a systematic investigation of the effects of the initial distribution of aggression, social identification, and cohesiveness on riot outcomes, but also an analytic tool others may use to investigate, visualize and predict how various individual characteristics affect emergent crowd behavior.
Resumo:
The FUS1 tumor suppressor gene (TSG) has been found to be deficient in many human non-small cell lung cancer (NSCLC) tissue samples and cell lines (1,2,3). Studies have shown potent anti-tumor activity of FUS1 in animal models where FUS1 was delivered through a liposomal vector (4) and the use of FUS1 as a therapeutic agent is currently being studied in clinical human trials (5). Currently, the mechanisms of FUS1 activity are being investigated and my studies have shown that c-Abl tyrosine kinase is inhibited by the FUS1 TSG.^ Considering that many NSCLC cell lines are FUS1 deficient, my studies further identified that FUS1 deficient NSCLC cells have an activated c-Abl tyrosine kinase. C-Abl is a known proto-oncogene and while c-Abl kinase is tightly regulated in normal cells, constitutively active Abl kinase is known to contribute to the oncogenic phenotype in some types of hematopoietic cancers. My studies show that the active c-Abl kinase contributes to the oncogenicity of NSCLC cells, particularly in tumors that are deficient in FUS1, and that c-Abl may prove to be a viable target in NSCLC therapy.^ Current studies have shown that growth factor receptors play a role in NSCLC. Over-expression of the epidermal growth factor receptor (EGFR) plays a significant role in aggressiveness of NSCLC. Current late stage treatments include EFGR tyrosine kinase inhibitors or EGFR antibodies. Platelet-derived growth factor receptor (PDGFR) also has been shown to play a role in NSCLC. Of note, both growth factor receptors are known upstream activators of c-Abl kinase. My studies indicate that growth factor receptor simulation along deficiency in FUS1 expression contributes to the activation of c-Abl kinase in NSCLC cells. ^
Resumo:
The purpose of this research is to develop a new statistical method to determine the minimum set of rows (R) in a R x C contingency table of discrete data that explains the dependence of observations. The statistical power of the method will be empirically determined by computer simulation to judge its efficiency over the presently existing methods. The method will be applied to data on DNA fragment length variation at six VNTR loci in over 72 populations from five major racial groups of human (total sample size is over 15,000 individuals; each sample having at least 50 individuals). DNA fragment lengths grouped in bins will form the basis of studying inter-population DNA variation within the racial groups are significant, will provide a rigorous re-binning procedure for forensic computation of DNA profile frequencies that takes into account intra-racial DNA variation among populations. ^
Resumo:
Direct Simulation Monte Carlo (DSMC) is a powerful numerical method to study rarefied gas flows such as cometary comae and has been used by several authors over the past decade to study cometary outflow. However, the investigation of the parameter space in simulations can be time consuming since 3D DSMC is computationally highly intensive. For the target of ESA's Rosetta mission, comet 67P/Churyumov-Gerasimenko, we have identified to what extent modification of several parameters influence the 3D flow and gas temperature fields and have attempted to establish the reliability of inferences about the initial conditions from in situ and remote sensing measurements. A large number of DSMC runs have been completed with varying input parameters. In this work, we present the simulation results and conclude on the sensitivity of solutions to certain inputs. It is found that among cases of water outgassing, the surface production rate distribution is the most influential variable to the flow field.
Resumo:
To prepare an answer to the question of how a developing country can attract FDI, this paper explored the factors and policies that may help bring FDI into a developing country by utilizing an extended version of the knowledge-capital model. With a special focus on the effects of FTAs/EPAs between market countries and developing countries, simulations with the model revealed the following: (1) Although FTA/EPA generally ends to increase FDI to a developing country, the possibility of improving welfare through increased demand for skilled and unskilled labor becomes higher as the size of the country declines; (2) Because the additional implementation of cost-saving policies to reduce firm-type/trade-link specific fixed costs ends to depreciate the price of skilled labor by saving its input, a developing country, which is extremely scarce in skilled labor, is better off avoiding the additional option; (3) If a country hopes to enjoy larger welfare gains with EPA, efforts to increase skilled labor in the country, such as investing in education, may be beneficial.
Resumo:
An efficient approach for the simulation of ion scattering from solids is proposed. For every encountered atom, we take multiple samples of its thermal displacements among those which result in scattering with high probability to finally reach the detector. As a result, the detector is illuminated by intensive “showers,” where each event of detection must be weighted according to the actual probability of the atom displacement. The computational cost of such simulation is orders of magnitude lower than in the direct approach, and a comprehensive analysis of multiple and plural scattering effects becomes possible. We use this method for two purposes. First, the accuracy of the approximate approaches, developed mainly for ion-beam structural analysis, is verified. Second, the possibility to reproduce a wide class of experimental conditions is used to analyze some basic features of ion-solid collisions: the role of double violent collisions in low-energy ion scattering; the origin of the “surface peak” in scattering from amorphous samples; the low-energy tail in the energy spectra of scattered medium-energy ions due to plural scattering; and the degradation of blocking patterns in two-dimensional angular distributions with increasing depth of scattering. As an example of simulation for ions of MeV energies, we verify the time reversibility for channeling and blocking of 1-MeV protons in a W crystal. The possibilities of analysis that our approach offers may be very useful for various applications, in particular, for structural analysis with atomic resolution.
Resumo:
Monte Carlo simulations have been carried out to study the effect of temperature on the growth kinetics of a circular grain. This work demonstrates the importance of roughening fluctuations on the growth dynamics. Since the effect of thermal fluctuations is stronger in d =2 than in d =3, as predicted by d =3 theories of domain kinetics, the circular domain shrinks linearly with time as A (t)=A(0)-αt, where A (0) and A(t) are the initial and instantaneous areas, respectively. However, in contrast to d =3, the slope α is strongly temperature dependent for T≥0.6TC. An analytical theory which considers the thermal fluctuations agrees with the T dependence of the Monte Carlo data in this regime, and this model show that these fluctuations are responsible for the strong temperature dependence of the growth rate for d =2. Our results are particularly relevant to the problem of domain growth in surface science
Resumo:
Th e CERES-Maize model is the most widely used maize (Zea mays L.) model and is a recognized reference for comparing new developments in maize growth, development, and yield simulation. Th e objective of this study was to present and evaluate CSMIXIM, a new maize simulation model for DSSAT version 4.5. Code from CSM-CERES-Maize, the modular version of the model, was modifi ed to include a number of model improvements. Model enhancements included the simulation of leaf area, C assimilation and partitioning, ear growth, kernel number, grain yield, and plant N acquisition and distribution. Th e addition of two genetic coeffi cients to simulate per-leaf foliar surface produced 32% smaller root mean square error (RMSE) values estimating leaf area index than did CSM-CERES. Grain yield and total shoot biomass were correctly simulated by both models. Carbon partitioning, however, showed diff erences. Th e CSM-IXIM model simulated leaf mass more accurately, reducing the CSM-CERES error by 44%, but overestimated stem mass, especially aft er stress, resulting in similar average RMSE values as CSM-CERES. Excessive N uptake aft er fertilization events as simulated by CSM-CERES was also corrected, reducing the error by 16%. Th e accuracy of N distribution to stems was improved by 68%. Th ese improvements in CSM-IXIM provided a stable basis for more precise simulation of maize canopy growth and yield and a framework for continuing future model developments
Resumo:
Although several profiling techniques for identifying performance bottlenecks in logic programs have been developed, they are generally not automatic and in most cases they do not provide enough information for identifying the root causes of such bottlenecks. This complicates using their results for guiding performance improvement. We present a profiling method and tool that provides such explanations. Our profiler associates cost centers to certain program elements and can measure different types of resource-related properties that affect performance, preserving the precedence of cost centers in the cali graph. It includes an automatic method for detecting procedures that are performance bottlenecks. The profiling tool has been integrated in a previously developed run-time checking framework to allow verification of certain properties when they cannot be verified statically. The approach allows checking global computational properties which require complex instrumentation tracking information about previous execution states, such as, e.g., that the execution time accumulated by a given procedure is not greater than a given bound. We have built a prototype implementation, integrated it in the Ciao/CiaoPP system and successfully applied it to performance improvement, automatic optimization (e.g., resource-aware specialization of programs), run-time checking, and debugging of global computational properties (e.g., resource usage) in Prolog programs.
Resumo:
We describe lpdoc, a tool which generates documentation manuals automatically from one or more logic program source files, written in Ciao, ISO-Prolog, and other (C)LP languages. It is particularly useful for documenting library modules, for which it automatically generates a rich description of the module interface. However, it can also be used quite successfully to document full applications. A fundamental advantage of using lpdoc is that it helps maintaining a true correspondence between the program and its documentation, and also identifying precisely to what versión of the program a given printed manual corresponds. The quality of the documentation generated can be greatly enhanced by including within the program text assertions (declarations with types, modes, etc. ...) for the predicates in the program, and machine-readable comments. One of the main novelties of lpdoc is that these assertions and comments are written using the Ciao system asseriion language, which is also the language of communication between the compiler and the user and between the components of the compiler. This allows a significant synergy among specification, debugging, documentation, optimization, etc. A simple compatibility library allows conventional (C)LP systems to ignore these assertions and comments and treat normally programs documented in this way. The documentation can be generated interactively from emacs or from the command line, in many formats including texinfo, dvi, ps, pdf, info, ascii, html/css, Unix nroff/man, Windows help, etc., and can include bibliographic citations and images, lpdoc can also genérate "man" pages (Unix man page format), nicely formatted plain ASCII "readme" files, installation scripts useful when the manuals are included in software distributions, brief descriptions in html/css or info formats suitable for inclusión in on-line Índices of manuals, and even complete WWW and info sites containing on-line catalogs of documents and software distributions. The lpdoc manual, all other Ciao system manuals, and parts of this paper are generated by lpdoc.