918 resultados para Computer simulation, Colloidal systems, Nucleation
Resumo:
The dynamics of focusing weak bases using a transient pH boundary was examined via high-resolution computer simulation software. Emphasis was placed on the mechanism and impact that the presence of salt, namely, NaCl, has on the ability to focus weak bases. A series of weak bases with mobilities ranging from 5 x 10(-9) to 30 x 10(-9) m2/V x s and pKa values between 3.0 and 7.5 were examined using a combination of 65.6 mM formic acid, pH 2.85, for the separation electrolyte, and 65.6 mM formic acid, pH 8.60, for the sample matrix. Simulation data show that it is possible to focus weak bases with a pKa value similar to that of the separation electrolyte, but it is restricted to weak bases having an electrophoretic mobility of 20 x 10(-9) m2/V x s or quicker. This mobility range can be extended by the addition of NaCl, with 50 mM NaCl allowing stacking of weak bases down to a mobility of 15 x 10(-9) m2/V x s and 100 mM extending the range to 10 x 10(-9) m2/V x s. The addition of NaCl does not adversely influence focusing of more mobile bases, but does prolong the existence of the transient pH boundary. This allows analytes to migrate extensively through the capillary as a single focused band around the transient pH boundary until the boundary is dissipated. This reduces the length of capillary that is available for separation and, in extreme cases, causes multiple analytes to be detected as a single highly efficient peak.
Resumo:
Using navigation systems in general orthopaedic surgery and, in particular, knee replacement is becoming more and more accepted. This paper describes the basic technological concepts of modern computer assisted surgical systems. It explains the variation in currently available systems and outlines research activities that will potentially influence future products. In general, each navigation system is defined by three components: (1) the therapeutic object is the anatomical structure that is operated on using the navigation system, (2) the virtual object represents an image of the therapeutic object, with radiological images or computer generated models potentially being used, and (3) last but not least, the navigator acquires the spatial position and orientation of instruments and anatomy thus providing the necessary data to replay surgical action in real-time on the navigation system's screen.
Resumo:
Reliable data transfer is one of the most difficult tasks to be accomplished in multihop wireless networks. Traditional transport protocols like TCP face severe performance degradation over multihop networks given the noisy nature of wireless media as well as unstable connectivity conditions in place. The success of TCP in wired networks motivates its extension to wireless networks. A crucial challenge faced by TCP over these networks is how to operate smoothly with the 802.11 wireless MAC protocol which also implements a retransmission mechanism at link level in addition to short RTS/CTS control frames for avoiding collisions. These features render TCP acknowledgments (ACK) transmission quite costly. Data and ACK packets cause similar medium access overheads despite the much smaller size of the ACKs. In this paper, we further evaluate our dynamic adaptive strategy for reducing ACK-induced overhead and consequent collisions. Our approach resembles the sender side's congestion control. The receiver is self-adaptive by delaying more ACKs under nonconstrained channels and less otherwise. This improves not only throughput but also power consumption. Simulation evaluations exhibit significant improvement in several scenarios
Resumo:
Large Power transformers, an aging and vulnerable part of our energy infrastructure, are at choke points in the grid and are key to reliability and security. Damage or destruction due to vandalism, misoperation, or other unexpected events is of great concern, given replacement costs upward of $2M and lead time of 12 months. Transient overvoltages can cause great damage and there is much interest in improving computer simulation models to correctly predict and avoid the consequences. EMTP (the Electromagnetic Transients Program) has been developed for computer simulation of power system transients. Component models for most equipment have been developed and benchmarked. Power transformers would appear to be simple. However, due to their nonlinear and frequency-dependent behaviors, they can be one of the most complex system components to model. It is imperative that the applied models be appropriate for the range of frequencies and excitation levels that the system experiences. Thus, transformer modeling is not a mature field and newer improved models must be made available. In this work, improved topologically-correct duality-based models are developed for three-phase autotransformers having five-legged, three-legged, and shell-form cores. The main problem in the implementation of detailed models is the lack of complete and reliable data, as no international standard suggests how to measure and calculate parameters. Therefore, parameter estimation methods are developed here to determine the parameters of a given model in cases where available information is incomplete. The transformer nameplate data is required and relative physical dimensions of the core are estimated. The models include a separate representation of each segment of the core, including hysteresis of the core, λ-i saturation characteristic, capacitive effects, and frequency dependency of winding resistance and core loss. Steady-state excitation, and de-energization and re-energization transients are simulated and compared with an earlier-developed BCTRAN-based model. Black start energization cases are also simulated as a means of model evaluation and compared with actual event records. The simulated results using the model developed here are reasonable and more correct than those of the BCTRAN-based model. Simulation accuracy is dependent on the accuracy of the equipment model and its parameters. This work is significant in that it advances existing parameter estimation methods in cases where the available data and measurements are incomplete. The accuracy of EMTP simulation for power systems including three-phase autotransformers is thus enhanced. Theoretical results obtained from this work provide a sound foundation for development of transformer parameter estimation methods using engineering optimization. In addition, it should be possible to refine which information and measurement data are necessary for complete duality-based transformer models. To further refine and develop the models and transformer parameter estimation methods developed here, iterative full-scale laboratory tests using high-voltage and high-power three-phase transformer would be helpful.
Resumo:
PURPOSE: To assess the literature on accuracy and clinical performance of computer technology applications in surgical implant dentistry. MATERIALS AND METHODS: Electronic and manual literature searches were conducted to collect information about (1) the accuracy and (2) clinical performance of computer-assisted implant systems. Meta-regression analysis was performed for summarizing the accuracy studies. Failure/complication rates were analyzed using random-effects Poisson regression models to obtain summary estimates of 12-month proportions. RESULTS: Twenty-nine different image guidance systems were included. From 2,827 articles, 13 clinical and 19 accuracy studies were included in this systematic review. The meta-analysis of the accuracy (19 clinical and preclinical studies) revealed a total mean error of 0.74 mm (maximum of 4.5 mm) at the entry point in the bone and 0.85 mm at the apex (maximum of 7.1 mm). For the 5 included clinical studies (total of 506 implants) using computer-assisted implant dentistry, the mean failure rate was 3.36% (0% to 8.45%) after an observation period of at least 12 months. In 4.6% of the treated cases, intraoperative complications were reported; these included limited interocclusal distances to perform guided implant placement, limited primary implant stability, or need for additional grafting procedures. CONCLUSION: Differing levels and quantity of evidence were available for computer-assisted implant placement, revealing high implant survival rates after only 12 months of observation in different indications and a reasonable level of accuracy. However, future long-term clinical data are necessary to identify clinical indications and to justify additional radiation doses, effort, and costs associated with computer-assisted implant surgery.
Resumo:
This tutorial gives a step by step explanation of how one uses experimental data to construct a biologically realistic multicompartmental model. Special emphasis is given on the many ways that this process can be imprecise. The tutorial is intended for both experimentalists who want to get into computer modeling and for computer scientists who use abstract neural network models but are curious about biological realistic modeling. The tutorial is not dependent on the use of a specific simulation engine, but rather covers the kind of data needed for constructing a model, how they are used, and potential pitfalls in the process.
Resumo:
Almost all regions of the brain receive one or more neuromodulatory inputs, and disrupting these inputs produces deficits in neuronal function. Neuromodulators act through intracellular second messenger pathways to influence the electrical properties of neurons, integration of synaptic inputs, spatio-temporal firing dynamics of neuronal networks, and, ultimately, systems behavior. Second messengers pathways consist of series of bimolecular reactions, enzymatic reactions, and diffusion. Calcium is the second messenger molecule with the most effectors, and thus is highly regulated by buffers, pumps and intracellular stores. Computational modeling provides an innovative, yet practical method to evaluate the spatial extent, time course and interaction among second messenger pathways, and the interaction of second messengers with neuron electrical properties. These processes occur both in compartments where the number of molecules are large enough to describe reactions deterministically (e.g. cell body), and in compartments where the number of molecules is small enough that reactions occur stochastically (e.g. spines). – In this tutorial, I explain how to develop models of second messenger pathways and calcium dynamics. The first part of the tutorial explains the equations used to model bimolecular reactions, enzyme reactions, calcium release channels, calcium pumps and diffusion. The second part explains some of the GENESIS, Kinetikit and Chemesis objects that implement the appropriate equations. In depth explanation of calcium and second messenger models is provided by reviewing code, both in XPP, Chemesis and Kinetikit, that implements simple models of calcium dynamics and second messenger cascades.
Resumo:
This book provides the latest in a series of books growing out of the International Joint Conferences on Computer, Information and Systems Sciences and Engineering. It includes chapters in the most advanced areas of Computing, Informatics, Systems Sciences and Engineering. It has accessible to a wide range of readership, including professors, researchers, practitioners and students. This book includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computer Science, Informatics, and Systems Sciences, and Engineering. It includes selected papers form the conference proceedings of the Ninth International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2013). Coverage includes topics in: Industrial Electronics, Technology & Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.
Resumo:
Opportunistic routing (OR) employs a list of candi- dates to improve reliability of wireless transmission. However, list-based OR features restrict the freedom of opportunism, since only the listed nodes can compete for packet forwarding. Additionally, the list is statically generated based on a single metric prior to data transmission, which is not appropriate for mobile ad-hoc networks. This paper provides a thorough perfor- mance evaluation of a new protocol - Context-aware Opportunistic Routing (COR). The contributions of COR are threefold. First, it uses various types of context information simultaneously such as link quality, geographic progress, and residual energy of nodes to make routing decisions. Second, it allows all qualified nodes to participate in packet forwarding. Third, it exploits the relative mobility of nodes to further improve performance. Simulation results show that COR can provide efficient routing in mobile environments, and it outperforms existing solutions that solely rely on a single metric by nearly 20 - 40 %.
Resumo:
Recent advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing environmental conditions and number of users, application performance might suffer, leading to Service Level Agreement (SLA) violations and inefficient use of hardware resources. We introduce a system for controlling the complexity of scaling applications composed of multiple services using mechanisms based on fulfillment of SLAs. We present how service monitoring information can be used in conjunction with service level objectives, predictions, and correlations between performance indicators for optimizing the allocation of services belonging to distributed applications. We validate our models using experiments and simulations involving a distributed enterprise information system. We show how discovering correlations between application performance indicators can be used as a basis for creating refined service level objectives, which can then be used for scaling the application and improving the overall application's performance under similar conditions.
Resumo:
This booklet contains abstracts of papers presented at a biochemical engineering symposium conducted at the University of Nebraska-Lincoln on April 29, 1972. This was the second annual symposium on this subject, the first having been held at Kansas State University on June 4, 1971. It is expected that future symposia will alternate between the two campuses. ContentsS.H. Lin, Kansas State University, "Enzyme Reaction in a Tubular Reactor with Laminar Flow" Gregory C. Martin, University of Nebraska, "Estimation of Parameters in Population Models for Schizosaccharomyces pombe from Chemostat Data" Jaiprakash S. Shastry and Prakash N. Mishra, Kansas State University, "Immobilized Enzymes: Analysis of Ultrafiltration Reactors" Mark D. Young, University of Nebraska, "Modelling Unsteady-State Two-Species Data Using Ramkrishna's Staling Model" G.C.Y. Chu, Kansas State University, "Optimization of Step Aeration Waste Treatment Systems Using EVOP" Shinji Goto, University of Nebraska, "Growth of the Blue-Green Alga Microcytis aeruginosa under Defined Conditions" Prakash N. Mishra and Thomas M.C. Kuo, Kansas State University, "Digital Computer Simulation of the Activated Sludge System: Effect of Primary Clarifier on System Performance" Mark D. Young, University of Nebraska, "Aerobic Fermentation of Paunch Liquor"
Resumo:
It is often claimed that scientists can obtain new knowledge about nature by running computer simulations. How is this possible? I answer this question by arguing that computer simulations are arguments. This view parallels Norton’s argument view about thought experiments. I show that computer simulations can be reconstructed as arguments that fully capture the epistemic power of the simulations. Assuming the extended mind hypothesis, I furthermore argue that running the computer simulation is to execute the reconstructing argument. I discuss some objections and reject the view that computer simulations produce knowledge because they are experiments. I conclude by comparing thought experiments and computer simulations, assuming that both are arguments.