899 resultados para Simulation analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research work reported in this Thesis was held along two main lines of research. The first and main line of research is about the synthesis of heteroaromatic compounds with increasing steric hindrance, with the aim of preparing stable atropisomers. The main tools used for the study of these dynamic systems, as described in the Introduction, are DNMR, coupled with line shape simulation and DFT calculations, aimed to the conformational analysis for the prediction of the geometries and energy barriers to the trasition states. This techniques have been applied to the research projects about: • atropisomers of arylmaleimides; • atropisomers of 4-arylpyrazolo[3,4-b]pyridines; • study of the intramolecular NO2/CO interaction in solution; • study on 2-arylpyridines. Parallel to the main project, in collaboration with other groups, the research line about determination of the absolute configuration was followed. The products, deriving form organocatalytic reactions, in many cases couldn’t be analyzed by means of X-Ray diffraction, making necessary the development of a protocol based on spectroscopic methodologies: NMR, circular dichroism and computational tools (DFT, TD-DFT) have been implemented in this scope. In this Thesis are reported the determination of the absolute configuration of: • substituted 1,2,3,4-tetrahydroquinolines; • compounds from enantioselective Friedel-Crafts alkylation-acetalization cascade of naphthols with α,β-unsaturated cyclic ketones; • substituted 3,4-annulated indoles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the Three Mile Island Unit 2 (TMI-2), accident in 1979 which led to the meltdown of about one half of the reactor core and to limited releases of radioactive materials to the environment, an important international effort has been made on severe accident research. The present work aims to investigate the behaviour of a Small Modular Reactor during severe accident conditions. In order to perform these analyses, a SMR has been studied for the European reference severe accident analysis code ASTEC, developed by IRSN and GRS. In the thesis will be described in detail the IRIS Small Modular Reactor; the reference reactor chosen to develop the ASTEC input deck. The IRIS model was developed in the framework of a research collaboration with the IRSN development team. In the thesis will be described systematically the creation of the ASTEC IRIS input deck: the nodalization scheme adopted, the solution used to simulate the passive safety systems and the strong interaction between the reactor vessel and the containment. The ASTEC SMR model will be tested against the RELAP-GOTHIC coupled code model, with respect to a Design Basis Accident, to evaluate the capability of the ASTEC code on reproducing correctly the behaviour of the nuclear system. Once the model has been validated, a severe accident scenario will be simulated and the obtained results along with the nuclear system response will be analysed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past two decades the work of a growing portion of researchers in robotics focused on a particular group of machines, belonging to the family of parallel manipulators: the cable robots. Although these robots share several theoretical elements with the better known parallel robots, they still present completely (or partly) unsolved issues. In particular, the study of their kinematic, already a difficult subject for conventional parallel manipulators, is further complicated by the non-linear nature of cables, which can exert only efforts of pure traction. The work presented in this thesis therefore focuses on the study of the kinematics of these robots and on the development of numerical techniques able to address some of the problems related to it. Most of the work is focused on the development of an interval-analysis based procedure for the solution of the direct geometric problem of a generic cable manipulator. This technique, as well as allowing for a rapid solution of the problem, also guarantees the results obtained against rounding and elimination errors and can take into account any uncertainties in the model of the problem. The developed code has been tested with the help of a small manipulator whose realization is described in this dissertation together with the auxiliary work done during its design and simulation phases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In condensed matter systems, the interfacial tension plays a central role for a multitude of phenomena. It is the driving force for nucleation processes, determines the shape and structure of crystalline structures and is important for industrial applications. Despite its importance, the interfacial tension is hard to determine in experiments and also in computer simulations. While for liquid-vapor interfacial tensions there exist sophisticated simulation methods to compute the interfacial tension, current methods for solid-liquid interfaces produce unsatisfactory results.rnrnAs a first approach to this topic, the influence of the interfacial tension on nuclei is studied within the three-dimensional Ising model. This model is well suited because despite its simplicity, one can learn much about nucleation of crystalline nuclei. Below the so-called roughening temperature, nuclei in the Ising model are not spherical anymore but become cubic because of the anisotropy of the interfacial tension. This is similar to crystalline nuclei, which are in general not spherical but more like a convex polyhedron with flat facets on the surface. In this context, the problem of distinguishing between the two bulk phases in the vicinity of the diffuse droplet surface is addressed. A new definition is found which correctly determines the volume of a droplet in a given configuration if compared to the volume predicted by simple macroscopic assumptions.rnrnTo compute the interfacial tension of solid-liquid interfaces, a new Monte Carlo method called ensemble switch method'' is presented which allows to compute the interfacial tension of liquid-vapor interfaces as well as solid-liquid interfaces with great accuracy. In the past, the dependence of the interfacial tension on the finite size and shape of the simulation box has often been neglected although there is a nontrivial dependence on the box dimensions. As a consequence, one needs to systematically increase the box size and extrapolate to infinite volume in order to accurately predict the interfacial tension. Therefore, a thorough finite-size scaling analysis is established in this thesis. Logarithmic corrections to the finite-size scaling are motivated and identified, which are of leading order and therefore must not be neglected. The astounding feature of these logarithmic corrections is that they do not depend at all on the model under consideration. Using the ensemble switch method, the validity of a finite-size scaling ansatz containing the aforementioned logarithmic corrections is carefully tested and confirmed. Combining the finite-size scaling theory with the ensemble switch method, the interfacial tension of several model systems, ranging from the Ising model to colloidal systems, is computed with great accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical simulation of the Oldroyd-B type viscoelastic fluids is a very challenging problem. rnThe well-known High Weissenberg Number Problem" has haunted the mathematicians, computer scientists, and rnengineers for more than 40 years. rnWhen the Weissenberg number, which represents the ratio of elasticity to viscosity, rnexceeds some limits, simulations done by standard methods break down exponentially fast in time. rnHowever, some approaches, such as the logarithm transformation technique can significantly improve rnthe limits of the Weissenberg number until which the simulations stay stable. rnrnWe should point out that the global existence of weak solutions for the Oldroyd-B model is still open. rnLet us note that in the evolution equation of the elastic stress tensor the terms describing diffusive rneffects are typically neglected in the modelling due to their smallness. However, when keeping rnthese diffusive terms in the constitutive law the global existence of weak solutions in two-space dimension rncan been shown. rnrnThis main part of the thesis is devoted to the stability study of the Oldroyd-B viscoelastic model. rnFirstly, we show that the free energy of the diffusive Oldroyd-B model as well as its rnlogarithm transformation are dissipative in time. rnFurther, we have developed free energy dissipative schemes based on the characteristic finite element and finite difference framework. rnIn addition, the global linear stability analysis of the diffusive Oldroyd-B model has also be discussed. rnThe next part of the thesis deals with the error estimates of the combined finite element rnand finite volume discretization of a special Oldroyd-B model which covers the limiting rncase of Weissenberg number going to infinity. Theoretical results are confirmed by a series of numerical rnexperiments, which are presented in the thesis, too.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coarse graining is a popular technique used in physics to speed up the computer simulation of molecular fluids. An essential part of this technique is a method that solves the inverse problem of determining the interaction potential or its parameters from the given structural data. Due to discrepancies between model and reality, the potential is not unique, such that stability of such method and its convergence to a meaningful solution are issues.rnrnIn this work, we investigate empirically whether coarse graining can be improved by applying the theory of inverse problems from applied mathematics. In particular, we use the singular value analysis to reveal the weak interaction parameters, that have a negligible influence on the structure of the fluid and which cause non-uniqueness of the solution. Further, we apply a regularizing Levenberg-Marquardt method, which is stable against the mentioned discrepancies. Then, we compare it to the existing physical methods - the Iterative Boltzmann Inversion and the Inverse Monte Carlo method, which are fast and well adapted to the problem, but sometimes have convergence problems.rnrnFrom analysis of the Iterative Boltzmann Inversion, we elaborate a meaningful approximation of the structure and use it to derive a modification of the Levenberg-Marquardt method. We engage the latter for reconstruction of the interaction parameters from experimental data for liquid argon and nitrogen. We show that the modified method is stable, convergent and fast. Further, the singular value analysis of the structure and its approximation allows to determine the crucial interaction parameters, that is, to simplify the modeling of interactions. Therefore, our results build a rigorous bridge between the inverse problem from physics and the powerful solution tools from mathematics. rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performances of the H → ZZ* → 4l analysis are studied in the context of the High Luminosity upgrade of the LHC collider, with the CMS detector. The high luminosity (up to L = 5 × 10^34 cm−2s−1) of the accelerator poses very challenging experimental con- ditions. In particular, the number of overlapping events per bunch crossing will increase to 140. To cope with this difficult environment, the CMS detector will be upgraded in two stages: Phase-I and Phase-II. The tools used in the analysis are the CMS Full Simulation and the fast parametrized Delphes simulation. A validation of Delphes with respect to the Full Simulation is performed, using reference Phase-I detector samples. Delphes is then used to simulate the Phase-II detector response. The Phase-II configuration is compared with the Phase-I detector and the same Phase-I detector affected by aging processes, both modeled with the Full Simulation framework. Conclusions on these three scenarios are derived: the degradation in performances observed with the “aged” scenario shows that a major upgrade of the detector is mandatory. The specific upgrade configuration studied allows to keep the same performances as in Phase-I and, in the case of the four-muons channel, even to exceed them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In dentistry the restoration of decayed teeth is challenging and makes great demands on both the dentist and the materials. Hence, fiber-reinforced posts have been introduced. The effects of different variables on the ultimate load on teeth restored using fiber-reinforced posts is controversial, maybe because the results are mostly based on non-standardized in vitro tests and, therefore, give inhomogeneous results. This study combines the advantages of in vitro tests and finite element analysis (FEA) to clarify the effects of ferrule height, post length and cementation technique used for restoration. Sixty-four single rooted premolars were decoronated (ferrule height 1 or 2 mm), endodontically treated and restored using fiber posts (length 2 or 7 mm), composite fillings and metal crowns (resin bonded or cemented). After thermocycling and chewing simulation the samples were loaded until fracture, recording first damage events. Using UNIANOVA to analyze recorded fracture loads, ferrule height and cementation technique were found to be significant, i.e. increased ferrule height and resin bonding of the crown resulted in higher fracture loads. Post length had no significant effect. All conventionally cemented crowns with a 1-mm ferrule height failed during artificial ageing, in contrast to resin-bonded crowns (75% survival rate). FEA confirmed these results and provided information about stress and force distribution within the restoration. Based on the findings of in vitro tests and computations we concluded that crowns, especially those with a small ferrule height, should be resin bonded. Finally, centrally positioned fiber-reinforced posts did not contribute to load transfer as long as the bond between the tooth and composite core was intact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image-based modeling of tumor growth combines methods from cancer simulation and medical imaging. In this context, we present a novel approach to adapt a healthy brain atlas to MR images of tumor patients. In order to establish correspondence between a healthy atlas and a pathologic patient image, tumor growth modeling in combination with registration algorithms is employed. In a first step, the tumor is grown in the atlas based on a new multi-scale, multi-physics model including growth simulation from the cellular level up to the biomechanical level, accounting for cell proliferation and tissue deformations. Large-scale deformations are handled with an Eulerian approach for finite element computations, which can operate directly on the image voxel mesh. Subsequently, dense correspondence between the modified atlas and patient image is established using nonrigid registration. The method offers opportunities in atlasbased segmentation of tumor-bearing brain images as well as for improved patient-specific simulation and prognosis of tumor progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular dynamics simulations have been used to explore the conformational flexibility of a PNA·DNA·PNA triple helix in aqueous solution. Three 1.05 ns trajectories starting from different but reasonable conformations have been generated and analyzed in detail. All three trajectories converge within about 300 ps to produce stable and very similar conformational ensembles, which resemble the crystal structure conformation in many details. However, in contrast to the crystal structure, there is a tendency for the direct hydrogen-bonds observed between the amide hydrogens of the Hoogsteen-binding PNA strand and the phosphate oxygens of the DNA strand to be replaced by water-mediated hydrogen bonds, which also involve pyrimidine O2 atoms. This structural transition does not appear to weaken the triplex structure but alters groove widths and so may relate to the potential for recognition of such structures by other ligands (small molecules or proteins). Energetic analysis leads us to conclude that the reason that the hybrid PNA/DNA triplex has quite different helical characteristics from the all-DNA triplex is not because the additional flexibility imparted by the replacement of sugar−phosphate by PNA backbones allows motions to improve base-stacking but rather that base-stacking interactions are very similar in both types of triplex and the driving force comes from weak but definate conformational preferences of the PNA strands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulation is an important resource for researchers in diverse fields. However, many researchers have found flaws in the methodology of published simulation studies and have described the state of the simulation community as being in a crisis of credibility. This work describes the project of the Simulation Automation Framework for Experiments (SAFE), which addresses the issues that undermine credibility by automating the workflow in the execution of simulation studies. Automation reduces the number of opportunities for users to introduce error in the scientific process thereby improvingthe credibility of the final results. Automation also eases the job of simulation users and allows them to focus on the design of models and the analysis of results rather than on the complexities of the workflow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long Term Evolution (LTE) is a cellular technology foreseen to extend the capacity and improve the performance of current 3G cellular networks. A key mechanism in the LTE traffic handling is the packet scheduler, which is in charge of allocating resources to active flows in both the frequency and time dimension. In this paper we present a performance comparison of three distinct scheduling schemes for LTE uplink with main focus on the impact of flow-level dynamics resulting from the random user behaviour. We apply a combined analytical/simulation approach which enables fast evaluation of flow-level performance measures. The results show that by considering flow-level dynamics we are able to observe performance trends that would otherwise stay hidden if only packet-level analysis is performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: After bovine spongiform encephalopathy (BSE) emerged in European cattle livestock in 1986 a fundamental question was whether the agent established also in the small ruminants' population. In Switzerland transmissible spongiform encephalopathies (TSEs) in small ruminants have been monitored since 1990. While in the most recent TSE cases a BSE infection could be excluded, for historical cases techniques to discriminate scrapie from BSE had not been available at the time of diagnosis and thus their status remained unclear. We herein applied state-of-the-art techniques to retrospectively classify these animals and to re-analyze the affected flocks for secondary cases. These results were the basis for models, simulating the course of TSEs over a period of 70 years. The aim was to come to a statistically based overall assessment of the TSE situation in the domestic small ruminant population in Switzerland. RESULTS: In sum 16 TSE cases were identified in small ruminants in Switzerland since 1981, of which eight were atypical and six were classical scrapie. In two animals retrospective analysis did not allow any further classification due to the lack of appropriate tissue samples. We found no evidence for an infection with the BSE agent in the cases under investigation. In none of the affected flocks, secondary cases were identified. A Bayesian prevalence calculation resulted in most likely estimates of one case of BSE, five cases of classical scrapie and 21 cases of atypical scrapie per 100'000 small ruminants. According to our models none of the TSEs is considered to cause a broader epidemic in Switzerland. In a closed population, they are rather expected to fade out in the next decades or, in case of a sporadic origin, may remain at a very low level. CONCLUSIONS: In summary, these data indicate that despite a significant epidemic of BSE in cattle, there is no evidence that BSE established in the small ruminant population in Switzerland. Classical and atypical scrapie both occur at a very low level and are not expected to escalate into an epidemic. In this situation the extent of TSE surveillance in small ruminants requires reevaluation based on cost-benefit analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of a high-density neural recording system targeting epilepsy monitoring is presented. Circuit challenges and techniques are discussed to optimize the amplifier topology and the included OTA. A new platform supporting active recording devices targeting wireless and high-resolution focus localization in epilepsy diagnosis is also proposed. The post-layout simulation results of an amplifier dedicated to this application are presented. The amplifier is designed in a UMC 0.18µm CMOS technology, has an NEF of 2.19 and occupies a silicon area of 0.038 mm(2), while consuming 5.8 µW from a 1.8-V supply.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work electrophoretically mediated micro-analysis (EMMA) is used in conjunction with short end injection to improve the in-capillary Jaffé assay for creatinine. Key advances over prior work include (i) using simulation to ensure intimate overlap of reagent plugs, (ii) using OH- to drive the reaction, (iii) using short-end injection to minimize analysis time and in-line product degradation. The potential-driven overlapping time with the EMMA approach, as well as the borate buffer background electrolyte (BGE) concentration and pH are optimized with the short end approach. The best conditions for short-end analyses would not have been predicted by the prior long end work, owing to a complex interplay of separation time and product degradation rates. Raw peak areas and flow-adjusted peak areas for the Jaffé reaction product (at 505 nm) are used to assess the sensitivity of the short-end EMMA approach. Optimal overlap conditions depend heavily on local conductivity differences within the reagent zone(s), as these differences cause dramatic voltage field differences, which effect reagent overlap dynamics. Simul 5.0, a dynamic simulation program for capillary electrophoresis (CE) systems, is used to understand the ionic boundaries and profiles that give rise to the experimentally obtained data for EMMA analysis. Overall, fast migration of hydroxide ions from the picrate zone makes difficult reagent overlap. In addition, the challenges associated with the simultaneous overlapping of three reagent zones are considered, and experimental results validate the predictions made by the simulation. With one set of “optimized” conditions including OH- (253 mM) as the third reagent zone the response was linear with creatinine concentration (R2 = 0.998) and reproducible over the clinically relevant range (0.08 to 0.1 mM) of standard creatinine concentrations. An LOD (S/N = 3) of 0.02 mM and LOQ (S/N=10) of 0.08 mM were determined. A significant improvement (43%) in assay sensitivity was obtained compared to prior work that considered only two reagents in the overlap.