996 resultados para POLARIZABLE CONTINUUM MODEL
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-03
Resumo:
Strain localisation is a widespread phenomenon often observed in shear and compressive loading of geomaterials, for example, the fault gouge. It is believed that the main mechanisms of strain localisation are strain softening and mismatch between dilatancy and pressure sensitivity. Observations show that gouge deformation is accompanied by considerable rotations of grains. In our previous work as a model for gouge material, we proposed a continuum description for an assembly of particles of equal radius in which the particle rotation is treated as an independent degree of freedom. We showed that there exist critical values of the model parameters for which the displacement gradient exhibits a pronounced localisation at the mid-surface layers of the fault, even in the absence of inelasticity. Here, we generalise the model to the case of finite deformations characteristic for the gouge deformation. We derive objective constitutive relationships relating the Jaumann rates of stress and moment stress to the relative strain and curvature rates, respectively. The model suggests that the pattern of localisation remains the same as in the linear case. However, the presence of the Jaumann terms leads to the emergence of non-zero normal stresses acting along and perpendicular to the shear layer (with zero hydrostatic pressure), and localised along the mid-line of the gouge; these stress components are absent in the linear model of simple shear. These additional normal stresses, albeit small, cause a change in the direction in which the maximal normal stresses act and in which en-echelon fracturing is formed.
Resumo:
Starting from a continuum description, we study the nonequilibrium roughening of a thermal re-emission model for etching in one and two spatial dimensions. Using standard analytical techniques, we map our problem to a generalized version of an earlier nonlocal KPZ (Kardar-Parisi-Zhang) model. In 2 + 1 dimensions, the values of the roughness and the dynamic exponents calculated from our theory go like α ≈ z ≈ 1 and in 1 + 1 dimensions, the exponents resemble the KPZ values for low vapor pressure, supporting experimental results. Interestingly, Galilean invariance is maintained throughout.
Resumo:
This paper presents a new interpretation for the Superpave IDT strength test based on a viscoelastic-damage framework. The framework is based on continuum damage mechanics and the thermodynamics of irreversible processes with an anisotropic damage representation. The new approach introduces considerations for the viscoelastic effects and the damage accumulation that accompanies the fracture process in the interpretation of the Superpave IDT strength test for the identification of the Dissipated Creep Strain Energy (DCSE) limit from the test result. The viscoelastic model is implemented in a Finite Element Method (FEM) program for the simulation of the Superpave IDT strength test. The DCSE values obtained using the new approach is compared with the values obtained using the conventional approach to evaluate the validity of the assumptions made in the conventional interpretation of the test results. The result shows that the conventional approach over-estimates the DCSE value with increasing estimation error at higher deformation rates.
Resumo:
The main objective of this work is to develop a quasi three-dimensional numerical model to simulate stony debris flows, considering a continuum fluid phase, composed by water and fine sediments, and a non-continuum phase including large particles, such as pebbles and boulders. Large particles are treated in a Lagrangian frame of reference using the Discrete Element Method, the fluid phase is based on the Eulerian approach, using the Finite Element Method to solve the depth-averaged Navier-Stokes equations in two horizontal dimensions. The particle’s equations of motion are in three dimensions. The model simulates particle-particle collisions and wall-particle collisions, taking into account that particles are immersed in a fluid. Bingham and Cross rheological models are used for the continuum phase. Both formulations provide very stable results, even in the range of very low shear rates. Bingham formulation is better able to simulate the stopping stage of the fluid when applied shear stresses are low. Results of numerical simulations have been compared with data from laboratory experiments on a flume-fan prototype. Results show that the model is capable of simulating the motion of big particles moving in the fluid flow, handling dense particulate flows and avoiding overlap among particles. An application to simulate debris flow events that occurred in Northern Venezuela in 1999 shows that the model could replicate the main boulder accumulation areas that were surveyed by the USGS. Uniqueness of this research is the integration of mud flow and stony debris movement in a single modeling tool that can be used for planning and management of debris flow prone areas.
Resumo:
The main objective of this work is to develop a quasi three-dimensional numerical model to simulate stony debris flows, considering a continuum fluid phase, composed by water and fine sediments, and a non-continuum phase including large particles, such as pebbles and boulders. Large particles are treated in a Lagrangian frame of reference using the Discrete Element Method, the fluid phase is based on the Eulerian approach, using the Finite Element Method to solve the depth-averaged Navier–Stokes equations in two horizontal dimensions. The particle’s equations of motion are in three dimensions. The model simulates particle-particle collisions and wall-particle collisions, taking into account that particles are immersed in a fluid. Bingham and Cross rheological models are used for the continuum phase. Both formulations provide very stable results, even in the range of very low shear rates. Bingham formulation is better able to simulate the stopping stage of the fluid when applied shear stresses are low. Results of numerical simulations have been compared with data from laboratory experiments on a flume-fan prototype. Results show that the model is capable of simulating the motion of big particles moving in the fluid flow, handling dense particulate flows and avoiding overlap among particles. An application to simulate debris flow events that occurred in Northern Venezuela in 1999 shows that the model could replicate the main boulder accumulation areas that were surveyed by the USGS. Uniqueness of this research is the integration of mud flow and stony debris movement in a single modeling tool that can be used for planning and management of debris flow prone areas.
Resumo:
Trials in a temporal two-interval forced-choice discrimination experiment consist of two sequential intervals presenting stimuli that differ from one another as to magnitude along some continuum. The observer must report in which interval the stimulus had a larger magnitude. The standard difference model from signal detection theory analyses poses that order of presentation should not affect the results of the comparison, something known as the balance condition (J.-C. Falmagne, 1985, in Elements of Psychophysical Theory). But empirical data prove otherwise and consistently reveal what Fechner (1860/1966, in Elements of Psychophysics) called time-order errors, whereby the magnitude of the stimulus presented in one of the intervals is systematically underestimated relative to the other. Here we discuss sensory factors (temporary desensitization) and procedural glitches (short interstimulus or intertrial intervals and response bias) that might explain the time-order error, and we derive a formal model indicating how these factors make observed performance vary with presentation order despite a single underlying mechanism. Experimental results are also presented illustrating the conventional failure of the balance condition and testing the hypothesis that time-order errors result from contamination by the factors included in the model.
Resumo:
To solve problems in polymer fluid dynamics, one needs the equation of continuity, motion, and energy. The last two equations contain the stress tensor and the heat-flux vector for the material. There are two ways to formulate the stress tensor: (1) one can write a continuum expression for the stress tensor in terms of kinematic tensors, or (2) one can select a molecular model that represents the polymer molecule, and then develop an expression for the stress tensor from kinetic theory. The advantage of the kinetic theory approach is that one gets information about the relation between the molecular structure of the polymers and the rheological properties. In this review, we restrict the discussion primarily to the simplest stress tensor expressions or “constitutive equations” containing from two to four adjustable parameters, although we do indicate how these formulations may be extended to give more complicated expressions. We also explore how these simplest expressions are recovered as special cases of a more general framework, the Oldroyd 8-constant model. The virtue of studying the simplest models is that we can discover some general notions as to which types of empiricisms or which types of molecular models seem to be worth investigating further. We also explore equivalences between continuum and molecular approaches. We restrict the discussion to several types of simple flows, such as shearing flows and extensional flows. These are the flows that are of greatest importance in industrial operations. Furthermore, if these simple flows cannot be well described by continuum or molecular models, then it is not necessary to lavish time and energy to apply them to more complex flow problems.
Resumo:
Increase in the Balmer continuum radiation during solar flares was predicted by various authors, but has never been firmly confirmed observationally using ground-based slit spectrographs. Here we describe a new post-focal instrument, the image selector, with which the Balmer continuum flux can be measured from the whole flare area, in analogy to successful detections of flaring dMe stars. The system was developed and put into operation at the horizontal solar telescope HSFA2 of the Ondřejov Observatory. We measure the total flux by a fast spectrometer from a limited but well-defined region on the solar disk. Using a system of diaphragms, the disturbing contribution of a bright solar disk can be eliminated as much as possible. Light curves of the measured flux in the spectral range 350 – 440 nm are processed, together with the Hα images of the flaring area delimited by the appropriate diaphragm. The spectral flux data are flat-fielded, calibrated, and processed to be compared with model predictions. Our analysis of the data proves that the described device is sufficiently sensitive to detect variations in the Balmer continuum during solar flares. Assuming that the Balmer-continuum kernels have at least a similar size as those visible in Hα, we find the flux increase in the Balmer continuum to reach 230 – 550 % of the quiet continuum during the observed X-class flare. We also found temporal changes in the Balmer continuum flux starting well before the onset of the flare in Hα.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
We study spatially localized states of a spiking neuronal network populated by a pulse coupled phase oscillator known as the lighthouse model. We show that in the limit of slow synaptic interactions in the continuum limit the dynamics reduce to those of the standard Amari model. For non-slow synaptic connections we are able to go beyond the standard firing rate analysis of localized solutions allowing us to explicitly construct a family of co-existing one-bump solutions, and then track bump width and firing pattern as a function of system parameters. We also present an analysis of the model on a discrete lattice. We show that multiple width bump states can co-exist and uncover a mechanism for bump wandering linked to the speed of synaptic processing. Moreover, beyond a wandering transition point we show that the bump undergoes an effective random walk with a diffusion coefficient that scales exponentially with the rate of synaptic processing and linearly with the lattice spacing.
Resumo:
The Hybrid Monte Carlo algorithm is adapted to the simulation of a system of classical degrees of freedom coupled to non self-interacting lattices fermions. The diagonalization of the Hamiltonian matrix is avoided by introducing a path-integral formulation of the problem, in d + 1 Euclidean space–time. A perfect action formulation allows to work on the continuum Euclidean time, without need for a Trotter–Suzuki extrapolation. To demonstrate the feasibility of the method we study the Double Exchange Model in three dimensions. The complexity of the algorithm grows only as the system volume, allowing to simulate in lattices as large as 163 on a personal computer. We conclude that the second order paramagnetic–ferromagnetic phase transition of Double Exchange Materials close to half-filling belongs to the Universality Class of the three-dimensional classical Heisenberg model.
Resumo:
We present a multiscale model bridging length and time scales from molecular to continuum levels with the objective of predicting the yield behavior of amorphous glassy polyethylene (PE). Constitutive pa- rameters are obtained from molecular dynamics (MD) simulations, decreasing the requirement for ad- hoc experiments. Consequently, we achieve: (1) the identification of multisurface yield functions; (2) the high strain rate involved in MD simulations is upscaled to continuum via quasi-static simulations. Validation demonstrates that the entire multisurface yield functions can be scaled to quasi-static rates where the yield stresses are possibly predicted by a proposed scaling law; (3) a hierarchical multiscale model is constructed to predict temperature and strain rate dependent yield strength of the PE.
Resumo:
In this thesis, we explore three methods for the geometrico-static modelling of continuum parallel robots. Inspired by biological trunks, tentacles and snakes, continuum robot designs can reach confined spaces, manipulate objects in complex environments and conform to curvilinear paths in space. In addition, parallel continuum manipulators have the potential to inherit some of the compactness and compliance of continuum robots while retaining some of the precision, stability and strength of rigid-links parallel robots. Subsequently, the foundation of our work is performed on slender beam by applying the Cosserat rod theory, appropriate to model continuum robots. After that, three different approaches are developed on a case study of a planar parallel continuum robot constituted of two connected flexible links. We solve the forward and inverse geometrico-static problem namely by using (a) shooting methods to obtain a numerical solution, (b) an elliptic method to find a quasi-analytical solution, and (c) the Corde model to perform further model analysis. The performances of each of the studied methods are evaluated and their limits are highlighted. This thesis is divided as follows. Chapter one gives the introduction on the field of the continuum robotics and introduce the parallel continuum robots that is studied in this work. Chapter two describe the geometrico-static problem and gives the mathematical description of this problem. Chapter three explains the numerical approach with the shooting method and chapter four introduce the quasi-analytical solution. Then, Chapter five introduce the analytic method inspired by the Corde model and chapter six gives the conclusions of this work.
Resumo:
Cancer is a challenging disease that involves multiple types of biological interactions in different time and space scales. Often computational modelling has been facing problems that, in the current technology level, is impracticable to represent in a single space-time continuum. To handle this sort of problems, complex orchestrations of multiscale models is frequently done. PRIMAGE is a large EU project that aims to support personalized childhood cancer diagnosis and prognosis. The goal is to do so predicting the growth of the solid tumour using multiscale in-silico technologies. The project proposes an open cloud-based platform to support decision making in the clinical management of paediatric cancers. The orchestration of predictive models is in general complex and would require a software framework that support and facilitate such task. The present work, proposes the development of an updated framework, referred herein as the VPH-HFv3, as a part of the PRIMAGE project. This framework, a complete re-writing with respect to the previous versions, aims to orchestrate several models, which are in concurrent development, using an architecture as simple as possible, easy to maintain and with high reusability. This sort of problem generally requires unfeasible execution times. To overcome this problem was developed a strategy of particularisation, which maps the upper-scale model results into a smaller number and homogenisation which does the inverse way and analysed the accuracy of this approach.