993 resultados para 2D elasticity problems
Resumo:
We study the role of connectivity on the linear and nonlinear elastic behavior of amorphous systems using a two-dimensional random network of harmonic springs as a model system. A natural characterization of these systems arises in terms of the network coordination relative to that of an isostatic network $\delta z$; a floppy network has $\delta z<0$, while a stiff network has $\delta z>0$. Under the influence of an externally applied load we observe that the response of both floppy and rigid network are controlled by the same critical point, corresponding to the onset of rigidity. We use numerical simulations to compute the exponents which characterize the shear modulus, the amplitude of non-affine displacements, and the network stiffening as a function of $\delta z$, derive these theoretically and make predictions for the mechanical response of glasses and fibrous networks.
Resumo:
Numerous structures uplift under the influence of strong ground motion. Although many researchers have investigated the effects of base uplift on very stiff (ideally rigid) structures, the rocking response of flexible structures has received less attention. Related practical analysis methods treat these structures with simplified 'equivalent' oscillators without directly addressing the interaction between elasticity and rocking. This paper addresses the fundamental dynamics of flexible rocking structures. The nonlinear equations of motion, derived using a Lagrangian formulation for large rotations, are presented for an idealized structural model. Particular attention is devoted to the transition between successive phases; a physically consistent classical impact framework is utilized alongside an energy approach. The fundamental dynamic properties of the flexible rocking system are compared with those of similar linear elastic oscillators and rigid rocking structures, revealing the distinct characteristics of flexible rocking structures. In particular, parametric analysis is performed to quantify the effect of elasticity on uplift, overturning instability, and harmonic response, from which an uplifted resonance emerges. The contribution of stability and strength to the collapse of flexible rocking structures is discussed. © 2012 John Wiley & Sons, Ltd.
Resumo:
Most quasi-static ultrasound elastography methods image only the axial strain, derived from displacements measured in the direction of ultrasound propagation. In other directions, the beam lacks high resolution phase information and displacement estimation is therefore less precise. However, these estimates can be improved by steering the ultrasound beam through multiple angles and combining displacements measured along the different beam directions. Previously, beamsteering has only considered the 2D case to improve the lateral displacement estimates. In this paper, we extend this to 3D using a simulated 2D array to steer both laterally and elevationally in order to estimate the full 3D displacement vector over a volume. The method is tested on simulated and phantom data using a simulated 6-10MHz array, and the precision of displacement estimation is measured with and without beamsteering. In simulations, we found a statistically significant improvement in the precision of lateral and elevational displacement estimates: lateral precision 35.69μm unsteered, 3.70μm steered; elevational precision 38.67μm unsteered, 3.64μm steered. Similar results were found in the phantom data: lateral precision 26.51μm unsteered, 5.78μm steered; elevational precision 28.92μm unsteered, 11.87μm steered. We conclude that volumetric 3D beamsteering improves the precision of lateral and elevational displacement estimates.
Resumo:
This paper presents explicit solutions for a class of decentralized LQG problems in which players communicate their states with delays. A method for decomposing the Bellman equation into a hierarchy of independent subproblems is introduced. Using this decomposition, all of the gains for the optimal controller are computed from the solution of a single algebraic Riccati equation. © 2012 AACC American Automatic Control Council).
Resumo:
Most quasi-static ultrasound elastography methods image only the axial strain, derived from displacements measured in the direction of ultrasound propagation. In other directions, the beam lacks high resolution phase information and displacement estimation is therefore less precise. However, these estimates can be improved by steering the ultrasound beam through multiple angles and combining displacements measured along the different beam directions. Previously, beamsteering has only considered the 2D case to improve the lateral displacement estimates. In this paper, we extend this to 3D using a simulated 2D array to steer both laterally and elevationally in order to estimate the full 3D displacement vector over a volume. The method is tested on simulated and phantom data using a simulated 6-10 MHz array, and the precision of displacement estimation is measured with and without beamsteering. In simulations, we found a statistically significant improvement in the precision of lateral and elevational displacement estimates: lateral precision 35.69 μm unsteered, 3.70 μm steered; elevational precision 38.67 μm unsteered, 3.64 μm steered. Similar results were found in the phantom data: lateral precision 26.51 μm unsteered, 5.78 μm steered; elevational precision 28.92 μm unsteered, 11.87 μm steered. We conclude that volumetric 3D beamsteering improves the precision of lateral and elevational displacement estimates. © 2012 Elsevier B.V. All rights reserved.
Resumo:
A three-dimensional (3D) numerical model is proposed to solve the electromagnetic problems involving transport current and background field of a high-T c superconducting (HTS) system. The model is characterized by the E-J power law and H-formulation, and is successfully implemented using finite element software. We first discuss the model in detail, including the mesh methods, boundary conditions and computing time. To validate the 3D model, we calculate the ac loss and trapped field solution for a bulk material and compare the results with the previously verified 2D solutions and an analytical solution. We then apply our model to test some typical problems such as superconducting bulk array and twisted conductors, which cannot be tackled by the 2D models. The new 3D model could be a powerful tool for researchers and engineers to investigate problems with a greater level of complicity.
Resumo:
Graphene is at the center of an ever growing research effort due to its unique properties, interesting for both fundamental science and applications. A key requirement for applications is the development of industrial-scale, reliable, inexpensive production processes. Here we review the state of the art of graphene preparation, production, placement and handling. Graphene is just the first of a new class of two dimensional materials, derived from layered bulk crystals. Most of the approaches used for graphene can be extended to these crystals, accelerating their journey towards applications. © 2012 Elsevier Ltd.
Resumo:
Free software and open source projects are often perceived to be of high quality. It has been suggested that the high level of quality found in some free software projects is related to the open development model which promotes peer review. While the quality of some free software projects is comparable to, if not better than, that of closed source software, not all free software projects are successful and of high quality. Even mature and successful projects face quality problems; some of these are related to the unique characteristics of free software and open source as a distributed development model led primarily by volunteers. In exploratory interviews performed with free software and open source developers, several common quality practices as well as actual quality problems have been identified. The results of these interviews are presented in this paper in order to take stock of the current status of quality in free software projects and to act as a starting point for the implementation of quality process improvement strategies.
Resumo:
Underground space is commonly exploited both to maximise the utility of costly land in urban development and to reduce the vertical load acting on the ground. Deep excavations are carried out to construct various types of underground infrastructure such as deep basements, subways and service tunnels. Although the soil response to excavation is known in principle, designers lack practical calculation methods for predicting both short- and long-term ground movements. As the understanding of how soil behaves around an excavation in both the short and long term is insufficient and usually empirical, the judgements used in design are also empirical and serious accidents are common. To gain a better understanding of the mechanisms involved in soil excavation, a new apparatus for the centrifuge model testing of deep excavations in soft clay has been developed. This apparatus simulates the field construction sequence of a multi-propped retaining wall during centrifuge flight. A comparison is given between the new technique and the previously used method of draining heavy fluid to simulate excavation in a centrifuge model. The new system has the benefit of giving the correct initial ground conditions before excavation and the proper earth pressure distribution on the retaining structures during excavation, whereas heavy fluid only gives an earth pressure coefficient of unity and is unable to capture any changes in the earth pressure coefficient of soil inside the zone of excavation, for example owing to wall movements. Settlements of the ground surface, changes in pore water pressure, variations in earth pressure, prop forces and bending moments in the retaining wall are all monitored during excavation. Furthermore, digital images taken of a cross-section during the test are analysed using particle image velocimetry to illustrate ground deformation and soil–structure interaction mechanisms. The significance of these observations is discussed.
Resumo:
The loss mechanisms which control 2D incidence range are discussed with an emphasis on determining which real in-service geometric variations will have the largest impact. For the majority of engine compressor blades (Minlet>0.55) both the negative and positive incidence limits are controlled by supersonic patches. It is shown that these patches are highly sensitive to the geometric variations close to, and around the leading edge. The variations used in this study were measured from newly manufactured as well as ex-service blades. Over most the high pressure compressor considered, it was shown that manufacture variations dominated. The first part of the paper shows that, despite large geometric variations (~10% of leading edge thickness), the incidence range responded in a linear way. The result of this is that the geometric variations have little effect on the mean incidence range of a row of blades. In the second part of the paper a region of the design space is identified where non-linear behavior can result in a 10% reduction in positive incidence range. The mechanism for this is reported and design guidelines for its avoidance offered. In the final part of the paper, the linear behavior at negative incidence and the transonic nature of the flow is exploited to design a robust asymmetric leading edge with a 5% increase in incidence range.
Resumo:
Variational methods are a key component of the approximate inference and learning toolbox. These methods fill an important middle ground, retaining distributional information about uncertainty in latent variables, unlike maximum a posteriori methods (MAP), and yet generally requiring less computational time than Monte Carlo Markov Chain methods. In particular the variational Expectation Maximisation (vEM) and variational Bayes algorithms, both involving variational optimisation of a free-energy, are widely used in time-series modelling. Here, we investigate the success of vEM in simple probabilistic time-series models. First we consider the inference step of vEM, and show that a consequence of the well-known compactness property of variational inference is a failure to propagate uncertainty in time, thus limiting the usefulness of the retained distributional information. In particular, the uncertainty may appear to be smallest precisely when the approximation is poorest. Second, we consider parameter learning and analytically reveal systematic biases in the parameters found by vEM. Surprisingly, simpler variational approximations (such a mean-field) can lead to less bias than more complicated structured approximations.
Resumo:
Ideally, one would like to perform image search using an intuitive and friendly approach. Many existing image search engines, however, present users with sets of images arranged in some default order on the screen, typically the relevance to a query, only. While this certainly has its advantages, arguably, a more flexible and intuitive way would be to sort images into arbitrary structures such as grids, hierarchies, or spheres so that images that are visually or semantically alike are placed together. This paper focuses on designing such a navigation system for image browsers. This is a challenging task because arbitrary layout structure makes it difficult - if not impossible - to compute cross-similarities between images and structure coordinates, the main ingredient of traditional layouting approaches. For this reason, we resort to a recently developed machine learning technique: kernelized sorting. It is a general technique for matching pairs of objects from different domains without requiring cross-domain similarity measures and hence elegantly allows sorting images into arbitrary structures. Moreover, we extend it so that some images can be preselected for instance forming the tip of the hierarchy allowing to subsequently navigate through the search results in the lower levels in an intuitive way. Copyright 2010 ACM.