220 resultados para 2D elasticity problems


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the ultrasonic velocity measurement method which investigates the possible effects of high voltage high frequency pulsed power on cortical bone material elasticity. Before applying a pulsed power signal on a live bone, it is essential to determine the safe parameters of pulsed power applied on bone non-destructively. Therefore, the possible changes in cortical bone material elasticity due to a specified pulsed power excitation have been investigated. A controllable positive buck-boost converter with adjustable output voltage and frequency has been used to generate high voltage pulses (500V magnitude at 10 KHz frequency). To determine bone elasticity, an ultrasonic velocity measurement has been conducted on two groups of control (unexposed to pulse power but in the same environmental condition) and cortical bone samples exposed to pulsed power. Young’s modulus of cortical bone samples have been determined and compared before and after applying the pulsed power signal. After applying the high voltage pulses, no significant variation in elastic property of cortical bone specimens was found compared to the control. The result shows that pulsed power with nominated parameters can be applied on cortical bone tissue without any considerable negative effect on elasticity of bone material.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the analysis of material nonlinearity, an effective shear modulus approach based on the strain control method is proposed in this paper by using point collocation method. Hencky’s total deformation theory is used to evaluate the effective shear modulus, Young’s modulus and Poisson’s ratio, which are treated as spatial field variables. These effective properties are obtained by the strain controlled projection method in an iterative manner. To evaluate the second order derivatives of shape function at the field point, the radial basis function (RBF) in the local support domain is used. Several numerical examples are presented to demonstrate the efficiency and accuracy of the proposed method and comparisons have been made with analytical solutions and the finite element method (ABAQUS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel m-ary tree based approach is presented to solve asset management decisions which are combinatorial in nature. The approach introduces a new dynamic constraint based control mechanism which is capable of excluding infeasible solutions from the solution space. The approach also provides a solution to the challenges with ordering of assets decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the availability of 3D full body scanners and the associated software systems for operations with large point clouds, 3D anthropometry has been marketed as a breakthrough and milestone in ergonomic design. The assumptions made by the representatives of the 3D paradigm need to be critically reviewed though. 3D anthropometry has advantages as well as shortfalls, which need to be carefully considered. While it is apparent that the measurement of a full body point cloud allows for easier storage of raw data and improves quality control, the difficulties in calculation of standardized measurements from the point cloud are widely underestimated. Early studies that made use of 3D point clouds to derive anthropometric dimensions have shown unacceptable deviations from the standardized results measured manually. While 3D human point clouds provide a valuable tool to replicate specific single persons for further virtual studies, or personalize garment, their use in ergonomic design must be critically assessed. Ergonomic, volumetric problems are defined by their 2-dimensional boundary or one dimensional sections. A 1D/2D approach is therefore sufficient to solve an ergonomic design problem. As a consequence, all modern 3D human manikins are defined by the underlying anthropometric girths (2D) and lengths/widths (1D), which can be measured efficiently using manual techniques. Traditionally, Ergonomists have taken a statistical approach to design for generalized percentiles of the population rather than for a single user. The underlying method is based on the distribution function of meaningful single and two-dimensional anthropometric variables. Compared to these variables, the distribution of human volume has no ergonomic relevance. On the other hand, if volume is to be seen as a two-dimensional integral or distribution function of length and girth, the calculation of combined percentiles – a common ergonomic requirement - is undefined. Consequently, we suggest to critically review the cost and use of 3D anthropometry. We also recommend making proper use of widely available single and 2-dimensional anthropometric data in ergonomic design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Children who have suffered physical or sexual abuse are as vulnerable as adult trauma victims to experience "secondary trauma", in which the reactions of the family or broader system exacerbate the child's difficulties. Three clinical cases (a 7 yr old male, an 8 yr old male, and a 7 yr old female) are presented that suggest that this secondary trauma can be made worse by either excessive or insufficient provision of individual child psychotherapy, and the way the system interprets and reacts to these clinical decisions. Types of secondary trauma and their interactions with clinical decisions are discussed. Ways of framing clinical decisions to minimize the potential secondary trauma are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A system is described for calculating volume from a sequence of multiplanar 2D ultrasound images. Ultrasound images are captured using a video digitising card (Hauppauge Win/TV card) installed in a personal computer, and regions of interest transformed into 3D space using position and orientation data obtained from an electromagnetic device (Polbemus, Fastrak). The accuracy of the system was assessed by scanning 10 water filled balloons (13-141 ml), 10 kidneys (147  200 ml) and 16 fetal livers (8  37 ml) in water using an Acuson 128XP/10 (5 MHz curvilinear probe). Volume was calculated using the ellipsoid, planimetry, tetrahedral and ray tracing methods and compared with the actual volume measured by weighing (balloons) and water displacement (kidneys and livers). The mean percentage error for the ray tracing method was 0.9 ± 2.4%, 2.7 ± 2.3%, 6.6 ± 5.4% for balloons, kidneys and livers, respectively. So far the system has been used clinically to scan fetal livers and lungs, neonate brain ventricles and adult prostate glands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses commonly encountered diesel engine problems and the underlying combustion related faults. Also discussed are the methods used in previous studies to simulate diesel engine faults and the initial results of an experimental simulation of a common combustion related diesel engine fault, namely diesel engine misfire. This experimental fault simulation represents the first step towards a comprehensive investigation and analysis into the characteristics of acoustic emission signals arising from combustion related diesel engine faults. Data corresponding to different engine running conditions was captured using in-cylinder pressure, vibration and acoustic emission transducers along with both crank-angle encoder and top-dead centre signals. Using these signals, it was possible to characterise the diesel engine in-cylinder pressure profiles and the effect of different combustion conditions on both vibration and acoustic emission signals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Public dialogue regarding the high concentration of drug use and crime in inner city locations is frequently legitimised through visibility of drug-using populations and a perception of high crime rates. The public space known as the Brunswick Street Mall (Valley mall), located in the inner city Brisbane suburb of Fortitude Valley, has long provided the focal point for discussions regarding the problem of illicit drug use and antisocial behaviour in Brisbane. During the late 1990s a range of stakeholders in Fortitude Valley became mobilised to tackle crime and illicit drugs. In particular they wanted to dismantle popular perceptions of the area as representing the dark and unsafe side of Brisbane. The aim of this campaign was to instil a sense of safety in the area and dislodge Fortitude Valley from its reputation as a =symbolic location of danger‘. This thesis is a case study about an urban site that became contested by the diverse aims of a range of stakeholders who were invested in an urban renewal program and community safety project. This case study makes visible a number of actors that were lured from their existing roles in an indeterminable number of heterogeneous networks in order to create a community safety network. The following analysis of the community safety network emphasises some specific actors: history, ideas, technologies, materialities and displacements. The case study relies on the work of Foucault, Latour, Callon and Law to draw out the rationalities, background contingencies and the attempts to impose order and translate a number of entities into the community safety project in Fortitude Valley. The results of this research show that the community safety project is a case of ontological politics. Specifically the data indicates that both the (reality) problem of safety and the (knowledge) solution to safety were created simultaneously. This thesis explores the idea that while violence continues to occur in the Valley, evidence that community safety got done is located through mapping its displacement and eventual disappearance. As such, this thesis argues that community safety is a =collateral reality‘.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we extend the ideas of Brugnano, Iavernaro and Trigiante in their development of HBVM($s,r$) methods to construct symplectic Runge-Kutta methods for all values of $s$ and $r$ with $s\geq r$. However, these methods do not see the dramatic performance improvement that HBVMs can attain. Nevertheless, in the case of additive stochastic Hamiltonian problems an extension of these ideas, which requires the simulation of an independent Wiener process at each stage of a Runge-Kutta method, leads to methods that have very favourable properties. These ideas are illustrated by some simple numerical tests for the modified midpoint rule.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While governments are engaged in developing social policy responses to address wicked issues such as poverty, homelessness, drug addiction and crime, long term resolution of these issues through government policy making and state-based programmatic action has remained elusive. The use of vehicles for joint action and partnership between government and the community sector such as co-management has been offered as a way of harnessing productive capability and innovative capacity of both these sectors to resolve these complex problems. However, it is suggested that while there is a well advanced agenda with the intent for collaboration and partnership, working with the models for undertaking this joint action are not well understood and have not been fully developed or evaluated. This chapter examines new approaches to resolving the wicked issue of homelessness through applying the lens of co-management to understand the complexities of this issue and its resolution. The chapter analyses an attempt to move away from traditional bureaucratic structures of welfare departments, operating through single functional ‘silos’ to a new horizontal ‘hub-based’ model of service delivery that seeks to integrate actors across many different service areas and organizations. The chapter explores case studies of co-management in the establishment, development and operation of service hubs to address homelessness. We argue that the response to homelessness needs a ‘wicked solution’ that goes beyond simply providing shelter to those in need. The case of the hub models of community sector organizations working across organizational boundaries is evaluated to determine whether this approach can be considered successful co-managing of an innovative initiative, and understanding the requirements for developing, improving and extending this model. The role of the third sector in co-managing public services is examined through the in-depth case studies and the results are presented together with an assessment of how co-management can contribute to service quality and service management in public services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prior in vitro studies, utilizing 31Pn uclear magnetic resonance (31PN MR) to measure the chemical shift (CT) of 0-ATP and lengthening of the phosphocreatine spin-spin (7"') relaxation time, suggested an assessment of their efficacy in measuring magnesium depletion in vivo. Dietary magnesium depletion (Me$) produced markedly lower magnesium in plasma (0.44 vs 1. I3 mmol/liter) and bone (1 30 vs 190 pmol/g) but much smaller changes in muscle (41 vs 45 pmol/g, P < 0.01), heart (42.5 vs 44.6 prnol/g), and brain (30 vs 32 pmollg). NMR experiments in anesthetized rats in a Bruker 7-T vertical bore magnet showed that in M e $ rats there was a significant change in brain j3-ATP shift (16.15 vs 16.03 ppm, P < 0.05). These chemical shifts gave a calculated free [Mg"] of 0.71 mM (control) and 0.48 mM (MgZ+$). In muscle the change in j3-ATP shift was not significant (Me$ 15.99 ppm, controls 15.96 ppm), corresponding to a calculated free M P of 0.83 and 0.95 mM, respectively. Phosphccreatine Tz (Carr-Purcell, spin-echo pulse sequence) was no different with M e $ in muscle in vivo (surface coil) (M$+$ 136, control 142 ms) or in isolated perfused hearts (Helmholtz coil) (control 83, M e $ 92 ms). 3'P NMR is severely limited in its ability to detect dietary magnesium depletion in vivo. Measurement of j3-ATP shift in brain may allow studies of the effects of interaction in group studies but does not allow prediction of an individual magnesium status.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a novel method for determining the extrinsic calibration parameters between 2D and 3D LIDAR sensors with respect to a vehicle base frame. To recover the calibration parameters we attempt to optimize the quality of a 3D point cloud produced by the vehicle as it traverses an unknown, unmodified environment. The point cloud quality metric is derived from Rényi Quadratic Entropy and quantifies the compactness of the point distribution using only a single tuning parameter. We also present a fast approximate method to reduce the computational requirements of the entropy evaluation, allowing unsupervised calibration in vast environments with millions of points. The algorithm is analyzed using real world data gathered in many locations, showing robust calibration performance and substantial speed improvements from the approximations.