982 resultados para Finit elements method
Resumo:
When a new form is inserted in an existing townscape, its consonance within the urban fabric is dependent on the level of attention paid to the evaluation and management of its architectural elements. However, despite the established principles and methods of urban morphology that enable the systematic analysis of the built environment, a formula for ensuring that new development relates to its context so as to achieve congruent outcomes is still lacking. This paper proposes a new method of evaluating and measuring architectural elements within evolving urban forms, with particular emphasis on a three-dimensional study of buildings. In a case study, detailed mapping of both current and past forms provides the basis for evincing predominant characteristics that have changed over time. Using this method, it is possible to demonstrate objectively how the townscape has been affected through changes in its architectural configuration.
Resumo:
Background Context There are differences in definitions of end plate lesions (EPLs), often referred to as Schmorl’s nodes, that may, to some extent, account for the large range of reported prevalence (3.8 to 76%). Purpose To develop a technique to measure the size, prevalence and location of EPLs in a consistent manner. Study Design/Setting This study proposed a method using a detection algorithm which was applied to five adolescent females (average age 15.1 years, range 13.0 to 19.2 years) with idiopathic scoliosis (average major Cobb angle 60°, range 55 to 67°). Methods Existing low-dose, computed tomography scans were segmented semi-automatically to extract 3D morphology of each vertebral endplate. Any remaining attachments to the posterior elements of adjacent vertebrae or endplates were then manually sectioned. An automatic algorithm was used to determine the presence and position of EPLs. Results EPLs were identified in 15 of the 170 (8.8%) endplates analysed with an average depth of 3.1mm. 11/15 of the EPLs were seen in the lumbar spine. The algorithm was found to be most sensitive to changes in the minimum EPL gradient at the edges of the EPL. Conclusions This study describes an imaging analysis technique for consistent measurement of the prevalence, location and size of EPLs. The technique can be used to analyse large populations without observer errors in EPL definitions.
Resumo:
INTRODUCTION There is a large range in the reported prevalence of end plate lesions (EPLs), sometimes referred to as Schmorl's nodes in the general population (3.8-76%). One possible reason for this large range is the differences in definitions used by authors. Previous research has suggested that EPLs may potentially be a primary disturbance of growth plates that leads to the onset of scoliosis. The aim of this study was to develop a technique to measure the size, prevalence and location of EPLs on Computed Tomography (CT) images of scoliosis patients in a consistent manner. METHODS A detection algorithm was developed and applied to measure EPLs for five adolescent females with idiopathic scoliosis (average age 15.1 years, average major Cobb 60°). In this algorithm, the EPL definition was based on the lesion depth, the distance from the edge of the vertebral body and the gradient of the lesion edge. Existing low-dose, CT scans of the patients' spines were segmented semi-automatically to extract 3D vertebral endplate morphology. Manual sectioning of any attachments between posterior elements of adjacent vertebrae and, if necessary, endplates was carried out before the automatic algorithm was used to determine the presence and position of EPLs. RESULTS EPLs were identified in 15 of the 170 (8.8%) endplates analysed with an average depth of 3.1mm. 73% of the EPLs were seen in the lumbar spines (11/15). A sensitivity study demonstrated that the algorithm was most sensitive to changes in the minimum gradient required at the lesion edge. CONCLUSION An imaging analysis technique for consistent measurement of the prevalence, location and size of EPLs on CT images has been developed. Although the technique was tested on scoliosis patients, it can be used to analyse other populations without observer errors in EPL definitions.
Resumo:
A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical.
Resumo:
Wet-milling protocol was employed to produce pressed powder tablets with excellent cohesion and homogeneity suitable for laser ablation (LA) analysis of volatile and refractive elements in sediment. The influence of sample preparation on analytical performance was also investigated, including sample homogeneity, accuracy and limit of detection. Milling in volatile solvent for 40 min ensured sample is well mixed and could reasonably recover both volatile (Hg) and refractive (Zr) elements. With the exception of Cr (−52%) and Nb (+26%) major, minor and trace elements in STSD-1 and MESS-3 could be analysed within ±20% of the certified values. Comparison of the method with total digestion method using HF was tested by analysing 10 different sediment samples. The laser method recovers significantly higher amounts of analytes such as Ag, Cd, Sn and Sn than the total digestion method making it a more robust method for elements across the periodic table. LA-ICP-MS also eliminates the interferences from chemical reagents as well as the health and safety risks associated with digestion processes. Therefore, it can be considered as an enhanced method for the analysis of heterogeneous matrices such as river sediments.
Resumo:
Quantifying the stiffness properties of soft tissues is essential for the diagnosis of many cardiovascular diseases such as atherosclerosis. In these pathologies it is widely agreed that the arterial wall stiffness is an indicator of vulnerability. The present paper focuses on the carotid artery and proposes a new inversion methodology for deriving the stiffness properties of the wall from cine-MRI (magnetic resonance imaging) data. We address this problem by setting-up a cost function defined as the distance between the modeled pixel signals and the measured ones. Minimizing this cost function yields the unknown stiffness properties of both the arterial wall and the surrounding tissues. The sensitivity of the identified properties to various sources of uncertainty is studied. Validation of the method is performed on a rubber phantom. The elastic modulus identified using the developed methodology lies within a mean error of 9.6%. It is then applied to two young healthy subjects as a proof of practical feasibility, with identified values of 625 kPa and 587 kPa for one of the carotid of each subject.
Resumo:
A finite element analysis of laminated shells reinforced with laminated stiffeners is described in this paper. A rectangular laminated anisotropic shallow thin shell finite element of 48 d.o.f. is used in conjunction with a laminated anisotropic curved beam and shell stiffening finite element having 16 d.o.f. Compatibility between the shell and the stiffener is maintained all along their junction line. Some problems of symmetrically stiff ened isotropic plates and shells have been solved to evaluate the performance of the present method. Behaviour of an eccentrically stiffened laminated cantilever cylindrical shell has been predicted to show the ability of the present program. General shells amenable to rectangular meshes can also be solved in a similar manner.
Resumo:
In this paper we present a novel application of scenario methods to engage a diverse constituency of senior stakeholders, with limited time availability, in debate to inform planning and policy development. Our case study project explores post-carbon futures for the Latrobe Valley region of the Australian state of Victoria. Our approach involved initial deductive development of two ‘extreme scenarios’ by a multi-disciplinary research team, based upon an extensive research programme. Over four workshops with the stakeholder constituency, these initial scenarios were discussed, challenged, refined and expanded through an inductive process, whereby participants took ‘ownership’ of a final set of three scenarios. These were both comfortable and challenging to them. The outcomes of this process subsequently informed public policy development for the region. Whilst this process did not follow a single extant structured, multi-stage scenario approach, neither was it devoid of form. Here, we seek to theorise and codify elements of our process – which we term ‘scenario improvisation’ – such that others may adopt it.
Resumo:
Predatory insects and spiders are key elements of integrated pest management (IPM) programmes in agricultural crops such as cotton. Management decisions in IPM programmes should to be based on a reliable and efficient method for counting both predators and pests. Knowledge of the temporal constraints that influence sampling is required because arthropod abundance estimates are likely to vary over a growing season and within a day. Few studies have adequately quantified this effect using the beat sheet, a potentially important sampling method. We compared the commonly used methods of suction and visual sampling to the beat sheet, with reference to an absolute cage clamp method for determining the abundance of various arthropod taxa over 5 weeks. There were significantly more entomophagous arthropods recorded using the beat sheet and cage clamp methods than by using suction or visual sampling, and these differences were more pronounced as the plants grew. In a second trial, relative estimates of entomophagous and phytophagous arthropod abundance were made using beat sheet samples collected over a day. Beat sheet estimates of the abundance of only eight of the 43 taxa examined were found to vary significantly over a day. Beat sheet sampling is recommended in further studies of arthropod abundance in cotton, but researchers and pest management advisors should bear in mind the time of season and time of day effects.
Resumo:
Near the boundaries of shells, thin shell theories cannot always provide a satisfactory description of the kinematic situation. This imposes severe limitations on simulating the boundary conditions in theoretical shell models. Here an attempt is made to overcome the above limitation. Three-dimensional theory of elasticity is used near boundaries, while thin shell theory covers the major part of the shell away from the boundaries. Both regions are connected by means of an “interphase element.” This method is used to study typical static stress and natural vibration problems
Resumo:
This work deals with the formulation and implementation of finite deformation viscoplasticity within the framework of stress-based hybrid finite element methods. Hybrid elements, which are based on a two-field variational formulation, are much less susceptible to locking than conventional displacement-based elements. The conventional return-mapping scheme cannot be used in the context of hybrid stress methods since the stress is known, and the strain and the internal plastic variables have to be recovered using this known stress field.We discuss the formulation and implementation of the consistent tangent tensor, and the return-mapping algorithm within the context of the hybrid method. We demonstrate the efficacy of the algorithm on a wide range of problems.
Resumo:
The safety of food has become an increasingly interesting issue to consumers and the media. It has also become a source of concern, as the amount of information on the risks related to food safety continues to expand. Today, risk and safety are permanent elements within the concept of food quality. Safety, in particular, is the attribute that consumers find very difficult to assess. The literature in this study consists of three main themes: traceability; consumer behaviour related to both quality and safety issues and perception of risk; and valuation methods. The empirical scope of the study was restricted to beef, because the beef labelling system enables reliable tracing of the origin of beef, as well as attributes related to safety, environmental friendliness and animal welfare. The purpose of this study was to examine what kind of information flows are required to ensure quality and safety in the food chain for beef, and who should produce that information. Studying the willingness to pay of consumers makes it possible to determine whether the consumers consider the quantity of information available on the safety and quality of beef sufficient. One of the main findings of this study was that the majority of Finnish consumers (73%) regard increased quality information as beneficial. These benefits were assessed using the contingent valuation method. The results showed that those who were willing to pay for increased information on the quality and safety of beef would accept an average price increase of 24% per kilogram. The results showed that certain risk factors impact consumer willingness to pay. If the respondents considered genetic modification of food or foodborne zoonotic diseases as harmful or extremely harmful risk factors in food, they were more likely to be willing to pay for quality information. The results produced by the models thus confirmed the premise that certain food-related risks affect willingness to pay for beef quality information. The results also showed that safety-related quality cues are significant to the consumers. In the first place, the consumers would like to receive information on the control of zoonotic diseases that are contagious to humans. Similarly, other process-control related information ranked high among the top responses. Information on any potential genetic modification was also considered important, even though genetic modification was not regarded as a high risk factor.
Resumo:
A direct method of preparing cast aluminium alloy-graphite particle composites using uncoated graphite particles is reported. The method consists of introducing and dispersing uncoated but suitably pretreated graphite particles in aluminium alloy melts, and casting the resulting composite melts in suitable permanent moulds. The optical pretreatment required for the dispersion of the uncoated graphite particles in aluminium alloy melts consists of heating the graphite particles to 400° C in air for 1 h just prior to their dispersion in the melts. The effects of alloying elements such as Si, Cu and Mg on the dispersability of pretreated graphite in molten aluminium have also been reported. It was found that additions of about 0.5% Mg or 5% Si significantly improve the dispersability of graphite particles in aluminium alloy melts as indicated by the high recoveries of graphite in the castings of these composites. It was also possible to disperse upto 3% graphite in LM 13 alloy melts and retain the graphite particles in a well distributed fashion in the castings using the pre-heat-treated graphite particles. The observations in this study have been related to the information presently available on wetting between graphite and molten aluminium in the presence of different elements and our own thermogravimetric analysis studies on graphite particles. Physical and mechanical properties of LM 13-3% graphite composite made using pre-heat-treated graphite powder, were found to be adequate for many applications, including pistons which have been successfully used in internal combustion engines.
Resumo:
This thesis presents methods for locating and analyzing cis-regulatory DNA elements involved with the regulation of gene expression in multicellular organisms. The regulation of gene expression is carried out by the combined effort of several transcription factor proteins collectively binding the DNA on the cis-regulatory elements. Only sparse knowledge of the 'genetic code' of these elements exists today. An automatic tool for discovery of putative cis-regulatory elements could help their experimental analysis, which would result in a more detailed view of the cis-regulatory element structure and function. We have developed a computational model for the evolutionary conservation of cis-regulatory elements. The elements are modeled as evolutionarily conserved clusters of sequence-specific transcription factor binding sites. We give an efficient dynamic programming algorithm that locates the putative cis-regulatory elements and scores them according to the conservation model. A notable proportion of the high-scoring DNA sequences show transcriptional enhancer activity in transgenic mouse embryos. The conservation model includes four parameters whose optimal values are estimated with simulated annealing. With good parameter values the model discriminates well between the DNA sequences with evolutionarily conserved cis-regulatory elements and the DNA sequences that have evolved neutrally. In further inquiry, the set of highest scoring putative cis-regulatory elements were found to be sensitive to small variations in the parameter values. The statistical significance of the putative cis-regulatory elements is estimated with the Two Component Extreme Value Distribution. The p-values grade the conservation of the cis-regulatory elements above the neutral expectation. The parameter values for the distribution are estimated by simulating the neutral DNA evolution. The conservation of the transcription factor binding sites can be used in the upstream analysis of regulatory interactions. This approach may provide mechanistic insight to the transcription level data from, e.g., microarray experiments. Here we give a method to predict shared transcriptional regulators for a set of co-expressed genes. The EEL (Enhancer Element Locator) software implements the method for locating putative cis-regulatory elements. The software facilitates both interactive use and distributed batch processing. We have used it to analyze the non-coding regions around all human genes with respect to the orthologous regions in various other species including mouse. The data from these genome-wide analyzes is stored in a relational database which is used in the publicly available web services for upstream analysis and visualization of the putative cis-regulatory elements in the human genome.
Resumo:
We propose a new type of high-order elements that incorporates the mesh-free Galerkin formulations into the framework of finite element method. Traditional polynomial interpolation is replaced by mesh-free interpolations in the present high-order elements, and the strain smoothing technique is used for integration of the governing equations based on smoothing cells. The properties of high-order elements, which are influenced by the basis function of mesh-free interpolations and boundary nodes, are discussed through numerical examples. It can be found that the basis function has significant influence on the computational accuracy and upper-lower bounds of energy norm, when the strain smoothing technique retains the softening phenomenon. This new type of high-order elements shows good performance when quadratic basis functions are used in the mesh-free interpolations and present elements prove advantageous in adaptive mesh and nodes refinement schemes. Furthermore, it shows less sensitive to the quality of element because it uses the mesh-free interpolations and obeys the Weakened Weak (W2) formulation as introduced in [3, 5].