569 resultados para Scotchbond Multi-Purpose Plus


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Structural identification (St-Id) can be considered as the process of updating a finite element (FE) model of a structural system to match the measured response of the structure. This paper presents the St-Id of a laboratory-based steel through-truss cantilevered bridge with suspended span. There are a total of 600 degrees of freedom (DOFs) in the superstructure plus additional DOFs in the substructure. The St-Id of the bridge model used the modal parameters from a preliminary modal test in the objective function of a global optimisation technique using a layered genetic algorithm with patternsearch step (GAPS). Each layer of the St-Id process involved grouping of the structural parameters into a number of updating parameters and running parallel optimisations. The number of updating parameters was increased at each layer of the process. In order to accelerate the optimisation and ensure improved diversity within the population, a patternsearch step was applied to the fittest individuals at the end of each generation of the GA. The GAPS process was able to replicate the mode shapes for the first two lateral sway modes and the first vertical bending mode to a high degree of accuracy and, to a lesser degree, the mode shape of the first lateral bending mode. The mode shape and frequency of the torsional mode did not match very well. The frequencies of the first lateral bending mode, the first longitudinal mode and the first vertical mode matched very well. The frequency of the first sway mode was lower and that of the second sway mode was higher than the true values, indicating a possible problem with the FE model. Improvements to the model and the St-Id process will be presented at the upcoming conference and compared to the results presented in this paper. These improvements will include the use of multiple FE models in a multi-layered, multi-solution, GAPS St-Id approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Director selection is an important yet under-researched topic. The purpose of this paper is to contribute to extant literature by gaining a greater understanding into how and why new board members are recruited. Design/methodology/approach This exploratory study uses in-depth interviews with Australian non-executive directors to identify what selection criteria are deemed most important when selecting new director candidates and how selection practices vary between organisations. Findings The findings indicate that appointments to the board are based on two key attributes: first, the candidates’ ability to contribute complementary skills and second, the candidates’ ability to work well with the existing board. Despite commonality in these broad criteria, board selection approaches vary considerably between organisations. As a result, some boards do not adequately assess both criteria when appointing a new director hence increasing the chance of a mis-fit between the position and the appointed director. Research limitations/implications The study highlights the importance of both individual technical capabilities and social compatibility in director selections. The authors introduce a new perspective through which future research may consider director selection: fit. Originality/value The in-depth analysis of the director selection process highlights some less obvious and more nuanced issues surrounding directors’ appointment to the board. Recurrent patterns indicate the need for both technical and social considerations. Hence the study is a first step in synthesising the current literature and illustrates the need for a multi-theoretical approach in future director selection research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite substantial progress in measuring the 3D profile of anatomical variations in the human brain, their genetic and environmental causes remain enigmatic. We developed an automated system to identify and map genetic and environmental effects on brain structure in large brain MRI databases . We applied our multi-template segmentation approach ("Multi-Atlas Fluid Image Alignment") to fluidly propagate hand-labeled parameterized surface meshes into 116 scans of twins (60 identical, 56 fraternal), labeling the lateral ventricles. Mesh surfaces were averaged within subjects to minimize segmentation error. We fitted quantitative genetic models at each of 30,000 surface points to measure the proportion of shape variance attributable to (1) genetic differences among subjects, (2) environmental influences unique to each individual, and (3) shared environmental effects. Surface-based statistical maps revealed 3D heritability patterns, and their significance, with and without adjustments for global brain scale. These maps visualized detailed profiles of environmental versus genetic influences on the brain, extending genetic models to spatially detailed, automatically computed, 3D maps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We developed and validated a new method to create automated 3D parametric surface models of the lateral ventricles in brain MRI scans, providing an efficient approach to monitor degenerative disease in clinical studies and drug trials. First, we used a set of parameterized surfaces to represent the ventricles in four subjects' manually labeled brain MRI scans (atlases). We fluidly registered each atlas and mesh model to MRIs from 17 Alzheimer's disease (AD) patients and 13 age- and gender-matched healthy elderly control subjects, and 18 asymptomatic ApoE4-carriers and 18 age- and gender-matched non-carriers. We examined genotyped healthy subjects with the goal of detecting subtle effects of a gene that confers heightened risk for Alzheimer's disease. We averaged the meshes extracted for each 3D MR data set, and combined the automated segmentations with a radial mapping approach to localize ventricular shape differences in patients. Validation experiments comparing automated and expert manual segmentations showed that (1) the Hausdorff labeling error rapidly decreased, and (2) the power to detect disease- and gene-related alterations improved, as the number of atlases, N, was increased from 1 to 9. In surface-based statistical maps, we detected more widespread and intense anatomical deficits as we increased the number of atlases. We formulated a statistical stopping criterion to determine the optimal number of atlases to use. Healthy ApoE4-carriers and those with AD showed local ventricular abnormalities. This high-throughput method for morphometric studies further motivates the combination of genetic and neuroimaging strategies in predicting AD progression and treatment response. © 2007 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ENIGMA (Enhancing NeuroImaging Genetics through Meta-Analysis) Consortium was set up to analyze brain measures and genotypes from multiple sites across the world to improve the power to detect genetic variants that influence the brain. Diffusion tensor imaging (DTI) yields quantitative measures sensitive to brain development and degeneration, and some common genetic variants may be associated with white matter integrity or connectivity. DTI measures, such as the fractional anisotropy (FA) of water diffusion, may be useful for identifying genetic variants that influence brain microstructure. However, genome-wide association studies (GWAS) require large populations to obtain sufficient power to detect and replicate significant effects, motivating a multi-site consortium effort. As part of an ENIGMA-DTI working group, we analyzed high-resolution FA images from multiple imaging sites across North America, Australia, and Europe, to address the challenge of harmonizing imaging data collected at multiple sites. Four hundred images of healthy adults aged 18-85 from four sites were used to create a template and corresponding skeletonized FA image as a common reference space. Using twin and pedigree samples of different ethnicities, we used our common template to evaluate the heritability of tract-derived FA measures. We show that our template is reliable for integrating multiple datasets by combining results through meta-analysis and unifying the data through exploratory mega-analyses. Our results may help prioritize regions of the FA map that are consistently influenced by additive genetic factors for future genetic discovery studies. Protocols and templates are publicly available at (http://enigma.loni.ucla.edu/ongoing/dti-working-group/).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion weighted magnetic resonance (MR) imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of 6 directions, second-order tensors can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve crossing fiber tracts. Recently, a number of high-angular resolution schemes with greater than 6 gradient directions have been employed to address this issue. In this paper, we introduce the Tensor Distribution Function (TDF), a probability function defined on the space of symmetric positive definite matrices. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the diffusion orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-angular resolution diffusion imaging (HARDI) can reconstruct fiber pathways in the brain with extraordinary detail, identifying anatomical features and connections not seen with conventional MRI. HARDI overcomes several limitations of standard diffusion tensor imaging, which fails to model diffusion correctly in regions where fibers cross or mix. As HARDI can accurately resolve sharp signal peaks in angular space where fibers cross, we studied how many gradients are required in practice to compute accurate orientation density functions, to better understand the tradeoff between longer scanning times and more angular precision. We computed orientation density functions analytically from tensor distribution functions (TDFs) which model the HARDI signal at each point as a unit-mass probability density on the 6D manifold of symmetric positive definite tensors. In simulated two-fiber systems with varying Rician noise, we assessed how many diffusionsensitized gradients were sufficient to (1) accurately resolve the diffusion profile, and (2) measure the exponential isotropy (EI), a TDF-derived measure of fiber integrity that exploits the full multidirectional HARDI signal. At lower SNR, the reconstruction accuracy, measured using the Kullback-Leibler divergence, rapidly increased with additional gradients, and EI estimation accuracy plateaued at around 70 gradients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents data on residents’ use of common stairways and lifts (vertical circulation spaces) in multi-storey apartment buildings (MSABs) in Brisbane, Australia. Vertical movement is a defining aspect of multi-storey living and the energy consumed by lifts contributes significantly to the energy budget of the typical MSAB. The purpose is to investigate whether a reappraisal of vertical circulation design, through the lens of residents’ requirements, might contribute to energy reductions in this building type. Data was gathered on a theoretical sample of MSAB ranging from five decades old to very recent schemes. 90 residents were surveyed about their day-to-day experiences of circulation and access systems. The results showed that residents mainly chose to use the stairs for convenience and exercise. Building management regimes that limited residents’ access to collective spaces were the main impediment to discretionary stair use. Only two buildings did not have fully enclosed stairwells and these had the highest stair usage, suggesting that stair design, and building governance are two areas that might be worthy of attention. The more that circulation design is focussed on limiting access, the less opportunities there are for personal choice, incidental social interaction and casual surveillance of collective spaces. The more that design of vertical circulation spaces in MSAB meets residents’ needs the less likely they are to be reliant on continuous energy supply for normal functioning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Australian housing sector contributes about a fifth of national greenhouse gas (GHG) emissions. GHG emissions contribute to climate change which leads to an increase in the occurrence or intensity of natural disasters and damage of houses. To ensure housing performance in the face of climate change, various rating tools for residential property have been introduced in different countries. The aim of this paper is to present a preliminary comparison between international and Australian rating tools in terms of purpose, use and sustainability elements for residential property. The methodologies used are to review, classify, compare and identify similarities and differences between rating tools. Two international tools, Building Research Establishment Environmental Assessment Methodology (BREEAM) (UK) and Leadership in Energy and Environmental Design for Homes (LEED-Homes) (USA), will be compared to two Australian tools, Green Star – Multi Unit Residential v1 and EnviroDevelopment. All four rating tools include management, energy, water and material aspects. The findings reveal thirteen elements that fall under three categories: spatial planning, occupants’ health and comfort, and environmental conditions. The variations in different tools may result from differences in local prevailing climate. Not all sustainability elements covered by international rating tools are included in the Australian rating tools. The voluntary nature of the tools implies they are not broadly applied in their respective market and that there is a policy implementation gap. A comprehensive rating tool could be developed in Australia to promote and lessen the confusion about sustainable housing, which in turn assist in improving the supply and demand of sustainable housing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oleaginous microorganisms have potential to be used to produce oils as alternative feedstock for biodiesel production. Microalgae (Chlorella protothecoides and Chlorella zofingiensis), yeasts (Cryptococcus albidus and Rhodotorula mucilaginosa), and fungi (Aspergillus oryzae and Mucor plumbeus) were investigated for their ability to produce oil from glucose, xylose and glycerol. Multi-criteria analysis (MCA) using analytic hierarchy process (AHP) and preference ranking organization method for the enrichment of evaluations (PROMETHEE) with graphical analysis for interactive aid (GAIA), was used to rank and select the preferred microorganisms for oil production for biodiesel application. This was based on a number of criteria viz., oil concentration, content, production rate and yield, substrate consumption rate, fatty acids composition, biomass harvesting and nutrient costs. PROMETHEE selected A. oryzae, M. plumbeus and R. mucilaginosa as the most prospective species for oil production. However, further analysis by GAIA Webs identified A. oryzae and M. plumbeus as the best performing microorganisms.