653 resultados para multi-theoretical


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Particle swarm optimization (PSO), a new population based algorithm, has recently been used on multi-robot systems. Although this algorithm is applied to solve many optimization problems as well as multi-robot systems, it has some drawbacks when it is applied on multi-robot search systems to find a target in a search space containing big static obstacles. One of these defects is premature convergence. This means that one of the properties of basic PSO is that when particles are spread in a search space, as time increases they tend to converge in a small area. This shortcoming is also evident on a multi-robot search system, particularly when there are big static obstacles in the search space that prevent the robots from finding the target easily; therefore, as time increases, based on this property they converge to a small area that may not contain the target and become entrapped in that area.Another shortcoming is that basic PSO cannot guarantee the global convergence of the algorithm. In other words, initially particles explore different areas, but in some cases they are not good at exploiting promising areas, which will increase the search time.This study proposes a method based on the particle swarm optimization (PSO) technique on a multi-robot system to find a target in a search space containing big static obstacles. This method is not only able to overcome the premature convergence problem but also establishes an efficient balance between exploration and exploitation and guarantees global convergence, reducing the search time by combining with a local search method, such as A-star.To validate the effectiveness and usefulness of algorithms,a simulation environment has been developed for conducting simulation-based experiments in different scenarios and for reporting experimental results. These experimental results have demonstrated that the proposed method is able to overcome the premature convergence problem and guarantee global convergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work of French sociologist, anthropologist and philosopher Pierre Bourdieu has been influential across a set of cognate disciplines that can be classified as physical culture studies. Concepts such as field, capital, habitus and symbolic violence have been used as theoretical tools by scholars and students looking to understand the nature and purpose of sport, leisure, physical education and human movement within wider society. Pierre Bourdieu and Physical Culture is the first book to focus on the significance of Bourdieu’s work for, and in, physical culture. Bringing together the work of leading and emerging international researchers, it introduces the core concepts in Bourdieu’s thought and work, and presents a series of fascinating demonstrations of the application of his theory to physical culture studies. A concluding section discusses the inherent difficulties of choosing and using theory to understand the world around us. By providing an in-depth and multi-layered example of how theory can be used across the many and varied components of sport, leisure, physical education and human movement, this book should help all serious students and researchers in physical culture to better understand the importance of social theory in their work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an uncertainty quantification study of the performance analysis of the high pressure ratio single stage radial-inflow turbine used in the Sundstrand Power Systems T-100 Multi-purpose Small Power Unit. A deterministic 3D volume-averaged Computational Fluid Dynamics (CFD) solver is coupled with a non-statistical generalized Polynomial Chaos (gPC) representation based on a pseudo-spectral projection method. One of the advantages of this approach is that it does not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic and geometric fields. The stochastic results highlight the importance of the blade thickness and trailing edge tip radius on the total-to-static efficiency of the turbine compared to the angular velocity and trailing edge tip length. From a theoretical point of view, the use of the gPC representation on an arbitrary grid also allows the investigation of the sensitivity of the blade thickness profiles on the turbine efficiency. The gPC approach is also applied to coupled random parameters. The results show that the most influential coupled random variables are trailing edge tip radius coupled with the angular velocity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite substantial progress in measuring the 3D profile of anatomical variations in the human brain, their genetic and environmental causes remain enigmatic. We developed an automated system to identify and map genetic and environmental effects on brain structure in large brain MRI databases . We applied our multi-template segmentation approach ("Multi-Atlas Fluid Image Alignment") to fluidly propagate hand-labeled parameterized surface meshes into 116 scans of twins (60 identical, 56 fraternal), labeling the lateral ventricles. Mesh surfaces were averaged within subjects to minimize segmentation error. We fitted quantitative genetic models at each of 30,000 surface points to measure the proportion of shape variance attributable to (1) genetic differences among subjects, (2) environmental influences unique to each individual, and (3) shared environmental effects. Surface-based statistical maps revealed 3D heritability patterns, and their significance, with and without adjustments for global brain scale. These maps visualized detailed profiles of environmental versus genetic influences on the brain, extending genetic models to spatially detailed, automatically computed, 3D maps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We developed and validated a new method to create automated 3D parametric surface models of the lateral ventricles in brain MRI scans, providing an efficient approach to monitor degenerative disease in clinical studies and drug trials. First, we used a set of parameterized surfaces to represent the ventricles in four subjects' manually labeled brain MRI scans (atlases). We fluidly registered each atlas and mesh model to MRIs from 17 Alzheimer's disease (AD) patients and 13 age- and gender-matched healthy elderly control subjects, and 18 asymptomatic ApoE4-carriers and 18 age- and gender-matched non-carriers. We examined genotyped healthy subjects with the goal of detecting subtle effects of a gene that confers heightened risk for Alzheimer's disease. We averaged the meshes extracted for each 3D MR data set, and combined the automated segmentations with a radial mapping approach to localize ventricular shape differences in patients. Validation experiments comparing automated and expert manual segmentations showed that (1) the Hausdorff labeling error rapidly decreased, and (2) the power to detect disease- and gene-related alterations improved, as the number of atlases, N, was increased from 1 to 9. In surface-based statistical maps, we detected more widespread and intense anatomical deficits as we increased the number of atlases. We formulated a statistical stopping criterion to determine the optimal number of atlases to use. Healthy ApoE4-carriers and those with AD showed local ventricular abnormalities. This high-throughput method for morphometric studies further motivates the combination of genetic and neuroimaging strategies in predicting AD progression and treatment response. © 2007 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ENIGMA (Enhancing NeuroImaging Genetics through Meta-Analysis) Consortium was set up to analyze brain measures and genotypes from multiple sites across the world to improve the power to detect genetic variants that influence the brain. Diffusion tensor imaging (DTI) yields quantitative measures sensitive to brain development and degeneration, and some common genetic variants may be associated with white matter integrity or connectivity. DTI measures, such as the fractional anisotropy (FA) of water diffusion, may be useful for identifying genetic variants that influence brain microstructure. However, genome-wide association studies (GWAS) require large populations to obtain sufficient power to detect and replicate significant effects, motivating a multi-site consortium effort. As part of an ENIGMA-DTI working group, we analyzed high-resolution FA images from multiple imaging sites across North America, Australia, and Europe, to address the challenge of harmonizing imaging data collected at multiple sites. Four hundred images of healthy adults aged 18-85 from four sites were used to create a template and corresponding skeletonized FA image as a common reference space. Using twin and pedigree samples of different ethnicities, we used our common template to evaluate the heritability of tract-derived FA measures. We show that our template is reliable for integrating multiple datasets by combining results through meta-analysis and unifying the data through exploratory mega-analyses. Our results may help prioritize regions of the FA map that are consistently influenced by additive genetic factors for future genetic discovery studies. Protocols and templates are publicly available at (http://enigma.loni.ucla.edu/ongoing/dti-working-group/).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion weighted magnetic resonance (MR) imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of 6 directions, second-order tensors can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve crossing fiber tracts. Recently, a number of high-angular resolution schemes with greater than 6 gradient directions have been employed to address this issue. In this paper, we introduce the Tensor Distribution Function (TDF), a probability function defined on the space of symmetric positive definite matrices. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the diffusion orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-angular resolution diffusion imaging (HARDI) can reconstruct fiber pathways in the brain with extraordinary detail, identifying anatomical features and connections not seen with conventional MRI. HARDI overcomes several limitations of standard diffusion tensor imaging, which fails to model diffusion correctly in regions where fibers cross or mix. As HARDI can accurately resolve sharp signal peaks in angular space where fibers cross, we studied how many gradients are required in practice to compute accurate orientation density functions, to better understand the tradeoff between longer scanning times and more angular precision. We computed orientation density functions analytically from tensor distribution functions (TDFs) which model the HARDI signal at each point as a unit-mass probability density on the 6D manifold of symmetric positive definite tensors. In simulated two-fiber systems with varying Rician noise, we assessed how many diffusionsensitized gradients were sufficient to (1) accurately resolve the diffusion profile, and (2) measure the exponential isotropy (EI), a TDF-derived measure of fiber integrity that exploits the full multidirectional HARDI signal. At lower SNR, the reconstruction accuracy, measured using the Kullback-Leibler divergence, rapidly increased with additional gradients, and EI estimation accuracy plateaued at around 70 gradients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Industry-school partnerships (ISPs) are increasingly being recognised as a new way of providing vocational education opportunities. However, there is limited research investigating their impact on systemic (organisational and structural) and human resource (teachers and education managers) capacity to support school to work transitions. This paper reports on a government led ISP, established by the Queensland state government. ISPs across three industry sectors: minerals and energy; building and construction; and aviation are included in this study. This research adopted a qualitative case study methodology and draws upon boundary crossing theory to understand the dynamics of how each industry sector responded to systemic and human resource issues that emerged in each ISP. The main finding being that the systematic application of boundary crossing mechanisms by all partners pro-duced mutually beneficial outcomes. ISPs from the three sectors adopted different models, leveraged different boundary crossing objects but all maintained the joint vision and mutually agreed outcomes. All three ISPs genuinely crossed boundaries, albeit in different ways, and assisted teachers to co-pro-duce industry-based curriculums, share sector specific knowledge and skills that help enhance the school to work transition for school graduates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the global construction context, the Best Value or Most Economically Advantageous Tender is becoming a widespread approach for contractor selection, as an alternative to other traditional awarding criteria such as the Lowest Price. In these multi-attribute tenders, the owner or auctioneer solicits proposals containing both a price bid and additional technical features. Once the proposals are received, each bidder's price bid is given an economic score according to a scoring rule, generally called an Economic Scoring Formula (ESF) and a technical score according to pre-specified criteria. Eventually, the contract is awarded to the bidder with the highest weighted overall score (economic + technical). However, Economic Scoring Formula selection by auctioneers is invariably and paradoxically a highly intuitive process in practice, involving few theoretical or empirical considerations, despite having being considered traditionally and mistakenly as objective, due to its mathematical nature. This paper provides a taxonomic classification of a wide variety of ESF and Abnormally Low Bid Criteria (ALBC) gathered in several countries with different tendering approaches. Practical implications concern the optimal design of price scoring rules in construction contract tenders, as well as future analyses of the effects of ESF and ALBC on competitive bidding behaviour.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A central dimension of the State’s responsibility in a liberal democracy and any just society is the protection of individuals’ central rights and freedoms, and the creation of the minimum conditions under which each individual has an opportunity to lead a life of sufficient equality, dignity and value. A special subset of this responsibility is to protect those who are unable to protect themselves from genuine harm. Substantial numbers of children suffer serious physical, emotional and sexual abuse, and neglect at the hands of their parents and caregivers or by other known parties. Child abuse and neglect occurs in a situation of extreme power asymmetry. The physical, social, behavioural and economic costs to the individual, and the social and economic costs to communities, are vast. Children are not generally able to protect themselves from serious abuse and neglect. This enlivens both the State’s responsibility to protect the child, and the debate about how that responsibility can and should be discharged. A core question arises for all societies, given that most serious child maltreatment occurs in the family sphere, is unlikely to be disclosed, causes substantial harm to both individual and community, and infringes fundamental individual rights and freedoms. The question is: how can society identify these situations so that the maltreatment can be interrupted, the child’s needs for security and safety, and health and other rehabilitation can be met, and the family’s needs can be addressed to reduce the likelihood of recurrence? This chapter proposes a theoretical framework applicable for any society that is considering justifiable and effective policy approaches to identify and respond to cases of serious child abuse and neglect. The core of the theoretical framework is based on major principles from both classical liberal political philosophy (Locke and Mill), and leading political philosophers from the twentieth century and the first part of the new millennium (Rawls, Rorty, Okin, Nussbaum), and is further situated within fundamental frameworks of civil and criminal law, and health and economics.