983 resultados para multi-framing camera
Resumo:
The Jericho kimberlite (173.1. ±. 1.3. Ma) is a small (~. 130. ×. 70. m), multi-vent system that preserves products from deep (>. 1. km?) portions of kimberlite vents. Pit mapping, drill core examination, petrographic study, image analysis of olivine crystals (grain size distributions and shape studies), and compositional and mineralogical studies, are used to reconstruct processes from near-surface magma ascent to kimberlite emplacement and alteration. The Jericho kimberlite formed by multiple eruptions through an Archean granodiorite batholith that was overlain by mid-Devonian limestones ~. 1. km in thickness. Kimberlite magma ascended through granodiorite basement by dyke propagation but ascended through limestone, at least in part, by locally brecciating the host rocks. After the first explosive breakthrough to surface, vent deepening and widening occurred by the erosive forces of the waxing phase of the eruption, by gravitationally induced failures as portions of the vent margins slid into the vent and, in the deeper portions of the vent (>. 1. km), by scaling, as thin slabs burst from the walls into the vent. At currently exposed levels, coherent kimberlite (CK) dykes (<. 40. cm thick) are found to the north and south of the vent complex and represent the earliest preserved in-situ products of Jericho magmatism. Timing of CK emplacement on the eastern side of the vent complex is unclear; some thick CK (15-20. m) may have been emplaced after the central vent was formed. Explosive eruptive products are preserved in four partially overlapping vents that are roughly aligned along strike with the coherent kimberlite dyke. The volcaniclastic kimberlite (VK) facies are massive and poorly sorted, with matrix- to clast-supported textures. The VK facies fragmented by dry, volatile-driven processes and were emplaced by eruption column collapse back into the volcanic vents. The first explosive products, poorly preserved because of partial destruction by later eruptions, are found in the central-east vent and were formed by eruption column collapse after the vent was largely cleared of country rock debris. The next active vent was either the north or south vent. Collapse of the eruption column, linked to a vent widening episode, resulted in coeval avalanching of pipe margin walls into the north vent, forming interstratified lenses of country rock-rich boulder breccias in finer-grained volcaniclastic kimberlite. South vent kimberlite has similar characteristics to kimberlite of the north vent and likely formed by similar processes. The final eruptive phase formed olivine-rich and moderately sorted deposits of the central vent. Better sorting is attributed to recycling of kimberlite debris by multiple eruptions through the unconsolidated volcaniclastic pile and associated collapse events. Post-emplacement alteration varies in intensity, but in all cases, has overprinted the primary groundmass and matrix, in CK and VK, respectively. Erosion has since removed all limestone cover.
Resumo:
Dwellings in multi-storey apartment buildings (MSAB) are predicted to increase dramatically as a proportion of housing stock in subtropical cities over coming decades. The problem of designing comfortable and healthy high-density residential environments and minimising energy consumption must be addressed urgently in subtropical cities globally. This paper explores private residents’ experiences of privacy and comfort and their perceptions of how well their apartment dwelling modulated the external environment in subtropical conditions through analysis of 636 survey responses and 24 interviews with residents of MSAB in inner urban neighbourhoods of Brisbane, Australia. The findings show that the availability of natural ventilation and outdoor private living spaces play important roles in resident perceptions of liveability in the subtropics where the climate is conducive to year round “outdoor living”. Residents valued choice with regard to climate control methods in their apartments. They overwhelmingly preferred natural ventilation to manage thermal comfort, and turned to the air-conditioner for limited periods, particularly when external conditions were too noisy. These findings provide a unique evidence base for reducing the environmental impact of MSAB and increasing the acceptability of apartment living, through incorporating residential attributes positioned around climate-responsive architecture.
Resumo:
There is an increasing demand for Unmanned Aerial Systems (UAS) to carry suspended loads as this can provide significant benefits to several applications in agriculture, law enforcement and construction. The load impact on the underlying system dynamics should not be neglected as significant feedback forces may be induced on the vehicle during certain flight manoeuvres. The constant variation in operating point induced by the slung load also causes conventional controllers to demand increased control effort. Much research has focused on standard multi-rotor position and attitude control with and without a slung load. However, predictive control schemes, such as Nonlinear Model Predictive Control (NMPC), have not yet been fully explored. To this end, we present a novel controller for safe and precise operation of multi-rotors with heavy slung load in three dimensions. The paper describes a System Dynamics and Control Simulation Toolbox for use with MATLAB/SIMULINK which includes a detailed simulation of the multi-rotor and slung load as well as a predictive controller to manage the nonlinear dynamics whilst accounting for system constraints. It is demonstrated that the controller simultaneously tracks specified waypoints and actively damps large slung load oscillations. A linear-quadratic regulator (LQR) is derived and control performance is compared. Results show the improved performance of the predictive controller for a larger flight envelope, including aggressive manoeuvres and large slung load displacements. The computational cost remains relatively small, amenable to practical implementations.
Resumo:
Public buildings and large infrastructure are typically monitored by tens or hundreds of cameras, all capturing different physical spaces and observing different types of interactions and behaviours. However to date, in large part due to limited data availability, crowd monitoring and operational surveillance research has focused on single camera scenarios which are not representative of real-world applications. In this paper we present a new, publicly available database for large scale crowd surveillance. Footage from 12 cameras for a full work day covering the main floor of a busy university campus building, including an internal and external foyer, elevator foyers, and the main external approach are provided; alongside annotation for crowd counting (single or multi-camera) and pedestrian flow analysis for 10 and 6 sites respectively. We describe how this large dataset can be used to perform distributed monitoring of building utilisation, and demonstrate the potential of this dataset to understand and learn the relationship between different areas of a building.
Resumo:
Particle swarm optimization (PSO), a new population based algorithm, has recently been used on multi-robot systems. Although this algorithm is applied to solve many optimization problems as well as multi-robot systems, it has some drawbacks when it is applied on multi-robot search systems to find a target in a search space containing big static obstacles. One of these defects is premature convergence. This means that one of the properties of basic PSO is that when particles are spread in a search space, as time increases they tend to converge in a small area. This shortcoming is also evident on a multi-robot search system, particularly when there are big static obstacles in the search space that prevent the robots from finding the target easily; therefore, as time increases, based on this property they converge to a small area that may not contain the target and become entrapped in that area.Another shortcoming is that basic PSO cannot guarantee the global convergence of the algorithm. In other words, initially particles explore different areas, but in some cases they are not good at exploiting promising areas, which will increase the search time.This study proposes a method based on the particle swarm optimization (PSO) technique on a multi-robot system to find a target in a search space containing big static obstacles. This method is not only able to overcome the premature convergence problem but also establishes an efficient balance between exploration and exploitation and guarantees global convergence, reducing the search time by combining with a local search method, such as A-star.To validate the effectiveness and usefulness of algorithms,a simulation environment has been developed for conducting simulation-based experiments in different scenarios and for reporting experimental results. These experimental results have demonstrated that the proposed method is able to overcome the premature convergence problem and guarantee global convergence.
Resumo:
Despite substantial progress in measuring the 3D profile of anatomical variations in the human brain, their genetic and environmental causes remain enigmatic. We developed an automated system to identify and map genetic and environmental effects on brain structure in large brain MRI databases . We applied our multi-template segmentation approach ("Multi-Atlas Fluid Image Alignment") to fluidly propagate hand-labeled parameterized surface meshes into 116 scans of twins (60 identical, 56 fraternal), labeling the lateral ventricles. Mesh surfaces were averaged within subjects to minimize segmentation error. We fitted quantitative genetic models at each of 30,000 surface points to measure the proportion of shape variance attributable to (1) genetic differences among subjects, (2) environmental influences unique to each individual, and (3) shared environmental effects. Surface-based statistical maps revealed 3D heritability patterns, and their significance, with and without adjustments for global brain scale. These maps visualized detailed profiles of environmental versus genetic influences on the brain, extending genetic models to spatially detailed, automatically computed, 3D maps.
Resumo:
We developed and validated a new method to create automated 3D parametric surface models of the lateral ventricles in brain MRI scans, providing an efficient approach to monitor degenerative disease in clinical studies and drug trials. First, we used a set of parameterized surfaces to represent the ventricles in four subjects' manually labeled brain MRI scans (atlases). We fluidly registered each atlas and mesh model to MRIs from 17 Alzheimer's disease (AD) patients and 13 age- and gender-matched healthy elderly control subjects, and 18 asymptomatic ApoE4-carriers and 18 age- and gender-matched non-carriers. We examined genotyped healthy subjects with the goal of detecting subtle effects of a gene that confers heightened risk for Alzheimer's disease. We averaged the meshes extracted for each 3D MR data set, and combined the automated segmentations with a radial mapping approach to localize ventricular shape differences in patients. Validation experiments comparing automated and expert manual segmentations showed that (1) the Hausdorff labeling error rapidly decreased, and (2) the power to detect disease- and gene-related alterations improved, as the number of atlases, N, was increased from 1 to 9. In surface-based statistical maps, we detected more widespread and intense anatomical deficits as we increased the number of atlases. We formulated a statistical stopping criterion to determine the optimal number of atlases to use. Healthy ApoE4-carriers and those with AD showed local ventricular abnormalities. This high-throughput method for morphometric studies further motivates the combination of genetic and neuroimaging strategies in predicting AD progression and treatment response. © 2007 Elsevier Inc. All rights reserved.
Resumo:
Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.
Resumo:
The ENIGMA (Enhancing NeuroImaging Genetics through Meta-Analysis) Consortium was set up to analyze brain measures and genotypes from multiple sites across the world to improve the power to detect genetic variants that influence the brain. Diffusion tensor imaging (DTI) yields quantitative measures sensitive to brain development and degeneration, and some common genetic variants may be associated with white matter integrity or connectivity. DTI measures, such as the fractional anisotropy (FA) of water diffusion, may be useful for identifying genetic variants that influence brain microstructure. However, genome-wide association studies (GWAS) require large populations to obtain sufficient power to detect and replicate significant effects, motivating a multi-site consortium effort. As part of an ENIGMA-DTI working group, we analyzed high-resolution FA images from multiple imaging sites across North America, Australia, and Europe, to address the challenge of harmonizing imaging data collected at multiple sites. Four hundred images of healthy adults aged 18-85 from four sites were used to create a template and corresponding skeletonized FA image as a common reference space. Using twin and pedigree samples of different ethnicities, we used our common template to evaluate the heritability of tract-derived FA measures. We show that our template is reliable for integrating multiple datasets by combining results through meta-analysis and unifying the data through exploratory mega-analyses. Our results may help prioritize regions of the FA map that are consistently influenced by additive genetic factors for future genetic discovery studies. Protocols and templates are publicly available at (http://enigma.loni.ucla.edu/ongoing/dti-working-group/).
Resumo:
Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability.
Resumo:
Diffusion weighted magnetic resonance (MR) imaging is a powerful tool that can be employed to study white matter microstructure by examining the 3D displacement profile of water molecules in brain tissue. By applying diffusion-sensitized gradients along a minimum of 6 directions, second-order tensors can be computed to model dominant diffusion processes. However, conventional DTI is not sufficient to resolve crossing fiber tracts. Recently, a number of high-angular resolution schemes with greater than 6 gradient directions have been employed to address this issue. In this paper, we introduce the Tensor Distribution Function (TDF), a probability function defined on the space of symmetric positive definite matrices. Here, fiber crossing is modeled as an ensemble of Gaussian diffusion processes with weights specified by the TDF. Once this optimal TDF is determined, the diffusion orientation distribution function (ODF) can easily be computed by analytic integration of the resulting displacement probability function.
Resumo:
High-angular resolution diffusion imaging (HARDI) can reconstruct fiber pathways in the brain with extraordinary detail, identifying anatomical features and connections not seen with conventional MRI. HARDI overcomes several limitations of standard diffusion tensor imaging, which fails to model diffusion correctly in regions where fibers cross or mix. As HARDI can accurately resolve sharp signal peaks in angular space where fibers cross, we studied how many gradients are required in practice to compute accurate orientation density functions, to better understand the tradeoff between longer scanning times and more angular precision. We computed orientation density functions analytically from tensor distribution functions (TDFs) which model the HARDI signal at each point as a unit-mass probability density on the 6D manifold of symmetric positive definite tensors. In simulated two-fiber systems with varying Rician noise, we assessed how many diffusionsensitized gradients were sufficient to (1) accurately resolve the diffusion profile, and (2) measure the exponential isotropy (EI), a TDF-derived measure of fiber integrity that exploits the full multidirectional HARDI signal. At lower SNR, the reconstruction accuracy, measured using the Kullback-Leibler divergence, rapidly increased with additional gradients, and EI estimation accuracy plateaued at around 70 gradients.
Resumo:
Purpose The purpose of this paper is to explore the concept of service quality for settings where several customers are involved in the joint creation and consumption of a service. The approach is to provide first insights into the implications of a simultaneous multi‐customer integration on service quality. Design/methodology/approach This conceptual paper undertakes a thorough review of the relevant literature before developing a conceptual model regarding service co‐creation and service quality in customer groups. Findings Group service encounters must be set up carefully to account for the dynamics (social activity) in a customer group and skill set and capabilities (task activity) of each of the individual participants involved in a group service experience. Research limitations/implications Future research should undertake empirical studies to validate and/or modify the suggested model presented in this contribution. Practical implications Managers of service firms should be made aware of the implications and the underlying factors of group services in order to create and manage a group experience successfully. Particular attention should be given to those factors that can be influenced by service providers in managing encounters with multiple customers. Originality/value This article introduces a new conceptual approach for service encounters with groups of customers in a proposed service quality model. In particular, the paper focuses on integrating the impact of customers' co‐creation activities on service quality in a multiple‐actor model.