112 resultados para Projection cortico-corticale

em Queensland University of Technology - ePrints Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary Generalized Procrustes analysis and thin plate splines were employed to create an average 3D shape template of the proximal femur that was warped to the size and shape of a single 2D radiographic image of a subject. Mean absolute depth errors are comparable with previous approaches utilising multiple 2D input projections. Introduction Several approaches have been adopted to derive volumetric density (g cm-3) from a conventional 2D representation of areal bone mineral density (BMD, g cm-2). Such approaches have generally aimed at deriving an average depth across the areal projection rather than creating a formal 3D shape of the bone. Methods Generalized Procrustes analysis and thin plate splines were employed to create an average 3D shape template of the proximal femur that was subsequently warped to suit the size and shape of a single 2D radiographic image of a subject. CT scans of excised human femora, 18 and 24 scanned at pixel resolutions of 1.08 mm and 0.674 mm, respectively, were equally split into training (created 3D shape template) and test cohorts. Results The mean absolute depth errors of 3.4 mm and 1.73 mm, respectively, for the two CT pixel sizes are comparable with previous approaches based upon multiple 2D input projections. Conclusions This technique has the potential to derive volumetric density from BMD and to facilitate 3D finite element analysis for prediction of the mechanical integrity of the proximal femur. It may further be applied to other anatomical bone sites such as the distal radius and lumbar spine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a generic decoupled imaged-based control scheme for calibrated cameras obeying the unified projection model is proposed. The proposed decoupled scheme is based on the surface of object projections onto the unit sphere. Such features are invariant to rotational motions. This allows the control of translational motion independently from the rotational motion. Finally, the proposed results are validated with experiments using a classical perspective camera as well as a fisheye camera mounted on a 6 dofs robot platform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a generic decoupled imagebased control scheme for cameras obeying the unified projection model. The scheme is based on the spherical projection model. Invariants to rotational motion are computed from this projection and used to control the translational degrees of freedom. Importantly we form invariants which decrease the sensitivity of the interaction matrix to object depth variation. Finally, the proposed results are validated with experiments using a classical perspective camera as well as a fisheye camera mounted on a 6-DOF robotic platform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cognitive load theory was used to generate a series of three experiments to investigate the effects of various worked example formats on learning orthographic projection. Experiments 1 and 2 investigated the benefits of presenting problems, conventional worked examples incorporating the final 2-D and 3-D representations only, and modified worked examples with several intermediate stages of rotation between the 2-D and 3-D representations. Modified worked examples proved superior to conventional worked examples without intermediate stages while conventional worked examples were, in turn, superior to problems. Experiment 3 investigated the consequences of varying the number and location of intermediate stages in the rotation trajectory and found three stages to be superior to one. A single intermediate stage was superior when nearer the 2-D than the 3-D end of the trajectory. It was concluded that (a) orthographic projection is learned best using worked examples with several intermediate stages and that (b) a linear relation between angle of rotation and problem difficulty did not hold for orthographic projection material. Cognitive load theory could be used to suggest the ideal location of the intermediate stages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Use of ball projection machines in the acquisition of interceptive skill has recently been questioned. The use of projection machines in developmental and elite fast ball sports programmes is not a trivial issue, since they play a crucial role in reducing injury incidence in players and coaches. A compelling challenge for sports science is to provide theoretical principles to guide how and when projection machines might be used for acquisition of ball skills and preparation for competition in developmental and elite sport performance programmes. Here, we propose how principles from an ecological dynamics theoretical framework could be adopted by sports scientists, pedagogues and coaches to underpin the design of interventions, practice and training tasks, including the use of hybrid video-projection technologies. The assessment of representative learning design during practice may provide ways to optimize developmental programmes in fast ball sports and inform the principled use of ball projection machines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biochemical reactions underlying genetic regulation are often modelled as a continuous-time, discrete-state, Markov process, and the evolution of the associated probability density is described by the so-called chemical master equation (CME). However the CME is typically difficult to solve, since the state-space involved can be very large or even countably infinite. Recently a finite state projection method (FSP) that truncates the state-space was suggested and shown to be effective in an example of a model of the Pap-pili epigenetic switch. However in this example, both the model and the final time at which the solution was computed, were relatively small. Presented here is a Krylov FSP algorithm based on a combination of state-space truncation and inexact matrix-vector product routines. This allows larger-scale models to be studied and solutions for larger final times to be computed in a realistic execution time. Additionally the new method computes the solution at intermediate times at virtually no extra cost, since it is derived from Krylov-type methods for computing matrix exponentials. For the purpose of comparison the new algorithm is applied to the model of the Pap-pili epigenetic switch, where the original FSP was first demonstrated. Also the method is applied to a more sophisticated model of regulated transcription. Numerical results indicate that the new approach is significantly faster and extendable to larger biological models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work is to develop software that is capable of back projecting primary fluence images obtained from EPID measurements through phantom and patient geometries in order to calculate 3D dose distributions. In the first instance, we aim to develop a tool for pretreatment verification in IMRT. In our approach, a Geant4 application is used to back project primary fluence values from each EPID pixel towards the source. Each beam is considered to be polyenergetic, with a spectrum obtained from Monte Carlo calculations for the LINAC in question. At each step of the ray tracing process, the energy differential fluence is corrected for attenuation and beam divergence. Subsequently, the TERMA is calculated and accumulated to an energy differential 3D TERMA distribution. This distribution is then convolved with monoenergetic point spread kernels, thus generating energy differential 3D dose distributions. The resulting dose distributions are accumulated to yield the total dose distribution, which can then be used for pre-treatment verification of IMRT plans. Preliminary results were obtained for a test EPID image comprised of 100 9 100 pixels of unity fluence. Back projection of this field into a 30 cm9 30 cm 9 30 cm water phantom was performed, with TERMA distributions obtained in approximately 10 min (running on a single core of a 3 GHz processor). Point spread kernels for monoenergetic photons in water were calculated using a separate Geant4 application. Following convolution and summation, the resulting 3D dose distribution produced familiar build-up and penumbral features. In order to validate the dose model we will use EPID images recorded without any attenuating material in the beam for a number of MLC defined square fields. The dose distributions in water will be calculated and compared to TPS predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Timely and comprehensive scene segmentation is often a critical step for many high level mobile robotic tasks. This paper examines a projected area based neighbourhood lookup approach with the motivation towards faster unsupervised segmentation of dense 3D point clouds. The proposed algorithm exploits the projection geometry of a depth camera to find nearest neighbours which is time independent of the input data size. Points near depth discontinuations are also detected to reinforce object boundaries in the clustering process. The search method presented is evaluated using both indoor and outdoor dense depth images and demonstrates significant improvements in speed and precision compared to the commonly used Fast library for approximate nearest neighbour (FLANN) [Muja and Lowe, 2009].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This practice-led research project investigated the use of digital projection as a compositional tool in live performance. The project was carried out through the creation of a new Australian theatre work called Genesis that poetically integrated digital projection and live performance. The investigation produced a framework for creating powerful theatrical sequences where the themes and ideas of the show were embedded inside particular performance gestures prompting an expanded aesthetic perception of the content.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biological factors underlying individual variability in fearfulness and anxiety have important implications for stress-related psychiatric illness including PTSD and major depression. Using an advanced intercross line (AIL) derived from C57BL/6 and DBA/2J mouse strains and behavioral selection over 3 generations, we established two lines exhibiting High or Low fear behavior after fear conditioning. Across the selection generations, the two lines showed clear differences in training and tests for contextual and conditioned fear. Before fear conditioning training, there were no differences between lines in baseline freezing to a novel context. However, after fear conditioning High line mice demonstrated pronounced freezing in a new context suggestive of poor context discrimination. Fear generalization was not restricted to contextual fear. High fear mice froze to a novel acoustic stimulus while freezing in the Low line did not increase over baseline. Enhanced fear learning and generalization are consistent with transgenic and pharmacological disruption of the hypothalamic-pituitary-adrenal axis (HPA-axis) (Brinks, 2009, Thompson, 2004, Kaouane, 2012). To determine whether there were differences in HPA-axis regulation between the lines, morning urine samples were collected to measure basal corticosterone. Levels of secreted corticosterone in the circadian trough were analyzed by corticosterone ELISA. High fear mice were found to have higher basal corticosterone levels than low line animals. Examination of hormonal stress response components by qPCR revealed increased expression of CRH mRNA and decreased mRNA for MR and CRHR1 in hypothalamus of high fear mice. These alterations may contribute to both the behavioral phenotype and higher basal corticosterone in High fear mice. To determine basal brain activity in vivo in High and Low fear mice we used manganese-enhanced magnetic resonance imaging (MEMRI). Analysis revealed a pattern of basal brain activity made up of amygdala, cortical and hippocampal circuits that was elevated in the High line. Ongoing studies also seek to determine the relative balance of excitatory and inhibitory tone in the amygdala and hippocampus and the neuronal structure of its neurons. While these heterogeneous lines are selected on fear memory expression, HPA-axis alterations and differences in hippocampal activity segregate with the behavioral phenotypes. These differences are detectable in a basal state strongly suggesting these are biological traits underlying the behavioral phenotype (Johnson et al, 2011).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Domain-invariant representations are key to addressing the domain shift problem where the training and test exam- ples follow different distributions. Existing techniques that have attempted to match the distributions of the source and target domains typically compare these distributions in the original feature space. This space, however, may not be di- rectly suitable for such a comparison, since some of the fea- tures may have been distorted by the domain shift, or may be domain specific. In this paper, we introduce a Domain Invariant Projection approach: An unsupervised domain adaptation method that overcomes this issue by extracting the information that is invariant across the source and tar- get domains. More specifically, we learn a projection of the data to a low-dimensional latent space where the distance between the empirical distributions of the source and target examples is minimized. We demonstrate the effectiveness of our approach on the task of visual object recognition and show that it outperforms state-of-the-art methods on a stan- dard domain adaptation benchmark dataset

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the accelerated trend of global warming, the thermal behavior of existing buildings, which were typically designed based on current weather data, may not be able to cope with the future climate. This paper quantifies, through computer simulations, the increased cooling loads imposed by potential global warming and probable indoor temperature increases due to possible undersized air-conditioning system. It is found from the sample office building examined that the existing buildings would generally be able to adapt to the increasing warmth of 2030 year Low and High scenarios projections and 2070 year Low scenario projection. However, for the 2070 year High scenario, the study indicates that the existing office buildings, in all capital cities except for Hobart, will suffer from overheating problems. When the annual average temperature increase exceeds 2°C, the risk of current office buildings subjected to overheating will be significantly increased. For existing buildings which are designed with current climate condition, it is shown that there is a nearly linear correlation between the increase of average external air temperature and the increase of building cooling load. For the new buildings, in which the possible global warming has been taken into account in the design, a 28-59% increase of cooling capacity under 2070 High scenario would be required to improve the building thermal comfort level to an acceptable standard.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The occurrence and levels of airborne polycyclic aromatic hydrocarbons and volatile organic compounds in selected non-industrial environments in Brisbane have been investigated as part of an integrated indoor air quality assessment program. The most abundant and most frequently encountered compounds include, nonanal, decanal, texanol, phenol, 2-ethyl-1-hexanol, ethanal, naphthalene, 2,6-tert-butyl-4-methyl-phenol (BHT), salicylaldehyde, toluene, hexanal, benzaldehyde, styrene, ethyl benzene, o-, m- and pxylenes, benzene, n-butanol, 1,2-propandiol, and n-butylacetate. Many of the 64 compounds usually included in the European Collaborative Action method of TVOC analysis were below detection limits in the samples analysed. In order to extract maximum amount of information from the data collected, multivariate data projection methods have been employed. The implications of the information extracted on source identification and exposure control are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dynamic interaction between building systems and external climate is extremely complex, involving a large number of difficult-to-predict variables. In order to study the impact of global warming on the built environment, the use of building simulation techniques together with forecast weather data are often necessary. Since all building simulation programs require hourly meteorological input data for their thermal comfort and energy evaluation, the provision of suitable weather data becomes critical. Based on a review of the existing weather data generation models, this paper presents an effective method to generate approximate future hourly weather data suitable for the study of the impact of global warming. Depending on the level of information available for the prediction of future weather condition, it is shown that either the method of retaining to current level, constant offset method or diurnal modelling method may be used to generate the future hourly variation of an individual weather parameter. An example of the application of this method to the different global warming scenarios in Australia is presented. Since there is no reliable projection of possible change in air humidity, solar radiation or wind characters, as a first approximation, these parameters have been assumed to remain at the current level. A sensitivity test of their impact on the building energy performance shows that there is generally a good linear relationship between building cooling load and the changes of weather variables of solar radiation, relative humidity or wind speed.