995 resultados para 3D integration


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The impact of navigator spatial resolution and navigator evaluation time on image quality in free-breathing navigator-gated 3D coronary magnetic resonance angiography (MRA), including real-time motion correction, was investigated in a moving phantom. Objective image quality parameters signal-to-noise ratio (SNR) and vessel sharpness were compared. It was found that for improved mage quality a short navigator evaluation time is of crucial importance. Navigator spatial resolution showed minimal influence on image quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Games are powerful and engaging. On average, one billion people spend at least 1 hour a day playing computer and videogames. This is even more true with the younger generations. Our students have become the < digital natives >, the < gamers >, the < virtual generation >. Research shows that those who are most at risk for failure in the traditional classroom setting, also spend more time than their counterparts, using video games. They might strive, given a different learning environment. Educators have the responsibility to align their teaching style to these younger generation learning styles. However, many academics resist the use of computer-assisted learning that has been "created elsewhere". This can be extrapolated to game-based teaching: even if educational games were more widely authored, their adoption would still be limited to the educators who feel a match between the authored games and their own beliefs and practices. Consequently, game-based teaching would be much more widespread if teachers could develop their own games, or at least customize them. Yet, the development and customization of teaching games are complex and costly. This research uses a design science methodology, leveraging gamification techniques, active and cooperative learning theories, as well as immersive sandbox 3D virtual worlds, to develop a method which allows management instructors to transform any off-the-shelf case study into an engaging collaborative gamified experience. This method is applied to marketing case studies, and uses the sandbox virtual world of Second Life. -- Les jeux sont puissants et motivants, En moyenne, un milliard de personnes passent au moins 1 heure par jour jouer à des jeux vidéo sur ordinateur. Ceci se vérifie encore plus avec les jeunes générations, Nos étudiants sont nés à l'ère du numérique, certains les appellent des < gamers >, d'autres la < génération virtuelle >. Les études montrent que les élèves qui se trouvent en échec scolaire dans les salles de classes traditionnelles, passent aussi plus de temps que leurs homologues à jouer à des jeux vidéo. lls pourraient potentiellement briller, si on leur proposait un autre environnement d'apprentissage. Les enseignants ont la responsabilité d'adapter leur style d'enseignement aux styles d'apprentissage de ces jeunes générations. Toutefois, de nombreux professeurs résistent lorsqu'il s'agit d'utiliser des contenus d'apprentissage assisté par ordinateur, développés par d'autres. Ceci peut être extrapolé à l'enseignement par les jeux : même si un plus grand nombre de jeux éducatifs était créé, leur adoption se limiterait tout de même aux éducateurs qui perçoivent une bonne adéquation entre ces jeux et leurs propres convictions et pratiques. Par conséquent, I'enseignement par les jeux serait bien plus répandu si les enseignants pouvaient développer leurs propres jeux, ou au moins les customiser. Mais le développement de jeux pédagogiques est complexe et coûteux. Cette recherche utilise une méthodologie Design Science pour développer, en s'appuyant sur des techniques de ludification, sur les théories de pédagogie active et d'apprentissage coopératif, ainsi que sur les mondes virtuels immersifs < bac à sable > en 3D, une méthode qui permet aux enseignants et formateurs de management, de transformer n'importe quelle étude de cas, provenant par exemple d'une centrale de cas, en une expérience ludique, collaborative et motivante. Cette méthode est appliquée aux études de cas Marketing dans le monde virtuel de Second Life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crop-livestock integration represents an interesting alternative of soil management, especially in regions where the maintenance of cover crops in no-tillage systems is difficult. The objective of this study was to evaluate soil physical and chemical properties, based on the hypothesis that a well-managed crop-livestock integration system improves the soil quality and stabilizes the system. The experiment was set up in a completely randomized design, with five replications. The treatments were arranged in a 6 x 4 factorial design, to assess five crop rotation systems in crop-livestock integration, and native forest as reference of soil undisturbed by agriculture, in four layers (0.0-0.05; 0.05-0.10; 0.10-0.15 and 0.15-0.20 m). The crop rotation systems in crop-livestock integration promoted changes in soil physical and chemical properties and the effects of the different systems were mainly detected in the surface layer. The crops in integrated crop-livestock systems allowed the maintenance of soil carbon at levels equal to those of the native forest, proving the efficiency of these systems in terms of soil conservation. The systems influenced the environmental stability positively; the soil quality indicator mineral-associated organic matter was best related to aggregate stability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We provide analytical evidence of stochastic resonance in polarization switching vertical-cavity surface-emitting lasers (VCSELs). We describe the VCSEL by a two-mode stochastic rate equation model and apply a multiple time-scale analysis. We were able to reduce the dynamical description to a single stochastic differential equation, which is the starting point of the analytical study of stochastic resonance. We confront our results with numerical simulations on the original rate equations, validating the use of a multiple time-scale analysis on stochastic equations as an analytical tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

TCRep 3D is an automated systematic approach for TCR-peptide-MHC class I structure prediction, based on homology and ab initio modeling. It has been considerably generalized from former studies to be applicable to large repertoires of TCR. First, the location of the complementary determining regions of the target sequences are automatically identified by a sequence alignment strategy against a database of TCR Vα and Vβ chains. A structure-based alignment ensures automated identification of CDR3 loops. The CDR are then modeled in the environment of the complex, in an ab initio approach based on a simulated annealing protocol. During this step, dihedral restraints are applied to drive the CDR1 and CDR2 loops towards their canonical conformations, described by Al-Lazikani et. al. We developed a new automated algorithm that determines additional restraints to iteratively converge towards TCR conformations making frequent hydrogen bonds with the pMHC. We demonstrated that our approach outperforms popular scoring methods (Anolea, Dope and Modeller) in predicting relevant CDR conformations. Finally, this modeling approach has been successfully applied to experimentally determined sequences of TCR that recognize the NY-ESO-1 cancer testis antigen. This analysis revealed a mechanism of selection of TCR through the presence of a single conserved amino acid in all CDR3β sequences. The important structural modifications predicted in silico and the associated dramatic loss of experimental binding affinity upon mutation of this amino acid show the good correspondence between the predicted structures and their biological activities. To our knowledge, this is the first systematic approach that was developed for large TCR repertoire structural modeling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two-dimensional (2D)-breath-hold coronary magnetic resonance angiography (MRA) has been shown to be a fast and reliable method to depict the proximal coronary arteries. Recent developments, however, allow for free-breathing navigator gated and navigator corrected three-dimensional (3D) coronary MRA. These 3D approaches have potential for improved signal-to-noise ratio (SNR) and allow for the acquisition of adjacent thin slices without the misregistration problems known from 2D approaches. Still, a major impediment of a 3D acquisition is the increased scan time. The purpose of this study was the implementation of a free-breathing navigator gated and corrected ultra-fast 3D coronary MRA technique, which allows for scan times of less than 5 minutes. Twelve healthy adult subjects were examined in the supine position using a navigator gated and corrected ECG triggered ultra-fast 3D interleaved gradient echo planar imaging sequence (TFE-EPI). A 3D slab, consisting of 20 slices with a reconstructed slice thickness of 1.5 mm, was acquired with free-breathing. The diastolic TFE-EPI acquisition block was preceded by a T2prep pre-pulse, a diaphragmatic navigator pulse, and a fat suppression pre-pulse. With a TR of 19 ms and an effective TE of 5.4 ms, the duration of the data acquisition window duration was 38 ms. The in-plane spatial resolution was 1.0-1.3 mm*1.5-1.9 mm. In all cases, the entire left main (LM) and extensive portions of the left anterior descending (LAD) and right coronary artery (RCA) could be visualized with an average scan time for the entire 3D-volume data set of 2:57 +/- 0:51 minutes. Average contiguous vessel length visualized was 53 +/- 11 mm (range: 42 to 75 mm) for the LAD and 84 +/- 14 mm (range: 62 to 112 mm) for the RCA. Contrast-to-noise between coronary blood and myocardium was 5.0 +/- 2.3 for the LM/LAD and 8.0 +/- 2.9 for the RCA, resulting in an excellent suppression of myocardium. We present a new approach for free-breathing 3D coronary MRA, which allows for scan times superior to corresponding 2D coronary MRA approaches, and which takes advantage of the enhanced SNR of 3D acquisitions and the post-processing benefits of thin adjacent slices. The robust image quality and the short average scanning time suggest that this approach may be useful for screening the major coronary arteries or identification of anomalous coronary arteries. J. Magn. Reson. Imaging 1999;10:821-825.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RATIONALE AND OBJECTIVES: The purpose of this study was the investigation of the impact of real-time adaptive motion correction on image quality in navigator-gated, free-breathing, double-oblique three-dimensional (3D) submillimeter right coronary magnetic resonance angiography (MRA). MATERIALS AND METHODS: Free-breathing 3D right coronary MRA with real-time navigator technology was performed in 10 healthy adult subjects with an in-plane spatial resolution of 700 x 700 microm. Identical double-oblique coronary MR-angiograms were performed with navigator gating alone and combined navigator gating and real-time adaptive motion correction. Quantitative objective parameters of contrast-to-noise ratio (CNR) and vessel sharpness and subjective image quality scores were compared. RESULTS: Superior vessel sharpness, increased CNR, and superior image quality scores were found with combined navigator gating and real-time adaptive motion correction (vs. navigator gating alone; P < 0.01 for all comparisons). CONCLUSION: Real-time adaptive motion correction objectively and subjectively improves image quality in 3D navigator-gated free-breathing double-oblique submillimeter right coronary MRA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use cryo-electron microscopy to compare 3D shapes of 158 bp long DNA minicircles that differ only in the sequence within an 18 bp block containing either a TATA box or a catabolite activator protein binding site. We present a sorting algorithm that correlates the reconstructed shapes and groups them into distinct categories. We conclude that the presence of the TATA box sequence, which is believed to be easily bent, does not significantly affect the observed shapes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a new method to track bonemovements in stereoscopic X-ray image series of the kneejoint. The method is based on two different X-ray imagesets: a rotational series of acquisitions of the stillsubject knee that will allow the tomographicreconstruction of the three-dimensional volume (model),and a stereoscopic image series of orthogonal projectionsas the subject performs movements. Tracking the movementsof bones throughout the stereoscopic image series meansto determine, for each frame, the best pose of everymoving element (bone) previously identified in the 3Dreconstructed model. The quality of a pose is reflectedin the similarity between its simulated projections andthe actual radiographs. We use direct Fourierreconstruction to approximate the three-dimensionalvolume of the knee joint. Then, to avoid the expensivecomputation of digitally rendered radiographs (DRR) forpose recovery, we reformulate the tracking problem in theFourier domain. Under the hypothesis of parallel X-raybeams, we use the central-slice-projection theorem toreplace the heavy 2D-to-3D registration of projections inthe signal domain by efficient slice-to-volumeregistration in the Fourier domain. Focusing onrotational movements, the translation-relevant phaseinformation can be discarded and we only consider scalarFourier amplitudes. The core of our motion trackingalgorithm can be implemented as a classical frame-wiseslice-to-volume registration task. Preliminary results onboth synthetic and real images confirm the validity ofour approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hippocampal adult neurogenesis results in the continuous formation of new neurons in the adult hippocampus, which participate to learning and memory. Manipulations increasing adult neurogenesis have a huge clinical potential in pathologies involving memory loss. Intringuingly, most of the newborn neurons die during their maturation. Thus, increasing newborn neuron survival during their maturation may be a powerful way to increase overall adult neurogenesis. The factors governing this neuronal death are yet poorly known. In my PhD project, we made the hypothesis that synaptogenesis and synaptic activity play a role in the survival of newborn hippocampal neurons. We studied three factors potentially involved in the regulation of the synaptic integration of adult-born neurons. First, we used propofol anesthesia to provoke a global increase in GABAergic activity of the network, and we evaluated the outcome on newborn neuron synaptic integration, morphological development and survival. Propofol anesthesia impaired the dendritic maturation and survival of adult-born neurons in an age-dependent manner. Next, we examined the development of astrocytic ensheathment on the synapses formed by newborn neurons, as we hypothesized that astrocytes are involved in their synaptic integration. Astrocytic processes ensheathed the synapses of newborn neurons very early in their development, and the processes modulated synaptic transmission on these cells. Finally, we studied the cell-autonomous effects of the overexpression of synaptic adhesion molecules on the development, synaptic integration and survival of newborn neurons, and we found that manipulating of a single adhesion molecule was sufficient to modify synaptogenesis and/or synapse function, and to modify newborn neuron survival. Together, these results suggest that the activity of the neuronal network, the modulation of glutamate transport by astrocytes, and the synapse formation and activity of the neuron itself may regulate the survival of newborn neurons. Thus, the survival of newborn neurons may depend on their ability to communicate with the network. This knowledge is crucial for finding ways to increase neurogenesis in patients. More generally, understanding how the neurogenic niche works and which factors are important for the generation, maturation and survival of neurons is fundamental to be able to maybe, one day, replace neurons in any region of the brain.