978 resultados para 3D reconstruction accuracy
Resumo:
Map units directly related to properties of soil-landscape are generated by local soil classes. Therefore to take into consideration the knowledge of farmers is essential to automate the procedure. The aim of this study was to map local soil classes by computer-assisted cartography (CAC), using several combinations of topographic properties produced by GIS (digital elevation model, aspect, slope, and profile curvature). A decision tree was used to find the number of topographic properties required for digital cartography of the local soil classes. The maps produced were evaluated based on the attributes of map quality defined as precision and accuracy of the CAC-based maps. The evaluation was carried out in Central Mexico using three maps of local soil classes with contrasting landscape and climatic conditions (desert, temperate, and tropical). In the three areas the precision (56 %) of the CAC maps based on elevation as topographical feature was higher than when based on slope, aspect and profile curvature. The accuracy of the maps (boundary locations) was however low (33 %), in other words, further research is required to improve this indicator.
Resumo:
The impact of navigator spatial resolution and navigator evaluation time on image quality in free-breathing navigator-gated 3D coronary magnetic resonance angiography (MRA), including real-time motion correction, was investigated in a moving phantom. Objective image quality parameters signal-to-noise ratio (SNR) and vessel sharpness were compared. It was found that for improved mage quality a short navigator evaluation time is of crucial importance. Navigator spatial resolution showed minimal influence on image quality.
Resumo:
The uncertainties inherent to experimental differential scanning calorimetric data are evaluated. A new procedure is developed to perform the kinetic analysis of continuous heating calorimetric data when the heat capacity of the sample changes during the crystallization. The accuracy of isothermal calorimetric data is analyzed in terms of the peak-to-peak noise of the calorimetric signal and base line drift typical of differential scanning calorimetry equipment. Their influence in the evaluation of the kinetic parameters is discussed. An empirical construction of the time-temperature and temperature heating rate transformation diagrams, grounded on the kinetic parameters, is presented. The method is applied to the kinetic study of the primary crystallization of Te in an amorphous alloy of nominal composition Ga20Te80, obtained by rapid solidification.
Resumo:
Games are powerful and engaging. On average, one billion people spend at least 1 hour a day playing computer and videogames. This is even more true with the younger generations. Our students have become the < digital natives >, the < gamers >, the < virtual generation >. Research shows that those who are most at risk for failure in the traditional classroom setting, also spend more time than their counterparts, using video games. They might strive, given a different learning environment. Educators have the responsibility to align their teaching style to these younger generation learning styles. However, many academics resist the use of computer-assisted learning that has been "created elsewhere". This can be extrapolated to game-based teaching: even if educational games were more widely authored, their adoption would still be limited to the educators who feel a match between the authored games and their own beliefs and practices. Consequently, game-based teaching would be much more widespread if teachers could develop their own games, or at least customize them. Yet, the development and customization of teaching games are complex and costly. This research uses a design science methodology, leveraging gamification techniques, active and cooperative learning theories, as well as immersive sandbox 3D virtual worlds, to develop a method which allows management instructors to transform any off-the-shelf case study into an engaging collaborative gamified experience. This method is applied to marketing case studies, and uses the sandbox virtual world of Second Life. -- Les jeux sont puissants et motivants, En moyenne, un milliard de personnes passent au moins 1 heure par jour jouer à des jeux vidéo sur ordinateur. Ceci se vérifie encore plus avec les jeunes générations, Nos étudiants sont nés à l'ère du numérique, certains les appellent des < gamers >, d'autres la < génération virtuelle >. Les études montrent que les élèves qui se trouvent en échec scolaire dans les salles de classes traditionnelles, passent aussi plus de temps que leurs homologues à jouer à des jeux vidéo. lls pourraient potentiellement briller, si on leur proposait un autre environnement d'apprentissage. Les enseignants ont la responsabilité d'adapter leur style d'enseignement aux styles d'apprentissage de ces jeunes générations. Toutefois, de nombreux professeurs résistent lorsqu'il s'agit d'utiliser des contenus d'apprentissage assisté par ordinateur, développés par d'autres. Ceci peut être extrapolé à l'enseignement par les jeux : même si un plus grand nombre de jeux éducatifs était créé, leur adoption se limiterait tout de même aux éducateurs qui perçoivent une bonne adéquation entre ces jeux et leurs propres convictions et pratiques. Par conséquent, I'enseignement par les jeux serait bien plus répandu si les enseignants pouvaient développer leurs propres jeux, ou au moins les customiser. Mais le développement de jeux pédagogiques est complexe et coûteux. Cette recherche utilise une méthodologie Design Science pour développer, en s'appuyant sur des techniques de ludification, sur les théories de pédagogie active et d'apprentissage coopératif, ainsi que sur les mondes virtuels immersifs < bac à sable > en 3D, une méthode qui permet aux enseignants et formateurs de management, de transformer n'importe quelle étude de cas, provenant par exemple d'une centrale de cas, en une expérience ludique, collaborative et motivante. Cette méthode est appliquée aux études de cas Marketing dans le monde virtuel de Second Life.
Resumo:
We provide analytical evidence of stochastic resonance in polarization switching vertical-cavity surface-emitting lasers (VCSELs). We describe the VCSEL by a two-mode stochastic rate equation model and apply a multiple time-scale analysis. We were able to reduce the dynamical description to a single stochastic differential equation, which is the starting point of the analytical study of stochastic resonance. We confront our results with numerical simulations on the original rate equations, validating the use of a multiple time-scale analysis on stochastic equations as an analytical tool.
Resumo:
A systematic assessment of global neural network connectivity through direct electrophysiological assays has remained technically infeasible, even in simpler systems like dissociated neuronal cultures. We introduce an improved algorithmic approach based on Transfer Entropy to reconstruct structural connectivity from network activity monitored through calcium imaging. We focus in this study on the inference of excitatory synaptic links. Based on information theory, our method requires no prior assumptions on the statistics of neuronal firing and neuronal connections. The performance of our algorithm is benchmarked on surrogate time series of calcium fluorescence generated by the simulated dynamics of a network with known ground-truth topology. We find that the functional network topology revealed by Transfer Entropy depends qualitatively on the time-dependent dynamic state of the network (bursting or non-bursting). Thus by conditioning with respect to the global mean activity, we improve the performance of our method. This allows us to focus the analysis to specific dynamical regimes of the network in which the inferred functional connectivity is shaped by monosynaptic excitatory connections, rather than by collective synchrony. Our method can discriminate between actual causal influences between neurons and spurious non-causal correlations due to light scattering artifacts, which inherently affect the quality of fluorescence imaging. Compared to other reconstruction strategies such as cross-correlation or Granger Causality methods, our method based on improved Transfer Entropy is remarkably more accurate. In particular, it provides a good estimation of the excitatory network clustering coefficient, allowing for discrimination between weakly and strongly clustered topologies. Finally, we demonstrate the applicability of our method to analyses of real recordings of in vitro disinhibited cortical cultures where we suggest that excitatory connections are characterized by an elevated level of clustering compared to a random graph (although not extreme) and can be markedly non-local.
Resumo:
TCRep 3D is an automated systematic approach for TCR-peptide-MHC class I structure prediction, based on homology and ab initio modeling. It has been considerably generalized from former studies to be applicable to large repertoires of TCR. First, the location of the complementary determining regions of the target sequences are automatically identified by a sequence alignment strategy against a database of TCR Vα and Vβ chains. A structure-based alignment ensures automated identification of CDR3 loops. The CDR are then modeled in the environment of the complex, in an ab initio approach based on a simulated annealing protocol. During this step, dihedral restraints are applied to drive the CDR1 and CDR2 loops towards their canonical conformations, described by Al-Lazikani et. al. We developed a new automated algorithm that determines additional restraints to iteratively converge towards TCR conformations making frequent hydrogen bonds with the pMHC. We demonstrated that our approach outperforms popular scoring methods (Anolea, Dope and Modeller) in predicting relevant CDR conformations. Finally, this modeling approach has been successfully applied to experimentally determined sequences of TCR that recognize the NY-ESO-1 cancer testis antigen. This analysis revealed a mechanism of selection of TCR through the presence of a single conserved amino acid in all CDR3β sequences. The important structural modifications predicted in silico and the associated dramatic loss of experimental binding affinity upon mutation of this amino acid show the good correspondence between the predicted structures and their biological activities. To our knowledge, this is the first systematic approach that was developed for large TCR repertoire structural modeling.
Resumo:
Two-dimensional (2D)-breath-hold coronary magnetic resonance angiography (MRA) has been shown to be a fast and reliable method to depict the proximal coronary arteries. Recent developments, however, allow for free-breathing navigator gated and navigator corrected three-dimensional (3D) coronary MRA. These 3D approaches have potential for improved signal-to-noise ratio (SNR) and allow for the acquisition of adjacent thin slices without the misregistration problems known from 2D approaches. Still, a major impediment of a 3D acquisition is the increased scan time. The purpose of this study was the implementation of a free-breathing navigator gated and corrected ultra-fast 3D coronary MRA technique, which allows for scan times of less than 5 minutes. Twelve healthy adult subjects were examined in the supine position using a navigator gated and corrected ECG triggered ultra-fast 3D interleaved gradient echo planar imaging sequence (TFE-EPI). A 3D slab, consisting of 20 slices with a reconstructed slice thickness of 1.5 mm, was acquired with free-breathing. The diastolic TFE-EPI acquisition block was preceded by a T2prep pre-pulse, a diaphragmatic navigator pulse, and a fat suppression pre-pulse. With a TR of 19 ms and an effective TE of 5.4 ms, the duration of the data acquisition window duration was 38 ms. The in-plane spatial resolution was 1.0-1.3 mm*1.5-1.9 mm. In all cases, the entire left main (LM) and extensive portions of the left anterior descending (LAD) and right coronary artery (RCA) could be visualized with an average scan time for the entire 3D-volume data set of 2:57 +/- 0:51 minutes. Average contiguous vessel length visualized was 53 +/- 11 mm (range: 42 to 75 mm) for the LAD and 84 +/- 14 mm (range: 62 to 112 mm) for the RCA. Contrast-to-noise between coronary blood and myocardium was 5.0 +/- 2.3 for the LM/LAD and 8.0 +/- 2.9 for the RCA, resulting in an excellent suppression of myocardium. We present a new approach for free-breathing 3D coronary MRA, which allows for scan times superior to corresponding 2D coronary MRA approaches, and which takes advantage of the enhanced SNR of 3D acquisitions and the post-processing benefits of thin adjacent slices. The robust image quality and the short average scanning time suggest that this approach may be useful for screening the major coronary arteries or identification of anomalous coronary arteries. J. Magn. Reson. Imaging 1999;10:821-825.
Resumo:
RATIONALE AND OBJECTIVES: The purpose of this study was the investigation of the impact of real-time adaptive motion correction on image quality in navigator-gated, free-breathing, double-oblique three-dimensional (3D) submillimeter right coronary magnetic resonance angiography (MRA). MATERIALS AND METHODS: Free-breathing 3D right coronary MRA with real-time navigator technology was performed in 10 healthy adult subjects with an in-plane spatial resolution of 700 x 700 microm. Identical double-oblique coronary MR-angiograms were performed with navigator gating alone and combined navigator gating and real-time adaptive motion correction. Quantitative objective parameters of contrast-to-noise ratio (CNR) and vessel sharpness and subjective image quality scores were compared. RESULTS: Superior vessel sharpness, increased CNR, and superior image quality scores were found with combined navigator gating and real-time adaptive motion correction (vs. navigator gating alone; P < 0.01 for all comparisons). CONCLUSION: Real-time adaptive motion correction objectively and subjectively improves image quality in 3D navigator-gated free-breathing double-oblique submillimeter right coronary MRA.