73 resultados para SPECIAL VIRTUAL FIELDS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, modern cross-sectional imaging techniques such as multi-detector computed tomography (MDCT) have pioneered post mortem investigations, especially in forensic medicine. Such approaches can also be used to investigate bones non-invasively for anthropological purposes. Long bones are often examined in forensic cases because they are frequently discovered and transferred to medico-legal departments for investigation. To estimate their age, the trabecular structure must be examined. This study aimed to compare the performance of MDCT with conventional X-rays to investigate the trabecular structure of long bones. Fifty-two dry bones (24 humeri and 28 femora) from anthropological collections were first examined by conventional X-ray, and then by MDCT. Trabecular structure was evaluated by seven observers (two experienced and five inexperienced in anthropology) who analyzed images obtained by radiological methods. Analyses contained the measurement of one quantitative parameter (caput diameter of humerus and femur) and staging the trabecular structure of each bone. Preciseness of each technique was indicated by describing areas of trabecular destruction and particularities of the bones, such as pathological changes. Concerning quantitative parameters, the measurements demonstrate comparable results for the MDCT and conventional X-ray techniques. In contrast, the overall inter-observer reliability of the staging was low with MDCT and conventional X-ray. Reliability increased significantly when only the results of the staging performed by the two experienced observers were compared, particularly regarding the MDCT analysis. Our results also indicate that MDCT appears to be better suited to a detailed examination of the trabecular structure. In our opinion, MDCT is an adequate tool with which to examine the trabecular structure of long bones. However, adequate methods should be developed or existing methods should be adapted to MDCT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Games are powerful and engaging. On average, one billion people spend at least 1 hour a day playing computer and videogames. This is even more true with the younger generations. Our students have become the < digital natives >, the < gamers >, the < virtual generation >. Research shows that those who are most at risk for failure in the traditional classroom setting, also spend more time than their counterparts, using video games. They might strive, given a different learning environment. Educators have the responsibility to align their teaching style to these younger generation learning styles. However, many academics resist the use of computer-assisted learning that has been "created elsewhere". This can be extrapolated to game-based teaching: even if educational games were more widely authored, their adoption would still be limited to the educators who feel a match between the authored games and their own beliefs and practices. Consequently, game-based teaching would be much more widespread if teachers could develop their own games, or at least customize them. Yet, the development and customization of teaching games are complex and costly. This research uses a design science methodology, leveraging gamification techniques, active and cooperative learning theories, as well as immersive sandbox 3D virtual worlds, to develop a method which allows management instructors to transform any off-the-shelf case study into an engaging collaborative gamified experience. This method is applied to marketing case studies, and uses the sandbox virtual world of Second Life. -- Les jeux sont puissants et motivants, En moyenne, un milliard de personnes passent au moins 1 heure par jour jouer à des jeux vidéo sur ordinateur. Ceci se vérifie encore plus avec les jeunes générations, Nos étudiants sont nés à l'ère du numérique, certains les appellent des < gamers >, d'autres la < génération virtuelle >. Les études montrent que les élèves qui se trouvent en échec scolaire dans les salles de classes traditionnelles, passent aussi plus de temps que leurs homologues à jouer à des jeux vidéo. lls pourraient potentiellement briller, si on leur proposait un autre environnement d'apprentissage. Les enseignants ont la responsabilité d'adapter leur style d'enseignement aux styles d'apprentissage de ces jeunes générations. Toutefois, de nombreux professeurs résistent lorsqu'il s'agit d'utiliser des contenus d'apprentissage assisté par ordinateur, développés par d'autres. Ceci peut être extrapolé à l'enseignement par les jeux : même si un plus grand nombre de jeux éducatifs était créé, leur adoption se limiterait tout de même aux éducateurs qui perçoivent une bonne adéquation entre ces jeux et leurs propres convictions et pratiques. Par conséquent, I'enseignement par les jeux serait bien plus répandu si les enseignants pouvaient développer leurs propres jeux, ou au moins les customiser. Mais le développement de jeux pédagogiques est complexe et coûteux. Cette recherche utilise une méthodologie Design Science pour développer, en s'appuyant sur des techniques de ludification, sur les théories de pédagogie active et d'apprentissage coopératif, ainsi que sur les mondes virtuels immersifs < bac à sable > en 3D, une méthode qui permet aux enseignants et formateurs de management, de transformer n'importe quelle étude de cas, provenant par exemple d'une centrale de cas, en une expérience ludique, collaborative et motivante. Cette méthode est appliquée aux études de cas Marketing dans le monde virtuel de Second Life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gel electrophoresis allows one to separate knotted DNA (nicked circular) of equal length according to the knot type. At low electric fields, complex knots, being more compact, drift faster than simpler knots. Recent experiments have shown that the drift velocity dependence on the knot type is inverted when changing from low to high electric fields. We present a computer simulation on a lattice of a closed, knotted, charged DNA chain drifting in an external electric field in a topologically restricted medium. Using a Monte Carlo algorithm, the dependence of the electrophoretic migration of the DNA molecules on the knot type and on the electric field intensity is investigated. The results are in qualitative and quantitative agreement with electrophoretic experiments done under conditions of low and high electric fields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research note presents a set of strategies to conduct small-N comparisons in policy research including the Swiss case. Even though every country can be considered "special" to some extent, the Swiss political system is often viewed as a particularly difficult case for comparison because of the impact of its idiosyncratic institutional features (most notably direct democracy). In order to deal with this problem, our note sets out two possible strategies - the use of functional equivalents and of counterfactual reasoning - and explains how to implement them empirically through process tracing and the establishment of causal chains. As an illustration, these strategies are used for a comparison of the process of electricity market liberalisation in Switzerland and Belgium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to determine potential relationships between anthropometric parameters and athletic performance with special consideration to repeated-sprint ability (RSA). Sixteen players of the senior male Qatar national soccer team performed a series of anthropometric and physical tests including countermovement jumps without (CMJ) and with free arms (CMJwA), straight-line 20 m sprint, RSA (6 × 35 m with 10 s recovery) and incremental field test. Significant (P < 0.05) relationships occurred between muscle-to-bone ratio and both CMJs height (r ranging from 0.56 to 0.69) as well as with all RSA-related variables (r < -0.53 for sprinting times and r = 0.54 for maximal sprinting speed) with the exception of the sprint decrement score (Sdec). The sum of six skinfolds and adipose mass index were largely correlated with Sdec (r = 0.68, P < 0.01 and r = 0.55, P < 0.05, respectively) but not with total time (TT, r = 0.44 and 0.33, P > 0.05, respectively) or any standard athletic tests. Multiple regression analyses indicated that muscular cross-sectional area for mid-thigh, adipose index, straight-line 20 m time, maximal sprinting speed and CMJwA are the strongest predictors of Sdec (r(2) = 0.89) and TT (r(2) = 0.95) during our RSA test. In the Qatar national soccer team, players' power-related qualities and RSA are associated with a high muscular profile and a low adiposity. This supports the relevance of explosive power for the soccer players and the larger importance of neuromuscular qualities determining the RSA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the early days of functional magnetic resonance imaging (fMRI), retinotopic mapping emerged as a powerful and widely-accepted tool, allowing the identification of individual visual cortical fields and furthering the study of visual processing. In contrast, tonotopic mapping in auditory cortex proved more challenging primarily because of the smaller size of auditory cortical fields. The spatial resolution capabilities of fMRI have since advanced, and recent reports from our labs and several others demonstrate the reliability of tonotopic mapping in human auditory cortex. Here we review the wide range of stimulus procedures and analysis methods that have been used to successfully map tonotopy in human auditory cortex. We point out that recent studies provide a remarkably consistent view of human tonotopic organisation, although the interpretation of the maps continues to vary. In particular, there remains controversy over the exact orientation of the primary gradients with respect to Heschl's gyrus, which leads to different predictions about the location of human A1, R, and surrounding fields. We discuss the development of this debate and argue that literature is converging towards an interpretation that core fields A1 and R fold across the rostral and caudal banks of Heschl's gyrus, with tonotopic gradients laid out in a distinctive V-shaped manner. This suggests an organisation that is largely homologous with non-human primates. This article is part of a Special Issue entitled Human Auditory Neuroimaging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rationale of this study was to investigate molecular flexibility and its influence on physicochemical properties with a view to uncovering additional information on the fuzzy concept of dynamic molecular structure. Indeed, it is now known that computed molecular interaction fields (MIFs) such as molecular electrostatic potentials (MEPs) and lipophilicity potentials (MLPs) are conformation-dependent, as are dipole moments. A database of 125 compounds was used whose conformational space was explored, while conformation-dependent parameters were computed for each non-redundant conformer found in the conformational space of the compounds. These parameters were the virtual log P (log P(MLP), calculated by a MLP approach), the apolar surface area (ASA), polar surface area (PSA), and solvent-accessible surface (SAS). For each compound, the range taken by each parameter (its property space) was divided by the number of rotors taken as an index of flexibility, yielding a parameter termed 'molecular sensitivity'. This parameter was poorly correlated with others (i.e., it contains novel information) and showed the compounds to fall into two broad classes. 'Sensitive' molecules are those whose computed property ranges are markedly sensitive to conformational effects, whereas 'insensitive' (in fact, less sensitive) molecules have property ranges which are comparatively less affected by conformational fluctuations. A pharmacokinetic application is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Previous studies of the visual outcome in bilateral non-arteritic anterior ischemic optic neuropathy (NAION) have yielded conflicting results, specifically regarding congruity between fellow eyes. Prior studies have used measures of acuity and computerized perimetry but none has compared Goldmann visual field outcomes between fellow eyes. In order to better define the concordance of visual loss in this condition, we reviewed our cases of bilateral sequential NAION, including measures of visual acuity, pupillary function and both pattern and severity of visual field loss.Methods: We performed a retrospective chart review of 102 patients with a diagnosis of bilateral sequential NAION. Of the 102 patients, 86 were included in the study for analysis of final visual outcome between the affected eyes. Visual function was assessed using visual acuity, Goldmann visual fields, color vision and RAPD. A quantitative total visual field score and score per quadrant was analyzed for each eye using the numerical Goldmann visual field scoring method previously described by Esterman and colleagues. Based upon these scores, we calculated the total deviation and pattern deviation between fellow eyes and between eyes of different patients. Statistical significance was determined using nonparametric tests.Results: A statistically significant correlation was found between fellow eyes for multiple parameters, including logMAR visual acuity (P = 0.0101), global visual field (P = 0.0001), superior visual field (P = 0.0001), and inferior visual field (P = 0.0001). In addition, the mean deviation of both total (P = 0.0000000007) and pattern (P = 0.000000004) deviation analyses was significantly less between fellow eyes ("intra"-eyes) than between eyes of different patients ("inter"-eyes).Conclusions: Visual function between fellow eyes showed a fair to moderate correlation that was statistically significant. The pattern of vision loss was also more similar in fellow eyes than between eyes of different patients. These results may help allow better prediction of visual outcome for the second eye in patients with NAION. These findings may also be useful for evaluating efficacy of therapeutic interventions.