954 resultados para Model of the semantic fields
Resumo:
Electrical deep brain stimulation (DBS) is an efficient method to treat movement disorders. Many models of DBS, based mostly on finite elements, have recently been proposed to better understand the interaction between the electrical stimulation and the brain tissues. In monopolar DBS, clinically widely used, the implanted pulse generator (IPG) is used as reference electrode (RE). In this paper, the influence of the RE model of monopolar DBS is investigated. For that purpose, a finite element model of the full electric loop including the head, the neck and the superior chest is used. Head, neck and superior chest are made of simple structures such as parallelepipeds and cylinders. The tissues surrounding the electrode are accurately modelled from data provided by the diffusion tensor magnetic resonance imaging (DT-MRI). Three different configurations of RE are compared with a commonly used model of reduced size. The electrical impedance seen by the DBS system and the potential distribution are computed for each model. Moreover, axons are modelled to compute the area of tissue activated by stimulation. Results show that these indicators are influenced by the surface and position of the RE. The use of a RE model corresponding to the implanted device rather than the usually simplified model leads to an increase of the system impedance (+48%) and a reduction of the area of activated tissue (-15%).
Resumo:
Résumé: L'impact de la maladie d'Alzheimer (MA) est dévastateur pour la vie quotidienne de la personne affectée, avec perte progressive de la mémoire et d'autres facultés cognitives jusqu'à la démence. Il n'existe toujours pas de traitement contre cette maladie et il y a aussi une grande incertitude sur le diagnostic des premiers stades de la MA. La signature anatomique de la MA, en particulier l'atrophie du lobe temporal moyen (LTM) mesurée avec la neuroimagerie, peut être utilisée comme un biomarqueur précoce, in vivo, des premiers stades de la MA. Toutefois, malgré le rôle évident du LMT dans les processus de la mémoire, nous savons que les modèles anatomiques prédictifs de la MA basés seulement sur des mesures d'atrophie du LTM n'expliquent pas tous les cas cliniques. Au cours de ma thèse, j'ai conduit trois projets pour comprendre l'anatomie et le fonctionnement du LMT dans (1) les processus de la maladie et dans (2) les processus de mémoire ainsi que (3) ceux de l'apprentissage. Je me suis intéressée à une population avec déficit cognitif léger (« Mild Cognitive Impairment », MCI), à risque pour la MA. Le but du premier projet était de tester l'hypothèse que des facteurs, autres que ceux cognitifs, tels que les traits de personnalité peuvent expliquer les différences interindividuelles dans le LTM. De plus, la diversité phénotypique des manifestations précliniques de la MA provient aussi d'une connaissance limitée des processus de mémoire et d'apprentissage dans le cerveau sain. L'objectif du deuxième projet porte sur l'investigation des sous-régions du LTM, et plus particulièrement de leur contribution dans différentes composantes de la mémoire de reconnaissance chez le sujet sain. Pour étudier cela, j'ai utilisé une nouvelle méthode multivariée ainsi que l'IRM à haute résolution pour tester la contribution de ces sous-régions dans les processus de familiarité (« ou Know ») et de remémoration (ou « Recollection »). Finalement, l'objectif du troisième projet était de tester la contribution du LTM en tant que système de mémoire dans l'apprentissage et l'interaction dynamique entre différents systèmes de mémoire durant l'apprentissage. Les résultats du premier projet montrent que, en plus du déficit cognitif observé dans une population avec MCI, les traits de personnalité peuvent expliquer les différences interindividuelles du LTM ; notamment avec une plus grande contribution du neuroticisme liée à une vulnérabilité au stress et à la dépression. Mon étude a permis d'identifier un pattern d'anormalité anatomique dans le LTM associé à la personnalité avec des mesures de volume et de diffusion moyenne du tissu. Ce pattern est caractérisé par une asymétrie droite-gauche du LTM et un gradient antéro-postérieur dans le LTM. J'ai interprété ce résultat par des propriétés tissulaires et neurochimiques différemment sensibles au stress. Les résultats de mon deuxième projet ont contribué au débat actuel sur la contribution des sous-régions du LTM dans les processus de familiarité et de remémoration. Utilisant une nouvelle méthode multivariée, les résultats supportent premièrement une dissociation des sous-régions associées aux différentes composantes de la mémoire. L'hippocampe est le plus associé à la mémoire de type remémoration et le cortex parahippocampique, à la mémoire de type familiarité. Deuxièmement, l'activation correspondant à la trace mnésique pour chaque type de mémoire est caractérisée par une distribution spatiale distincte. La représentation neuronale spécifique, « sparse-distributed», associée à la mémoire de remémoration dans l'hippocampe serait la meilleure manière d'encoder rapidement des souvenirs détaillés sans interférer les souvenirs précédemment stockés. Dans mon troisième projet, j'ai mis en place une tâche d'apprentissage en IRM fonctionnelle pour étudier les processus d'apprentissage d'associations probabilistes basé sur le feedback/récompense. Cette étude m'a permis de mettre en évidence le rôle du LTM dans l'apprentissage et l'interaction entre différents systèmes de mémoire comme la mémoire procédurale, perceptuelle ou d'amorçage et la mémoire de travail. Nous avons trouvé des activations dans le LTM correspondant à un processus de mémoire épisodique; les ganglions de la base (GB), à la mémoire procédurale et la récompense; le cortex occipito-temporal (OT), à la mémoire de représentation perceptive ou l'amorçage et le cortex préfrontal, à la mémoire de travail. Nous avons également observé que ces régions peuvent interagir; le type de relation entre le LTM et les GB a été interprété comme une compétition, ce qui a déjà été reporté dans des études récentes. De plus, avec un modèle dynamique causal, j'ai démontré l'existence d'une connectivité effective entre des régions. Elle se caractérise par une influence causale de type « top-down » venant de régions corticales associées avec des processus de plus haut niveau venant du cortex préfrontal sur des régions corticales plus primaires comme le OT cortex. Cette influence diminue au cours du de l'apprentissage; cela pourrait correspondre à un mécanisme de diminution de l'erreur de prédiction. Mon interprétation est que cela est à l'origine de la connaissance sémantique. J'ai également montré que les choix du sujet et l'activation cérébrale associée sont influencés par les traits de personnalité et des états affectifs négatifs. Les résultats de cette thèse m'ont amenée à proposer (1) un modèle expliquant les mécanismes possibles liés à l'influence de la personnalité sur le LTM dans une population avec MCI, (2) une dissociation des sous-régions du LTM dans différents types de mémoire et une représentation neuronale spécifique à ces régions. Cela pourrait être une piste pour résoudre les débats actuels sur la mémoire de reconnaissance. Finalement, (3) le LTM est aussi un système de mémoire impliqué dans l'apprentissage et qui peut interagir avec les GB par une compétition. Nous avons aussi mis en évidence une interaction dynamique de type « top -down » et « bottom-up » entre le cortex préfrontal et le cortex OT. En conclusion, les résultats peuvent donner des indices afin de mieux comprendre certains dysfonctionnements de la mémoire liés à l'âge et la maladie d'Alzheimer ainsi qu'à améliorer le développement de traitement. Abstract: The impact of Alzheimer's disease is devastating for the daily life of the affected patients, with progressive loss of memory and other cognitive skills until dementia. We still lack disease modifying treatment and there is also a great amount of uncertainty regarding the accuracy of diagnostic classification in the early stages of AD. The anatomical signature of AD, in particular the medial temporal lobe (MTL) atrophy measured with neuroimaging, can be used as an early in vivo biomarker in early stages of AD. However, despite the evident role of MTL in memory, we know that the derived predictive anatomical model based only on measures of brain atrophy in MTL does not explain all clinical cases. Throughout my thesis, I have conducted three projects to understand the anatomy and the functioning of MTL on (1) disease's progression, (2) memory process and (3) learning process. I was interested in a population with mild cognitive impairment (MCI), at risk for AD. The objective of the first project was to test the hypothesis that factors, other than the cognitive ones, such as the personality traits, can explain inter-individual differences in the MTL. Moreover, the phenotypic diversity in the manifestations of preclinical AD arises also from the limited knowledge of memory and learning processes in healthy brain. The objective of the second project concerns the investigation of sub-regions of the MTL, and more particularly their contributions in the different components of recognition memory in healthy subjects. To study that, I have used a new multivariate method as well as MRI at high resolution to test the contribution of those sub-regions in the processes of familiarity and recollection. Finally, the objective of the third project was to test the contribution of the MTL as a memory system in learning and the dynamic interaction between memory systems during learning. The results of the first project show that, beyond cognitive state of impairment observed in the population with MCI, the personality traits can explain the inter-individual differences in the MTL; notably with a higher contribution of neuroticism linked to proneness to stress and depression. My study has allowed identifying a pattern of anatomical abnormality in the MTL related to personality with measures of volume and mean diffusion of the tissue. That pattern is characterized by right-left asymmetry in MTL and an anterior to posterior gradient within MTL. I have interpreted that result by tissue and neurochemical properties differently sensitive to stress. Results of my second project have contributed to the actual debate on the contribution of MTL sub-regions in the processes of familiarity and recollection. Using a new multivariate method, the results support firstly a dissociation of the subregions associated with different memory components. The hippocampus was mostly associated with recollection and the surrounding parahippocampal cortex, with familiarity type of memory. Secondly, the activation corresponding to the mensic trace for each type of memory is characterized by a distinct spatial distribution. The specific neuronal representation, "sparse-distributed", associated with recollection in the hippocampus would be the best way to rapidly encode detailed memories without overwriting previously stored memories. In the third project, I have created a learning task with functional MRI to sudy the processes of learning of probabilistic associations based on feedback/reward. That study allowed me to highlight the role of the MTL in learning and the interaction between different memory systems such as the procedural memory, the perceptual memory or priming and the working memory. We have found activations in the MTL corresponding to a process of episodic memory; the basal ganglia (BG), to a procedural memory and reward; the occipito-temporal (OT) cortex, to a perceptive memory or priming and the prefrontal cortex, to working memory. We have also observed that those regions can interact; the relation type between the MTL and the BG has been interpreted as a competition. In addition, with a dynamic causal model, I have demonstrated a "top-down" influence from cortical regions associated with high level cortical area such as the prefrontal cortex on lower level cortical regions such as the OT cortex. That influence decreases during learning; that could correspond to a mechanism linked to a diminution of prediction error. My interpretation is that this is at the origin of the semantic knowledge. I have also shown that the subject's choice and the associated brain activation are influenced by personality traits and negative affects. Overall results of this thesis have brought me to propose (1) a model explaining the possible mechanism linked to the influence of personality on the MTL in a population with MCI, (2) a dissociation of MTL sub-regions in different memory types and a neuronal representation specific to each region. This could be a cue to resolve the actual debates on recognition memory. Finally, (3) the MTL is also a system involved in learning and that can interact with the BG by a competition. We have also shown a dynamic interaction of « top -down » and « bottom-up » types between the pre-frontal cortex and the OT cortex. In conclusion, the results could give cues to better understand some memory dysfunctions in aging and Alzheimer's disease and to improve development of treatment.
Resumo:
Despite extensive genetic and immunological research, the complex etiology and pathogenesis of type I diabetes remains unresolved. During the last few years, our attention has been focused on factors such as abnormalities of islet function and/or microenvironment, that could interact with immune partners in the spontaneous model of the disease, the non-obese diabetic (NOD) mouse. Intriguingly, the first anomalies that we noted in NOD mice, compared to control strains, are already present at birth and consist of 1) higher numbers of paradoxically hyperactive ß cells, assessed by in situ preproinsulin II expression; 2) high percentages of immature islets, representing islet neogenesis related to neonatal ß-cell hyperactivity and suggestive of in utero ß-cell stimulation; 3) elevated levels of some types of antigen-presenting cells and FasL+ cells, and 4) abnormalities of extracellular matrix (ECM) protein expression. However, the colocalization in all control mouse strains studied of fibroblast-like cells (anti-TR-7 labeling), some ECM proteins (particularly, fibronectin and collagen I), antigen-presenting cells and a few FasL+ cells at the periphery of islets undergoing neogenesis suggests that remodeling phenomena that normally take place during postnatal pancreas development could be disturbed in NOD mice. These data show that from birth onwards there is an intricate relationship between endocrine and immune events in the NOD mouse. They also suggest that tissue-specific autoimmune reactions could arise from developmental phenomena taking place during fetal life in which ECM-immune cell interaction(s) may play a key role.
Resumo:
The healthcare sector is currently in the verge of a reform and thus, the medical game research provide an interesting area of research. The aim of this study is to explore the critical elements underpinning the emergence of the medical game ecosystem with three sub-objectives: (1) to seek who are the key actors involved in the medical game ecosystem and identify their needs, (2) to scrutinise what types of resources are required in medical game development and what types of relationships are needed to secure those resources, and (3) to identify the existing institutions (‘the rules of the game’) affecting the emergence of the medical game ecosystem. The theoretical background consists of service ecosystems literature. The empirical study conducted is based on the semi-structured theme interviews of 25 experts in three relevant fields: games and technology, health and funding. The data was analysed through a theoretical framework that was designed based upon service ecosystems literature. The study proposes that the key actors are divided into five groups: medical game companies, customers, funders, regulatory parties and complementors. Their needs are linked to improving patient motivation and enhancing the healthcare processes resulting in lower costs. Several types of resources, especially skills and knowledge, are required to create a medical game. To gain access to those resources, medical game companies need to build complex networks of relationships. Proficiency in managing those value networks is crucial. In addition, the company should take into account the underlying institutions in the healthcare sector affecting the medical game ecosystem. Three crucial institutions were identified: validation, lack of innovation supporting structures in healthcare and the rising consumerisation. Based on the findings, medical games cannot be made in isolation. A developmental trajectory model of the emerging medical game ecosystem was created based on the empirical data. The relevancy of relationships and resources is dependent on the trajectory that the medical game company at that time resides. Furthermore, creating an official and documented database for clinically valdated medical games was proposed to establish the medical game market and ensure an adequate status for the effective medical games. Finally, ecosystems approach provides interesting future opportunities for research on medical game ecosystems.
Resumo:
The healthcare sector is currently in the verge of a reform and thus, the medical game research provide an interesting area of research. The aim of this study is to explore the critical elements underpinning the emergence of the medical game ecosystem with three sub-objectives: (1) to seek who are the key actors involved in the medical game ecosystem and identify their needs, (2) to scrutinise what types of resources are required in medical game development and what types of relationships are needed to secure those resources, and (3) to identify the existing institutions (‘the rules of the game’) affecting the emergence of the medical game ecosystem. The theoretical background consists of service ecosystems literature. The empirical study conducted is based on the semi-structured theme interviews of 25 experts in three relevant fields: games and technology, health and funding. The data was analysed through a theoretical framework that was designed based upon service ecosystems literature. The study proposes that the key actors are divided into five groups: medical game companies, customers, funders, regulatory parties and complementors. Their needs are linked to improving patient motivation and enhancing the healthcare processes resulting in lower costs. Several types of resources, especially skills and knowledge, are required to create a medical game. To gain access to those resources, medical game companies need to build complex networks of relationships. Proficiency in managing those value networks is crucial. In addition, the company should take into account the underlying institutions in the healthcare sector affecting the medical game ecosystem. Three crucial institutions were identified: validation, lack of innovation supporting structures in healthcare and the rising consumerisation. Based on the findings, medical games cannot be made in isolation. A developmental trajectory model of the emerging medical game ecosystem was created based on the empirical data. The relevancy of relationships and resources is dependent on the trajectory that the medical game company at that time resides. Furthermore, creating an official and documented database for clinically validated medical games was proposed to establish the medical game market and ensure an adequate status for the effective medical games. Finally, ecosystems approach provides interesting future opportunities for research on medical game ecosystems
Resumo:
The topic of this thesis is marginaVminority popular music and the question of identity; the term "marginaVminority" specifically refers to members of racial and cultural minorities who are socially and politically marginalized. The thesis argument is that popular music produced by members of cultural and racial minorities establishes cultural identity and resists racist discourse. Three marginaVminority popular music artists and their songs have been chosen for analysis in support of the argument: Gil Scott-Heron's "Gun," Tracy Chapman's "Fast Car" and Robbie Robertson's "Sacrifice." The thesis will draw from two fields of study; popular music and postcolonialism. Within the area of popular music, Theodor Adorno's "Standardization" theory is the focus. Within the area of postcolonialism, this thesis concentrates on two specific topics; 1) Stuart Hall's and Homi Bhabha's overlapping perspectives that identity is a process of cultural signification, and 2) Homi Bhabha's concept of the "Third Space." For Bhabha (1995a), the Third Space defines cultures in the moment of their use, at the moment of their exchange. The idea of identities arising out of cultural struggle suggests that identity is a process as opposed to a fixed center, an enclosed totality. Cultures arise from historical memory and memory has no center. Historical memory is de-centered and thus cultures are also de-centered, they are not enclosed totalities. This is what Bhabha means by "hybridity" of culture - that cultures are not unitary totalities, they are ways of knowing and speaking about a reality that is in constant flux. In this regard, the language of "Otherness" depends on suppressing or marginalizing the productive capacity of culture in the act of enunciation. The Third Space represents a strategy of enunciation that disrupts, interrupts and dislocates the dominant discursive construction of US and THEM, (a construction explained by Hall's concept of binary oppositions, detailed in Chapter 2). Bhabha uses the term "enunciation" as a linguistic metaphor for how cultural differences are articulated through discourse and thus how differences are discursively produced. Like Hall, Bhabha views culture as a process of understanding and of signification because Bhabha sees traditional cultures' struggle against colonizing cultures as transforming them. Adorno's theory of Standardization will be understood as a theoretical position of Western authority. The thesis will argue that Adorno's theory rests on the assumption that there is an "essence" to music, an essence that Adorno rationalizes as structure/form. The thesis will demonstrate that constructing music as possessing an essence is connected to ideology and power and in this regard, Adorno's Standardization theory is a discourse of White Western power. It will be argued that "essentialism" is at the root of Western "rationalization" of music, and that the definition of what constitutes music is an extension of Western racist "discourses" of the Other. The methodological framework of the thesis entails a) applying semiotics to each of the three songs examined and b) also applying Bhabha's model of the Third Space to each of the songs. In this thesis, semiotics specifically refers to Stuart Hall's retheorized semiotics, which recognizes the dual function of semiotics in the analysis of marginal racial/cultural identities, i.e., simultaneously represent embedded racial/cultural stereotypes, and the marginal raciaVcultural first person voice that disavows and thus reinscribes stereotyped identities. (Here, and throughout this thesis, "first person voice" is used not to denote the voice of the songwriter, but rather the collective voice of a marginal racial/cultural group). This dual function fits with Hall's and Bhabha's idea that cultural identity emerges out of cultural antagonism, cultural struggle. Bhabha's Third Space is also applied to each of the songs to show that cultural "struggle" between colonizers and colonized produces cultural hybridities, musically expressed as fusions of styles/sounds. The purpose of combining semiotics and postcolonialism in the three songs to be analyzed is to show that marginal popular music, produced by members of cultural and racial minorities, establishes cultural identity and resists racist discourse by overwriting identities of racial/cultural stereotypes with identities shaped by the first person voice enunciated in the Third Space, to produce identities of cultural hybridities. Semiotic codes of embedded "Black" and "Indian" stereotypes in each song's musical and lyrical text will be read and shown to be overwritten by the semiotic codes of the first person voice, which are decoded with the aid of postcolonial concepts such as "ambivalence," "hybridity" and "enunciation."
Resumo:
Multi-country models have not been very successful in replicating important features of the international transmission of business cycles. Standard models predict cross-country correlations of output and consumption which are respectively too low and too high. In this paper, we build a multi-country model of the business cycle with multiple sectors in order to analyze the role of sectoral shocks in the international transmission of the business cycle. We find that a model with multiple sectors generates a higher cross-country correlation of output than standard one-sector models, and a lower cross-country correlation of consumption. In addition, it predicts cross-country correlations of employment and investment that are closer to the data than the standard model. We also analyze the relative effects of multiple sectors, trade in intermediate goods, imperfect substitution between domestic and foreign goods, home preference, capital adjustment costs, and capital depreciation on the international transmission of the business cycle.
Resumo:
The objects with which the hand interacts with may significantly change the dynamics of the arm. How does the brain adapt control of arm movements to this new dynamic? We show that adaptation is via composition of a model of the task's dynamics. By exploring generalization capabilities of this adaptation we infer some of the properties of the computational elements with which the brain formed this model: the elements have broad receptive fields and encode the learned dynamics as a map structured in an intrinsic coordinate system closely related to the geometry of the skeletomusculature. The low--level nature of these elements suggests that they may represent asset of primitives with which a movement is represented in the CNS.
Resumo:
FAMOUS is an ocean-atmosphere general circulation model of low resolution, capable of simulating approximately 120 years of model climate per wallclock day using current high performance computing facilities. It uses most of the same code as HadCM3, a widely used climate model of higher resolution and computational cost, and has been tuned to reproduce the same climate reasonably well. FAMOUS is useful for climate simulations where the computational cost makes the application of HadCM3 unfeasible, either because of the length of simulation or the size of the ensemble desired. We document a number of scientific and technical improvements to the original version of FAMOUS. These improvements include changes to the parameterisations of ozone and sea-ice which alleviate a significant cold bias from high northern latitudes and the upper troposphere, and the elimination of volume-averaged drifts in ocean tracers. A simple model of the marine carbon cycle has also been included. A particular goal of FAMOUS is to conduct millennial-scale paleoclimate simulations of Quaternary ice ages; to this end, a number of useful changes to the model infrastructure have been made.
Resumo:
The entropy budget is calculated of the coupled atmosphere–ocean general circulation model HadCM3. Estimates of the different entropy sources and sinks of the climate system are obtained directly from the diabatic heating terms, and an approximate estimate of the planetary entropy production is also provided. The rate of material entropy production of the climate system is found to be ∼50 mW m−2 K−1, a value intermediate in the range 30–70 mW m−2 K−1 previously reported from different models. The largest part of this is due to sensible and latent heat transport (∼38 mW m−2 K−1). Another 13 mW m−2 K−1 is due to dissipation of kinetic energy in the atmosphere by friction and Reynolds stresses. Numerical entropy production in the atmosphere dynamical core is found to be about 0.7 mW m−2 K−1. The material entropy production within the ocean due to turbulent mixing is ∼1 mW m−2 K−1, a very small contribution to the material entropy production of the climate system. The rate of change of entropy of the model climate system is about 1 mW m−2 K−1 or less, which is comparable with the typical size of the fluctuations of the entropy sources due to interannual variability, and a more accurate closure of the budget than achieved by previous analyses. Results are similar for FAMOUS, which has a lower spatial resolution but similar formulation to HadCM3, while more substantial differences are found with respect to other models, suggesting that the formulation of the model has an important influence on the climate entropy budget. Since this is the first diagnosis of the entropy budget in a climate model of the type and complexity used for projection of twenty-first century climate change, it would be valuable if similar analyses were carried out for other such models.
Resumo:
Models of the dynamics of nitrogen in soil (soil-N) can be used to aid the fertilizer management of a crop. The predictions of soil-N models can be validated by comparison with observed data. Validation generally involves calculating non-spatial statistics of the observations and predictions, such as their means, their mean squared-difference, and their correlation. However, when the model predictions are spatially distributed across a landscape the model requires validation with spatial statistics. There are three reasons for this: (i) the model may be more or less successful at reproducing the variance of the observations at different spatial scales; (ii) the correlation of the predictions with the observations may be different at different spatial scales; (iii) the spatial pattern of model error may be informative. In this study we used a model, parameterized with spatially variable input information about the soil, to predict the mineral-N content of soil in an arable field, and compared the results with observed data. We validated the performance of the N model spatially with a linear mixed model of the observations and model predictions, estimated by residual maximum likelihood. This novel approach allowed us to describe the joint variation of the observations and predictions as: (i) independent random variation that occurred at a fine spatial scale; (ii) correlated random variation that occurred at a coarse spatial scale; (iii) systematic variation associated with a spatial trend. The linear mixed model revealed that, in general, the performance of the N model changed depending on the spatial scale of interest. At the scales associated with random variation, the N model underestimated the variance of the observations, and the predictions were correlated poorly with the observations. At the scale of the trend, the predictions and observations shared a common surface. The spatial pattern of the error of the N model suggested that the observations were affected by the local soil condition, but this was not accounted for by the N model. In summary, the N model would be well-suited to field-scale management of soil nitrogen, but suited poorly to management at finer spatial scales. This information was not apparent with a non-spatial validation. (c),2007 Elsevier B.V. All rights reserved.
Resumo:
ERA-Interim is the latest global atmospheric reanalysis produced by the European Centre for Medium-Range Weather Forecasts (ECMWF). The ERA-Interim project was conducted in part to prepare for a new atmospheric reanalysis to replace ERA-40, which will extend back to the early part of the twentieth century. This article describes the forecast model, data assimilation method, and input datasets used to produce ERA-Interim, and discusses the performance of the system. Special emphasis is placed on various difficulties encountered in the production of ERA-40, including the representation of the hydrological cycle, the quality of the stratospheric circulation, and the consistency in time of the reanalysed fields. We provide evidence for substantial improvements in each of these aspects. We also identify areas where further work is needed and describe opportunities and objectives for future reanalysis projects at ECMWF
Resumo:
Acrylamide is formed from reducing sugars and asparagine during the preparation of French fries. The commercial preparation of French fries is a multi-stage process involving the preparation of frozen, par-fried potato strips for distribution to catering outlets where they are finish fried. The initial blanching, treatment in glucose solution and par-frying steps are crucial since they determine the levels of precursors present at the beginning of the finish frying process. In order to minimize the quantities of acrylamide in cooked fries, it is important to understand the impact of each stage on the formation of acrylamide. Acrylamide, amino acids, sugars, moisture, fat and color were monitored at time intervals during the frying of potato strips which had been dipped in varying concentrations of glucose and fructose during a typical pretreatment. A mathematical model of the finish-frying was developed based on the fundamental chemical reaction pathways, incorporating moisture and temperature gradients in the fries. This showed the contribution of both glucose and fructose to the generation of acrylamide, and accurately predicted the acrylamide content of the final fries.
Resumo:
The Canadian Middle Atmosphere Modelling (MAM) project is a collaboration between thé Atmospheric Environment Service (AES) of Environment Canada and several Canadian universities. Its goal is thé development of a comprehensive General Circulation Model of the troposphere-stratosphere-mesosphere System, starting from the AES/CCCma third-generation atmospheric General Circulation Model. This paper describes the basic features of the first-generation Canadian MAM and some aspects of its radiative-dynamical climatology. Standard first-order mean diagnostics are presented for monthly means and for the annual cycle of zonal-mean winds and temperatures. The mean meridional circulation is examined, and comparison is made between thé steady diabatic, downward controlled, and residual stream functions. It is found that downward control holds quite well in the monthly mean through most of the middle atmosphere, even during equinoctal periods. The relative roles of different drag processes in determining the mean downwelling over the wintertime polar middle stratosphere is examined, and the vertical structure of the drag is quantified.
Resumo:
In this paper the origin and evolution of the Sun’s open magnetic flux are considered for single magnetic bipoles as they are transported across the Sun. The effects of magnetic flux transport on the radial field at the surface of the Sun are modeled numerically by developing earlier work by Wang, Sheeley, and Lean (2000). The paper considers how the initial tilt of the bipole axis (α) and its latitude of emergence affect the variation and magnitude of the surface and open magnetic flux. The amount of open magnetic flux is estimated by constructing potential coronal fields. It is found that the open flux may evolve independently from the surface field for certain ranges of the tilt angle. For a given tilt angle, the lower the latitude of emergence, the higher the magnitude of the surface and open flux at the end of the simulation. In addition, three types of behavior are found for the open flux depending on the initial tilt angle of the bipole axis. When the tilt is such that α ≥ 2◦ the open flux is independent of the surface flux and initially increases before decaying away. In contrast, for tilt angles in the range −16◦ < α < 2◦ the open flux follows the surface flux and continually decays. Finally, for α ≤ −16◦ the open flux first decays and then increases in magnitude towards a second maximum before decaying away. This behavior of the open flux can be explained in terms of two competing effects produced by differential rotation. Firstly, differential rotation may increase or decrease the open flux by rotating the centers of each polarity of the bipole at different rates when the axis has tilt. Secondly, it decreases the open flux by increasing the length of the polarity inversion line where flux cancellation occurs. The results suggest that, in order to reproduce a realistic model of the Sun’s open magnetic flux over a solar cycle, it is important to have accurate input data on the latitude of emergence of bipoles along with the variation of their tilt angles as the cycle progresses.