953 resultados para Complex models
Resumo:
This study addresses the role of EFL education, its potential and shortcomings, and the challenges the future of EFL education will bring. It is argued that new societal demands and the limited time we have at our disposal in the classroom make it necessary to rethink goals and content and move away from the transmissionof limited sets of facts and information to helping students develop awareness and competences that can be applied in many different situations, also in a perspective of lifelong learning. The overall aim of the current study is to problematize and increase understanding of the implementation of cultural aspects in the language classroom by addressing the interrelated what, why and how of the cultural dimension within EFL education. This has been conducted by means of theoretical explorations into the area, alongside an attempt at promoting intercultural competence (IC) in a more systematic and insightful manner within my own educational praxis. The focus of the intercultural work in the classroom was on the promotion of awareness of difference and diversity, as well as respect for such difference through the ability to decenter from cultural norms and behavior that previously have been taken for granted. These are two elements that have been suggested as fundamental for other work with IC in the classroom and for the realization of important aspects of the underlying values of basic education. In the context of this study, IC comprises several interconnected components supportingeach other in a variety of ways, with the further aim being interaction with and respect for difference in general, not only concerning e.g. representatives ofcertain English-speaking communities. The methodology was informed by action research, with myself in the role of the teacher-researcher or the reflective practitioner. For the purpose of the project I was authorized to take on the EFL education for the three years of upper comprehensive school of one random class of students originally assigned to one of the language teachers of the selected Finland-Swedish school. Thus, the class of 17 students was not specifically chosen for the project, and the aims and contents chosen for the development project were placed within the framework of the ordinary curriculum. By exploring the students¿ insights concerning different English-speaking cultural groups, mainly through a set of questionnaires, it was possible to outline the work with the cultural dimension in the classroom for the following three years. Work progress was evaluated at specific stages, and the final project evaluations were conducted through individual student interviews in grade 9. The interviews were focused on possible development of students¿ insights concerning different aspects of the cultural dimension. In particular this concerned awareness of difference and diversity, including modification of stereotypes, as well as the ability to decenterin order to be better able to respect such difference. I also explored students¿ awareness and views of the activities and approaches used in class, as well asaffordances both inside and outside the EFL classroom in relation to these intended insights. A further focus area was the perceived relevance to students of different aspects of the cultural dimension. The frameworks and approaches adopted for the work in the classroom all have in common that they are based on a constructivist framework, where knowledge is constructed and reconstructed through interaction with one¿s social and cultural environment, including interaction with others. Reflective processes precede or are simultaneous with the learning of basic factual knowledge. This entails a view of learning as a progression from simple to more complex models rather than as a progression from facts to understanding and analysis. Here, the development of intercultural competence is seen asa cyclical process, or along a spiral curriculum, from simple to more complex levels through a combination of cognitive, affective and behavioral elements within a framework of experiential learning. This project has shown one possible wayforward concerning the development of intercultural competence within EFL education through a more systematic and comprehensive approach regarding linguistic and cultural aspects. The evaluation of the educational process explored in the study suggests the possibilities for work with the promotion of awareness of difference and diversity concerning some specific context that, based on students¿ prior knowledge and preconceptions, would benefit from further work. In this case, the specific context primarily concerned different aspects of both cultural and linguistic conditions in the UK. It is also suggested that many students developed the ability to decenter, described in the study as integral to being able to respect otherness. What still remains to be explored are more individualized approaches considering students¿ different levels of departure. Further work alsoneeds to be put into how to apply insights gained in these specific situations to more general contexts. It is also necessary to explore the use of the suggested approaches in a wider range of different contexts.
Resumo:
Les transferts horizontaux de gènes (THG) ont été démontrés pour jouer un rôle important dans l'évolution des procaryotes. Leur impact a été le sujet de débats intenses, ceux-ci allant même jusqu'à l'abandon de l'arbre des espèces. Selon certaines études, un signal historique dominant est présent chez les procaryotes, puisque les transmissions horizontales stables et fonctionnelles semblent beaucoup plus rares que les transmissions verticales (des dizaines contre des milliards). Cependant, l'effet cumulatif des THG est non-négligeable et peut potentiellement affecter l'inférence phylogénétique. Conséquemment, la plupart des chercheurs basent leurs inférences phylogénétiques sur un faible nombre de gènes rarement transférés, comme les protéines ribosomales. Ceux-ci n'accordent cependant pas autant d'importance au modèle d'évolution utilisé, même s'il a été démontré que celui-ci est important lorsqu'il est question de résoudre certaines divergences entre ancêtres d'espèces, comme pour les animaux par exemple. Dans ce mémoire, nous avons utilisé des simulations et analyser des jeux de données d'Archées afin d'étudier l'impact relatif des THG ainsi que l'impact des modèles d'évolution sur la précision phylogénétique. Nos simulations prouvent que (1) les THG ont un impact limité sur les phylogénies, considérant un taux de transferts réaliste et que (2) l'approche super-matrice est plus précise que l'approche super-arbre. Nous avons également observé que les modèles complexes expliquent non seulement mieux les données que les modèles standards, mais peuvent avoir un impact direct sur différents groupes phylogénétiques et sur la robustesse de l'arbre obtenu. Nos résultats contredisent une publication récente proposant que les Thaumarchaeota apparaissent à la base de l'arbre des Archées.
Resumo:
Changes in both the mean and the variability of climate, whether naturally forced, or due to human activities, pose a threat to crop production globally. This paper summarizes discussions of this issue at a meeting of the Royal Society in April 2005. Recent advances in understanding the sensitivity of crops to weather, climate and the levels of particular gases in the atmosphere indicate that the impact of these factors on crop yields and quality may be more severe than previously thought. There is increasing information on the importance to crop yields of extremes of temperature and rainfall at key stages of crop development. Agriculture will itself impact on the climate system and a greater understanding of these feedbacks is needed. Complex models are required to perform simulations of climate variability and change, together with predictions of how crops will respond to different climate variables. Variability of climate, such as that associated with El Niño events, has large impacts on crop production. If skilful predictions of the probability of such events occurring can be made a season or more in advance, then agricultural and other societal responses can be made. The development of strategies to adapt to variations in the current climate may also build resilience to changes in future climate. Africa will be the part of the world that is most vulnerable to climate variability and change, but knowledge of how to use climate information and the regional impacts of climate variability and change in Africa is rudimentary. In order to develop appropriate adaptation strategies globally, predictions about changes in the quantity and quality of food crops need to be considered in the context of the entire food chain from production to distribution, access and utilization. Recommendations for future research priorities are given.
Resumo:
The D 2 dopamine receptor exists as dimers or as higher-order oligomers, as determined from data from physical experiments. In this study, we sought evidence that this oligomerization leads to cooperativity by examining the binding of three radioligands ([H-3] nemonapride, [H-3] raclopride, and [H-3] spiperone) to D 2 dopamine receptors expressed in membranes of Sf9 cells. In saturation binding experiments, the three radioligands exhibited different B-max values, and the B-max values could be altered by the addition of sodium ions to assays. Despite labeling different numbers of sites, the different ligands were able to achieve full inhibition in competition experiments. Some ligand pairs also exhibited complex inhibition curves in these experiments. In radioligand dissociation experiments, the rate of dissociation of [H-3] nemonapride or [H-3] spiperone depended on the sodium ion concentration but was independent of the competing ligand. Although some of the data in this study are consistent with the behavior of a cooperative oligomeric receptor, not all of the data are in agreement with this model. It may, therefore, be necessary to consider more complex models for the behavior of this receptor.
Resumo:
The D 2 dopamine receptor exists as dimers or as higher-order oligomers, as determined from data from physical experiments. In this study, we sought evidence that this oligomerization leads to cooperativity by examining the binding of three radioligands ([H-3] nemonapride, [H-3] raclopride, and [H-3] spiperone) to D 2 dopamine receptors expressed in membranes of Sf9 cells. In saturation binding experiments, the three radioligands exhibited different B-max values, and the B-max values could be altered by the addition of sodium ions to assays. Despite labeling different numbers of sites, the different ligands were able to achieve full inhibition in competition experiments. Some ligand pairs also exhibited complex inhibition curves in these experiments. In radioligand dissociation experiments, the rate of dissociation of [H-3] nemonapride or [H-3] spiperone depended on the sodium ion concentration but was independent of the competing ligand. Although some of the data in this study are consistent with the behavior of a cooperative oligomeric receptor, not all of the data are in agreement with this model. It may, therefore, be necessary to consider more complex models for the behavior of this receptor.
Resumo:
This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.
Resumo:
Large scale air pollution models are powerful tools, designed to meet the increasing demand in different environmental studies. The atmosphere is the most dynamic component of the environment, where the pollutants can be moved quickly on far distnce. Therefore the air pollution modeling must be done in a large computational domain. Moreover, all relevant physical, chemical and photochemical processes must be taken into account. In such complex models operator splitting is very often applied in order to achieve sufficient accuracy as well as efficiency of the numerical solution. The Danish Eulerian Model (DEM) is one of the most advanced such models. Its space domain (4800 × 4800 km) covers Europe, most of the Mediterian and neighboring parts of Asia and the Atlantic Ocean. Efficient parallelization is crucial for the performance and practical capabilities of this huge computational model. Different splitting schemes, based on the main processes mentioned above, have been implemented and tested with respect to accuracy and performance in the new version of DEM. Some numerical results of these experiments are presented in this paper.
Resumo:
The Aqua-Planet Experiment (APE) was first proposed by Neale and Hoskins (2000a) as a benchmark for atmospheric general circulation models (AGCMs) on an idealised water-covered Earth. The experiment and its aims are summarised, and its context within a modelling hierarchy used to evaluate complex models and to provide a link between realistic simulation and conceptual models of atmospheric phenomena is discussed. The simplified aqua-planet configuration bridges a gap in the existing hierarchy. It is designed to expose differences between models and to focus attention on particular phenomena and their response to changes in the underlying distribution of sea surface temperature.
Resumo:
The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions.
Resumo:
During the last termination (from ~18 000 years ago to ~9000 years ago), the climate significantly warmed and the ice sheets melted. Simultaneously, atmospheric CO2 increased from ~190 ppm to ~260 ppm. Although this CO2 rise plays an important role in the deglacial warming, the reasons for its evolution are difficult to explain. Only box models have been used to run transient simulations of this carbon cycle transition, but by forcing the model with data constrained scenarios of the evolution of temperature, sea level, sea ice, NADW formation, Southern Ocean vertical mixing and biological carbon pump. More complex models (including GCMs) have investigated some of these mechanisms but they have only been used to try and explain LGM versus present day steady-state climates. In this study we use a coupled climate-carbon model of intermediate complexity to explore the role of three oceanic processes in transient simulations: the sinking of brines, stratification-dependent diffusion and iron fertilization. Carbonate compensation is accounted for in these simulations. We show that neither iron fertilization nor the sinking of brines alone can account for the evolution of CO2, and that only the combination of the sinking of brines and interactive diffusion can simultaneously simulate the increase in deep Southern Ocean δ13C. The scenario that agrees best with the data takes into account all mechanisms and favours a rapid cessation of the sinking of brines around 18 000 years ago, when the Antarctic ice sheet extent was at its maximum. In this scenario, we make the hypothesis that sea ice formation was then shifted to the open ocean where the salty water is quickly mixed with fresher water, which prevents deep sinking of salty water and therefore breaks down the deep stratification and releases carbon from the abyss. Based on this scenario, it is possible to simulate both the amplitude and timing of the long-term CO2 increase during the last termination in agreement with ice core data. The atmospheric δ13C appears to be highly sensitive to changes in the terrestrial biosphere, underlining the need to better constrain the vegetation evolution during the termination.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The Fitzhugh-Nagumo (fn) mathematical model characterizes the action potential of the membrane. The dynamics of the Fitzhugh-Nagumo model have been extensively studied both with a view to their biological implications and as a test bed for numerical methods, which can be applied to more complex models. This paper deals with the dynamics in the (FH) model. Here, the dynamics are analyzed, qualitatively, through the stability diagrams to the action potential of the membrane. Furthermore, we also analyze quantitatively the problem through the evaluation of Floquet multipliers. Finally, the nonlinear periodic problem is controlled, based on the Chebyshev polynomial expansion, the Picard iterative method and on Lyapunov-Floquet transformation (L-F transformation).
Resumo:
No presente trabalho de tese é apresentada uma nova técnica de empilhamento de dados sísmicos para a obtenção da seção de incidência normal ou afastamento fonte-receptor nulo, aplicável em meios bidimensionais com variações laterais de velocidade. Esta nova técnica denominada Empilhamento Sísmico pela Composição de Ondas Planas (empilhamento PWC) foi desenvolvida tomando como base os conceitos físicos e matemáticos da decomposição do campo de ondas em ondas planas. Este trabalho pode ser dividido em três partes: Uma primeira parte, onde se apresenta uma revisão da técnica de empilhamento sísmico convencional e do processo de decomposição do campo de ondas produzido a partir de fontes pontuais em suas correspondentes ondas planas. Na segunda parte, é apresentada a formulação matemática e o procedimento de aplicação do método de empilhamento sísmico pela composição de ondas planas. Na terceira parte se apresenta a aplicação desta nova técnica de empilhamento na serie de dados Marmousi e uma analise sobre a atenuação de ruído. A formulação matemática desta nova técnica de empilhamento sísmico foi desenvolvida com base na teoria do espalhamento aplicado a ondas sísmicas sob a restrição do modelo de aproximação de Born. Nesse sentido, inicialmente se apresenta a determinação da solução da equação de onda caustica para a configuração com afastamento fonte-receptor finito, que posteriormente é reduzido para a configuração de afastamento fonte-receptor nulo. Por outra parte, com base nessas soluções, a expressão matemática deste novo processo de empilhamento sísmico é resolvida dentro do contexto do modelo de aproximação de Born. Verificou-se que as soluções encontradas por ambos procedimentos, isto é, por meio da solução da equação da onda e pelo processo de empilhamento proposto, são iguais, mostrando-se assim que o processo de empilhamento pela composição de ondas planas produz uma seção com afastamento fonte-receptor nulo. Esta nova técnica de empilhamento basicamente consiste na aplicação de uma dupla decomposição do campo de ondas em onda planas por meio da aplicação de dois empilhamentos oblíquos (slant stack), isto é um ao longo do arranjo das fontes e outro ao longo do arranjo dos detectores; seguido pelo processo de composição das ondas planas por meio do empilhamento obliquo inverso. Portanto, com base nestas operações e com a ajuda de um exemplo de aplicação nos dados gerados a partir de um modelo simples, são descritos os fundamentos e o procedimento de aplicação (ou algoritmo) desta nova técnica de obtenção da seção de afastamento nulo. Como exemplo de aplicação do empilhamento PWC em dados correspondentes a um meio com variações laterais de velocidade, foi aplicado nos dados Marmousi gerados segundo a técnica de cobertura múltipla a partir de um modelo que representa uma situação geológica real. Por comparação da seção resultante com a similar produzida pelo método de empilhamento convencional, observa-se que a seção de afastamento nulo desta nova técnica apresenta melhor definição e continuidade dos reflectores, como também uma melhor caracterização da ocorrência de difrações. Por último, da atenuação de ruído aleatório realizada nos mesmos dados, observa-se que esta técnica de empilhamento também produz uma atenuação do ruído presente no sinal, a qual implica um aumento na relação sinal ruído.
Resumo:
Fizemos a modelagem direta 2D do método magnetotelúrico (MT) com o método dos elementos finitos (MEF) de arestas em termos dos campos primários e secundários. Para usarmos modelos de maior complexidade e diminuirmos o custo computacional utilizamos malhas não estruturadas. Nas malhas utilizadas, introduzimos quatro nós em torno de cada estação MT, constituindo um quadrado alinhado nas direções dos eixos cartesianos
Resumo:
Surface water quality models have been developed since 1925, when Streeter e Phelps created first order equations that represent the balance between DO and BOD. Since then, and specially after the ‘60s, new computational technologies evolved, making it possible to create more complex models, which try to represent, through mathematics, natural phenomena like eutrophication and rivers self-depuration. As main objective of such models is the understanding of aquatic systems and the relationship between them and the environment, so that it can support decision makers in creating water manage plans and in elaborating environmental projects of such resources. Regarding to that, it is of crucial importance the understanding of the models structures, so that one can choose the most appropriate model for the river in question. While one-dimensional models like QUAL2K are more appropriate for long and narrow rivers, bi- or tridimensional models (CE-QUAL-W2, WASP and CEQUAL- ICM) are more commonly used in wide and slower rivers, with higher lateral and/or vertical mixes rates. Besides, the more complex is the river studied more complex the model should be, which demands more costs and time for the model to be applied