951 resultados para Initial conditions


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In Brazil, special education public is a challenge to all teachers, especially to Physical Education ones. Among others, it encompasses students with disabilities, students with intellectual giftedness, and students with pervasive developmental disorder. Besides posing challenges, the inclusion process causes worry and generates debates on problems that impede the full partaking of such pupils in schooling practices related to physical education. This thesis presents a research that focused on these matters by means of co-working involving the researcher and the Physical Education teacher in regular classrooms following co-teaching perspective. The starting point of the research is the following question: what contributions co-working involving Physical Education teacher and researcher may provide to people with disabilities and to Physical Education teacher in regular schools attended by students who are the special education’s target? The research aimed at discussing and analyzing the development of such co-working activity involving the researcher and Physical Education teacher. It followed co-teaching perspective and was put into practice in a public school in Uberlândia, state of Minas Gerais. Participant qualitative approach, which recognizes relations between social sciences and intervention in social reality, was the methodological choice to develop the research in three phases: 1) making the research; 2) intervening in social reality; 3) assessing/diagnosing it. Strategies to gather data included semi structured interview, questionnaire, participant observation, and group interview. Data come, above all, from oral accounts as well as from the work by the group of participants of the research, which means, researcher, Physical Education teacher who works at regular schools and three teachers who deal with AEE (Atendimento Educacional Especial), a special educational teaching program. The concept of inclusion is discussed accordingly to authors such as Miranda (2001), Mantoan (2001), Duarte and Santos (2003), Mittler (2003), Rodrigues (2006), and Bueno (2008). The conception of co-working is developed in the light of studies by Capellini (2004) and Mendes (2009), among others. Results point out not only initial conditions of anguish, doubts and hardships, but also a will to debate difficulties Physical Education teachers face in their daily pedagogical activities at school. Likewise, results showed that teachers who took part in the research are interested in continuing their training in connection with co-teaching as strategy to teach physical education at inclusive schools.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Deaf teachers presence at superior education triggers a series of reactions due to cultural differences. They feel the discomfort. The cultural difference defies the established power relations. From that emerge the trading spaces with their constant shocks about problems that affect the deaf teacher participation. The thesis goes through practice, resistance, resilience and political thinking of the deaf teacher at the Superior Education. Authors like: Foucault (2004), Hall (2009), Bhabha (1998), Touraine (2009) and Veiga-Netto (2010) underlie the concept of power relations that permeate this study. Perlin (2003); Ladd (2002) subsidize with the cultural focus. The investigation came from the question: How deaf teachers make their political stands in power relations established to the construction of their narratives at Superior Education? It had the goal of identify and chart the deaf teachers narratives at Superior Education. Leaving from the interview-narrative qualitative approach it was constituted a corpus with the collected narratives. These narratives were identified in order to achieve a thematic map express in the last chapter where the constant facts of the trading spaces of Superior Education shocks unfolds. The results point to an infinity of debates. The deaf teachers do not only present initial conditions of distress, doubt and difficulty at Superior Education, but also the disposition to discuss more the everyday power chains, waged by trading spaces. The identification of the narratives was vitally important to confirm the value of cultural and linguistic recognition as strategy for new politics to the structural power relations at the university context.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aberrant behavior of biological signaling pathways has been implicated in diseases such as cancers. Therapies have been developed to target proteins in these networks in the hope of curing the illness or bringing about remission. However, identifying targets for drug inhibition that exhibit good therapeutic index has proven to be challenging since signaling pathways have a large number of components and many interconnections such as feedback, crosstalk, and divergence. Unfortunately, some characteristics of these pathways such as redundancy, feedback, and drug resistance reduce the efficacy of single drug target therapy and necessitate the employment of more than one drug to target multiple nodes in the system. However, choosing multiple targets with high therapeutic index poses more challenges since the combinatorial search space could be huge. To cope with the complexity of these systems, computational tools such as ordinary differential equations have been used to successfully model some of these pathways. Regrettably, for building these models, experimentally-measured initial concentrations of the components and rates of reactions are needed which are difficult to obtain, and in very large networks, they may not be available at the moment. Fortunately, there exist other modeling tools, though not as powerful as ordinary differential equations, which do not need the rates and initial conditions to model signaling pathways. Petri net and graph theory are among these tools. In this thesis, we introduce a methodology based on Petri net siphon analysis and graph network centrality measures for identifying prospective targets for single and multiple drug therapies. In this methodology, first, potential targets are identified in the Petri net model of a signaling pathway using siphon analysis. Then, the graph-theoretic centrality measures are employed to prioritize the candidate targets. Also, an algorithm is developed to check whether the candidate targets are able to disable the intended outputs in the graph model of the system or not. We implement structural and dynamical models of ErbB1-Ras-MAPK pathways and use them to assess and evaluate this methodology. The identified drug-targets, single and multiple, correspond to clinically relevant drugs. Overall, the results suggest that this methodology, using siphons and centrality measures, shows promise in identifying and ranking drugs. Since this methodology only uses the structural information of the signaling pathways and does not need initial conditions and dynamical rates, it can be utilized in larger networks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Models of neutrino-driven core-collapse supernova explosions have matured considerably in recent years. Explosions of low-mass progenitors can routinely be simulated in 1D, 2D, and 3D. Nucleosynthesis calculations indicate that these supernovae could be contributors of some lighter neutron-rich elements beyond iron. The explosion mechanism of more massive stars remains under investigation, although first 3D models of neutrino-driven explosions employing multi-group neutrino transport have become available. Together with earlier 2D models and more simplified 3D simulations, these have elucidated the interplay between neutrino heating and hydrodynamic instabilities in the post-shock region that is essential for shock revival. However, some physical ingredients may still need to be added/improved before simulations can robustly explain supernova explosions over a wide range of progenitors. Solutions recently suggested in the literature include uncertainties in the neutrino rates, rotation, and seed perturbations from convective shell burning. We review the implications of 3D simulations of shell burning in supernova progenitors for the ‘perturbations-aided neutrino-driven mechanism,’ whose efficacy is illustrated by the first successful multi-group neutrino hydrodynamics simulation of an 18 solar mass progenitor with 3D initial conditions. We conclude with speculations about the impact of 3D effects on the structure of massive stars through convective boundary mixing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Diante da grande quantidade de glicerol bruto gerado na síntese do biodiesel e seu baixo valor comercial, torna-se fundamental encontrar formas alternativas para converter este substrato em produtos com valor agregado. Neste contexto, este trabalho teve como objetivo avaliar diferentes leveduras oleaginosas capazes de metabolizar o glicerol bruto, gerado como coproduto na síntese de biodiesel, visando produzir biomassa como fonte de lipídios. Todos os cultivos foram realizados em frascos agitados, em condições estabelecidas de acordo com cada etapa do trabalho, sendo obtidos dados relativos ao crescimento celular e à produção de lipídios, tratados estatisticamente conforme o propósito. Lipomyces lipofer NRRL Y-1155 apresentou diferenças significativas em relação às outras leveduras oriundas de banco de cultura, atingindo 57,64% de lipídios na biomassa. Estas leveduras apresentarem perfis de ácidos graxos diferenciados, semelhantes aos dos principais óleos vegetais utilizadas na síntese de biodiesel, com predominância de ácidos graxos poli-insaturados, especialmente ácido linoleico (68,3% na levedura Rhodotorula glutinis NRRL YB-252). O ácido gama-linolênico, um ácido graxo essencial ω6, foi detectado em todas as leveduras analisadas, sendo que na biomassa de Candida cylindracea NRRL Y-17506 chegou a 23,1%. Através de um planejamento experimental Plackett-Burman, verificou-se que as variáveis concentração de extrato de levedura e de MgSO4.7H20 demonstraram maior influência na produção de lipídios por uma linhagem silvestre de Rhodotorula mucilaginosa. Para esta levedura, a partir da análise de efeitos foi possível estabelecer a seguinte condição para a produção de lipídios: 30,0 g.L-1 glicerol; 5,0 g.L-1 KH2PO4; 1,0 g.L-1 Na2HPO4; 3,0 g.L-1 MgSO4.7H2O; 1,2 g.L-1 extrato de levedura; pH inicial 4,5; temperatura 25°C. Nestas condições conseguiu-se um teor de lipídios de 59,96% e lipídios totais produzidos de 5,51 g.L-1 . Também foi possível observar aumento no teor de lipídios da biomassa ao longo do tempo de cultivo, bem como o aumento do teor relativo do ácido linoleico, que atingiu 52%. Dentre as leveduras isoladas a partir de amostras ambientais do Extremo Sul do Brasil, a levedura identificada como Cryptococcus humicola se destacou das demais, apresentando proporção de 23,5% de ácidos graxos saturados, 14,8% de ácidos graxos monoinsaturados e 54,9% de ácidos graxos poli-insaturados, destacando-se o ácido linoleico. O planejamento Plackett-Burman foi também utilizado para esta levedura, sendo que as variáveis concentração de extrato de levedura e glicerol bruto demonstraram maior influência na produção de lipídios. Posteriormente, um delineamento composto central rotacional (DCCR) foi proposto visando à otimização da produção de lipídios. Os modelos empíricos preditivos obtidos para biomassa máxima e lipídios totais permitiram estabelecer para a produção de lipídios por Cryptococcus humicola a seguinte condição otimizada: 100,0 g.L-1 glicerol; 5,0 g.L-1 KH2PO4; 1,0 g.L-1 Na2HPO4; 4,8 g.L-1 extrato de levedura; pH inicial 4,5; temperatura 25°C. Esta condição representou um incremento de cerca de 2 vezes nos lipídios totais em relação à melhor condição estabelecida pelo planejamento Plackett-Burmann e um acréscimo de cerca de 4,8 vezes em relação às condições testadas inicialmente, atingindo 37,61% de lipídios e 8,85 g.L-1 de lipídios totais. Deste modo, os propósitos de valorização de um coproduto oriundo da síntese de biodiesel, bem como a produção de um óleo com potencial para a produção de biodiesel, foram cumpridos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new type of space debris was recently discovered by Schildknecht in near -geosynchronous orbit (GEO). These objects were later identified as exhibiting properties associated with High Area-to-Mass ratio (HAMR) objects. According to their brightness magnitudes (light curve), high rotation rates and composition properties (albedo, amount of specular and diffuse reflection, colour, etc), it is thought that these objects are multilayer insulation (MLI). Observations have shown that this debris type is very sensitive to environmental disturbances, particularly solar radiation pressure, due to the fact that their shapes are easily deformed leading to changes in the Area-to-Mass ratio (AMR) over time. This thesis proposes a simple effective flexible model of the thin, deformable membrane with two different methods. Firstly, this debris is modelled with Finite Element Analysis (FEA) by using Bernoulli-Euler theory called “Bernoulli model”. The Bernoulli model is constructed with beam elements consisting 2 nodes and each node has six degrees of freedom (DoF). The mass of membrane is distributed in beam elements. Secondly, the debris based on multibody dynamics theory call “Multibody model” is modelled as a series of lump masses, connected through flexible joints, representing the flexibility of the membrane itself. The mass of the membrane, albeit low, is taken into account with lump masses in the joints. The dynamic equations for the masses, including the constraints defined by the connecting rigid rod, are derived using fundamental Newtonian mechanics. The physical properties of both flexible models required by the models (membrane density, reflectivity, composition, etc.), are assumed to be those of multilayer insulation. Both flexible membrane models are then propagated together with classical orbital and attitude equations of motion near GEO region to predict the orbital evolution under the perturbations of solar radiation pressure, Earth’s gravity field, luni-solar gravitational fields and self-shadowing effect. These results are then compared to two rigid body models (cannonball and flat rigid plate). In this investigation, when comparing with a rigid model, the evolutions of orbital elements of the flexible models indicate the difference of inclination and secular eccentricity evolutions, rapid irregular attitude motion and unstable cross-section area due to a deformation over time. Then, the Monte Carlo simulations by varying initial attitude dynamics and deformed angle are investigated and compared with rigid models over 100 days. As the results of the simulations, the different initial conditions provide unique orbital motions, which is significantly different in term of orbital motions of both rigid models. Furthermore, this thesis presents a methodology to determine the material dynamic properties of thin membranes and validates the deformation of the multibody model with real MLI materials. Experiments are performed in a high vacuum chamber (10-4 mbar) replicating space environment. A thin membrane is hinged at one end but free at the other. The free motion experiment, the first experiment, is a free vibration test to determine the damping coefficient and natural frequency of the thin membrane. In this test, the membrane is allowed to fall freely in the chamber with the motion tracked and captured through high velocity video frames. A Kalman filter technique is implemented in the tracking algorithm to reduce noise and increase the tracking accuracy of the oscillating motion. The forced motion experiment, the last test, is performed to determine the deformation characteristics of the object. A high power spotlight (500-2000W) is used to illuminate the MLI and the displacements are measured by means of a high resolution laser sensor. Finite Element Analysis (FEA) and multibody dynamics of the experimental setups are used for the validation of the flexible model by comparing with the experimental results of displacements and natural frequencies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays we observe social transformations that have no counterparts in previous ages. Social dependences are expressed in forms that change and increase with unprecedented intensity. Human beings in their individual experience, particularly religious, realize themselves through dynamic relations to the surrounding world, other people and God. People desire a deepened interpretation what they experience. In this article I seek research tools which can help investigations in the area of fundamental theology which is expected to transform or translate individual experience in a rational way into an objective interpretative pattern. We seek such models of reality, which will ground theological investigations in the social relation of an individual person to world [in general] and to transcendence. Two investigative categories from the Christian tradition are accepted in the article: logos spermatikos and assembly of God – qehal which especially take into account the salvific perspective of the history of creation. These notions allow us to describe phenomena not only as static; they also allow us to interpret dynamic relations underway in our time. In the accepted investigative model (model 2) both the beginning and the aim of creation are qualified as dynamic realities. They influence and react to everything that happens in creation. We should therefore interpret the precondition of individual religious experience as dynamic. This will let us describe the .transient relations. as a basis in these investigations. Such an approach will prevent us from deprecating individuality in favor of the community, or the reverse.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Theoretical and Experimental Tomography in the Sea Experiment (THETIS 1) took place in the Gulf of Lion to observe the evolution of the temperature field and the process of deep convection during the 1991-1992 winter. The temperature measurements consist, of moored sensors, conductivity-temperature-depth and expendable bathythermograph surveys, ana acoustic tomography. Because of this diverse data set and since the field evolves rather fast, the analysis uses a unified framework, based on estimation theory and implementing a Kalman filter. The resolution and the errors associated with the model are systematically estimated. Temperature is a good tracer of water masses. The time-evolving three-dimensional view of the field resulting from the analysis shows the details of the three classical convection phases: preconditioning, vigourous convection, and relaxation. In all phases, there is strong spatial nonuniformity, with mesoscale activity, short timescales, and sporadic evidence of advective events (surface capping, intrusions of Levantine Intermediate Water (LIW)). Deep convection, reaching 1500 m, was observed in late February; by late April the field had not yet returned to its initial conditions (strong deficit of LIW). Comparison with available atmospheric flux data shows that advection acts to delay the occurence of convection and confirms the essential role of buoyancy fluxes. For this winter, the deep. mixing results in an injection of anomalously warm water (Delta T similar or equal to 0.03 degrees) to a depth of 1500 m, compatible with the deep warming previously reported.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The anticipated growth of air traffic worldwide requires enhanced Air Traffic Management (ATM) technologies and procedures to increase the system capacity, efficiency, and resilience, while reducing environmental impact and maintaining operational safety. To deal with these challenges, new automation and information exchange capabilities are being developed through different modernisation initiatives toward a new global operational concept called Trajectory Based Operations (TBO), in which aircraft trajectory information becomes the cornerstone of advanced ATM applications. This transformation will lead to higher levels of system complexity requiring enhanced Decision Support Tools (DST) to aid humans in the decision making processes. These will rely on accurate predicted aircraft trajectories, provided by advanced Trajectory Predictors (TP). The trajectory prediction process is subject to stochastic effects that introduce uncertainty into the predictions. Regardless of the assumptions that define the aircraft motion model underpinning the TP, deviations between predicted and actual trajectories are unavoidable. This thesis proposes an innovative method to characterise the uncertainty associated with a trajectory prediction based on the mathematical theory of Polynomial Chaos Expansions (PCE). Assuming univariate PCEs of the trajectory prediction inputs, the method describes how to generate multivariate PCEs of the prediction outputs that quantify their associated uncertainty. Arbitrary PCE (aPCE) was chosen because it allows a higher degree of flexibility to model input uncertainty. The obtained polynomial description can be used in subsequent prediction sensitivity analyses thanks to the relationship between polynomial coefficients and Sobol indices. The Sobol indices enable ranking the input parameters according to their influence on trajectory prediction uncertainty. The applicability of the aPCE-based uncertainty quantification detailed herein is analysed through a study case. This study case represents a typical aircraft trajectory prediction problem in ATM, in which uncertain parameters regarding aircraft performance, aircraft intent description, weather forecast, and initial conditions are considered simultaneously. Numerical results are compared to those obtained from a Monte Carlo simulation, demonstrating the advantages of the proposed method. The thesis includes two examples of DSTs (Demand and Capacity Balancing tool, and Arrival Manager) to illustrate the potential benefits of exploiting the proposed uncertainty quantification method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Planar cell polarity (PCP) occurs in the epithelia of many animals and can lead to the alignment of hairs, bristles and feathers; physiologically, it can organise ciliary beating. Here we present two approaches to modelling this phenomenon. The aim is to discover the basic mechanisms that drive PCP, while keeping the models mathematically tractable. We present a feedback and diffusion model, in which adjacent cell sides of neighbouring cells are coupled by a negative feedback loop and diffusion acts within the cell. This approach can give rise to polarity, but also to period two patterns. Polarisation arises via an instability provided a sufficiently strong feedback and sufficiently weak diffusion. Moreover, we discuss a conservative model in which proteins within a cell are redistributed depending on the amount of proteins in the neighbouring cells, coupled with intracellular diffusion. In this case polarity can arise from weakly polarised initial conditions or via a wave provided the diffusion is weak enough. Both models can overcome small anomalies in the initial conditions. Furthermore, the range of the effects of groups of cells with different properties than the surrounding cells depends on the strength of the initial global cue and the intracellular diffusion.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the past few years, there has been a concern among economists and policy makers that increased openness to international trade affects some regions in a country more than others. Recent research has found that local labor markets more exposed to import competition through their initial employment composition experience worse outcomes in several dimensions such as, employment, wages, and poverty. Although there is evidence that regions within a country exhibit variation in the intensity with which they trade with each other and with other countries, trade linkages have been ignored in empirical analyses of the regional effects of trade, which focus on differences in employment composition. In this dissertation, I investigate how local labor markets' trade linkages shape the response of wages to international trade shocks. In the second chapter, I lay out a standard multi-sector general equilibrium model of trade, where domestic regions trade with each other and with the rest of the world. Using this benchmark, I decompose a region's wage change resulting from a national import cost shock into a direct effect on prices, holding other endogenous variables constant, and a series of general equilibrium effects. I argue the direct effect provides a natural measure of exposure to import competition within the model since it summarizes the effect of the shock on a region's wage as a function of initial conditions given by its trade linkages. I call my proposed measure linkage exposure while I refer to the measures used in previous studies as employment exposure. My theoretical analysis also shows that the assumptions previous studies make on trade linkages are not consistent with the standard trade model. In the third chapter, I calibrate the model to the Brazilian economy in 1991--at the beginning of a period of trade liberalization--to perform a series of experiments. In each of them, I reduce the Brazilian import cost by 1 percent in a single sector and I calculate how much of the cross-regional variation in counterfactual wage changes is explained by exposure measures. Over this set of experiments, employment exposure explains, for the median sector, 2 percent of the variation in counterfactual wage changes while linkage exposure explains 44 percent. In addition, I propose an estimation strategy that incorporates trade linkages in the analysis of the effects of trade on observed wages. In the model, changes in wages are completely determined by changes in market access, an endogenous variable that summarizes the real demand faced by a region. I show that a linkage measure of exposure is a valid instrument for changes in market access within Brazil. By using observed wage changes in Brazil between 1991-2000, my estimates imply that a region at the 25th percentile of the change in domestic market access induced by trade liberalization, experiences a 0.6 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. The estimates from a regression of wages changes on exposure imply that a region at the 25th percentile of exposure experiences a 3 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. I conclude that estimates based on exposure overstate the negative impact of trade liberalization on wages in Brazil. In the fourth chapter, I extend the standard model to allow for two types of workers according to their education levels: skilled and unskilled. I show that there is substantial variation across Brazilian regions in the skill premium. I use the exogenous variation provided by tariff changes to estimate the impact of market access on the skill premium. I find that decreased domestic market access resulting from trade liberalization resulted in a higher skill premium. I propose a mechanism to explain this result: that the manufacturing sector is relatively more intensive in unskilled labor and I show empirical evidence that supports this hypothesis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We describe and evaluate two reduced models for nonlinear chemical reactions in a chaotic laminar flow. Each model involves two separate steps to compute the chemical composition at a given location and time. The “manifold tracking model” first tracks backwards in time a segment of the stable manifold of the requisite point. This then provides a sample of the initial conditions appropriate for the second step, which requires solving one-dimensional problems for the reaction in Lagrangian coordinates. By contrast, the first step of the “branching trajectories model” simulates both the advection and diffusion of fluid particles that terminate at the appropriate point; the chemical reaction equations are then solved along each of the branched trajectories in a second step. Results from each model are compared with full numerical simulations of the reaction processes in a chaotic laminar flow.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We study networks of nonlocally coupled electronic oscillators that can be described approximately by a Kuramoto-like model. The experimental networks show long complex transients from random initial conditions on the route to network synchronization. The transients display complex behaviors, including resurgence of chimera states, which are network dynamics where order and disorder coexists. The spatial domain of the chimera state moves around the network and alternates with desynchronized dynamics. The fast time scale of our oscillators (on the order of 100ns) allows us to study the scaling of the transient time of large networks of more than a hundred nodes, which has not yet been confirmed previously in an experiment and could potentially be important in many natural networks. We find that the average transient time increases exponentially with the network size and can be modeled as a Poisson process in experiment and simulation. This exponential scaling is a result of a synchronization rate that follows a power law of the phase-space volume.