40 resultados para Computational modelling by homology
Resumo:
Some points of the paper by N.K. Nichols (see ibid., vol.AC-31, p.643-5, 1986), concerning the robust pole assignment of linear multiinput systems, are clarified. It is stressed that the minimization of the condition number of the closed-loop eigenvector matrix does not necessarily lead to robustness of the pole assignment. It is shown why the computational method, which Nichols claims is robust, is in fact numerically unstable with respect to the determination of the gain matrix. In replying, Nichols presents arguments to support the choice of the conditioning of the closed-loop poles as a measure of robustness and to show that the methods of J Kautsky, N. K. Nichols and P. VanDooren (1985) are stable in the sense that they produce accurate solutions to well-conditioned problems.
Resumo:
A recent paper published in this journal considers the numerical integration of the shallow-water equations using the leapfrog time-stepping scheme [Sun Wen-Yih, Sun Oliver MT. A modified leapfrog scheme for shallow water equations. Comput Fluids 2011;52:69–72]. The authors of that paper propose using the time-averaged height in the numerical calculation of the pressure-gradient force, instead of the instantaneous height at the middle time step. The authors show that this modification doubles the maximum Courant number (and hence the maximum time step) at which the integrations are stable, doubling the computational efficiency. Unfortunately, the pressure-averaging technique proposed by the authors is not original. It was devised and published by Shuman [5] and has been widely used in the atmosphere and ocean modelling community for over 40 years.
Resumo:
A mechanism for amplification of mountain waves, and their associated drag, by parametric resonance is investigated using linear theory and numerical simulations. This mechanism, which is active when the Scorer parameter oscillates with height, was recently classified by previous authors as intrinsically nonlinear. Here it is shown that, if friction is included in the simplest possible form as a Rayleigh damping, and the solution to the Taylor-Goldstein equation is expanded in a power series of the amplitude of the Scorer parameter oscillation, linear theory can replicate the resonant amplification produced by numerical simulations with some accuracy. The drag is significantly altered by resonance in the vicinity of n/l_0 = 2, where l_0 is the unperturbed value of the Scorer parameter and n is the wave number of its oscillation. Depending on the phase of this oscillation, the drag may be substantially amplified or attenuated relative to its non-resonant value, displaying either single maxima or minima, or double extrema near n/l_0 = 2. Both non-hydrostatic effects and friction tend to reduce the magnitude of the drag extrema. However, in exactly inviscid conditions, the single drag maximum and minimum are suppressed. As in the atmosphere friction is often small but non-zero outside the boundary layer, modelling of the drag amplification mechanism addressed here should be quite sensitive to the type of turbulence closure employed in numerical models, or to computational dissipation in nominally inviscid simulations.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Relating the measurable, large scale, effects of anaesthetic agents to their molecular and cellular targets of action is necessary to better understand the principles by which they affect behavior, as well as enabling the design and evaluation of more effective agents and the better clinical monitoring of existing and future drugs. Volatile and intravenous general anaesthetic agents (GAs) are now known to exert their effects on a variety of protein targets, the most important of which seem to be the neuronal ion channels. It is hence unlikely that anaesthetic effect is the result of a unitary mechanism at the single cell level. However, by altering the behavior of ion channels GAs are believed to change the overall dynamics of distributed networks of neurons. This disruption of regular network activity can be hypothesized to cause the hypnotic and analgesic effects of GAs and may well present more stereotypical characteristics than its underlying microscopic causes. Nevertheless, there have been surprisingly few theories that have attempted to integrate, in a quantitative manner, the empirically well documented alterations in neuronal ion channel behavior with the corresponding macroscopic effects. Here we outline one such approach, and show that a range of well documented effects of anaesthetics on the electroencephalogram (EEG) may be putatively accounted for. In particular we parameterize, on the basis of detailed empirical data, the effects of halogenated volatile ethers (a clinically widely used class of general anaesthetic agent). The resulting model is able to provisionally account for a range of anaesthetically induced EEG phenomena that include EEG slowing, biphasic changes in EEG power, and the dose dependent appearance of anomalous ictal activity, as well as providing a basis for novel approaches to monitoring brain function in both health and disease.
Resumo:
Activating transcription factor 3 (Atf3) is rapidly and transiently upregulated in numerous systems, and is associated with various disease states. Atf3 is required for negative feedback regulation of other genes, but is itself subject to negative feedback regulation possibly by autorepression. In cardiomyocytes, Atf3 and Egr1 mRNAs are upregulated via ERK1/2 signalling and Atf3 suppresses Egr1 expression. We previously developed a mathematical model for the Atf3-Egr1 system. Here, we adjusted and extended the model to explore mechanisms of Atf3 feedback regulation. Introduction of an autorepressive loop for Atf3 tuned down its expression and inhibition of Egr1 was lost, demonstrating that negative feedback regulation of Atf3 by Atf3 itself is implausible in this context. Experimentally, signals downstream from ERK1/2 suppress Atf3 expression. Mathematical modelling indicated that this cannot occur by phosphorylation of pre-existing inhibitory transcriptional regulators because the time delay is too short. De novo synthesis of an inhibitory transcription factor (ITF) with a high affinity for the Atf3 promoter could suppress Atf3 expression, but (as with the Atf3 autorepression loop) inhibition of Egr1 was lost. Developing the model to include newly-synthesised miRNAs very efficiently terminated Atf3 protein expression and, with a 4-fold increase in the rate of degradation of mRNA from the mRNA/miRNA complex, profiles for Atf3 mRNA, Atf3 protein and Egr1 mRNA approximated to the experimental data. Combining the ITF model with that of the miRNA did not improve the profiles suggesting that miRNAs are likely to play a dominant role in switching off Atf3 expression post-induction.
Resumo:
In recent years, computational fluid dynamics (CFD) has been widely used as a method of simulating airflow and addressing indoor environment problems. The complexity of airflows within the indoor environment would make experimental investigation difficult to undertake and also imposes significant challenges on turbulence modelling for flow prediction. This research examines through CFD visualization how air is distributed within a room. Measurements of air temperature and air velocity have been performed at a number of points in an environmental test chamber with a human occupant. To complement the experimental results, CFD simulations were carried out and the results enabled detailed analysis and visualization of spatial distribution of airflow patterns and the effect of different parameters to be predicted. The results demonstrate the complexity of modelling human exhalation within a ventilated enclosure and shed some light into how to achieve more realistic predictions of the airflow within an occupied enclosure.
Resumo:
Using the novel technique of topic modelling, this paper examines thematic patterns and their changes over time in a large corpus of corporate social responsibility (CSR) reports produced in the oil sector. Whereas previous research on corporate communications has been small-scale or interested in selected lexical aspects and thematic categories identified ex ante, our approach allows for thematic patterns to emerge from the data. The analysis reveals a number of major trends and topic shifts pointing to changing practices of CSR. Nowadays ‘people’, ‘communities’ and ‘rights’ seem to be given more prominence, whereas ‘environmental protection’ appears to be less relevant. Using more established corpus-based methods, we subsequently explore two top phrases - ‘human rights’ and ‘climate change’ that were identified as representative of the shifting thematic patterns. Our approach strikes a balance between the purely quantitative and qualitative methodologies and offers applied linguists new ways of exploring discourse in large collections of texts.