134 resultados para Full-scale Physical Modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent coordinated observations of interplanetary scintillation (IPS) from the EISCAT, MERLIN, and STELab, and stereoscopic white-light imaging from the two heliospheric imagers (HIs) onboard the twin STEREO spacecraft are significant to continuously track the propagation and evolution of solar eruptions throughout interplanetary space. In order to obtain a better understanding of the observational signatures in these two remote-sensing techniques, the magnetohydrodynamics of the macro-scale interplanetary disturbance and the radio-wave scattering of the micro-scale electron-density fluctuation are coupled and investigated using a newly constructed multi-scale numerical model. This model is then applied to a case of an interplanetary shock propagation within the ecliptic plane. The shock could be nearly invisible to an HI, once entering the Thomson-scattering sphere of the HI. The asymmetry in the optical images between the western and eastern HIs suggests the shock propagation off the Sun–Earth line. Meanwhile, an IPS signal, strongly dependent on the local electron density, is insensitive to the density cavity far downstream of the shock front. When this cavity (or the shock nose) is cut through by an IPS ray-path, a single speed component at the flank (or the nose) of the shock can be recorded; when an IPS ray-path penetrates the sheath between the shock nose and this cavity, two speed components at the sheath and flank can be detected. Moreover, once a shock front touches an IPS ray-path, the derived position and speed at the irregularity source of this IPS signal, together with an assumption of a radial and constant propagation of the shock, can be used to estimate the later appearance of the shock front in the elongation of the HI field of view. The results of synthetic measurements from forward modelling are helpful in inferring the in-situ properties of coronal mass ejection from real observational data via an inverse approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology is presented for the development of a combined seasonal weather and crop productivity forecasting system. The first stage of the methodology is the determination of the spatial scale(s) on which the system could operate; this determination has been made for the case of groundnut production in India. Rainfall is a dominant climatic determinant of groundnut yield in India. The relationship between yield and rainfall has been explored using data from 1966 to 1995. On the all-India scale, seasonal rainfall explains 52% of the variance in yield. On the subdivisional scale, correlations vary between variance r(2) = 0.62 (significance level p < 10(-4)) and a negative correlation with r(2) = 0.1 (p = 0.13). The spatial structure of the relationship between rainfall and groundnut yield has been explored using empirical orthogonal function (EOF) analysis. A coherent, large-scale pattern emerges for both rainfall and yield. On the subdivisional scale (similar to 300 km), the first principal component (PC) of rainfall is correlated well with the first PC of yield (r(2) = 0.53, p < 10(-4)), demonstrating that the large-scale patterns picked out by the EOFs are related. The physical significance of this result is demonstrated. Use of larger averaging areas for the EOF analysis resulted in lower and (over time) less robust correlations. Because of this loss of detail when using larger spatial scales, the subdivisional scale is suggested as an upper limit on the spatial scale for the proposed forecasting system. Further, district-level EOFs of the yield data demonstrate the validity of upscaling these data to the subdivisional scale. Similar patterns have been produced using data on both of these scales, and the first PCs are very highly correlated (r(2) = 0.96). Hence, a working spatial scale has been identified, typical of that used in seasonal weather forecasting, that can form the basis of crop modeling work for the case of groundnut production in India. Last, the change in correlation between yield and seasonal rainfall during the study period has been examined using seasonal totals and monthly EOFs. A further link between yield and subseasonal variability is demonstrated via analysis of dynamical data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rate and scale of human-driven changes can exert profound impacts on ecosystems, the species that make them up and the services they provide that sustain humanity. Given the speed at which these changes are occurring, one of society's major challenges is to coexist within ecosystems and to manage ecosystem services in a sustainable way. The effect of possible scenarios of global change on ecosystem services can be explored using ecosystem models. Such models should adequately represent ecosystem processes above and below the soil surface (aboveground and belowground) and the interactions between them. We explore possibilities to include such interactions into ecosystem models at scales that range from global to local. At the regional to global scale we suggest to expand the plant functional type concept (aggregating plants into groups according to their physiological attributes) to include functional types of aboveground-belowground interactions. At the scale of discrete plant communities, process-based and organism-oriented models could be combined into "hybrid approaches" that include organism-oriented mechanistic representation of a limited number of trophic interactions in an otherwise process - oriented approach. Under global change the density and activity of organisms determining the processes may change non-linearly and therefore explicit knowledge of the organisms and their responses should ideally be included. At the individual plant scale a common organism-based conceptual model of aboveground-belowground interactions has emerged. This conceptual model facilitates the formulation of research questions to guide experiments aiming to identify patterns that are common within, but differ between, ecosystem types and biomes. Such experiments inform modelling approaches at larger scales. Future ecosystem models should better include this evolving knowledge of common patterns of aboveground-belowground interactions. Improved ecosystem models are necessary toots to reduce the uncertainty in the information that assists us in the sustainable management of our environment in a changing world. (C) 2004 Elsevier GmbH. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Networks are ubiquitous in natural, technological and social systems. They are of increasing relevance for improved understanding and control of infectious diseases of plants, animals and humans, given the interconnectedness of today's world. Recent modelling work on disease development in complex networks shows: the relative rapidity of pathogen spread in scale-free compared with random networks, unless there is high local clustering; the theoretical absence of an epidemic threshold in scale-free networks of infinite size, which implies that diseases with low infection rates can spread in them, but the emergence of a threshold when realistic features are added to networks (e.g. finite size, household structure or deactivation of links); and the influence on epidemic dynamics of asymmetrical interactions. Models suggest that control of pathogens spreading in scale-free networks should focus on highly connected individuals rather than on mass random immunization. A growing number of empirical applications of network theory in human medicine and animal disease ecology confirm the potential of the approach, and suggest that network thinking could also benefit plant epidemiology and forest pathology, particularly in human-modified pathosystems linked by commercial transport of plant and disease propagules. Potential consequences for the study and management of plant and tree diseases are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In previous empirical and modelling studies of rare species and weeds, evidence of fractal behaviour has been found. We propose that weeds in modern agricultural systems may be managed close to critical population dynamic thresholds, below which their rates of increase will be negative and where scale-invariance may be expected as a consequence. We collected detailed spatial data on five contrasting species over a period of three years in a primarily arable field. Counts in 20×20 cm contiguous quadrats, 225,000 in 1998 and 84,375 thereafter, could be re-structured into a wide range of larger quadrat sizes. These were analysed using three methods based on correlation sum, incidence and conditional incidence. We found non-trivial scale invariance for species occurring at low mean densities and where they were strongly aggregated. The fact that the scale-invariance was not found for widespread species occurring at higher densities suggests that the scaling in agricultural weed populations may, indeed, be related to critical phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1.There is concern over the possibility of unwanted environmental change following transgene movement from genetically modified (GM) rapeseed Brassica napus to its wild and weedy relatives. 2. The aim of this research was to develop a remote sensing-assisted methodology to help quantify gene flow from crops to their wild relatives over wide areas. Emphasis was placed on locating sites of sympatry, where the frequency of gene flow is likely to be highest, and on measuring the size of rapeseed fields to allow spatially explicit modelling of wind-mediated pollen-dispersal patterns. 3. Remote sensing was used as a tool to locate rapeseed fields, and a variety of image-processing techniques was adopted to facilitate the compilation of a spatially explicit profile of sympatry between the crop and Brassica rapa. 4. Classified satellite images containing rapeseed fields were first used to infer the spatial relationship between donor rapeseed fields and recipient riverside B. rapa populations. Such images also have utility for improving the efficiency of ground surveys by identifying probable sites of sympatry. The same data were then also used for the calculation of mean field size. 5. This paper forms a companion paper to Wilkinson et al. (2003), in which these elements were combined to produce a spatially explicit profile of hybrid formation over the UK. The current paper demonstrates the value of remote sensing and image processing for large-scale studies of gene flow, and describes a generic method that could be applied to a variety of crops in many countries. 6.Synthesis and applications. The decision to approve or prevent the release of a GM cultivar is made at a national rather than regional level. It is highly desirable that data relating to the decision-making process are collected at the same scale, rather than relying on extrapolation from smaller experiments designed at the plot, field or even regional scale. It would be extremely difficult and labour intensive to attempt to carry out such large-scale investigations without the use of remote-sensing technology. This study used rapeseed in the UK as a model to demonstrate the value of remote sensing in assembling empirical information at a national level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-resolved studies of germylene, GeH2, and dimethygermylene, GeMe2, generated by the 193 nm laser flash photolysis of appropriate precursor molecules have been carried out to try to obtain rate coefficients for their bimolecular reactions with dimethylgermane, Me2GeH2, in the gas-phase. GeH2 + Me2GeH2 was studied over the pressure range 1-100 Torr with SF6 as bath gas and at five temperatures in the range 296-553 K. Only slight pressure dependences were found (at 386, 447 and 553 K). RRKM modelling was carried out to fit these pressure dependences. The high pressure rate coefficients gave the Arrhenius parameters: log(A/cm(3) molecule(-1)s(-1)) = -10.99 +/- 0.07 and E-a = -(7.35 +/- 0.48) kJ mol(-1). No reaction could be found between GeMe2 + Me2GeH2 at any temperature up to 549 K, and upper limits of ca. 10(-14) cm(3) molecule(-1)s(-1) were set for the rate coefficients. A rate coefficient of (1.33 +/- 0.04) x 10(-11)cm(3) molecule(-1)s(-1) was also obtained for GeH2 + MeGeH3 at 296 K. No reaction was found between GeMe2 and MeGeH3. Rate coefficient comparisons showed, inter alia, that in the substrate germane Me-for-H substitution increased the magnitudes of rate coefficients significantly, while in the germylene Me-for-H substitution decreased the magnitudes of rate coefficients by at least four orders of magnitude. Quantum chemical calculations (G2(MP2,SVP)// B3LYP level) supported these findings and showed that the lack of reactivity of GeMe2 is caused by a positive energy barrier for rearrangement of the initially formed complexes. Full details of the structures of intermediate complexes and the discussion of their stabilities are given in the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time resolved studies of silylene, SiH2, generated by the 193 nm laser. ash photolysis of phenylsilane, have been carried out to obtain rate coefficients for its bimolecular reactions with methyl-, dimethyl- and trimethyl-silanes in the gas phase. The reactions were studied over the pressure range 3 - 100 Torr with SF6 as bath gas and at five temperatures in the range 300 - 625 K. Only slight pressure dependences were found for SiH2 + MeSiH3 ( 485 and 602 K) and for SiH2 + Me2SiH2 ( 600 K). The high pressure rate constants gave the following Arrhenius parameters: [GRAPHICS] These are consistent with fast, near to collision-controlled, association processes. RRKM modelling calculations are consistent with the observed pressure dependences ( and also the lack of them for SiH2 + Me3SiH). Ab initio calculations at both second order perturbation theory (MP2) and coupled cluster (CCSD(T)) levels, showed the presence of weakly-bound complexes along the reaction pathways. In the case of SiH2 + MeSiH3 two complexes, with different geometries, were obtained consistent with earlier studies of SiH2 + SiH4. These complexes were stabilised by methyl substitution in the substrate silane, but all had exceedingly low barriers to rearrangement to product disilanes. Although methyl groups in the substrate silane enhance the intrinsic SiH2 insertion rates, it is doubtful whether the intermediate complexes have a significant effect on the kinetics. A further calculation on the reaction MeSiH + SiH4 shows that the methyl substitution in the silylene should have a much more significant kinetic effect ( as observed in other studies).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several studies have highlighted the importance of the cooling period in oil absorption in deep-fat fried products. Specifically, it has been established that the largest proportion of oil which ends up into the food, is sucked into the porous crust region after the fried product is removed from the oil bath, stressing the importance of this time interval. The main objective of this paper was to develop a predictive mechanistic model that can be used to understand the principles behind post-frying cooling oil absorption kinetics, which can also help identifying the key parameters that affect the final oil intake by the fried product. The model was developed for two different geometries, an infinite slab and an infinite cylinder, and was divided into two main sub-models, one describing the immersion frying period itself and the other describing the post-frying cooling period. The immersion frying period was described by a transient moving-front model that considered the movement of the crust/core interface, whereas post-frying cooling oil absorption was considered to be a pressure driven flow mediated by capillary forces. A key element in the model was the hypothesis that oil suction would only begin once a positive pressure driving force had developed. The mechanistic model was based on measurable physical and thermal properties, and process parameters with no need of empirical data fitting, and can be used to study oil absorption in any deep-fat fried product that satisfies the assumptions made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: There were two aims to this study: first to examine whether emotional abuse and neglect are significant predictors of psychological and somatic symptoms, and lifetime trauma exposure in women presenting to a primary care practice, and second to examine the strength of these relationships after controlling for the effects of other types of childhood abuse and trauma. Method: Two-hundred and five women completed the Childhood Trauma Questionnaire (Bernstein et al., 1994), Trauma History Questionnaire (Green, 1996), the Symptom Checklist-revised (Derogatis, 1997), and the Revised Civilian Mississippi Scale for posttraumatic stress disorder (Norris & Perilla, 1996) when presenting to their primary care physician for a visit. Hierarchical multiple regression analyses were conducted to examine unique contributions of emotional abuse and neglect variables on symptom measures while controlling for childhood sexual and physical abuse and lifetime trauma exposure. Results: A history of emotional abuse and neglect was associated with increased anxiety, depression, posttraumatic stress and physical symptoms, as well as lifetime trauma exposure. Physical and sexual abuse and lifetime trauma were also significant predictors of physical and psychological symptoms. Hierarchical multiple regressions demonstrated that emotional abuse and neglect predicted symptomatology in these women even when controlling for other types of abuse and lifetime trauma exposure. Conclusions: Long-standing behavioral consequences may arise as a result of childhood emotional abuse and neglect, specifically, poorer emotional and physical functioning, and vulnerability to further trauma exposure. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Stokes drift induced by surface waves distorts turbulence in the wind-driven mixed layer of the ocean, leading to the development of streamwise vortices, or Langmuir circulations, on a wide range of scales. We investigate the structure of the resulting Langmuir turbulence, and contrast it with the structure of shear turbulence, using rapid distortion theory (RDT) and kinematic simulation of turbulence. Firstly, these linear models show clearly why elongated streamwise vortices are produced in Langmuir turbulence, when Stokes drift tilts and stretches vertical vorticity into horizontal vorticity, whereas elongated streaky structures in streamwise velocity fluctuations (u) are produced in shear turbulence, because there is a cancellation in the streamwise vorticity equation and instead it is vertical vorticity that is amplified. Secondly, we develop scaling arguments, illustrated by analysing data from LES, that indicate that Langmuir turbulence is generated when the deformation of the turbulence by mean shear is much weaker than the deformation by the Stokes drift. These scalings motivate a quantitative RDT model of Langmuir turbulence that accounts for deformation of turbulence by Stokes drift and blocking by the air–sea interface that is shown to yield profiles of the velocity variances in good agreement with LES. The physical picture that emerges, at least in the LES, is as follows. Early in the life cycle of a Langmuir eddy initial turbulent disturbances of vertical vorticity are amplified algebraically by the Stokes drift into elongated streamwise vortices, the Langmuir eddies. The turbulence is thus in a near two-component state, with suppressed and . Near the surface, over a depth of order the integral length scale of the turbulence, the vertical velocity (w) is brought to zero by blocking of the air–sea interface. Since the turbulence is nearly two-component, this vertical energy is transferred into the spanwise fluctuations, considerably enhancing at the interface. After a time of order half the eddy decorrelation time the nonlinear processes, such as distortion by the strain field of the surrounding eddies, arrest the deformation and the Langmuir eddy decays. Presumably, Langmuir turbulence then consists of a statistically steady state of such Langmuir eddies. The analysis then provides a dynamical connection between the flow structures in LES of Langmuir turbulence and the dominant balance between Stokes production and dissipation in the turbulent kinetic energy budget, found by previous authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new fast, effective and practical model structure construction algorithm for a mixture of experts network system utilising only process data. The algorithm is based on a novel forward constrained regression procedure. Given a full set of the experts as potential model bases, the structure construction algorithm, formed on the forward constrained regression procedure, selects the most significant model base one by one so as to minimise the overall system approximation error at each iteration, while the gate parameters in the mixture of experts network system are accordingly adjusted so as to satisfy the convex constraints required in the derivation of the forward constrained regression procedure. The procedure continues until a proper system model is constructed that utilises some or all of the experts. A pruning algorithm of the consequent mixture of experts network system is also derived to generate an overall parsimonious construction algorithm. Numerical examples are provided to demonstrate the effectiveness of the new algorithms. The mixture of experts network framework can be applied to a wide variety of applications ranging from multiple model controller synthesis to multi-sensor data fusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Terahertz (THz) frequency radiation, 0.1 THz to 20 THz, is being investigated for biomedical imaging applications following the introduction of pulsed THz sources that produce picosecond pulses and function at room temperature. Owing to the broadband nature of the radiation, spectral and temporal information is available from radiation that has interacted with a sample; this information is exploited in the development of biomedical imaging tools and sensors. In this work, models to aid interpretation of broadband THz spectra were developed and evaluated. THz radiation lies on the boundary between regions best considered using a deterministic electromagnetic approach and those better analysed using a stochastic approach incorporating quantum mechanical effects, so two computational models to simulate the propagation of THz radiation in an absorbing medium were compared. The first was a thin film analysis and the second a stochastic Monte Carlo model. The Cole–Cole model was used to predict the variation with frequency of the physical properties of the sample and scattering was neglected. The two models were compared with measurements from a highly absorbing water-based phantom. The Monte Carlo model gave a prediction closer to experiment over 0.1 to 3 THz. Knowledge of the frequency-dependent physical properties, including the scattering characteristics, of the absorbing media is necessary. The thin film model is computationally simple to implement but is restricted by the geometry of the sample it can describe. The Monte Carlo framework, despite being initially more complex, provides greater flexibility to investigate more complicated sample geometries.