910 resultados para Large system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence matrix is used in ordinary least-squares applications for monitoring statistical multiple-regression analyses. Concepts related to the influence matrix provide diagnostics on the influence of individual data on the analysis - the analysis change that would occur by leaving one observation out, and the effective information content (degrees of freedom for signal) in any sub-set of the analysed data. In this paper, the corresponding concepts have been derived in the context of linear statistical data assimilation in numerical weather prediction. An approximate method to compute the diagonal elements of the influence matrix (the self-sensitivities) has been developed for a large-dimension variational data assimilation system (the four-dimensional variational system of the European Centre for Medium-Range Weather Forecasts). Results show that, in the boreal spring 2003 operational system, 15% of the global influence is due to the assimilated observations in any one analysis, and the complementary 85% is the influence of the prior (background) information, a short-range forecast containing information from earlier assimilated observations. About 25% of the observational information is currently provided by surface-based observing systems, and 75% by satellite systems. Low-influence data points usually occur in data-rich areas, while high-influence data points are in data-sparse areas or in dynamically active regions. Background-error correlations also play an important role: high correlation diminishes the observation influence and amplifies the importance of the surrounding real and pseudo observations (prior information in observation space). Incorrect specifications of background and observation-error covariance matrices can be identified, interpreted and better understood by the use of influence-matrix diagnostics for the variety of observation types and observed variables used in the data assimilation system. Copyright © 2004 Royal Meteorological Society

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spontaneous mutants of Rhizobium leguminosarum bv. viciae 3841 were isolated that grow faster than the wild type on gamma-aminobutyric acid (GABA) as the sole carbon and nitrogen source. These strains (RU1736 and RU1816) have frameshift mutations (gtsR101 and gtsR102, respectively) in a GntR-type regulator (GtsR) that result in a high rate of constitutive GABA transport. Tn5 mutagenesis and quantitative reverse transcription-PCR showed that GstR regulates expression of a large operon (pRL100242 to pRL100252) on the Sym plasmid that is required for GABA uptake. An ABC transport system, GtsABCD (for GABA transport system) (pRL100248-51), of the spermidine/putrescine family is part of this operon. GtsA is a periplasmic binding protein, GtsB and GtsC are integral membrane proteins, and GtsD is an ATP-binding subunit. Expression of gtsABCD from a lacZ promoter confirmed that it alone is responsible for high rates of GABA transport, enabling rapid growth of strain 3841 on GABA. Gts transports open-chain compounds with four or five carbon atoms with carboxyl and amino groups at, or close to, opposite termini. However, aromatic compounds with similar spacing between carboxyl and amino groups are excellent inhibitors of GABA uptake so they may also be transported. In addition to the ABC transporter, the operon contains two putative mono-oxygenases, a putative hydrolase, a putative aldehyde dehydrogenase, and a succinate semialdehyde dehydrogenase. This suggests the operon may be involved in the transport and breakdown of a more complex precursor to GABA. Gts is not expressed in pea bacteroids, and gtsB mutants are unaltered in their symbiotic phenotype, suggesting that Bra is the only GABA transport system available for amino acid cycling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new system for the generation of hydrodynamic modulated voltammetry (HMV) is presented. This system consists of an oscillating jet produced through the mechanical vibration of a large diaphragm. The structure of the cell is such that a relatively small vibration is transferred to a large fluid flow at the jet outlet. Positioning of an electrode (Pt, 0.5 mm or 25 mu m diameter) over the exit of this jet enables the detection of the modulated flow of liquid. While this flow creates modest mass transfer rates (time averaged similar to 0.015 cm s(-1)) it can also be used to create a HMV system where a 'lock-in' approach is adopted to investigate the redox chemistry in question. This is demonstrated for the Fe(CN)(6)(3-/4-) redox system. Here 'lock-in' to the modulated hydrodynamic signal is achieved through the deployment of bespoke software. The apparatus and procedure is shown to produce a simple and efficient way to obtain the desired signal. In addition the spatial variation of the HMV signal, phase correction and time averaged current with respect to the jet orifice is presented. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A whole life-cycle information management vision is proposed, the organizational requirements for the realization of the scenario is investigated. Preliminary interviews with construction professionals are reported. Discontinuities at information transfer throughout life-cycle of built environments are resulting from lack of coordination and multiple data collection/storage practices. A more coherent history of these activities can improve the work practices of various teams by augmenting decision making processes and creating organizational learning opportunities. Therefore, there is a need for unifying these fragmented bits of data to create a meaningful, semantically rich and standardized information repository for built environment. The proposed vision utilizes embedded technologies and distributed building information models. Two diverse construction project types (large one-off design, small repetitive design) are investigated for the applicability of the vision. A functional prototype software/hardware system for demonstrating the practical use of this vision is developed and discussed. Plans for case-studies for validating the proposed model at a large PFI hospital and housing association projects are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rheological properties of dough and gluten are important for end-use quality of flour but there is a lack of knowledge of the relationships between fundamental and empirical tests and how they relate to flour composition and gluten quality. Dough and gluten from six breadmaking wheat qualities were subjected to a range of rheological tests. Fundamental (small-deformation) rheological characterizations (dynamic oscillatory shear and creep recovery) were performed on gluten to avoid the nonlinear influence of the starch component, whereas large deformation tests were conducted on both dough and gluten. A number of variables from the various curves were considered and subjected to a principal component analysis (PCA) to get an overview of relationships between the various variables. The first component represented variability in protein quality, associated with elasticity and tenacity in large deformation (large positive loadings for resistance to extension and initial slope of dough and gluten extension curves recorded by the SMS/Kieffer dough and gluten extensibility rig, and the tenacity and strain hardening index of dough measured by the Dobraszczyk/Roberts dough inflation system), the elastic character of the hydrated gluten proteins (large positive loading for elastic modulus [G'], large negative loadings for tan delta and steady state compliance [J(e)(0)]), the presence of high molecular weight glutenin subunits (HMW-GS) 5+10 vs. 2+12, and a size distribution of glutenin polymers shifted toward the high-end range. The second principal component was associated with flour protein content. Certain rheological data were influenced by protein content in addition to protein quality (area under dough extension curves and dough inflation curves [W]). The approach made it possible to bridge the gap between fundamental rheological properties, empirical measurements of physical properties, protein composition, and size distribution. The interpretation of this study gave indications of the molecular basis for differences in breadmaking performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three large deformation rheological tests, the Kieffer dough extensibility system, the D/R dough inflation system and the 2 g mixograph test, were carried out on doughs made from a large number of winter wheat lines and cultivars grown in Poland. These lines and cultivars represented a broad spread in baking performance in order to assess their suitability as predictors of baking volume. The parameters most closely associated with baking volume were strain hardening index, bubble failure strain, and mixograph bandwidth at 10min. Simple correlations with baking volume indicate that bubble failure strain and strain hardening index give the highest correlations, whilst the use of best subsets regression, which selects the best combination of parameters, gave increased correlations with R-2 = 0.865 for dough inflation parameters, R-2 = 0. 842 for Kieffer parameters and R-2 = 0.760 for mixograph parameters. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have discovered a novel approach of intrusion detection system using an intelligent data classifier based on a self organizing map (SOM). We have surveyed all other unsupervised intrusion detection methods, different alternative SOM based techniques and KDD winner IDS methods. This paper provides a robust designed and implemented intelligent data classifier technique based on a single large size (30x30) self organizing map (SOM) having the capability to detect all types of attacks given in the DARPA Archive 1999 the lowest false positive rate being 0.04 % and higher detection rate being 99.73% tested using full KDD data sets and 89.54% comparable detection rate and 0.18% lowest false positive rate tested using corrected data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the feasibility of wireless terahertz communications links deployed in a metropolitan area and model the large-scale fading of such channels. The model takes into account reception through direct line of sight, ground and wall reflection, as well as diffraction around a corner. The movement of the receiver is modeled by an autonomous dynamic linear system in state space, whereas the geometric relations involved in the attenuation and multipath propagation of the electric field are described by a static nonlinear mapping. A subspace algorithm in conjunction with polynomial regression is used to identify a single-output Wiener model from time-domain measurements of the field intensity when the receiver motion is simulated using a constant angular speed and an exponentially decaying radius. The identification procedure is validated by using the model to perform q-step ahead predictions. The sensitivity of the algorithm to small-scale fading, detector noise, and atmospheric changes are discussed. The performance of the algorithm is tested in the diffraction zone assuming a range of emitter frequencies (2, 38, 60, 100, 140, and 400 GHz). Extensions of the simulation results to situations where a more complicated trajectory describes the motion of the receiver are also implemented, providing information on the performance of the algorithm under a worst case scenario. Finally, a sensitivity analysis to model parameters for the identified Wiener system is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The large scale fading of wireless mobile communications links is modelled assuming the mobile receiver motion is described by a dynamic linear system in state-space. The geometric relations involved in the attenuation and multi-path propagation of the electric field are described by a static non-linear mapping. A Wiener system subspace identification algorithm in conjunction with polynomial regression is used to identify a model from time-domain estimates of the field intensity assuming a multitude of emitters and an antenna array at the receiver end.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phase diagram of a series of poly(1,2-octylene oxide)-poly(ethylene oxide) (POO-PEO) diblock copolymers is determined by small-angle X-ray scattering. The Flory-Huggins interaction parameter was measured by small-angle neutron scattering. The phase diagram is highly asymmetric due to large conformational asymmetry that results from the hexyl side chains in the POO block. Non-lamellar phases (hexagonal and gyroid) are observed near f(PEO) = 0.5, and the lamellar phase is observed for f(PEO) >= 0.5.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Group on Earth Observations System of Systems, GEOSS, is a co-ordinated initiative by many nations to address the needs for earth-system information expressed by the 2002 World Summit on Sustainable Development. We discuss the role of earth-system modelling and data assimilation in transforming earth-system observations into the predictive and status-assessment products required by GEOSS, across many areas of socio-economic interest. First we review recent gains in the predictive skill of operational global earth-system models, on time-scales of days to several seasons. We then discuss recent work to develop from the global predictions a diverse set of end-user applications which can meet GEOSS requirements for information of socio-economic benefit; examples include forecasts of coastal storm surges, floods in large river basins, seasonal crop yield forecasts and seasonal lead-time alerts for malaria epidemics. We note ongoing efforts to extend operational earth-system modelling and assimilation capabilities to atmospheric composition, in support of improved services for air-quality forecasts and for treaty assessment. We next sketch likely GEOSS observational requirements in the coming decades. In concluding, we reflect on the cost of earth observations relative to the modest cost of transforming the observations into information of socio-economic value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The next couple of years will see the need for replacement of a large amount of life-expired switchgear on the UK 11 kV distribution system. Latest technology and alternative equipment have made the choice of replacement a complex task. The authors present an expert system as an aid to the decision process for the design of the 11 kV power distribution network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A key strategy to improve the skill of quantitative predictions of precipitation, as well as hazardous weather such as severe thunderstorms and flash floods is to exploit the use of observations of convective activity (e.g. from radar). In this paper, a convection-permitting ensemble prediction system (EPS) aimed at addressing the problems of forecasting localized weather events with relatively short predictability time scale and based on a 1.5 km grid-length version of the Met Office Unified Model is presented. Particular attention is given to the impact of using predicted observations of radar-derived precipitation intensity in the ensemble transform Kalman filter (ETKF) used within the EPS. Our initial results based on the use of a 24-member ensemble of forecasts for two summer case studies show that the convective-scale EPS produces fairly reliable forecasts of temperature, horizontal winds and relative humidity at 1 h lead time, as evident from the inspection of rank histograms. On the other hand, the rank histograms seem also to show that the EPS generates too much spread for forecasts of (i) surface pressure and (ii) surface precipitation intensity. These may indicate that for (i) the value of surface pressure observation error standard deviation used to generate surface pressure rank histograms is too large and for (ii) may be the result of non-Gaussian precipitation observation errors. However, further investigations are needed to better understand these findings. Finally, the inclusion of predicted observations of precipitation from radar in the 24-member EPS considered in this paper does not seem to improve the 1-h lead time forecast skill.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organic agriculture is becoming widespread due to increased consumer demand and regulatory and political support. Organic agriculture can increase arthropod diversity but the response of pests and their natural enemies is variable. Fertiliser is an important component of agricultural systems and its effects on pests and natural enemies will influence agroecosystems. In this study, meta-analysis and vote-counting techniques were used to compare farming system (organic and conventional) and fertiliser effects on arthropod pests and their natural enemies. The meta-analyses indicated that pests generally benefitted from organic techniques, this is particularly evident when experiments were carried out on a smaller scale. Pest responses to organic and conventional fertiliser types were divergent, plant composts benefitted pest arthropods while the opposite was true for manures, this has implications for pest management. Most natural enemy groups responded positively to organic farming although this was not true for Coleopterans. Experimental scale had a prominent impact on natural enemy responses with farm scale studies showing particularly positive effects of organic agriculture on natural enemies. This suggests that it is large scale features of organic agriculture such as landscape heterogeneity that are beneficial to natural enemies. Natural enemy responses to organic fertilisers were positive indicating that field scale management practices including fertiliser can also be important in pest management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.