892 resultados para EXPLOITING MULTICOMMUTATION


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A theoretical framework for the joint conservation of energy and momentum in the parameterization of subgrid-scale processes in climate models is presented. The framework couples a hydrostatic resolved (planetary scale) flow to a nonhydrostatic subgrid-scale (mesoscale) flow. The temporal and horizontal spatial scale separation between the planetary scale and mesoscale is imposed using multiple-scale asymptotics. Energy and momentum are exchanged through subgrid-scale flux convergences of heat, pressure, and momentum. The generation and dissipation of subgrid-scale energy and momentum is understood using wave-activity conservation laws that are derived by exploiting the (mesoscale) temporal and horizontal spatial homogeneities in the planetary-scale flow. The relations between these conservation laws and the planetary-scale dynamics represent generalized nonacceleration theorems. A derived relationship between the wave-activity fluxes-which represents a generalization of the second Eliassen-Palm theorem-is key to ensuring consistency between energy and momentum conservation. The framework includes a consistent formulation of heating and entropy production due to kinetic energy dissipation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An ensemble forecast is a collection of runs of a numerical dynamical model, initialized with perturbed initial conditions. In modern weather prediction for example, ensembles are used to retrieve probabilistic information about future weather conditions. In this contribution, we are concerned with ensemble forecasts of a scalar quantity (say, the temperature at a specific location). We consider the event that the verification is smaller than the smallest, or larger than the largest ensemble member. We call these events outliers. If a K-member ensemble accurately reflected the variability of the verification, outliers should occur with a base rate of 2/(K + 1). In operational forecast ensembles though, this frequency is often found to be higher. We study the predictability of outliers and find that, exploiting information available from the ensemble, forecast probabilities for outlier events can be calculated which are more skilful than the unconditional base rate. We prove this analytically for statistically consistent forecast ensembles. Further, the analytical results are compared to the predictability of outliers in an operational forecast ensemble by means of model output statistics. We find the analytical and empirical results to agree both qualitatively and quantitatively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As part of its contribution to the 1951 Festival of Britain, the Arts Council ran what can be seen in retrospect to be an important playwriting competition. Disregarding the London stage entirely, it invited regional theatres throughout the UK to put forward nominations for new plays within their repertoire for 1950-1951. Each of the five winning plays would receive, what was then, the substantial sum of £100. Originality and innovation featured highly amongst the selection criteria, with 40 per cent of the judges’ marks being awarded for “interest of subject matter and inventiveness of treatment”. This article will assess some of the surprising outcomes of the competition and argue that it served as an important nexus point in British theatrical historiography between two key moments in post-war Britain: the first being the inauguration of the Festival of Britain in 1951, the other being the debut of John Osborne’s Look Back in Anger in May 1956. The article will also argue that the Arts Council’s play competition was significant for two other reasons. By circumventing the London stage, it provides a useful tool by which to reassess the state of new writing in regional theatre at the beginning of the 1950s and to question how far received views of parochialism and conservatism held true. The paper will also put forward a case for the competition significantly anticipating the work of George Devine at the English Stage Company, which during its early years established a reputation for itself by heavily exploiting the repertoire of new plays originally commissioned by regional theatres. This article forms part of a five year funded Arts and Humanities Research Council (AHRC) project, ‘Giving Voice to the Nation: The Arts Council of Great Britain and the Development of Theatre and Performance in Britain 1945-1994’. Details of the Arts Council’s archvie, which is housed at the Victoria & Albert Museum in London can be found at http://www.vam.ac.uk/vastatic/wid/ead/acgb/acgbf.html Keywords: Arts Council of Great Britain, regional theatre, playwriting, Festival of Britain, English Stage Company (Royal Court) , Yvonne Mitchell

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a polynomial-based noise variance estimator for multiple-input multiple-output single-carrier block transmission (MIMO-SCBT) systems. It is shown that the optimal pilots for noise variance estimation satisfy the same condition as that for channel estimation. Theoretical analysis indicates that the proposed estimator is statistically more efficient than the conventional sum of squared residuals (SSR) based estimator. Furthermore, we obtain an efficient implementation of the estimator by exploiting its special structure. Numerical results confirm our theoretical analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sea surface temperature (SST) can be estimated from day and night observations of the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI) by optimal estimation (OE). We show that exploiting the 8.7 μm channel, in addition to the “traditional” wavelengths of 10.8 and 12.0 μm, improves OE SST retrieval statistics in validation. However, the main benefit is an improvement in the sensitivity of the SST estimate to variability in true SST. In a fair, single-pixel comparison, the 3-channel OE gives better results than the SST estimation technique presently operational within the Ocean and Sea Ice Satellite Application Facility. This operational technique is to use SST retrieval coefficients, followed by a bias-correction step informed by radiative transfer simulation. However, the operational technique has an additional “atmospheric correction smoothing”, which improves its noise performance, and hitherto had no analogue within the OE framework. Here, we propose an analogue to atmospheric correction smoothing, based on the expectation that atmospheric total column water vapour has a longer spatial correlation length scale than SST features. The approach extends the observations input to the OE to include the averaged brightness temperatures (BTs) of nearby clear-sky pixels, in addition to the BTs of the pixel for which SST is being retrieved. The retrieved quantities are then the single-pixel SST and the clear-sky total column water vapour averaged over the vicinity of the pixel. This reduces the noise in the retrieved SST significantly. The robust standard deviation of the new OE SST compared to matched drifting buoys becomes 0.39 K for all data. The smoothed OE gives SST sensitivity of 98% on average. This means that diurnal temperature variability and ocean frontal gradients are more faithfully estimated, and that the influence of the prior SST used is minimal (2%). This benefit is not available using traditional atmospheric correction smoothing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the techniques used to obtain sea surface temperature (SST) retrievals from the Geostationary Operational Environmental Satellite 12 (GOES-12) at the National Oceanic and Atmospheric Administration’s Office of Satellite Data Processing and Distribution. Previous SST retrieval techniques relying on channels at 11 and 12 μm are not applicable because GOES-12 lacks the latter channel. Cloud detection is performed using a Bayesian method exploiting fast-forward modeling of prior clear-sky radiances using numerical weather predictions. The basic retrieval algorithm used at nighttime is based on a linear combination of brightness temperatures at 3.9 and 11 μm. In comparison with traditional split window SSTs (using 11- and 12-μm channels), simulations show that this combination has maximum scatter when observing drier colder scenes, with a comparable overall performance. For daytime retrieval, the same algorithm is applied after estimating and removing the contribution to brightness temperature in the 3.9-μm channel from solar irradiance. The correction is based on radiative transfer simulations and comprises a parameterization for atmospheric scattering and a calculation of ocean surface reflected radiance. Potential use of the 13-μm channel for SST is shown in a simulation study: in conjunction with the 3.9-μm channel, it can reduce the retrieval error by 30%. Some validation results are shown while a companion paper by Maturi et al. shows a detailed analysis of the validation results for the operational algorithms described in this present article.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecological theory predicts that communities using the same resources should have similar structure, but evolutionary constraints on colonization and niche shifts may hamper such convergence. Multitrophic communities of wasps exploiting fig fruits, which first evolved about 75MYA, do not show long-term “inheritance” of taxonomic (lineage) composition or species diversity. However, communities on three continents have converged ecologically in the presence and relative abundance of five insect guilds that we define. Some taxa fill the same niches in each community (phylogenetic niche conservatism). However, we show that overall convergence in ecological community structure depends also on a combination of niche shifts by resident lineages and local colonizations of figs by other insect lineages. Our study explores new ground, and develops new heuristic tools, in combining ecology and phylogeny to address patterns in the complex multitrophic communities of insect on plants, which comprise a large part of terrestrial biodiversity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mathematics in Defence 2011 Abstract. We review transreal arithmetic and present transcomplex arithmetic. These arithmetics have no exceptions. This leads to incremental improvements in computer hardware and software. For example, the range of real numbers, encoded by floating-point bits, is doubled when all of the Not-a-Number(NaN) states, in IEEE 754 arithmetic, are replaced with real numbers. The task of programming such systems is simplified and made safer by discarding the unordered relational operator,leaving only the operators less-than, equal-to, and greater than. The advantages of using a transarithmetic in a computation, or transcomputation as we prefer to call it, may be had by making small changes to compilers and processor designs. However, radical change is possible by exploiting the reliability of transcomputations to make pipelined dataflow machines with a large number of cores. Our initial designs are for a machine with order one million cores. Such a machine can complete the execution of multiple in-line programs each clock tick

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Epidemiological and clinical trials reveal compelling evidence for the ability of dietary flavonoids to lower cardiovascular disease risk. The mechanisms of action of these polyphenolic compounds are diverse, and of particular interest is their ability to function as protein and lipid kinase inhibitors. We have previously described structure-activity studies that reinforce the possibility for using flavonoid structures as templates for drug design. In the present study, we aim to begin constructing rational screening strategies for exploiting these compounds as templates for the design of clinically relevant, antiplatelet agents. We used the platelet as a model system to dissect the structural influence of flavonoids, stilbenes, anthocyanidins, and phenolic acids on inhibition of cell signaling and function. Functional groups identified as relevant for potent inhibition of platelet function included at least 2 benzene rings, a hydroxylated B ring, a planar C ring, a C ring ketone group, and a C-2 positioned B ring. Hydroxylation of the B ring with either a catechol group or a single C-4' hydroxyl may be required for efficient inhibition of collagen-stimulated tyrosine phosphorylated proteins of 125 to 130 kDa, but may not be necessary for that of phosphotyrosine proteins at approximately 29 kDa. The removal of the C ring C-3 hydroxyl together with a hydroxylated B ring (apigenin) may confer selectivity for 37 to 38 kDa phosphotyrosine proteins. We conclude that this study may form the basis for construction of maps of flavonoid inhibitory activity on kinase targets that may allow a multitargeted therapeutic approach with analogue counterparts and parent compounds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Owing to continuous advances in the computational power of handheld devices like smartphones and tablet computers, it has become possible to perform Big Data operations including modern data mining processes onboard these small devices. A decade of research has proved the feasibility of what has been termed as Mobile Data Mining, with a focus on one mobile device running data mining processes. However, it is not before 2010 until the authors of this book initiated the Pocket Data Mining (PDM) project exploiting the seamless communication among handheld devices performing data analysis tasks that were infeasible until recently. PDM is the process of collaboratively extracting knowledge from distributed data streams in a mobile computing environment. This book provides the reader with an in-depth treatment on this emerging area of research. Details of techniques used and thorough experimental studies are given. More importantly and exclusive to this book, the authors provide detailed practical guide on the deployment of PDM in the mobile environment. An important extension to the basic implementation of PDM dealing with concept drift is also reported. In the era of Big Data, potential applications of paramount importance offered by PDM in a variety of domains including security, business and telemedicine are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Food security depends on enhancing production and reducing loss to pests and pathogens. A promising alternative to agrochemicals is the use of plant growth-promoting rhizobacteria (PGPR), which are commonly associated with many, if not all, plant species. However, exploiting the benefits of PGPRs requires knowledge of bacterial function and an in-depth understanding of plant-bacteria associations. Motility is important for colonization efficiency and microbial fitness in the plant environment, but the mechanisms employed by bacteria on and around plants are not well understood. We describe and investigate an atypical mode of motility in Pseudomonas fluorescens SBW25 that was revealed only after flagellum production was eliminated by deletion of the master regulator fleQ. Our results suggest that this ‘spidery spreading’ is a type of surface motility. Transposon mutagenesis of SBW25ΔfleQ (SBW25Q) produced mutants, defective in viscosin production, and surface spreading was also abolished. Genetic analysis indicated growth-dependency, production of viscosin, and several potential regulatory and secretory systems involved in the spidery spreading phenotype. Moreover, viscosin both increases efficiency of surface spreading over the plant root and protects germinating seedlings in soil infected with the plant pathogen Pythium. Thus, viscosin could be a useful target for biotechnological development of plant growth promotion agents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For much of lowland Britain during the Holocene one important factor in determining environmental change was sea level fluctuation. A net rise of circa 20 m, within an oscillating short term picture of transgression and regression, caused significant short to medium term challenges for people exploiting those resources. During transgression phases estuarine creek systems extended landwards, and during the final transgression phase, widespread sedimentation took place, allowing for the development of saltmarshes on tidal flats. In later prehistory the exploitation of lowlands and estuarine wetlands was predominantly for fishing, waterfowling and pastoral use, and this paper explores the human ecodynamics of the intertidal zone in the Humber estuary during the Bronze Age. Results of the Humber Wetlands Project's recent estuarine survey, will be used to argue that following a marine transgression circa 1500 cal BC, the foreshore was fully exploited in terms of food procurement. Furthermore the construction of hurdle trackways allowed access across expanding tidal creek systems to be maintained. This not only shows continued use of the most productive environments, and provides evidence for selective use of woodland, but also the continued exploitation of the intertidal zone may have played a role in the evolution of social and political structures in this area during the Bronze Age.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The notion that learning can be enhanced when a teaching approach matches a learner’s learning style has been widely accepted in classroom settings since the latter represents a predictor of student’s attitude and preferences. As such, the traditional approach of ‘one-size-fits-all’ as may be applied to teaching delivery in Educational Hypermedia Systems (EHSs) has to be changed with an approach that responds to users’ needs by exploiting their individual differences. However, establishing and implementing reliable approaches for matching the teaching delivery and modalities to learning styles still represents an innovation challenge which has to be tackled. In this paper, seventy six studies are objectively analysed for several goals. In order to reveal the value of integrating learning styles in EHSs, different perspectives in this context are discussed. Identifying the most effective learning style models as incorporated within AEHSs. Investigating the effectiveness of different approaches for modelling students’ individual learning traits is another goal of this study. Thus, the paper highlights a number of theoretical and technical issues of LS-BAEHSs to serve as a comprehensive guidance for researchers who interest in this area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we develop an energy-efficient resource-allocation scheme with proportional fairness for downlink multiuser orthogonal frequency-division multiplexing (OFDM) systems with distributed antennas. Our aim is to maximize energy efficiency (EE) under the constraints of the overall transmit power of each remote access unit (RAU), proportional fairness data rates, and bit error rates (BERs). Because of the nonconvex nature of the optimization problem, obtaining the optimal solution is extremely computationally complex. Therefore, we develop a low-complexity suboptimal algorithm, which separates subcarrier allocation and power allocation. For the low-complexity algorithm, we first allocate subcarriers by assuming equal power distribution. Then, by exploiting the properties of fractional programming, we transform the nonconvex optimization problem in fractional form into an equivalent optimization problem in subtractive form, which includes a tractable solution. Next, an optimal energy-efficient power-allocation algorithm is developed to maximize EE while maintaining proportional fairness. Through computer simulation, we demonstrate the effectiveness of the proposed low-complexity algorithm and illustrate the fundamental trade off between energy and spectral-efficient transmission designs.