914 resultados para Deterministic nanofabrication


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Chartered Institute of Building Service Engineers (CIBSE) produced a technical memorandum (TM36) presenting research on future climate impacting building energy use and thermal comfort. One climate projection for each of four CO2 emissions scenario were used in TM36, so providing a deterministic outlook. As part of the UK Climate Impacts Programme (UKCIP) probabilistic climate projections are being studied in relation to building energy simulation techniques. Including uncertainty in climate projections is considered an important advance to climate impacts modelling and is included in the latest UKCIP data (UKCP09). Incorporating the stochastic nature of these new climate projections in building energy modelling requires a significant increase in data handling and careful statistical interpretation of the results to provide meaningful conclusions. This paper compares the results from building energy simulations when applying deterministic and probabilistic climate data. This is based on two case study buildings: (i) a mixed-mode office building with exposed thermal mass and (ii) a mechanically ventilated, light-weight office building. Building (i) represents an energy efficient building design that provides passive and active measures to maintain thermal comfort. Building (ii) relies entirely on mechanical means for heating and cooling, with its light-weight construction raising concern over increased cooling loads in a warmer climate. Devising an effective probabilistic approach highlighted greater uncertainty in predicting building performance, depending on the type of building modelled and the performance factors under consideration. Results indicate that the range of calculated quantities depends not only on the building type but is strongly dependent on the performance parameters that are of interest. Uncertainty is likely to be particularly marked with regard to thermal comfort in naturally ventilated buildings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the error dynamics for cycled data assimilation systems, such that the inverse problem of state determination is solved at tk, k = 1, 2, 3, ..., with a first guess given by the state propagated via a dynamical system model from time tk − 1 to time tk. In particular, for nonlinear dynamical systems that are Lipschitz continuous with respect to their initial states, we provide deterministic estimates for the development of the error ||ek|| := ||x(a)k − x(t)k|| between the estimated state x(a) and the true state x(t) over time. Clearly, observation error of size δ > 0 leads to an estimation error in every assimilation step. These errors can accumulate, if they are not (a) controlled in the reconstruction and (b) damped by the dynamical system under consideration. A data assimilation method is called stable, if the error in the estimate is bounded in time by some constant C. The key task of this work is to provide estimates for the error ||ek||, depending on the size δ of the observation error, the reconstruction operator Rα, the observation operator H and the Lipschitz constants K(1) and K(2) on the lower and higher modes of controlling the damping behaviour of the dynamics. We show that systems can be stabilized by choosing α sufficiently small, but the bound C will then depend on the data error δ in the form c||Rα||δ with some constant c. Since ||Rα|| → ∞ for α → 0, the constant might be large. Numerical examples for this behaviour in the nonlinear case are provided using a (low-dimensional) Lorenz '63 system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let 0 denote the level of quality inherent in a food product that is delivered to some terminal market. In this paper, I characterize allocations over 0 and provide an economic rationale for regulating safety and quality standards in the food system. Zusman and Bockstael investigate the theoretical foundations for imposing standards and stress the importance of providing a tractable conceptual foundation. Despite a wealth of contributions that are mainly empirical (for reviews of these works see, respectively, Caswell and Antle), there have been relatively few attempts to model formally the linkages between farm and food markets when food quality and consumer safety are at issue. Here, I attempt to provide such a framework, building on key contributions in the theoretical literature and linking them in a simple model of quality determination in a vertically related marketing channel. The food-marketing model is due to Gardner. Spence provides a foundation for Pareto-improving intervention in a deterministic model of quality provision, and Leland, building on the classic paper by Akerlof, investigates licensing and minimum standards when the information structure is incomplete. Linking these ideas in a satisfactory model of the food markets is the main objective of the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Price movements in many commodity markets exhibit significant seasonal patterns. However, given an observed futures price, a deterministic seasonal component at the price level is not relevant for the pricing of commodity options. In contrast, this is not true for the seasonal pattern observed in the volatility of the commodity price. Analyzing an extensive sample of soybean, corn, heating oil and natural gas options, we find that seasonality in volatility is an important aspect to consider when valuing these contracts. The inclusion of an appropriate seasonality adjustment significantly reduces pricing errors in these markets and yields more improvement in valuation accuracy than increasing the number of stochastic factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – This paper seeks to critically review the conceptual frameworks that have been developed for assessing the impact of information and communications technology (ICT) on real estate. Design/methodology/approach – The research is based on a critical review of existing literature and draws from examples of previous empirical research in the field. Findings – The paper suggests that a “socio-technical framework” is more appropriate to examine ICT impact in real estate than other “deterministic” frameworks. Therefore, ICT is an important part of the new economy, but must be seen in the context of a number of other social and economic factors. Research limitations/implications – The research is based on a qualitative assessment of existing frameworks, and by using examples from commercial real estate, assesses the extent to which a “socio-technical” framework can aid understanding of ICT impact. Practical implications – The paper is important in highlighting a number of the main issues in conceptualising ICT impact in real estate and also critically examines the emergence of a new economy in the information society within the general context of real estate. The paper also highlights research gaps in the field. Originality/value – The paper deconstructs the myths of the “death of real estate” and “productivity increase means jobs loss”, in relation to office real estate. Finally, it examines some of the ways in which ICT is impacting on real estate and suggests the most important components for a future research agenda in the field of ICT and real estate impact, and will be of value to property investors, facilities managers, developers, financiers, and others.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geophysical time series sometimes exhibit serial correlations that are stronger than can be captured by the commonly used first‐order autoregressive model. In this study we demonstrate that a power law statistical model serves as a useful upper bound for the persistence of total ozone anomalies on monthly to interannual timescales. Such a model is usually characterized by the Hurst exponent. We show that the estimation of the Hurst exponent in time series of total ozone is sensitive to various choices made in the statistical analysis, especially whether and how the deterministic (including periodic) signals are filtered from the time series, and the frequency range over which the estimation is made. In particular, care must be taken to ensure that the estimate of the Hurst exponent accurately represents the low‐frequency limit of the spectrum, which is the part that is relevant to long‐term correlations and the uncertainty of estimated trends. Otherwise, spurious results can be obtained. Based on this analysis, and using an updated equivalent effective stratospheric chlorine (EESC) function, we predict that an increase in total ozone attributable to EESC should be detectable at the 95% confidence level by 2015 at the latest in southern midlatitudes, and by 2020–2025 at the latest over 30°–45°N, with the time to detection increasing rapidly with latitude north of this range.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamics affects the distribution and abundance of stratospheric ozone directly through transport of ozone itself and indirectly through its effect on ozone chemistry via temperature and transport of other chemical species. Dynamical processes must be considered in order to understand past ozone changes, especially in the northern hemisphere where there appears to be significant low-frequency variability which can look “trend-like” on decadal time scales. A major challenge is to quantify the predictable, or deterministic, component of past ozone changes. Over the coming century, changes in climate will affect the expected recovery of ozone. For policy reasons it is important to be able to distinguish and separately attribute the effects of ozone-depleting substances and greenhouse gases on both ozone and climate. While the radiative-chemical effects can be relatively easily identified, this is not so evident for dynamics — yet dynamical changes (e.g., changes in the Brewer-Dobson circulation) could have a first-order effect on ozone over particular regions. Understanding the predictability and robustness of such dynamical changes represents another major challenge. Chemistry-climate models have recently emerged as useful tools for addressing these questions, as they provide a self-consistent representation of dynamical aspects of climate and their coupling to ozone chemistry. We can expect such models to play an increasingly central role in the study of ozone and climate in the future, analogous to the central role of global climate models in the study of tropospheric climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a unified neurofuzzy modelling scheme. To begin with, the initial fuzzy base construction method is based on fuzzy clustering utilising a Gaussian mixture model (GMM) combined with the analysis of covariance (ANOVA) decomposition in order to obtain more compact univariate and bivariate membership functions over the subspaces of the input features. The mean and covariance of the Gaussian membership functions are found by the expectation maximisation (EM) algorithm with the merit of revealing the underlying density distribution of system inputs. The resultant set of membership functions forms the basis of the generalised fuzzy model (GFM) inference engine. The model structure and parameters of this neurofuzzy model are identified via the supervised subspace orthogonal least square (OLS) learning. Finally, instead of providing deterministic class label as model output by convention, a logistic regression model is applied to present the classifier’s output, in which the sigmoid type of logistic transfer function scales the outputs of the neurofuzzy model to the class probability. Experimental validation results are presented to demonstrate the effectiveness of the proposed neurofuzzy modelling scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite many decades investigating scalp recordable 8–13-Hz (alpha) electroencephalographic activity, no consensus has yet emerged regarding its physiological origins nor its functional role in cognition. Here we outline a detailed, physiologically meaningful, theory for the genesis of this rhythm that may provide important clues to its functional role. In particular we find that electroencephalographically plausible model dynamics, obtained with physiological admissible parameterisations, reveals a cortex perched on the brink of stability, which when perturbed gives rise to a range of unanticipated complex dynamics that include 40-Hz (gamma) activity. Preliminary experimental evidence, involving the detection of weak nonlinearity in resting EEG using an extension of the well-known surrogate data method, suggests that nonlinear (deterministic) dynamics are more likely to be associated with weakly damped alpha activity. Thus rather than the “alpha rhythm” being an idling rhythm it may be more profitable to conceive it as a readiness rhythm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ensemble-based data assimilation is rapidly proving itself as a computationally-efficient and skilful assimilation method for numerical weather prediction, which can provide a viable alternative to more established variational assimilation techniques. However, a fundamental shortcoming of ensemble techniques is that the resulting analysis increments can only span a limited subspace of the state space, whose dimension is less than the ensemble size. This limits the amount of observational information that can effectively constrain the analysis. In this paper, a data selection strategy that aims to assimilate only the observational components that matter most and that can be used with both stochastic and deterministic ensemble filters is presented. This avoids unnecessary computations, reduces round-off errors and minimizes the risk of importing observation bias in the analysis. When an ensemble-based assimilation technique is used to assimilate high-density observations, the data-selection procedure allows the use of larger localization domains that may lead to a more balanced analysis. Results from the use of this data selection technique with a two-dimensional linear and a nonlinear advection model using both in situ and remote sounding observations are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines the potential to improve numerical weather prediction (NWP) by estimating upper and lower bounds on predictability by re-visiting the original study of Lorenz (1982) but applied to the most recent version of the European Centre for Medium Range Weather Forecasts (ECMWF) forecast system, for both the deterministic and ensemble prediction systems (EPS). These bounds are contrasted with an older version of the same NWP system to see how they have changed with improvements to the NWP system. The computations were performed for the earlier seasons of DJF 1985/1986 and JJA 1986 and the later seasons of DJF 2010/2011 and JJA 2011 using the 500-hPa geopotential height field. Results indicate that for this field, we may be approaching the limit of deterministic forecasting so that further improvements might only be obtained by improving the initial state. The results also show that predictability calculations with earlier versions of the model may overestimate potential forecast skill, which may be due to insufficient internal variability in the model and because recent versions of the model are more realistic in representing the true atmospheric evolution. The same methodology is applied to the EPS to calculate upper and lower bounds of predictability of the ensemble mean forecast in order to explore how ensemble forecasting could extend the limits of the deterministic forecast. The results show that there is a large potential to improve the ensemble predictions, but for the increased predictability of the ensemble mean, there will be a trade-off in information as the forecasts will become increasingly smoothed with time. From around the 10-d forecast time, the ensemble mean begins to converge towards climatology. Until this point, the ensemble mean is able to predict the main features of the large-scale flow accurately and with high consistency from one forecast cycle to the next. By the 15-d forecast time, the ensemble mean has lost information with the anomaly of the flow strongly smoothed out. In contrast, the control forecast is much less consistent from run to run, but provides more detailed (unsmoothed) but less useful information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In accord with the general program of researching factors relating to ultimate attainment and maturational constraints in adult language acquisition, this commentary highlights the importance of input differences in amount, type, and setting between naturalistic and classroom learners of an L2. It is suggested that these variables are often confounded with age factors. Herein, we wish to call attention to the possible deterministic role that the differences in the grammatical quality of classroom input have on development and on competence outcomes. Framing what we see as greater formal complexity of the learning task for classroom learners, we suggest that one might benefit from focusing less on difference and more on how classroom L2 learners, at least some of them, come to acquire all that they do despite crucial qualitative differences in their input.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One central question in the formal linguistic study of adult multilingual morphosyntax (i.e., L3/Ln acquisition) involves determining the role(s) the L1 and/or the L2 play(s) at the L3 initial state (e.g., Bardel & Falk, Second Language Research 23: 459–484, 2007; Falk & Bardel, Second Language Research: forthcoming; Flynn et al., The International Journal of Multilingualism 8: 3–16, 2004; Rothman, Second Language Research: forthcoming; Rothman & Cabrelli, On the initial state of L3 (Ln) acquisition: Selective or absolute transfer?: 2007; Rothman & Cabrelli Amaro, Second Language Research 26: 219–289, 2010). The present article adds to this general program, testing Rothman's (Second Language Research: forthcoming) model for L3 initial state transfer, which when relevant in light of specific language pairings, maintains that typological proximity between the languages is the most deterministic variable determining the selection of syntactic transfer. Herein, I present empirical evidence from the later part of the beginning stages of L3 Brazilian Portuguese (BP) by native speakers of English and Spanish, who have attained an advanced level of proficiency in either English or Spanish as an L2. Examining the related domains of syntactic word order and relative clause attachment preference in L3 BP, the data clearly indicate that Spanish is transferred for both experimental groups irrespective of whether it was the L1 or L2. These results are expected by Rothman's (Second Language Research: forthcoming) model, but not necessarily predicted by other current hypotheses of multilingual syntactic transfer; the implications of this are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop and analyze a class of efficient Galerkin approximation methods for uncertainty quantification of nonlinear operator equations. The algorithms are based on sparse Galerkin discretizations of tensorized linearizations at nominal parameters. Specifically, we consider abstract, nonlinear, parametric operator equations J(\alpha ,u)=0 for random input \alpha (\omega ) with almost sure realizations in a neighborhood of a nominal input parameter \alpha _0. Under some structural assumptions on the parameter dependence, we prove existence and uniqueness of a random solution, u(\omega ) = S(\alpha (\omega )). We derive a multilinear, tensorized operator equation for the deterministic computation of k-th order statistical moments of the random solution's fluctuations u(\omega ) - S(\alpha _0). We introduce and analyse sparse tensor Galerkin discretization schemes for the efficient, deterministic computation of the k-th statistical moment equation. We prove a shift theorem for the k-point correlation equation in anisotropic smoothness scales and deduce that sparse tensor Galerkin discretizations of this equation converge in accuracy vs. complexity which equals, up to logarithmic terms, that of the Galerkin discretization of a single instance of the mean field problem. We illustrate the abstract theory for nonstationary diffusion problems in random domains.