914 resultados para Deterministic nanofabrication


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations (SSPs) as well as for model error representation, uncertainty quantification, data assimilation, and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large-scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modeling. In this review, we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspective. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a topological approach to the problem of determining a curve from its iterated integrals. In particular, we prove that a family of terms in the signature series of a two dimensional closed curve with finite p-variation, 1≤p<2, are in fact moments of its winding number. This relation allows us to prove that the signature series of a class of simple non-smooth curves uniquely determine the curves. This implies that outside a Chordal SLEκ null set, where 0<κ≤4, the signature series of curves uniquely determine the curves. Our calculations also enable us to express the Fourier transform of the n-point functions of SLE curves in terms of the expected signature of SLE curves. Although the techniques used in this article are deterministic, the results provide a platform for studying SLE curves through the signatures of their sample paths.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter has two goals: (a) to discuss the Spanish-Portuguese interface in current formal language acquisition research and (b) to highlight the contributions of this language pairing in the emerging field of formal third language acquisition. The authors discuss two L3 acquisition studies (Montrul, Dias, & Santos, 2011; Giancaspro, Halloran, & Iverson, in press) examining Differential Object Marking, a morphological case marker present in Spanish but not in Portuguese, arguing that the results show how data from Spanish-English bilinguals learning Brazilian Portuguese as an L3 illuminate the deterministic role of structural and typological similarity in linguistic transfer. The data provide supportive evidence for only one of three existing L3 transfer models: the Typological Proximity Model (Rothman, 2010, 2011, 2013).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis explores human-environment interactions during the Mid-Late Holocene in raised bogs in central Ireland. The raised bogs of central Ireland are widely-recognised for their considerable palaeoenvironmental and archaeological resources: research over the past few decades has established the potential for such sites to preserve sensitive records of Holocene climatic variability expressed as changes in bog surface wetness (BSW); meanwhile archaeological investigations over the past century have uncovered hundreds of peatland archaeological features dating from the Neolithic through to the Post-Medieval period including wooden trackways, platforms, and deposits of high-status metalwork. Previous studies have attempted to explore the relationship between records of past environmental change and the occurrence of peatland archaeological sites reaching varying conclusions. More recently, environmentally-deterministic models of human-environment interaction in Irish raised bogs at the regional scale have been explicitly tested leading to the conclusion that there is no relationship between BSW and past human activity. These relationships are examined in more detail on a site-by-site basis in this thesis. To that end, testate amoebae-derived BSW records from nine milled former raised bogs in central Ireland were produced from sites with known and dated archaeological records. Relationships between BSW records and environmental conditions within the study area were explored through both the development of a new central Ireland testate amoebae transfer function and through comparisons between recent BSW records and instrumental weather data. Compilation of BSW records from the nine fossil study sites show evidence both for climate forcing, particularly during 3200-2400 cal BP, as well as considerable inter-site variability. Considerable inter-site variability was also evident in the archaeological records of the same sites. Whilst comparisons between BSW and archaeological records do not show a consistent linear relationship, examination of records on a site-by-site basis were shown to reveal interpretatively important contingent relationships. It is concluded therefore, that future research on human-environment interactions should focus on individual sites and should utilise theoretical approaches from the humanities in order to avoid the twin pitfalls of masking important local patterns of change, and of environmental determinism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961–2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño–Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper shows that radiometer channel radiances for cloudy atmospheric conditions can be simulated with an optimised frequency grid derived under clear-sky conditions. A new clear-sky optimised grid is derived for AVHRR channel 5 ð12 m m, 833 cm �1 Þ. For HIRS channel 11 ð7:33 m m, 1364 cm �1 Þ and AVHRR channel 5, radiative transfer simulations using an optimised frequency grid are compared with simulations using a reference grid, where the optimised grid has roughly 100–1000 times less frequencies than the full grid. The root mean square error between the optimised and the reference simulation is found to be less than 0.3 K for both comparisons, with the magnitude of the bias less than 0.03 K. The simulations have been carried out with the radiative transfer model Atmospheric Radiative Transfer Simulator (ARTS), version 2, using a backward Monte Carlo module for the treatment of clouds. With this module, the optimised simulations are more than 10 times faster than the reference simulations. Although the number of photons is the same, the smaller number of frequencies reduces the overhead for preparing the optical properties for each frequency. With deterministic scattering solvers, the relative decrease in runtime would be even more. The results allow for new radiative transfer applications, such as the development of new retrievals, because it becomes much quicker to carry out a large number of simulations. The conclusions are applicable to any downlooking infrared radiometer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines convection-permitting numerical simulations of four cases of terrain-locked quasi-stationary convective bands over the UK. For each case, a 2.2-km grid-length 12-member ensemble and 1.5-km grid-length deterministic forecast are analyzed, each with two different initialization times. Object-based verification is applied to determine whether the simulations capture the structure, location, timing, intensity and duration of the observed precipitation. These verification diagnostics reveal that the forecast skill varies greatly between the four cases. Although the deterministic and ensemble simulations captured some aspects of the precipitation correctly in each case, they never simultaneously captured all of them satisfactorily. In general, the models predicted banded precipitation accumulations at approximately the correct time and location, but the precipitating structures were more cellular and less persistent than the coherent quasi-stationary bands that were observed. Ensemble simulations from the two different initialization times were not significantly different, which suggests a potential benefit of time-lagging subsequent ensembles to increase ensemble size. The predictive skill of the upstream larger-scale flow conditions and the simulated precipitation on the convection-permitting grids were strongly correlated, which suggests that more accurate forecasts from the parent ensemble should improve the performance of the convection-permitting ensemble nested within it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The derivation of time evolution equations for slow collective variables starting from a micro- scopic model system is demonstrated for the tutorial example of the classical, two-dimensional XY model. Projection operator techniques are used within a nonequilibrium thermodynamics framework together with molecular simulations in order to establish the building blocks of the hydrodynamics equations: Poisson brackets that determine the deterministic drift, the driving forces from the macroscopic free energy and the friction matrix. The approach is rather general and can be applied for deriving the equations of slow variables for a broad variety of systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on a large dataset from eight Asian economies, we test the impact of post-crisis regulatory reforms on the performance of depository institutions in countries at different levels of financial development. We allow for technological heterogeneity and estimate a set of country-level stochastic cost frontiers followed by a deterministic bootstrapped meta-frontier to evaluate cost efficiency and cost technology. Our results support the view that liberalization policies have a positive impact on bank performance, while the reverse is true for prudential regulation policies. The removal of activities restrictions, bank privatization and foreign bank entry have a positive and significant impact on technological progress and cost efficiency. In contrast, prudential policies, which aim to protect the banking sector from excessive risk-taking, tend to adversely affect banks cost efficiency but not cost technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Following the 1997 crisis, banking sector reforms in Asia have been characterised by the emphasis on prudential regulation, associated with increased financial liberalisation. Using a panel data set of commercial banks from eight major Asian economies over the period 2001-2010, this study explores how the coexistence of liberalisation and prudential regulation affects banks’ cost characteristics. Given the presence of heterogeneity of technologies across countries, we use a stochastic frontier approach followed by the estimation of a deterministic meta-frontier to provide ‘true’ estimates of bank cost efficiency measures. Our results show that the liberalization of bank interest rates and the increase in foreign banks' presence have had a positive and significant impact on technological progress and cost efficiency. On the other hand, we find that prudential regulation might adversely affect bank cost performance. When designing an optimal regulatory framework, policy makers should combine policies which aim to foster financial stability without hindering financial intermediation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Plant–Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic scheme only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant–Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant–Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecastuncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called “How much are you prepared to pay for a forecast?”. The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydrometeorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants’ willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A particle filter method is presented for the discrete-time filtering problem with nonlinear ItA ` stochastic ordinary differential equations (SODE) with additive noise supposed to be analytically integrable as a function of the underlying vector-Wiener process and time. The Diffusion Kernel Filter is arrived at by a parametrization of small noise-driven state fluctuations within branches of prediction and a local use of this parametrization in the Bootstrap Filter. The method applies for small noise and short prediction steps. With explicit numerical integrators, the operations count in the Diffusion Kernel Filter is shown to be smaller than in the Bootstrap Filter whenever the initial state for the prediction step has sufficiently few moments. The established parametrization is a dual-formula for the analysis of sensitivity to gaussian-initial perturbations and the analysis of sensitivity to noise-perturbations, in deterministic models, showing in particular how the stability of a deterministic dynamics is modeled by noise on short times and how the diffusion matrix of an SODE should be modeled (i.e. defined) for a gaussian-initial deterministic problem to be cast into an SODE problem. From it, a novel definition of prediction may be proposed that coincides with the deterministic path within the branch of prediction whose information entropy at the end of the prediction step is closest to the average information entropy over all branches. Tests are made with the Lorenz-63 equations, showing good results both for the filter and the definition of prediction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a GIS-based multicriteria flood risk assessment and mapping approach applied to coastal drainage basins where hydrological data are not available. It involves risk to different types of possible processes: coastal inundation (storm surge), river, estuarine and flash flood, either at urban or natural areas, and fords. Based on the causes of these processes, several environmental indicators were taken to build-up the risk assessment. Geoindicators include geological-geomorphologic proprieties of Quaternary sedimentary units, water table, drainage basin morphometry, coastal dynamics, beach morphodynamics and microclimatic characteristics. Bioindicators involve coastal plain and low slope native vegetation categories and two alteration states. Anthropogenic indicators encompass land use categories properties such as: type, occupation density, urban structure type and occupation consolidation degree. The selected indicators were stored within an expert Geoenvironmental Information System developed for the State of Sao Paulo Coastal Zone (SIIGAL), which attributes were mathematically classified through deterministic approaches, in order to estimate natural susceptibilities (Sn), human-induced susceptibilities (Sa), return period of rain events (Ri), potential damages (Dp) and the risk classification (R), according to the equation R=(Sn.Sa.Ri).Dp. Thematic maps were automatically processed within the SIIGAL, in which automata cells (""geoenvironmental management units"") aggregating geological-geomorphologic and land use/native vegetation categories were the units of classification. The method has been applied to the Northern Littoral of the State of Sao Paulo (Brazil) in 32 small drainage basins, demonstrating to be very useful for coastal zone public politics, civil defense programs and flood management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conventional procedures employed in the modeling of viscoelastic properties of polymer rely on the determination of the polymer`s discrete relaxation spectrum from experimentally obtained data. In the past decades, several analytical regression techniques have been proposed to determine an explicit equation which describes the measured spectra. With a diverse approach, the procedure herein introduced constitutes a simulation-based computational optimization technique based on non-deterministic search method arisen from the field of evolutionary computation. Instead of comparing numerical results, this purpose of this paper is to highlight some Subtle differences between both strategies and focus on what properties of the exploited technique emerge as new possibilities for the field, In oder to illustrate this, essayed cases show how the employed technique can outperform conventional approaches in terms of fitting quality. Moreover, in some instances, it produces equivalent results With much fewer fitting parameters, which is convenient for computational simulation applications. I-lie problem formulation and the rationale of the highlighted method are herein discussed and constitute the main intended contribution. (C) 2009 Wiley Periodicals, Inc. J Appl Polym Sci 113: 122-135, 2009