75 resultados para Deterministic walkers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Field configured assembly is a programmable force field method that permits rapid, "hands-free" manipulation, assembly, and integration of mesoscale objects and devices. In this method, electric fields, configured by specific addressing of receptor and counter electrode sites pre-patterned at a silicon chip substrate, drive the field assisted transport, positioning, and localization of mesoscale devices at selected receptor locations. Using this approach, we demonstrate field configured deterministic and stochastic self-assembly of model mesoscale devices, i.e., 50 mum diameter, 670 nm emitting GaAs-based light emitting diodes, at targeted receptor sites on a silicon chip. The versatility of the field configured assembly method suggests that it is applicable to self-assembly of a wide variety of functionally integrated nanoscale and mesoscale systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Long-range dependence in volatility is one of the most prominent examples in financial market research involving universal power laws. Its characterization has recently spurred attempts to provide some explanations of the underlying mechanism. This paper contributes to this recent line of research by analyzing a simple market fraction asset pricing model with two types of traders---fundamentalists who trade on the price deviation from estimated fundamental value and trend followers whose conditional mean and variance of the trend are updated through a geometric learning process. Our analysis shows that agent heterogeneity, risk-adjusted trend chasing through the geometric learning process, and the interplay of noisy fundamental and demand processes and the underlying deterministic dynamics can be the source of power-law distributed fluctuations. In particular, the noisy demand plays an important role in the generation of insignificant autocorrelations (ACs) on returns, while the significant decaying AC patterns of the absolute returns and squared returns are more influenced by the noisy fundamental process. A statistical analysis based on Monte Carlo simulations is conducted to characterize the decay rate. Realistic estimates of the power-law decay indices and the (FI)GARCH parameters are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is a contribution to the literature on the explanatory power and calibration of heterogeneous asset pricing models. We set out a new stochastic market-fraction asset pricing model of fundamentalists and trend followers under a market maker. Our model explains key features of financial market behaviour such as market dominance, convergence to the fundamental price and under- and over-reaction. We use the dynamics of the underlying deterministic system to characterize these features and statistical properties, including convergence of the limiting distribution and autocorrelation structure. We confirm these properties using Monte Carlo simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditionally, the Internet provides only a “best-effort” service, treating all packets going to the same destination equally. However, providing differentiated services for different users based on their quality requirements is increasingly becoming a demanding issue. For this, routers need to have the capability to distinguish and isolate traffic belonging to different flows. This ability to determine the flow each packet belongs to is called packet classification. Technology vendors are reluctant to support algorithmic solutions for classification due to their non-deterministic performance. Although CAMs are favoured by technology vendors due to their deterministic high lookup rates, they suffer from the problems of high power dissipation and high silicon cost. This paper provides a new algorithmic-architectural solution for packet classification that mixes CAMs with algorithms based on multi-level cutting the classification space into smaller spaces. The provided solution utilizes the geometrical distribution of rules in the classification space. It provides the deterministic performance of CAMs, support for dynamic updates, and added flexibility for system designers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper is primarily concerned with the modelling of aircraft manufacturing cost. The aim is to establish an integrated life cycle balanced design process through a systems engineering approach to interdisciplinary analysis and control. The cost modelling is achieved using the genetic causal approach that enforces product family categorisation and the subsequent generation of causal relationships between deterministic cost components and their design source. This utilises causal parametric cost drivers and the definition of the physical architecture from the Work Breakdown Structure (WBS) to identify product families. The paper presents applications to the overall aircraft design with a particular focus on the fuselage as a subsystem of the aircraft, including fuselage panels and localised detail, as well as engine nacelles. The higher level application to aircraft requirements and functional analysis is investigated and verified relative to life cycle design issues for the relationship between acquisition cost and Direct Operational Cost (DOC), for a range of both metal and composite subsystems. Maintenance is considered in some detail as an important contributor to DOC and life cycle cost. The lower level application to aircraft physical architecture is investigated and verified for the WBS of an engine nacelle, including a sequential build stage investigation of the materials, fabrication and assembly costs. The studies are then extended by investigating the acquisition cost of aircraft fuselages, including the recurring unit cost and the non-recurring design cost of the airframe sub-system. The systems costing methodology is facilitated by the genetic causal cost modeling technique as the latter is highly generic, interdisciplinary, flexible, multilevel and recursive in nature, and can be applied at the various analysis levels required of systems engineering. Therefore, the main contribution of paper is a methodology for applying systems engineering costing, supported by the genetic causal cost modeling approach, whether at a requirements, functional or physical level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Controlling coherent electromagnetic interactions in molecular systems is a problem of both fundamental interest and important applicative potential in the development of photonic and opto-electronic devices. The strength of these interactions determines both the absorption and emission properties of molecules coupled to nanostructures, effectively governing the optical properties of such a composite metamaterial. Here we report on the observation of strong coupling between a plasmon supported by an assembly of oriented gold nanorods (ANR) and a molecular exciton. We show that the coupling is easily engineered and is deterministic as both spatial and spectral overlap between the plasmonic structure and molecular aggregates are controlled. We think that these results in conjunction with the flexible geometry of the ANR are of potential significance to the development of plasmonic molecular devices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A flexible, mass-conservative numerical technique for solving the advection-dispersion equation for miscible contaminant transport is presented. The method combines features of puff transport models from air pollution studies with features from the random walk particle method used in water resources studies, providing a deterministic time-marching algorithm which is independent of the grid Peclet number and scales from one to higher dimensions simply. The concentration field is discretised into a number of particles, each of which is treated as a point release which advects and disperses over the time interval. The dispersed puff is itself discretised into a spatial distribution of particles whose masses can be pre-calculated. Concentration within the simulation domain is then calculated from the mass distribution as an average over some small volume. Comparison with analytical solutions for a one-dimensional fixed-duration concentration pulse and for two-dimensional transport in an axisymmetric flow field indicate that the algorithm performs well. For a given level of accuracy the new method has lower computation times than the random walk particle method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Subspace monitoring has recently been proposed as a condition monitoring tool that requires considerably fewer variables to be analysed compared to dynamic principal component analysis (PCA). This paper analyses subspace monitoring in identifying and isolating fault conditions, which reveals that the existing work suffers from inherent limitations if complex fault senarios arise. Based on the assumption that the fault signature is deterministic while the monitored variables are stochastic, the paper introduces a regression-based reconstruction technique to overcome these limitations. The utility of the proposed fault identification and isolation method is shown using a simulation example and the analysis of experimental data from an industrial reactive distillation unit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditionally, the Internet provides only a “best-effort” service, treating all packets going to the same destination equally. However, providing differentiated services for different users based on their quality requirements is increasingly becoming a demanding issue. For this, routers need to have the capability to distinguish and isolate traffic belonging to different flows. This ability to determine the flow each packet belongs to is called packet classification. Technology vendors are reluctant to support algorithmic solutions for classification due to their nondeterministic performance. Although content addressable memories (CAMs) are favoured by technology vendors due to their deterministic high-lookup rates, they suffer from the problems of high-power consumption and high-silicon cost. This paper provides a new algorithmic-architectural solution for packet classification that mixes CAMs with algorithms based on multilevel cutting of the classification space into smaller spaces. The provided solution utilizes the geometrical distribution of rules in the classification space. It provides the deterministic performance of CAMs, support for dynamic updates, and added flexibility for system designers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling patient flow in health care systems is vital in understanding the system activity and may therefore prove to be useful in improving their functionality. An extensively used measure is the average length of stay which, although easy to calculate and quantify, is not considered appropriate when the distribution is very long-tailed. In fact, simple deterministic models are generally considered inadequate because of the necessity for models to reflect the complex, variable, dynamic and multidimensional nature of the systems. This paper focuses on modelling length of stay and flow of patients. An overview of such modelling techniques is provided, with particular attention to their impact and suitability in managing a hospital service.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research investigated the unconfined flow through dams. The hydraulic conductivity was modeled as spatially random field following lognormal distribution. Results showed that the seepage flow produced from the stochastic solution was smaller than its deterministic value. In addition, the free surface was observed to exit at a point lower than that obtained from the deterministic solution. When the hydraulic conductivity was strongly correlated in the horizontal direction than the vertical direction, the flow through the dam has markedly increased. It is suggested that it may not be necessary to construct a core in dams made from soils that exhibit high degree of variability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional Time Division Multiple Access (TDMA) protocol provides deterministic periodic collision free data transmissions. However, TDMA lacks flexibility and exhibits low efficiency in dynamic environments such as wireless LANs. On the other hand contention-based MAC protocols such as the IEEE 802.11 DCF are adaptive to network dynamics but are generally inefficient in heavily loaded or large networks. To take advantage of the both types of protocols, a D-CVDMA protocol is proposed. It is based on the k-round elimination contention (k-EC) scheme, which provides fast contention resolution for Wireless LANs. D-CVDMA uses a contention mechanism to achieve TDMA-like collision-free data transmissions, which does not need to reserve time slots for forthcoming transmissions. These features make the D-CVDMA robust and adaptive to network dynamics such as node leaving and joining, changes in packet size and arrival rate, which in turn make it suitable for the delivery of hybrid traffic including multimedia and data content. Analyses and simulations demonstrate that D-CVDMA outperforms the IEEE 802.11 DCF and k-EC in terms of network throughput, delay, jitter, and fairness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper adds to the growing literature of market competition related to the remanufacturer by analyzing the model where the remanufacturer and the manufacturer collaborate with each other in the same channel. This paper investigates a single-period deterministic model which keeps the analysis simple so as to obtain sharper insights. The results characterize the optimal remanufacturing and pricing strategies for the remanufacturer and the manufacturer in the collaborative model