868 resultados para temperature-based models
Resumo:
We present unified, systematic derivations of schemes in the two known measurement-based models of quantum computation. The first model (introduced by Raussendorf and Briegel, [Phys. Rev. Lett. 86, 5188 (2001)]) uses a fixed entangled state, adaptive measurements on single qubits, and feedforward of the measurement results. The second model (proposed by Nielsen, [Phys. Lett. A 308, 96 (2003)] and further simplified by Leung, [Int. J. Quant. Inf. 2, 33 (2004)]) uses adaptive two-qubit measurements that can be applied to arbitrary pairs of qubits, and feedforward of the measurement results. The underlying principle of our derivations is a variant of teleportation introduced by Zhou, Leung, and Chuang, [Phys. Rev. A 62, 052316 (2000)]. Our derivations unify these two measurement-based models of quantum computation and provide significantly simpler schemes.
Resumo:
Study Objective: Community-based models for injury prevention have become an accepted part of the overall injury control strategy. This systematic review of the scientific literature examines the evidence for their effectiveness in reducing injury due to inadequate car seat restraint use in children 0-16 years of age. Methods: A comprehensive search of the literature was performed using the following study selection criteria: community-based intervention study: target population was children aged 0-16 years of age; outcome measure was either injury rates due to motor vehicle crashes or observed changes in child restraint use; and use of community control or historical control in the study design. Quality assessment and data abstraction was guided by a standardized procedure and performed independently by two authors. Data synthesis was in tabular and text form with meta-analysis not being possible due to the discrepancy in methods and measures between the studies. Results: This review found eight studies, that met all the inclusion criteria. In the studies that measured injury outcomes, significant reductions in risk of motor vehicle occupant injury (33-55%) were reported in the study communities. For those studies reporting observed car seat restraint use the community-based programs were successful in increasing toddler restraint use in 1-5 year aged children by up to 11%; child booster seat use in 4-8 year aged children by up to 13%; rear restraint use in children aged 0-15 years by 8%; a 50% increase in restraint use in pre-school aged children in a high-risk community; and a 44% increase in children aged 5-11 years. Conclusion: While this review highlights that there is some evidence to support the effectiveness of community-based programs to promote car restraint use and/or motor vehicle occupant injury, limitations in the evaluation methodologies of the studies requires the results to be interpreted with caution. There is clearly a need for further high quality program evaluation research to develop an evidence base. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
An appreciation of the physical mechanisms which cause observed seismicity complexity is fundamental to the understanding of the temporal behaviour of faults and single slip events. Numerical simulation of fault slip can provide insights into fault processes by allowing exploration of parameter spaces which influence microscopic and macroscopic physics of processes which may lead towards an answer to those questions. Particle-based models such as the Lattice Solid Model have been used previously for the simulation of stick-slip dynamics of faults, although mainly in two dimensions. Recent increases in the power of computers and the ability to use the power of parallel computer systems have made it possible to extend particle-based fault simulations to three dimensions. In this paper a particle-based numerical model of a rough planar fault embedded between two elastic blocks in three dimensions is presented. A very simple friction law without any rate dependency and no spatial heterogeneity in the intrinsic coefficient of friction is used in the model. To simulate earthquake dynamics the model is sheared in a direction parallel to the fault plane with a constant velocity at the driving edges. Spontaneous slip occurs on the fault when the shear stress is large enough to overcome the frictional forces on the fault. Slip events with a wide range of event sizes are observed. Investigation of the temporal evolution and spatial distribution of slip during each event shows a high degree of variability between the events. In some of the larger events highly complex slip patterns are observed.
Resumo:
On a global scale basalts from mid-ocean ridges are strikingly more homogeneous than basalts from intraplate volcanism. The observed geochemical heterogeneity argues strongly for the existence of distinct reservoirs in the Earth's mantle. It is an unresolved problem of Geodynamics as to how these findings can be reconciled with large-scale convection. We review observational constraints, and investigate stirring properties of numerical models of mantle convection. Conditions in the early Earth may have supported layered convection with rapid stirring in the upper layers. Material that has been altered near the surface is transported downwards by small-scale convection. Thereby a layer of homogeneous depleted material develops above pristine mantle. As the mantle cools over Earth history, the effects leading to layering become reduced and models show the large-scale convection favoured for the Earth today. Laterally averaged, the upper mantle below the lithosphere is least affected by material that has experienced near-surface differentiation. The geochemical signature obtained during the previous episode of small-scale convection may be preserved there for the longest time. Additionally, stirring is less effective in the high viscosity layer of the central lower mantle [1, 2], supporting the survival of medium-scale heterogeneities there. These models are the first, using 3-d spherical geometry and mostly Earth-like parameters, to address the suggested change of convective style. Although the models are still far from reproducing our planet, we find that proposal might be helpful towards reconciling geochemical and geophysical constraints.
Resumo:
In this paper, we propose and demonstrate a novel scheme for simultaneous measurement of liquid level and temperature based on a simple uniform fiber Bragg grating (FBG) by monitoring both the short-wavelength-loss peaks and its Bragg resonance. The liquid level can be measured from the amplitude changes of the short-wavelength-loss peaks, while temperature can be measured from the wavelength shift of the Bragg resonance. Both theoretical simulation results and experimental results are presented. Such a scheme has some advantages including robustness, simplicity, flexibility in choosing sensitivity and simultaneous temperature measurement capability.
Resumo:
This paper compares the UK/US exchange rate forecasting performance of linear and nonlinear models based on monetary fundamentals, to a random walk (RW) model. Structural breaks are identified and taken into account. The exchange rate forecasting framework is also used for assessing the relative merits of the official Simple Sum and the weighted Divisia measures of money. Overall, there are four main findings. First, the majority of the models with fundamentals are able to beat the RW model in forecasting the UK/US exchange rate. Second, the most accurate forecasts of the UK/US exchange rate are obtained with a nonlinear model. Third, taking into account structural breaks reveals that the Divisia aggregate performs better than its Simple Sum counterpart. Finally, Divisia-based models provide more accurate forecasts than Simple Sum-based models provided they are constructed within a nonlinear framework.
Resumo:
In this paper, we propose and demonstrate a novel scheme for simultaneous measurement of liquid level and temperature based on a simple uniform fiber Bragg grating (FBG) by monitoring both the short-wavelength-loss peaks and its Bragg resonance. The liquid level can be measured from the amplitude changes of the short-wavelength-loss peaks, while temperature can be measured from the wavelength shift of the Bragg resonance. Both theoretical simulation results and experimental results are presented. Such a scheme has some advantages including robustness, simplicity, flexibility in choosing sensitivity and simultaneous temperature measurement capability.
Resumo:
We study the comparative importance of thermal to nonthermal fluctuations for membrane-based models in the linear regime. Our results, both in 1+1 and 2+1 dimensions, suggest that nonthermal fluctuations dominate thermal ones only when the relaxation time τ is large. For moderate to small values of τ, the dynamics is defined by a competition between these two forces. The results are expected to act as a quantitative benchmark for biological modeling in systems involving cytoskeletal and other nonthermal fluctuations. © 2011 American Physical Society.
Resumo:
An approach to realizing simultaneous measurement of refractive index (RI) and temperature based on a microfiber-based dual inline Mach-Zehnder interferometer (MZI) is proposed and demonstrated. Due to different interference mechanisms, as one interference between the core mode and the lower order cladding mode in the sensing single-mode fiber and the other interference between the fundamental mode and the high-order mode in the multimode microfiber, the former interferometer achieves RI sensitivity of -23.67 nm/RIU and temperature sensitivity of 81.2 pm/oC, whereas those of the latter are 3820.23 nm/RIU, and -465.7 pm/oC, respectively. The large sensitivity differences can provide a more accurate demodulation of RI and temperature. The sensor is featured with multiparameters measurement, compact structure, high sensitivity, low cost, and easy fabrication.
Resumo:
For metal and metal halide vapor lasers excited by high frequency pulsed discharge, the thermal effect mainly caused by the radial temperature distribution is of considerable importance for stable laser operation and improvement of laser output characteristics. A short survey of the obtained analytical and numerical-analytical mathematical models of the temperature profile in a high-powered He-SrBr2 laser is presented. The models are described by the steady-state heat conduction equation with mixed type nonlinear boundary conditions for the arbitrary form of the volume power density. A complete model of radial heat flow between the two tubes is established for precise calculating the inner wall temperature. The models are applied for simulating temperature profiles for newly designed laser. The author’s software prototype LasSim is used for carrying out the mathematical models and simulations.
Resumo:
Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.
Resumo:
In recent years, there has been an increasing interest in learning a distributed representation of word sense. Traditional context clustering based models usually require careful tuning of model parameters, and typically perform worse on infrequent word senses. This paper presents a novel approach which addresses these limitations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned representations outperform the publicly available embeddings on half of the metrics in the word similarity task, 6 out of 13 sub tasks in the analogical reasoning task, and gives the best overall accuracy in the word sense effect classification task, which shows the effectiveness of our proposed distributed distribution learning model.
Resumo:
There has been an increasing interest in the use of agent-based simulation and some discussion of the relative merits of this approach as compared to discrete-event simulation. There are differing views on whether an agent-based simulation offers capabilities that discrete-event cannot provide or whether all agent-based applications can at least in theory be undertaken using a discrete-event approach. This paper presents a simple agent-based NetLogo model and corresponding discrete-event versions implemented in the widely used ARENA software. The two versions of the discrete-event model presented use a traditional process flow approach normally adopted in discrete-event simulation software and also an agent-based approach to the model build. In addition a real-time spatial visual display facility is provided using a spreadsheet platform controlled by VBA code embedded within the ARENA model. Initial findings from this investigation are that discrete-event simulation can indeed be used to implement agent-based models and with suitable integration elements such as VBA provide the spatial displays associated with agent-based software.
Resumo:
The spatial and temporal distribution of modern diatom assemblages in surface sediments, on the most dominant macrophytes, and in the water column at 96 locations in Florida Bay, Biscayne Bay and adjacent regions were examined in order to develop paleoenvironmental prediction models for this region. Analyses of these distributions revealed distinct temporal and spatial differences in assemblages among the locations. The differences among diatom assemblages living on subaquatic vegetation and sediments, and in the water column were significant. Because concentrations of salts, total phosphorus (WTP), total nitrogen (WTN) and total organic carbon (WTOC) are partly controlled by water management in this region, diatom-based models were produced to assess these variables. Discriminant function analyses showed that diatoms can also be successfully used to reconstruct changes in the abundance of diatom assemblages typical for different habitats and life habits. ^ To interpret paleoenvironmental changes, changes in salinity, WTN, WTP and WTOC were inferred from diatoms preserved in sediment cores collected along environmental gradients in Florida Bay (4 cores) and from nearshore and offshore locations in Biscayne Bay (3 cores). The reconstructions showed that water quality conditions in these estuaries have been fluctuating for thousands of years due to natural processes and sea-level changes, but almost synchronized shifts in diatom assemblages occurred in the mid-1960’s at all coring locations (except Ninemile Bank and Bob Allen Bank in Florida Bay). These alterations correspond to the major construction of numerous water management structures on the mainland. Additionally, all the coring sites (except Card Sound Bank, Biscayne Bay and Trout Cove, Florida Bay) showed decreasing salinity and fluctuations in nutrient levels in the last two decades that correspond to increased rainfall in the 1990’s and increased freshwater discharge to the bays, a result of increased freshwater deliveries to the Everglades by South Florida Water Management District in the 1980’s and 1990’s. Reconstructions of the abundance of diatom assemblages typical for different habitats and life habits revealed multiple sources of diatoms to the coring locations and that epiphytic assemblages in both bays increased in abundance since the early 1990’s. ^
Resumo:
This dissertation aimed to improve travel time estimation for the purpose of transportation planning by developing a travel time estimation method that incorporates the effects of signal timing plans, which were difficult to consider in planning models. For this purpose, an analytical model has been developed. The model parameters were calibrated based on data from CORSIM microscopic simulation, with signal timing plans optimized using the TRANSYT-7F software. Independent variables in the model are link length, free-flow speed, and traffic volumes from the competing turning movements. The developed model has three advantages compared to traditional link-based or node-based models. First, the model considers the influence of signal timing plans for a variety of traffic volume combinations without requiring signal timing information as input. Second, the model describes the non-uniform spatial distribution of delay along a link, this being able to estimate the impacts of queues at different upstream locations of an intersection and attribute delays to a subject link and upstream link. Third, the model shows promise of improving the accuracy of travel time prediction. The mean absolute percentage error (MAPE) of the model is 13% for a set of field data from Minnesota Department of Transportation (MDOT); this is close to the MAPE of uniform delay in the HCM 2000 method (11%). The HCM is the industrial accepted analytical model in the existing literature, but it requires signal timing information as input for calculating delays. The developed model also outperforms the HCM 2000 method for a set of Miami-Dade County data that represent congested traffic conditions, with a MAPE of 29%, compared to 31% of the HCM 2000 method. The advantages of the proposed model make it feasible for application to a large network without the burden of signal timing input, while improving the accuracy of travel time estimation. An assignment model with the developed travel time estimation method has been implemented in a South Florida planning model, which improved assignment results.