82 resultados para Theory of electronic transport

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We model the behavior of rational forward-looking agents in a spatial economy. The economic geography structure is built on Fujita et al. (1999)'s racetrack economy. Workers choose optimally what to consume at each period, as well as which spatial itinerary to follow in the geographical space. The spatial extent of the resulting agglomerations increases with the taste for variety and the expenditure share on manufactured goods, and decreases with transport costs. Because forward-looking agents anticipate the future formation of agglomerations, they are more responsive to spatial utility differentials than myopic agents. As a consequence, the emerging agglomerations are larger under perfect foresight spatial adjustments than under myopic ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Technology Acceptance Model (TAM) posits that Perceived Ease of Use (PEOU) and Perceived Usefulness (PU) influence the ‘intention to use’. The Post-Acceptance Model (PAM) posits that continued use is influenced by prior experience. In order to study the factors that influence how professionals use complex systems, we create a tentative research model that builds on PAM and TAM. Specifically we include PEOU and the construct ‘Professional Association Guidance’. We postulate that feature usage is enhanced when professional associations influence PU by highlighting additional benefits. We explore the theory in the context of post-adoption use of Electronic Medical Records (EMRs) by primary care physicians in Ontario. The methodology can be extended to other professional environments and we suggest directions for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method of clear-air turbulence (CAT) forecasting based on the Lighthill–Ford theory of spontaneous imbalance and emission of inertia–gravity waves has been derived and applied on episodic and seasonal time scales. A scale analysis of this shallow-water theory for midlatitude synoptic-scale flows identifies advection of relative vorticity as the leading-order source term. Examination of leading- and second-order terms elucidates previous, more empirically inspired CAT forecast diagnostics. Application of the Lighthill–Ford theory to the Upper Mississippi and Ohio Valleys CAT outbreak of 9 March 2006 results in good agreement with pilot reports of turbulence. Application of Lighthill–Ford theory to CAT forecasting for the 3 November 2005–26 March 2006 period using 1-h forecasts of the Rapid Update Cycle (RUC) 2 1500 UTC model run leads to superior forecasts compared to the current operational version of the Graphical Turbulence Guidance (GTG1) algorithm, the most skillful operational CAT forecasting method in existence. The results suggest that major improvements in CAT forecasting could result if the methods presented herein become operational.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within this paper modern techniques such as satellite image analysis and tools provided by geographic information systems (GIS.) are exploited in order to extend and improve existing techniques for mapping the spatial distribution of sediment transport processes. The processes of interest comprise mass movements such as solifluction, slope wash, dirty avalanches and rock- and boulder falls. They differ considerably in nature and therefore different approaches for the derivation of their spatial extent are required. A major challenge is addressing the differences between the comparably coarse resolution of the available satellite data (Landsat TM/ETM+, 30 in x 30 m) and the actual scale of sediment transport in this environment. A three-stepped approach has been developed which is based on the concept of Geomorphic Process Units (GPUs): parameterization, process area delineation and combination. Parameters include land cover from satellite data and digital elevation model derivatives. Process areas are identified using a hierarchical classification scheme utilizing thresholds and definition of topology. The approach has been developed for the Karkevagge in Sweden and could be successfully transferred to the Rabotsbekken catchment at Okstindan, Norway using similar input data. Copyright (C) 2008 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Calculations are reported of the magnetic anisotropy energy of two-dimensional (2D) Co nanostructures on a Pt(111) substrate. The perpendicular magnetic anisotropy (PMA) of the 2D Co clusters strongly depends on their size and shape, and rapidly decreases with increasing cluster size. The PMA calculated is in reasonable agreement with experimental results. The sensitivity of the results to the Co-Pt spacing at the interface is also investigated and, in particular, for a complete Co monolayer we note that the value of the spacing at the interface determines whether PMA or in-plane anisotropy occurs. We find that the PMA can be greatly enhanced by the addition of Pt adatoms to the top surface of the 2D Co clusters. A single Pt atom can induce in excess of 5 meV to the anisotropy energy of a cluster. In the absence of the Pt adatoms the PMA of the Co clusters falls below 1 meV/Co atom for clusters of about 10 atoms whereas, with Pt atoms added to the surface of the clusters, a PMA of 1 meV/Co atom can be maintained for clusters as large as about 40 atoms. The effect of placing Os atoms on the top of the Co clusters is also considered. The addition of 5d atoms and clusters on the top of ferromagnetic nanoparticles may provide an approach to tune the magnetic anisotropy and moment separately.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims To investigate the effects of electronic prescribing (EP) on prescribing quality, as indicated by prescribing errors and pharmacists' clinical interventions, in a UK hospital. Methods Prescribing errors and pharmacists' interventions were recorded by the ward pharmacist during a 4 week period both pre- and post-EP, with a second check by the principal investigator. The percentage of new medication orders with a prescribing error and/or pharmacist's intervention was calculated for each study period. Results Following the introduction of EP, there was a significant reduction in both pharmacists' interventions and prescribing errors. Interventions reduced from 73 (3.0% of all medication orders) to 45 (1.9%) (95% confidence interval (CI) for the absolute reduction 0.2, 2.0%), and errors from 94 (3.8%) to 48 (2.0%) (95% CI 0.9, 2.7%). Ten EP-specific prescribing errors were identified. Only 52% of pharmacists' interventions related to a prescribing error pre-EP, and 60% post-EP; only 40% and 56% of prescribing errors resulted in an intervention pre- and post-EP, respectively. Conclusions EP improved the quality of prescribing by reducing both prescribing errors and pharmacists' clinical interventions. Prescribers and pharmacists need to be aware of new types of error with EP, so that they can best target their activities to reduce clinical risk. Pharmacists may need to change the way they work to complement, rather than duplicate, the benefits of EP.