9 resultados para video on demand

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work concerns the developnent of a proton irduced X-ray emission (PIXE) analysis system and a multi-sample scattering chamber facility. The characteristics of the beam pulsing system and its counting rate capabilities were evaluated by observing the ion-induced X-ray emission from pure thick copper targets, with and without beam pulsing operation. The characteristic X-rays were detected with a high resolution Si(Li) detector coupled to a rrulti-channel analyser. The removal of the pile-up continuum by the use of the on-demand beam pulsing is clearly demonstrated in this work. This new on-demand pu1sirg system with its counting rate capability of 25, 18 and 10 kPPS corresponding to 2, 4 am 8 usec main amplifier time constant respectively enables thick targets to be analysed more readily. Reproducibility tests of the on-demard beam pulsing system operation were checked by repeated measurements of the system throughput curves, with and without beam pulsing. The reproducibility of the analysis performed using this system was also checked by repeated measurements of the intensity ratios from a number of standard binary alloys during the experimental work. A computer programme has been developed to evaluate the calculations of the X-ray yields from thick targets bornbarded by protons, taking into account the secondary X-ray yield production due to characteristic X-ray fluorescence from an element energetically higher than the absorption edge energy of the other element present in the target. This effect was studied on metallic binary alloys such as Fe/Ni and Cr/Fe. The quantitative analysis of Fe/Ni and Cr/Fe alloy samples to determine their elemental composition taking into account the enhancement has been demonstrated in this work. Furthermore, the usefulness of the Rutherford backscattering (R.B.S.) technique to obtain the depth profiles of the elements in the upper micron of the sample is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Project arose during a period in which the World was still coming to terms with the effects and implications of the so called 'energy crisis' of 1973/74. Serck Heat Transfer is a manufacturer of heat exchangers which transfer heat between fluids of various sorts. As such the company felt that past and possible future changes in the energy situation could have an impact upon the demand for its products. The thesis represents the first attempt to examine the impact of changes in the energy situation (a major economic variable) on the long term demand for heat exchangers. The scope of the work was limited to the United Kingdom, this being the largest single market for Serek's products. The thesis analyses industrial heat exchanger markets and identifies those trends which are related to both the changing energy situation and the usage of heat exchangers. These trends have been interpreted In terms of projected values of heat exchanger demand. The projections cover the period 197S to the year 2000. Also examined in the thesis is the future energy situation both internationally and nationally and it is found that in the long term there will be increasing pressure on consumers to conserve energy through rising real prices. The possibility of a connection between energy consumption and heat exchanger demand is investigated and no significant correlation found. This appears to be because there are a number of determinants of demand besides energy related factors and also there is a wide diversity of individual markets for heat exchangers. Conclusions are that in all markets, bar one, the changing energy situation should lead to a higher level of heat exchanger demand than would otherwise be the case had the energy situation not changed. It is also pointed out that it is misleading to look at changes in one influence on the demand for a product and ignore others.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present work describes the development of a proton induced X-ray emission (PIXE) analysis system, especially designed and builtfor routine quantitative multi-elemental analysis of a large number of samples. The historical and general developments of the analytical technique and the physical processes involved are discussed. The philosophy, design, constructional details and evaluation of a versatile vacuum chamber, an automatic multi-sample changer, an on-demand beam pulsing system and ion beam current monitoring facility are described.The system calibration using thin standard foils of Si, P, S,Cl, K, Ca, Ti, V, Fe, Cu, Ga, Ge, Rb, Y and Mo was undertaken at proton beam energies of 1 to 3 MeV in steps of 0.5 MeV energy and compared with theoretical calculations. An independent calibration check using bovine liver Standard Reference Material was performed.  The minimum detectable limits have been experimentally determined at detector positions of 90° and 135° with respect to the incident beam for the above range of proton energies as a function of atomic number Z. The system has detection limits of typically 10- 7 to 10- 9 g for elements 14ons of areal density of thin foils using Rutherford backscattering data.  Amniotic fluid samples supplied by South Sefton Health Authority were successfully analysed for their low base line elemental concentrations. In conclusion the findings of this work are discussed with suggestions for further work .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Successful innovation of prescription drugs requires a substantial amount of marketing support. There is, however, much concern about the effects of marketing expenditures on the demand of pharmaceutical products (Manchanda et al., Market Lett 16(3/4):293–308, 2005). For example, excessive marketing could stimulate demand for products in the absence of a fundamental need. It also has been suggested that increased marketing expenditures may reduce the price elasticity of demand and allow firms to charge higher prices (Windmeijer et al., Health Econ 15(1):5–18, 2005). In this paper, we present the outcomes of an empirical study in which we determine the effects of pharmaceutical marketing expenditures using a number of frequently used “standardized” models. We determine which models perform best in terms of predictive validity and adequate descriptions of reality. We demonstrate, among others, that the effects of promotional efforts are brand specific and that most standardized models do not provide adequate descriptions of reality. We find that marketing expenditures have no or moderate effects on demand for pharmaceutical products in The Netherlands.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The chapter examines possibilities for industrial policy in African countries through the lens of lessons that can be learned from the industrial policy approaches pursued in Ireland as well as in East Asia. As latecomers to industrialization, the small African economies are well positioned to undertake such an exercise, we suggest. This chapter provides some novel insights by providing a comparison between Ireland and the small African economies. To our knowledge such a comparison offers a unique contribution. Cognizant of the fact that a “one size fits all” approach to industrial policy is not appropriate in the African context, we argue in favor of the adoption of a more “holistic” approach to industrial policy in these economies. Such an approach we argue should focus simultaneously on demand and supply factors of industrial development, and on microeconomic as well as macroeconomic factors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Although the importance of dataset fitness-for-use evaluation and intercomparison is widely recognised within the GIS community, no practical tools have yet been developed to support such interrogation. GeoViQua aims to develop a GEO label which will visually summarise and allow interrogation of key informational aspects of geospatial datasets upon which users rely when selecting datasets for use. The proposed GEO label will be integrated in the Global Earth Observation System of Systems (GEOSS) and will be used as a value and trust indicator for datasets accessible through the GEO Portal. As envisioned, the GEO label will act as a decision support mechanism for dataset selection and thereby hopefully improve user recognition of the quality of datasets. To date we have conducted 3 user studies to (1) identify the informational aspects of geospatial datasets upon which users rely when assessing dataset quality and trustworthiness, (2) elicit initial user views on a GEO label and its potential role and (3), evaluate prototype label visualisations. Our first study revealed that, when evaluating quality of data, users consider 8 facets: dataset producer information; producer comments on dataset quality; dataset compliance with international standards; community advice; dataset ratings; links to dataset citations; expert value judgements; and quantitative quality information. Our second study confirmed the relevance of these facets in terms of the community-perceived function that a GEO label should fulfil: users and producers of geospatial data supported the concept of a GEO label that provides a drill-down interrogation facility covering all 8 informational aspects. Consequently, we developed three prototype label visualisations and evaluated their comparative effectiveness and user preference via a third user study to arrive at a final graphical GEO label representation. When integrated in the GEOSS, an individual GEO label will be provided for each dataset in the GEOSS clearinghouse (or other data portals and clearinghouses) based on its available quality information. Producer and feedback metadata documents are being used to dynamically assess information availability and generate the GEO labels. The producer metadata document can either be a standard ISO compliant metadata record supplied with the dataset, or an extended version of a GeoViQua-derived metadata record, and is used to assess the availability of a producer profile, producer comments, compliance with standards, citations and quantitative quality information. GeoViQua is also currently developing a feedback server to collect and encode (as metadata records) user and producer feedback on datasets; these metadata records will be used to assess the availability of user comments, ratings, expert reviews and user-supplied citations for a dataset. The GEO label will provide drill-down functionality which will allow a user to navigate to a GEO label page offering detailed quality information for its associated dataset. At this stage, we are developing the GEO label service that will be used to provide GEO labels on demand based on supplied metadata records. In this presentation, we will provide a comprehensive overview of the GEO label development process, with specific emphasis on the GEO label implementation and integration into the GEOSS.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Humans are especially good at taking another's perspective-representing what others might be thinking or experiencing. This "mentalizing" capacity is apparent in everyday human interactions and conversations. We investigated its neural basis using magnetoencephalography. We focused on whether mentalizing was engaged spontaneously and routinely to understand an utterance's meaning or largely on-demand, to restore "common ground" when expectations were violated. Participants conversed with 1 of 2 confederate speakers and established tacit agreements about objects' names. In a subsequent "test" phase, some of these agreements were violated by either the same or a different speaker. Our analysis of the neural processing of test phase utterances revealed recruitment of neural circuits associated with language (temporal cortex), episodic memory (e.g., medial temporal lobe), and mentalizing (temporo-parietal junction and ventromedial prefrontal cortex). Theta oscillations (3-7 Hz) were modulated most prominently, and we observed phase coupling between functionally distinct neural circuits. The episodic memory and language circuits were recruited in anticipation of upcoming referring expressions, suggesting that context-sensitive predictions were spontaneously generated. In contrast, the mentalizing areas were recruited on-demand, as a means for detecting and resolving perceived pragmatic anomalies, with little evidence they were activated to make partner-specific predictions about upcoming linguistic utterances.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper introduces a joint load balancing and hotspot mitigation protocol for mobile ad-hoc network (MANET) termed by us as 'load_energy balance + hotspot mitigation protocol (LEB+HM)'. We argue that although ad-hoc wireless networks have limited network resources - bandwidth and power, prone to frequent link/node failures and have high security risk; existing ad hoc routing protocols do not put emphasis on maintaining robust link/node, efficient use of network resources and on maintaining the security of the network. Typical route selection metrics used by existing ad hoc routing protocols are shortest hop, shortest delay, and loop avoidance. These routing philosophy have the tendency to cause traffic concentration on certain regions or nodes, leading to heavy contention, congestion and resource exhaustion which in turn may result in increased end-to-end delay, packet loss and faster battery power depletion, degrading the overall performance of the network. Also in most existing on-demand ad hoc routing protocols intermediate nodes are allowed to send route reply RREP to source in response to a route request RREQ. In such situation a malicious node can send a false optimal route to the source so that data packets sent will be directed to or through it, and tamper with them as wish. It is therefore desirable to adopt routing schemes which can dynamically disperse traffic load, able to detect and remove any possible bottlenecks and provide some form of security to the network. In this paper we propose a combine adaptive load_energy balancing and hotspot mitigation scheme that aims at evenly distributing network traffic load and energy, mitigate against any possible occurrence of hotspot and provide some form of security to the network. This combine approach is expected to yield high reliability, availability and robustness, that best suits any dynamic and scalable ad hoc network environment. Dynamic source routing (DSR) was use as our underlying protocol for the implementation of our algorithm. Simulation comparison of our protocol to that of original DSR shows that our protocol has reduced node/link failure, even distribution of battery energy, and better network service efficiency.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents an assessment of the technical and economic performance of thermal processes to generate electricity from a wood chip feedstock by combustion, gasification and fast pyrolysis. The scope of the work begins with the delivery of a wood chip feedstock at a conversion plant and ends with the supply of electricity to the grid, incorporating wood chip preparation, thermal conversion, and electricity generation in dual fuel diesel engines. Net generating capacities of 1–20 MWe are evaluated. The techno-economic assessment is achieved through the development of a suite of models that are combined to give cost and performance data for the integrated system. The models include feed pretreatment, combustion, atmospheric and pressure gasification, fast pyrolysis with pyrolysis liquid storage and transport (an optional step in de-coupled systems) and diesel engine or turbine power generation. The models calculate system efficiencies, capital costs and production costs. An identical methodology is applied in the development of all the models so that all of the results are directly comparable. The electricity production costs have been calculated for 10th plant systems, indicating the costs that are achievable in the medium term after the high initial costs associated with novel technologies have reduced. The costs converge at the larger scale with the mean electricity price paid in the EU by a large consumer, and there is therefore potential for fast pyrolysis and diesel engine systems to sell electricity directly to large consumers or for on-site generation. However, competition will be fierce at all capacities since electricity production costs vary only slightly between the four biomass to electricity systems that are evaluated. Systems de-coupling is one way that the fast pyrolysis and diesel engine system can distinguish itself from the other conversion technologies. Evaluations in this work show that situations requiring several remote generators are much better served by a large fast pyrolysis plant that supplies fuel to de-coupled diesel engines than by constructing an entire close-coupled system at each generating site. Another advantage of de-coupling is that the fast pyrolysis conversion step and the diesel engine generation step can operate independently, with intermediate storage of the fast pyrolysis liquid fuel, increasing overall reliability. Peak load or seasonal power requirements would also benefit from de-coupling since a small fast pyrolysis plant could operate continuously to produce fuel that is stored for use in the engine on demand. Current electricity production costs for a fast pyrolysis and diesel engine system are 0.091/kWh at 1 MWe when learning effects are included. These systems are handicapped by the typical characteristics of a novel technology: high capital cost, high labour, and low reliability. As such the more established combustion and steam cycle produces lower cost electricity under current conditions. The fast pyrolysis and diesel engine system is a low capital cost option but it also suffers from relatively low system efficiency particularly at high capacities. This low efficiency is the result of a low conversion efficiency of feed energy into the pyrolysis liquid, because of the energy in the char by-product. A sensitivity analysis has highlighted the high impact on electricity production costs of the fast pyrolysis liquids yield. The liquids yield should be set realistically during design, and it should be maintained in practice by careful attention to plant operation and feed quality. Another problem is the high power consumption during feedstock grinding. Efficiencies may be enhanced in ablative fast pyrolysis which can tolerate a chipped feedstock. This has yet to be demonstrated at commercial scale. In summary, the fast pyrolysis and diesel engine system has great potential to generate electricity at a profit in the long term, and at a lower cost than any other biomass to electricity system at small scale. This future viability can only be achieved through the construction of early plant that could, in the short term, be more expensive than the combustion alternative. Profitability in the short term can best be achieved by exploiting niches in the market place and specific features of fast pyrolysis. These include: •countries or regions with fiscal incentives for renewable energy such as premium electricity prices or capital grants; •locations with high electricity prices so that electricity can be sold direct to large consumers or generated on-site by companies who wish to reduce their consumption from the grid; •waste disposal opportunities where feedstocks can attract a gate fee rather than incur a cost; •the ability to store fast pyrolysis liquids as a buffer against shutdowns or as a fuel for peak-load generating plant; •de-coupling opportunities where a large, single pyrolysis plant supplies fuel to several small and remote generators; •small-scale combined heat and power opportunities; •sales of the excess char, although a market has yet to be established for this by-product; and •potential co-production of speciality chemicals and fuel for power generation in fast pyrolysis systems.