12 resultados para Corporative use of the territory

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the use of the synoptic approach within decision making concerning nuclear waste management. The synoptic approach to decision making refers to an approach to rational decision making that assumes as an ideal, comprehensiveness of information and analysis. Two case studies are examined in which a high degree of synoptic analysis has been used within the decision making process. The case studies examined are the Windscale Inquiry into the decision to build the THORP reprocessing plant and the Nirex safety assessment of nuclear waste disposal. The case studies are used to test Lindblom's hypothesis that a synoptic approach to decision making is not achievable. In the first case study Lindblom's hypothesis is tested through the evaluation of the decision to build the THORP plant, taken following the Windscale Inquiry. It is concluded that the incongruity of this decision supports Lindblom's hypothesis. However, it has been argued that the Inquiry should be seen as a legitimisation exercise for a decision that was effectively predetermined, rather than a rigorous synoptic analysis. Therefore, the Windscale Inquiry does not provide a robust test of the synoptic method. It was concluded that a methodology was required, that allowed robust conclusions to be drawn, despite the ambiguity of the role of the synoptic method in decision making. Thus, the methodology adopted for the second case study was modified. In this case study the synoptic method was evaluated directly. This was achieved through the analysis of the cogency of the Nirex safety assessment. It was concluded that the failure of Nirex to provide a cogent synoptic analysis supported Lindblom's criticism of the synoptic method. Moreover, it was found that the synoptic method failed in the way that Lindblom predicted that it would.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of the Type I and Type II scheme, first introduced and used by fiber Bragg grating researchers, has recently been adopted by the ultrafast laser direct-write photonics community to classify the physical geometry of waveguides written into glasses and crystals. This has created confusion between the fiber Bragg grating and direct-write photonics community. Here we propose a return to the original basis of the classification based on the characteristics of the material modification rather than the physical geometry of the waveguide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lichenometry is one of the most widely used methods of dating the surface age of substrata including rock surfaces, boulders, walls, and archaeological remains and has been particularly important in dating late Holocene glacial events. Yellow-green species of the crustose genus Rhizocarpon have been the most useful lichens in lichenometry because of their low growth rates and longevity. This review describes: (1) the biology of the genus Rhizocarpon, (2) growth rates and longevity, (3) environmental growth effects, (4) methods of estimating lichen age, (5) the methodology of lichenometry, (6) applications to dating glacial events, and (7) future research. Lichenometry depends on many assumptions, most critically that if the lag time before colonisation of a substratum is known and lichen age can be estimated, then a minimum surface age date can be obtained by measuring the size of the largest Rhizocarpon thallus. Lichen age can be estimated by calibrating thallus size against surfaces of known age (‘indirect lichenometry’), by constructing a growth rate-size curve from direct measurement of growth (‘direct lichenometry’), using radio-carbon (RC) dating, or from lichen ‘growth rings’. Future research should include a more rigorous investigation of the assumptions of lichenometry, especially whether the largest thallus present at a site is a good indicator of substratum age, and further studies on the establishment, development, growth, senescence, and mortality of Rhizocarpon lichens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present article describes a standard instrument for the continuous online determination of retinal vessel diameters, the commercially available retinal vessel analyzer. This report is intended to provide informed guidelines for measuring ocular blood flow with this system. The report describes the principles underlying the method and the instruments currently available, and discusses clinical protocol and the specific parameters measured by the system. Unresolved questions and the possible limitations of the technique are also discussed. © 2009 Acta Ophthalmol.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper contributes to the literature on the intra-firm diffusion of innovations by investigating the factors that affect the firm’s decision to adopt and use sets of complementary innovations. We define complementary innovations those innovations whose joint use generates super additive gains, i.e. the gain from the joint adoption is higher than the sum of the gains derived from the adoption of each innovation in isolation. From a theoretical perspective, we present a simple decision model, whereby the firm decides ‘whether’ and ‘how much’ to invest in each of the innovations under investigation based upon the expected profit gain from each possible combination of adoption and use. The model shows how the extent of complementarity among the innovations can affect the firm’s profit gains and therefore the likelihood that the firm will adopt these innovations jointly, rather than individually. From an empirical perspective, we focus on four sets of management practices, namely operating (OMP), monitoring (MMP), targets (TMP) and incentives (IMP) management practices. We show that these sets of practices, although to a different extent, are complementary to each other. Then, we construct a synthetic indicator of the depth of their use. The resulting intra-firm index is built to reflect not only the number of practices adopted but also the depth of their individual use and the extent of their complementarity. The empirical testing of the decision model is carried out using the evidence from the adoption behaviour of a sample of 1,238 UK establishments present in the 2004 Workplace Employment Relations Survey (WERS). Our empirical results show that the intra-firm profitability based model is a good model in that it can explain more of the variability of joint adoption than models based upon the variability of adoption and use of individual practices. We also investigate whether a number of firm specific and market characteristics by affecting the size of the gains (which the joint adoption of innovations can generate) may drive the intensity of use of the four innovations. We find that establishment size, whether foreign owned, whether exposed to an international market and the degree of homogeneity of the final product are important determinants of the intensity of the joint adoption of the four innovations. Most importantly, our results point out that the factors that the economics of innovation literature has been showing to affect the intensity of use of a technological innovation do also affect the intensity of use of sets of innovative management practices. However, they can explain only a small part of the diversity of their joint adoption use by the firms in the sample.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The further development of the use of NMR relaxation times in chemical, biological and medical research has perhaps been curtailed by the length of time these measurements often take. The DESPOT (Driven Equilibrium Single Pulse Observation of T1) method has been developed, which reduces the time required to make a T1 measurement by a factor of up to 100. The technique has been studied extensively herein and the thesis contains recommendations for its successful experimental application. Modified DESPOT type equations for use when T2 relaxation is incomplete or where off-resonance effects are thought to be significant are also presented. A recently reported application of the DESPOT technique to MR imaging gave good initial results but suffered from the fact that the images were derived from spin systems that were not driven to equilibrium. An approach which allows equilibrium to be obtained with only one non-acquisition sequence is presented herein and should prove invaluable in variable contrast imaging. A DESPOT type approach has also been successfully applied to the measurement of T1. T_1's can be measured, using this approach significantly faster than by the use of the classical method. The new method also provides a value for T1 simultaneously and therefore the technique should prove valuable in intermediate energy barrier chemical exchange studies. The method also gives rise to the possibility of obtaining simultaneous T1 and T1 MR images. The DESPOT technique depends on rapid multipulsing at nutation angles, normally less than 90^o. Work in this area has highlighted the possible time saving for spectral acquisition over the classical technique (90^o-5T_1)_n. A new method based on these principles has been developed which permits the rapid multipulsing of samples to give T_1 and M_0 ratio information. The time needed, however, is only slightly longer than would be required to determine the M_0 ratio alone using the classical technique. In ^1H decoupled ^13C spectroscopy the method also gives nOe ratio information for the individual absorptions in the spectrum.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For more than forty years, research has been on going in the use of the computer in the processing of natural language. During this period methods have evolved, with various parsing techniques and grammars coming to prominence. Problems still exist, not least in the field of Machine Translation. However, one of the successes in this field is the translation of sublanguage. The present work reports Deterministic Parsing, a relatively new parsing technique, and its application to the sublanguage of an aircraft maintenance manual for Machine Translation. The aim has been to investigate the practicability of using Deterministic Parsers in the analysis stage of a Machine Translation system. Machine Translation, Sublanguage and parsing are described in general terms with a review of Deterministic parsing systems, pertinent to this research, being presented in detail. The interaction between machine Translation, Sublanguage and Parsing, including Deterministic parsing, is also highlighted. Two types of Deterministic Parser have been investigated, a Marcus-type parser, based on the basic design of the original Deterministic parser (Marcus, 1980) and an LR-type Deterministic Parser for natural language, based on the LR parsing algorithm. In total, four Deterministic Parsers have been built and are described in the thesis. Two of the Deterministic Parsers are prototypes from which the remaining two parsers to be used on sublanguage have been developed. This thesis reports the results of parsing by the prototypes, a Marcus-type parser and an LR-type parser which have a similar grammatical and linguistic range to the original Marcus parser. The Marcus-type parser uses a grammar of production rules, whereas the LR-type parser employs a Definite Clause Grammar(DGC).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In previous sea-surface variability studies, researchers have failed to utilise the full ERS-1 mission due to the varying orbital characteristics in each mission phase, and most have simply ignored the Ice and Geodetic phases. This project aims to introduce a technique which will allow the straightforward use of all orbital phases, regardless of orbit type. This technique is based upon single satellite crossovers. Unfortunately the ERS-1 orbital height is still poorly resolved (due to higher air drag and stronger gravitational effects) when compared with that of TOPEX/Poseidon (T/P), so to make best use of the ERS-1 crossover data corrections to the ERS-1 orbital heights are calculated by fitting a cubic-spline to dual-crossover residuals with T/P. This correction is validated by comparison of dual satellite crossovers with tide gauge data. The crossover processing technique is validated by comparing the extracted sea-surface variability information with that from T/P repeat pass data. The two data sets are then combined into a single consistent data set for analysis of sea-surface variability patterns. These patterns are simplified by the use of an empirical orthogonal function decomposition which breaks the signals into spatial modes which are then discussed separately. Further studies carried out on these data include an analysis of the characteristics of the annual signal, discussion of evidence for Rossby wave propagation on a global basis, and finally analysis of the evidence for global mean sea level rise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban regions present some of the most challenging areas for the remote sensing community. Many different types of land cover have similar spectral responses, making them difficult to distinguish from one another. Traditional per-pixel classification techniques suffer particularly badly because they only use these spectral properties to determine a class, and no other properties of the image, such as context. This project presents the results of the classification of a deeply urban area of Dudley, West Midlands, using 4 methods: Supervised Maximum Likelihood, SMAP, ECHO and Unsupervised Maximum Likelihood. An accuracy assessment method is then developed to allow a fair representation of each procedure and a direct comparison between them. Subsequently, a classification procedure is developed that makes use of the context in the image, though a per-polygon classification. The imagery is broken up into a series of polygons extracted from the Marr-Hildreth zero-crossing edge detector. These polygons are then refined using a region-growing algorithm, and then classified according to the mean class of the fine polygons. The imagery produced by this technique is shown to be of better quality and of a higher accuracy than that of other conventional methods. Further refinements are suggested and examined to improve the aesthetic appearance of the imagery. Finally a comparison with the results produced from a previous study of the James Bridge catchment, in Darleston, West Midlands, is made, showing that the Polygon classified ATM imagery performs significantly better than the Maximum Likelihood classified videography used in the initial study, despite the presence of geometric correction errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers the use of general performance measures in evaluating specific planning and design decisions in higher education and reflects on the students' learning process. Specifically, it concerns the use of the MENTOR multimedia computer aided learning package for helping students learn about OR as part of a general business degree. It includes the transfer of responsibility for a learning module to a new staff member and a change from a single tutor to a system involving multiple tutors. Student satisfaction measures, learning outcome measures and MENTOR usage patterns are examined in monitoring the effects of the changes in course delivery. The results raise some questions about the effectiveness of general performance measures in supporting specific decisions relating to course design and planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Service innovations in retailing have the potential to benefit consumers as well as retailers. This research models key factors associated with the trial and continuous use of a specific self-service technology (SST), the personal shopping assistant (PSA), and estimates retailer benefits from implementing that innovation. Based on theoretical insights from prior SST studies, diffusion of innovation literature, and the technology acceptance model (TAM), this study develops specific hypotheses and tests them on a sample of 104 actual users of the PSA and 345 nonusers who shopped at the retail store offering the PSA device. Results indicate that factors affecting initial trial are different from those affecting continuous use. More specifically, consumers' trust toward the retailer, novelty seeking, and market mavenism are positively related to trial, while technology anxiety hinders the likelihood of trying the PSA. Perceived ease of use of the device positively impacts continuous use while consumers' need for interaction in shopping environments reduces the likelihood of continuous use. Importantly, there is evidence on retailer benefits from introducing the innovation since consumers using the PSA tend to spend more during each shopping trip. However, given the high costs of technology, the payback period for recovery of investments in innovation depends largely upon continued use of the innovation by consumers. Important implications are provided for retailers considering investments in new in-store service innovations. Incorporation of technology within physical stores affords opportunities for the retailer to reduce costs, while enhancing service provided to consumers. Therefore, service innovations in retailing have the potential to benefit consumers as well as retailers. This research models key factors associated with the trial and continuous use of a specific SST in the retail context, the PSA, and estimates retailer benefits from implementing that innovation. In so doing, the study contributes to the nascent area of research on SSTs in the retail sector. Based on theoretical insights from prior SST studies, diffusion of innovation literature, and the TAM, this study develops specific hypotheses regarding the (1) antecedent effects of technological anxiety, novelty seeking, market mavenism, and trust in the retailer on trial of the service innovation; (2) the effects of ease of use, perceived waiting time, and need for interaction on continuous use of the innovation; and (3) the effect of use of innovation on consumer spending at the store. The hypotheses were tested on a sample of 104 actual users of the PSA and 345 nonusers who shopped at the retail store offering the PSA device, one of the early adopters of PSA in Germany. Data were analyzed using logistic regression (antecedents of trial), multiple regression (antecedents of continuous use), and propensity score matching (assessing retailer benefits). Results indicate that factors affecting initial trial are different from those affecting continuous use. More specifically, consumers' trust toward the retailer, novelty seeking, and market mavenism are positively related to trial, while technology anxiety hinders the likelihood of trying the PSA. Perceived ease of use of the device positively impacts continuous use, while consumers' need for interaction in shopping environments reduces the likelihood of continuous use. Importantly, there is evidence on retailer benefits from introducing the innovation since consumers using the PSA tend to spend more during each shopping trip. However, given the high costs of technology, the payback period for recovery of investments in innovation depends largely upon continued use of the innovation by consumers. Important implications are provided for retailers considering investments in new in-store service innovations. The study contributes to the literature through its (1) simultaneous examination of antecedents of trial and continuous usage of a specific SST, (2) the demonstration of economic benefits of SST introduction for the retailer, and (3) contribution to the stream of research on service innovation, as against product innovation.