4 resultados para Analysis of precipitable water vapor from GPS measurements

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a fibre-based approach for generation of optical frequency combs (OFCs) with the aim of calibration of astronomical spectrographs in the low and medium-resolution range. This approach includes two steps: in the first step, an appropriate state of optical pulses is generated and subsequently moulded in the second step delivering the desired OFC. More precisely, the first step is realised by injection of two continuous-wave (CW) lasers into a conventional single-mode fibre, whereas the second step generates a broad OFC by using the optical solitons generated in step one as initial condition. We investigate the conversion of a bichromatic input wave produced by two initial CW lasers into a train of optical solitons, which happens in the fibre used as step one. Especially, we are interested in the soliton content of the pulses created in this fibre. For that, we study different initial conditions (a single cosine-hump, an Akhmediev breather, and a deeply modulated bichromatic wave) by means of soliton radiation beat analysis and compare the results to draw conclusion about the soliton content of the state generated in the first step. In case of a deeply modulated bichromatic wave, we observed the formation of a collective soliton crystal for low input powers and the appearance of separated solitons for high input powers. An intermediate state showing the features of both, the soliton crystal and the separated solitons, turned out to be most suitable for the generation of OFC for the purpose of calibration of astronomical spectrographs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – This paper attempts to seek answers to four questions. Two of these questions have been borrowed (but adapted) from the work of Defee et al.: RQ1. To what extent is theory used in purchasing and supply chain management (P&SCM) research? RQ2. What are the prevalent theories to be found in P&SCM research? Following on from these questions an additional question is posed: RQ3. Are theory-based papers more highly cited than papers with no theoretical foundation? Finally, drawing on the work of Harland et al., the authors have added a fourth question: RQ4. To what extent does P&SCM meet the tests of coherence, breadth and depth, and quality necessary to make it a scientific discipline? Design/methodology/approach – A systematic literature review was conducted in accordance with the model outlined by Tranfield et al. for three journals within the field of “purchasing and supply chain management”. In total 1,113 articles were reviewed. In addition a citation analysis was completed covering 806 articles in total. Findings – The headline features from the results suggest that nearly a decade-and-a-half on from its development, the field still lacks coherence. There is the absence of theory in much of the work and although theory-based articles achieved on average a higher number of citations than non-theoretical papers, there is no obvious contender as an emergent paradigm for the discipline. Furthermore, it is evident that P&SCM does not meet Fabian's test necessary to make it a scientific discipline and is still some way from being a normal science. Research limitations/implications – This study would have benefited from the analysis of further journals, however the analysis of 1,113 articles from three leading journals in the field of P&SCM was deemed sufficient in scope. In addition, a further significant line of enquiry to follow is the rigour vs relevance debate. Practical implications – This article is of interest to both an academic and practitioner audience as it highlights the use theories in P&SCM. Furthermore, this article raises a number of important questions. Should research in this area draw more heavily on theory and if so which theories are appropriate? Social implications – The broader social implications relate to the discussion of how a scientific discipline develops and builds on the work of Fabian and Amundson. Originality/value – The data set for this study is significant and builds on a number of previous literature reviews. This review is both greater in scope than previous reviews and is broader in its subject focus. In addition, the citation analysis (not previously conducted in any of the reviews) and statistical test highlights that theory-based articles are more highly cited than non-theoretically based papers. This could indicate that researchers are attempting to build on one another's work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative analysis of solid-state processes from isothermal microcalorimetric data is straightforward if data for the total process have been recorded and problematic (in the more likely case) when they have not. Data are usually plotted as a function of fraction reacted (α); for calorimetric data, this requires knowledge of the total heat change (Q) upon completion of the process. Determination of Q is difficult in cases where the process is fast (initial data missing) or slow (final data missing). Here we introduce several mathematical methods that allow the direct calculation of Q by selection of data points when only partial data are present, based on analysis with the Pérez-Maqueda model. All methods in addition allow direct determination of the reaction mechanism descriptors m and n and from this the rate constant, k. The validity of the methods is tested with the use of simulated calorimetric data, and we introduce a graphical method for generating solid-state power-time data. The methods are then applied to the crystallization of indomethacin from a glass. All methods correctly recovered the total reaction enthalpy (16.6 J) and suggested that the crystallization followed an Avrami model. The rate constants for crystallization were determined to be 3.98 × 10-6, 4.13 × 10-6, and 3.98 × 10 -6 s-1 with methods 1, 2, and 3, respectively. © 2010 American Chemical Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An important parameter in integrated optical device is the propagation loss of the waveguide. Its characterization gives the information of the fabrication quality as well as the information of other passive devices on the chip as it is the basic building block of the passive devices. Although, over the last three decades many methods have been developed, there is not a single standard present yet. This paper presents a comparative analysis of the methods existing from the past as well as methods developed very recently in order to provide a complete picture of the pros and cons of different types of methods and from this comparison the best method is suggested according to the authors opinion. To support the claim, apart from the analytical comparison, this paper also presents a comparison performed with the experimental results between the suggested best method which is recently proposed by Massachusetts Institute of Technology (MIT) researchers based on undercoupled all-pass microring structure and the popular cut-back method.