17 resultados para Well-established techniques
em Aston University Research Archive
Resumo:
Summary form only given. Broadly tunable compact visible laser sources in the spectral region of 500-650 nm are valuable in biophotonics, photomedicine and for many applications including spectroscopy, laser projection and confocal microscopy. Unfortunately, commercially available lasers of this spectral range are in practice bulky and inconvenient in use. An attractive method for the realization of portable visible laser sources is the frequency-doubling of the infrared laser diodes in a nonlinear crystal containing a waveguide [1]. Nonlinear crystal waveguides that offer an order-of-magnitude increase in the IR-to-visible conversion efficiency also enable a very different approach to second-harmonic generation (SHG) tunability in periodically-poled crystals, promising order-of-magnitude increase of wavelength range for SHG conversion. This is possible by utilization of a significant difference in the effective refractive indices of the high-order and low-order modes in multimode waveguides [2]. The recent availability of low-cost, good quality semiconductor diode lasers, offering the coverage of a broad spectral range between 1 µ?? and 1.3 µp? [3,4], in combination with well-established techniques to fabricate good quality waveguides in nonlinear crystals, allows compact tunable CW laser sources in the visible spectral region to be realized [2].
Resumo:
Compact and tunable semiconductor terahertz sources providing direct electrical control, efficient operation at room temperatures and device integration opportunities are of great interest at the present time. One of the most well-established techniques for terahertz generation utilises photoconductive antennas driven by ultrafast pulsed or dual wavelength continuous wave laser systems, though some limitations, such as confined optical wavelength pumping range and thermal breakdown, still exist. The use of quantum dot-based semiconductor materials, having unique carrier dynamics and material properties, can help to overcome limitations and enable efficient optical-to-terahertz signal conversion at room temperatures. Here we discuss the construction of novel and versatile terahertz transceiver systems based on quantum dot semiconductor devices. Configurable, energy-dependent optical and electronic characteristics of quantum-dot-based semiconductors are described, and the resonant response to optical pump wavelength is revealed. Terahertz signal generation and detection at energies that resonantly excite only the implanted quantum dots opens the potential for using compact quantum dot-based semiconductor lasers as pump sources. Proof-of-concept experiments are demonstrated here that show quantum dot-based samples to have higher optical pump damage thresholds and reduced carrier lifetime with increasing pump power.
Resumo:
Orally disintegrating tablets (ODTs) offer many advantages over the conventional oral dosage forms in terms of convenience and ease of use. Over the last decade, substantial advances in the formulation of ODTs have been achieved in academia and industry that resulted in the emerging of a large number of patents. The aim of this review is to summarise the most recent patents in ODT formulations and highlight their motivations, inventive steps and significances in the development of ODT formulations. Five major techniques have been applied in manufacturing of ODTs, namely conventional tablet press, moulding, freeze drying, tablet loading and pulverization, with majority of the patents dedicated to the use of conventional tablet pressing. The patents have addressed various issues concerning the manufacturing of robust and practical ODT formulations by disclosing new manufacturing techniques, advantageous materials, and innovative formulation steps. However, future developments are required to reduce the cost and widening the application of the new manufacturing techniques, while simplifying and shortening the formulation steps will be crucial in the well established ones.
Resumo:
The contact lens represents a well-established important class of biomaterials. This thesis brings together the literature, mostly Japanese and American patents, concerned with an important group of polymers, `rigid gas permeable contact lens materials'. A comparison is made of similarities in the underlying chemical themes, centring on the use of variants of highly branched siloxy compounds with polymerizable methacrylate groups. There is a need for standard techniques to assess laboratory behaviour in relation to in vitro performance. A major part of the present work is dedicated to the establishment of such standardised techniques. It is apparent that property design requirements in this field (i.e. oxygen permeability, surface and mechanical properties) are to some extent conflicting. In principle, the structural approaches used to obtain high oxygen permeability lead to surface properties that are less than ideal in terms of compatibility with tears. PMMA is known to have uniquely good (but not perfect) surface properties in this respect; it has been used as a starting point in attempting to design new materials that possess a more acceptable compromise of transport and surface properties for ocular use. Initial examination of the oxygen permeabilities of relatively simple alkyl methacrylates, show that butyl methacrylate which has a permeability some fifty times greater than PMMA, represents an interesting and hitherto unexplored group of materials for ophthalmic applications. Consideration was similarly given to surface modification techniques that would produce materials having the ability to sustain coherent tear film in the eye without markedly impairing oxygen transport properties. Particular attention is paid to the use of oxygen plasma techniques in this respect. In conclusion, similar design considerations were applied to an extended wear hydrogel lens material in an attempt to overcome mechanical stability deficiencies which manifest themselves lq`in vivo' but not `in vitro'. A relatively simple structure modification, involving steric shielding of the amide substituent group, proved to be an effective solution to the problem.
Resumo:
Hypercoiling poly(styrene-alt-maleic anhydride) (PSMA) is known to undergo conformational transition in response to environmental stimuli. The association of PSMA with lipid 2-dilauryl-sn-glycero-3-phosphocholine (DLPC) produces polymer-lipid complex analogues to lipoprotein assemblies found in lung surfactant. These complexes represent a new bio-mimetic delivery vehicle with applications in the cosmetic and pharmaceutical industries. The primary aim of this study was to develop a better understanding of PSMA-DLPC association by using physical and spectroscopic techniques. Ternary phase diagrams were constructed to examine the effects of various factors, such as molecular weight, pH and temperature on PSMA-DLPC association. 31P-NMR spectroscopy was used to investigate the polymorphic changes of DLPC upon associating with PSMA. The Langmuir Trough technique and surface tension measurement were used to explore the association behaviour of PSMA both at the interface and in the bulk of solution, as well as its interaction with DLPC membranes. The ultimate aim of this study was to investigate the potential use of PSMA-DLPC complexes to improve the bioavailability and therapeutic efficacy of a range of drugs. Typical compounds of ophthalmic interest range from new drugs such as Pirenzepine, which has attracted clinical interest for the control of myopia progression, to the well-established family of non-steroid anti-inflammatory drugs. These drugs have widely differing structures, sizes, solubility profiles and pH-sensitivities. In order to understand the ways in which these characteristics influence incorporation and release behaviour, the marker molecules Rhodamine B and Oil Red O were chosen. PSMA-DLPC complexes, incorporated with marker molecules and Pirenzepine, were encapsulated in hydrogels of the types used for soft contact lenses. Release studies were conducted to examine if this smart drug delivery system can retain such compounds and deliver them at a slow rate over a prolonged period of time.
Resumo:
Qualitative reasoning has traditionally been applied in the domain of physical systems, where there are well established and understood laws governing the behaviour of each `component' in the system. Such application has shown that it is possible to produce models which can be used for explaining and predicting the behaviour of physical phenomena and also trouble-shooting. The principles underlying the theory ensure that the models are robust and exhibit consistent behaviour under all conditions. This research examines the validity of applying the theory in the financial domain where such laws may not exist or if they do, may not be universally applicable. In particular, it investigates how far these principles and techniques may be applied in the construction of financial analysis models. Because of the inherent differences in the nature of these two domains, it is argued that a different qualitative value system ought to be employed. The dissertation enlarges on the constraints this places on model descriptions and the effect it may have on the power and usefulness of the resulting models. It also describes the implementation of a system that investigates the implications of applying this theory by way of testing it on situations drawn from both text-books and published financial information.
Resumo:
The use of immunological adjuvants has been established since 1924 and ever since many candidates have been extensively researched in vaccine development. The controlled release of vaccine is another area of biotechnology research, which is advancing rapidly with great potential and success. Encapsulation of peptide and protein drugs within biodegradable microspheres has been amongst the most successful of approaches within the past decade. The present studies have focused on combining the advantages of microsphere delivery systems composed of biodegradable polylactide (PLLA) and polylactide-co-glycolide (PLGA) polymers with that of safe and effective adjuvants. The research efforts were directed to the development of single-dose delivery vehicles which, can be manufactured easily, safely, under mild and favourable conditions to the encapsulated antigens. In pursuing this objective non ionic block copolymers (NIBCs) (Pluronics@ LI01 and L121) were incorporated within poly-dl-lactide (PDLA) micorospheres prepared with emulsification-diffusion method. LI0I and L121 served both as adjuvants and stabilising agents within these vaccine delivery vehicles. These formulations encapsulating the model antigens lysozyme, ovalbumin (OVA) and diphtheria toxoid (DT) resulted in high entrapment efficiency (99%), yield (96.7%) and elicited high and sustained immune response (IgG titres up to 9427) after one single administration over nine months. The structural integrity of the antigens was preserved within these formulations. In evaluating new approaches for the use of well-established adjuvants such as alum, these particles were incorporated within PLLA and PLGA microspheres at much lesser quantities (5-10 times lower) than those contained within conventional alum-adsorbed vaccines. These studies focused on the incorporation of the clinically relevant tetanus toxoid (TT) antigen within biodegradable microspheres. The encapsulation of both alum particles and TT antigen within these micropheres resulted in preparations with high encapsulation efficiency (95%) and yield (91.2%). The immune response to these particles was also investigated to evaluate the secretion of serum IgG, IgG1, IgG2a and IgG2b after a single administration of these vaccines. The Splenic cells proliferation was also investigated as an indication for the induction of cell mediated immunity. These particles resulted in high and sustained immune response over a period of 14 months. The stability of TT within particles was also investigated under dry storage over a period of several months. NIBC microspheres were also investigated as potential DNA vaccine delivery systems using hepatitis B plasmid. These particles resulted in micro spheres of 3-5 μm diameter and were shown to preserve the integrity of the encapsulated (27.7% entrapment efficiency) hepatitis B plasmid.
Resumo:
Type 1 cannabinoid receptors (CB1R) have a well established role in modulating GABAergic signalling with the central nervous system, and are thought to be the only type present at GABAergic presynaptic terminals. In the medial entorhinal cortex (mEC), some cortical layers show high levels of ongoing GABAergic signalling (namely layer II) while others show relatively low levels (layer V). Using whole-cell patch clamp techniques, I have, for the first time, demonstrated the presence of functional CB1R in both deep and superficial layers of the mEC. Furthermore, using a range of highly specific ligands for both CB1R and CB2R, I present strong pharmacological evidence for CB2Rs being present in both deep and superficial layers of the mEC in the adult rat brain. In brain slices taken at earlier points in CNS development (P8-12), I have shown that while both CB1R and CB2R specific ligands do modulate GABAergic signalling at early developmental stages, antagonists/ inverse agonists and full agonists have similar effects, and serve only to reduce GABAergic signalling. These data suggest that the full cannabinoid signalling mechanisms at this early stage in synaptogenesis are not yet in place. During these whole-cell studies, I have developed and refined a novel recording technique, using an amantidine derivative (IEM1460) which allows inhibitory postsynaptic currents to be recorded under conditions in which glutamate receptors are not blocked and network activity remains high. Finally I have shown that bath applied CB1 and CB2 receptor antagonists/ inverse agonists are capable of modulating kainic acid induced persistent oscillatory activity in mEC. Inverse agonists suppressed oscillatory activity in the superficial layers of the mEC while it was enhanced in the deeper layers. It seems likely that cannabinoid receptors modulate the inhibitory neuronal activity that underlies network oscillations.
Resumo:
The oxidation of lipids has long been a topic of interest in biological and food sciences, and the fundamental principles of non-enzymatic free radical attack on phospholipids are well established, although questions about detail of the mechanisms remain. The number of end products that are formed following the initiation of phospholipid peroxidation is large, and is continually growing as new structures of oxidized phospholipids are elucidated. Common products are phospholipids with esterified isoprostane-like structures and chain-shortened products containing hydroxy, carbonyl or carboxylic acid groups; the carbonyl-containing compounds are reactive and readily form adducts with proteins and other biomolecules. Phospholipids can also be attacked by reactive nitrogen and chlorine species, further expanding the range of products to nitrated and chlorinated phospholipids. Key to understanding the mechanisms of oxidation is the development of advanced and sensitive technologies that enable structural elucidation. Tandem mass spectrometry has proved invaluable in this respect and is generally the method of choice for structural work. A number of studies have investigated whether individual oxidized phospholipid products occur in vivo, and mass spectrometry techniques have been instrumental in detecting a variety of oxidation products in biological samples such as atherosclerotic plaque material, brain tissue, intestinal tissue and plasma, although relatively few have achieved an absolute quantitative analysis. The levels of oxidized phospholipids in vivo is a critical question, as there is now substantial evidence that many of these compounds are bioactive and could contribute to pathology. The challenges for the future will be to adopt lipidomic approaches to map the profile of oxidized phospholipid formation in different biological conditions, and relate this to their effects in vivo. This article is part of a Special Issue entitled: Oxidized phospholipids-their properties and interactions with proteins.
Resumo:
This chapter reports on a framework that has been successfully used to analyze the e-business capabilities of an organization with a view to developing their e-capability maturity levels. This should be the first stage of any systems development project. The framework has been used widely within start-up companies and well-established companies both large and small; it has been deployed in the service and manufacturing sectors. It has been applied by practitioners and consultants to help improve e-business capability levels, and by academics for teaching and research purposes at graduate and undergraduate levels. This chapter will provide an account of the unique e-business planning and analysis framework (E-PAF) and demonstrate how it works via an abridged version of a case study (selected from hundreds that have been produced). This will include a brief account of the three techniques that are integrated to form the analysis framework: quality function deployment (QFD) (Akao, 1972), the balanced scorecard (BSC) (Kaplan & Norton, 1992), and value chain analysis (VCA) (Porter, 1985). The case study extract is based on an online community and dating agency service identified as VirtualCom which has been produced through a consulting assignment with the founding directors of that company and has not been published previously. It has been chosen because it gives a concise, comprehensive example from an industry that is relatively easy to relate to.
Resumo:
The incretin hormone glucagon-like peptide-1(7-36)amide (GLP-1) has been deemed of considerable importance in the regulation of blood glucose. Its effects, mediated through the regulation of insulin, glucagon, and somatostatin, are glucose-dependent and contribute to the tight control of glucose levels. Much enthusiasm has been assigned to a possible role of GLP-1 in the treatment of type 2 diabetes. GLIP-l's action unfortunately is limited through enzymatic inactivation caused by dipeptidylpeptidase IV (DPP IV). It is now well established that modifying GLP-1 at the N-terminal amino acids, His7 and Ala8, can greatly improve resistance to this enzyme. Little research has assessed what effect Glu9-substitution has on GLP-1 activity and its degradation by DPP IV. Here, we report that the replacement of Glu9 of GLP-1 with Lys dramatically increased resistance to DPP IV. This analogue (Lys9)GLP-1, exhibited a preserved GLP-1 receptor affinity, but the usual stimulatory effects of GLP-1 were completely eliminated, a trait duplicated by the other established GLP-1-antagonists, exendin (9-39) and GLP-1 (9-36)amide. We investigated the in vivo antagonistic actions of (Lys9)GLP-1 in comparison with GLP-1(9-36)amide and exendin (9-39) and revealed that this novel analogue may serve as a functional antagonist of the GLP-1 receptor.
Resumo:
Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.
Resumo:
Liquid-level sensing technologies have attracted great prominence, because such measurements are essential to industrial applications, such as fuel storage, flood warning and in the biochemical industry. Traditional liquid level sensors are based on electromechanical techniques; however they suffer from intrinsic safety concerns in explosive environments. In recent years, given that optical fiber sensors have lots of well-established advantages such as high accuracy, costeffectiveness, compact size, and ease of multiplexing, several optical fiber liquid level sensors have been investigated which are based on different operating principles such as side-polishing the cladding and a portion of core, using a spiral side-emitting optical fiber or using silica fiber gratings. The present work proposes a novel and highly sensitive liquid level sensor making use of polymer optical fiber Bragg gratings (POFBGs). The key elements of the system are a set of POFBGs embedded in silicone rubber diaphragms. This is a new development building on the idea of determining liquid level by measuring the pressure at the bottom of a liquid container, however it has a number of critical advantages. The system features several FBG-based pressure sensors as described above placed at different depths. Any sensor above the surface of the liquid will read the same ambient pressure. Sensors below the surface of the liquid will read pressures that increase linearly with depth. The position of the liquid surface can therefore be approximately identified as lying between the first sensor to read an above-ambient pressure and the next higher sensor. This level of precision would not in general be sufficient for most liquid level monitoring applications; however a much more precise determination of liquid level can be made by linear regression to the pressure readings from the sub-surface sensors. There are numerous advantages to this multi-sensor approach. First, the use of linear regression using multiple sensors is inherently more accurate than using a single pressure reading to estimate depth. Second, common mode temperature induced wavelength shifts in the individual sensors are automatically compensated. Thirdly, temperature induced changes in the sensor pressure sensitivity are also compensated. Fourthly, the approach provides the possibility to detect and compensate for malfunctioning sensors. Finally, the system is immune to changes in the density of the monitored fluid and even to changes in the effective force of gravity, as might be obtained in an aerospace application. The performance of an individual sensor was characterized and displays a sensitivity (54 pm/cm), enhanced by more than a factor of 2 when compared to a sensor head configuration based on a silica FBG published in the literature, resulting from the much lower elastic modulus of POF. Furthermore, the temperature/humidity behavior and measurement resolution were also studied in detail. The proposed configuration also displays a highly linear response, high resolution and good repeatability. The results suggest the new configuration can be a useful tool in many different applications, such as aircraft fuel monitoring, and biochemical and environmental sensing, where accuracy and stability are fundamental. © (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Resumo:
In dimensional metrology, often the largest source of uncertainty of measurement is thermal variation. Dimensional measurements are currently scaled linearly, using ambient temperature measurements and coefficients of thermal expansion, to ideal metrology conditions at 20˚C. This scaling is particularly difficult to implement with confidence in large volumes as the temperature is unlikely to be uniform, resulting in thermal gradients. A number of well-established computational methods are used in the design phase of product development for the prediction of thermal and gravitational effects, which could be used to a greater extent in metrology. This paper outlines the theory of how physical measurements of dimension and temperature can be combined more comprehensively throughout the product lifecycle, from design through to the manufacturing phase. The Hybrid Metrology concept is also introduced: an approach to metrology, which promises to improve product and equipment integrity in future manufacturing environments. The Hybrid Metrology System combines various state of the art physical dimensional and temperature measurement techniques with established computational methods to better predict thermal and gravitational effects.
Resumo:
The integrability of the nonlinear Schräodinger equation (NLSE) by the inverse scattering transform shown in a seminal work [1] gave an interesting opportunity to treat the corresponding nonlinear channel similar to a linear one by using the nonlinear Fourier transform. Integrability of the NLSE is in the background of the old idea of eigenvalue communications [2] that was resurrected in recent works [3{7]. In [6, 7] the new method for the coherent optical transmission employing the continuous nonlinear spectral data | nonlinear inverse synthesis was introduced. It assumes the modulation and detection of data using directly the continuous part of nonlinear spectrum associated with an integrable transmission channel (the NLSE in the case considered). Although such a transmission method is inherently free from nonlinear impairments, the noisy signal corruptions, arising due to the ampli¯er spontaneous emission, inevitably degrade the optical system performance. We study properties of the noise-corrupted channel model in the nonlinear spectral domain attributed to NLSE. We derive the general stochastic equations governing the signal evolution inside the nonlinear spectral domain and elucidate the properties of the emerging nonlinear spectral noise using well-established methods of perturbation theory based on inverse scattering transform [8]. It is shown that in the presence of small noise the communication channel in the nonlinear domain is the additive Gaussian channel with memory and signal-dependent correlation matrix. We demonstrate that the effective spectral noise acquires colouring", its autocorrelation function becomes slow decaying and non-diagonal as a function of \frequencies", and the noise loses its circular symmetry, becoming elliptically polarized. Then we derive a low bound for the spectral effiency for such a channel. Our main result is that by using the nonlinear spectral techniques one can significantly increase the achievable spectral effiency compared to the currently available methods [9]. REFERENCES 1. Zakharov, V. E. and A. B. Shabat, Sov. Phys. JETP, Vol. 34, 62{69, 1972. 2. Hasegawa, A. and T. Nyu, J. Lightwave Technol., Vol. 11, 395{399, 1993. 3. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4312{4328, 2014. 4. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4329{4345 2014. 5. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4346{4369, 2014. 6. Prilepsky, J. E., S. A. Derevyanko, K. J. Blow, I. Gabitov, and S. K. Turitsyn, Phys. Rev. Lett., Vol. 113, 013901, 2014. 7. Le, S. T., J. E. Prilepsky, and S. K. Turitsyn, Opt. Express, Vol. 22, 26720{26741, 2014. 8. Kaup, D. J. and A. C. Newell, Proc. R. Soc. Lond. A, Vol. 361, 413{446, 1978. 9. Essiambre, R.-J., G. Kramer, P. J. Winzer, G. J. Foschini, and B. Goebel, J. Lightwave Technol., Vol. 28, 662{701, 2010.