15 resultados para Aragonite (integrated peak area)
em Aston University Research Archive
Resumo:
The aim of this study was to use the transformation of anionic to metathesis polymerization to produce block co-polymers of styrene-b-pentenylene using WC16 /PStLi and WC16/PStLi/ AlEtC12 catalyst systems. Analysis of the products using SEC and 1H and 13C NMR spectroscopy enabled mechanisms for metathesis initiation reactions to be proposed. The initial work involved preparation of the constituent homo-polymers. Solutions of polystyryllithium in cyclohexane were prepared and diluted so that the [PStLi]o<2x10-3M. The dilution produced initial rapid decay of the active species, followed by slower spontaneous decay within a period of days. This was investigated using UV / visible spectrophotometry and the wavelength of maximum absorbance of the PStLi was found to change with the decay from an initial value of 328mn. to λmax of approximately 340nm. after 4-7 days. SEC analysis of solutions of polystyrene, using RI and UV / visible (set at 254nm.) detectors; showed the UV:RI peak area was constant for a range of polystyrene samples of different moleculor weight. Samples of polypentenylene were prepared and analysed using SEC. Unexpectedly the solutions showed an absorbance at 254nm. which had to be considered when this technique was used subsequently to analyse polymer samples to determine their styrene/ pentenylene co-polymer composition. Cyclohexane was found to be a poor solvent for these ring-opening metathesis polymerizations of cyclopentene. Attempts to produce styrene-b-pentenylene block co-polymers, using a range of co-catalyst systems, were generally unsuccessful as the products were shown to be mainly homopolymers. The character of the polymers did suggest that several catalytic species are present in these systems and mechanisms have been suggested for the formation of initiating carbenes. Evidence of some low molecular weight product with co-polymer character has been obtained. Further investigation indicated that this is most likely to be ABA block copolymer, which led to a mechanism being proposed for the termination of the polymerization.
Resumo:
Due to its wide applicability and ease of use, the analytic hierarchy process (AHP) has been studied extensively for the last 20 years. Recently, it is observed that the focus has been confined to the applications of the integrated AHPs rather than the stand-alone AHP. The five tools that commonly combined with the AHP include mathematical programming, quality function deployment (QFD), meta-heuristics, SWOT analysis, and data envelopment analysis (DEA). This paper reviews the literature of the applications of the integrated AHPs. Related articles appearing in the international journals from 1997 to 2006 are gathered and analyzed so that the following three questions can be answered: (i) which type of the integrated AHPs was paid most attention to? (ii) which area the integrated AHPs were prevalently applied to? (iii) is there any inadequacy of the approaches? Based on the inadequacy, if any, some improvements and possible future work are recommended. This research not only provides evidence that the integrated AHPs are better than the stand-alone AHP, but also aids the researchers and decision makers in applying the integrated AHPs effectively.
Resumo:
Communication in Forensic Contexts provides in-depth coverage of the complex area of communication in forensic situations. Drawing on expertise from forensic psychology, linguistics and law enforcement worldwide, the text bridges the gap between these fields in a definitive guide to best practice. • Offers best practice for understanding and improving communication in forensic contexts, including interviewing of victims, witnesses and suspects, discourse in courtrooms, and discourse via interpreters • Bridges the knowledge gaps between forensic psychology, forensic linguistics and law enforcement, with chapters written by teams bringing together expertise from each field • Published in collaboration with the International Investigative Interviewing Research Group, dedicated to furthering evidence-based practice and practice-based research amongst researchers and practitioners • International, cross-disciplinary team includes contributors from North America, Europe and Asia Pacific, and from psychology, linguistics and forensic practice
Resumo:
B-ISDN is a universal network which supports diverse mixes of service, applications and traffic. ATM has been accepted world-wide as the transport technique for future use in B-ISDN. ATM, being a simple packet oriented transfer technique, provides a flexible means for supporting a continuum of transport rates and is efficient due to possible statistical sharing of network resources by multiple users. In order to fully exploit the potential statistical gain, while at the same time provide diverse service and traffic mixes, an efficient traffic control must be designed. Traffic controls which include congestion and flow control are a fundamental necessity to the success and viability of future B-ISDN. Congestion and flow control is difficult in the broadband environment due to the high speed link, the wide area distance, diverse service requirements and diverse traffic characteristics. Most congestion and flow control approaches in conventional packet switched networks are reactive in nature and are not applicable in the B-ISDN environment. In this research, traffic control procedures mainly based on preventive measures for a private ATM-based network are proposed and their performance evaluated. The various traffic controls include CAC, traffic flow enforcement, priority control and an explicit feedback mechanism. These functions operate at call level and cell level. They are carried out distributively by the end terminals, the network access points and the internal elements of the network. During the connection set-up phase, the CAC decides the acceptance or denial of a connection request and allocates bandwidth to the new connection according to three schemes; peak bit rate, statistical rate and average bit rate. The statistical multiplexing rate is based on a `bufferless fluid flow model' which is simple and robust. The allocation of an average bit rate to data traffic at the expense of delay obviously improves the network bandwidth utilisation.
Resumo:
Computer integrated manufacture has brought about great advances in manufacturing technology and its recognition is world wide. Cold roll forming of thin-walled sections, and in particular the design and manufacture of form-rolls, the special tooling used in the cold roll forming process, is but one such area where computer integrated manufacture can make a positive contribution. The work reported in this thesis, concerned with the development of an integrated manufacturing system for assisting the design and manufacture of form-rolls, was undertaken in collaboration with a leading manufacturer of thin-walled sections. A suit of computer programs, written in FORTRAN 77, have been developed to provide computer aids for every aspect of work in form-roll design and manufacture including cost estimation and stock control aids. The first phase of the development programme dealt with the establishment of CAD facilities for form-roll design, comprising the design of the finished section, the flower pattern, the roll design and the interactive roll editor program. Concerning the CAM facilities, dealt with in the second phase, an expert system roll machining processor and a general post-processor have been developed for considering the roll geometry and automatically generating NC tape programs for any required CNC lathe system. These programs have been successfully implemented, as an integrated manufacturing software system, on the VAX 11/750 super-minicomputer with graphics facilities for displaying drawings interactively on the terminal screen. The development of the integrated system has been found beneficial in all aspects of form-roll design and manufacture. Design and manufacturing lead times have been reduced by several weeks, quality has improved considerably and productivity has increased. The work has also demonstrated the promising nature of the expert systems approach to computer integrated manufacture.
Resumo:
Computer integrated monitoring is a very large area in engineering where on-line, real time data acquisition with the aid of sensors is the solution to many problems in the manufacturing industry as opposed to the old data logging method by graphics analysis. The raw data which is collected this way however is useless in the absence of a proper computerized management system. The transfer of data between the management and the shop floor processes has been impossible in the past unless all the computers in the system were totally compatible with each other. This limits the efficiency of the systems because they get governed by the limitations of the computers. General Motors of U.S.A. have recently started research on a new standard called the Manufacturing Automation Protocol (MAP) which is expected to allow data transfer between different types of computers. This is still in early development stages and also is currently very expensive. This research programme shows how such a shop floor data acquisition system and a complete management system on entirely different computers can be integrated together to form a single system by achieving data transfer communications using a cheaper but a superior alternative to MAP. Standard communication character sets and hardware such as ASCII and UARTs have been used in this method but the technique is so powerful that totally incompatible computers are shown to run different programs (in different languages) simultaneously and yet receive data from each other and process in their own CPUs with no human intervention.
Resumo:
This doctoral thesis originates from an observational incongruence between the perennial aims and aspirations of economic endeavour and actually recorded outcomes, which frequently seem contrary to those intended and of a recurrent, cyclical type. The research hypothesizes parallel movement between unstable business environments through time, as expressed by periodically fluctuating levels of economic activity, and the precipitation rates of industrial production companies. A major problem arose from the need to provide theoretical and empirical cohesion from the conflicting, partial and fragmented interpretations of several hundred historians and economists, without which the research question would remain unanswerable. An attempt to discover a master cycle, or superimposition theorem, failed, but was replaced by minute analysis of both the concept of cycles and their underlying data-bases. A novel technique of congregational analysis emerged, resulting in an integrated matrix of numerical history. Two centuries of industrial revolution history in England and Wales was then explored and recomposed for the first time in a single account of change, thereby providing a factual basis for the matrix. The accompanying history of the Birmingham area provided the context of research into the failure rates and longevities of firms in the city's staple metal industries. Sample specific results are obtained for company longevities in the Birmingham area. Some novel presentational forms are deployed for results of a postal questionnaire to surviving firms. Practical demonstration of the new index of national economic activity (INEA) in relation to company insolvencies leads to conclusions and suggestions for further applications of research into the tempo of change, substantial Appendices support the thesis and provide a compendium of information covering immediately contiguous domains.
Resumo:
In the area of international environmental law this thesis proposes the formulation of one-step planning and permitting regulation for the integrated utilisation of new surface mines as depositories for municipal solid waste. Additionally, the utilisation of abandoned and currently operated surface mines is proposed as solid waste landfills as an integral step in their reclamation. Existing laws, litigation and issues in the United Kingdom, the U.S. and Canada are discussed because of their common legal system, language and heritage. The critical shortage of approved space for disposal of solid waste has caused an urgent and growing problem for both the waste disposal industry and society. Surface mining can serve three important environmental and societal functions inuring to the health and welfare of the public: (1) providing basic minerals for goods and construction; (20 sequentially, to provide critically needed, safe burial sites for society's wastes, and (3) to conserve land by dual purpose use and to restore derelict land to beneficial surface use. Currently, the first two functions are treated environmentally, and in regulation, as two different siting problems, yet they both are earth-disturbing and excavating industries requiring surface restoration. The processes are largely duplicative and should be combined for better efficiency, less earth disturbance, conservation of land, and for fuller and better reclamation of completed surface mines returning the surfaces to greater utility than present mined land reclamation procedures. While both industries are viewed by a developed society and its communities as "bad neighbours", they remain essential and critical for mankind's existence and welfare. The study offers successful examples of the integrated process in each country. The study argues that most non-fuel surface mine openings, if not already safe, can economically, through present containment technology, be made environmentally safe for use as solid waste landfills. Simultaneously, the procedure safeguards and monitors protection of ground and surface waters from landfill contamination.
Resumo:
This thesis describes the geology, geochemistry and mineralogy of a Lower Proterozoic, metamorphosed volcanogenic Cu-Zn deposit, situated at the western end of the Flin Flon greenstone belt. Stratabound copper mineralisation occurs in silicified and chloritoid-bearing alteration assemblages within felsic tuffs and is mantled by thin (< 3m) high-grade sphalerite layers. Mineralisation is underlain by garnet-hornblende bearing Lower Iron Formation (LIF), and overlain by garnet-grunerite bearing Upper Iron Formation (UIF). Distinctive trace element trends, involving Ti and Zr, in mineralised and footwall felsic tuffs are interpreted to have formed by fractionation associated with a high-level magma chamber in a caldera-type environment. Discrimination diagrams for basaltic rocks are interpreted to indicate their formation in an environment similar to that of recent, primitive, tholeiitic island arcs. Microprobe studies of key mineral phases demonstrate large and small scale chemical variations in silicate phases related to primary lithological, rather than metamorphic, controls. LIF is characterised by alumino-ferro-tschermakite and relatively Mn-poor, Ca-rich garnets, whereas UIF contains manganoan grunerite and Mn-rich garnets. Metamorphic mineral reactions are considered and possible precursor assemblages identified for garnet-, and chloritoid-bearing rocks. Chloritoid-bearing rocks are interpreted as the metamorphosed equivalents of iron-rich feeder zones formed near the surface. The iron-formations are thought to represent iron-rich sediments formed on the sea floor formed from the venting of the ore fluids. Consideration of various mineral assemblages leads to an estimate for peak metamorphic conditions of 450-500oC and > 4Kb total pressure. Comparisons with other volcanogenic deposits indicate affinities with deposits of `Mattabi-type' from the Archean of Ontario. An extrapolation of the main conclusions of the thesis to adjacent areas points to the presence of a number of geologically similar localities with potential for mineralisation.
Resumo:
The development of a system that integrates reverse osmosis (RO) with a horticultural greenhouse has been advanced through laboratory experiments. In this concept, intended for the inland desalination of brackish groundwater in dry areas, the RO concentrate will be reduced in volume by passing it through the evaporative cooling pads of the greenhouse. The system will be powered by solar photovoltaics (PV). Using a solar array simulator, we have verified that the RO can operate with varying power input and recovery rates to meet the water demands for irrigation and cooling of a greenhouse in north-west India. Cooling requires ventilation by a fan which has also been built, tested and optimised with a PV module outdoors. Results from the experiments with these two subsystems (RO and fan) are compared to theoretical predictions to reach conclusions about energy usage, sizing and cost. For example, the optimal sizing for the RO system is 0.12–1.3 m2 of PV module per m2 of membrane, depending on feed salinity. For the fan, the PV module area equals that of the fan aperture. The fan consumes <30 J of electrical energy per m3 of air moved which is 3 times less than that of standard fans. The specific energy consumption of the RO, at 1–2.3 kWh ?m-3, is comparable to that reported by others. Now that the subsystems have been verifi ed, the next step will be to integrate and test the whole system in the field.
Resumo:
Fiber Bragg gratings can be used for monitoring different parameters in a wide variety of materials and constructions. The interrogation of fiber Bragg gratings traditionally consists of an expensive and spacious peak tracking or spectrum analyzing unit which needs to be deployed outside the monitored structure. We present a dynamic low-cost interrogation system for fiber Bragg gratings which can be integrated with the fiber itself, limiting the fragile optical in- and outcoupling interfaces and providing a compact, unobtrusive driving and read-out unit. The reported system is based on an embedded Vertical Cavity Surface Emitting Laser (VCSEL) which is tuned dynamically at 1 kHz and an embedded photodiode. Fiber coupling is provided through a dedicated 45° micromirror yielding a 90° in-the-plane coupling and limiting the total thickness of the fiber coupled optoelectronic package to 550 µm. The red-shift of the VCSEL wavelength is providing a full reconstruction of the spectrum with a range of 2.5 nm. A few-mode fiber with fiber Bragg gratings at 850 nm is used to prove the feasibility of this low-cost and ultra-compact interrogation approach.
Resumo:
To extend our understanding of the early visual hierarchy, we investigated the long-range integration of first- and second-order signals in spatial vision. In our first experiment we performed a conventional area summation experiment where we varied the diameter of (a) luminance-modulated (LM) noise and (b) contrastmodulated (CM) noise. Results from the LM condition replicated previous findings with sine-wave gratings in the absence of noise, consistent with long-range integration of signal contrast over space. For CM, the summation function was much shallower than for LM suggesting, at first glance, that the signal integration process was spatially less extensive than for LM. However, an alternative possibility was that the high spatial frequency noise carrier for the CM signal was attenuated by peripheral retina (or cortex), thereby impeding our ability to observe area summation of CM in the conventional way. To test this, we developed the ''Swiss cheese'' stimulus of Meese and Summers (2007) in which signal area can be varied without changing the stimulus diameter, providing some protection against inhomogeneity of the retinal field. Using this technique and a two-component subthreshold summation paradigm we found that (a) CM is spatially integrated over at least five stimulus cycles (possibly more), (b) spatial integration follows square-law signal transduction for both LM and CM and (c) the summing device integrates over spatially-interdigitated LM and CM signals when they are co-oriented, but not when crossoriented. The spatial pooling mechanism that we have identified would be a good candidate component for amodule involved in representing visual textures, including their spatial extent.
Resumo:
Communication in Forensic Contexts provides in-depth coverage of the complex area of communication in forensic situations. Drawing on expertise from forensic psychology, linguistics and law enforcement worldwide, the text bridges the gap between these fields in a definitive guide to best practice. •Offers best practice for understanding and improving communication in forensic contexts, including interviewing of victims, witnesses and suspects, discourse in courtrooms, and discourse via interpreters •Bridges the knowledge gaps between forensic psychology, forensic linguistics and law enforcement, with chapters written by teams bringing together expertise from each field •Published in collaboration with the International Investigative Interviewing Research Group, dedicated to furthering evidence-based practice and practice-based research amongst researchers and practitioners •International, cross-disciplinary team includes contributors from North America, Europe and Asia Pacific, and from psychology, linguistics and forensic practice
Resumo:
Fiber Bragg gratings can be used for monitoring different parameters in a wide variety of materials and constructions. The interrogation of fiber Bragg gratings traditionally consists of an expensive and spacious peak tracking or spectrum analyzing unit which needs to be deployed outside the monitored structure. We present a dynamic low-cost interrogation system for fiber Bragg gratings which can be integrated with the fiber itself, limiting the fragile optical in- and outcoupling interfaces and providing a compact, unobtrusive driving and read-out unit. The reported system is based on an embedded Vertical Cavity Surface Emitting Laser (VCSEL) which is tuned dynamically at 1 kHz and an embedded photodiode. Fiber coupling is provided through a dedicated 45° micromirror yielding a 90° in-the-plane coupling and limiting the total thickness of the fiber coupled optoelectronic package to 550 µm. The red-shift of the VCSEL wavelength is providing a full reconstruction of the spectrum with a range of 2.5 nm. A few-mode fiber with fiber Bragg gratings at 850 nm is used to prove the feasibility of this low-cost and ultra-compact interrogation approach.
Resumo:
Discrepancies of materials, tools, and factory environments, as well as human intervention, make variation an integral part of the manufacturing process of any component. In particular, the assembly of large volume, aerospace parts is an area where significant levels of form and dimensional variation are encountered. Corrective actions can usually be taken to reduce the defects, when the sources and levels of variation are known. For the unknown dimensional and form variations, a tolerancing strategy is typically put in place in order to minimize the effects of production inconsistencies related to geometric dimensions. This generates a challenging problem for the automation of the corresponding manufacturing and assembly processes. Metrology is becoming a major contributor to being able to predict, in real time, the automated assembly problems related to the dimensional variation of parts and assemblies. This is done by continuously measuring dimensions and coordinate points, focusing on the product's key characteristics. In this paper, a number of metrology focused activities for large-volume aerospace products, including their implementation and application in the automation of manufacturing and assembly processes, are reviewed. This is done by using a case study approach within the assembly of large-volume aircraft wing structures.