900 resultados para LARGE-AREA TELESCOPE


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the geology, geochemistry and mineralogy of a Lower Proterozoic, metamorphosed volcanogenic Cu-Zn deposit, situated at the western end of the Flin Flon greenstone belt. Stratabound copper mineralisation occurs in silicified and chloritoid-bearing alteration assemblages within felsic tuffs and is mantled by thin (< 3m) high-grade sphalerite layers. Mineralisation is underlain by garnet-hornblende bearing Lower Iron Formation (LIF), and overlain by garnet-grunerite bearing Upper Iron Formation (UIF). Distinctive trace element trends, involving Ti and Zr, in mineralised and footwall felsic tuffs are interpreted to have formed by fractionation associated with a high-level magma chamber in a caldera-type environment. Discrimination diagrams for basaltic rocks are interpreted to indicate their formation in an environment similar to that of recent, primitive, tholeiitic island arcs. Microprobe studies of key mineral phases demonstrate large and small scale chemical variations in silicate phases related to primary lithological, rather than metamorphic, controls. LIF is characterised by alumino-ferro-tschermakite and relatively Mn-poor, Ca-rich garnets, whereas UIF contains manganoan grunerite and Mn-rich garnets. Metamorphic mineral reactions are considered and possible precursor assemblages identified for garnet-, and chloritoid-bearing rocks. Chloritoid-bearing rocks are interpreted as the metamorphosed equivalents of iron-rich feeder zones formed near the surface. The iron-formations are thought to represent iron-rich sediments formed on the sea floor formed from the venting of the ore fluids. Consideration of various mineral assemblages leads to an estimate for peak metamorphic conditions of 450-500oC and > 4Kb total pressure. Comparisons with other volcanogenic deposits indicate affinities with deposits of `Mattabi-type' from the Archean of Ontario. An extrapolation of the main conclusions of the thesis to adjacent areas points to the presence of a number of geologically similar localities with potential for mineralisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is the result of an action-research-type study of the diversification effort of part of a major U.K. industrial company. Work in contingency theory concerning the impact of environmental factors on organizational design, and the systemic model of viable systems put forward by Stafford Beer form the theoretical basis of the vvork. The two streams of thought are compared and found to offer similar conclusions about the design of effective organizations. These findings are taken as the framework for an analysis both of organization structures for promoting innovation described in the literature, and of those employed by the company for this purpose in recent years. Much attention is given to the use of venture groups, and conclusions are drawn on particular factors which may influence their success or failure. Both theoretical considerations, and the examination of the company' s recent experience suggested that the formation of the policy of diversification, as well as the method of implementation of the police, might affect its outcorre. Attention is therefore focused on the policy-making and planning process, and in particular on possible problems that this process could generate in a multi-division company. The view finally taken of diversification effort is that it should be regarded as a learning system. This view helps to expose some ambiguities in the concepts of success and failure in this area, and demonstrates considerable weaknesses in traditional project evaluation procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An equivalent step index fibre with a silica core and air cladding is used to model photonic crystal fibres with large air holes. We model this fibre for linear polarisation (we focus on the lowest few transverse modes of the electromagnetic field). The equivalent step index radius is obtained by equating the lowest two eigenvalues of the model to those calculated numerically for the photonic crystal fibres. The step index parameters thus obtained can then be used to calculate nonlinear parameters like the nonlinear effective area of a photonic crystal fibre or to model nonlinear few-mode interactions using an existing model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper new architectural approaches that improve the energy efficiency of a cellular radio access network (RAN) are investigated. The aim of the paper is to characterize both the energy consumption ratio (ECR) and the energy consumption gain (ECG) of a cellular RAN when the cell size is reduced for a given user density and service area. The paper affirms that reducing the cell size reduces the cell ECR as desired while increasing the capacity density but the overall RAN energy consumption remains unchanged. In order to trade the increase in capacity density with RAN energy consumption, without degrading the cell capacity provision, a sleep mode is introduced. In sleep mode, cells without active users are powered-off, thereby saving energy. By combining a sleep mode with a small-cell deployment architecture, the paper shows that the ECG can be increased by the factor n = (R/R) while the cell ECR continues to decrease with decreasing cell size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Calibration of stochastic traffic microsimulation models is a challenging task. This paper proposes a fast iterative probabilistic precalibration framework and demonstrates how it can be successfully applied to a real-world traffic simulation model of a section of the M40 motorway and its surrounding area in the U.K. The efficiency of the method stems from the use of emulators of the stochastic microsimulator, which provides fast surrogates of the traffic model. The use of emulators minimizes the number of microsimulator runs required, and the emulators' probabilistic construction allows for the consideration of the extra uncertainty introduced by the approximation. It is shown that automatic precalibration of this real-world microsimulator, using turn-count observational data, is possible, considering all parameters at once, and that this precalibrated microsimulator improves on the fit to observations compared with the traditional expertly tuned microsimulation. © 2000-2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale massively parallel molecular dynamics (MD) simulations of the human class I major histo-compatibility complex (MHC) protein HLA-A*0201 bound to a decameric tumor-specific antigenic peptide GVY-DGREHTV were performed using a scalable MD code on high-performance computing platforms. Such computational capabilities put us in reach of simulations of various scales and complexities. The supercomputing resources available Large-scale massively parallel molecular dynamics (MD) simulations of the human class I major histocompatibility complex (MHC) protein HLA-A*0201 bound to a decameric tumor-specific antigenic peptide GVYDGREHTV were performed using a scalable MD code on high-performance computing platforms. Such computational capabilities put us in reach of simulations of various scales and complexities. The supercomputing resources available for this study allow us to compare directly differences in the behavior of very large molecular models; in this case, the entire extracellular portion of the peptide–MHC complex vs. the isolated peptide binding domain. Comparison of the results from the partial and the whole system simulations indicates that the peptide is less tightly bound in the partial system than in the whole system. From a detailed study of conformations, solvent-accessible surface area, the nature of the water network structure, and the binding energies, we conclude that, when considering the conformation of the α1–α2 domain, the α3 and β2m domains cannot be neglected. © 2004 Wiley Periodicals, Inc. J Comput Chem 25: 1803–1813, 2004

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problems of finding two optimal triangulations of a convex polygon: MaxMin area and MinMax area. These are the triangulations that maximize the area of the smallest area triangle in a triangulation, and respectively minimize the area of the largest area triangle in a triangulation, over all possible triangulations. The problem was originally solved by Klincsek by dynamic programming in cubic time [2]. Later, Keil and Vassilev devised an algorithm that runs in O(n^2 log n) time [1]. In this paper we describe new geometric findings on the structure of MaxMin and MinMax Area triangulations of convex polygons in two dimensions and their algorithmic implications. We improve the algorithm’s running time to quadratic for large classes of convex polygons. We also present experimental results on MaxMin area triangulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter discusses network protection of high-voltage direct current (HVDC) transmission systems for large-scale offshore wind farms where the HVDC system utilizes voltage-source converters. The multi-terminal HVDC network topology and protection allocation and configuration are discussed with DC circuit breaker and protection relay configurations studied for different fault conditions. A detailed protection scheme is designed with a solution that does not require relay communication. Advanced understanding of protection system design and operation is necessary for reliable and safe operation of the meshed HVDC system under fault conditions. Meshed-HVDC systems are important as they will be used to interconnect large-scale offshore wind generation projects. Offshore wind generation is growing rapidly and offers a means of securing energy supply and addressing emissions targets whilst minimising community impacts. There are ambitious plans concerning such projects in Europe and in the Asia-Pacific region which will all require a reliable yet economic system to generate, collect, and transmit electrical power from renewable resources. Collective offshore wind farms are efficient and have potential as a significant low-carbon energy source. However, this requires a reliable collection and transmission system. Offshore wind power generation is a relatively new area and lacks systematic analysis of faults and associated operational experience to enhance further development. Appropriate fault protection schemes are required and this chapter highlights the process of developing and assessing such schemes. The chapter illustrates the basic meshed topology, identifies the need for distance evaluation, and appropriate cable models, then details the design and operation of the protection scheme with simulation results used to illustrate operation. © Springer Science+Business Media Singapore 2014.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in the area of industrial metrology have generated new technologies that are capable of measuring components with complex geometry and large dimensions. However, no standard or best-practice guides are available for the majority of such systems. Therefore, these new systems require appropriate testing and verification in order for the users to understand their full potential prior to their deployment in a real manufacturing environment. This is a crucial stage, especially when more than one system can be used for a specific measurement task. In this paper, two relatively new large-volume measurement systems, the mobile spatial co-ordinate measuring system (MScMS) and the indoor global positioning system (iGPS), are reviewed. These two systems utilize different technologies: the MScMS is based on ultrasound and radiofrequency signal transmission and the iGPS uses laser technology. Both systems have components with small dimensions that are distributed around the measuring area to form a network of sensors allowing rapid dimensional measurements to be performed in relation to large-size objects, with typical dimensions of several decametres. The portability, reconfigurability, and ease of installation make these systems attractive for many industries that manufacture large-scale products. In this paper, the major technical aspects of the two systems are briefly described and compared. Initial results of the tests performed to establish the repeatability and reproducibility of these systems are also presented. © IMechE 2009.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production of recombinant therapeutic proteins is an active area of research in drug development. These bio-therapeutic drugs target nearly 150 disease states and promise to bring better treatments to patients. However, if new bio-therapeutics are to be made more accessible and affordable, improvements in production performance and optimization of processes are necessary. A major challenge lies in controlling the effect of process conditions on production of intact functional proteins. To achieve this, improved tools are needed for bio-processing. For example, implementation of process modeling and high-throughput technologies can be used to achieve quality by design, leading to improvements in productivity. Commercially, the most sought after targets are secreted proteins due to the ease of handling in downstream procedures. This chapter outlines different approaches for production and optimization of secreted proteins in the host Pichia pastoris. © 2012 Springer Science+business Media, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Vomiting in pregnancy is a common condition affecting 80% of pregnant women. Hyperemesis is at one end of the spectrum, seen in 0.5–2% of the pregnant population. Known factors such as nulliparity, younger age and high body mass indexare associated with an increased risk of this condition in the first trimester. Late pregnancy complications attributable to hyperemesis, the pathogenesis of which is poorly understood, have not been studied in large population-based studies in the United Kingdom. The objective of this study was to determine a plausible association between hyperemesis and pregnancy complications,such as pregnancy-related hypertension, gestational diabetes and liver problems in pregnancy, and the rates of elective (ElCS) and emergency caesarean section (EmCS). Methods: Using a database based on ICD-10 classification, anonymised data of admissions to a large multi-ethnic hospital in Manchester, UK between 2000 and 2012 were examined.Notwithstanding the obvious limitations with hospital database-based research, this large volume of datasets allows powerful studies of disease trends and complications.Results Between 2000 and 2012, 156 507 women aged 45 or under were admitted to hospital. Of these, 1111 women were coded for hyperemesis (0.4%). A greater proportion of women with hyperemesis than without hyperemesis were coded forhypertensive disorders in pregnancy such as pregnancy-induced hypertension, pre-eclampsia and eclampsia (2.7% vs 1.5%;P=0.001). The proportion of gestational diabetes and liver disorders in pregnancy was similar for both groups (diabetes:0.5% vs. 0.4%; P=0.945, liver disorders: 0.2% vs. 0.1%;P=0.662). Hyperemesis patients had a higher proportion of elective and emergency caesarean sections compared with the non-hyperemesis group (ElCS: 3.3% vs. 2%; P=0.002, EmCS: 5% vs.3%; P=0.00). Conclusions: There was a higher rate of emergency and elective caesarean section in women with hyperemesis, which could reflect the higher prevalence of pregnancy-related hypertensive disorders(but not diabetes or liver disorders) in this group. The factors contributing to the higher prevalence of hypertensive disorders arenot known, but these findings lead us to question whether there is a similar pathogenesis in the development of both the conditions and hence whether further study in this area is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is important to help researchers find valuable papers from a large literature collection. To this end, many graph-based ranking algorithms have been proposed. However, most of these algorithms suffer from the problem of ranking bias. Ranking bias hurts the usefulness of a ranking algorithm because it returns a ranking list with an undesirable time distribution. This paper is a focused study on how to alleviate ranking bias by leveraging the heterogeneous network structure of the literature collection. We propose a new graph-based ranking algorithm, MutualRank, that integrates mutual reinforcement relationships among networks of papers, researchers, and venues to achieve a more synthetic, accurate, and less-biased ranking than previous methods. MutualRank provides a unified model that involves both intra- and inter-network information for ranking papers, researchers, and venues simultaneously. We use the ACL Anthology Network as the benchmark data set and construct the gold standard from computer linguistics course websites of well-known universities and two well-known textbooks. The experimental results show that MutualRank greatly outperforms the state-of-the-art competitors, including PageRank, HITS, CoRank, Future Rank, and P-Rank, in ranking papers in both improving ranking effectiveness and alleviating ranking bias. Rankings of researchers and venues by MutualRank are also quite reasonable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To evaluate the implementation of the National Health Service (NHS) Health Check programme in one area of England from the perspective of general practitioners (GPs). DESIGN: A qualitative exploratory study was conducted with GPs and other healthcare professionals involved in delivering the NHS Health Check and with patients. This paper reports the experience of GPs and focuses on the management of the Heath Check programme in primary care. SETTING: Primary care surgeries in the Heart of Birmingham region (now under the auspices of the Birmingham Cross City Clinical Commissioning Group) were invited to take part in the larger scale evaluation. This study focuses on a subset of those surgeries whose GPs were willing to participate. PARTICIPANTS: 9 GPs from different practices volunteered. GPs served an ethnically diverse region with areas of socioeconomic deprivation. Ethnicities of participant GPs included South Asian, South Asian British, white, black British and Chinese. METHODS: Individual semistructured interviews were conducted with GPs face to face or via telephone. Thematic analysis was used to analyse verbatim transcripts. RESULTS: Themes were generated which represent GPs' experiences of managing the NHS Health Check: primary care as a commercial enterprise; 'buy in' to concordance in preventive healthcare; following protocol and support provision. These themes represent the key issues raised by GPs. They reveal variability in the implementation of NHS Health Checks. GPs also need support in allocating resources to the Health Check including training on how to conduct checks in a concordant (or collaborative) way. CONCLUSIONS: The variability observed in this small-scale evaluation corroborates existing findings suggesting a need for more standardisation. Further large-scale research is needed to determine how that could be achieved. Work needs to be done to further develop a concordant approach to lifestyle advice which involves tailored individual goal setting rather than a paternalistic advice-giving model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Star formation occurs when the gas (mostly atomic hydrogen; H I) in a galaxy becomes disturbed, forming regions of high density gas, which then collapses to form stars. In dwarf galaxies it is still uncertain which processes contribute to star formation and how much they contribute to star formation. Blue compact dwarf (BCD) galaxies are low mass, low shear, gas rich galaxies that have high star formation rates when compared to other dwarf galaxies. What triggers the dense burst of star formation in BCDs but not other dwarfs is not well understood. It is often suggested that BCDs may have their starburst triggered by gravitational interactions with other galaxies, dwarf-dwarf galaxy mergers, or consumption of intergalactic gas. However, there are BCDs that appear isolated with respect to other galaxies, making an external disturbance unlikely.^ Here, I study six apparently isolated BCDs from the LITTLE THINGS sample in an attempt to understand what has triggered their burst of star formation. LITTLE THINGS is an H I survey of 41 dwarf galaxies. Each galaxy has high angular and velocity resolution H I data from the Very Large Array (VLA) telescope and ancillary stellar data. I use these data to study the detailed morphology and kinematics of each galaxy, looking for signatures of starburst triggers. In addition to the VLA data, I have collected Green Bank Telescope data for the six BCDs. These high sensitivity, low resolution data are used to search the surrounding area of each galaxy for extended emission and possible nearby companion galaxies.^ The VLA data show evidence that each BCD has likely experienced some form of external disturbance despite their apparent isolation. These external disturbances potentially seen in the sample include: ongoing/advanced dwarf-dwarf mergers, an interaction with an unknown external object, and external gas consumption. The GBT data result in no nearby, separate H I companions at the sensitivity of the data. These data therefore suggest that even though these BCDs appear isolated, they have not been evolving in isolation. It is possible that these external disturbances may have triggered the starbursts that defines them as BCDs.^