27 resultados para Collection development (Libraries)--Ireland
em Aston University Research Archive
Resumo:
Purpose – The purpose of this paper is to explore the role and relevance of external standards in demonstrating the value and impact of academic library services to their stakeholders. Design/methodology/approach – Two UK standards, Charter Mark and Customer Service Excellence, are evaluated via an exploratory case study, employing multiple data collection techniques. Methods and results of phases 1-2 of a three phase research project are outlined. Findings – Despite some limitations, standards may assist the manager in demonstrating the value, impact and quality of academic libraries in a recessional environment. Active engagement and partnership with customers is imperative if academic libraries are to be viewed as vital to their parent organisations and thus survive. Originality/value – This paper provides a systematic evaluation of the role of external accreditation standards in measuring academic library service value and impact.
Resumo:
The purpose of this research is to propose a procurement system across other disciplines and retrieved information with relevant parties so as to have a better co-ordination between supply and demand sides. This paper demonstrates how to analyze the data with an agent-based procurement system (APS) to re-engineer and improve the existing procurement process. The intelligence agents take the responsibility of searching the potential suppliers, negotiation with the short-listed suppliers and evaluating the performance of suppliers based on the selection criteria with mathematical model. Manufacturing firms and trading companies spend more than half of their sales dollar in the purchase of raw material and components. Efficient data collection with high accuracy is one of the key success factors to generate quality procurement which is to purchasing right material at right quality from right suppliers. In general, the enterprises spend a significant amount of resources on data collection and storage, but too little on facilitating data analysis and sharing. To validate the feasibility of the approach, a case study on a manufacturing small and medium-sized enterprise (SME) has been conducted. APS supports the data and information analyzing technique to facilitate the decision making such that the agent can enhance the negotiation and suppler evaluation efficiency by saving time and cost.
Resumo:
The research reported in this paper arose from collaboration with Brian Ashcroft (Fraser of Allander Institute, University of Strathclyde) and Stephen Roper (Northern Ireland Economic Research Centre, Queen's University of Belfast). The author is, however, solely responsible for the views expressed. The following sections are included: -Introduction -An Economics Perspective on Innovation Networks -The Product Development Survey -Discussion: Innovation, Networks and Institutions -Conclusions -References Read More: http://www.worldscientific.com/doi/abs/10.1142/9781848161481_0005
Resumo:
A two-tier study is presented in this thesis. The first involves the commissioning of an extant but at the time, unproven bubbling fluidised bed fast pyrolysis unit. The unit was designed for an intended nominal throughput of 300 g/h of biomass. The unit came complete with solids separation, pyrolysis vapour quenching and oil collection systems. Modifications were carried out on various sections of the system including the reactor heating, quenching and liquid collection systems. The modifications allowed for fast pyrolysis experiments to be carried out at the appropriate temperatures. Bio-oil was generated using conventional biomass feedstocks including Willow, beechwood, Pine and Miscanthus. Results from this phase of the research showed however, that although the rig was capable of processing biomass to bio-oil, it was characterised by low mass balance closures and recurrent operational problems. The problems included blockages, poor reactor hydrodynamics and reduced organic liquid yields. The less than optimal performance of individual sections, particularly the feed and reactor systems of the rig, culminated in a poor overall performance of the system. The second phase of this research involved the redesign of two key components of the unit. An alternative feeding system was commissioned for the unit. The feed system included an off the shelf gravimetric system for accurate metering and efficient delivery of biomass. Similarly, a new bubbling fluidised bed reactor with an intended nominal throughput of 500g/h of biomass was designed and constructed. The design leveraged on experience from the initial commissioning phase with proven kinetic and hydrodynamic studies. These units were commissioned as part of the optimisation phase of the study. Also as part of this study, two varieties each, of previously unreported feedstocks namely Jatropha curcas and Moringa olifiera oil seed press cakes were characterised to determine their suitability as feedstocks for liquid fuel production via fast pyrolysis. Consequently, the feedstocks were used for the production of pyrolysis liquids. The quality of the pyrolysis liquids from the feedstocks were then investigated via a number of analytical techniques. The oils from the press cakes showed high levels of stability and reduced pH values. The improvements to the design of the fast pyrolysis unit led to higher mass balance closures and increased organic liquid yields. The maximum liquid yield obtained from the press cakes was from African Jatropha press cake at 66 wt% on a dry basis.
Resumo:
Garner seeks to explain the absence of far-right political formations in the history of the Republic of Ireland, especially in relation to immigration. He argues that the ‘mainstream’ nationalist parties have implemented a racialized governance of Ireland via the issue of citizenship (in the referendum of 2004). While hegemonic ideas on the racial purity of indigenous populations and the highly ambivalent attitudes and policies on immigration pursued over the last decade are characteristic of a broader European trend, this has not, in the Republic, been accompanied by meaningful far-right political mobilization. Ireland has frequently been seen as sui generis in political terms, and indeed emerges in some ways as a counter-case: increasing hostility towards Others has been identified in the midst of rapid economic growth and political stability. A variety of issues related to the country’s political development have given rise to an especially small left-wing vote, a nationalist centre ground and longlasting domination by a single populist party, Fianna Fa´ il. This party has been partnered in government since 1997 by a free-market party, the Progressive Democrats, who have contributed to Ireland’s movement towards neo-liberal policies and a highly functional approach to immigration. The transition from country of emigration to country of immigration has thus taken place against an ideological backdrop in which the imperatives of labour demand and consolidating domestic support for reform have made an uneasy match, resulting in the racialization of Irishness. The state has, however, amended the Constitution in order to qualify jus soli citizenship entitlement in the case of particular categories of people: those whose parents are not Irish nationals. The significant stakes of these changes are analysed in the context of state responses to Eire’s transition to a country of immigration, and the role of nationalist-populism in the country’s political culture.
Resumo:
Although techniques such as biopanning rely heavily upon the screening of randomized gene libraries, there is surprisingly little information available on the construction of those libraries. In general, it is based on the cloning of 'randomized' synthetic oligonucleotides, in which given position(s) contain an equal mixture of all four bases. Yet, many supposedly 'randomized' libraries contain significant elements of bias and/or omission. Here, we report the development and validation of a new, PCR-based assay that enables rapid examination of library composition both prior to and after cloning. By using our assay to analyse model libraries, we demonstrate that the cloning of a given distribution of sequences does not necessarily result in a similarly composed library of clones. Thus, while bias in randomized synthetic oligonucleotide mixtures can be virtually eliminated by using unequal ratios of the four phosphoramidites, the use of such mixtures does not ensure retrieval of a truly randomized library. We propose that in the absence of a technique to control cloning frequencies, the ability to analyse the composition of libraries after cloning will enhance significantly the quality of information derived from those libraries. (C) 2000 Published by Elsevier Science B.V. All rights reserved.
Resumo:
The objective of this work was to design, construct and commission a new ablative pyrolysis reactor and a high efficiency product collection system. The reactor was to have a nominal throughput of 10 kg/11r of dry biomass and be inherently scalable up to an industrial scale application of 10 tones/hr. The whole process consists of a bladed ablative pyrolysis reactor, two high efficiency cyclones for char removal and a disk and doughnut quench column combined with a wet walled electrostatic precipitator, which is directly mounted on top, for liquids collection. In order to aid design and scale-up calculations, detailed mathematical modelling was undertaken of the reaction system enabling sizes, efficiencies and operating conditions to be determined. Specifically, a modular approach was taken due to the iterative nature of some of the design methodologies, with the output from one module being the input to the next. Separate modules were developed for the determination of the biomass ablation rate, specification of the reactor capacity, cyclone design, quench column design and electrostatic precipitator design. These models enabled a rigorous design protocol to be developed capable of specifying the required reactor and product collection system size for specified biomass throughputs, operating conditions and collection efficiencies. The reactor proved capable of generating an ablation rate of 0.63 mm/s for pine wood at a temperature of 525 'DC with a relative velocity between the heated surface and reacting biomass particle of 12.1 m/s. The reactor achieved a maximum throughput of 2.3 kg/hr, which was the maximum the biomass feeder could supply. The reactor is capable of being operated at a far higher throughput but this would require a new feeder and drive motor to be purchased. Modelling showed that the reactor is capable of achieving a reactor throughput of approximately 30 kg/hr. This is an area that should be considered for the future as the reactor is currently operating well below its theoretical maximum. Calculations show that the current product collection system could operate efficiently up to a maximum feed rate of 10 kg/Fir, provided the inert gas supply was adjusted accordingly to keep the vapour residence time in the electrostatic precipitator above one second. Operation above 10 kg/hr would require some modifications to the product collection system. Eight experimental runs were documented and considered successful, more were attempted but due to equipment failure had to be abandoned. This does not detract from the fact that the reactor and product collection system design was extremely efficient. The maximum total liquid yield was 64.9 % liquid yields on a dry wood fed basis. It is considered that the liquid yield would have been higher had there been sufficient development time to overcome certain operational difficulties and if longer operating runs had been attempted to offset product losses occurring due to the difficulties in collecting all available product from a large scale collection unit. The liquids collection system was highly efficient and modeling determined a liquid collection efficiency of above 99% on a mass basis. This was validated due to the fact that a dry ice/acetone condenser and a cotton wool filter downstream of the collection unit enabled mass measurements of the amount of condensable product exiting the product collection unit. This showed that the collection efficiency was in excess of 99% on a mass basis.
Resumo:
This essay explores the relationship between the development of public libraries in the context of an increasingly market-dominated economy and marketised society. It argues that although neo-liberalism as a policy goal and practice has taken different forms over time, there are common themes in terms of its emphasis on market values, privatisation, and the support of measures that reduce the role of public funding and the state in the provision of public services. This has led some commentators to express concerns that the meaning and practice of citizenship and democracy is being transformed, managed or otherwise diminished. These concerns are compounded by changes effected by new digital technology. Imbricated with this issue are debates surrounding the future of the public library, and attempts by librarians and others to reinvent and reimagine its purpose. With reference to some innovative initiatives in the USA and Scandinavia, it is suggested that public libraries, through their service and spatial rearticulation, can conceivably help strengthen and revitalise public democracy and the public sphere.
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.
Resumo:
The aim of the research is to develop an e-business selection framework for small and medium enterprises (SMEs) by integrating established techniques in planning. The research is case based, comprising four case studies carried out in the printing industry for the purpose of evaluating the framework. Two of the companies are from Singapore, while the other two are from Guangzhou, China and Jinan, China respectively. To determine the need of an e-business selection framework for SMEs, extensive literature reviews were carried out in the area of e-business, business planning frameworks, SMEs and the printing industry. An e-business selection framework is then proposed by integrating the three established techniques of the Balanced Scorecard (BSC), Value Chain Analysis (VCA) and Quality Function Deployment (QFD). The newly developed selection framework is pilot tested using a published case study before actual evaluation is carried out in four case study companies. The case study methodology was chosen because of its ability to integrate diverse data collection techniques required to generate the BSC, VCA and QFD for the selection framework. The findings of the case studies revealed that the three techniques of BSC, VCA and QFD can be integrated seamlessly to complement on each other’s strengths in e-business planning. The eight-step methodology of the selection framework can provide SMEs with a step-by-step approach to e-business through structured planning. Also, the project has also provided better understanding and deeper insights into SMEs in the printing industry.
Resumo:
During the last decade the use of randomised gene libraries has had an enormous impact in the field of protein engineering. Such libraries comprise many variations of a single gene in which codon replacements are used to substitute key residues of the encoded protein. The expression of such libraries generates a library of randomised proteins which can subsequently be screened for desired or novel activities. Randomisation in this fashion has predominantly been achieved by the inclusion of the codons NNN or NNGCor T, in which N represents any of the four bases A,C,G, or T. The use of thesis codons however, necessities the cloning of redundant codons at each position of randomisation, in addition to those required to encode the twenty possible amino acid substitutions. As degenerate codons must be included at each position of randomisation, this results in a progressive loss of randomisation efficiency as the number of randomised positions is increased. The ratio of genes to proteins in these libraries rises exponentially with each position of randomisation, creating large gene libraries, which generate protein libraries of limited diversity upon expression. In addition to these problems of library size, the cloning of redundant codons also results in the generation of protein libraries in which substituted amino acids are unevenly represented. As several of the randomised codons may encode the same amino acid, for example serine which is encoded six time using the codon NNN, an inherent bias may be introduced into the resulting protein library during the randomisation procedure. The work outlined here describes the development of a novel randomisation technique aimed at a eliminating codon redundancy from randomised gene libraries, thus addressing the problems of library size and bias, associated with the cloning of redundant codons.
Resumo:
Combinatorial libraries continue to play a key role in drug discovery. To increase structural diversity, several experimental methods have been developed. However, limited efforts have been performed so far to quantify the diversity of the broadly used diversity-oriented synthetic (DOS) libraries. Herein we report a comprehensive characterization of 15 bis-diazacyclic combinatorial libraries obtained through libraries from libraries, which is a DOS approach. Using MACCS keys, radial and different pharmacophoric fingerprints as well as six molecular properties, it was demonstrated the increased structural and property diversity of the libraries from libraries over the individual libraries. Comparison of the libraries to existing drugs, NCI Diversity and the Molecular Libraries Small Molecule Repository revealed the structural uniqueness of the combinatorial libraries (mean similarity < 0.5 for any fingerprint representation). In particular, bis-cyclic thiourea libraries were the most structurally dissimilar to drugs retaining drug-like character in property space. This study represents the first comprehensive quantification of the diversity of libraries from libraries providing a solid quantitative approach to compare and contrast the diversity of DOS libraries with existing drugs or any other compound collection.
Resumo:
This thesis examines the ways that libraries have employed computers to assist with housekeeping operations. It considers the relevance of such applications to company libraries in the construction industry, and describes more specifically the development of an integrated cataloguing and loan system. A review of the main features in the development of computerised ordering, cataloguing and circulation control systems shows that fully integrated packages are beginning to be completed, and that some libraries are introducing second generation programs. Cataloguing is the most common activity to be computerised, both at national and company level. Results from a sample of libraries in the construction industry suggest that the only computerised housekeeping system is at Taylor Woodrow. Most of the firms have access to an in-house computer, and some of the libraries, particularly those in firms of consulting engineers, might benefit from computerisation, but there are differing attitudes amongst the librarians towards the computer. A detailed study of the library at Taylor Woodrow resulted in a feasibility report covering all the areas of its activities. One of the main suggestions was the possible use of a computerised loans and cataloguing system. An integrated system to cover these two areas was programmed in Fortran and implemented. This new system provides certain benefits and saves staff time, but at the cost of time on the computer. Some improvements could be made by reprogramming, but it provides a general system for small technical libraries. A general equation comparing costs for manual and computerised operations is progressively simplified to a form where the annual saving from the computerised system is expressed in terms of staff and computer costs and the size of the library. This equation gives any library an indication of the savings or extra cost which would result from using the computerised system.
Resumo:
A need was indicated for the identification of a possible new solar energy product to improve the sales potential of a metal film with a selective surface, manufactured by the industriaI sponsor of this project (INCO). A possible way of overcoming the disadvantageous economics of solar energy collection was identified. This utilised the collection of solar energy by the walls of buildings constructed in such a manner as to allow the transfer of energy into the building, whilst providing adequate thermal insulation in the absence of sunlight. The actual collection element of the wall, being metallic, is also capable of performing the function of a low temperature heating .system in the absence of sunlight. As a result of this, the proposed system, by displacing both the wall and centraI heating system which would otherwise be necessary, demonstrates economic benefits over systems which are constructed solely for the purpose of collecting solar energy. The necessary thermodynamic and meteorological. characteristics and data: are established, and applied to a typical urban site in the North of England, for a typical average year, with and without a shading device incorporated into the construction. It is concluded that the proposed system may offer considerable benefit in reducing the effective heating season in all orientations of wall.