904 resultados para Policy Design, Analysis, and Evaluation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The SAR of Asperlicin analogues is reported, leading to bioactive 1,4-benzodiazepine-2-ones, which were prepared in a 3 step reaction sequence. The Asperlicin substructure was built up using Tryptophan and readily available 2-amino-acetophenones. This template, containing a 1,4-benzodiazepin-2-one moiety with a 3-indolmethyl side chain, was transformed into mono- and di-substituted 3-indol-3 '-yl-methyl-1,4-benzodi-azepine-2-ones by selective alkylation and acylation reactions. The SAR optimization of the 1,4-benzodiazepine scaffold has included variations at the 5-, 7-, 8-position, at the N1, N-indole nitrogen and the configuration of the C3-position. The most active Asperlicin analogue, having an IC50 of 1.6 microM on the CCKA receptor subtype, was obtained from Tryptophan in only 3 steps in an overall yield of 48%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topic of bioenergy, biofuels and bioproducts remains at the top of the current political and research agenda. Identification of the optimum processing routes for biomass, in terms of efficiency, cost, environment and socio-economics is vital as concern grows over the remaining fossil fuel resources, climate change and energy security. It is known that the only renewable way of producing conventional hydrocarbon fuels and organic chemicals is from biomass, but the problem remains of identifying the best product mix and the most efficient way of processing biomass to products. The aim is to move Europe towards a biobased economy and it is widely accepted that biorefineries are key to this development. A methodology was required for the generation and evaluation of biorefinery process chains for converting biomass into one or more valuable products that properly considers performance, cost, environment, socio-economics and other factors that influence the commercial viability of a process. In this thesis a methodology to achieve this objective is described. The completed methodology includes process chain generation, process modelling and subsequent analysis and comparison of results in order to evaluate alternative process routes. A modular structure was chosen to allow greater flexibility and allowing the user to generate a large number of different biorefinery configurations The significance of the approach is that the methodology is defined and is thus rigorous and consistent and may be readily re-examined if circumstances change. There was the requirement for consistency in structure and use, particularly for multiple analyses. It was important that analyses could be quickly and easily carried out to consider, for example, different scales, configurations and product portfolios and so that previous outcomes could be readily reconsidered. The result of the completed methodology is the identification of the most promising biorefinery chains from those considered as part of the European Biosynergy Project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The object of this work was to further develop the idea introduced by Muaddi et al (1981) which enables some of the disadvantages of earlier destructive adhesion test methods to be overcome. The test is non-destructive in nature but it does need to be calibrated against a destructive method. Adhesion is determined by measuring the effect of plating on internal friction. This is achieved by determining the damping of vibrations of a resonating specimen before and after plating. The level of adhesion was considered by the above authors to influence the degree of damping. In the major portion of the research work the electrodeposited metal was Watt's nickel, which is ductile in nature and is therefore suitable for peel adhesion testing. The base metals chosen were aluminium alloys S1C and HE9 as it is relatively easy to produce varying levels of adhesion between the substrate and electrodeposited coating by choosing the appropriate process sequence. S1C alloy is the commercially pure aluminium and was used to produce good adhesion. HE9 aluminium alloy is a more difficult to plate alloy and was chosen to produce poorer adhesion. The "Modal Testing" method used for studying vibrations was investigated as a possible means of evaluating adhesion but was not successful and so research was concentrated on the "Q" meter. The method based on the use of a "Q" meter involves the principle of exciting vibrations in a sample, interrupting the driving signal and counting the number of oscillations of the freely decaying vibrations between two known preselected amplitudes of oscillations. It was not possible to reconstruct a working instrument using Muaddi's thesis (1982) as it had either a serious error or the information was incomplete. Hence a modified "Q" meter had to be designed and constructed but it was then difficult to resonate non-magnetic materials, such as aluminium, therefore, a comparison before and after plating could not be made. A new "Q" meter was then developed based on an Impulse Technique. A regulated miniature hammer was used to excite the test piece at the fundamental mode instead of an electronic hammer and test pieces were supported at the two predetermined nodal points using nylon threads. This instrument developed was not very successful at detecting changes due to good and poor pretreatments given before plating, however, it was more sensitive to changes at the surface such as room temperature oxidation. Statistical analysis of test results from untreated aluminium alloys show that the instrument is not always consistent, the variation was even bigger when readings were taken on different days. Although aluminium is said to form protective oxides at room temperature there was evidence that the aluminium surface changes continuously due to film formation, growth and breakdown. Nickel plated and zinc alloy immersion coated samples also showed variation in Q with time. In order to prove that the variations in Q were mainly due to surface oxidation, aluminium samples were lacquered and anodised Such treatments enveloped the active surfaces reacting with the environment and the Q variation with time was almost eliminated especially after hard anodising. This instrument detected major differences between different untreated aluminium substrates.Also Q values decreased progressively as coating thicknesses were increased. This instrument was also able to detect changes in Q due to heat-treatment of aluminium alloys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cellular mobile radio systems will be of increasing importance in the future. This thesis describes research work concerned with the teletraffic capacity and the canputer control requirements of such systems. The work involves theoretical analysis and experimental investigations using digital computer simulation. New formulas are derived for the congestion in single-cell systems in which there are both land-to-mobile and mobile-to-mobile calls and in which mobile-to-mobile calls go via the base station. Two approaches are used, the first yields modified forms of the familiar Erlang and Engset formulas, while the second gives more complicated but more accurate formulas. The results of computer simulations to establish the accuracy of the formulas are described. New teletraffic formulas are also derived for the congestion in multi -cell systems. Fixed, dynamic and hybrid channel assignments are considered. The formulas agree with previously published simulation results. Simulation programs are described for the evaluation of the speech traffic of mobiles and for the investigation of a possible computer network for the control of the speech traffic. The programs were developed according to the structured progranming approach leading to programs of modular construction. Two simulation methods are used for the speech traffic: the roulette method and the time-true method. The first is economical but has some restriction, while the second is expensive but gives comprehensive answers. The proposed control network operates at three hierarchical levels performing various control functions which include: the setting-up and clearing-down of calls, the hand-over of calls between cells and the address-changing of mobiles travelling between cities. The results demonstrate the feasibility of the control netwvork and indicate that small mini -computers inter-connected via voice grade data channels would be capable of providing satisfactory control

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A notable feature of the recent commercialisation of biotechnology has been the success of 200 or so new firms, established in America since 1976, in exploiting specialised market niches. A key factor in their formation has been the ready availability of venture capital funding. These firms have been instrumental in establishing America's lead in exploiting biotechnology. It is this example which Britain has attempted to emulate as part of its strategy for developing its own biotechnology capabilities. This thesis investigated some aspects of the relationship between biotechnology and venture capital, concentrating on the determinants of the venture capitalist's investment decision. Following an extensive literature survey, two hypothetical business proposals were used to find what venture capitalists themselves consider to be the key elements of this decision. It was found that venture capitalists invest in people, not products, and businesses, not industries. It was concluded that venture capital-backed small firms should, therefore, be seen as an adjunct to the development of biotechnology in Britain, rather than as a substitute for a co-ordinated, co-operative strategy involving Government, the financial institutions, industry and academia. This is chiefly because the small size of the UK's domestic market means that many potentially important innovations in biotechnology may continue to be lost, since the short term identification of market opportunities for biotechnology products will dictate that they are insupportable in Britain alone. In addition, the data analysis highlighted some interesting methodological issues concerning the investigation of investment decision making. These related especially to shortcomings in the use of scoresheets and questionnaires in research in this area. The conclusion here was that future research should concentrate on the reasons why an individual reaches an investment decision. It is argued that only in this way can the nature of the evaluation procedures employed by venture capitalists be properly understood.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of N1-benzylideneheteroarylcarboxamidrazones was prepared in an automated fashion, and tested against Mycobacterium fortuitum in a rapid screen for antimycobacterial activity. Many of the compounds from this series were also tested against Mycobacterium tuberculosis, and the usefulness as M.fortuitum as a rapid, initial screen for anti-tubercular activity evaluated. Various deletions were made to the N1-benzylideneheteroarylcarboxamidrazone structure in order to establish the minimum structural requirements for activity. The N1-benzylideneheteroarylcarbox-amidrazones were then subjected to molecular modelling studies and their activities against M.fortuitum and M.tuberculosis were analysed using quantitative structure-analysis relationship (QSAR) techniques in the computational package TSAR (Oxford Molecular Ltd.). A set of equations predictive of antimycobacterial activity was hereby obtained. The series of N1-benzylidenehetero-arylcarboxamidrazones was also tested against a multidrug-resistant strain of Staphylococcus aureus (MRSA), followed by a panel of Gram-positive and Gram-negative bacteria, if activity was observed for MRSA. A set of antimycobacterial N1-benzylideneheteroarylcarboxamidrazones was hereby discovered, the best of which had MICs against m. fortuitum in the range 4-8μgml-1 and displayed 94% inhibition of M.tuberculosis at a concentration of 6.25μgml-1. The antimycobacterial activity of these compounds appeared to be specific, since the same compounds were shown to be inactive against other classes of organisms. Compounds which were found to be sufficiently active in any screen were also tested for their toxicity against human mononuclear leucocytes. Polyethylene glycol (PEG) was used as a soluble polymeric support for the synthesis of some fatty acid derivatives, containing an isoxazoline group, which may inhibit mycolic acid synthesis in mycobacteria. Both the PEG-bound products and the cleaved, isolated products themselves were tested against M.fortuitum and some low levels of antimycobacterial activity were observed, which may serve as lead compounds for further studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this research was to investigate the integration of computer-aided drafting and finite-element analysis in a linked computer-aided design procedure and to develop the necessary software. The Be'zier surface patch for surface representation was used to bridge the gap between the rather separate fields of drafting and finite-element analysis because the surfaces are defined by analytical functions which allow systematic and controlled variation of the shape and provide continuous derivatives up to any required degree. The objectives of this research were achieved by establishing : (i) A package which interpretes the engineering drawings of plate and shell structures and prepares the Be'zier net necessary for surface representation. (ii) A general purpose stand-alone meshed-surface modelling package for surface representation of plates and shells using the Be'zier surface patch technique. (iii) A translator which adapts the geometric description of plate and shell structures as given by the meshed-surface modeller to the form needed by the finite-element analysis package. The translator was extended to suit fan impellers by taking advantage of their sectorial symmetry. The linking processes were carried out for simple test structures, simplified and actual fan impellers to verify the flexibility and usefulness of the linking technique adopted. Finite-element results for thin plate and shell structures showed excellent agreement with those obtained by other investigators while results for the simplified and actual fan impellers also showed good agreement with those obtained in an earlier investigation where finite-element analysis input data were manually prepared. Some extensions of this work have also been discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to deal with the outcomes of a so-called “employability management needs analysis” that is meant to provide more insight into current employability management activities and its possible benefits for Information and Communication Technology (ICT) professionals working in Small- and Medium-sized enterprises (SMEs) throughout Europe. A considerable series of interviews (N=107) were conducted with managers in SMEs in seven European countries, including Germany, Greece, Italy, the Netherlands, Norway, Poland, and the UK. A semi-structured interview protocol was used during the interviews to cover three issues: employability (13 items), ageing (8 items), and future developments and requirements (13 items). Analysis of all final interview transcriptions was at a national level using an elaborate common coding scheme. Although an interest in employability emerged, actual policy and action lagged behind. The recession in the ICT sector at the time of the investigation and the developmental stage of the sector in each participating country appeared connected. Ageing was not seen as a major issue in the ICT sector because managers considered ICT to be a relatively young sector. There appeared to be a serious lack of investment in the development of expertise of ICT professionals. Generalization of the results to large organizations in the ICT sector should be made with caution. The interview protocol developed is of value for further research and complements survey research undertaken within the employability field of study. It can be concluded that proactive HRM (Human Resource Management) policies and strategies are essential, even in times of economic downturn. Employability management activities are especially important in the light of current career issues. The study advances knowledge regarding HRM practices adopted by SMEs in the ICT sector, especially as there is a gap in knowledge about career development issues in that particular sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the development of a simple and accurate method for estimating the quantity and composition of household waste arisings. The method is based on the fundamental tenet that waste arisings can be predicted from information on the demographic and socio-economic characteristics of households, thus reducing the need for the direct measurement of waste arisings to that necessary for the calibration of a prediction model. The aim of the research is twofold: firstly to investigate the generation of waste arisings at the household level, and secondly to devise a method for supplying information on waste arisings to meet the needs of waste collection and disposal authorities, policy makers at both national and European level and the manufacturers of plant and equipment for waste sorting and treatment. The research was carried out in three phases: theoretical, empirical and analytical. In the theoretical phase specific testable hypotheses were formulated concerning the process of waste generation at the household level. The empirical phase of the research involved an initial questionnaire survey of 1277 households to obtain data on their socio-economic characteristics, and the subsequent sorting of waste arisings from each of the households surveyed. The analytical phase was divided between (a) the testing of the research hypotheses by matching each household's waste against its demographic/socioeconomic characteristics (b) the development of statistical models capable of predicting the waste arisings from an individual household and (c) the development of a practical method for obtaining area-based estimates of waste arisings using readily available data from the national census. The latter method was found to represent a substantial improvement over conventional methods of waste estimation in terms of both accuracy and spatial flexibility. The research therefore represents a substantial contribution both to scientific knowledge of the process of household waste generation, and to the practical management of waste arisings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Productivity growth has long been associated with, among other things, contestability of markets which, in turn, is dependent on the ease with which potential competitors to the incumbent firms can enter the product market. There is a growing consensus that in emerging markets regulatory and institutional factors may have a greater influence on a firm's ability to enter a product market than strategic positions adopted by the incumbent firms. We examine this proposition in the context of India where the industrial policies of the 1980s and the 1990s are widely believed to be pro-incumbent and pro-competition, respectively, thereby providing the setting for a natural experiment with 1991 as the watershed year. In our analysis, we also take into consideration the possibility that the greater economic federalism associated with the reforms of the 1990s may have affected the distribution of industrial units across states after 1991. Our paper, which uses the experiences of the textiles and electrical machinery sectors during the two decades as the basis for the analysis, finds broad support for both these hypotheses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to focus on investigating and benchmarking green operations initiatives in the automotive industry documented in the environmental reports of selected companies. The investigation roadmaps the main environmental initiatives taken by the world's three major car manufacturers and benchmarks them against each other. The categorisation of green operations initiatives that is provided in the paper can also help companies in other sectors to evaluate their green practices. Design/methodology/approach: The first part of the paper is based on existing literature on the topic of green and sustainable operations and the "unsustainable" context of automotive production. The second part relates to the roadmap and benchmarking of green operations initiatives based on an analysis of secondary data from the automotive industry. Findings: The findings show that the world's three major car manufacturers are pursuing various environmental initiatives involving the following green operations practices: green buildings, eco-design, green supply chains, green manufacturing, reverse logistics and innovation. Research limitations/implications: The limitations of this paper start from its selection of the companies, which was made using production volume and country of origin as the principal criteria. There is ample evidence that other, smaller, companies are pursuing more sophisticated and original environmental initiatives. Also, there might be a gap between what companies say they do in their environmental reports and what they actually do. Practical implications: This paper helps practitioners in the automotive industry to benchmark themselves against the major volume manufacturers in three different continents. Practitioners from other industries will also find it valuable to discover how the automotive industry is pursuing environmental initiatives beyond manufacturing, apart from the green operations practices covering broadly all the activities of operations function. Originality/value: The originality of the paper is in its up-to-date analysis of environmental reports of automotive companies. The paper offers value for researchers and practitioners due to its contribution to the green operations literature. For instance, the inclusion of green buildings as part of green operations practices has so far been neglected by most researchers and authors in the field of green and sustainable operations. © Emerald Group Publishing Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bio-impedance analysis (BIA) provides a rapid, non-invasive technique for body composition estimation. BIA offers a convenient alternative to standard techniques such as MRI, CT scan or DEXA scan for selected types of body composition analysis. The accuracy of BIA is limited because it is an indirect method of composition analysis. It relies on linear relationships between measured impedance and morphological parameters such as height and weight to derive estimates. To overcome these underlying limitations of BIA, a multi-frequency segmental bio-impedance device was constructed through a series of iterative enhancements and improvements of existing BIA instrumentation. Key features of the design included an easy to construct current-source and compact PCB design. The final device was trialled with 22 human volunteers and measured impedance was compared against body composition estimates obtained by DEXA scan. This enabled the development of newer techniques to make BIA predictions. To add a ‘visual aspect’ to BIA, volunteers were scanned in 3D using an inexpensive scattered light gadget (Xbox Kinect controller) and 3D volumes of their limbs were compared with BIA measurements to further improve BIA predictions. A three-stage digital filtering scheme was also implemented to enable extraction of heart-rate data from recorded bio-electrical signals. Additionally modifications have been introduced to measure change in bio-impedance with motion, this could be adapted to further improve accuracy and veracity for limb composition analysis. The findings in this thesis aim to give new direction to the prediction of body composition using BIA. The design development and refinement applied to BIA in this research programme suggest new opportunities to enhance the accuracy and clinical utility of BIA for the prediction of body composition analysis. In particular, the use of bio-impedance to predict limb volumes which would provide an additional metric for body composition measurement and help distinguish between fat and muscle content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

JPEG2000 is a new coming image standard. In this paper we analyze the performance of error resilience tools in JPEG2000, and present an analytical model to estimate the quality of JPEG2000 encoded image transmitted over wireless channels. The effectiveness of the analytical model is validated by simulation results. Furthermore, analytical model is utilized by the base station to design efficient unequally error protection schemes for JPEG2000 transmission. In the design, a utility function is denned to make a tradeoff between the image quality and the cost for transmitting the image over wireless channel. © 2002 IEEE.