84 resultados para Tree based intercrop systems
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
We report for the first time on the limitations in the operational power range of few-mode fiber based transmission systems, employing 28Gbaud quadrature phase shift keying transponders, over 1,600km. It is demonstrated that if an additional mode is used on a preexisting few-mode transmission link, and allowed to optimize its performance, it will have a significant impact on the pre-existing mode. In particular, we show that for low mode coupling strengths (weak coupling regime), the newly added variable power mode does not considerably impact the fixed power existing mode, with performance penalties less than 2dB (in Q-factor). On the other hand, as mode coupling strength is increased (strong coupling regime), the individual launch power optimization significantly degrades the system performance, with penalties up to ∼6dB. Our results further suggest that mutual power optimization, of both fixed power and variable power modes, reduces power allocation related penalties to less than 3dB, for any given coupling strength, for both high and low differential mode delays. © 2013 Optical Society of America.
Resumo:
We demonstrate a novel and simple sensor interrogation scheme for fiber Bragg grating (FBG) based sensing systems. In this scheme, a chirped FBG based Sagnac loop is used as a wavelength-dependent receiver, and a stable and linear readout response is realised. It is a signijkant advantage of this scheme that the sensitivity and the measurement wavelength range can be easily adjhsted by controlling the chirp of the FBG or using an optical delay line in the Sagnac loop.
Resumo:
We present novel Terahertz (THz) emitting optically pumped Quantum Dot (QD) photoconductive (PC) materials and antenna structures on their basis both for pulsed and CW pumping regimes. Full text Quantum dot and microantenna design - Presented here are design considerations for the semiconductor materials in our novel QD-based photoconductive antenna (PCA) structures, metallic microantenna designs, and their implementation as part of a complete THz source or transceiver system. Layers of implanted QDs can be used for the photocarrier lifetime shortening mechanism[1,2]. In our research we use InAs:GaAs QD structures of varying dot layer number and distributed Bragg reflector(DBR)reflectivity range. According to the observed dependence of carrier lifetimes on QD layer periodicity [3], it is reasonable to assume that electron lifetimes can be potentially reduced down to 0.45ps in such structures. Both of these features; long excitation wavelength and short carriers lifetime predict possible feasibility of QD antennas for THz generation and detection. In general, relatively simple antenna configurations were used here, including: coplanar stripline (CPS); Hertzian-type dipoles; bow-ties for broadband and log-spiral(LS)or log-periodic(LP)‘toothed’ geometriesfor a CW operation regime. Experimental results - Several lasers are used for antenna pumping: Ti:Sapphire femtosecond laser, as well as single-[4], double-[5] wavelength, and pulsed [6] QD lasers. For detection of the THz signal different schemes and devices were used, e.g. helium-cooled bolometer, Golay cell and a second PCA for coherent THz detection in a traditional time-domain measurement scheme.Fig.1shows the typical THz output power trend from a 5 um-gap LPQD PCA pumped using a tunable QD LD with optical pump spectrum shown in (b). Summary - QD-based THz systems have been demonstrated as a feasible and highly versatile solution. The implementation of QD LDs as pump sources could be a major step towards ultra-compact, electrically controllable transceiver system that would increase the scope of data analysis due to the high pulse repetition rates of such LDs [3], allowing real-time THz TDS and data acquisition. Future steps in development of such systems now lie in the further investigation of QD-based THz PCA structures and devices, particularly with regards to their compatibilitywith QD LDs as pump sources. [1]E. U. Rafailov et al., “Fast quantum-dot saturable absorber for passive mode-locking of solid-State lasers,”Photon.Tech.Lett., IEEE, vol. 16 pp. 2439-2441(2004) [2]E. Estacio, “Strong enhancement of terahertz emission from GaAs in InAs/GaAs quantum dot structures. Appl.Phys.Lett., vol. 94 pp. 232104 (2009) [3]C. Kadow et al., “Self-assembled ErAs islands in GaAs: Growth and subpicosecond carrier dynamics,” Appl. Phys. Lett., vol. 75 pp. 3548-3550 (1999) [4]T. Kruczek, R. Leyman, D. Carnegie, N. Bazieva, G. Erbert, S. Schulz, C. Reardon, and E. U. Rafailov, “Continuous wave terahertz radiation from an InAs/GaAs quantum-dot photomixer device,” Appl. Phys. Lett., vol. 101(2012) [5]R. Leyman, D. I. Nikitichev, N. Bazieva, and E. U. Rafailov, “Multimodal spectral control of a quantum-dot diode laser for THz difference frequency generation,” Appl. Phys. Lett., vol. 99 (2011) [6]K.G. Wilcox, M. Butkus, I. Farrer, D.A. Ritchie, A. Tropper, E.U. Rafailov, “Subpicosecond quantum dot saturable absorber mode-locked semiconductor disk laser, ” Appl. Phys. Lett. Vol 94, 2511 © 2014 IEEE.
Resumo:
Cellular thiols are critical moieties in signal transduction, regulation of gene expression, and ultimately are determinants of specific protein activity. Whilst protein bound thiols are the critical effector molecules, low molecular weight thiols, such as glutathione, play a central role in cytoprotection through (1) direct consumption of oxidants, (2) regeneration of protein thiols and (3) export of glutathione containing mixed disulphides. The brain is particularly vulnerable to oxidative stress, as it consumes 20% of oxygen load, contains high concentrations of polyunsaturated fatty acids and iron in certain regions, and expresses low concentrations of enzymic antioxidants. There is substantial evidence for a role for oxidative stress in neurodegenerative disease, where excitotoxic, redox cycling and mitochondrial dysfunction have been postulated to contribute to the enhanced oxidative load. Others have suggested that loss of important trophic factors may underlie neurodegeneration. However, the two are not mutually exclusive; using cell based model systems, low molecular weight antioxidants have been shown to play an important neuroprotective role in vitro, where neurotrophic factors have been suggested to modulate glutathione levels. Glutathione levels are regulated by substrate availability, synthetic enzyme and metabolic enzyme activity, and by the presence of other antioxidants, which according to the redox potential, consume or regenerate GSH from its oxidised partner. Therefore we have investigated the hypothesis that amyloid beta neurotoxicity is mediated by reactive oxygen species, where trophic factor cytoprotection against oxidative stress is achieved through regulation of glutathione levels. Using PC12 cells as a model system, amyloid beta 25-35 caused a shift in DCF fluorescence after four hours in culture. This fluorescence shift was attenuated by both desferioxamine and NGF. After four hours, cellular glutathione levels were depleted by as much as 75%, however, 24 hours following oxidant exposure, glutathione concentration was restored to twice the concentration seen in controls. NGF prevented both the loss of viability seen after 24 hours amyloid beta treatment and also protected glutathione levels. NGF decreased the total cellular glutathione concentration but did not affect expression of GCS. In conclusion, loss of glutathione precedes cell death in PC12 cells. However, at sublethal doses the surviving fraction respond to oxidative stress by increasing glutathione levels, where this is achieved, at least in part, at the gene level through upregulation of GCS. Whilst NGF does protect against oxidative toxicity, this is not achieved through upregulation of GCS or glutathione.
Resumo:
We propose a computationally efficient method to the per-channel dispersion optimisation applied to 50 GHz-spaced N × 20-Gbit/s wavelength division multiplexing return-to-zero differential phase shift keying transmission in non-zero dispersion-shifted fibre based submarine systems. Crown Copyright © 2010.
Resumo:
Computer-Based Learning systems of one sort or another have been in existence for almost 20 years, but they have yet to achieve real credibility within Commerce, Industry or Education. A variety of reasons could be postulated for this, typically: - cost - complexity - inefficiency - inflexibility - tedium Obviously different systems deserve different levels and types of criticism, but it still remains true that Computer-Based Learning (CBL) is falling significantly short of its potential. Experience of a small, but highly successful CBL system within a large, geographically distributed industry (the National Coal Board) prompted an investigation into currently available packages, the original intention being to purchase the most suitable software and run it on existing computer hardware, alongside existing software systems. It became apparent that none of the available CBL packages were suitable, and a decision was taken to develop an in-house Computer-Assisted Instruction system according to the following criteria: - cheap to run; - easy to author course material; - easy to use; - requires no computing knowledge to use (as either an author or student) ; - efficient in the use of computer resources; - has a comprehensive range of facilities at all levels. This thesis describes the initial investigation, resultant observations and the design, development and implementation of the SCHOOL system. One of the principal characteristics c£ SCHOOL is that it uses a hierarchical database structure for the storage of course material - thereby providing inherently a great deal of the power, flexibility and efficiency originally required. Trials using the SCHOOL system on IBM 303X series equipment are also detailed, along with proposed and current development work on what is essentially an operational CBL system within a large-scale Industrial environment.
Resumo:
Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.
Resumo:
'British Racial Discourse' is a study of political discourse about race and race-related matters. The explanatory theory is adapted from current sociological studies of ideology with a heavy emphasis on the tradition developed from Marx and Engels's Feuerbach. The empirical data is drawn from the parliamentary debates on immigration and the Race Relations Bills, Conservative and Labour Party Conference Reports, and a set of interviews with Wolverhampton Borough councillors. Although the thesis has broader significance for British political discourse about race, it is particularly concerned with the responses of members of the two main political parties, rather than with the more overt and sensational racism of certain extreme Right-wing groups. Indeed, as the study progresses, it focuses more and more narrowly on the phenomenon of 'deracialised' discourse, and the details of the predominantly class-based justificatory systems of the Conservative and Labour Parties. Of particular interest are the argument forms (used in the debates on immigration and race relations) which manage to obscure the white electorate's responsibility for prejudice and discrimination. Such discoursive forms are of major significance for understanding British race relations, and their detailed examination provides an insight into the way in which 'ideological facades' are created and maintained.
Resumo:
Enhanced immune responses for DNA and subunit vaccines potentiated by surfactant vesicle based delivery systems outlined in the present study, provides proof of principle for the beneficial aspects of vesicle mediated vaccination. The dehydration-rehydration technique was used to entrap plasmid DNA or subunit antigens into lipid-based (liposomes) or non-ionic surfactant-based (niosomes) dehydration-rehydration vesicles (DRV). Using this procedure, it was shown that both these types of antigens can be effectively entrapped in DRV liposomes and DRV niosomes. The vesicle size of DRV niosomes was shown to be twice the diameter (~2µm) of that of their liposome counterparts. Incorporation of cryoprotectants such as sucrose in the DRV procedure resulted in reduced vesicle sizes while retaining high DNA incorporation efficiency (~95%). Transfection studies in COS 7 cells demonstrated that the choice of cationic lipid, the helper lipid, and the method of preparation, all influenced transfection efficiency indicating a strong interdependency of these factors. This phenomenon has been further reinforced when 1,2-dioleoyl-sn-glycero-3-phosphoethanolamine (DOPE): cholesteryl 3b- [N-(N’ ,N’ -dimethylaminoethane)-carbamoyl] cholesterol (DC-Chol)/DNA complexes were supplemented with non-ionic surfactants. Morphological analysis of these complexes using transmission electron microscopy and environmental scanning electron microscopy (ESEM) revealed the presence of heterogeneous structures which may be essential for an efficient transfection in addition to the fusogenic properties of DOPE. In vivo evaluation of these DNA incorporated vesicle systems in BALB/c mice showed weak antibody and cell-mediated immune (CMI) responses. Subsequent mock challenge with hepatitis B antigen demonstrated that, 1-monopalmitoyl glycerol (MP) based DRV, is a more promising DNA vaccine adjuvant. Studying these DRV systems as adjuvants for the Hepatitis B subunit antigen (HBsAg) revealed a balanced antibody/CMI response profile on the basis of the HBsAg specific antibody and cytokine responses which were higher than unadjuvated antigen. The effect of addition of MP, cholesterol and trehalose 6,6’-dibehenate (TDB) on the stability and immuno-efficacy of dimethyldioctadecylammonium bromide (DDA) vesicles was investigated. Differential scanning calorimetry showed a reduction in transition temperature of DDA vesicles by ~12°C when incorporated with surfactants. ESEM of MP based DRV system indicated an increased vesicle stability upon incorporation of antigen. Adjuvant activity of these systems tested in C57BL/6j mice against three subunit antigens i.e., mycobacterial fusion protein- Ag85B-ESAT-6, and two malarial antigens - merozoite surface protein-1, (MSP1), and glutamate rich protein, (GLURP) revealed that while MP and DDA based systems induced comparable antibody responses, DDA based systems induced powerful CMI responses.
Resumo:
This thesis is based upon a case study of the adoption of digital, electronic, microprocessor-based control systems by Albright & Wilson Limited - a UK chemical producer. It offers an explanation of the company's changing technology policy between 1978 and 1981, by examining its past development, internal features and industrial environment. Part One of the thesis gives an industry-level analysis which relates the development of process control technology to changes in the economic requirements of production . The rapid diffusion of microcomputers and other microelectronic equipment in the chemical industry is found to be a response to general need to raise the efficiency of all processes, imposed by the economic recession following 1973. Part Two examines the impaot of these technical and eoonomic ohanges upon Albright & Wilson Limited. The company's slowness in adopting new control technology is explained by its long history in which trends are identified whlich produced theconservatism of the 1970s. By contrast, a study of Tenneco Incorporated, a much more successful adoptor of automating technology, is offered with an analysis of the new technology policy of adoption of such equipment which it imposed upon Albright & Wilson, following the latter's takeover by Tenneco in 1978. Some indications of the consequences by this new policy of widespread adoptions of microprocessor-based control equipment are derived from a study of the first Albright & Wilson plant to use such equipment. The thesis concludes that companies which fail to adopt rapidly the new control technology may not survive in the recessionary environment, the long- established British companies may lack the flexibility to make such necessary changes and that multi-national companies may have an important role jn the planned transfer and adoption of new production technology through their subsidiaries in the UK.
Resumo:
Introduction: Adjuvants potentiate immune responses, reducing the amount and dosing frequency of antigen required for inducing protective immunity. Adjuvants are of special importance when considering subunit, epitope-based or more unusual vaccine formulations lacking significant innate immunogenicity. While numerous adjuvants are known, only a few are licensed for human use; principally alum, and squalene-based oil-in-water adjuvants. Alum, the most commonly used, is suboptimal. There are many varieties of adjuvant: proteins, oligonucleotides, drug-like small molecules and liposome-based delivery systems with intrinsic adjuvant activity being perhaps the most prominent. Areas covered: This article focuses on small molecules acting as adjuvants, with the author reviewing their current status while highlighting their potential for systematic discovery and rational optimisation. Known small molecule adjuvants (SMAs) can be synthetically complex natural products, small oligonucleotides or drug-like synthetic molecules. The author provides examples of each class, discussing adjuvant mechanisms relevant to SMAs, and exploring the high-throughput discovery of SMAs. Expert opinion: SMAs, particularly synthetic drug-like adjuvants, are amenable to the plethora of drug-discovery techniques able to optimise the properties of biologically active small molecules. These range from laborious synthetic modifications to modern, rational, effort-efficient computational approaches, such as QSAR and structure-based drug design. In principal, any property or characteristic can thus be designed in or out of compounds, allowing us to tailor SMAs to specific biological functions, such as targeting specific cells or pathways, in turn affording the power to tailor SMAs to better address different diseases.
Resumo:
Inference and optimisation of real-value edge variables in sparse graphs are studied using the tree based Bethe approximation optimisation algorithms. Equilibrium states of general energy functions involving a large set of real edge-variables that interact at the network nodes are obtained for networks in various cases. These include different cost functions, connectivity values, constraints on the edge bandwidth and the case of multiclass optimisation.
Resumo:
We propose a computationally efficient method to the per-channel dispersion optimisation applied to 50 GHz-spaced N × 20-Gbit/s wavelength division multiplexing return-to-zero differential phase shift keying transmission in non-zero dispersion-shifted fibre based submarine systems. Crown Copyright © 2010.
Resumo:
This paper explores attitudes and perceptions towards entrepreneurs in three Central Eastern European (CEE) countries undergoing transition from planned to market-based economic systems. Entrepreneurs and small and medium-sized enterprises (SME) play a critical role in this transformation process. Study one examines whether governments and general public are perceived as supportive of entrepreneurs. Such perceptions might eventually increase the number of entrepreneurs as it would be seen as a legitimate career choice (cf. Etzioni, 1987). Study two explores whether the concept ‘entrepreneur’ is interpreted in the same way in the three cultures using a student sample. Cross-cultural aspects and support measures for entrepreneurship are discussed.