832 resultados para Deployment of Federal Institutes
Resumo:
This thesis presents theoretical investigation of three topics concerned with nonlinear optical pulse propagation in optical fibres. The techniques used are mathematical analysis and numerical modelling. Firstly, dispersion-managed (DM) solitons in fibre lines employing a weak dispersion map are analysed by means of a perturbation approach. In the case of small dispersion map strengths the average pulse dynamics is described by a perturbation approach (NLS) equation. Applying a perturbation theory, based on the Inverse Scattering Transform method, an analytic expression for the envelope of the DM soliton is derived. This expression correctly predicts the power enhancement arising from the dispersion management.Secondly, autosoliton transmission in DM fibre systems with periodical in-line deployment of nonlinear optical loop mirrors (NOLMs) is investigated. The use of in-line NOLMs is addressed as a general technique for all-optical passive 2R regeneration of return-to-zero data in high speed transmission system with strong dispersion management. By system optimisation, the feasibility of ultra-long single-channel and wavelength-division multiplexed data transmission at bit-rates ³ 40 Gbit s-1 in standard fibre-based systems is demonstrated. The tolerance limits of the results are defined.Thirdly, solutions of the NLS equation with gain and normal dispersion, that describes optical pulse propagation in an amplifying medium, are examined. A self-similar parabolic solution in the energy-containing core of the pulse is matched through Painlevé functions to the linear low-amplitude tails. The analysis provides a full description of the features of high-power pulses generated in an amplifying medium.
Resumo:
The research described in this thesis investigates three issues related to the use of expert systems for decision making in organizations. These are the effectiveness of ESs when used in different roles, to replace a human decision maker or to advise a human decision maker, the users' behaviourand opinions towards using an expertadvisory system and, the possibility of organization-wide deployment of expert systems and the role of an ES in different organizational levels. The research was based on the development of expert systems within a business game environment, a simulation of a manufacturing company. This was chosen to give more control over the `experiments' than would be possible in a real organization. An expert system (EXGAME) was developed based on a structure derived from Anthony's three levels of decision making to manage the simulated company in the business game itself with little user intervention. On the basis of EXGAME, an expert advisory system (ADGAME) was built to help game players to make better decisions in managing the game company. EXGAME and ADGAME are thus two expert systems in the same domain performing different roles; it was found that ADGAME had, in places, to be different from EXGAME, not simply an extension of it. EXGAME was tested several times against human rivals and was evaluated by measuring its performance. ADGAME was also tested by different users and was assessed by measuring the users' performance and analysing their opinions towards it as a helpful decision making aid. The results showed that an expert system was able to replace a human at the operational level, but had difficulty at the strategic level. It also showed the success of the organization-wide deployment of expert systems in this simulated company.
Resumo:
Predictive models of peptide-Major Histocompatibility Complex (MHC) binding affinity are important components of modern computational immunovaccinology. Here, we describe the development and deployment of a reliable peptide-binding prediction method for a previously poorly-characterized human MHC class I allele, HLA-Cw*0102.
Resumo:
This work focuses on translated political speeches made by Canadas prime minister during times of national crises. Delivered orally in both English and French, this translation-based political discourse is examined in a tripartite manner, offering the reader contextualisation of the corpus researched; description of the translation shifts encountered; and interpretation of the discourse varies greatly depending on the era observed. Since the latter half of the 20th century, for instance, different text types have been assigned to different categories of translators. As for translative shifts revealed in the corpus, they have been categorised as either paratextual or textual divergences. Paratextual differences indicate that the Canadian prime ministers national statements in English and French do not necessarily seek to portray symmetry between what is presented in each language. Each version of a national speech thus retains a relative degree of visual autonomy. In sum, accumulated instances of paratextual divergence suggest an identifiable paratextual strategy, whereby translation contributes to the illusion that there is only one federal language: the readers. The deployment of this paratextual strategy obscures the fact that such federal expression occurs in two official languages. The illusion of monolingualism generates two different world views one for each linguistic community. Similarly, another strategy is discerned in the analysis of translative textual shifts a textual strategy useful in highlighting some of the power struggles inherent in translated federal expression. Textual interpretation of data identifies four federal translation tendencies: legitimisation and characterisation of linguistic communities; dislocation of the speech-event; neutralisation of (linguistic) territory; and valorisation of federalism.
Resumo:
There were three principal research aims: primarily, Lean is and always should be regarded as a business model as depicted by Toyota who is dedicated towards finding better ways of producing cars; consequently an investigation of whether organisations embracing Lean as a philosophy were indeed more triumphant. An adapted balanced scorecard was used which embraced strategic, operation and indices focused towards the future prospects of an organisation. Secondly, it was obligatory to explicitly and precisely determine whether an organisation espoused Lean as a philosophy as opposed to another process or strategy. Thirdly, since Lean has to be envisaged as a never-ending journey; it was important to map out the Lean journey and to be able to categorize the juncture an organisation occupies at any particular phase of its overall implementation. This affords an opportunity to advise an organisation of specific requirements it needs to satisfy should it wish to embrace Lean as a philosophy. The methodological approach focused on the effective deployment of survey questionnaires in sixty-eight organisations and seven extensive case studies in manufacturing organisations of varying sizes. The CIMA organisational classification, the Puttick grid and the Product-Process matrix were used to analyse the range of organisations used in this investigation. Whilst there was a requirement to investigate whether Lean indeed equates to success, pertinent performance measurement was considered decisive; the DMP Model (Maltz et al., 2003) was modified to perform this role. An unremitting theme both in literature concerning the implementation of Lean and in the research evolves around the notion of corporate cultures. Its relevance is explored further within the analysis. In accepting the premise that Lean incorporates a journey, it was fundamental to identify the voyage. Prevalent frameworks are deficient in identifying the sustainability and ideological facets of Lean. Consequently, an extensive Lean audit was developed and piloted in twenty disparate organisations.
Resumo:
Firstly, we numerically model a practical 20 Gb/s undersea configuration employing the Return-to-Zero Differential Phase Shift Keying data format. The modelling is completed using the Split-Step Fourier Method to solve the Generalised Nonlinear Schrdinger Equation. We optimise the dispersion map and per-channel launch power of these channels and investigate how the choice of pre/post compensation can influence the performance. After obtaining these optimal configurations, we investigate the Bit Error Rate estimation of these systems and we see that estimation based on Gaussian electrical current systems is appropriate for systems of this type, indicating quasi-linear behaviour. The introduction of narrower pulses due to the deployment of quasi-linear transmission decreases the tolerance to chromatic dispersion and intra-channel nonlinearity. We used tools from Mathematical Statistics to study the behaviour of these channels in order to develop new methods to estimate Bit Error Rate. In the final section, we consider the estimation of Eye Closure Penalty, a popular measure of signal distortion. Using a numerical example and assuming the symmetry of eye closure, we see that we can simply estimate Eye Closure Penalty using Gaussian statistics. We also see that the statistics of the logical ones dominates the statistics of the logical ones dominates the statistics of signal distortion in the case of Return-to-Zero On-Off Keying configurations.
Resumo:
To exploit the popularity of TCP as still the dominant sender and protocol of choice for transporting data reliably across the heterogeneous Internet, this thesis explores end-to-end performance issues and behaviours of TCP senders when transferring data to wireless end-users. The theme throughout is on end-users located specifically within 802.11 WLANs at the edges of the Internet, a largely untapped area of work. To exploit the interests of researchers wanting to study the performance of TCP accurately over heterogeneous conditions, this thesis proposes a flexible wired-to-wireless experimental testbed that better reflects conditions in the real-world. To exploit the transparent functionalities between TCP in the wired domain and the IEEE 802.11 WLAN protocols, this thesis proposes a more accurate methodology for gauging the transmission and error characteristics of real-world 802.11 WLANs. It also aims to correlate any findings with the functionality of fixed TCP senders. To exploit the popularity of Linux as a popular operating system for many of the Internet’s data servers, this thesis studies and evaluates various sender-side TCP congestion control implementations within the recent Linux v2.6. A selection of the implementations are put under systematic testing using real-world wired-to-wireless conditions in order to screen and present a viable candidate/s for further development and usage in the modern-day heterogeneous Internet. Overall, this thesis comprises a set of systematic evaluations of TCP senders over 802.11 WLANs, incorporating measurements in the form of simulations, emulations, and through the use of a real-world-like experimental testbed. The goal of the work is to ensure that all aspects concerned are comprehensively investigated in order to establish rules that can help to decide under which circumstances the deployment of TCP is optimal i.e. a set of paradigms for advancing the state-of-the-art in data transport across the Internet.
Resumo:
Financial institutes are an integral part of any modern economy. In the 1970s and 1980s, Gulf Cooperation Council (GCC) countries made significant progress in financial deepening and in building a modern financial infrastructure. This study aims to evaluate the performance (efficiency) of financial institutes (banking sector) in GCC countries. Since, the selected variables include negative data for some banks and positive for others, and the available evaluation methods are not helpful in this case, so we developed a Semi Oriented Radial Model to perform this evaluation. Furthermore, since the SORM evaluation result provides a limited information for any decision maker (bankers, investors, etc...), we proposed a second stage analysis using classification and regression (C&R) method to get further results combining SORM results with other environmental data (Financial, economical and political) to set rules for the efficient banks, hence, the results will be useful for bankers in order to improve their bank performance and to the investors, maximize their returns. Mainly there are two approaches to evaluate the performance of Decision Making Units (DMUs), under each of them there are different methods with different assumptions. Parametric approach is based on the econometric regression theory and nonparametric approach is based on a mathematical linear programming theory. Under the nonparametric approaches, there are two methods: Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH). While there are three methods under the parametric approach: Stochastic Frontier Analysis (SFA); Thick Frontier Analysis (TFA) and Distribution-Free Analysis (DFA). The result shows that DEA and SFA are the most applicable methods in banking sector, but DEA is seem to be most popular between researchers. However DEA as SFA still facing many challenges, one of these challenges is how to deal with negative data, since it requires the assumption that all the input and output values are non-negative, while in many applications negative outputs could appear e.g. losses in contrast with profit. Although there are few developed Models under DEA to deal with negative data but we believe that each of them has it is own limitations, therefore we developed a Semi-Oriented-Radial-Model (SORM) that could handle the negativity issue in DEA. The application result using SORM shows that the overall performance of GCC banking is relatively high (85.6%). Although, the efficiency score is fluctuated over the study period (1998-2007) due to the second Gulf War and to the international financial crisis, but still higher than the efficiency score of their counterpart in other countries. Banks operating in Saudi Arabia seem to be the highest efficient banks followed by UAE, Omani and Bahraini banks, while banks operating in Qatar and Kuwait seem to be the lowest efficient banks; this is because these two countries are the most affected country in the second Gulf War. Also, the result shows that there is no statistical relationship between the operating style (Islamic or Conventional) and bank efficiency. Even though there is no statistical differences due to the operational style, but Islamic bank seem to be more efficient than the Conventional bank, since on average their efficiency score is 86.33% compare to 85.38% for Conventional banks. Furthermore, the Islamic banks seem to be more affected by the political crisis (second Gulf War), whereas Conventional banks seem to be more affected by the financial crisis.
Resumo:
This article is a contribution to an emerging scholarship on the role of rhetoric, persona and celebrity, and the effects of performance on the political process. We analyse party leader Ed Miliband at the UK Labour Party Conference in Manchester in 2012. Our analysis identifies how, through performance of himself and the beginnings of the deployment of an alternative party narrative centred on One Nation, Ed Miliband began to revise his received persona. By using a range of rhetorical and other techniques, Miliband began to adapt the Labour narrative to the personalized political. The article sets out the theoretical framework for the analysis and returns to the implications for the theory of leadership performance in its conclusion.
Resumo:
The UK Police Force is required to operate communications centres under increased funding constraints. Staff represent the main cost in operating the facility and the key issue for the efficient deployment of staff, in this case call handler staff, is to try to ensure sufficient staff are available to make a timely response to customer calls when the timing of individual calls is difficult to predict. A discrete-event simulation study is presented of an investigation of a new shift pattern for call handler staff that aims to improve operational efficiency. The communications centre can be considered a specialised case of a call centre but an important issue for Police Force management is the particularly stressful nature of the work staff are involved with when responding to emergency calls. Thus decisions regarding changes to the shift system were made in the context of both attempting to improve efficiency by matching staff supply with customer demand, but also ensuring a reasonable workload pattern for staff over time.
Resumo:
The deployment of bioenergy technologies is a key part of UK and European renewable energy policy. A key barrier to the deployment of bioenergy technologies is the management of biomass supply chains including the evaluation of suppliers and the contracting of biomass. In the undeveloped biomass for energy market buyers of biomass are faced with three major challenges during the development of new bioenergy projects. What characteristics will a certain supply of biomass have, how to evaluate biomass suppliers and which suppliers to contract with in order to provide a portfolio of suppliers that best satisfies the needs of the project and its stakeholder group whilst also satisfying crisp and non-crisp technological constraints. The problem description is taken from the situation faced by the industrial partner in this research, Express Energy Ltd. This research tackles these three areas separately then combines them to form a decision framework to assist biomass buyers with the strategic sourcing of biomass. The BioSS framework. The BioSS framework consists of three modes which mirror the development stages of bioenergy projects. BioSS.2 mode for early stage development, BioSS.3 mode for financial close stage and BioSS.Op for the operational phase of the project. BioSS is formed of a fuels library, a supplier evaluation module and an order allocation module, a Monte-Carlo analysis module is also included to evaluate the accuracy of the recommended portfolios. In each mode BioSS can recommend which suppliers should be contracted with and how much material should be purchased from each. The recommended blend should have chemical characteristics within the technological constraints of the conversion technology and also best satisfy the stakeholder group. The fuels library is made up from a wide variety of sources and contains around 100 unique descriptions of potential biomass sources that a developer may encounter. The library takes a wide data collection approach and has the aim of allowing for estimates to be made of biomass characteristics without expensive and time consuming testing. The supplier evaluation part of BioSS uses a QFD-AHP method to give importance weightings to 27 different evaluating criteria. The evaluating criteria have been compiled from interviews with stakeholders and policy and position documents and the weightings have been assigned using a mixture of workshops and expert interview. The weighted importance scores allow potential suppliers to better tailor their business offering and provides a robust framework for decision makers to better understand the requirements of the bioenergy project stakeholder groups. The order allocation part of BioSS uses a chance-constrained programming approach to assign orders of material between potential suppliers based on the chemical characteristics of those suppliers and the preference score of those suppliers. The optimisation program finds the portfolio of orders to allocate to suppliers to give the highest performance portfolio in the eyes of the stakeholder group whilst also complying with technological constraints. The technological constraints can be breached if the decision maker requires by setting the constraint as a chance-constraint. This allows a wider range of biomass sources to be procured and allows a greater overall performance to be realised than considering crisp constraints or using deterministic programming approaches. BioSS is demonstrated against two scenarios faced by UK bioenergy developers. The first is a large scale combustion power project, the second a small scale gasification project. The Bioss is applied in each mode for both scenarios and is shown to adapt the solution to the stakeholder group importance and the different constraints of the different conversion technologies whilst finding a globally optimal portfolio for stakeholder satisfaction.
Resumo:
With their compact spectrum and high tolerance to residual chromatic dispersion, duobinary formats are attractive for the deployment of 40 Gb/s technology on 10 Gb/s WDM Long-Haul transmission infrastructures. Here, we compare the robustness of various duobinary formats when facing 40 Gb/s transmission impairments.
Resumo:
A compact high-power yellow-green continuous wave (CW) laser source based on second-harmonic generation (SHG) in a 5% MgO doped periodically poled congruent lithium niobate (PPLN) waveguide crystal pumped by a quantum-dot fiber Bragg grating (QD-FBG) laser diode is demonstrated. A frequency-doubled power of 90.11 mW at the wavelength of 560.68 nm with a conversion efficiency of 52.4% is reported. To the best of our knowledge, this represents the highest output power and conversion efficiency achieved to date in this spectral region from a diode-pumped PPLN waveguide crystal, which could prove extremely valuable for the deployment of such a source in a wide range of biomedical applications.
Resumo:
This article discusses the findings of a study tracing the incorporation of claims about infant brain development into English family policy as part of the longer term development of a ‘parent training’, early intervention agenda. The main focus is on the ways in which the deployment of neuroscientific discourse in family policy creates the basis for a new governmental oversight of parents. We argue that advocacy of ‘early intervention’, in particular that which deploys the authority of ‘the neuroscience’, places parents at the centre of the policy stage but simultaneously demotes and marginalises them. So we ask, what becomes of the parent when politically and culturally, the child is spoken of as infinitely and permanently neurologically vulnerable to parental influence? In particular, the policy focus on parental emotions and their impact on infant brain development indicates that this represents a biologisation of ‘therapeutic’ governance.
Resumo:
Composite Web Services (CWS) aggregate multiple Web Services in one logical unit to accomplish a complex task (e.g. business process). This aggregation is achieved by defining a workflow that orchestrates the underlying Web Services in a manner consistent with the desired functionality. Since CWS can aggregate atomic and other CWS they foster the development of service layers and reuse of already existing functionality. An important issue in the deployment of services is their run-time performance under various loads. Due to the complex interactions of the underlying services, a CWS they can exhibit problematic and often difficult to predict behaviours in overload situations. This paper focuses on the use of request scheduling for improving CWS performance in overload situations. Different scheduling policies are investigated in regards to their effectiveness in helping with bulk arrivals.