878 resultados para improve acoustic performance
Resumo:
La ricerca intende analizzare l’efficacia della spesa pubblica, l’efficienza e le loro determinanti nei settori della Sanità, dell’Istruzione e della Ricerca per 33 paesi dell’area OCSE. L’analisi ha un duplice obiettivo: da un lato un confronto cross country e dall’altro un confronto temporale, prendendo in considerazione il periodo che va dal 1992 al 2011. Il tema della valutazione dell’efficacia e dell’efficienza della spesa pubblica è molto attuale, soprattutto in Europa, sia perché essa incide di quasi il 50% sul PIL, sia a causa della crisi finanziaria del 2008 che ha spinto i governi ad una riduzione dei bugdet e ad un loro uso più oculato. La scelta di concentrare il lavoro di analisi nei settori della Sanità, dell’Istruzione e della Ricerca e Sviluppo deriva da un lato dalla loro peculiarità di attività orientate al cliente (scuole, ospedali, tribunali) dall’altro dal ruolo strategico che essi rappresentano per lo sviluppo economico di un paese. Il lavoro è articolato in tre sezioni: 1. Rassegna dei principali strumenti metodologici utilizzati in letteratura per la misurazione della performance e dell’efficienza della spesa pubblica nei tre settori. 2. Valutazione e confronto dell’efficienza e della performance della spesa pubblica dal punto di vista sia temporale sia cross-country attraverso la costruzione di indicatori di performance e di efficienza della spesa pubblica (per approfondire l'indice dell'efficienza ho applicato la tecnica DEA "bootstrap output oriented" con indicatori di output ed input non simultanei mentre l’evoluzione dell’efficienza tra i periodi 2011-2002 e 2001-1992 è stata analizzata attraverso il calcolo dell’indice di Malmquist). 3. Analisi delle variabili esogene che influenzano l’efficienza della spesa pubblica nei settori Salute, Istruzione e Ricerca e Sviluppo attraverso una regressione Tobit avente come variabile dipendente i punteggi di efficienza DEA output oriented e come variabili esogene alcuni indicatori scelti tra quelli presenti in letteratura: l’Indicatore delle condizioni socioeconomiche delle famiglie (costruito e applicato da OCSE PISA per valutare l’impatto del background familiare nelle performance dell’apprendimento), l’Indicatore di fiducia nel sistema legislativo del paese, l’Indicatore di tutela dei diritti di proprietà, l’Indicatore delle azioni di controllo della corruzione, l’Indicatore di efficacia delle azioni di governo, l’Indicatore della qualità dei regolamenti, il PIL pro-capite. Da questo lavoro emergono risultati interessanti: non sempre alla quantità di risorse impiegate corrisponde il livello massimo di performance raggiungibile. I risultati della DEA evidenziano la media dei punteggi di efficienza corretti di 0,712 e quindi, impiegando la stessa quantità di risorse, si produrrebbe un potenziale miglioramento dell’output generato di circa il 29%. Svezia, Giappone, Finlandia e Germania risultano i paesi più efficienti, più vicini alla frontiera, mentre Slovacchia, Portogallo e Ungheria sono più distanti dalla frontiera con una misura di inefficienza di circa il 40%. Per quanto riguarda il confronto tra l’efficienza della spesa pubblica nei tre settori tra i periodi 1992-2001 e 2002-2011, l’indice di Malmquist mostra risultati interessanti: i paesi che hanno migliorato il loro livello di efficienza sono quelli dell’Est come l’Estonia, la Slovacchia, la Lituania mentre Paesi Bassi, Belgio e Stati Uniti hanno peggiorato la loro posizione. I paesi che risultano efficienti nella DEA come Finlandia, Germania e Svezia sono rimasti sostanzialmente fermi con un indice di Malmquist vicino al valore uno. In conclusione, i risultati della Tobit contengono indicazioni importanti per orientare le scelte dei Governi. Dall’analisi effettuata emerge che la fiducia nelle leggi, la lotta di contrasto alla corruzione, l’efficacia del governo, la tutela dei diritti di proprietà, le condizioni socioeconomiche delle famiglie degli studenti OECD PISA, influenzano positivamente l’efficienza della spesa pubblica nei tre settori indagati. Oltre alla spending review, per aumentare l’efficienza e migliorare la performance della spesa pubblica nei tre settori, è indispensabile per gli Stati la capacità di realizzare delle riforme che siano in grado di garantire il corretto funzionamento delle istituzioni.
Resumo:
We obtain phase diagrams of regular and irregular finite-connectivity spin glasses. Contact is first established between properties of the phase diagram and the performance of low-density parity check (LDPC) codes within the replica symmetric (RS) ansatz. We then study the location of the dynamical and critical transition points of these systems within the one step replica symmetry breaking theory (RSB), extending similar calculations that have been performed in the past for the Bethe spin-glass problem. We observe that the location of the dynamical transition line does change within the RSB theory, in comparison with the results obtained in the RS case. For LDPC decoding of messages transmitted over the binary erasure channel we find, at zero temperature and rate R=14, an RS critical transition point at pc 0.67 while the critical RSB transition point is located at pc 0.7450±0.0050, to be compared with the corresponding Shannon bound 1-R. For the binary symmetric channel we show that the low temperature reentrant behavior of the dynamical transition line, observed within the RS ansatz, changes its location when the RSB ansatz is employed; the dynamical transition point occurs at higher values of the channel noise. Possible practical implications to improve the performance of the state-of-the-art error correcting codes are discussed. © 2006 The American Physical Society.
Resumo:
Improving healthcare quality is a growing need of any society. Although various quality improvement projects are routinely deployed by the healthcare professional, they are characterised by a fragmented approach, i.e. they are not linked with the strategic intent of the organisation. This study introduces a framework which integrates all quality improvement projects with the strategic intent of the organisation. It first derives the strengths, weaknesses, opportunities and threats (SWOT) matrix of the system with the involvement of the concerned stakeholders (clinical professional), which helps identify a few projects, the implementation of which ensures achievement of desired quality. The projects are then prioritised using the analytic hierarchy process with the involvement of the concerned stakeholders (clinical professionals) and implemented in order to improve system performance. The effectiveness of the method has been demonstrated using a case study in the intensive care unit of Queen Elizabeth Hospital in Bridgetown, Barbados.
Resumo:
Purpose - The purpose of this paper is to develop an integrated quality management model that identifies problems, suggests solutions, develops a framework for implementation and helps to evaluate dynamically healthcare service performance. Design/methodology/approach - This study used the logical framework analysis (LFA) to improve the performance of healthcare service processes. LFA has three major steps - problems identification, solution derivation, and formation of a planning matrix for implementation. LFA has been applied in a case-study environment to three acute healthcare services (Operating Room utilisation, Accident and Emergency, and Intensive Care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This study shows LFA application in three service processes in one hospital. This very limited population sample needs to be extended. Practical implications - The proposed model can be implemented in hospital-based healthcare services in order to improve performance. It may also be applied to other services. Originality/value - Quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, they are not without flaws. There is an absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues, develop a project management framework to implement those solutions. This study introduces an integrated and uniform quality management tool for healthcare services. © Emerald Group Publishing Limited.
Resumo:
Purpose - The purpose of the paper is to develop an integrated quality management model, which identifies problems, suggests solutions, develops a framework for implementation and helps evaluate performance of health care services dynamically. Design/methodology/approach - This paper uses logical framework analysis (LFA), a matrix approach to project planning for managing quality. This has been applied to three acute healthcare services (Operating room utilization, Accident and emergency, and Intensive care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This paper shows LFA application in three service processes in one hospital. However, ideally this is required to be tested in several hospitals and other services as well. Practical implications - In the paper the proposed model can be practised in hospital-based healthcare services for improving performance. Originality/value - The paper shows that quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in health care delivery and corrective measures are taken for superior performance, there is an absence of an integrated approach, which can identify and analyze issues, provide solutions to resolve those issues, develop a project management framework (planning, monitoring, and evaluating) to implement those solutions in order to improve process performance. This study introduces an integrated and uniform quality management tool. It integrates operations with organizational strategies. © Emerald Group Publishing Limited.
Resumo:
With the advent of globalisation companies all around the world must improve their performance in order to survive. The threats are coming from everywhere, and in different ways, such as low cost products, high quality products, new technologies, and new products. Different companies in different countries are using various techniques and using quality criteria items to strive for excellence. Continuous improvement techniques are used to enable companies to improve their operations. Therefore, companies are using techniques such as TQM, Kaizen, Six-Sigma, Lean Manufacturing, and quality award criteria items such as Customer Focus, Human Resources, Information & Analysis, and Process Management. The purpose of this paper is to compare the use of these techniques and criteria items in two countries, Mexico and the United Kingdom, which differ in culture and industrial structure. In terms of the use of continuous improvement tools and techniques, Mexico formally started to deal with continuous improvement by creating its National Quality Award soon after the Americans began the Malcolm Baldrige National Quality Award. The United Kingdom formally started by using the European Quality Award (EQA), modified and renamed as the EFQM Excellence Model. The methodology used in this study was to undertake a literature review of the subject matter and to study some general applications around the world. A questionnaire survey was then designed and a survey undertaken based on the same scale, about the same sample size, and the about the same industrial sector within the two countries. The survey presents a brief definition of each of the constructs to facilitate understanding of the questions. The analysis of the data was then conducted with the assistance of a statistical software package. The survey results indicate both similarities and differences in the strengths and weaknesses of the companies in the two countries. One outcome of the analysis is that it enables the companies to use the results to benchmark themselves and thus act to reinforce their strengths and to reduce their weaknesses.
Resumo:
Healthcare services available these days deploy high technology to satisfy both internal and external customers by continuously improving various quality parameters. Quality improvement in healthcare services is a complex and multidimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, there is absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues and develop a project management framework to implement and evaluate those solutions. This study introduces an integrated and uniform quality management framework for healthcare services. This study uses the Logical Framework Analysis (LFA) to improve the performance of healthcare services. LFA has three major steps - problem identification, solution derivation and formation of a planning matrix for implementation and evaluation. LFA has been applied in a case study environment to three acute healthcare services (Operating Room (OR) utilisation, Accident and Emergency (A&E) and intensive care) in order to demonstrate its effectiveness. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
Purpose: The purpose of this paper is to investigate the use of 802.11e MAC to resolve the transmission control protocol (TCP) unfairness. Design/methodology/approach: The paper shows how a TCP sender may adapt its transmission rate using the number of hops and the standard deviation of recently measured round-trip times to address the TCP unfairness. Findings: Simulation results show that the proposed techniques provide even throughput by providing TCP fairness as the number of hops increases over a wireless mesh network (WMN). Research limitations/implications: Future work will examine the performance of TCP over routing protocols, which use different routing metrics. Other future work is scalability over WMNs. Since scalability is a problem with communication in multi-hop, carrier sense multiple access (CSMA) will be compared with time division multiple access (TDMA) and a hybrid of TDMA and code division multiple access (CDMA) will be designed that works with TCP and other traffic. Finally, to further improve network performance and also increase network capacity of TCP for WMNs, the usage of multiple channels instead of only a single fixed channel will be exploited. Practical implications: By allowing the tuning of the 802.11e MAC parameters that have previously been constant in 802.11 MAC, the paper proposes the usage of 802.11e MAC on a per class basis by collecting the TCP ACK into a single class and a novel congestion control method for TCP over a WMN. The key feature of the proposed TCP algorithm is the detection of congestion by measuring the fluctuation of RTT of the TCP ACK samples via the standard deviation, plus the combined the 802.11e AIFS and CWmin allowing the TCP ACK to be prioritised which allows the TCP ACKs will match the volume of the TCP data packets. While 802.11e MAC provides flexibility and flow/congestion control mechanism, the challenge is to take advantage of these features in 802.11e MAC. Originality/value: With 802.11 MAC not having flexibility and flow/congestion control mechanisms implemented with TCP, these contribute to TCP unfairness with competing flows. © Emerald Group Publishing Limited.
Resumo:
Wireless Mesh Networks (WMNs) have emerged as a key technology for the next generation of wireless networking. Instead ofbeing another type of ad-hoc networking, WMNs diversify the capabilities of ad-hoc networks. There are many kinds of protocols that work over WMNs, such as IEEE 802.11a/b/g, 802.15 and 802.16. To bring about a high throughput under varying conditions, these protocols have to adapt their transmission rate. While transmission rate is a significant part, only a few algorithms such as Auto Rate Fallback (ARF) or Receiver Based Auto Rate (RBAR) have been published. In this paper we will show MAC, packet loss and physical layer conditions play important role for having good channel condition. Also we perform rate adaption along with multiple packet transmission for better throughput. By allowing for dynamically monitored, multiple packet transmission and adaptation to changes in channel quality by adjusting the packet transmission rates according to certain optimization criteria improvements in performance can be obtained. The proposed method is the detection of channel congestion by measuring the fluctuation of signal to the standard deviation of and the detection of packet loss before channel performance diminishes. We will show that the use of such techniques in WMN can significantly improve performance. The effectiveness of the proposed method is presented in an experimental wireless network testbed via packet-level simulation. Our simulation results show that regardless of the channel condition we were to improve the performance in the throughput.
Resumo:
A recently proposed colour based tracking algorithm has been established to track objects in real circumstances [Zivkovic, Z., Krose, B. 2004. An EM-like algorithm for color-histogram-based object tracking. In: Proc, IEEE Conf. on Computer Vision and Pattern Recognition, pp. 798-803]. To improve the performance of this technique in complex scenes, in this paper we propose a new algorithm for optimally adapting the ellipse outlining the objects of interest. This paper presents a Lagrangian based method to integrate a regularising component into the covariance matrix to be computed. Technically, we intend to reduce the residuals between the estimated probability distribution and the expected one. We argue that, by doing this, the shape of the ellipse can be properly adapted in the tracking stage. Experimental results show that the proposed method has favourable performance in shape adaption and object localisation.
Resumo:
This thesis presents several advanced optical techniques that are crucial for improving high capacity transmission systems. The basic theory of optical fibre communications are introduced before optical solitons and their usage in optically amplified fibre systems are discussed. The design, operation, limitations and importance of the recirculating loop are illustrated. The crucial role of dispersion management in the transmission systems is then considered. Two of the most popular dispersion compensation methods - dispersion compensating fibres and fibre Bragg gratings - are emphasised. A tunable dispersion compensator is fabricated using the linear chirped fibre Bragg gratings and a bending rig. Results show that it is capable of compensating not only the second order dispersion, but also higher order dispersion. Stimulated Raman Scattering (SRS) are studied and discussed. Different dispersion maps are performed for all Raman amplified standard fibre link to obtain maximum transmission distances. Raman amplification is used in most of our loop experiments since it improves the optical signal-to-noise ratio (OSNR) and significantly reduces the nonlinear intrachannel effects of the transmission systems. The main body of the experimental work is concerned with nonlinear optical switching using the nonlinear optical loop mirrors (NOLMs). A number of different types of optical loop mirrors are built, tested and implemented in the transmission systems for noise suppression and 2R regeneration. Their results show that for 2R regeneration, NOLM does improve system performance, while NILM degrades system performance due to its sensitivity to the input pulse width, and the NALM built is unstable and therefore affects system performance.
Resumo:
The initial aim of this project was to improve the performance of a chromatographic bioreactor-separator (CBRS). In such a system, a dilute enzyme solution is pumped continuously through a preparative chromatographic column, while pulses of substrate are periodically injected on to the column. Enzymic reaction and separation are therefore performed in a single unit operation. The chromatographic columns used were jacketed glass columns ranging from 1 to 2 metres long with an internal diameter of 1.5 cm. Linking these columns allowed 1, 2, 3 and 4 metre long CBRS systems to be constructed. The hydrolysis of lactose in the presence of β~galactosidase was the reaction of study. From previous work at Aston University, there appeared to be no difficulties in achieving complete lactose hydrolysis in a CBRS. There did, however, appear to be scope for improving the separative performance, so this was adopted as an initial goal. Reducing the particle size of the stationary phase was identified as a way of achieving this improvement. A cation exchange resin was selected which had an average particle size of around half that previously used when studying this reaction. A CBRS system was developed which overcame the operational problems (such as high pressure drop development) associated with use of such a particle size. A significant improvement in separative power was achieved. This was shown by an increase in the number of theoretical plates (N) from about 500 to about 3000 for a 2 metre long CBRS, coupled with higher resolution. A simple experiment with the 1 metre column showed that combined bioreaction and separation was achievable in this system. Having improved the separative performance of the system, the factors affecting enzymic reaction in a CBRS were investigated; including pulse volume and the degree of mixing between enzyme and substrate. The progress of reaction in a CBRS was then studied. This information was related to the interaction of reaction and separation over the reaction zone. The effect of injecting a pulse over a length of time as in CBRS operation was simulated by fed batch experiments. These experiments were performed in parallel with normal batch experiments where the substrate is mixed almost instantly with the enzyme. The batch experiments enabled samples to be taken every minute and revealed that reaction is very rapid. The hydrodynamic characteristics of the two injector configurations used in CBRS construction were studied using Magnetic Resonance Imaging, combined with hydrodynamic calculations. During the optimisation studies, galactooligosaccharides (GOS) were detected as intermediates in the hydrolysis process. GOS are valuable products with potential and existing applications in food manufacture (as nutraceuticals), medicine and drug targeting. The focus of the research was therefore turned to GOS production. A means of controlling reaction to arrest break down of GOS was required. Raising temperature was identified as a possible means of achieving this within a CBRS. Studies were undertaken to optimise the yield of oligosaccharides, culminating in the design, construction and evaluation of a Dithermal Chromatographic Bioreactor-separator.
Resumo:
With the advent of distributed computer systems with a largely transparent user interface, new questions have arisen regarding the management of such an environment by an operating system. One fertile area of research is that of load balancing, which attempts to improve system performance by redistributing the workload submitted to the system by the users. Early work in this field concentrated on static placement of computational objects to improve performance, given prior knowledge of process behaviour. More recently this has evolved into studying dynamic load balancing with process migration, thus allowing the system to adapt to varying loads. In this thesis, we describe a simulated system which facilitates experimentation with various load balancing algorithms. The system runs under UNIX and provides functions for user processes to communicate through software ports; processes reside on simulated homogeneous processors, connected by a user-specified topology, and a mechanism is included to allow migration of a process from one processor to another. We present the results of a study of adaptive load balancing algorithms, conducted using the aforementioned simulated system, under varying conditions; these results show the relative merits of different approaches to the load balancing problem, and we analyse the trade-offs between them. Following from this study, we present further novel modifications to suggested algorithms, and show their effects on system performance.
Resumo:
The English writing system is notoriously irregular in its orthography at the phonemic level. It was therefore proposed that focusing beginner-spellers’ attention on sound-letter relations at the sub-syllabic level might improve spelling performance. This hypothesis was tested in Experiments 1 and 2 using a ‘clue word’ paradigm to investigate the effect of analogy teaching intervention / non-intervention on the spelling performance of an experimental group and controls. The results overall showed the intervention to be effective in improving spelling, and this effect to be enduring. Experiment 3 demonstrated a greater application of analogy in spelling, when clue words, which participants used in analogy to spell test words, remained in view during testing. A series of regression analyses, with spelling entered as the criterion variable and age, analogy and phonological plausibility (PP) as predictors, showed both analogy and PP to be highly predictive of spelling. Experiment 4 showed that children could use analogy to improve their spelling, even without intervention, by comparing their performance in spelling words presented in analogous categories or in random lists. Consideration of children’s patterns of analogy use at different points of development showed three age groups to use similar patterns of analogy, but contrasting analogy patterns for spelling different words. This challenges stage theories of analogy use in literacy. Overall the most salient units used in analogy were the rime and, to a slightly lesser degree, the onset-vowel and vowel. Finally, Experiment 5 showed analogy and phonology to be fairly equally influential in spelling, but analogy to be more influential than phonology in reading. Five separate experiments therefore found analogy to be highly influential in spelling. Experiment 5 also considered the role of memory and attention in literacy attainment. The important implications of this research are that analogy, rather than purely phonics-based strategy, is instrumental in correct spelling in English.
Resumo:
The coagulase-negative staphylococci are the most frequent cause of sepsis associated with indwelling intravascular catheters. Current microbiological investigations to support the diagnosis of catheter-related sepsis (CRS) include the culture of blood and catheter tips, however positive results may reflect specimen contamination, or colonisation of the catheter rather than true sepsis. Previous serological approaches to assist in the diagnosis of CRS based on cellular staphylococcal antigens have been of limited value. In this current study, the serodiagnostic potential of an exocellular antigen produced by 7 strains of coagulase-negative staphylococci cultured in brain heart infusion broth was investigated. Antigenic material isolated by gel permeation from liquid culture was characterised by immunological techniques and chemical analysis. Characterisation of the exocellular antigen revealed a novel glycerophosphoglycolipid, termed lipid S. which shared antigenic determinants with lipoteichoic acid, but differed by comprising a glycerophosphate chain length of only 6 units. In addition, lipid S was immunologically distinct from diphosphatidyl glycerol, a constituent cell membrane phospho lipid. An indirect enzyme linked immunosorbent assay (ELISA) based on lipid S was subsequently developed and used to determine serum antibody levels (IgM and IgG) in 67 patients with CRS due to staphylococci, and 67 patients with a central venous catheter (CVC) in situ who exhibited no evidence of sepsis. The sensitivity and specificity of the lipid S IgG ELISA was 75% and 90% respectively whilst the IgM assay had sensitivity and specificity of 52% and 85%. The addition of GullSORereagent to the EL1SA procedure to remove competing serum IgG and rheumatoid factor did not significantly improve the performance of the IgM assay. The serological response in serial serum samples of 13 patients with CRS due to staphylococci was investigated. Elevated levels of antibody were detected at an early stage of infection, prior to the isolation of microorganisms by standard culture methods, and before the clinical presentation of sepsis in 3 patients. The lipid S ELISA was later optimised and a rapid 4-hour assay developed for the serodiagnosis of CRS. Serum IgG levels were determined in 40 patients with CRS due to staphylococci and 40 patients with a CVC in situ who exhibited no evidence of sepsis. The sensitivity and specificity of the rapid IgG assay was 70% and 100% respectively. Elevated serum antibody levels in patients with endocarditis, prosthetic joint infection and pyogenic spondylodiscitis due to Gram-positive cocci were also detected with the lipid S ELISA suggesting that the assay may facilitate the diagnosis of these infections. Unexpected increased levels of anti-lipid S IgG in 31% of control patients with sciatica suggested a possible microbial aetiology of this condition. Further investigation of some of these patients by culture of microdiscectomy tissue removed at operation, revealed the presence of low-virulent microorganisms in 37% of patients of which Propionibacterium aeries accounted for 85% of the positive culture isolates. The results suggested a previously unrecognised association between P. acnes and sciatica, which may have implications for the future management of the condition.