22 resultados para Cost Over run
em Aston University Research Archive
Resumo:
We demonstrate the first experimental implementation of intensity-modulation and direct-detection 7.6Gb/s DBPSK-based DSB optical Fast-OFDM with a reduced subcarrier spacing equal to half of the symbol rate per subcarrier over 40km SMF. © 2012 OSA.
Resumo:
Background: As Internet use grows, health interventions are increasingly being delivered online. Pioneering researchers are using the networking potential of the Internet, and several of them have evaluated these interventions. Objective: The objective was to review the reasons why health interventions have been delivered on the Internet and to reflect on the work of the pioneers in this field in order to inform future research. Methods: We conducted a qualitative systematic review of peer-reviewed evaluations of health interventions delivered to a known client/patient group using networked features of the Internet. Papers were reviewed for the reasons given for using the Internet, and these reasons were categorized. Results: We included studies evaluating 28 interventions plus 9 interventions that were evaluated in pilot studies. The interventions were aimed at a range of health conditions. Reasons for Internet delivery included low cost and resource implications due to the nature of the technology; reducing cost and increasing convenience for users; reduction of health service costs; overcoming isolation of users; the need for timely information; stigma reduction; and increased user and supplier control of the intervention. A small number of studies gave the existence of Internet interventions as the only reason for undertaking an evaluation of this mode of delivery. Conclusions: One must remain alert for the unintended effects of Internet delivery of health interventions due to the potential for reinforcing the problems that the intervention was designed to help. Internet delivery overcomes isolation of time, mobility, and geography, but it may not be a substitute for face-to-face contact. Future evaluations need to incorporate the evaluation of cost, not only to the health service but also to users and their social networks. When researchers report the outcomes of Internet-delivered health care interventions, it is important that they clearly state why they chose to use the Internet, preferably backing up their decision with theoretical models and exploratory work. Evaluation of the effectiveness of a health care intervention delivered by the Internet needs to include comparison with more traditional modes of delivery to answer the following question: What are the added benefits or disadvantages of Internet use that are particular to this mode of delivery? © Griffiths, Frances, Lindenmeyer, Antje, Powell, John, Thorogood, Margaret.
Resumo:
This study employs stochastic frontier analysis to analyze Malaysian commercial banks during 1996-2002, and particularly focuses on determining the impact of Islamic banking on performance. We derive both net and gross efficiency estimates, thereby demonstrating that differences in operating characteristics explain much of the difference in costs between Malaysian banks. We also decompose productivity change into efficiency, technical, and scale change using a generalised Malmquist productivity index. On average, Malaysian banks experience moderate scale economies and annual productivity change of 2.68 percent, with the latter driven primarily by technical change, which has declined over time. Our gross efficiency estimates suggest that Islamic banking is associated with higher input requirements. However, our productivity estimates indicate that full-fledged Islamic banks have overcome some of these cost disadvantages with rapid technical change, although this is not the case for conventional banks operating Islamic windows. Merged banks are found to have higher input usage and lower productivity change, suggesting that bank mergers have not contributed positively to bank performance. Finally, our results suggest that while the East Asian financial crisis had a short-term cost-reducing effect in 1998, the crisis triggered a more lasting negative impact by increasing the volume of non-performing loans.
Resumo:
This paper presents a new method for the optimisation of the mirror element spacing arrangement and operating temperature of linear Fresnel reflectors (LFR). The specific objective is to maximise available power output (i.e. exergy) and operational hours whilst minimising cost. The method is described in detail and compared to an existing design method prominent in the literature. Results are given in terms of the exergy per total mirror area (W/m2) and cost per exergy (US $/W). The new method is applied principally to the optimisation of an LFR in Gujarat, India, for which cost data have been gathered. It is recommended to use a spacing arrangement such that the onset of shadowing among mirror elements occurs at a transversal angle of 45°. This results in a cost per exergy of 2.3 $/W. Compared to the existing design approach, the exergy averaged over the year is increased by 9% to 50 W/m2 and an additional 122 h of operation per year are predicted. The ideal operating temperature at the surface of the absorber tubes is found to be 300 °C. It is concluded that the new method is an improvement over existing techniques and a significant tool for any future design work on LFR systems
Resumo:
A key objective of autonomic computing is to reduce the cost and expertise required for the management of complex IT systems. As a growing number of these systems are implemented as hierarchies or federations of lower-level systems, techniques that support the development of autonomic systems of systems are required. This article introduces one such technique, which involves the run-time synthesis of autonomic system connectors. These connectors are specified by means of a new type of autonomic computing policy termed a resource definition policy, and enable the dynamic realisation of collections of collaborating autonomic systems, as envisaged by the original vision of autonomic computing. We propose a framework for the formal specification of autonomic computing policies, and use it to define the new policy type and to describe its application to the development of autonomic system of systems. To validate the approach, we present a sample data-centre application that was built using connectors synthesised from resource-definition policies.
Resumo:
Low-cost, high-capacity optical transmission systems are required for metropolitan area networks. Direct-detected multi-carrier systems are attractive candidates, but polarization mode dispersion (PMD) is one of the major impairments that limits their performance. In this paper, we report the first experimental analysis of the PMD tolerance of a 288Gbit/s NRZ-OOK Coherent Wavelength Division Multiplexing system. The results show that this impairment is determined primarily by the subcarrier baud rate. We confirm the robustness of the system to PMD by demonstrating error-free performance over an unrepeatered 124km field-installed single-mode fiber with a negligible penalty of 0.3dB compared to the back-to-back measurements. (C) 2010 Optical Society of America
Resumo:
This article examines cost economies, productivity growth and cost efficiency of the Chinese banks using a unique panel dataset that identifies banks' four outputs and four input prices over the period of 1995-2001. By assessing the appropriateness of model specification, and making use of alternative methodologies in evaluating the performance of banks, we find that the joint-stock commercial banks outperform state-owned commercial banks in productivity growth and cost efficiency. Under the variable cost assumption, Chinese banks display economies of scale, with state-owned commercial banks enjoying cost advantages over the joint-stock commercial banks. Consequently, our results highlight the ownership advantage of these two types of banks and generally support the ongoing banking reform and transformation that is currently taking place in China.
Resumo:
This study employs Stochastic Frontier Analysis (SFA) to analyse Malaysian commercial banks during 1996–2002, and particularly focuses on determining the impact of Islamic banking on performance. We derive both net and gross efficiency estimates, thereby demonstrating that differences in operating characteristics explain much of the difference in costs between Malaysian banks. We also decompose productivity change into efficiency, technical, and scale change using a generalized Malmquist productivity index. On average, Malaysian banks experience moderate scale economies and annual productivity change of 2.68%, with the latter driven primarily by Technical Change (TC), which has declined over time. Our gross efficiency estimates suggest that Islamic banking is associated with higher input requirements. However, our productivity estimates indicate that full-fledged Islamic banks have overcome some of these cost disadvantages with rapid TC, although this is not the case for conventional banks operating Islamic windows. Merged banks are found to have higher input usage and lower productivity change, suggesting that bank mergers have not contributed positively to bank performance. Finally, our results suggest that while the East Asian financial crisis had a short-term costreducing effect in 1998, the crisis triggered a long-lasting negative impact by increasing the volume of nonperforming loans.
Resumo:
Introduction-The design of the UK MPharm curriculum is driven by the Royal Pharmaceutical Society of Great Britain (RPSGB) accreditation process and the EU directive (85/432/EEC).[1] Although the RPSGB is informed about teaching activity in UK Schools of Pharmacy (SOPs), there is no database which aggregates information to provide the whole picture of pharmacy education within the UK. The aim of the teaching, learning and assessment study [2] was to document and map current programmes in the 16 established SOPs. Recent developments in programme delivery have resulted in a focus on deep learning (for example, through problem based learning approaches) and on being more student centred and less didactic through lectures. The specific objectives of this part of the study were (a) to quantify the content and modes of delivery of material as described in course documentation and (b) having categorised the range of teaching methods, ask students to rate how important they perceived each one for their own learning (using a three point Likert scale: very important, fairly important or not important). Material and methods-The study design compared three datasets: (1) quantitative course document review, (2) qualitative staff interview and (3) quantitative student self completion survey. All 16 SOPs provided a set of their undergraduate course documentation for the year 2003/4. The documentation variables were entered into Excel tables. A self-completion questionnaire was administered to all year four undergraduates, using a pragmatic mixture of methods, (n=1847) in 15 SOPs within Great Britain. The survey data were analysed (n=741) using SPSS, excluding non-UK students who may have undertaken part of their studies within a non-UK university. Results and discussion-Interviews showed that individual teachers and course module leaders determine the choice of teaching methods used. Content review of the documentary evidence showed that 51% of the taught element of the course was delivered using lectures, 31% using practicals (includes computer aided learning) and 18% small group or interactive teaching. There was high uniformity across the schools for the first three years; variation in the final year was due to the project. The average number of hours per year across 15 schools (data for one school were not available) was: year 1: 408 hours; year 2: 401 hours; year 3: 387 hours; year 4: 401 hours. The survey showed that students perceived lectures to be the most important method of teaching after dispensing or clinical practicals. Taking the very important rating only: 94% (n=694) dispensing or clinical practicals; 75% (n=558) lectures; 52% (n=386) workshops, 50% (n=369) tutorials, 43% (n=318) directed study. Scientific laboratory practices were rated very important by only 31% (n=227). The study shows that teaching of pharmacy to undergraduates in the UK is still essentially didactic through a high proportion of formal lectures and with high levels of staff-student contact. Schools consider lectures still to be the most cost effective means of delivering the core syllabus to large cohorts of students. However, this does limit the scope for any optionality within teaching, the scope for small group work is reduced as is the opportunity to develop multi-professional learning or practice placements. Although novel teaching and learning techniques such as e-learning have expanded considerably over the past decade, schools of pharmacy have concentrated on lectures as the best way of coping with the huge expansion in student numbers. References [1] Council Directive. Concerning the coordination of provisions laid down by law, regulation or administrative action in respect of certain activities in the field of pharmacy. Official Journal of the European Communities 1985;85/432/EEC. [2] Wilson K, Jesson J, Langley C, Clarke L, Hatfield K. MPharm Programmes: Where are we now? Report commissioned by the Pharmacy Practice Research Trust., 2005.
Resumo:
JPEG2000 is a new coming image standard. In this paper we analyze the performance of error resilience tools in JPEG2000, and present an analytical model to estimate the quality of JPEG2000 encoded image transmitted over wireless channels. The effectiveness of the analytical model is validated by simulation results. Furthermore, analytical model is utilized by the base station to design efficient unequally error protection schemes for JPEG2000 transmission. In the design, a utility function is denned to make a tradeoff between the image quality and the cost for transmitting the image over wireless channel. © 2002 IEEE.
Resumo:
Because memories are not always accurate, people rely on a variety of strategies to verify whether the events that they remember really did occur. Several studies have examined which strategies people tend to use, but none to date has asked why people opt for certain strategies over others. Here we examined the extent to which people's beliefs about the reliability and the cost of different strategies would determine their strategy selection. Subjects described a childhood memory and then suggested strategies they might use to verify the accuracy of that memory. Next, they rated the reliability and cost of each strategy, and the likelihood that they might use it. Reliability and cost each predicted strategy selection, but a combination of the two ratings provided even greater predictive value. Cost was significantly more influential than reliability, which suggests that a tendency to seek and to value "cheap" information more than reliable information could underlie many real-world memory errors. © 2013 Elsevier B.V.
Resumo:
Traditionally, research on model-driven engineering (MDE) has mainly focused on the use of models at the design, implementation, and verification stages of development. This work has produced relatively mature techniques and tools that are currently being used in industry and academia. However, software models also have the potential to be used at runtime, to monitor and verify particular aspects of runtime behavior, and to implement self-* capabilities (e.g., adaptation technologies used in self-healing, self-managing, self-optimizing systems). A key benefit of using models at runtime is that they can provide a richer semantic base for runtime decision-making related to runtime system concerns associated with autonomic and adaptive systems. This book is one of the outcomes of the Dagstuhl Seminar 11481 on models@run.time held in November/December 2011, discussing foundations, techniques, mechanisms, state of the art, research challenges, and applications for the use of runtime models. The book comprises four research roadmaps, written by the original participants of the Dagstuhl Seminar over the course of two years following the seminar, and seven research papers from experts in the area. The roadmap papers provide insights to key features of the use of runtime models and identify the following research challenges: the need for a reference architecture, uncertainty tackled by runtime models, mechanisms for leveraging runtime models for self-adaptive software, and the use of models at runtime to address assurance for self-adaptive systems.
Resumo:
AIM: To assess the suitability and potential cost savings, from both the hospital and community perspective, of prescribed oral liquid medicine substitution with acceptable solid forms for children over 2 years. METHOD: Oral liquid medicines dispensed from a paediatric hospital (UK) in 1 week were assessed by screening for existence of the solid form alternative and evaluating the acceptability of the available solid form, firstly related to the prescribed dose and secondly to acceptable size depending on the child's age. Costs were calculated based on providing treatment for 28 days or prescribed duration for short term treatments. RESULTS: Over 90% (440/476) of liquid formulations were available as a marketed solid form. Considering dosage acceptability (maximum of 10% deviation from prescribed dosage or 0% for narrow therapeutic range drugs, maximum tablet divisions into quarters) 80% of liquids could be substituted with a solid form. The main limitation for liquid substitution would be solid form size. However, two-thirds of prescribed liquids could have been substituted with a suitable solid form for dosage and size, with estimated savings being of 5K and 8K in 1 week, respectively based on hospital and community costs, corresponding to a projected annual saving of 238K and 410K (single institution). CONCLUSION: Whilst not all children over 2 years will be able to swallow tablets, drug cost savings if oral liquid formulations were substituted with suitable solid dosage forms would be considerable. Given the numerous advantages of solid forms compared with liquids, this study may provide a theoretical basis for investing in supporting children to swallow tablets/capsules.
Resumo:
Background/aims: Retinal screening programmes in England and Scotland have similar photographic grading schemes for background (non-proliferative) and proliferative diabetic retinopathy, but diverge over maculopathy. We looked for the most cost-effective method of identifying diabetic macular oedema from retinal photographs including the role of automated grading and optical coherence tomography, a technology that directly visualises oedema. Methods: Patients from seven UK centres were recruited. The following features in at least one eye were required for enrolment: microaneurysms/dot haemorrhages or blot haemorrhages within one disc diameter, or exudates within one or two disc diameters of the centre of the macula. Subjects had optical coherence tomography and digital photography. Manual and automated grading schemes were evaluated. Costs and QALYs were modelled using microsimulation techniques. Results: 3540 patients were recruited, 3170 were analysed. For diabetic macular oedema, England's scheme had a sensitivity of 72.6% and specificity of 66.8%; Scotland 's had a sensitivity of 59.5% and specificity of 79.0%. When applying a ceiling ratio of £30 000 per quality adjusted life years (QALY) gained, Scotland's scheme was preferred. Assuming automated grading could be implemented without increasing grading costs, automation produced a greater number of QALYS for a lower cost than England's scheme, but was not cost effective, at the study's operating point, compared with Scotland's. The addition of optical coherence tomography, to each scheme, resulted in cost savings without reducing health benefits. Conclusions: Retinal screening programmes in the UK should reconsider the screening pathway to make best use of existing and new technologies.
Resumo:
A novel modulator array integrating eight GaAs electro-optic IQ modulators is characterized and tested over long-reach direct-detected multi-band OFDM-PONs. The GaAs IQ modulators present > 22 GHz bandwidth with 3V Vpi, being suitable for a 100-km 40-Gb/s OOFDM-PON supporting up to 1024 users.