819 resultados para distributed denial of service
Resumo:
Rationale, aims and objectives: Intermediate care (IC) describes a range of services targeted at older people, aimed at preventing unnecessary hospitalisation, promoting faster recovery and maximising independence. The introduction of IC has created a new interface between primary and secondary care. Older people are known to be at an increased risk of medication-related problems when transferring between healthcare settings and pharmacists are often not included as part of IC multidisciplinary teams. This study aimed to explore community pharmacists’ (CPs) awareness of IC services and to investigate their views of and attitudes towards the medicines management aspects of such services, including the transfer of medication information.
Method: Semi-structured interviews were conducted, recorded, transcribed verbatim and analysed using a constant comparative approach with CPs practising in the vicinity of IC facilities in Northern Ireland, UK.
Results: Interviews were conducted with 16 CPs. Three themes were identified and named ‘left out of the loop’, ‘chasing things up’ and ‘closing the loop’. CPs felt that they were often ‘left out of the loop’ with regards to both their involvement with local IC services and communication across the healthcare interfaces. As a result, CPs resorted to ‘chasing things up’ as they had to proactively try to obtain information relating to patients’ medications. CPs viewed themselves as ideally placed to facilitate medicines management across the healthcare interfaces (i.e., ‘closing the loop’), but several barriers to potential services were identified.
Conclusion: CPs have limited involvement with IC services. There is a need for improvement of effective communication of patients’ medication information between secondary care, IC and community pharmacy. Increasing CP involvement may contribute to improving continuity of care across such healthcare interfaces, thereby increasing the person-centeredness of service provision.
Resumo:
BACKGROUND: Glaucoma is a leading cause of avoidable blindness worldwide. Open angle glaucoma is the most common type of glaucoma. No randomised controlled trials have been conducted evaluating the effectiveness of glaucoma screening for reducing sight loss. It is unclear what the most appropriate intervention to be evaluated in any glaucoma screening trial would be. The purpose of this study was to develop the clinical components of an intervention for evaluation in a glaucoma (open angle) screening trial that would be feasible and acceptable in a UK eye-care service.
METHODS: A mixed-methods study, based on the Medical Research Council (MRC) framework for complex interventions, integrating qualitative (semi-structured interviews with 46 UK eye-care providers, policy makers and health service commissioners), and quantitative (economic modelling) methods. Interview data were synthesised and used to revise the screening interventions compared within an existing economic model.
RESULTS: The qualitative data indicated broad based support for a glaucoma screening trial to take place in primary care, using ophthalmic trained technical assistants supported by optometry input. The precise location should be tailored to local circumstances. There was variability in opinion around the choice of screening test and target population. Integrating the interview findings with cost-effectiveness criteria reduced 189 potential components to a two test intervention including either optic nerve photography or screening mode perimetry (a measure of visual field sensitivity) with or without tonometry (a measure of intraocular pressure). It would be more cost-effective, and thus acceptable in a policy context, to target screening for open angle glaucoma to those at highest risk but for both practicality and equity arguments the optimal strategy was screening a general population cohort beginning at age forty.
CONCLUSIONS: Interventions for screening for open angle glaucoma that would be feasible from a service delivery perspective were identified. Integration within an economic modelling framework explicitly highlighted the trade-off between cost-effectiveness, feasibility and equity. This study exemplifies the MRC recommendation to integrate qualitative and quantitative methods in developing complex interventions. The next step in the development pathway should encompass the views of service users.
Resumo:
This paper presents a thorough performance analysis of dual-hop cognitive amplify-and-forward (AF) relaying networks under spectrum-sharing mechanism over independent non-identically distributed (i.n.i.d.) 􀀀 fading channels. In order to guarantee the quality-of-service (QoS) of primary networks, both maximum tolerable peak interference power Q at the primary users (PUs) and maximum allowable transmit power P at secondary users (SUs) are considered to constrain transmit power at the cognitive transmitters. For integer-valued fading parameters, a closed-form lower bound for the outage probability (OP) of the considered networks is obtained. Moreover, assuming arbitrary-valued fading parameters, the lower bound in integral form for the OP is derived. In order to obtain further insights on the OP performance, asymptotic expressions for the OP at high SNRs are derived, from which the diversity/coding gains and the diversity-multiplexing gain tradeoff (DMT) of the secondary network can be readily deduced. It is shown that the diversity gain and also the DMT are solely determined by the fading parameters of the secondary network whereas the primary network only affects the coding gain. The derived results include several others available in previously published works as special cases, such as those for Nakagami-m fading channels. In addition, performance evaluation results have been obtained by Monte Carlo computer simulations which have verified the accuracy of the theoretical analysis.
Resumo:
Background A 2014 national audit used the English General Practice Patient Survey (GPPS) to compare service users’ experience of out-of-hours general practitioner (GP) services, yet there is no published evidence on the validity of these GPPS items. Objectives Establish the construct and concurrent validity of GPPS items evaluating service users’ experience of GP out-of-hours care. Methods Cross-sectional postal survey of service users (n=1396) of six English out-of-hours providers. Participants reported on four GPPS items evaluating out-of-hours care (three items modified following cognitive interviews with service users), and 14 evaluative items from the Out-of-hours Patient Questionnaire (OPQ). Construct validity was assessed through correlations between any reliable (Cochran's α>0.7) scales, as suggested by a principal component analysis of the modified GPPS items, with the ‘entry access’ (four items) and ‘consultation satisfaction’ (10 items) OPQ subscales. Concurrent validity was determined by investigating whether each modified GPPS item was associated with thematically related items from the OPQ using linear regressions. Results The modified GPPS item-set formed a single scale (α=0.77), which summarised the two-component structure of the OPQ moderately well; explaining 39.7% of variation in the ‘entry access’ scores (r=0.63) and 44.0% of variation in the ‘consultation satisfaction’ scores (r=0.66), demonstrating acceptable construct validity. Concurrent validity was verified as each modified GPPS item was highly associated with a distinct set of related items from the OPQ. Conclusions Minor modifications are required for the English GPPS items evaluating out-of-hours care to improve comprehension by service users. A modified question set was demonstrated to comprise a valid measure of service users’ overall satisfaction with out-of-hours care received. This demonstrates the potential for the use of as few as four items in benchmarking providers and assisting services in identifying, implementing and assessing quality improvement initiatives.
Resumo:
Changes in the economic climate and the delivery of health care require that pre-operative information programmes are effective and efficiently implemented. In order to be effective the pre-operative programme must meet the information needs of intensive care unit (ICU) patients and their relatives. Efficiency can be achieved through a structured pre-operative programme which provides a framework for teaching. The need to develop an ICU information booklet in a large teaching hospital in Northern Ireland has become essential to provide relevant information and improve the quality of service for patients and relatives, as set out in the White Paper, ‘Working for Patients’, (DoH, 1989). The first step in establishing a patient education programme was to ascertain patients' and relatives' informational needs. A ‘needs assessment’ identified the pre-operative information needs of ICU patients and their relatives (McGaughey, 1994) and the findings were used to plan and publish an information booklet. The ICU booklet provides a structure for pre-operative visits to ensure that patients and relatives information needs are met.
Resumo:
Coastal and estuarine landforms provide a physical template that not only accommodates diverse ecosystem functions and human activities, but also mediates flood and erosion risks that are expected to increase with climate change. In this paper, we explore some of the issues associated with the conceptualisation and modelling of coastal morphological change at time and space scales relevant to managers and policy makers. Firstly, we revisit the question of how to define the most appropriate scales at which to seek quantitative predictions of landform change within an age defined by human interference with natural sediment systems and by the prospect of significant changes in climate and ocean forcing. Secondly, we consider the theoretical bases and conceptual frameworks for determining which processes are most important at a given scale of interest and the related problem of how to translate this understanding into models that are computationally feasible, retain a sound physical basis and demonstrate useful predictive skill. In particular, we explore the limitations of a primary scale approach and the extent to which these can be resolved with reference to the concept of the coastal tract and application of systems theory. Thirdly, we consider the importance of different styles of landform change and the need to resolve not only incremental evolution of morphology but also changes in the qualitative dynamics of a system and/or its gross morphological configuration. The extreme complexity and spatially distributed nature of landform systems means that quantitative prediction of future changes must necessarily be approached through mechanistic modelling of some form or another. Geomorphology has increasingly embraced so-called ‘reduced complexity’ models as a means of moving from an essentially reductionist focus on the mechanics of sediment transport towards a more synthesist view of landform evolution. However, there is little consensus on exactly what constitutes a reduced complexity model and the term itself is both misleading and, arguably, unhelpful. Accordingly, we synthesise a set of requirements for what might be termed ‘appropriate complexity modelling’ of quantitative coastal morphological change at scales commensurate with contemporary management and policy-making requirements: 1) The system being studied must be bounded with reference to the time and space scales at which behaviours of interest emerge and/or scientific or management problems arise; 2) model complexity and comprehensiveness must be appropriate to the problem at hand; 3) modellers should seek a priori insights into what kind of behaviours are likely to be evident at the scale of interest and the extent to which the behavioural validity of a model may be constrained by its underlying assumptions and its comprehensiveness; 4) informed by qualitative insights into likely dynamic behaviour, models should then be formulated with a view to resolving critical state changes; and 5) meso-scale modelling of coastal morphological change should reflect critically on the role of modelling and its relation to the observable world.
Resumo:
In the last decade, mobile wireless communications have witnessed an explosive growth in the user’s penetration rate and their widespread deployment around the globe. In particular, a research topic of particular relevance in telecommunications nowadays is related to the design and implementation of mobile communication systems of 4th generation (4G). 4G networks will be characterized by the support of multiple radio access technologies in a core network fully compliant with the Internet Protocol (all IP paradigms). Such networks will sustain the stringent quality of service (QoS) requirements and the expected high data rates from the type of multimedia applications (i.e. YouTube and Skype) to be available in the near future. Therefore, 4G wireless communications system will be of paramount importance on the development of the information society in the near future. As 4G wireless services will continue to increase, this will put more and more pressure on the spectrum availability. There is a worldwide recognition that methods of spectrum managements have reached their limit and are no longer optimal, therefore new paradigms must be sought. Studies show that most of the assigned spectrum is under-utilized, thus the problem in most cases is inefficient spectrum management rather spectrum shortage. There are currently trends towards a more liberalized approach of spectrum management, which are tightly linked to what is commonly termed as Cognitive Radio (CR). Furthermore, conventional deployment of 4G wireless systems (one BS in cell and mobile deploy around it) are known to have problems in providing fairness (users closer to the BS are more benefited relatively to the cell edge users) and in covering some zones affected by shadowing, therefore the use of relays has been proposed as a solution. To evaluate and analyse the performances of 4G wireless systems software tools are normally used. Software tools have become more and more mature in recent years and their need to provide a high level evaluation of proposed algorithms and protocols is now more important. The system level simulation (SLS) tools provide a fundamental and flexible way to test all the envisioned algorithms and protocols under realistic conditions, without the need to deal with the problems of live networks or reduced scope prototypes. Furthermore, the tools allow network designers a rapid collection of a wide range of performance metrics that are useful for the analysis and optimization of different algorithms. This dissertation proposes the design and implementation of conventional system level simulator (SLS), which afterwards enhances for the 4G wireless technologies namely cognitive Radios (IEEE802.22) and Relays (IEEE802.16j). SLS is then used for the analysis of proposed algorithms and protocols.
Resumo:
Genericamente falando, os serviços sobre redes têm vindo a afastar-se de um modelo monolítico para um modelo de criação de serviços que permite ou - como é mais frequente - requer a cooperação entre vários Provedores de Serviço. A Internet, que tem vindo a forçar a convergência de serviços, mostra que começa a ser virtualmente impossível a um único operador fornecer qualquer serviço com um mínimo de interesse para os utilizadores. Esta tese foca-se em serviços de transporte (e.g., connectividade) e discute o impacto das fronteira que as ofertas de serviços têm com o negócio. A questão central é a seguinte: o que muda quando o mesmo serviço é oferecido não apenas por um mas por mais do que um Provedor de Serviço. Por um lado, esta tese cobre, em abs tracto, a noção de Provedor se Serviço, como evoluiu e em que sentido está a evoluir, particularmente num contexto de muitos Provedores de Serviço. Os primeiros capítulos desta tese analizam e propõem arquitecturas para cooperação inter-Provedor-de-Serviço e para serviços comuns tais como multimédia. Por outro lado, oferece-se soluções práticas, com as respectivas avaliações, para alguns problemas, que ainda hoje se mantêm em aberto, tais como encaminhamento inter-domínio, Qualidade-de-Serviço, Mobilidade e distribuição de conteúdos, tais como as contribuições relacionadas com o impacto da noção administrativa de Sistemas Autónomos sobre encaminhamento inter-domínio, uma arquitectura de transporte inter-domínio e o problema que levanta da ineficiência que decorre do planeamento não- cooperativo de Redes de Entrega de Conteúdos.
Resumo:
The expectations of citizens from the Information Technologies (ITs) are increasing as the ITs have become integral part of our society, serving all kinds of activities whether professional, leisure, safety-critical applications or business. Hence, the limitations of the traditional network designs to provide innovative and enhanced services and applications motivated a consensus to integrate all services over packet switching infrastructures, using the Internet Protocol, so as to leverage flexible control and economical benefits in the Next Generation Networks (NGNs). However, the Internet is not capable of treating services differently while each service has its own requirements (e.g., Quality of Service - QoS). Therefore, the need for more evolved forms of communications has driven to radical changes of architectural and layering designs which demand appropriate solutions for service admission and network resources control. This Thesis addresses QoS and network control issues, aiming to improve overall control performance in current and future networks which classify services into classes. The Thesis is divided into three parts. In the first part, we propose two resource over-reservation algorithms, a Class-based bandwidth Over-Reservation (COR) and an Enhanced COR (ECOR). The over-reservation means reserving more bandwidth than a Class of Service (CoS) needs, so the QoS reservation signalling rate is reduced. COR and ECOR allow for dynamically defining over-reservation parameters for CoSs based on network interfaces resource conditions; they aim to reduce QoS signalling and related overhead without incurring CoS starvation or waste of bandwidth. ECOR differs from COR by allowing for optimizing control overhead minimization. Further, we propose a centralized control mechanism called Advanced Centralization Architecture (ACA), that uses a single state-full Control Decision Point (CDP) which maintains a good view of its underlying network topology and the related links resource statistics on real-time basis to control the overall network. It is very important to mention that, in this Thesis, we use multicast trees as the basis for session transport, not only for group communication purposes, but mainly to pin packets of a session mapped to a tree to follow the desired tree. Our simulation results prove a drastic reduction of QoS control signalling and the related overhead without QoS violation or waste of resources. Besides, we provide a generic-purpose analytical model to assess the impact of various parameters (e.g., link capacity, session dynamics, etc.) that generally challenge resource overprovisioning control. In the second part of this Thesis, we propose a decentralization control mechanism called Advanced Class-based resource OverpRovisioning (ACOR), that aims to achieve better scalability than the ACA approach. ACOR enables multiple CDPs, distributed at network edge, to cooperate and exchange appropriate control data (e.g., trees and bandwidth usage information) such that each CDP is able to maintain a good knowledge of the network topology and the related links resource statistics on real-time basis. From scalability perspective, ACOR cooperation is selective, meaning that control information is exchanged dynamically among only the CDPs which are concerned (correlated). Moreover, the synchronization is carried out through our proposed concept of Virtual Over-Provisioned Resource (VOPR), which is a share of over-reservations of each interface to each tree that uses the interface. Thus, each CDP can process several session requests over a tree without requiring synchronization between the correlated CDPs as long as the VOPR of the tree is not exhausted. Analytical and simulation results demonstrate that aggregate over-reservation control in decentralized scenarios keep low signalling without QoS violations or waste of resources. We also introduced a control signalling protocol called ACOR Protocol (ACOR-P) to support the centralization and decentralization designs in this Thesis. Further, we propose an Extended ACOR (E-ACOR) which aggregates the VOPR of all trees that originate at the same CDP, and more session requests can be processed without synchronization when compared with ACOR. In addition, E-ACOR introduces a mechanism to efficiently track network congestion information to prevent unnecessary synchronization during congestion time when VOPRs would exhaust upon every session request. The performance evaluation through analytical and simulation results proves the superiority of E-ACOR in minimizing overall control signalling overhead while keeping all advantages of ACOR, that is, without incurring QoS violations or waste of resources. The last part of this Thesis includes the Survivable ACOR (SACOR) proposal to support stable operations of the QoS and network control mechanisms in case of failures and recoveries (e.g., of links and nodes). The performance results show flexible survivability characterized by fast convergence time and differentiation of traffic re-routing under efficient resource utilization i.e. without wasting bandwidth. In summary, the QoS and architectural control mechanisms proposed in this Thesis provide efficient and scalable support for network control key sub-systems (e.g., QoS and resource control, traffic engineering, multicasting, etc.), and thus allow for optimizing network overall control performance.