22 resultados para leave to withdraw admissions


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To describe the use of a multifaceted strategy for recruiting general practitioners (GPs) and community pharmacists to talk about medication errors which have resulted in preventable drug-related admissions to hospital. This is a potentially sensitive subject with medicolegal implications. Setting: Four primary care trusts and one teaching hospital in the UK. Method: Letters were mailed to community pharmacists and general practitioners asking for provisional consent to be interviewed and permission to contact them again should a patient be admitted to hospital as a result of a medication error. In addition, GPs were asked for permission to approach their patients should they be admitted to hospital. A multifaceted approach to recruitment was used including gaining support for the study from professional defence agencies and local champions. Key findings: Eighty-five percent (310/385) of GPs and 62% (93/149) of community pharmacists responded to the letters. Eighty-five percent (266/310) of GPs who responded and 81% (75/93) of community pharmacists who responded gave provisional consent to participate in interviews. All GPs (14 out of 14) and community pharmacists (10 out of 10) who were subsequently asked to participate, when patients were admitted to hospital, agreed to be interviewed. Conclusion: The multifaceted approach to recruitment was associated with an impressive response when asking healthcare professionals to be interviewed about medication errors which have resulted in preventable drug-related morbidity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To explore the causes of preventable drug-related admissions (PDRAs) to hospital. Design: Qualitative case studies using semi-structured interviews and medical record review; data analysed using a framework derived from Reason's model of organisational accidents and cascade analysis. Participants: 62 participants, including 18 patients, 8 informal carers, 17 general practitioners, 12 community pharmacists, 3 practice nurses and 4 other members of healthcare staff, involved in events leading up to the patients' hospital admissions. Setting: Nottingham, UK. Results: PDRAs are associated with problems at multiple stages in the medication use process, including prescribing, dispensing, administration, monitoring and help seeking. The main causes of these problems are communication failures ( between patients and healthcare professionals and different groups of healthcare professionals) and knowledge gaps ( about drugs and patients' medical and medication histories). The causes of PDRAs are similar irrespective of whether the hospital admission is associated with a prescribing, monitoring or patient adherence problem. Conclusions: The causes of PDRAs are multifaceted and complex. Technical solutions to PDRAs will need to take account of this complexity and are unlikely to be sufficient on their own. Interventions targeting the human causes of PDRAs are also necessary - for example, improving methods of communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an efficient construction algorithm for obtaining sparse kernel density estimates based on a regression approach that directly optimizes model generalization capability. Computational efficiency of the density construction is ensured using an orthogonal forward regression, and the algorithm incrementally minimizes the leave-one-out test score. A local regularization method is incorporated naturally into the density construction process to further enforce sparsity. An additional advantage of the proposed algorithm is that it is fully automatic and the user is not required to specify any criterion to terminate the density construction procedure. This is in contrast to an existing state-of-art kernel density estimation method using the support vector machine (SVM), where the user is required to specify some critical algorithm parameter. Several examples are included to demonstrate the ability of the proposed algorithm to effectively construct a very sparse kernel density estimate with comparable accuracy to that of the full sample optimized Parzen window density estimate. Our experimental results also demonstrate that the proposed algorithm compares favorably with the SVM method, in terms of both test accuracy and sparsity, for constructing kernel density estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines the early evolution of British policy, prior to the Second World War. The British government adopted an ‘open’ policy towards foreign direct investment (FDI), despite periodic fears that some foreign acquisitions of UK firms in key sectors might be detrimental to the national interest, and a few ad hoc attempts to deal with particular instances of this kind. During the 1930s, when the inflow of foreign firms accelerated following Britain's adoption of general tariff protection, the government developed a sophisticated admissions policy, based on an assessment of the likely net benefit of each applicant to the British economy. Its limited regulatory powers were used to maximize the potential of immigrant firms for technology transfer, enhanced competition, industrial diversification, and employment creation (particularly in the depressed regions), while protecting British industries suffering from excess capacity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Widespread commercial use of the internet has significantly increased the volume and scope of data being collected by organisations. ‘Big data’ has emerged as a term to encapsulate both the technical and commercial aspects of this growing data collection activity. To date, much of the discussion of big data has centred upon its transformational potential for innovation and efficiency, yet there has been less reflection on its wider implications beyond commercial value creation. This paper builds upon normal accident theory (NAT) to analyse the broader ethical implications of big data. It argues that the strategies behind big data require organisational systems that leave them vulnerable to normal accidents, that is to say some form of accident or disaster that is both unanticipated and inevitable. Whilst NAT has previously focused on the consequences of physical accidents, this paper suggests a new form of system accident that we label data accidents. These have distinct, less tangible and more complex characteristics and raise significant questions over the role of individual privacy in a ‘data society’. The paper concludes by considering the ways in which the risks of such data accidents might be managed or mitigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The military offers a form of welfare-for-work but when personnel leave they lose this safety net, a loss exacerbated by the rollback neoliberalism of the contemporary welfare state. Increasingly the third sector has stepped in to address veterans’ welfare needs through operating within and across military/civilian and state/market/community spaces and cultures. In this paper we use both veterans’ and military charities’ experiences to analyse the complex politics that govern the liminal boundary zone of post-military welfare. Through exploring ‘crossing’ and ‘bridging’ we conceptualise military charities as ‘boundary subjects’, active yet dependent on the continuation of the civilian-military binary, and argue that the latter is better understood as a multidirectional, multiscalar and contextual continuum. Post-military welfare emerges as a competitive, confused and confusing assemblage that needs to be made more navigable in order to better support the ‘heroic poor’.