911 resultados para critical approaches
Resumo:
Mitigation of diffuse nutrient and sediment delivery to streams requires successful identification andmanagement of critical source areas within catchments. Approaches to predicting high risk areas forsediment loss have typically relied on structural drivers of connectivity and risk, with little considera-tion given to process driven water quality responses. To assess the applicability of structural metrics topredict critical source areas, geochemical tracing of land use sources was conducted in three headwateragricultural catchments in Co. Down and Co. Louth, Ireland, within a Monte Carlo framework. Outputswere applied to the inverse optimisation of a connectivity model, based on LiDAR DEM data, to assess theefficacy of land use risk weightings to predict sediment source contributions over the 18 month studyperiod in the Louth Upper, Louth Lower and Down catchments. Results of the study indicated sedimentproportions over the study period varied from 6 to 10%, 84 to 87%, 4%, and 2 to 3% for the Down Catch-ment, 79 to 85%, 9 to 17%, 1 to 3% and 2 to 3% in the Louth Upper and 2 to 3%, 79 to 85%, 10 to 17%and 2 to 3% in the Louth Lower for arable, channel bank, grassland, and woodland sources, respectively.Optimised land use risk weightings for each sampling period showed that at the larger catchment scale,no variation in median land use weightings were required to predict land use contributions. However,for the two smaller study catchments, variation in median risk weightings was considerable, which mayindicate the importance of functional connectivity processes at this spatial scale. In all instances, arableland consistently generated the highest risk of sediment loss across all catchments and sampling times.This study documents some of the first data on sediment provenance in Ireland and indicates the needfor cautious consideration of land use as a tool to predict critical source areas at the headwater scale
Resumo:
Mobile malware has been growing in scale and complexity spurred by the unabated uptake of smartphones worldwide. Android is fast becoming the most popular mobile platform resulting in sharp increase in malware targeting the platform. Additionally, Android malware is evolving rapidly to evade detection by traditional signature-based scanning. Despite current detection measures in place, timely discovery of new malware is still a critical issue. This calls for novel approaches to mitigate the growing threat of zero-day Android malware. Hence, the authors develop and analyse proactive machine-learning approaches based on Bayesian classification aimed at uncovering unknown Android malware via static analysis. The study, which is based on a large malware sample set of majority of the existing families, demonstrates detection capabilities with high accuracy. Empirical results and comparative analysis are presented offering useful insight towards development of effective static-analytic Bayesian classification-based solutions for detecting unknown Android malware.
Resumo:
1. Quantitative reconstruction of past vegetation distribution and abundance from sedimentary pollen records provides an important baseline for understanding long term ecosystem dynamics and for the calibration of earth system process models such as regional-scale climate models, widely used to predict future environmental change. Most current approaches assume that the amount of pollen produced by each vegetation type, usually expressed as a relative pollen productivity term, is constant in space and time.
2. Estimates of relative pollen productivity can be extracted from extended R-value analysis (Parsons and Prentice, 1981) using comparisons between pollen assemblages deposited into sedimentary contexts, such as moss polsters, and measurements of the present day vegetation cover around the sampled location. Vegetation survey method has been shown to have a profound effect on estimates of model parameters (Bunting and Hjelle, 2010), therefore a standard method is an essential pre-requisite for testing some of the key assumptions of pollen-based reconstruction of past vegetation; such as the assumption that relative pollen productivity is effectively constant in space and time within a region or biome.
3. This paper systematically reviews the assumptions and methodology underlying current models of pollen dispersal and deposition, and thereby identifies the key characteristics of an effective vegetation survey method for estimating relative pollen productivity in a range of landscape contexts.
4. It then presents the methodology used in a current research project, developed during a practitioner workshop. The method selected is pragmatic, designed to be replicable by different research groups, usable in a wide range of habitats, and requiring minimum effort to collect adequate data for model calibration rather than representing some ideal or required approach. Using this common methodology will allow project members to collect multiple measurements of relative pollen productivity for major plant taxa from several northern European locations in order to test the assumption of uniformity of these values within the climatic range of the main taxa recorded in pollen records from the region.
Resumo:
Background: There is growing interest in the potential utility of real-time polymerase chain reaction (PCR) in diagnosing bloodstream infection by detecting pathogen deoxyribonucleic acid (DNA) in blood samples within a few hours. SeptiFast (Roche Diagnostics GmBH, Mannheim, Germany) is a multipathogen probe-based system targeting ribosomal DNA sequences of bacteria and fungi. It detects and identifies the commonest pathogens causing bloodstream infection. As background to this study, we report a systematic review of Phase III diagnostic accuracy studies of SeptiFast, which reveals uncertainty about its likely clinical utility based on widespread evidence of deficiencies in study design and reporting with a high risk of bias.
Objective: Determine the accuracy of SeptiFast real-time PCR for the detection of health-care-associated bloodstream infection, against standard microbiological culture.
Design: Prospective multicentre Phase III clinical diagnostic accuracy study using the standards for the reporting of diagnostic accuracy studies criteria.
Setting: Critical care departments within NHS hospitals in the north-west of England.
Participants: Adult patients requiring blood culture (BC) when developing new signs of systemic inflammation.
Main outcome measures: SeptiFast real-time PCR results at species/genus level compared with microbiological culture in association with independent adjudication of infection. Metrics of diagnostic accuracy were derived including sensitivity, specificity, likelihood ratios and predictive values, with their 95% confidence intervals (CIs). Latent class analysis was used to explore the diagnostic performance of culture as a reference standard.
Results: Of 1006 new patient episodes of systemic inflammation in 853 patients, 922 (92%) met the inclusion criteria and provided sufficient information for analysis. Index test assay failure occurred on 69 (7%) occasions. Adult patients had been exposed to a median of 8 days (interquartile range 4–16 days) of hospital care, had high levels of organ support activities and recent antibiotic exposure. SeptiFast real-time PCR, when compared with culture-proven bloodstream infection at species/genus level, had better specificity (85.8%, 95% CI 83.3% to 88.1%) than sensitivity (50%, 95% CI 39.1% to 60.8%). When compared with pooled diagnostic metrics derived from our systematic review, our clinical study revealed lower test accuracy of SeptiFast real-time PCR, mainly as a result of low diagnostic sensitivity. There was a low prevalence of BC-proven pathogens in these patients (9.2%, 95% CI 7.4% to 11.2%) such that the post-test probabilities of both a positive (26.3%, 95% CI 19.8% to 33.7%) and a negative SeptiFast test (5.6%, 95% CI 4.1% to 7.4%) indicate the potential limitations of this technology in the diagnosis of bloodstream infection. However, latent class analysis indicates that BC has a low sensitivity, questioning its relevance as a reference test in this setting. Using this analysis approach, the sensitivity of the SeptiFast test was low but also appeared significantly better than BC. Blood samples identified as positive by either culture or SeptiFast real-time PCR were associated with a high probability (> 95%) of infection, indicating higher diagnostic rule-in utility than was apparent using conventional analyses of diagnostic accuracy.
Conclusion: SeptiFast real-time PCR on blood samples may have rapid rule-in utility for the diagnosis of health-care-associated bloodstream infection but the lack of sensitivity is a significant limiting factor. Innovations aimed at improved diagnostic sensitivity of real-time PCR in this setting are urgently required. Future work recommendations include technology developments to improve the efficiency of pathogen DNA extraction and the capacity to detect a much broader range of pathogens and drug resistance genes and the application of new statistical approaches able to more reliably assess test performance in situation where the reference standard (e.g. blood culture in the setting of high antimicrobial use) is prone to error.
Resumo:
In this paper, we consider the variable selection problem for a nonlinear non-parametric system. Two approaches are proposed, one top-down approach and one bottom-up approach. The top-down algorithm selects a variable by detecting if the corresponding partial derivative is zero or not at the point of interest. The algorithm is shown to have not only the parameter but also the set convergence. This is critical because the variable selection problem is binary, a variable is either selected or not selected. The bottom-up approach is based on the forward/backward stepwise selection which is designed to work if the data length is limited. Both approaches determine the most important variables locally and allow the unknown non-parametric nonlinear system to have different local dimensions at different points of interest. Further, two potential applications along with numerical simulations are provided to illustrate the usefulness of the proposed algorithms.
Resumo:
Shared decision-making (SDM) is a high priority in healthcare policy and is complementary to the recovery philosophy in mental health care. This agenda has been operationalised within the Values-Based Practice (VBP) framework, which offers a theoretical and practical model to promote democratic interprofessional approaches to decision-making. However, these are limited by a lack of recognition of the implications of power implicit within the mental health system. This study considers issues of power within the context of decision-making and examines to what extent decisions about patients? care on acute in-patient wards are perceived to be shared. Focus groups were conducted with 46 mental health professionals, service users, and carers. The data were analysed using the framework of critical narrative analysis (CNA). The findings of the study suggested each group constructed different identity positions, which placed them as inside or outside of the decision-making process. This reflected their view of themselves as best placed to influence a decision on behalf of the service user. In conclusion, the discourse of VBP and SDM needs to take account of how differentials of power and the positioning of speakers affect the context in which decisions take place.
Resumo:
This article explores policy approaches to educating populations for potential critical infrastructure collapse in five different countries: the UK, the US, Germany, Japan and New Zealand. ‘Critical infrastructure’ is not always easy to define, and indeed is defined slightly differently across countries – it includes entities vital to life, such as utilities (water, energy), transportation systems and communications, and may also include social and cultural infrastructure. The article is a mapping exercise of different approaches to critical infrastructure protection and preparedness education by the five countries. The exercise facilitates a comparison of the countries and enables us to identify distinctive characteristics of each country’s approach. We argue that contrary to what most scholars of security have argued, these national approaches diverge greatly, suggesting that they are shaped more by internal politics and culture than by global approaches.
Resumo:
Tese de doutoramento, Farmácia (Química Farmacêutica e Terapêutica), Universidade de Lisboa, Faculdade de Farmácia, 2016
Resumo:
This thesis analyses how dominant policy approaches to peacebuilding have moved away from a single and universalised understanding of peace to be achieved through a top-down strategy of democratisation and economic liberalisation, prevalent at the beginning of 1990s. Instead, throughout the 2000s, peacebuilders have increasingly adopted a commitment to cultivating a bottom-up and hybrid peace building process that is context-sensitive and intended to be more respectful of the needs and values of post-war societies. The projects of statebuilding in Kosovo and, to a lesser extent, in Bosnia are examined to illustrate the shift. By capturing this shift, I seek to argue that contemporary practitioners of peace are sharing the sensibility of the theoretical critics of liberalism. These critics have long contended that post-war societies cannot be governed from ‘above’ and have advocated the adoption of a bottom-up approach to peacebuilding. Now, both peace practitioners and their critics share the tendency to embrace difference in peacebuilding operations, but this shift has failed to address meaningfully the problems and concerns of post-conflict societies. The conclusion of this research is that, drawing on the assumption that these societies are not capable of undertaking sovereign acts because of their problematic inter-subjective frames, the discourses of peacebuilding (in policy-making and academic critique) have increasingly legitimised an open-ended role of interference by external agencies, which now operate from ‘below’. Peacebuilding has turned into a long-term process, in which international and local actors engage relationally in the search for ever-more emancipatory hybrid outcomes, but in which self-government and self-determination are constantly deferred. Processes of emphasising difference have thus denied the political autonomy of post-war societies and have continuously questioned the political and human equality of these populations in a hierarchically divided world.
Resumo:
The key argument set out in this article is that historical and comparative forms of investigation are necessary if we are to better understand the ambitions and scope of contemporary housing interventions. To demonstrate the veracity of our claim we have set out an analysis of UK housing polices enacted in the mid 1970s as a basis for comparison with those pursued forty years later. The article begins with a critical summary of some of the methodological approaches adopted by researchers used to interpret housing policy. In the main section we present our critical analysis of housing policy reforms (implemented by the Labour government between 1974 and 1979) noting both their achievements and limitations. In the concluding section, we use our interpretation of this period as a basis to judge contemporary housing policy and reflect on the methodological issues that arise from our analysis.
Resumo:
Pelvic floor anatomy is complex and its three-dimensional organization is often difficult to understand for both undergrad- uate and postgraduate students. Here, we focused on several critical points that need to be considered when teaching the perineum. We have to deal with a mixed population of students and with a variety of interest. Yet, a perfect knowledge of the pelvic floor is the basis for any gynecologist and for any surgical intervention. Our objectives are several-fold; i) to estab- lish the objectives and the best way of teaching, ii) to identify and localize areas in the female pelvic floor that are suscepti- ble to generate problems in understanding the three-dimensional organization, iii) to create novel approaches by respecting the anatomical surroundings, and iv) prospectively, to identify elements that may create problems during surgery i.e. to have a closer look at nerve trajectories and on compression sites that may cause neuralgia or postoperative pain. A feedback from students concludes that they have difficulties to assimilate this much information, especially the different imaging tech- niques. Eventually, this will lead to a severe selection of what has to be taught and included in lectures or practicals. Another consequence is that more time to study prosected pelves needs to be given.
Resumo:
INTRODUCTION: Dendritic cells (DCs) are the most important antigen-presenting cell population for activating antitumor T-cell responses; therefore, they offer a unique opportunity for specific targeting of tumors. AREAS COVERED: We will discuss the critical factors for the enhancement of DC vaccine efficacy: different DC subsets, types of in vitro DC manufacturing protocol, types of tumor antigen to be loaded and finally different adjuvants for activating them. We will cover potential combinatorial strategies with immunomodulatory therapies: depleting T-regulatory (Treg) cells, blocking VEGF and blocking inhibitory signals. Furthermore, recommendations to incorporate these criteria into DC-based tumor immunotherapy will be suggested. EXPERT OPINION: Monocyte-derived DCs are the most widely used DC subset in the clinic, whereas Langerhans cells and plasmacytoid DCs are two emerging DC subsets that are highly effective in eliciting cytotoxic T lymphocyte responses. Depending on the type of tumor antigens selected for loading DCs, it is important to optimize a protocol that will generate highly potent DCs. The future aim of DC-based immunotherapy is to combine it with one or more immunomodulatory therapies, for example, Treg cell depletion, VEGF blockage and T-cell checkpoint blockage, to elicit the most optimal antitumor immunity to induce long-term remission or even cure cancer patients.
Resumo:
This thesis uses critical discourse analysis (COAl to explore and examine direct-toconsumer (OTC) pharmaceutical drug advertisements appearing in four issues of 0, The Oprah Magazine in 2006. The theoretical underpinnings of this thesis emerge from social scientists and feminists analyses regarding the medicalization of everyday life. The findings of this study highlight three types of discourses used by pharmaceutical companies. First, I explore the use of historical and contemporary gender norms to seJi pharmacological products; second, J examine discourses which normalize the use of chemical solutions as the first line of defense to address a wide range of everyday problems; and finally, I assess how phannaceutical advertisements provide an illusion of autonomy by responsibilizing individuals as patients, at the same time as they suggest that real independence can only be achieved with medication. My discussion of these themes also includes an analysis of why 0 Magazine, which explicitly promotes women's empowerment through holistic approaches to health and personal growthmight support such advertising. Thus I explore: how does OTC advertising benefit both pharmaceutical companies and 0 Magazine itself? I conclude through a brief discussion of the larger implications of OTC advertising for women's health.
Resumo:
Many international, political, and economic influences led to increased demands for development of new quality assurance systems for universities. Like many policies and processes that aim to assure quality, Ontario’s Quality Assurance Framework (QAF) did not define quality. This study sought to explore conceptions of quality and approaches to quality assurance used within Ontario’s universities. A document analysis of the QAF’s rationale and structure suggested that quality was conceived primarily as fitness for purpose, while suggested indicators represented an exceptional conception of quality. Ontario universities perpetuated such confusion by adopting the framework without customizing it to their institutional conceptions of quality. Drawing upon phenomenographic traditions, a qualitative investigation was conducted to better understand various conceptions of quality held by university administrators and to appreciate ways in which they implemented the QAF. Three main approaches to quality assurance were identified: (a) Defending Quality, characterized by conceptions of quality as exceptional, which focuses on administrative accountability and uses a hands-off strategy to defend traditional notions of quality inputs and resources; (b) Demonstrating Quality, characterized by conceptions of quality as fitness for purpose and value for money, which focuses on accountability to students and uses centralized engaged strategies to demonstrate how programs meet current priorities and intended outcomes; and (c) Enhancing Quality, characterized by conceptions of quality as transformation, which focuses on reflection and learning experience and uses engaged strategies to find new ways of improving learning and teaching. The development of a campus culture that values the institution’s function in student learning and quality teaching would benefit from Enhancing Quality approaches to quality assurance. This would require holistic consideration of the beliefs held by members of the institution, a clear articulation of the institution’s conceptions of quality, and a critical analysis of how these conceptions align with institutional practices and policies.
Resumo:
This paper develops some theoretical and methodological considerations for the development of a critical competence model (CCM). The model is defined as a set of skills and knowledge functionally organized allowing measurable results with positive consequences for the strategic business objectives. The theoretical approaches of classical model of competences, the contemporary model of competencies and human competencies model were revised for the proposal development. implementation of the model includes 5 steps: 1) conducting a job analysis considering which dimensions or facets are subject to revision, 2) identify people with the opposite performance (the higher performance and lower performance); 3) identify critical incidents most relevant to the job position, 4) develop behavioral expectation scales (bes) and 5) validate BES obtained for experts in the field. As a final consideration, is determined that the competence models require accurate measurement. Approaches considering excessive theoreticism may cause the issue of competence become a fashion business with low or minimal impact, affecting its validity, reliability and deployment in organizations.