116 resultados para Hold harmless
Resumo:
Purpose: The purpose of this paper is to theorise and empirically examine the views of various NGO stakeholders on the role of donors in facilitating beneficiary accountability.
Method: The paper adopts a case study design and draws primarily on semi-structured interviews with the officials of a large development NGO, donor representatives and regulators.
Findings: We find that donor accountability contains both enabling and constraining features in relation to beneficiary accountability. Our evidence shows that while legitimising their own actions, donors’ accountability requirements embed some enabling provisions of beneficiary accountability, such as participation, monitoring, evaluation and lessons learning, which facilitate beneficiary accountability (Ebrahim, 2003b). We argue that exerting the attributes of power, legitimacy and urgency donors are in a position to realise their accountability claims (Mitchell, Agle, & Wood, 1997) and can hold funded NGOs to account. In the absence of beneficiaries’ power and the unwillingness of regulators to hold NGOs to account, donors’ accountability can play a complementary role in making an NGO accountable to its beneficiaries. Finally, we capture and illustrate some constraining features of donor accountability which limits the promotion of beneficiary accountability.
Research limitations/implications: The findings have significant implications for the policy makers and donors in the context of the current phenomenon of NGOs drive for self-sustainability via commercial activities which are actively encouraged by the donors.
Originality: This paper provides an alternative theorisation of donor accountability in a development NGO context. It draws on rare qualitative empirical data which incorporate the views of multiple groups (including donors which is hitherto rare in the NGO accountability literature) who are directly and/or indirectly involved in setting and negotiating NGO-donors accountability relationship. It enhances our understanding in terms providing a more nuanced portrayal of donor accountability.
Resumo:
Transforming Post-Catholic Ireland is the first major book to explore the dynamic religious landscape of contemporary Ireland, north and south, and to analyse the island’s religious transition. It confirms that the Catholic Church’s long-standing ‘monopoly’ has well and truly disintegrated, replaced by a mixed, post-Catholic religious ‘market’ featuring new and growing expressions of Protestantism, as well as other religions. It describes how people of faith are developing ‘extra-institutional’ expressions of religion, keeping their faith alive outside or in addition to the institutional Catholic Church.
Drawing on island-wide surveys of clergy and laypeople, as well as more than 100 interviews, this book describes how people of faith are engaging with key issues such as increased diversity, reconciliation to overcome the island’s sectarian past, and ecumenism. It argues that extra-institutional religion is especially well-suited to address these and other issues due to its freedom and flexibility when compared to traditional religious institutions. It describes how those who practice extra-institutional religion have experienced personal transformation, and analyses the extent that they have contributed to wider religious, social, and political change. On an island where religion has caused much pain, from clerical sexual abuse scandals, to sectarian violence, to a frosty reception for some immigrants, those who practice their faith outside traditional religious institutions may hold the key to transforming post-Catholic Ireland into a more reconciled society.
Resumo:
Background
Although the General Medical Council recommends that United Kingdom medical students are taught ‘whole person medicine’, spiritual care is variably recognised within the curriculum. Data on teaching delivery and attainment of learning outcomes is lacking. This study ascertained views of Faculty and students about spiritual care and how to teach and assess competence in delivering such care.
MethodsA questionnaire comprising 28 questions exploring attitudes to whole person medicine, spirituality and illness, and training of healthcare staff in providing spiritual care was designed using a five-point Likert scale. Free text comments were studied by thematic analysis. The questionnaire was distributed to 1300 students and 106 Faculty at Queen’s University Belfast Medical School.
Results351 responses (54 staff, 287 students; 25 %) were obtained. >90 % agreed that whole person medicine included physical, psychological and social components; 60 % supported inclusion of a spiritual component within the definition. Most supported availability of spiritual interventions for patients, including access to chaplains (71 %), counsellors (62 %), or members of the patient’s faith community (59 %). 90 % felt that personal faith/spirituality was important to some patients and 60 % agreed that this influenced health. However 80 % felt that doctors should never/rarely share their own spiritual beliefs with patients and 67 % felt they should only do so when specifically invited. Most supported including training on provision of spiritual care within the curriculum; 40-50 % felt this should be optional and 40 % mandatory. Small group teaching was the favoured delivery method. 64 % felt that teaching should not be assessed, but among assessment methods, reflective portfolios were most favoured (30 %). Students tended to hold more polarised viewpoints but generally were more favourably disposed towards spiritual care than Faculty. Respecting patients’ values and beliefs and the need for guidance in provision of spiritual care were identified in the free-text comments.
ConclusionsStudents and Faculty generally recognise a spiritual dimension to health and support provision of spiritual care to appropriate patients. There is lack of consensus whether this should be delivered by doctors or left to others. Spiritual issues impacting patient management should be included in the curriculum; agreement is lacking about how to deliver and assess.
Resumo:
Modern cancer research on prognostic and predictive biomarkers demands the integration of established and emerging high-throughput technologies. However, these data are meaningless unless carefully integrated with patient clinical outcome and epidemiological information. Integrated datasets hold the key to discovering new biomarkers and therapeutic targets in cancer. We have developed a novel approach and set of methods for integrating and interrogating phenomic, genomic and clinical data sets to facilitate cancer biomarker discovery and patient stratification. Applied to a known paradigm, the biological and clinical relevance of TP53, PICan was able to recapitulate the known biomarker status and prognostic significance at a DNA, RNA and protein levels.
Resumo:
This chapter explores the extent to which courts can contribute to the countering of terrorism. It suggests that the contribution will depend on the type of actor the courts are attempting to hold to account as well as on the powers that are conferred on courts by national and international legal regimes. It concludes that courts are most legitimate and effective in relation to terrorist suspects and law enforcers, but less so in relation to counter-terrorism operatives and law-makers.
Resumo:
In Contingent Valuation studies, researchers often base their definition of the environmental good on scientific/expert consensus. However, respondents may not hold this same commodity definition prior to the transaction. This raises questions as to the potential for staging a satisfactory transaction, based on Fischoff and Furby's (1988) criteria. Some unresolved issues regarding the provision of information to respondents to facilitate such a transaction are highlighted. In this paper, we apply content analysis to focus group discussions and develop a set of rules which take account of the non-independence of group data to explore whether researcher and respondents' prior definitions are in any way similar. We use the results to guide information provision in a subsequent questionnaire.
Resumo:
The impending and increasing threat of antimicrobial resistance has led to a greater focus into developing alternative therapies as substitutes for traditional antibiotics for the treatment of multi-drug resistant infections.1 Our group has developed a library of short, cost-effective, diphenylalanine-based peptides (X1-FF-X2) which selective eradicate (viability reduced >90% in 24 hours) the most resistant biofilm forms of a range of Gram-positive and negative pathogens including: methicillin resistant and sensitive Staphyloccoccus aureus and Staphyloccoccus epidermidis; Pseudomonas aeruginosa, Proteus mirabilis and Escherichia coli. They demonstrate a reduced cell cytotoxic profile (NCTC929 murine fibroblast) and limited haemolysis.2 Our molecules have the ability respond to subtle changes in pH, associated with bacterial infection, self-assembling to form β-sheet secondary structures and supramolecular hydrogels at low concentrations (~0.5%w/v). Conjugation of variety of aromatic-based drugs at the X1 position, including non-steroidal anti-inflammatories (NSAIDs), confer further pharmacological properties to the peptide motif enhancing their therapeutic potential. In vivo studies using waxworms (Galleria mellonella) provide promising preliminary results demonstrating the low toxicity and high antimicrobial activity of these low molecular weight gelators in animal models. This work shows biofunctional peptide-based nanomaterials hold great promise for future translation to patients as antimicrobial drug delivery and biomaterial platforms.3 [1] G. Laverty, S.P. Gorman and B.F. Gilmore. Int.J.Mol.Sci. 2011, 12, 6566-6596. [2] G. Laverty, A.P. McCloskey, B.F. Gilmore, D.S. Jones, J Zhou, B Xu. Biomacromolecules. 2014, 15, 9, 3429-3439. [3] A.P. McCloskey, B.F. Gilmore and G.Laverty. Pathogens. 2014, 3, 791-821.
Resumo:
Biodegradable polymers, such as PLA (Polylactide), come from renewable resources like corn starch and if disposed of correctly, degrade and become harmless to the ecosystem making them attractive alternatives to petroleum based polymers. PLA in particular is used in a variety of applications including medical devices, food packaging and waste disposal packaging. However, the industry faces challenges in melt processing of PLA due to its poor thermal stability which is influenced by processing temperatures and shearing.
Identification and control of suitable processing conditions is extremely challenging, usually relying on trial and error, and often sensitive to batch to batch variations. Off-line assessment in a lab environment can result in high scrap rates, long lead times and lengthy and expensive process development. Scrap rates are typically in the region of 25-30% for medical grade PLA costing between €2000-€5000/kg.
Additives are used to enhance material properties such as mechanical properties and may also have a therapeutic role in the case of bioresorbable medical devices, for example the release of calcium from orthopaedic implants such as fixation screws promotes healing. Additives can also reduce the costs involved as less of the polymer resin is required.
This study investigates the scope for monitoring, modelling and optimising processing conditions for twin screw extrusion of PLA and PLA w/calcium carbonate to achieve desired material properties. A DAQ system has been constructed to gather data from a bespoke measurement die comprising melt temperature; pressure drop along the length of the die; and UV-Vis spectral data which is shown to correlate to filler dispersion. Trials were carried out under a range of processing conditions using a Design of Experiments approach and samples were tested for mechanical properties, degradation rate and the release rate of calcium. Relationships between recorded process data and material characterisation results are explored.
Resumo:
We present a fully-distributed self-healing algorithm dex that maintains a constant degree expander network in a dynamic setting. To the best of our knowledge, our algorithm provides the first efficient distributed construction of expanders—whose expansion properties holddeterministically—that works even under an all-powerful adaptive adversary that controls the dynamic changes to the network (the adversary has unlimited computational power and knowledge of the entire network state, can decide which nodes join and leave and at what time, and knows the past random choices made by the algorithm). Previous distributed expander constructions typically provide only probabilistic guarantees on the network expansion whichrapidly degrade in a dynamic setting; in particular, the expansion properties can degrade even more rapidly under adversarial insertions and deletions. Our algorithm provides efficient maintenance and incurs a low overhead per insertion/deletion by an adaptive adversary: only O(logn)O(logn) rounds and O(logn)O(logn) messages are needed with high probability (n is the number of nodes currently in the network). The algorithm requires only a constant number of topology changes. Moreover, our algorithm allows for an efficient implementation and maintenance of a distributed hash table on top of dex with only a constant additional overhead. Our results are a step towards implementing efficient self-healing networks that have guaranteed properties (constant bounded degree and expansion) despite dynamic changes.
Gopal Pandurangan has been supported in part by Nanyang Technological University Grant M58110000, Singapore Ministry of Education (MOE) Academic Research Fund (AcRF) Tier 2 Grant MOE2010-T2-2-082, MOE AcRF Tier 1 Grant MOE2012-T1-001-094, and the United States-Israel Binational Science Foundation (BSF) Grant 2008348. Peter Robinson has been supported by Grant MOE2011-T2-2-042 “Fault-tolerant Communication Complexity in Wireless Networks” from the Singapore MoE AcRF-2. Work done in part while the author was at the Nanyang Technological University and at the National University of Singapore. Amitabh Trehan has been supported by the Israeli Centers of Research Excellence (I-CORE) program (Center No. 4/11). Work done in part while the author was at Hebrew University of Jerusalem and at the Technion and supported by a Technion fellowship.
Resumo:
We study how ownership structure and management objectives interact in determining the company size without assuming information constraints or any explicit costs of management. In symmetric agent economies, the optimal company size balances the returns to scale of the production function and the returns to collaboration efficiency. For a general class of payoff functions, we characterize the optimal company size, and we compare the optimal company size across different managerial objectives. We demonstrate the restrictiveness of common assumptions on effort aggregation (e.g., constant elasticity of effort substitution), and we show that common intuition (e.g., that corporate companies are more efficient and therefore will be larger than equal-share partnerships) might not hold in general.
Religious actions speak louder than words::exposure to credibilityenhancing displays predicts theism
Resumo:
One of the central aims of the cognitive science of religion (CSR) is to explain why supernatural agent beliefs are so widespread. A related but distinct aim is to explain why some individuals hold supernatural agent beliefs but others do not. Here, we aim to provide an initial test of the power of exposure to what Henrich calls “credibility enhancing displays” (or “CREDs”) in determining whether or not an individual holds explicit supernatural agent beliefs. We present evidence from two studies of Americans suggesting that exposure to CREDs, as measured by a scale we developed and validated, predicts current theism vs. non-theism, certainty of God’s existence/non-existence, and religiosity while controlling for overall religious socialization. These results are among the first to empirically support the theorized significance of CREDs for the acquisition of supernatural agent beliefs.
Resumo:
This report summarizes our results from security analysis covering all 57 competitions for authenticated encryption: security, applicability, and robustness (CAESAR) first-round candidates and over 210 implementations. We have manually identified security issues with three candidates, two of which are more serious, and these ciphers have been withdrawn from the competition. We have developed a testing framework, BRUTUS, to facilitate automatic detection of simple security lapses and susceptible statistical structures across all ciphers. From this testing, we have security usage notes on four submissions and statistical notes on a further four. We highlight that some of the CAESAR algorithms pose an elevated risk if employed in real-life protocols due to a class of adaptive-chosen-plaintext attacks. Although authenticated encryption with associated data are often defined (and are best used) as discrete primitives that authenticate and transmit only complete messages, in practice, these algorithms are easily implemented in a fashion that outputs observable ciphertext data when the algorithm has not received all of the (attacker-controlled) plaintext. For an implementor, this strategy appears to offer seemingly harmless and compliant storage and latency advantages. If the algorithm uses the same state for secret keying information, encryption, and integrity protection, and the internal mixing permutation is not cryptographically strong, an attacker can exploit the ciphertext–plaintext feedback loop to reveal secret state information or even keying material. We conclude that the main advantages of exhaustive, automated cryptanalysis are that it acts as a very necessary sanity check for implementations and gives the cryptanalyst insights that can be used to focus more specific attack methods on given candidates.
Resumo:
The current model of mid-latitude late Quaternary terrace sequences, is that they are uplift-driven but climatically controlled terrace staircases, relating to both regional-scale crustal and tectonic factors, and palaeohydrological variations forced by quasi-cyclic climatic conditions in the 100 K world (post Mid Pleistocene Transition). This model appears to hold for the majority of the river valleys draining into the English Channel which exhibit 8–15 terrace levels over approximately 60–100 m of altitudinal elevation. However, one valley, the Axe, has only one major morphological terrace and has long-been regarded as anomalous. This paper uses both conventional and novel stratigraphical methods (digital granulometry and terrestrial laser scanning) to show that this terrace is a stacked sedimentary sequence of 20–30 m thickness with a quasi-continuous (i.e. with hiatuses) pulsed, record of fluvial and periglacial sedimentation over at least the last 300–400 K yrs as determined principally by OSL dating of the upper two thirds of the sequence. Since uplift has been regional, there is no evidence of anomalous neotectonics, and climatic history must be comparable to the adjacent catchments (both of which have staircase sequences) a catchment-specific mechanism is required. The Axe is the only valley in North West Europe incised entirely into the near-horizontally bedded chert (crypto-crystalline quartz) and sand-rich Lower Cretaceous rocks creating a buried valley. Mapping of the valley slopes has identified many large landslide scars associated with past and present springs. It is proposed that these are thaw-slump scars and represent large hill-slope failures caused by Vauclausian water pressures and hydraulic fracturing of the chert during rapid permafrost melting. A simple 1D model of this thermokarstic process is used to explore this mechanism, and it is proposed that the resultant anomalously high input of chert and sand into the valley during terminations caused pulsed aggradation until the last termination. It is also proposed that interglacial and interstadial incision may have been prevented by the over-sized and interlocking nature of the sub-angular chert clasts until the Lateglacial when confinement of the river overcame this immobility threshold. One result of this hydrogeologically mediated valley evolution was to provide a sequence of proximal Palaeolithic archaeology over two MIS cycles. This study demonstrates that uplift tectonics and climate alone do not fully determine Quaternary valley evolution and that lithological and hydrogeological conditions are a fundamental cause of variation in terrestrial Quaternary records and landform evolution.
Resumo:
One of the main challenges faced by the nuclear industry is the long-term confinement of nuclear waste. Because it is inexpensive and easy to manufacture, cement is the material of choice to store large volumes of radioactive materials, in particular the low-level medium-lived fission products. It is therefore of utmost importance to assess the chemical and structural stability of cement containing radioactive species. Here, we use ab initio calculations based on density functional theory (DFT) to study the effects of 90Sr insertion and decay in C-S-H (calcium-silicate-hydrate) in order to test the ability of cement to trap and hold this radioactive fission product and to investigate the consequences of its β-decay on the cement paste structure. We show that 90Sr is stable when it substitutes the Ca2+ ions in C-S-H, and so is its daughter nucleus 90Y after β-decay. Interestingly, 90Zr, daughter of 90Y and final product in the decay sequence, is found to be unstable compared to the bulk phase of the element at zero K but stable when compared to the solvated ion in water. Therefore, cement appears as a suitable waste form for 90Sr storage.
Resumo:
This paper explores the theme of exhibiting architectural research through a particular example, the development of the Irish pavilion for the 14th architectural biennale, Venice 2014. Responding to Rem Koolhaas’s call to investigate the international absorption of modernity, the Irish pavilion became a research project that engaged with the development of the architectures of infrastructure in Ireland in the twentieth and twenty-first centuries. Central to this proposition was that infrastructure is simultaneously a technological and cultural construct, one that for Ireland occupied a critical position in the building of a new, independent post-colonial nation state, after 1921.
Presupposing infrastructure as consisting of both visible and invisible networks, the idea of a matrix become a central conceptual and visual tool in the curatorial and design process for the exhibition and pavilion. To begin with this was a two-dimensional grid used to identify and order what became described as a series of ten ‘infrastructural episodes’. These were determined chronologically across the decades between 1914 and 2014 and their spatial manifestations articulated in terms of scale: micro, meso and macro. At this point ten academics were approached as researchers. Their purpose was twofold, to establish the broader narratives around which the infrastructures developed and to scrutinise relevant archives for compelling visual material. Defining the meso scale as that of the building, the media unearthed was further filtered and edited according to a range of categories – filmic/image, territory, building detail, and model – which sought to communicate the relationship between the pieces of architecture and the larger systems to which they connect. New drawings realised by the design team further iterated these relationships, filling in gaps in the narrative by providing composite, strategic or detailed drawings.
Conceived as an open-ended and extendable matrix, the pavilion was influenced by a series of academic writings, curatorial practices, artworks and other installations including: Frederick Kiesler’s City of Space (1925), Eduardo Persico and Marcello Nizzoli’s Medaglio d’Oro room (1934), Sol Le Witt’s Incomplete Open Cubes (1974) and Rosalind Krauss’s seminal text ‘Grids’ (1979). A modular frame whose structural bays would each hold and present an ‘episode’, the pavilion became both a visual analogue of the unseen networks embodying infrastructural systems and a reflection on the predominance of framed structures within the buildings exhibited. Sharing the aspiration of adaptability of many of these schemes, its white-painted timber components are connected by easily-dismantled steel fixings. These and its modularity allow the structure to be both taken down and re-erected subsequently in different iterations. The pavilion itself is, therefore, imagined as essentially provisional and – as with infrastructure – as having no fixed form. Presenting archives and other material over time, the transparent nature of the space allowed these to overlap visually conveying the nested nature of infrastructural production. Pursuing a means to evoke the qualities of infrastructural space while conveying a historical narrative, the exhibition’s termination in the present is designed to provoke in the visitor, a perceptual extension of the matrix to engage with the future.