871 resultados para Information Risk
Resumo:
Objective To undertake a process evaluation of pharmacists' recommendations arising in the context of a complex IT-enabled pharmacist-delivered randomised controlled trial (PINCER trial) to reduce the risk of hazardous medicines management in general practices. Methods PINCER pharmacists manually recorded patients’ demographics, details of interventions recommended, actions undertaken by practice staff and time taken to manage individual cases of hazardous medicines management. Data were coded and double entered into SPSS v15, and then summarised using percentages for categorical data (with 95% CI) and, as appropriate, means (SD) or medians (IQR) for continuous data. Key findings Pharmacists spent a median of 20 minutes (IQR 10, 30) reviewing medical records, recommending interventions and completing actions in each case of hazardous medicines management. Pharmacists judged 72% (95%CI 70, 74) (1463/2026) of cases of hazardous medicines management to be clinically relevant. Pharmacists recommended 2105 interventions in 74% (95%CI 73, 76) (1516/2038) of cases and 1685 actions were taken in 61% (95%CI 59, 63) (1246/2038) of cases; 66% (95%CI 64, 68) (1383/2105) of interventions recommended by pharmacists were completed and 5% (95%CI 4, 6) (104/2105) of recommendations were accepted by general practitioners (GPs), but not completed at the end of the pharmacists’ placement; the remaining recommendations were rejected or considered not relevant by GPs. Conclusions The outcome measures were used to target pharmacist activity in general practice towards patients at risk from hazardous medicines management. Recommendations from trained PINCER pharmacists were found to be broadly acceptable to GPs and led to ameliorative action in the majority of cases. It seems likely that the approach used by the PINCER pharmacists could be employed by other practice pharmacists following appropriate training.
Resumo:
Ensemble-based data assimilation is rapidly proving itself as a computationally-efficient and skilful assimilation method for numerical weather prediction, which can provide a viable alternative to more established variational assimilation techniques. However, a fundamental shortcoming of ensemble techniques is that the resulting analysis increments can only span a limited subspace of the state space, whose dimension is less than the ensemble size. This limits the amount of observational information that can effectively constrain the analysis. In this paper, a data selection strategy that aims to assimilate only the observational components that matter most and that can be used with both stochastic and deterministic ensemble filters is presented. This avoids unnecessary computations, reduces round-off errors and minimizes the risk of importing observation bias in the analysis. When an ensemble-based assimilation technique is used to assimilate high-density observations, the data-selection procedure allows the use of larger localization domains that may lead to a more balanced analysis. Results from the use of this data selection technique with a two-dimensional linear and a nonlinear advection model using both in situ and remote sounding observations are discussed.
Resumo:
Climate model ensembles are widely heralded for their potential to quantify uncertainties and generate probabilistic climate projections. However, such technical improvements to modeling science will do little to deliver on their ultimate promise of improving climate policymaking and adaptation unless the insights they generate can be effectively communicated to decision makers. While some of these communicative challenges are unique to climate ensembles, others are common to hydrometeorological modeling more generally, and to the tensions arising between the imperatives for saliency, robustness, and richness in risk communication. The paper reviews emerging approaches to visualizing and communicating climate ensembles and compares them to the more established and thoroughly evaluated communication methods used in the numerical weather prediction domains of day-to-day weather forecasting (in particular probabilities of precipitation), hurricane and flood warning, and seasonal forecasting. This comparative analysis informs recommendations on best practice for climate modelers, as well as prompting some further thoughts on key research challenges to improve the future communication of climate change uncertainties.
Resumo:
The financial crisis of 2008 led to new international regulatory controls for the governance, risk and compliance of financial services firms. Information systems play a critical role here as political, functional and social pressures may lead to the deinstitutionalization of existing structures, processes and practices. This research examines how an investment management system is introduced by a leading IT vendor across eight client sites in the post-crisis era. Using institutional theory, it examines changes in working practices occurring at the environmental and organizational levels and the ways in which technological interventions are used to apply disciplinary effects in order to prevent inappropriate behaviors. The results extend the constructs of deinstitutionalization and identify empirical predictors for the deinstitutionalization of compliance and trading practices within financial organizations.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in three consulting studies carried out by Capgemini involving four UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in two consulting studies carried out by Capgemini involving three UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
This report provides case studies of Early Warning Systems (EWSs) and risk assessments encompassing three main hazard types: drought; flood and cyclone. The case studies are taken from ten countries across three continents (focusing on Africa, South Asia and the Caribbean). The case studies have been developed to assist the UK Department for International Development (DFID) to prioritise areas for Early Warning System (EWS) related research under their ‘Science for Humanitarian Emergencies and Resilience’ (SHEAR) programme. The aim of these case studies is to ensure that DFID SHEAR research is informed by the views of Non-Governmental Organisations (NGOs) and communities engaged with Early Warning Systems and risk assessments (including community-based Early Warning Systems). The case studies highlight a number of challenges facing Early Warning Systems (EWSs). These challenges relate to financing; integration; responsibilities; community interpretation; politics; dissemination; accuracy; capacity and focus. The case studies summarise a number of priority areas for EWS related research: • Priority 1: Contextualising and localising early warning information • Priority 2: Climate proofing current EWSs • Priority 3: How best to sustain effective EWSs between hazard events? • Priority 4: Optimising the dissemination of risk and warning information • Priority 5: Governance and financing of EWSs • Priority 6: How to support EWSs under challenging circumstances • Priority 7: Improving EWSs through monitoring and evaluating the impact and effectiveness of those systems
Resumo:
Objectives. While older adults often display memory deficits, with practice they can sometimes selectively remember valuable information at the expense of less value information. We examined age-related differences and similarities in memory for health-related information under conditions where some information was critical to remember. Method. In Experiment 1, participants studied three lists of allergens, ranging in severity from 0 (not a health risk) to 10 (potentially fatal), with the instruction that it was particularly important to remember items to which a fictional relative was most severely allergic. After each list, participants received feedback regarding their recall of the high-value allergens. Experiment 2 examined memory for health benefits, presenting foods that were potentially beneficial to the relative’s immune system. Results. While younger adults exhibited better overall memory for the allergens, both age groups in Experiment 1 developed improved selectivity across the lists, with no evident age differences in severe allergen recall by List 2. Selectivity also developed in Experiment 2, although age differences for items of high health benefit were present. Discussion. The results have implications for models of selective memory in older age, and for how aging influences the ability to strategically remember important information within health-related contexts.
Resumo:
Dystrophin, the product of the Duchenne muscular dystrophy (DMD) gene, was studied in muscle from 16 human fetuses at risk for the disease. Eleven high risk (greater than 95% probability) and 5 low-risk (less than 25% probability) fetuses were studied with antibodies raised to different regions of the protein. All low-risk fetuses showed a similar pattern to that of normal fetuses of a comparable age: using Western blot analysis, a protein was detected of similar size and abundance to that of normal fetuses (i.e. smaller molecular weight than that of adult muscle); immunocytochemistry showed uniform sarcolemmal staining in fetuses older than 18 weeks gestation and differential staining of myotubes at different stages of development (distinguished by size) in younger fetuses (less than 15 weeks gestation). In contrast, Western blot analysis of high-risk fetuses detected low levels of dystrophin in 4 cases; 7 fetuses had no detectable protein. Immunocytochemistry with some dystrophin antibodies showed weak staining of the sarcolemma and around central nuclei in younger fetuses; in older fetuses there was little sarcolemmal staining with any antibody other than occasional positive fibres. These results indicate that careful study of dystrophin in fetuses at risk for DMD can be used to establish the clinical phenotype and provide additional information for future family counselling.
Resumo:
1. Comparative analyses are used to address the key question of what makes a species more prone to extinction by exploring the links between vulnerability and intrinsic species’ traits and/or extrinsic factors. This approach requires comprehensive species data but information is rarely available for all species of interest. As a result comparative analyses often rely on subsets of relatively few species that are assumed to be representative samples of the overall studied group. 2. Our study challenges this assumption and quantifies the taxonomic, spatial, and data type biases associated with the quantity of data available for 5415 mammalian species using the freely available life-history database PanTHERIA. 3. Moreover, we explore how existing biases influence results of comparative analyses of extinction risk by using subsets of data that attempt to correct for detected biases. In particular, we focus on links between four species’ traits commonly linked to vulnerability (distribution range area, adult body mass, population density and gestation length) and conduct univariate and multivariate analyses to understand how biases affect model predictions. 4. Our results show important biases in data availability with c.22% of mammals completely lacking data. Missing data, which appear to be not missing at random, occur frequently in all traits (14–99% of cases missing). Data availability is explained by intrinsic traits, with larger mammals occupying bigger range areas being the best studied. Importantly, we find that existing biases affect the results of comparative analyses by overestimating the risk of extinction and changing which traits are identified as important predictors. 5. Our results raise concerns over our ability to draw general conclusions regarding what makes a species more prone to extinction. Missing data represent a prevalent problem in comparative analyses, and unfortunately, because data are not missing at random, conventional approaches to fill data gaps, are not valid or present important challenges. These results show the importance of making appropriate inferences from comparative analyses by focusing on the subset of species for which data are available. Ultimately, addressing the data bias problem requires greater investment in data collection and dissemination, as well as the development of methodological approaches to effectively correct existing biases.
Resumo:
Traditional approaches to the way people react to food risks often focus on ways in which the media distort information about risk, or on the deficiencies in people’s interpretation of this information. In this chapter Jones offers an alternative model which sees decisions regarding food risk as taking place at a complex nexus where different people, texts, objects and practices, each with their own histories, come together. Based on a case study of a food scandal involving a particular brand of Chinese candy, Jones argues that understanding why people respond the way they do to food risk requires tracing the itineraries along which different people, texts, objects and practices have traveled to converge at particular moments, and understanding the kinds of concrete social actions that these convergences make possible.
Resumo:
Self-report underpins our understanding of falls among people with Parkinson’s (PwP) as they largely happen unwitnessed at home. In this qualitative study, we used an ethnographic approach to investigate which in-home sensors, in which locations, could gather useful data about fall risk. Over six weeks, we observed five independently mobile PwP at high risk of falling, at home. We made field notes about falls (prior events and concerns) and recorded movement with video, Kinect, and wearable sensors. The three women and two men (aged 71 to 79 years) having moderate or severe Parkinson’s were dependent on others and highly sedentary. We most commonly noted balance protection, loss, and restoration during chair transfers, walks across open spaces and through gaps, turns, steps up and down, and tasks in standing (all evident walking between chair and stairs, e.g.). Our unobtrusive sensors were acceptable to participants: they could detect instability during everyday activity at home and potentially guide intervention. Monitoring the route between chair and stairs is likely to give information without invading the privacy of people at high risk of falling, with very limited mobility, who spend most of the day in their sitting rooms.
Resumo:
Remotely sensed rainfall is increasingly being used to manage climate-related risk in gauge sparse regions. Applications based on such data must make maximal use of the skill of the methodology in order to avoid doing harm by providing misleading information. This is especially challenging in regions, such as Africa, which lack gauge data for validation. In this study, we show how calibrated ensembles of equally likely rainfall can be used to infer uncertainty in remotely sensed rainfall estimates, and subsequently in assessment of drought. We illustrate the methodology through a case study of weather index insurance (WII) in Zambia. Unlike traditional insurance, which compensates proven agricultural losses, WII pays out in the event that a weather index is breached. As remotely sensed rainfall is used to extend WII schemes to large numbers of farmers, it is crucial to ensure that the indices being insured are skillful representations of local environmental conditions. In our study we drive a land surface model with rainfall ensembles, in order to demonstrate how aggregation of rainfall estimates in space and time results in a clearer link with soil moisture, and hence a truer representation of agricultural drought. Although our study focuses on agricultural insurance, the methodological principles for application design are widely applicable in Africa and elsewhere.
Resumo:
Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecastuncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called “How much are you prepared to pay for a forecast?”. The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydrometeorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants’ willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.
Resumo:
This paper presents a GIS-based multicriteria flood risk assessment and mapping approach applied to coastal drainage basins where hydrological data are not available. It involves risk to different types of possible processes: coastal inundation (storm surge), river, estuarine and flash flood, either at urban or natural areas, and fords. Based on the causes of these processes, several environmental indicators were taken to build-up the risk assessment. Geoindicators include geological-geomorphologic proprieties of Quaternary sedimentary units, water table, drainage basin morphometry, coastal dynamics, beach morphodynamics and microclimatic characteristics. Bioindicators involve coastal plain and low slope native vegetation categories and two alteration states. Anthropogenic indicators encompass land use categories properties such as: type, occupation density, urban structure type and occupation consolidation degree. The selected indicators were stored within an expert Geoenvironmental Information System developed for the State of Sao Paulo Coastal Zone (SIIGAL), which attributes were mathematically classified through deterministic approaches, in order to estimate natural susceptibilities (Sn), human-induced susceptibilities (Sa), return period of rain events (Ri), potential damages (Dp) and the risk classification (R), according to the equation R=(Sn.Sa.Ri).Dp. Thematic maps were automatically processed within the SIIGAL, in which automata cells (""geoenvironmental management units"") aggregating geological-geomorphologic and land use/native vegetation categories were the units of classification. The method has been applied to the Northern Littoral of the State of Sao Paulo (Brazil) in 32 small drainage basins, demonstrating to be very useful for coastal zone public politics, civil defense programs and flood management.