10 resultados para Informed decisions

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Increasingly in the UK, companies that have traditionally considered themselves as manufacturers are being advised to now see themselves as service providers and to reconsider whether to have any production capability. A key challenge is to translate this strategy into a selection of product and service-centred activities within the company's supply chain networks. Strategic positioning is concerned with the choice of business activities a company carries out itself, compared to those provided by suppliers, partners, distributors and even customers. In practice, strategic positioning is directly impacted by such decisions as outsourcing, off-shoring, partnering, technology innovation, acquisition and exploitation. If companies can better understand their strategic positioning, they can make more informed decisions about the adoption of alternative manufacturing and supply chain activities. Similarly, they are more likely to reject those that, like off-shoring, are currently en vogue but are highly likely to erode competitive edge and business success. Our research has developed a new concept we call 'competitive space' as a means of appreciating the strategic positioning of companies, along with a structured decision process for managing competitive space. Our ideas about competitive space, along with the decision process itself, have been developed and tested on a range of manufacturers. As more and more manufacturers are encouraged to move towards system integration and a serviceable business model, the challenge is to identify the appropriate strategic position for their organisations, or in other words, to identify their optimum competitive space for manufacture.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Enabling a Simulation Capability in the Organisation addresses the application of simulation modelling techniques in order to enable better informed decisions in business and industrial organisations. The book’s unique approach treats simulation not just as a technical tool, but within as a support for organisational decision making, showing the results from a survey of current and potential users of simulation to suggest reasons why the technique is not used as much as it should be and what are the barriers to its further use. By incorporating an evaluation of six detailed case studies of the application of simulation in industry by the author, the book will teach readers: •the role of simulation in decision making; •how to introduce simulation as a tool for decision making; and •how to undertake simulation studies that lead to change in the organisation. Enabling a Simulation Capability in the Organisation provides an introduction to the state of the art in simulation modelling for researchers in business studies and engineering, as well a useful guide to practitioners and managers in business and industry.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

here is an increasing number of reports of propylene glycol (PG) toxicity in the literature, regardless of its inclusion on the Generally Recognized as Safe List (GRAS).1 PG is an excipient used in many medications as a solvent for water-insoluble drugs. Polypharmacy may increase PG exposure in vulnerable PICU patients who may accumulate PG due to compromised liver and renal function. The study aim was to quantify PG intake in PICU patients and attitudes of clinicians towards PG. Method A snapshot of 50 PICU patients oral or intravenous medication intake was collected. Other data collected included age, weight, diagnosis, lactate levels and renal function. Manufacturers were contacted for PG content and then converted to mg/kg. Excipients in formulations that compete with the PG metabolism pathway were recorded. The Intensivists' opinions on PG intake was sought via e-survey. Results The 50 patients were prescribed 62 drugs and 83 formulations, 43/83 (52%) were parenteral formulations. Median weight of the patients was 5.5 kg (range 2–50 kg), ages ranged from 1 day to 13 years of age. Eleven of the patients were classed as renally impaired (defined as 1.5 times the baseline creatinine). Sixteen formulations contained PG, 2/16 were parenteral, 6/16 unlicensed preparations. Thirty-eight patients received at least one prescription containing PG and 29/38 of these patients were receiving formulations that contained excipients that may have competed with the metabolic pathways of PG. PG intake ranged from 0.002 mg/kg/day to 250 mg/kg/day. Total intake was inconclusive for 2 patients due to a of lack of availability of information from the manufacturer; these formulations were licensed but used in for off-label indications. Five commonly used formulations contributed to higher intakes of PG, namely co-trimoxazole, dexamethasone, potassium chloride, dipyridamole and phenobarbitone. Lactate levels were difficult to interpret due to the underlying conditions of the patients. One of the sixteen intensivist was aware of PG content in drugs, 16/16 would actively change therapy if intake was above European Medicines Agency recommendations. Conclusions Certain formulations used on PICU can considerably increase PG exposure to patients. Due to a lack of awareness of PG content, these should be highlighted to the clinician to assist with making informed decisions regarding risks versus benefits in continuing that drug, route of administration or formulation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is an increasing number of reports of propylene glycol (PG) toxicity in the literature, regardless of its inclusion on the Generally Recognized as Safe List (GRAS).1 PG is an excipient used in many medications as a solvent for water-insoluble drugs. Polypharmacy may increase PG exposure in vulnerable PICU patients who may accumulate PG due to compromised liver and renal function. The study aim was to quantify PG intake in PICU patients and attitudes of clinicians towards PG. Method A snapshot of 50 PICU patients oral or intravenous medication intake was collected. Other data collected included age, weight, diagnosis, lactate levels and renal function. Manufacturers were contacted for PG content and then converted to mg/kg. Excipients in formulations that compete with the PG metabolism pathway were recorded. The Intensivists' opinions on PG intake was sought via e-survey. Results The 50 patients were prescribed 62 drugs and 83 formulations, 43/83 (52%) were parenteral formulations. Median weight of the patients was 5.5 kg (range 2–50 kg), ages ranged from 1 day to 13 years of age. Eleven of the patients were classed as renally impaired (defined as 1.5 times the baseline creatinine). Sixteen formulations contained PG, 2/16 were parenteral, 6/16 unlicensed preparations. Thirty-eight patients received at least one prescription containing PG and 29/38 of these patients were receiving formulations that contained excipients that may have competed with the metabolic pathways of PG. PG intake ranged from 0.002 mg/kg/day to 250 mg/kg/day. Total intake was inconclusive for 2 patients due to a of lack of availability of information from the manufacturer; these formulations were licensed but used in for off-label indications. Five commonly used formulations contributed to higher intakes of PG, namely co-trimoxazole, dexamethasone, potassium chloride, dipyridamole and phenobarbitone. Lactate levels were difficult to interpret due to the underlying conditions of the patients. One of the sixteen intensivist was aware of PG content in drugs, 16/16 would actively change therapy if intake was above European Medicines Agency recommendations. Conclusions Certain formulations used on PICU can considerably increase PG exposure to patients. Due to a lack of awareness of PG content, these should be highlighted to the clinician to assist with making informed decisions regarding risks versus benefits in continuing that drug, route of administration or formulation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Flooding can have a devastating impact on businesses, especially on small- and medium-sized enterprises (SMEs) who may be unprepared and vulnerable to the range of both direct and indirect impacts. SMEs may tend to focus on the direct tangible impacts of flooding, limiting their ability to realise the true costs of flooding. Greater understanding of the impacts of flooding is likely to contribute towards increased uptake of flood protection measures by SMEs, particularly during post-flood property reinstatement. This study sought to investigate the full range of impacts experienced by SMEs located in Cockermouth following the floods of 2009. The findings of a questionnaire survey of SMEs revealed that businesses not directly affected by the flooding experienced a range of impacts and that short-term impacts were given a higher significance. A strong correlation was observed between direct, physical flood impacts and post-flood costs of insurance. Significant increases in the costs of property insurance and excesses were noted, meaning that SMEs will be exposed to increased losses in the event of a future flood event. The findings from the research will enable policy makers and professional bodies to make informed decisions to improve the status of advice given to SMEs. The study also adds weight to the case for SMEs to consider investing in property-level flood risk adaptation measures, especially during the post flood reinstatement process. © 2012 Blackwell Publishing Ltd and The Chartered Institution of Water and Environmental Management (CIWEM).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The world's population is ageing. Older people are healthier and more active than previous generations. Living in a hypermobile world, people want to stay connected to dispersed communities as they age. Staying connected to communities and social networks enables older people to contribute and connect with society and is associated with positive mental and physical health, facilitating independence and physical activity while reducing social isolation. Changes in physiology and cognition associated with later life mean longer journeys may have to be curtailed. A shift in focus is needed to fully explore older people, transport and health; a need to be multidisciplinary in approach and to embrace social sciences and arts and humanities. A need to embrace different types of mobilities is needed for a full understanding of ageing, transport and health, moving from literal or corporeal through virtual and potential to imaginative mobility, taking into account aspirations and emotions. Mobility in later life is more than a means of getting to destinations and includes more affective or emotive associations. Cycling and walking are facilitated not just by improving safety but through social and cultural norms. Car driving can be continued safely in later life if people make appropriate and informed decisions about when and how to stop driving; stringent testing of driver ability and skill has as yet had little effect on safety. Bus use facilitates physical activity and keeps people connected but there are concerns for the future viability of buses. The future of transport may be more community led and involve more sharing of transport modes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Organizations can use the valuable tool of data envelopment analysis (DEA) to make informed decisions on developing successful strategies, setting specific goals, and identifying underperforming activities to improve the output or outcome of performance measurement. The Handbook of Research on Strategic Performance Management and Measurement Using Data Envelopment Analysis highlights the advantages of using DEA as a tool to improve business performance and identify sources of inefficiency in public and private organizations. These recently developed theories and applications of DEA will be useful for policymakers, managers, and practitioners in the areas of sustainable development of our society including environment, agriculture, finance, and higher education sectors. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Engineering education in the United Kingdom is at the point of embarking upon an interesting journey into uncharted waters. At no point in the past have there been so many drivers for change and so many opportunities for the development of engineering pedagogy. This paper will look at how Engineering Education Research (EER) has developed within the UK and what differentiates it from the many small scale practitioner interventions, perhaps without a clear research question or with little evaluation, which are presented at numerous staff development sessions, workshops and conferences. From this position some examples of current projects will be described, outcomes of funding opportunities will be summarised and the benefits of collaboration with other disciplines illustrated. In this study, I will account for how the design of task structure according to variation theory, as well as the probe-ware technology, make the laws of force and motion visible and learnable and, especially, in the lab studied make Newton's third law visible and learnable. I will also, as a comparison, include data from a mechanics lab that use the same probe-ware technology and deal with the same topics in mechanics, but uses a differently designed task structure. I will argue that the lower achievements on the FMCE-test in this latter case can be attributed to these differences in task structure in the lab instructions. According to my analysis, the necessary pattern of variation is not included in the design. I will also present a microanalysis of 15 hours collected from engineering students' activities in a lab about impulse and collisions based on video recordings of student's activities in a lab about impulse and collisions. The important object of learning in this lab is the development of an understanding of Newton's third law. The approach analysing students interaction using video data is inspired by ethnomethodology and conversation analysis, i.e. I will focus on students practical, contingent and embodied inquiry in the setting of the lab. I argue that my result corroborates variation theory and show this theory can be used as a 'tool' for designing labs as well as for analysing labs and lab instructions. Thus my results have implications outside the domain of this study and have implications for understanding critical features for student learning in labs. Engineering higher education is well used to change. As technology develops the abilities expected by employers of graduates expand, yet our understanding of how to make informed decisions about learning and teaching strategies does not without a conscious effort to do so. With the numerous demands of academic life, we often fail to acknowledge our incomplete understanding of how our students learn within our discipline. The journey facing engineering education in the UK is being driven by two classes of driver. Firstly there are those which we have been working to expand our understanding of, such as retention and employability, and secondly the new challenges such as substantial changes to funding systems allied with an increase in student expectations. Only through continued research can priorities be identified, addressed and a coherent and strong voice for informed change be heard within the wider engineering education community. This new position makes it even more important that through EER we acquire the knowledge and understanding needed to make informed decisions regarding approaches to teaching, curriculum design and measures to promote effective student learning. This then raises the question 'how does EER function within a diverse academic community?' Within an existing community of academics interested in taking meaningful steps towards understanding the ongoing challenges of engineering education a Special Interest Group (SIG) has formed in the UK. The formation of this group has itself been part of the rapidly changing environment through its facilitation by the Higher Education Academy's Engineering Subject Centre, an entity which through the Academy's current restructuring will no longer exist as a discrete Centre dedicated to supporting engineering academics. The aims of this group, the activities it is currently undertaking and how it expects to network and collaborate with the global EER community will be reported in this paper. This will include explanation of how the group has identified barriers to the progress of EER and how it is seeking, through a series of activities, to facilitate recognition and growth of EER both within the UK and with our valued international colleagues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Designers of self-adaptive systems often formulate adaptive design decisions, making unrealistic or myopic assumptions about the system's requirements and environment. The decisions taken during this formulation are crucial for satisfying requirements. In environments which are characterized by uncertainty and dynamism, deviation from these assumptions is the norm and may trigger 'surprises'. Our method allows designers to make explicit links between the possible emergence of surprises, risks and design trade-offs. The method can be used to explore the design decisions for self-adaptive systems and choose among decisions that better fulfil (or rather partially fulfil) non-functional requirements and address their trade-offs. The analysis can also provide designers with valuable input for refining the adaptation decisions to balance, for example, resilience (i.e. Satisfiability of non-functional requirements and their trade-offs) and stability (i.e. Minimizing the frequency of adaptation). The objective is to provide designers of self adaptive systems with a basis for multi-dimensional what-if analysis to revise and improve the understanding of the environment and its effect on non-functional requirements and thereafter decision-making. We have applied the method to a wireless sensor network for flood prediction. The application shows that the method gives rise to questions that were not explicitly asked before at design-time and assists designers in the process of risk-aware, what-if and trade-off analysis.