980 resultados para error management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind energy has been identified as key to the European Union’s 2050 low carbon economy. However, as wind is a variable resource and stochastic by nature, it is difficult to plan and schedule the power system under varying wind power generation. This paper investigates the impacts of offshore wind power forecast error on the operation and management of a pool-based electricity market in 2050. The impact of the magnitude and variance of the offshore wind power forecast error on system generation costs, emission costs, dispatch-down of wind, number of start-ups and system marginal price is analysed. The main findings of this research are that the magnitude of the offshore wind power forecast error has the largest impact on system generation costs and dispatch-down of wind, but the variance of the offshore wind power forecast error has the biggest impact on emissions costs and system marginal price. Overall offshore wind power forecast error variance results in a system marginal price increase of 9.6% in 2050.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the impacts of offshore wind power forecast error on the operation and management of a pool-based electricity market in 2050. The impact from offshore wind power forecast errors of up to 2000 MW on system generation costs, emission costs, dispatch-down of wind, number of start-ups and system marginal price are analysed. The main findings of this research are an increase in system marginal prices of approximately 1% for every percentage point rise in the offshore wind power forecast error regardless of the average forecast error sign. If offshore wind power generates less than forecasted (−13%) generation costs and system marginal prices increases by 10%. However, if offshore wind power generates more than forecasted (4%) the generation costs decrease yet the system marginal prices increase by 3%. The dispatch down of large quantities of wind power highlights the need for flexible interconnector capacity. From a system operator's perspective it is more beneficial when scheduling wind ahead of the trading period to forecast less wind than will be generated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the performance of sliding-window based training, termed as semi batch, using multilayer perceptron (MLP) neural network in the presence of correlated data. The sliding window training is a form of higher order instantaneous learning strategy without the need of covariance matrix, usually employed for modeling and tracking purposes. Sliding-window framework is implemented to combine the robustness of offline learning algorithms with the ability to track online the underlying process of a function. This paper adopted sliding window training with recent advances in conjugate gradient direction with application of data store management e.g. simple distance measure, angle evaluation and the novel prediction error test. The simulation results show the best convergence performance is gained by using store management techniques. © 2012 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social work in the United Kingdom remains embroiled in concerns about child protection error. The serious injury or death of vulnerable children continues to evince much consternation in the public and private spheres. Governmental responses to these concerns invariably draw on technocratic solutions involving more procedures, case management systems, information technology and bureaucratic regulation. Such solutions flow from an implicit use of instrumental rationality based on a ‘means-end’ logic. While bringing an important perspective to the problem of child protection error, instrumental rationality has been overused limiting discretion and other modes of rational inquiry. This paper argues that the social work profession should apply an enlarged form of rationality comprising not only the instrumental-rational mode but also the critical-rational, affective-rational and communicative-rational forms. It is suggested that this combined, conceptual arsenal of rational inquiry leads to a gestalt which has been termed the holistic-rational perspective. It is also argued that embracing a more rounded perspective such as this might offer greater opportunities for reducing child protection error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently wind power is dominated by onshore wind farms in the British Isles, but both the United Kingdom and the Republic of Ireland have high renewable energy targets, expected to come mostly from wind power. However, as the demand for wind power grows to ensure security of energy supply, as a potentially cheaper alternative to fossil fuels and to meet greenhouse gas emissions reduction targets offshore wind power will grow rapidly as the availability of suitable onshore sites decrease. However, wind is variable and stochastic by nature and thus difficult to schedule. In order to plan for these uncertainties market operators use wind forecasting tools, reserve plant and ancillary service agreements. Onshore wind power forecasting techniques have improved dramatically and continue to advance, but offshore wind power forecasting is more difficult due to limited datasets and knowledge. So as the amount of offshore wind power increases in the British Isles robust forecasting and planning techniques are even more critical. This paper presents a methodology to investigate the impacts of better offshore wind forecasting on the operation and management of the single wholesale electricity market in the Republic of Ireland and Northern Ireland using PLEXOS for Power Systems. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Os sistemas distribuídos embarcados (Distributed Embedded Systems – DES) têm sido usados ao longo dos últimos anos em muitos domínios de aplicação, da robótica, ao controlo de processos industriais passando pela aviónica e pelas aplicações veiculares, esperando-se que esta tendência continue nos próximos anos. A confiança no funcionamento é uma propriedade importante nestes domínios de aplicação, visto que os serviços têm de ser executados em tempo útil e de forma previsível, caso contrário, podem ocorrer danos económicos ou a vida de seres humanos poderá ser posta em causa. Na fase de projecto destes sistemas é impossível prever todos os cenários de falhas devido ao não determinismo do ambiente envolvente, sendo necessária a inclusão de mecanismos de tolerância a falhas. Adicionalmente, algumas destas aplicações requerem muita largura de banda, que também poderá ser usada para a evolução dos sistemas, adicionandolhes novas funcionalidades. A flexibilidade de um sistema é uma propriedade importante, pois permite a sua adaptação às condições e requisitos envolventes, contribuindo também para a simplicidade de manutenção e reparação. Adicionalmente, nos sistemas embarcados, a flexibilidade também é importante por potenciar uma melhor utilização dos, muitas vezes escassos, recursos existentes. Uma forma evidente de aumentar a largura de banda e a tolerância a falhas dos sistemas embarcados distribuídos é a replicação dos barramentos do sistema. Algumas soluções existentes, quer comerciais quer académicas, propõem a replicação dos barramentos para aumento da largura de banda ou para aumento da tolerância a falhas. No entanto e quase invariavelmente, o propósito é apenas um, sendo raras as soluções que disponibilizam uma maior largura de banda e um aumento da tolerância a falhas. Um destes raros exemplos é o FlexRay, com a limitação de apenas ser permitido o uso de dois barramentos. Esta tese apresentada e discute uma proposta para usar a replicação de barramentos de uma forma flexível com o objectivo duplo de aumentar a largura de banda e a tolerância a falhas. A flexibilidade dos protocolos propostos também permite a gestão dinâmica da topologia da rede, sendo o número de barramentos apenas limitado pelo hardware/software. As propostas desta tese foram validadas recorrendo ao barramento de campo CAN – Controller Area Network, escolhido devido à sua grande implantação no mercado. Mais especificamente, as soluções propostas foram implementadas e validadas usando um paradigma que combina flexibilidade com comunicações event-triggered e time-triggered: o FTT – Flexible Time- Triggered. No entanto, uma generalização para CAN nativo é também apresentada e discutida. A inclusão de mecanismos de replicação do barramento impõe a alteração dos antigos protocolos de replicação e substituição do nó mestre, bem como a definição de novos protocolos para esta finalidade. Este trabalho tira partido da arquitectura centralizada e da replicação do nó mestre para suportar de forma eficiente e flexível a replicação de barramentos. Em caso de ocorrência de uma falta num barramento (ou barramentos) que poderia provocar uma falha no sistema, os protocolos e componentes propostos nesta tese fazem com que o sistema reaja, mudando para um modo de funcionamento degradado. As mensagens que estavam a ser transmitidas nos barramentos onde ocorreu a falta são reencaminhadas para os outros barramentos. A replicação do nó mestre baseia-se numa estratégia líder-seguidores (leaderfollowers), onde o líder (leader) controla todo o sistema enquanto os seguidores (followers) servem como nós de reserva. Se um erro ocorrer no nó líder, um dos nós seguidores passará a controlar o sistema de uma forma transparente e mantendo as mesmas funcionalidades. As propostas desta tese foram também generalizadas para CAN nativo, tendo sido para tal propostos dois componentes adicionais. É, desta forma possível ter as mesmas capacidades de tolerância a falhas ao nível dos barramentos juntamente com a gestão dinâmica da topologia de rede. Todas as propostas desta tese foram implementadas e avaliadas. Uma implementação inicial, apenas com um barramento foi avaliada recorrendo a uma aplicação real, uma equipa de futebol robótico onde o protocolo FTT-CAN foi usado no controlo de movimento e da odometria. A avaliação do sistema com múltiplos barramentos foi feita numa plataforma de teste em laboratório. Para tal foi desenvolvido um sistema de injecção de faltas que permite impor faltas nos barramentos e nos nós mestre, e um sistema de medida de atrasos destinado a medir o tempo de resposta após a ocorrência de uma falta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Wireless Sensor Networks (WSN), neglecting the effects of varying channel quality can lead to an unnecessary wastage of precious battery resources and in turn can result in the rapid depletion of sensor energy and the partitioning of the network. Fairness is a critical issue when accessing a shared wireless channel and fair scheduling must be employed to provide the proper flow of information in a WSN. In this paper, we develop a channel adaptive MAC protocol with a traffic-aware dynamic power management algorithm for efficient packet scheduling and queuing in a sensor network, with time varying characteristics of the wireless channel also taken into consideration. The proposed protocol calculates a combined weight value based on the channel state and link quality. Then transmission is allowed only for those nodes with weights greater than a minimum quality threshold and nodes attempting to access the wireless medium with a low weight will be allowed to transmit only when their weight becomes high. This results in many poor quality nodes being deprived of transmission for a considerable amount of time. To avoid the buffer overflow and to achieve fairness for the poor quality nodes, we design a Load prediction algorithm. We also design a traffic aware dynamic power management scheme to minimize the energy consumption by continuously turning off the radio interface of all the unnecessary nodes that are not included in the routing path. By Simulation results, we show that our proposed protocol achieves a higher throughput and fairness besides reducing the delay

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of the dynamics of nitrogen in soil (soil-N) can be used to aid the fertilizer management of a crop. The predictions of soil-N models can be validated by comparison with observed data. Validation generally involves calculating non-spatial statistics of the observations and predictions, such as their means, their mean squared-difference, and their correlation. However, when the model predictions are spatially distributed across a landscape the model requires validation with spatial statistics. There are three reasons for this: (i) the model may be more or less successful at reproducing the variance of the observations at different spatial scales; (ii) the correlation of the predictions with the observations may be different at different spatial scales; (iii) the spatial pattern of model error may be informative. In this study we used a model, parameterized with spatially variable input information about the soil, to predict the mineral-N content of soil in an arable field, and compared the results with observed data. We validated the performance of the N model spatially with a linear mixed model of the observations and model predictions, estimated by residual maximum likelihood. This novel approach allowed us to describe the joint variation of the observations and predictions as: (i) independent random variation that occurred at a fine spatial scale; (ii) correlated random variation that occurred at a coarse spatial scale; (iii) systematic variation associated with a spatial trend. The linear mixed model revealed that, in general, the performance of the N model changed depending on the spatial scale of interest. At the scales associated with random variation, the N model underestimated the variance of the observations, and the predictions were correlated poorly with the observations. At the scale of the trend, the predictions and observations shared a common surface. The spatial pattern of the error of the N model suggested that the observations were affected by the local soil condition, but this was not accounted for by the N model. In summary, the N model would be well-suited to field-scale management of soil nitrogen, but suited poorly to management at finer spatial scales. This information was not apparent with a non-spatial validation. (c),2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods: Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion: At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The question as to whether active management adds any value above that of the funds investment policy is one of continual interest to investors. In order to investigate this issue in the UK real estate market we examine a number of related questions. First, how much return variability is explained by investment policy? Second, how similar are the policies across funds? Third, how much of a fund’s return is determined by investment policy? Finally, how was this added value achieved? Using data for 19 real estate funds we find that investment policy explains less than half of the variability in returns over time, nothing of the variation across funds and that more than 100% of a level of return is attributed to investment policy. The results also show UK real estate fund focus exclusively on trying to pick winners to add value and that in pursuit of active return fund mangers incur high tracking error risk, consequently, successful active management is very difficult to achieve. In addition, the results are dependent on the benchmark used to represent the investment policy of the fund. Nonetheless, active management can indeed add value to a real estate funds performance. This is the good news. The bad news is adding value is much more difficult to achieve than is generally accepted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research has suggested that forecast evaluation on the basis of standard statistical loss functions could prefer models which are sub-optimal when used in a practical setting. This paper explores a number of statistical models for predicting the daily volatility of several key UK financial time series. The out-of-sample forecasting performance of various linear and GARCH-type models of volatility are compared with forecasts derived from a multivariate approach. The forecasts are evaluated using traditional metrics, such as mean squared error, and also by how adequately they perform in a modern risk management setting. We find that the relative accuracies of the various methods are highly sensitive to the measure used to evaluate them. Such results have implications for any econometric time series forecasts which are subsequently employed in financial decisionmaking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Medication errors in general practice are an important source of potentially preventable morbidity and mortality. Building on previous descriptive, qualitative and pilot work, we sought to investigate the effectiveness, cost-effectiveness and likely generalisability of a complex pharm acist-led IT-based intervention aiming to improve prescribing safety in general practice. Objectives: We sought to: • Test the hypothesis that a pharmacist-led IT-based complex intervention using educational outreach and practical support is more effective than simple feedback in reducing the proportion of patients at risk from errors in prescribing and medicines management in general practice. • Conduct an economic evaluation of the cost per error avoided, from the perspective of the National Health Service (NHS). • Analyse data recorded by pharmacists, summarising the proportions of patients judged to be at clinical risk, the actions recommended by pharmacists, and actions completed in the practices. • Explore the views and experiences of healthcare professionals and NHS managers concerning the intervention; investigate potential explanations for the observed effects, and inform decisions on the future roll-out of the pharmacist-led intervention • Examine secular trends in the outcome measures of interest allowing for informal comparison between trial practices and practices that did not participate in the trial contributing to the QRESEARCH database. Methods Two-arm cluster randomised controlled trial of 72 English general practices with embedded economic analysis and longitudinal descriptive and qualitative analysis. Informal comparison of the trial findings with a national descriptive study investigating secular trends undertaken using data from practices contributing to the QRESEARCH database. The main outcomes of interest were prescribing errors and medication monitoring errors at six- and 12-months following the intervention. Results: Participants in the pharmacist intervention arm practices were significantly less likely to have been prescribed a non-selective NSAID without a proton pump inhibitor (PPI) if they had a history of peptic ulcer (OR 0.58, 95%CI 0.38, 0.89), to have been prescribed a beta-blocker if they had asthma (OR 0.73, 95% CI 0.58, 0.91) or (in those aged 75 years and older) to have been prescribed an ACE inhibitor or diuretic without a measurement of urea and electrolytes in the last 15 months (OR 0.51, 95% CI 0.34, 0.78). The economic analysis suggests that the PINCER pharmacist intervention has 95% probability of being cost effective if the decision-maker’s ceiling willingness to pay reaches £75 (6 months) or £85 (12 months) per error avoided. The intervention addressed an issue that was important to professionals and their teams and was delivered in a way that was acceptable to practices with minimum disruption of normal work processes. Comparison of the trial findings with changes seen in QRESEARCH practices indicated that any reductions achieved in the simple feedback arm were likely, in the main, to have been related to secular trends rather than the intervention. Conclusions Compared with simple feedback, the pharmacist-led intervention resulted in reductions in proportions of patients at risk of prescribing and monitoring errors for the primary outcome measures and the composite secondary outcome measures at six-months and (with the exception of the NSAID/peptic ulcer outcome measure) 12-months post-intervention. The intervention is acceptable to pharmacists and practices, and is likely to be seen as costeffective by decision makers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A dynamic, mechanistic model of enteric fermentation was used to investigate the effect of type and quality of grass forage, dry matter intake (DMI) and proportion of concentrates in dietary dry matter (DM) on variation in methane (CH(4)) emission from enteric fermentation in dairy cows. The model represents substrate degradation and microbial fermentation processes in rumen and hindgut and, in particular, the effects of type of substrate fermented and of pH oil the production of individual volatile fatty acids and CH, as end-products of fermentation. Effects of type and quality of fresh and ensiled grass were evaluated by distinguishing two N fertilization rates of grassland and two stages of grass maturity. Simulation results indicated a strong impact of the amount and type of grass consumed oil CH(4) emission, with a maximum difference (across all forage types and all levels of DM 1) of 49 and 77% in g CH(4)/kg fat and protein corrected milk (FCM) for diets with a proportion of concentrates in dietary DM of 0.1 and 0.4, respectively (values ranging from 10.2 to 19.5 g CH(4)/kg FCM). The lowest emission was established for early Cut, high fertilized grass silage (GS) and high fertilized grass herbage (GH). The highest emission was found for late cut, low-fertilized GS. The N fertilization rate had the largest impact, followed by stage of grass maturity at harvesting and by the distinction between GH and GS. Emission expressed in g CH(4)/kg FCM declined oil average 14% with an increase of DMI from 14 to 18 kg/day for grass forage diets with a proportion of concentrates of 0.1, and on average 29% with an increase of DMI from 14 to 23 kg/day for diets with a proportion of concentrates of 0.4. Simulation results indicated that a high proportion of concentrates in dietary DM may lead to a further reduction of CH, emission per kg FCM mainly as a result of a higher DM I and milk yield, in comparison to low concentrate diets. Simulation results were evaluated against independent data obtained at three different laboratories in indirect calorimetry trials with COWS consuming GH mainly. The model predicted the average of observed values reasonably, but systematic deviations remained between individual laboratories and root mean squared prediction error was a proportion of 0.12 of the observed mean. Both observed and predicted emission expressed in g CH(4)/kg DM intake decreased upon an increase in dietary N:organic matter (OM) ratio. The model reproduced reasonably well the variation in measured CH, emission in cattle sheds oil Dutch dairy farms and indicated that oil average a fraction of 0.28 of the total emissions must have originated from manure under these circumstances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Short-term Water Information and Forecasting Tools (SWIFT) is a suite of tools for flood and short-term streamflow forecasting, consisting of a collection of hydrologic model components and utilities. Catchments are modeled using conceptual subareas and a node-link structure for channel routing. The tools comprise modules for calibration, model state updating, output error correction, ensemble runs and data assimilation. Given the combinatorial nature of the modelling experiments and the sub-daily time steps typically used for simulations, the volume of model configurations and time series data is substantial and its management is not trivial. SWIFT is currently used mostly for research purposes but has also been used operationally, with intersecting but significantly different requirements. Early versions of SWIFT used mostly ad-hoc text files handled via Fortran code, with limited use of netCDF for time series data. The configuration and data handling modules have since been redesigned. The model configuration now follows a design where the data model is decoupled from the on-disk persistence mechanism. For research purposes the preferred on-disk format is JSON, to leverage numerous software libraries in a variety of languages, while retaining the legacy option of custom tab-separated text formats when it is a preferred access arrangement for the researcher. By decoupling data model and data persistence, it is much easier to interchangeably use for instance relational databases to provide stricter provenance and audit trail capabilities in an operational flood forecasting context. For the time series data, given the volume and required throughput, text based formats are usually inadequate. A schema derived from CF conventions has been designed to efficiently handle time series for SWIFT.