952 resultados para evaluation algorithm


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Estudo prospectivo com abordagem quantitativa envolvendo 3 grupos distintos de sujeitos. Grupo 1 constituído por 56 pacientes avaliados para o risco de tromboembolismo venoso (TEV) em um acompanhamento de 30 meses para verificar os desfechos morte, reinternação e profilaxia de TEV. Grupo 2 constituído por 50 enfermeiros assistenciais que responderam questionários sobre TEV, com o propósito de avaliar seus conhecimentos sobre os riscos e profilaxia dessa doença em pacientes clínicos internados. Grupo 3 constituído por 100 enfermeiros assistenciais que responderam questionários similares aos respondidos pelo grupo 2, antes e após treinamento sobre profilaxia de TEV. O objetivo geral foi verificar o grau de conhecimento de enfermeiros sobre tromboembolismo venoso considerando sua inserção no processo de prevenção de riscos; Os objetivos específicos foram: propor e implantar uma estratégia de treinamento para capacitação de enfermeiros no rastreamento de riscos de TEV em pacientes internados; verificar o impacto do treinamento sobre TEV no conhecimento dos enfermeiros para identificação de fatores de risco dessa doença; descrever os desfechos relacionados à TEV em pacientes internados por mais de 24 horas em um hospital quaternário num seguimento de 30 meses. No grupo 1 identificou-se que o evento TEV apresenta alta mortalidade 63,6% para pacientes que não receberam profilaxia. Identificou-se também que a maioria 89,2% desses sujeitos é acompanhada de seus médicos, 53,6% passaram por reinternações e 28,6% continuam usando alguma profilaxia para TEV. No grupo 2 verificou-se que os profissionais não sabem identificar corretamente os fatores de risco para TEV, havendo grande déficit de conhecimento em relação aos fatores de risco e profilaxia da doença, pois 90% da amostra não consegue apontar mais que 5 fatores de risco para TEV considerando-se 24 fatores contemplados por consensos internacionais, demonstrando um grau de conhecimento insuficiente pela utilização de uma escala intervalar proposta nesse estudo. No grupo 3, assim como identificado no grupo 2, houve similaridade no déficit de conhecimento, pois 100% não conseguiram apontar mais que 4 fatores de risco para a doença. Identificou-se que a realização de um treinamento sobre profilaxia de TEV para esses enfermeiros apresenta alto impacto em relação ao grau de retenção de informações sobre TEV, sendo uma ação facilmente replicável para profissionais de instituições hospitalares. Concluiu-se que o uso de um algoritmo/protocolo de avaliação voltado para rastreamento de riscos de TEV por enfermeiros representa uma ferramenta importante no processo de rastreamento e prevenção dessa doença em pacientes clínicos, como proposto nesse estudo, pois nos resultados demonstrou-se que 97% dos enfermeiros do grupo 3 não conhecem qualquer tipo de protocolo relacionado a prevenção de riscos de TEV. Os sujeitos apresentaram excelente nível de conhecimento sobre meios de profilaxia mecânica, mas identificou-se que a maioria 63% nunca deu orientações sobre a profilaxia enquanto cuidam, reforçando o entendimento de que não estão inseridos no processo de prevenção de riscos de TEV para pacientes internados. A inserção dos enfermeiros nesse processo de identificação de riscos deve ser capaz de reduzir a alta taxa de morbimortalidade e reduzir a incidência dessa doença em unidades hospitalares.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

面向对等计算的信任度评估提出了一种新的信任管理量化算法,该算法解决了已有算法不能很好解决的信任时间衰减特性和节点联盟等问题,系统地对目前有代表性的网络信任评估算法进行了总结和分析,并对当前相关的国内外研究热点作了分类,同时给出了信任相关的一些定义以及算法应该考虑的问题,并提出一套完整解决问题的算法.定义了信任时间矫正函数、域信任矫正函数、信任值校准函数和准确度函数,并构造了信任时间矫正算法与域矫正算法,通过推导说明本算法具有良好的时间衰减性、历史经验相关性、新入节点奖励特性和联盟特性,同时给出了一般性的信任自然衰减曲线和8种典型特征域的系数变化范围.通过实验评价了算法的正确性和有效性,并和Azzedin算法进行比较,表明提出的算法效率和准确性有了显著的提高.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One objective of artificial intelligence is to model the behavior of an intelligent agent interacting with its environment. The environment's transformations can be modeled as a Markov chain, whose state is partially observable to the agent and affected by its actions; such processes are known as partially observable Markov decision processes (POMDPs). While the environment's dynamics are assumed to obey certain rules, the agent does not know them and must learn. In this dissertation we focus on the agent's adaptation as captured by the reinforcement learning framework. This means learning a policy---a mapping of observations into actions---based on feedback from the environment. The learning can be viewed as browsing a set of policies while evaluating them by trial through interaction with the environment. The set of policies is constrained by the architecture of the agent's controller. POMDPs require a controller to have a memory. We investigate controllers with memory, including controllers with external memory, finite state controllers and distributed controllers for multi-agent systems. For these various controllers we work out the details of the algorithms which learn by ascending the gradient of expected cumulative reinforcement. Building on statistical learning theory and experiment design theory, a policy evaluation algorithm is developed for the case of experience re-use. We address the question of sufficient experience for uniform convergence of policy evaluation and obtain sample complexity bounds for various estimators. Finally, we demonstrate the performance of the proposed algorithms on several domains, the most complex of which is simulated adaptive packet routing in a telecommunication network.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta tesis realiza una contribución metodológica al problema de la gestión óptima de embalses hidroeléctricos durante eventos de avenidas, considerando un enfoque estocástico y multiobjetivo. Para ello se propone una metodología de evaluación de estrategias de laminación en un contexto probabilístico y multiobjetivo. Además se desarrolla un entorno dinámico de laminación en tiempo real con pronósticos que combina un modelo de optimización y algoritmos de simulación. Estas herramientas asisten a los gestores de las presas en la toma de decisión respecto de cuál es la operación más adecuada del embalse. Luego de una detallada revisión de la bibliografía, se observó que los trabajos en el ámbito de la gestión óptima de embalses en avenidas utilizan, en general, un número reducido de series de caudales o hidrogramas para caracterizar los posibles escenarios. Limitando el funcionamiento satisfactorio de un modelo determinado a situaciones hidrológicas similares. Por otra parte, la mayoría de estudios disponibles en este ámbito abordan el problema de la laminación en embalses multipropósito durante la temporada de avenidas, con varios meses de duración. Estas características difieren de la realidad de la gestión de embalses en España. Con los avances computacionales en materia de gestión de información en tiempo real, se observó una tendencia a la implementación de herramientas de operación en tiempo real con pronósticos para determinar la operación a corto plazo (involucrando el control de avenidas). La metodología de evaluación de estrategias propuesta en esta tesis se basa en determinar el comportamiento de éstas frente a un espectro de avenidas características de la solicitación hidrológica. Con ese fin, se combina un sistema de evaluación mediante indicadores y un entorno de generación estocástica de avenidas, obteniéndose un sistema implícitamente estocástico. El sistema de evaluación consta de tres etapas: caracterización, síntesis y comparación, a fin de poder manejar la compleja estructura de datos resultante y realizar la evaluación. En la primera etapa se definen variables de caracterización, vinculadas a los aspectos que se quieren evaluar (seguridad de la presa, control de inundaciones, generación de energía, etc.). Estas variables caracterizan el comportamiento del modelo para un aspecto y evento determinado. En la segunda etapa, la información de estas variables se sintetiza en un conjunto de indicadores, lo más reducido posible. Finalmente, la comparación se lleva a cabo a partir de la comparación de esos indicadores, bien sea mediante la agregación de dichos objetivos en un indicador único, o bien mediante la aplicación del criterio de dominancia de Pareto obteniéndose un conjunto de soluciones aptas. Esta metodología se aplicó para calibrar los parámetros de un modelo de optimización de embalse en laminación y su comparación con otra regla de operación, mediante el enfoque por agregación. Luego se amplió la metodología para evaluar y comparar reglas de operación existentes para el control de avenidas en embalses hidroeléctricos, utilizando el criterio de dominancia. La versatilidad de la metodología permite otras aplicaciones, tales como la determinación de niveles o volúmenes de seguridad, o la selección de las dimensiones del aliviadero entre varias alternativas. Por su parte, el entorno dinámico de laminación al presentar un enfoque combinado de optimización-simulación, permite aprovechar las ventajas de ambos tipos de modelos, facilitando la interacción con los operadores de las presas. Se mejoran los resultados respecto de los obtenidos con una regla de operación reactiva, aun cuando los pronósticos se desvían considerablemente del hidrograma real. Esto contribuye a reducir la tan mencionada brecha entre el desarrollo teórico y la aplicación práctica asociada a los modelos de gestión óptima de embalses. This thesis presents a methodological contribution to address the problem about how to operate a hydropower reservoir during floods in order to achieve an optimal management considering a multiobjective and stochastic approach. A methodology is proposed to assess the flood control strategies in a multiobjective and probabilistic framework. Additionally, a dynamic flood control environ was developed for real-time operation, including forecasts. This dynamic platform combines simulation and optimization models. These tools may assist to dam managers in the decision making process, regarding the most appropriate reservoir operation to be implemented. After a detailed review of the bibliography, it was observed that most of the existing studies in the sphere of flood control reservoir operation consider a reduce number of hydrographs to characterize the reservoir inflows. Consequently, the adequate functioning of a certain strategy may be limited to similar hydrologic scenarios. In the other hand, most of the works in this context tackle the problem of multipurpose flood control operation considering the entire flood season, lasting some months. These considerations differ from the real necessity in the Spanish context. The implementation of real-time reservoir operation is gaining popularity due to computational advances and improvements in real-time data management. The methodology proposed in this thesis for assessing the strategies is based on determining their behavior for a wide range of floods, which are representative of the hydrological forcing of the dam. An evaluation algorithm is combined with a stochastic flood generation system to obtain an implicit stochastic analysis framework. The evaluation system consists in three stages: characterizing, synthesizing and comparing, in order to handle the complex structure of results and, finally, conduct the evaluation process. In the first stage some characterization variables are defined. These variables should be related to the different aspects to be evaluated (such as dam safety, flood protection, hydropower, etc.). Each of these variables characterizes the behavior of a certain operating strategy for a given aspect and event. In the second stage this information is synthesized obtaining a reduced group of indicators or objective functions. Finally, the indicators are compared by means of an aggregated approach or by a dominance criterion approach. In the first case, a single optimum solution may be achieved. However in the second case, a set of good solutions is obtained. This methodology was applied for calibrating the parameters of a flood control model and to compare it with other operating policy, using an aggregated method. After that, the methodology was extent to assess and compared some existing hydropower reservoir flood control operation, considering the Pareto approach. The versatility of the method allows many other applications, such as determining the safety levels, defining the spillways characteristics, among others. The dynamic framework for flood control combines optimization and simulation models, exploiting the advantages of both techniques. This facilitates the interaction between dam operators and the model. Improvements are obtained applying this system when compared with a reactive operating policy, even if the forecasts deviate significantly from the observed hydrograph. This approach contributes to reduce the gap between the theoretical development in the field of reservoir management and its practical applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Concurrency control (CC) algorithms are important in distributed database systems to ensure consistency of the database. A number of such algorithms are available in the literature. The issue of performance evaluation of these algorithms has been recognized to be important. However, only a few studies have been carried out towards this. This paper deals with the performance evaluation of a CC algorithm proposed by Rosenkrantz et al. through a detailed simulation study. In doing so, the algorithm has been modified so that it can, within itself, take care of the redundancy in the database. The influences of various system parameters and the transaction profile on the response time and on the degree of conflict are considered. The entire study has been carried out using the programming language SIMULA on a DEC-1090 system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents the image reconstruction using the fan-beam filtered backprojection (FBP) algorithm with no backprojection weight from windowed linear prediction (WLP) completed truncated projection data. The image reconstruction from truncated projections aims to reconstruct the object accurately from the available limited projection data. Due to the incomplete projection data, the reconstructed image contains truncation artifacts which extends into the region of interest (ROI) making the reconstructed image unsuitable for further use. Data completion techniques have been shown to be effective in such situations. We use windowed linear prediction technique for projection completion and then use the fan-beam FBP algorithm with no backprojection weight for the 2-D image reconstruction. We evaluate the quality of the reconstructed image using fan-beam FBP algorithm with no backprojection weight after WLP completion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Numerical solutions of realistic 2-D and 3-D inverse problems may require a very large amount of computation. A two-level concept on parallelism is often used to solve such problems. The primary level uses the problem partitioning concept which is a decomposition based on the mathematical/physical problem. The secondary level utilizes the widely used data partitioning concept. A theoretical performance model is built based on the two-level parallelism. The observed performance results obtained from a network of general purpose Sun Sparc stations are compared with the theoretical values. Restrictions of the theoretical model are also discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE - To evaluate an algorithm guiding responses of continuous subcutaneous insulin infusion (CSII)-treated type 1 diabetic patients using real-time continuous glucose monitoring (RT-CGM). RESEARCH DESIGN AND METHODS - Sixty CSII-treated type 1 diabetic participants (aged 13-70 years, including adult and adolescent subgroups, with A1C =9.5%) were randomized in age-, sex-, and A1C-matched pairs. Phase 1 was an open 16-week multicenter randomized controlled trial. Group A was treated with CSII/RT-CGM with the algorithm, and group B was treated with CSII/RT-CGM without the algorithm. The primary outcome was the difference in time in target (4-10 mmol/l) glucose range on 6-day masked CGM. Secondary outcomes were differences in A1C, low (=3.9 mmol/l) glucose CGM time, and glycemic variability. Phase 2 was the week 16-32 follow-up. Group A was returned to usual care, and group B was provided with the algorithm. Glycemia parameters were as above. Comparisons were made between baseline and 16 weeks and 32 weeks. RESULTS - In phase 1, after withdrawals 29 of 30 subjects were left in group A and 28 of 30 subjects were left in group B. The change in target glucose time did not differ between groups. A1C fell (mean 7.9% [95% CI 7.7-8.2to 7.6% [7.2-8.0]; P <0.03) in group A but not in group B (7.8% [7.5-8.1] to 7.7 [7.3-8.0]; NS) with no difference between groups. More subjects in group A achieved A1C =7% than those in group B (2 of 29 to 14 of 29 vs. 4 of 28 to 7 of 28; P = 0.015). In phase 2, one participant was lost from each group. In group A, A1C returned to baseline with RT-CGM discontinuation but did not change in group B, who continued RT-CGM with addition of the algorithm. CONCLUSIONS - Early but not late algorithm provision to type 1 diabetic patients using CSII/RT-CGM did not increase the target glucose time but increased achievement of A1C =7%. Upon RT-CGM cessation, A1C returned to baseline. © 2010 by the American Diabetes Association.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Clostridium difficile (C. difficile) is a leading cause of infectious diarrhoea in hospitals. Sending faecal samples for testing expedites diagnosis and appropriate treatment. Clinical suspicion of C. difficile based on patient history, signs and symptoms is the basis for sampling. Sending faecal samples from patients with diarrhoea ‘just in case’ the patient has C. difficile may be an indication of poor clinical management.

Aim: To evaluate the effectiveness of an intervention by an Infection Prevention and Control Team (IPCT) in reducing inappropriate faecal samples sent for C. difficile testing.

Method: An audit of numbers of faecal samples sent before and after a decision-making algorithm was introduced. The number of samples received in the laboratory was retrospectively counted for 12-week periods before and after an algorithm was introduced.
Findings: There was a statistically significant reduction in the mean number of faecal samples sent post the algorithm. Results were compared to a similar intervention carried out in 2009 in which the same message was delivered by a memorandum. In 2009 the memorandum had no effect on the overall number of weekly samples being sent.

Conclusion: An algorithm intervention had an effect on the number of faecal samples being sent for C. difficile testing and thus contributed to the effective use of the laboratory service.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper describes the results of an investigation which examined the efficacy of a feedback equalization algorithm incorporated into the Central Institute for the Deaf Wearable Digital Hearing Aid. The study examined whether the feedback equalization would allow for greater usable gains when subjects listened to soft speech signals, and if so, whether or not this would improve speech intelligibility.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

For an increasing number of applications, mesoscale modelling systems now aim to better represent urban areas. The complexity of processes resolved by urban parametrization schemes varies with the application. The concept of fitness-for-purpose is therefore critical for both the choice of parametrizations and the way in which the scheme should be evaluated. A systematic and objective model response analysis procedure (Multiobjective Shuffled Complex Evolution Metropolis (MOSCEM) algorithm) is used to assess the fitness of the single-layer urban canopy parametrization implemented in the Weather Research and Forecasting (WRF) model. The scheme is evaluated regarding its ability to simulate observed surface energy fluxes and the sensitivity to input parameters. Recent amendments are described, focussing on features which improve its applicability to numerical weather prediction, such as a reduced and physically more meaningful list of input parameters. The study shows a high sensitivity of the scheme to parameters characterizing roof properties in contrast to a low response to road-related ones. Problems in partitioning of energy between turbulent sensible and latent heat fluxes are also emphasized. Some initial guidelines to prioritize efforts to obtain urban land-cover class characteristics in WRF are provided. Copyright © 2010 Royal Meteorological Society and Crown Copyright.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hepatocellular carcinoma (HCC) is a primary tumor of the liver. After local therapies, the tumor evaluation is based on the mRECIST criteria, which involves the measurement of the maximum diameter of the viable lesion. This paper describes a computed methodology to measure through the contrasted area of the lesions the maximum diameter of the tumor by a computational algorithm 63 computed tomography (CT) slices from 23 patients were assessed. Non-contrasted liver and HCC typical nodules were evaluated, and a virtual phantom was developed for this purpose. Optimization of the algorithm detection and quantification was made using the virtual phantom. After that, we compared the algorithm findings of maximum diameter of the target lesions against radiologist measures. Computed results of the maximum diameter are in good agreement with the results obtained by radiologist evaluation, indicating that the algorithm was able to detect properly the tumor limits A comparison of the estimated maximum diameter by radiologist versus the algorithm revealed differences on the order of 0.25 cm for large-sized tumors (diameter > 5 cm), whereas agreement lesser than 1.0cm was found for small-sized tumors. Differences between algorithm and radiologist measures were accurate for small-sized tumors with a trend to a small increase for tumors greater than 5 cm. Therefore, traditional methods for measuring lesion diameter should be complemented with non-subjective measurement methods, which would allow a more correct evaluation of the contrast-enhanced areas of HCC according to the mRECIST criteria.