852 resultados para kernel estimation
Resumo:
Contexte : Parmi les infections nosocomiales, le Staphylocoque méticilline résistant (MRSA) est le germe pathogène le plus couramment identifié dans les hôpitaux du monde entier. La stratégie de contrôle des MRSA au CHUV implique le dépistage des patients à risque. Avec la méthode de dépistage par culture, le temps d'attente est de plusieurs jours. Ceci occasionne des problèmes dans la gestion des flux des patients, principalement à cause des mesures d'isolement. Pour réduire le temps d'attente, l'hôpital envisage d'utiliser une méthode de diagnostic rapide par "polymerase chain reaction" (PCR). Méthodologie : Les données concernant les dépistages réalisés, dans trois services durant l'année 2007, ont été utilisées. Le nombre de jours d'isolement a d'abord été déterminé par patient et par service. Ensuite une analyse des coûts a été effectuée afin d'évaluer la différence des coûts entre les deux méthodes pour chaque service. Résultats : Le principal impact économique de la méthode par PCR dépend principalement du nombre de jours d'isolements évités par rapport à la méthode de culture. Aux services de soins, l'analyse a été menée sur 192 dépistages. Quand la différence de jours d'isolement est de deux jours, le coût des dépistages diminue de plus de 12kCHF et le nombre de jours d'isolement diminue de 384 jours. Au centre interdisciplinaire des urgences, sur 96 dépistages, le gain potentiel avec la méthode PCR est de 6kCHF avec une diminution de 192 jours d'isolement. Aux soins intensifs adultes, la méthode de dépistage par PCR est la méthode la plus rentable avec une diminution des coûts entre 4KCHF et 20K CHF et une diminution des jours d'isolement entre 170 et 310. Pour les trois services analysés, les résultats montrent un rapport coût-efficacité favorable pour la méthode PCR lorsque la diminution des jours d'isolement est supérieure à 1.3 jour. Quand la différence de jours d'isolement est inférieure à 1.3, il faut tenir compte d'autres paramètres, comme le coût de matériel qui doit être supérieur à 45.5 CHF, et du nombre d'analyses par dépistage, qui doit être inférieur à 3, pour que la PCR reste l'alternative la plus intéressante. Conclusions : La méthode par PCR montre des avantages potentiels importants, tant économiques qu'organisationnels qui limitent ou diminuent les contraintes liées à la stratégie de contrôle des MRSA au CHUV. [Auteure, p. 3]
Resumo:
[Table des matières] 1. Pourquoi s'intéresser à l'occupation inappropriée des lits de soins aigus au CHUV ?. - 1.1. Etat des lieux. - 1.1.1. Les chiffres du CHUV. - 1.1.2. La cellule de gestion des flux de patients. - 1.1.3. L'unité de patients en attente de placement. - 1.1.4. La pénurie de lits dans les EMS vaudois. - 1.1.5. Le vieillissement de la population vaudoise. - 1.2. Evidences nationales et internationales. - - 2. Estimation des coûts. - 2.1. Coûts chiffrables. - 2.1.1. Perte financière directe. - 2.1.2. Coûts des transferts pour engorgement. - 2.1.3. Coût d'opportunité. - 2.2. Coûts non chiffrables. - 2.2.1. Patients. - 2.2.2. Personnel médical. - 2.2.3. CHUV. - - 3. Propositions. - 3.1. Prises en charge alternatives. - 3.1.1. Les réseaux intégrés de services aux personnes âgées. - 3.1.2. Les courts séjours gériatriques. - 3.1.3. Autres solutions. - 3.2. Prévention. - 3.2.1. Prévention des chutes. - 3.2.2. La prévention par l'information aux personnes âgées. - 3.2.3. La prévention par l'information à l'ensemble de la population
Resumo:
In this paper a colour texture segmentation method, which unifies region and boundary information, is proposed. The algorithm uses a coarse detection of the perceptual (colour and texture) edges of the image to adequately place and initialise a set of active regions. Colour texture of regions is modelled by the conjunction of non-parametric techniques of kernel density estimation (which allow to estimate the colour behaviour) and classical co-occurrence matrix based texture features. Therefore, region information is defined and accurate boundary information can be extracted to guide the segmentation process. Regions concurrently compete for the image pixels in order to segment the whole image taking both information sources into account. Furthermore, experimental results are shown which prove the performance of the proposed method
Resumo:
In mathematical modeling the estimation of the model parameters is one of the most common problems. The goal is to seek parameters that fit to the measurements as well as possible. There is always error in the measurements which implies uncertainty to the model estimates. In Bayesian statistics all the unknown quantities are presented as probability distributions. If there is knowledge about parameters beforehand, it can be formulated as a prior distribution. The Bays’ rule combines the prior and the measurements to posterior distribution. Mathematical models are typically nonlinear, to produce statistics for them requires efficient sampling algorithms. In this thesis both Metropolis-Hastings (MH), Adaptive Metropolis (AM) algorithms and Gibbs sampling are introduced. In the thesis different ways to present prior distributions are introduced. The main issue is in the measurement error estimation and how to obtain prior knowledge for variance or covariance. Variance and covariance sampling is combined with the algorithms above. The examples of the hyperprior models are applied to estimation of model parameters and error in an outlier case.
Resumo:
This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement -SCR-, under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.
Resumo:
Learning of preference relations has recently received significant attention in machine learning community. It is closely related to the classification and regression analysis and can be reduced to these tasks. However, preference learning involves prediction of ordering of the data points rather than prediction of a single numerical value as in case of regression or a class label as in case of classification. Therefore, studying preference relations within a separate framework facilitates not only better theoretical understanding of the problem, but also motivates development of the efficient algorithms for the task. Preference learning has many applications in domains such as information retrieval, bioinformatics, natural language processing, etc. For example, algorithms that learn to rank are frequently used in search engines for ordering documents retrieved by the query. Preference learning methods have been also applied to collaborative filtering problems for predicting individual customer choices from the vast amount of user generated feedback. In this thesis we propose several algorithms for learning preference relations. These algorithms stem from well founded and robust class of regularized least-squares methods and have many attractive computational properties. In order to improve the performance of our methods, we introduce several non-linear kernel functions. Thus, contribution of this thesis is twofold: kernel functions for structured data that are used to take advantage of various non-vectorial data representations and the preference learning algorithms that are suitable for different tasks, namely efficient learning of preference relations, learning with large amount of training data, and semi-supervised preference learning. Proposed kernel-based algorithms and kernels are applied to the parse ranking task in natural language processing, document ranking in information retrieval, and remote homology detection in bioinformatics domain. Training of kernel-based ranking algorithms can be infeasible when the size of the training set is large. This problem is addressed by proposing a preference learning algorithm whose computation complexity scales linearly with the number of training data points. We also introduce sparse approximation of the algorithm that can be efficiently trained with large amount of data. For situations when small amount of labeled data but a large amount of unlabeled data is available, we propose a co-regularized preference learning algorithm. To conclude, the methods presented in this thesis address not only the problem of the efficient training of the algorithms but also fast regularization parameter selection, multiple output prediction, and cross-validation. Furthermore, proposed algorithms lead to notably better performance in many preference learning tasks considered.
Resumo:
Several models for the estimation of thermodynamic properties of layered double hydroxides (LDHs) are presented. The predicted thermodynamic quantities calculated by the proposed models agree with experimental thermodynamic data. A thermodynamic study of the anion exchange process on LDHs is also made using the described models. Tables for the prediction of monovalent anion exchange selectivities on LDHs are provided. Reasonable agreement is found between the predicted and the experimental monovalent anion exchange selectivities.
Resumo:
Variations in water volume in small depressions in Mediterranean salt marshes in Girona (Spain) are described and the potential causes for these variations analysed. Although the basins appear to be endorrheic, groundwater circulation is intense, as estimated from the difference between water volume observed and that expected from the balance precipitation / evaporation. The rate of variation in volume (VR = AV / VAt) may be used to estimate groundwater supply ('circulation'), since direct measurements of this parameter are impossible. Volume.conductivity figures can also be used to estimate the quantity of circulation, and to investigate the origin of water supplied to the system. The relationships between variations in the volume of water in the basins and the main causes of flooding are also analysed. Sea storms, rainfall levels and strong, dry northerly winds are suggested as the main causes of the variations in the volumes of basins. The relative importance assigned to these factors has changed, following the recent regulation of freshwater flows entering the system
Resumo:
During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia
Resumo:
Cost estimation is an important, but challenging process when designing a new product or a feature of it, verifying the product prices given by suppliers or planning a cost saving actions of existing products. It is even more challenging when the product is highly modular, not a bulk product. In general, cost estimation techniques can be divided into two main groups - qualitative and quantitative techniques - which can further be classified into more detailed methods. Generally, qualitative techniques are preferable when comparing alternatives and quantitative techniques when cost relationships can be found. The main objective of this thesis was to develop a method on how to estimate costs of internally manufactured and commercial elevator landing doors. Because of the challenging product structure, the proposed cost estimation framework is developed under three different levels based on past cost information available. The framework consists of features from both qualitative and quantitative cost estimation techniques. The starting point for the whole cost estimation process is an unambiguous, hierarchical product structure so that the product can be classified into controllable parts and is then easier to handle. Those controllable parts can then be compared to existing past cost knowledge of similar parts and create as accurate cost estimates as possible by that way.
Resumo:
Sensor-based robot control allows manipulation in dynamic environments with uncertainties. Vision is a versatile low-cost sensory modality, but low sample rate, high sensor delay and uncertain measurements limit its usability, especially in strongly dynamic environments. Force is a complementary sensory modality allowing accurate measurements of local object shape when a tooltip is in contact with the object. In multimodal sensor fusion, several sensors measuring different modalities are combined to give a more accurate estimate of the environment. As force and vision are fundamentally different sensory modalities not sharing a common representation, combining the information from these sensors is not straightforward. In this thesis, methods for fusing proprioception, force and vision together are proposed. Making assumptions of object shape and modeling the uncertainties of the sensors, the measurements can be fused together in an extended Kalman filter. The fusion of force and visual measurements makes it possible to estimate the pose of a moving target with an end-effector mounted moving camera at high rate and accuracy. The proposed approach takes the latency of the vision system into account explicitly, to provide high sample rate estimates. The estimates also allow a smooth transition from vision-based motion control to force control. The velocity of the end-effector can be controlled by estimating the distance to the target by vision and determining the velocity profile giving rapid approach and minimal force overshoot. Experiments with a 5-degree-of-freedom parallel hydraulic manipulator and a 6-degree-of-freedom serial manipulator show that integration of several sensor modalities can increase the accuracy of the measurements significantly.
Resumo:
Background: Assessing of the costs of treating disease is necessary to demonstrate cost-effectiveness and to estimate the budget impact of new interventions and therapeutic innovations. However, there are few comprehensive studies on resource use and costs associated with lung cancer patients in clinical practice in Spain or internationally. The aim of this paper was to assess the hospital cost associated with lung cancer diagnosis and treatment by histology, type of cost and stage at diagnosis in the Spanish National Health Service. Methods: A retrospective, descriptive analysis on resource use and a direct medical cost analysis were performed. Resource utilisation data were collected by means of patient files from nine teaching hospitals. From a hospital budget impact perspective, the aggregate and mean costs per patient were calculated over the first three years following diagnosis or up to death. Both aggregate and mean costs per patient were analysed by histology, stage at diagnosis and cost type. Results: A total of 232 cases of lung cancer were analysed, of which 74.1% corresponded to non-small cell lung cancer (NSCLC) and 11.2% to small cell lung cancer (SCLC); 14.7% had no cytohistologic confirmation. The mean cost per patient in NSCLC ranged from 13,218 Euros in Stage III to 16,120 Euros in Stage II. The main cost components were chemotherapy (29.5%) and surgery (22.8%). Advanced disease stages were associated with a decrease in the relative weight of surgical and inpatient care costs but an increase in chemotherapy costs. In SCLC patients, the mean cost per patient was 15,418 Euros for limited disease and 12,482 Euros for extensive disease. The main cost components were chemotherapy (36.1%) and other inpatient costs (28.7%). In both groups, the Kruskall-Wallis test did not show statistically significant differences in mean cost per patient between stages. Conclusions: This study provides the costs of lung cancer treatment based on patient file reviews, with chemotherapy and surgery accounting for the major components of costs. This cost analysis is a baseline study that will provide a useful source of information for future studies on cost-effectiveness and on the budget impact of different therapeutic innovations in Spain.
Estimation of surface area and pore volume of activated carbons by methylene blue and iodine numbers
Resumo:
Data of methylene blue number and iodine number of activated carbons samples were calibrated against the respective surface area, micropore volume and total pore volume using multiple regression. The models obtained from the calibrations were used in predicting these physical properties of a test group of activated carbon samples produced from several raw materials. In all cases, the predicted values were in good agreement with the expected values. The method allows extracting more information from the methylene blue and iodine adsorption studies than normally obtained with this type of material.