118 resultados para subset sum problems


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: In most of the emergency departments (ED) in developed countries, a subset of patients visits the ED frequently. Despite their small numbers, these patients are the source of a disproportionally high number of all ED visits, and use a significant proportion of healthcare resources. They place a heavy economic burden on hospital and healthcare systems budgets overall. Several interventions have been carried out to improve the management of these ED frequent users. Case management has been shown in some North American studies to reduce ED utilization and costs. In these studies, cost analyses have been carried out from the hospital perspective without examining the costs induced by healthcare consumed in the community. However, case management might reduce ED visits and costs from the hospital's perspective, but induce substitution effects, and increase health service utilization outside the hospital. This study examined if an interdisciplinary case-management intervention-compared to standard ED care -reduced costs generated by frequent ED users not only from the hospital perspective, but also from the healthcare system perspective-that is, from a broader perspective taking into account the costs of healthcare services used outside the hospital. METHODS: In this randomized controlled trial, 250 adult frequent emergency department users (5 or more visits during the previous 12 months) who visited the ED of the University Hospital of Lausanne, Switzerland, between May 2012 and July 2013 were allocated to one of two groups: case management intervention (CM) or standard ED care (SC), and followed up for 12 months. Depending on the perspective of the analysis, costs were evaluated differently. For the analysis from the hospital's perspective, the true value of resources used to provide services was used as a cost estimate. These data were obtained from the hospital's analytical accounting system. For the analysis from the health-care system perspective, all health-care services consumed by users and charged were used as an estimate of costs. These data were obtained from health insurance providers for a subsample of participants. To allow comparisons in a same time period, individual monthly average costs were calculated. Multivariate linear models including a fixed effect "group" were run using socio-demographic characteristics and health-related variables as controlling variables (age, gender, educational level, citizenship, marital status, somatic and mental health problems, and risk behaviors).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: In 2008, the Swiss Civil Code was amended. From 1 January 2013, each Swiss canton may propose specific provisions for involuntary outpatient treatment (community treatment orders (CTOs)) for individuals with mental disorders. AIM: This review catalogues the legal provisions of the various Swiss cantons for CTOs and outlines the differences between them. It sets this in the context of variations in clinical provisions between the cantons. METHODS: Databases were searched to obtain relevant publications about CTOs in Switzerland. The Swiss Medical Association, Swiss Federal Statistical Office, Swiss Health Observatory and all the 26 Cantonal medical officers were contacted to complete the information. Conférence des cantons en matière de protection des mineurs et des adultes (COPMA), the authority which monitors guardianship legislation, and Pro Mente Sana, a patients' right association, were also approached. RESULTS: Three articles about CTOs in Switzerland were identified. Psychiatric provisions vary considerably between cantons and only a few could provide complete or even partial figures for rates of compulsion in previous years. Prior to 2013, only 6 of the 20 cantons, for which information was returned, had any provision for CTOs. Now, every canton has some form of legal basis but the level of detail is often limited. In eight cantons, the powers of the measure are not specified (for example, use of medication). In 12 cantons, the maximum duration of the CTO is not specified. German speaking cantons and rural cantons are more likely to specify the details of CTOs. CONCLUSION: Highly variable Swiss provision for CTOs is being introduced despite the absence of convincing international evidence for their effectiveness or good quality data on current coercive practice. Careful monitoring and assessment of these new cantonal provisions are essential.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Hallux valgus is one of the most common forefoot problems in females. Studies have looked at gait alterations due to hallux valgus deformity, assessing temporal, kinematic or plantar pressure parameters individually. The present study, however, aims to assess all listed parameters at once and to isolate the most clinically relevant gait parameters for moderate to severe hallux valgus deformity with the intent of improving post-operative patient prognosis and rehabilitation. METHODS: The study included 26 feet with moderate to severe hallux valgus deformity and 30 feet with no sign of hallux valgus in female participants. Initially, weight bearing radiographs and foot and ankle clinical scores were assessed. Gait assessment was then performed utilizing pressure insoles (PEDAR®) and inertial sensors (Physilog®) and the two groups were compared using a non-parametric statistical hypothesis test (Wilcoxon rank sum, P<0.05). Furthermore, forward stepwise regression was used to reduce the number of gait parameters to the most clinically relevant and correlation of these parameters was assessed with the clinical score. FINDINGS: Overall, the results showed clear deterioration in several gait parameters in the hallux valgus group compared to controls and 9 gait parameters (effect size between 1.03 and 1.76) were successfully isolated to best describe the altered gait in hallux valgus deformity (r(2)=0.71) as well as showed good correlation with clinical scores. INTERPRETATION: Our results, and nine listed parameters, could serve as benchmark for characterization of hallux valgus and objective evaluation of treatment efficacy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tumor antigen-specific CD4(+) T cells generally orchestrate and regulate immune cells to provide immune surveillance against malignancy. However, activation of antigen-specific CD4(+) T cells is restricted at local tumor sites where antigen-presenting cells (APCs) are frequently dysfunctional, which can cause rapid exhaustion of anti-tumor immune responses. Herein, we characterize anti-tumor effects of a unique human CD4(+) helper T-cell subset that directly recognizes the cytoplasmic tumor antigen, NY-ESO-1, presented by MHC class II on cancer cells. Upon direct recognition of cancer cells, tumor-recognizing CD4(+) T cells (TR-CD4) potently induced IFN-γ-dependent growth arrest in cancer cells. In addition, direct recognition of cancer cells triggers TR-CD4 to provide help to NY-ESO-1-specific CD8(+) T cells by enhancing cytotoxic activity, and improving viability and proliferation in the absence of APCs. Notably, the TR-CD4 either alone or in collaboration with CD8(+) T cells significantly inhibited tumor growth in vivo in a xenograft model. Finally, retroviral gene-engineering with T cell receptor (TCR) derived from TR-CD4 produced large numbers of functional TR-CD4. These observations provide mechanistic insights into the role of TR-CD4 in tumor immunity, and suggest that approaches to utilize TR-CD4 will augment anti-tumor immune responses for durable therapeutic efficacy in cancer patients.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review presents the evolution of steroid analytical techniques, including gas chromatography coupled to mass spectrometry (GC-MS), immunoassay (IA) and targeted liquid chromatography coupled to mass spectrometry (LC-MS), and it evaluates the potential of extended steroid profiles by a metabolomics-based approach, namely steroidomics. Steroids regulate essential biological functions including growth and reproduction, and perturbations of the steroid homeostasis can generate serious physiological issues; therefore, specific and sensitive methods have been developed to measure steroid concentrations. GC-MS measuring several steroids simultaneously was considered the first historical standard method for analysis. Steroids were then quantified by immunoassay, allowing a higher throughput; however, major drawbacks included the measurement of a single compound instead of a panel and cross-reactivity reactions. Targeted LC-MS methods with selected reaction monitoring (SRM) were then introduced for quantifying a small steroid subset without the problems of cross-reactivity. The next step was the integration of metabolomic approaches in the context of steroid analyses. As metabolomics tends to identify and quantify all the metabolites (i.e., the metabolome) in a specific system, appropriate strategies were proposed for discovering new biomarkers. Steroidomics, defined as the untargeted analysis of the steroid content in a sample, was implemented in several fields, including doping analysis, clinical studies, in vivo or in vitro toxicology assays, and more. This review discusses the current analytical methods for assessing steroid changes and compares them to steroidomics. Steroids, their pathways, their implications in diseases and the biological matrices in which they are analysed will first be described. Then, the different analytical strategies will be presented with a focus on their ability to obtain relevant information on the steroid pattern. The future technical requirements for improving steroid analysis will also be presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les personnes vivant dans une situation de précarité sont susceptibles de développer un large éventail de problèmes médicaux et les atteintes cutanées sont très fréquentes. Nous présentons quatre situations cliniques de problèmes cutanés fréquemment retrouvés au sein des populations précaires. Leurs diagnostic et prise en charge précoces sont essentiels pour en prévenir l'ultérieure dissémination dans des contextes de promiscuité. People living in poor conditions are at high risk of developing different medical diseases of which dermatological diseases are very common. We present 4 clinical cases of skin diseases, which are the most prevalent amongst the majority of socially and economically vulnerable patients. Early diagnosis and appropriate treatment are of paramount importance, in order to avoid their spread in close- knit communities where these patients often live.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Delirium is an acute disorder of attention and cognition seen relatively commonly in people aged 65 yr or older. The prevalence is estimated to be between 11 and 42 per cent for elderly patients on medical wards. The prevalence is also high in nursing homes and long term care (LTC) facilities. The consequences of delirium could be significant such as an increase in mortality in the hospital, long-term cognitive decline, loss of autonomy and increased risk to be institutionalized. Despite being a common condition, it remains under-recognised, poorly understood and not adequately managed. Advanced age and dementia are the most important risk factors. Pain, dehydration, infections, stroke and metabolic disturbances, and surgery are the most common triggering factors. Delirium is preventable in a large proportion of cases and therefore, it is also important from a public health perspective for interventions to reduce further complications and the substantial costs associated with these. Since the aetiology is, in most cases, multfactorial, it is important to consider a multi-component approach to management, both pharmacological and non-pharmacological. Detection and treatment of triggering causes must have high priority in case of delirium. The aim of this review is to highlight the importance of delirium in the elderly population, given the increasing numbers of ageing people as well as increasing geriatric age.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study aimed to identify different patterns of gambling activities (PGAs) and to investigate how PGAs differed in gambling problems, substance use outcomes, personality traits and coping strategies. A representative sample of 4989 young Swiss males completed a questionnaire assessing seven distinct gambling activities, gambling problems, substance use outcomes, personality traits and coping strategies. PGAs were identified using latent class analysis (LCA). Differences between PGAs in gambling and substance use outcomes, personality traits and coping strategies were tested. LCA identified six different PGAs. With regard to gambling and substance use outcomes, the three most problematic PGAs were extensive gamblers, followed by private gamblers, and electronic lottery and casino gamblers, respectively. By contrast, the three least detrimental PGAs were rare or non-gamblers, lottery only gamblers and casino gamblers. With regard to personality traits, compared with rare or non-gamblers, private and casino gamblers reported higher levels of sensation seeking. Electronic lottery and casino gamblers, private gamblers and extensive gamblers had higher levels of aggression-hostility. Extensive and casino gamblers reported higher levels of sociability, whereas casino gamblers reported lower levels of anxiety-neuroticism. Extensive gamblers used more maladaptive and less adaptive coping strategies than other groups. Results suggest that gambling is not a homogeneous activity since different types of gamblers exist according to the PGA they are engaged in. Extensive gamblers, electronic and casino gamblers and private gamblers may have the most problematic PGAs. Personality traits and coping skills may predispose individuals to PGAs associated with more or less negative outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Early warning systems (EWSs) rely on the capacity to forecast a dangerous event with a certain amount of advance by defining warning criteria on which the safety of the population will depend. Monitoring of landslides is facilitated by new technologies, decreasing prices and easier data processing. At the same time, predicting the onset of a rapid failure or the sudden transition from slow to rapid failure and subsequent collapse, and its consequences is challenging for scientists that must deal with uncertainties and have limited tools to do so. Furthermore, EWS and warning criteria are becoming more and more a subject of concern between technical experts, researchers, stakeholders and decision makers responsible for the activation, enforcement and approval of civil protection actions. EWSs imply also a sharing of responsibilities which is often averted by technical staff, managers of technical offices and governing institutions. We organized the First International Workshop on Warning Criteria for Active Slides (IWWCAS) to promote sharing and networking among members from specialized institutions and relevant experts of EWS. In this paper, we summarize the event to stimulate discussion and collaboration between organizations dealing with the complex task of managing hazard and risk related to active slides.