104 resultados para over-generalization and under-generalization problems

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The project of articulating a theological ethics on the basis of liturgical anthropology is bound to fail if the necessary consequence is that one has to quit the forum of critical modern rationality. The risk of Engelhardt's approach is to limit rationality to a narrow vision of reason. Sin is not to be understood as the negation of human holiness, but as the negation of divine holiness. The only way to renew theological ethics is to understand sin as the anthropological and ethical expression of the biblical message of the justification by faith only. Sin is therefore a secondary category, which can only by interpreted in light of the positive manifestation of liberation, justification, and grace. The central issue of Christian ethics is not ritual purity or morality, but experience, confession and recognition of our own injustice in our dealing with God and men.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: Evidence-based and reliable measures of addictive disorders are needed in general population-based assessments. One study suggested that heavy use over time (UOT) should be used instead of self-reported addiction scales (AS). This study compared UOT and AS regarding video gaming and internet use empirically, using associations with comorbid factors. DESIGN: Cross-sectional data from the 2011 French Survey on Health and Consumption on Call-up and Preparation for Defence-Day (ESCAPAD), cross-sectional data from the 2012 Swiss ado@internet.ch study and two waves of longitudinal data (2010-13) of the Swiss Longitudinal Cohort Study on Substance Use Risk Factors (C-SURF). SETTING: Three representative samples from the general population of French and Swiss adolescents and young Swiss men, aged approximately 17, 14 and 20 years, respectively. PARTICIPANTS: ESCAPAD: n =22 945 (47.4% men); ado@internet.ch: n =3049 (50% men); C-SURF: n =4813 (baseline + follow-up, 100% men). MEASUREMENTS: We assessed video gaming/internet UOT ESCAPAD and ado@internet.ch: number of hours spent online per week, C-SURF: latent score of time spent gaming/using internet] and AS (ESCAPAD: Problematic Internet Use Questionnaire, ado@internet.ch: Internet Addiction Test, C-SURF: Gaming AS). Comorbidities were assessed with health outcomes (ESCAPAD: physical health evaluation with a single item, suicidal thoughts, and appointment with a psychiatrist; ado@internet.ch: WHO-5 and somatic health problems; C-SURF: Short Form 12 (SF-12 Health Survey) and Major Depression Inventory (MDI). FINDINGS: UOT and AS were correlated moderately (ESCAPAD: r = 0.40, ado@internet.ch: r = 0.53 and C-SURF: r = 0.51). Associations of AS with comorbidity factors were higher than those of UOT in cross-sectional (AS: .005 ≤ |b| ≤ 2.500, UOT: 0.001 ≤ |b| ≤ 1.000) and longitudinal analyses (AS: 0.093 ≤ |b| ≤ 1.079, UOT: 0.020 ≤ |b| ≤ 0.329). The results were similar across gender in ESCAPAD and ado@internet.ch (men: AS: 0.006 ≤ |b| ≤ 0.211, UOT: 0.001 ≤ |b| ≤ 0.061; women: AS: 0.004 ≤ |b| ≤ 0.155, UOT: 0.001 ≤ |b| ≤ 0.094). CONCLUSIONS: The measurement of heavy use over time captures part of addictive video gaming/internet use without overlapping to a large extent with the results of measuring by self-reported addiction scales (AS). Measuring addictive video gaming/internet use via self-reported addiction scales relates more strongly to comorbidity factors than heavy use over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Allostatic load reflects cumulative exposure to stressors throughout lifetime and has been associated with several adverse health outcomes. It is hypothesized that people with low socioeconomic status (SES) are exposed to higher chronic stress and have therefore greater levels of allostatic load. OBJECTIVE: To assess the association of receiving social transfers and low education with allostatic load. METHODS: We included 3589 participants (1812 women) aged over 35years and under retirement age from the population-based CoLaus study (Lausanne, Switzerland, 2003-2006). We computed an allostatic load index aggregating cardiovascular, metabolic, dyslipidemic and inflammatory markers. A novel index additionally including markers of oxidative stress was also examined. RESULTS: Men with low vs. high SES were more likely to have higher levels of allostatic load (odds ratio (OR)=1.93/2.34 for social transfers/education, 95%CI from 1.45 to 4.17). The same patterns were observed among women. Associations persisted after controlling for health behaviors and marital status. CONCLUSIONS: Low education and receiving social transfers independently and cumulatively predict high allostatic load and dysregulation of several homeostatic systems in a Swiss population-based study. Participants with low SES are at higher risk of oxidative stress, which may justify its inclusion as a separate component of allostatic load.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background and aims. Limited data from large cohorts are available on tumor necrosis factor (TNF) antagonists (infliximab, adalimumab, certolizumab pegol) switch over time. We aimed to evaluate the prevalence of switching from one TNF antagonist to another and to identify associated risk factors. Methods. Data from the Swiss Inflammatory Bowel Diseases Cohort Study (SIBDCS) were analyzed. Results. Of 1731 patients included into the SIBDCS (956 with Crohn's disease [CD] and 775 with ulcerative colitis [UC]), 347 CD patients (36.3%) and 129 UC patients (16.6%) were treated with at least one TNF antagonist. A total of 53/347 (15.3%) CD patients (median disease duration 9 years) and 20/129 (15.5%) of UC patients (median disease duration 7 years) needed to switch to a second and/or a third TNF antagonist, respectively. Median treatment duration was longest for the first TNF antagonist used (CD 25 months; UC 14 months), followed by the second (CD 13 months; UC 4 months) and third TNF antagonist (CD 11 months; UC 15 months). Primary nonresponse, loss of response and side effects were the major reasons to stop and/or switch TNF antagonist therapy. A low body mass index, a short diagnostic delay and extraintestinal manifestations at inclusion were identified as risk factors for a switch of the first used TNF antagonist within 24 months of its use in CD patients. Conclusion. Switching of the TNF antagonist over time is a common issue. The median treatment duration with a specific TNF antagonist is diminishing with an increasing number of TNF antagonists being used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article reviews nanoparticulate-chemotherapeutic systems that have been developed for human therapy, considering the components of the nanoparticles, the therapeutic agents associated with the nanoparticles and the clinical indications these therapeutic nanoparticles have been developed for. In this evaluation we have put into perspective the types of nanomaterials and their therapeutic indications. We have reviewed the nanoparticulate-chemotherapeutic systems that have been published, approved and marketed and that are currently in clinical use. We have also analyzed the nanoparticulate-chemotherapeutic systems that are in clinical trials and under preclinical development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordinated interactions between T and B cells are crucial for inducing physiological B cell responses. Mutant mice in which tyrosine 136 of linker for activation of T cell (LAT) is replaced by a phenylalanine (Lat(Y136F)) exhibit a strong CD4(+) T cell proliferation in the absence of intended immunization. The resulting effector T cells produce high amounts of T(H)2 cytokines and are extremely efficient at inducing polyclonal B cell activation. As a consequence, these Lat(Y136F) mutant mice showed massive germinal center formations and hypergammaglobulinemia. Here, we analyzed the involvement of different costimulators and their ligands in such T-B interactions both in vitro and in vivo, using blocking antibodies, knockout mice, and adoptive transfer experiments. Surprisingly, we showed in vitro that although B cell activation required contact with T cells, CD40, and inducible T cell costimulator molecule-ligand (ICOSL) signaling were not necessary for this process. These observations were further confirmed in vivo, where none of these molecules were required for the unfolding of the LAT CD4(+) T cell expansion and the subsequent polyclonal B cell activation, although, the absence of CD40 led to a reduction of the follicular B cell response. These results indicate that the crucial functions played by CD40 and ICOSL in germinal center formation and isotype switching in physiological humoral responses are partly overcome in Lat(Y136F) mice. By comparison, the absence of CD80-CD86 was found to almost completely block the in vitro B cell activation mediated by Lat(Y136F) CD4(+) T cells. The role of CD80-CD86 in T-B cooperation in vivo remained elusive due to the upstream implication of these costimulatory molecules in the expansion of Lat(Y136F) CD4(+) T cells. Together, our data suggest that CD80 and CD86 costimulators play a key role in the polyclonal B cell activation mediated by Lat(Y136F) CD4(+) T cells even though additional costimulatory molecules or cytokines are likely to be required in this process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background and aims. Limited data from large cohorts are available on tumor necrosis factor (TNF) antagonists (infliximab, adalimumab, certolizumab pegol) switch over time. We aimed to evaluate the prevalence of switching from one TNF antagonist to another and to identify associated risk factors. Methods. Data from the Swiss Inflammatory Bowel Diseases Cohort Study (SIBDCS) were analyzed. Results. Of 1731 patients included into the SIBDCS (956 with Crohn's disease [CD] and 775 with ulcerative colitis [UC]), 347 CD patients (36.3%) and 129 UC patients (16.6%) were treated with at least one TNF antagonist. A total of 53/347 (15.3%) CD patients (median disease duration 9 years) and 20/129 (15.5%) of UC patients (median disease duration 7 years) needed to switch to a second and/or a third TNF antagonist, respectively. Median treatment duration was longest for the first TNF antagonist used (CD 25 months; UC 14 months), followed by the second (CD 13 months; UC 4 months) and third TNF antagonist (CD 11 months; UC 15 months). Primary nonresponse, loss of response and side effects were the major reasons to stop and/or switch TNF antagonist therapy. A low body mass index, a short diagnostic delay and extraintestinal manifestations at inclusion were identified as risk factors for a switch of the first used TNF antagonist within 24 months of its use in CD patients. Conclusion. Switching of the TNF antagonist over time is a common issue. The median treatment duration with a specific TNF antagonist is diminishing with an increasing number of TNF antagonists being used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For several years now, substantial efforts have been devoted to the development and the implementation of a screening program for breast cancer in the Canton of Vaud. A four-year pilot phase is now starting, involving two regional hospitals with their catchment areas; women over 50 and under 70 years old will be invited to participate in the program. A double view mammography will be made, with a double reading made by the hospital radiologists; a third reading will be made in case of discrepancy between the two first radiologists. Patients classified as positive for screening (e.g., with a suspect radiological image) will be referred to their practitioner for further diagnosis and treatment. The medical and public health background of this program is discussed, more specifically the reasons for developing a screening program, the choice of mammography rather than other tools, and the need to implement screening as an organized program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thyroid substitution is generally considered easy as well by general practitioners as by specialists, considering that a single hormone levothyroxin is recommended and that laboratory tests are readily available for measurement of free T4 and TSH. However cross sectional studies have shown that about 45% of patients are over-treated and under-treated. This paper summarizes the critical information useful to facilitate a better management of hypothyroid patients by promoting long lasting euthyroidism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Epidemiological data indicate that 75% of subjects with major psychiatric disorders have their onset in the age range of 17-24 years. An estimated 35-50% of college and university students drop out prematurely due to insufficient coping skills under chronic stress, while 85% of students receiving a psychiatric diagnosis withdraw from college/university prior to the completion of their education. In this study we aimed at developing standardized means for identifying students with insufficient coping skills under chronic stress and at risk for mental health problems. A sample of 1,217 college students from 3 different sites in the U.S. and Switzerland completed 2 self-report questionnaires: the Coping Strategies Inventory "COPE" and the Zurich Health Questionnaire "ZHQ" which assesses "regular exercises", "consumption behavior", "impaired physical health", "psychosomatic disturbances", and "impaired mental health". The data were subjected to structure analyses by means of a Neural Network approach. We found 2 highly stable and reproducible COPE scales that explained the observed inter-individual variation in coping behavior sufficiently well and in a socio-culturally independent way. The scales reflected basic coping behavior in terms of "activity-passivity" and "defeatism-resilience", and in the sense of stable, socio-culturally independent personality traits. Correlation analyses carried out for external validation revealed a close relationship between high scores on the defeatism scale and impaired physical and mental health. This underlined the role of insufficient coping behavior as a risk factor for physical and mental health problems. The combined COPE and ZHQ instruments appear to constitute powerful screening tools for insufficient coping skills under chronic stress and for risks of mental health problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of specific terms under different meanings and varying definitions has always been a source of confusion in science. When we point our efforts towards an evidence based medicine for inflammatory bowel diseases (IBD) the same is true: Terms such as "mucosal healing" or "deep remission" as endpoints in clinical trials or treatment goals in daily patient care may contribute to misconceptions if meanings change over time or definitions are altered. It appears to be useful to first have a look at the development of terms and their definitions, to assess their intrinsic and context-independent problems and then to analyze the different relevance in present-day clinical studies and trials. The purpose of such an attempt would be to gain clearer insights into the true impact of the clinical findings behind the terms. It may also lead to a better defined use of those terms for future studies. The terms "mucosal healing" and "deep remission" have been introduced in recent years as new therapeutic targets in the treatment of IBD patients. Several clinical trials, cohort studies or inception cohorts provided data that the long term disease course is better, when mucosal healing is achieved. However, it is still unclear whether continued or increased therapeutic measures will aid or improve mucosal healing for patients in clinical remission. Clinical trials are under way to answer this question. Attention should be paid to clearly address what levels of IBD activity are looked at. In the present review article authors aim to summarize the current evidence available on mucosal healing and deep remission and try to highlight their value and position in the everyday decision making for gastroenterologists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Major burns are characterized by an initial capillary leak, which requires fluid resuscitation for hemodynamic stabilization. While under resuscitation was the major cause of death until the 1980s, over resuscitation has become an important source of complications, including abdominal compartment syndrome, escharosis, impaired gas exchange with prolonged mechanical ventilation and hospital stay. Fluid over infusion started in the 1990s with an increasing proportion of the fluid delivered within the first 24 h being well above the 4 ml/kg/% burn surface area (BSA) according to the Parkland formula. The first alerts were published in the form of case reports of increased mortality due to abdominal compartment syndrome and respiratory failure. This paper analyses the causes of this fluid over infusion and the ways to prevent it, which include rationing prehospital fluid delivery, avoiding early administration of colloids and prevention by permissive hypovolemia.