818 resultados para Gavarnie, Cirque de (Hautes-Pyrénées)
Resumo:
Référence bibliographique : Rol, 57492
Resumo:
Objectifs : Evaluer une méthode simple et rapide de mesure du volume atrial gauche. Matériels et méthodes : Cinquante patients ont été examinés avec un CT gaté pour mesure du score calcique. Trois méthodes ont été utilisées pour calculer le volume atrial gauche : 1) une méthode orthogonale avec mesure des surfaces/diamètres dans les plans axiaux/coronaux/sagittaux, 2) une méthode biplan inspirée de l'échocardiographie et 3) une méthode volumétrique. Les mesures ont été faites par le même observateur un mois plus tard et ont été répétées par trois autres observateurs. L'axe cardiaque a aussi été mesuré. La méthode Bland-Altmann et les corrélations de Spearman ont été utilisées. Résultats : La méthode volumétrique montre les variations intra/interobservateur les plus basses avec une variabilité de 6,1/7,4 ml, respectivement. Pour les mesures avec la méthode orthogonale (surfaces/diamètres), les variations intra/interobservateur sont 12,3/13,5 ml et 14,6/11,6 ml, respectivement. Pour la méthode biplan, les variations intra/interobservateur sont plus hautes : 23,9/19,8 ml. Comparée à la méthode de référence volumétrique, la méthode orthogonale avec les surfaces est mieux corrélée (R=0,959, p<0,001) que les autres méthodes. Il y a une faible influence de l'axe du coeur sur la méthode orthogonale avec les surfaces. Conclusion : La méthode volumétrique est le gold standard en terme de variabilité. Cependant elle est longue à metttre en oeuvre. La méthode orthogonale avec les surfaces est une alternative simple, sauf chez les patients obèses avec un coeur horizontalisé.
Resumo:
Référence bibliographique : Rol, 58266
Resumo:
Référence bibliographique : Rol, 58264
Resumo:
SUMMARY This paper analyses the outcomes of the EEA and bilateral agreements vote at the level of the 3025 communities of the Swiss Confederation by simultaneously modelling the vote and the participation decisions. Regressions include economic and political factors. The economic variables are the aggregated shares of people employed in the losing, Winning and neutral sectors, according to BRUNETTI, JAGGI and WEDER (1998) classification, Which follows a Ricardo-Viner logic, and the average education levels, which follows a Heckscher-Ohlin approach. The political factors are those used in the recent literature. The results are extremely precise and consistent. Most of the variables have the predicted sign and are significant at the l % level. More than 80 % of the communities' vote variance is explained by the model, substantially reducing the residuals when compared to former studies. The political variables do have the expected signs and are significant as Well. Our results underline the importance of the interaction between electoral choice and participation decisions as well as the importance of simultaneously dealing with those issues. Eventually they reveal the electorate's high level of information and rationality. ZUSAMMENFASSUNG Unser Beitrag analysiert in einem Model, welches gleichzeitig die Stimm- ("ja" oder "nein") und Partizipationsentscheidung einbezieht, den Ausgang der Abstimmungen über den Beitritt zum EWR und über die bilateralen Verträge für die 3025 Gemeinden der Schweiz. Die Regressionsgleichungen beinhalten ökonomische und politische Variabeln. Die ökonomischen Variabeln beinhalten die Anteile an sektoriellen Arbeitsplatzen, die, wie in BRUNETTI, JAGGIl.1I1d WEDER (1998), in Gewinner, Verlierer und Neutrale aufgeteilt Wurden, gemäß dem Model von Ricardo-Viner, und das durchschnittliche Ausbildungsniveau, gemäß dem Model von Heckscher-Ohlin. Die politischen Variabeln sind die in der gegenwärtigen Literatur üblichen. Unsere Resultate sind bemerkenswert präzise und kohärent. Die meisten Variabeln haben das von der Theorie vorausgesagte Vorzeichen und sind hoch signifikant (l%). Mehr als 80% der Varianz der Stimmabgabe in den Gemeinden wird durch das Modell erklärt, was, im Vergleich mit früheren Arbeiten, die unerklärten Residuen Wesentlich verkleinert. Die politischen Variabeln haben auch die erwarteten Vorzeichen und sind signifikant. Unsere Resultate unterstreichen die Bedeutung der Interaktion zwischen der Stimm- und der Partizipationsentscheidung, und die Bedeutung diese gleichzeitig zu behandeln. Letztendlich, belegen sie den hohen lnformationsgrad und die hohe Rationalität der Stimmbürger. RESUME Le présent article analyse les résultats des votations sur l'EEE et sur les accords bilatéraux au niveau des 3025 communes de la Confédération en modélisant simultanément les décisions de vote ("oui" ou "non") et de participation. Les régressions incluent des déterminants économiques et politiques. Les déterminants économiques sont les parts d'emploi sectoriels agrégées en perdants, gagnants et neutres selon la classification de BRUNETTI, JAGGI ET WEDER (1998), suivant la logique du modèle Ricardo-Viner, et les niveaux de diplômes moyens, suivant celle du modèle Heckscher-Ohlin. Les déterminants politiques suivent de près ceux utilisés dans la littérature récente. Les résultats sont remarquablement précis et cohérents. La plupart des variables ont les signes prédits par les modèles et sont significatives a 1%. Plus de 80% de la variance du vote par commune sont expliqués par le modèle, faisant substantiellement reculer la part résiduelle par rapport aux travaux précédents. Les variables politiques ont aussi les signes attendus et sont aussi significatives. Nos résultats soulignent l'importance de l'interaction entre choix électoraux et décisions de participation et l'importance de les traiter simultanément. Enfin, ils mettent en lumière les niveaux élevés d'information et de rationalité de l'électorat.
Resumo:
Executive summaryThe increasing prevalence of chronic diseases is one of the major causes of rising health expenditure, as stated by the WHO. Not only chronic diseases are very costly, but they are by far the leading cause of mortality in the world, representing 60% of all deaths. Diabetes in particular is becoming a major burden of disease. In Switzerland around 5% of the population suffer of type 2 diabetes and 5 to 10% of the annual health care budget is attributable to diabetes. If the predictions of WHO do realise, the prevalence of diabetes will double until 2030 and so is expected the attributable health expenditure.The objective of this thesis is to provide policy recommendations as to slow down the disease progression and its costly complication. We study the factors that influence diabetes dynamics and the interventions that improve health outcomes while decreasing costs according to different time horizon and use systems thinking and system dynamic.Our results show that managing diabetes requires using integrated care interventions that are effective on three fronts: (1) delaying the onset of complications, (2) slowing down the disease progression and (3) accelerating the time to diagnosis of diabetes and its complications. We recommend firstly the implementation of those interventions targeted at changing patients' behaviour which are also less expensive, but require a change in the delivery of care and medical practices. Then policies targeted at an earlier diagnosis of diabetes, its prevention and the diagnosis of complications are to be considered. This sequence of interventions allows saving money, as total costs decrease, even including the costs of interventions and result in longer life expectancy of diabetics in the long term.In diabetes management there is therefore a trade-off between medical costs and patients' benefits on the one hand and between the objectives of obtaining results in the short or long term on the other hand. Decision makers need to deliver acceptable outcomes in the short term. Considering this criterion, the preferred policy may be to focus only on diagnosed diabetics, thus attempting to slow down the progression of their disease, compared to an integrated care approach addressing all the aspects of the disease. Such a policy also yields desirable results in terms of costs and patients' benefits.
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.
Resumo:
Référence bibliographique : Rol, 57493
Resumo:
Abstract This paper presents a model of executive compensation in which the executive is risk-averse and has specific knowledge -knowledge about the optimal actions to take that is costly to transfer to the principal. The model generates predictions that are consistent with the available evidence and provides a rationale for a number of unresolved puzzles in executive compensation. Notably, we find that relative performance evaluation is optimal only if the quality of specific knowledge is low. We also show (1) why some common risk components are not filtered out of executives' pay, (2) why performance is more likely to be evaluated relative to aggregate market movements than relative to industry movements, and (3) why executives with higher perceived abilities are given stronger incentives. Finally, we demonstrate that the relation between risk and incentives may be positive or negative, depending on the quality of the executive's specific knowledge.
Resumo:
Référence bibliographique : Rol, 58271
Resumo:
Référence bibliographique : Rol, 58268
Resumo:
Référence bibliographique : Rol, 58263