987 resultados para Flow Theory
Resumo:
BACKGROUND: For over 50 years, radiocephalic wrist arteriovenous fistulae (RCAVF) have been the primary and best vascular access for haemodialysis. Nevertheless, early failure due to thrombosis or non-maturation is a major complication resulting in their abandonment. This prospective study was designed to investigate the predictive value of intra-operative blood flow on early failure of primary RCAVF before the first effective dialysis. METHODS: We enrolled patients undergoing creation of primary RCAVF for haemodialysis based on the pre-operative ultrasound vascular mapping discussed in a multidisciplinary approach. Intra-operative blood flow measurement was systematically performed once the anastomosis had been completed using a transit-time ultrasonic flowmeter. During the follow-up, blood flow was estimated by colour flow ultrasound at various intervals. Any events related to the RCAVF were recorded. RESULTS: Autogenous RCAVFs (n = 58) in 58 patients were constructed and followed up for an average of 30 days. Thrombosis and non-maturation occurred in eight (14%) and four (7%) patients, respectively. The intra-operative blood flow in functioning RCAVFs was significantly higher compared to non-functioning RCAVFs (230 vs 98 mL/min; P = 0.007), as well as 1 week (753 vs 228 mL/min; P = 0.0008) and 4 weeks (915 vs 245 mL/min, P < 0.0001) later. Blood flow volume measurements with a cut-off value of 120 mL/min had a sensitivity of 67%, specificity of 75% and positive predictive value of 91%. CONCLUSIONS: Blood flow <120 mL has a good predictive value for early failure in RCAVF. During the procedure, this cut-off value may be used to select appropriately which RCAVF should be investigated in the operation theatre in order to correct in real time any abnormality.
Resumo:
Debris flow susceptibility mapping at a regional scale has been the subject of various studies. The complexity of the phenomenon and the variability of local controlling factors limit the use of process-based models for a first assessment. GISbased approaches associating an automatic detection of the source areas and a simple assessment of the debris flow spreading may provide a substantial basis for a preliminary susceptibility assessment at the regional scale. The use of a digital elevation model, with a 10 m resolution, for the Canton de Vaud territory (Switzerland), a lithological map and a land use map, has allowed automatic identification of the potential source areas. The spreading estimates are based on basic probabilistic and energy calculations that allow to define the maximal runout distance of a debris flow.
Resumo:
Lamella formation and emigration from the water were investigated in juvenile Biomphalaria glabrata reared at two temperatures in aquaria with a constant water flow. Most snails (97.4%) reared at the lower temperature (21- C) formed lamella at the shell aperture and emigrated from the water, whereas only 10.1% did so at 25- C. Eighty percent of emigrations at 21- C occurred within a period of 15 days, 70-85 days after hatching. A comparison of the studies done so far indicates that the phenomenon may be affected by the ageing of snail colonies kept in the laboratory and their geographic origin, rather than the rearing conditions. This hypothesis, however, requires experimental confirmation.
Resumo:
Most opinion favors the origin of the malaria parasites from a coccidial ancestor. It is assumed that whatever the process through which the coccidia differentiated into a Plasmodium this phenomenon very probably occured millions of year ago, and during that differentiation process the original coccidia vanished. Therefore it has never repeated. At the light of some experiments the existence, at the present time, of a coccidial cycle of development in the malaria parasites, is proposed. The conection routes and mechanisms through which the malaria parasite changes to a coccidial life, and the routes in reverse are exposed. Transmission of the malaria-coccidial forms is suggested.
Resumo:
Fluorescence flow cytometry was employed to assess the potential of a vital dye, hydroethiedine, for use in the detection and monitoring of the viability of hemoparasites in infected erythrocytes, using Babesia bovis as a model parasite. The studies demonstrated that hydroethidine is taken up by B. bovis and metabolically converted to the DNA binding fluorochrone, ethidium. Following uptake of the dye, erythrocytes contamine viable parasites were readily distinguished and quantitated. Timed studies with the parasiticidal drug, Ganaseg, showed that it is possible to use the fluorochrome assay to monitor the effects of the drug on the rate of replication and viability of B. bovis in culture. The assay provides a rapid method for evaluation of the in vitro effect of drugs on hemoparasites and for analysis of the effect of various components of the immune response, such as lymphokines, monocyte products, antibodies, and effector cells (T, NK, LAK, ADCC) on the growth and viability of intraerythrocytic parasites.
Resumo:
We present existence, uniqueness and continuous dependence results for some kinetic equations motivated by models for the collective behavior of large groups of individuals. Models of this kind have been recently proposed to study the behavior of large groups of animals, such as flocks of birds, swarms, or schools of fish. Our aim is to give a well-posedness theory for general models which possibly include a variety of effects: an interaction through a potential, such as a short-range repulsion and long-range attraction; a velocity-averaging effect where individuals try to adapt their own velocity to that of other individuals in their surroundings; and self-propulsion effects, which take into account effects on one individual that are independent of the others. We develop our theory in a space of measures, using mass transportation distances. As consequences of our theory we show also the convergence of particle systems to their corresponding kinetic equations, and the local-in-time convergence to the hydrodynamic limit for one of the models.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
BACKGROUND: According to recent guidelines, patients with coronary artery disease (CAD) should undergo revascularization if significant myocardial ischemia is present. Both, cardiovascular magnetic resonance (CMR) and fractional flow reserve (FFR) allow for a reliable ischemia assessment and in combination with anatomical information provided by invasive coronary angiography (CXA), such a work-up sets the basis for a decision to revascularize or not. The cost-effectiveness ratio of these two strategies is compared. METHODS: Strategy 1) CMR to assess ischemia followed by CXA in ischemia-positive patients (CMR + CXA), Strategy 2) CXA followed by FFR in angiographically positive stenoses (CXA + FFR). The costs, evaluated from the third party payer perspective in Switzerland, Germany, the United Kingdom (UK), and the United States (US), included public prices of the different outpatient procedures and costs induced by procedural complications and by diagnostic errors. The effectiveness criterion was the correct identification of hemodynamically significant coronary lesion(s) (= significant CAD) complemented by full anatomical information. Test performances were derived from the published literature. Cost-effectiveness ratios for both strategies were compared for hypothetical cohorts with different pretest likelihood of significant CAD. RESULTS: CMR + CXA and CXA + FFR were equally cost-effective at a pretest likelihood of CAD of 62% in Switzerland, 65% in Germany, 83% in the UK, and 82% in the US with costs of CHF 5'794, euro 1'517, £ 2'680, and $ 2'179 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. CONCLUSIONS: The CMR + CXA strategy is more cost-effective than CXA + FFR below a CAD prevalence of 62%, 65%, 83%, and 82% for the Swiss, the German, the UK, and the US health care systems, respectively. These findings may help to optimize resource utilization in the diagnosis of CAD.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
In this article, we present a new approach of Nekhoroshev theory for a generic unperturbed Hamiltonian which completely avoids small divisors problems. The proof is an extension of a method introduced by P. Lochak which combines averaging along periodic orbits with simultaneous Diophantine approximation and uses geometric arguments designed by the second author to handle generic integrable Hamiltonians. This method allows to deal with generic non-analytic Hamiltonians and to obtain new results of generic stability around linearly stable tori.
Resumo:
Is there a link between decentralized governance and conflict prevention? This article tries to answer the question by presenting the state of the art of the intersection of both concepts. Provided that social conflict is inevitable and given the appearance of new threats and types of violence, as well as new demands for security based on people (human security), our societies should focus on promoting peaceful changes. Through an extensive analysis of the existing literature and the study of several cases, this paper suggests that decentralized governance can contribute to these efforts by transforming conflicts, bringing about power-sharing and inclusion incentives of minority groups. Albeit the complexity of assessing its impact on conflict prevention, it can be contended that decentralized governance might have very positive effects on the reduction of causes that bring about conflicts due to its ability to foster the creation of war/violence preventors. More specifically, this paper argues that decentralization can have a positive impact on the so-called triggers and accelerators (short- and medium-term causes).