979 resultados para Point of Purchase (POP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

After a study of the population dynamics of Biomphalaria glabrata snails in several breeding places in the Dominican Republic, the snail Thiara granifera was introduced in some B. glabrata habitats. T. granifera became established in one point in one habitat in the townof Quisqueya, in the east of the country. Around this point of establishment 6 points were selected in order to observe the population dynamics of both species of snails and the chemical and biological characteristics at each point. Four of these points already harbored B. glabrata. One control point was selected also harboring B. glabrata. After 14 months of observations, the results showed that T. granifera was competing with and displacing B. glabrata. This competition does not seem to be competition for food or vital space. Rather, B. glabrata avoids the presence of T. granifera and moves away to new areas, and this is possibly due to a chemical substance(s) secreted by T. granifera or by physical contact with the large number of individuals of T. granifera.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The usual way to investigate the statistical properties of finitely generated subgroups of free groups, and of finite presentations of groups, is based on the so-called word-based distribution: subgroups are generated (finite presentations are determined) by randomly chosen k-tuples of reduced words, whose maximal length is allowed to tend to infinity. In this paper we adopt a different, though equally natural point of view: we investigate the statistical properties of the same objects, but with respect to the so-called graph-based distribution, recently introduced by Bassino, Nicaud and Weil. Here, subgroups (and finite presentations) are determined by randomly chosen Stallings graphs whose number of vertices tends to infinity. Our results show that these two distributions behave quite differently from each other, shedding a new light on which properties of finitely generated subgroups can be considered frequent or rare. For example, we show that malnormal subgroups of a free group are negligible in the raph-based distribution, while they are exponentially generic in the word-based distribution. Quite surprisingly, a random finite presentation generically presents the trivial group in this new distribution, while in the classical one it is known to generically present an infinite hyperbolic group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An ELISA test for the serological diagnosisof amoebic liver abscess (ALA) was standardized and evaluated in sera from three groups of patients: (1) three patients with diagnosis confirmed by isolation of the parasite,(2) thirty seven patients with diagnosis established by clinical findings and ultrasound studies and (3) seven patients whose diagnosis were established by clinical findings and a positive double immunodifusion test. Ninety one serum samples from healthy subjects and 22 from patients with other liver or parasitic diseases were also included in the study. the optimum concentration of Entamoeba histolytica antigen was 1.25 µg/ml and optimum dilutions of serum and anti-human IgG-alkaline phosphatase conjugate were 1:400 and 1:4000 respectively. The cut-off point of the ELISA test in this study was an absorbance value of 0.34. The test parameters were: sensitivity = 95.7 per cent, specificty = 100 per cent, positive predictive value = 100 per cent and negative predictive value = 98.2 per cent.The ELISA test was found to be of great use as a diagnostic tool for the establishment of amoebic etiology in patients with clinical supposition of ALA. The test could also be used for seroepidemiological surveys of the prevalence of invasive amoebiasis in a given population, since it allows the processing of a greater number of samples at a lower cost tahn other serological tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction In my thesis I argue that economic policy is all about economics and politics. Consequently, analysing and understanding economic policy ideally has at least two parts. The economics part, which is centered around the expected impact of a specific policy on the real economy both in terms of efficiency and equity. The insights of this part point into which direction the fine-tuning of economic policies should go. However, fine-tuning of economic policies will be most likely subject to political constraints. That is why, in the politics part, a much better understanding can be gained by taking into account how the incentives of politicians and special interest groups as well as the role played by different institutional features affect the formation of economic policies. The first part and chapter of my thesis concentrates on the efficiency-related impact of economic policies: how does corporate income taxation in general, and corporate income tax progressivity in specific, affect the creation of new firms? Reduced progressivity and flat-rate taxes are in vogue. By 2009, 22 countries are operating flat-rate income tax systems, as do 7 US states and 14 Swiss cantons (for corporate income only). Tax reform proposals in the spirit of the "flat tax" model typically aim to reduce three parameters: the average tax burden, the progressivity of the tax schedule, and the complexity of the tax code. In joint work, Marius Brülhart and I explore the implications of changes in these three parameters on entrepreneurial activity, measured by counts of firm births in a panel of Swiss municipalities. Our results show that lower average tax rates and reduced complexity of the tax code promote firm births. Controlling for these effects, reduced progressivity inhibits firm births. Our reading of these results is that tax progressivity has an insurance effect that facilitates entrepreneurial risk taking. The positive effects of lower tax levels and reduced complexity are estimated to be significantly stronger than the negative effect of reduced progressivity. To the extent that firm births reflect desirable entrepreneurial dynamism, it is not the flattening of tax schedules that is key to successful tax reforms, but the lowering of average tax burdens and the simplification of tax codes. Flatness per se is of secondary importance and even appears to be detrimental to firm births. The second part of my thesis, which corresponds to the second and third chapter, concentrates on how economic policies are formed. By the nature of the analysis, these two chapters draw on a broader literature than the first chapter. Both economists and political scientists have done extensive research on how economic policies are formed. Thereby, researchers in both disciplines have recognised the importance of special interest groups trying to influence policy-making through various channels. In general, economists base their analysis on a formal and microeconomically founded approach, while abstracting from institutional details. In contrast, political scientists' frameworks are generally richer in terms of institutional features but lack the theoretical rigour of economists' approaches. I start from the economist's point of view. However, I try to borrow as much as possible from the findings of political science to gain a better understanding of how economic policies are formed in reality. In the second chapter, I take a theoretical approach and focus on the institutional policy framework to explore how interactions between different political institutions affect the outcome of trade policy in presence of special interest groups' lobbying. Standard political economy theory treats the government as a single institutional actor which sets tariffs by trading off social welfare against contributions from special interest groups seeking industry-specific protection from imports. However, these models lack important (institutional) features of reality. That is why, in my model, I split up the government into a legislative and executive branch which can both be lobbied by special interest groups. Furthermore, the legislative has the option to delegate its trade policy authority to the executive. I allow the executive to compensate the legislative in exchange for delegation. Despite ample anecdotal evidence, bargaining over delegation of trade policy authority has not yet been formally modelled in the literature. I show that delegation has an impact on policy formation in that it leads to lower equilibrium tariffs compared to a standard model without delegation. I also show that delegation will only take place if the lobby is not strong enough to prevent it. Furthermore, the option to delegate increases the bargaining power of the legislative at the expense of the lobbies. Therefore, the findings of this model can shed a light on why the U.S. Congress often practices delegation to the executive. In the final chapter of my thesis, my coauthor, Antonio Fidalgo, and I take a narrower approach and focus on the individual politician level of policy-making to explore how connections to private firms and networks within parliament affect individual politicians' decision-making. Theories in the spirit of the model of the second chapter show how campaign contributions from lobbies to politicians can influence economic policies. There exists an abundant empirical literature that analyses ties between firms and politicians based on campaign contributions. However, the evidence on the impact of campaign contributions is mixed, at best. In our paper, we analyse an alternative channel of influence in the shape of personal connections between politicians and firms through board membership. We identify a direct effect of board membership on individual politicians' voting behaviour and an indirect leverage effect when politicians with board connections influence non-connected peers. We assess the importance of these two effects using a vote in the Swiss parliament on a government bailout of the national airline, Swissair, in 2001, which serves as a natural experiment. We find that both the direct effect of connections to firms and the indirect leverage effect had a strong and positive impact on the probability that a politician supported the government bailout.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since 1895, when X-rays were discovered, ionizing radiation became part of our life. Its use in medicine has brought significant health benefits to the population globally. The benefit of any diagnostic procedure is to reduce the uncertainty about the patient's health. However, there are potential detrimental effects of radiation exposure. Therefore, radiation protection authorities have become strict regarding the control of radiation risks.¦There are various situations where the radiation risk needs to be evaluated. International authority bodies point to the increasing number of radiologic procedures and recommend population surveys. These surveys provide valuable data to public health authorities which helps them to prioritize and focus on patient groups in the population that are most highly exposed. On the other hand, physicians need to be aware of radiation risks from diagnostic procedures in order to justify and optimize the procedure and inform the patient.¦The aim of this work was to examine the different aspects of radiation protection and investigate a new method to estimate patient radiation risks.¦The first part of this work concerned radiation risk assessment from the regulatory authority point of view. To do so, a population dose survey was performed to evaluate the annual population exposure. This survey determined the contribution of different imaging modalities to the total collective dose as well as the annual effective dose per caput. It was revealed that although interventional procedures are not so frequent, they significantly contribute to the collective dose. Among the main results of this work, it was shown that interventional cardiological procedures are dose-intensive and therefore more attention should be paid to optimize the exposure.¦The second part of the project was related to the patient and physician oriented risk assessment. In this part, interventional cardiology procedures were studied by means of Monte Carlo simulations. The organ radiation doses as well as effective doses were estimated. Cancer incidence risks for different organs were calculated for different sex and age-at-exposure using the lifetime attributable risks provided by the Biological Effects of Ionizing Radiations Report VII. Advantages and disadvantages of the latter results were examined as an alternative method to estimate radiation risks. The results show that this method is the most accurate, currently available, to estimate radiation risks. The conclusions of this work may guide future studies in the field of radiation protection in medicine.¦-¦Depuis la découverte des rayons X en 1895, ce type de rayonnement a joué un rôle important dans de nombreux domaines. Son utilisation en médecine a bénéficié à la population mondiale puisque l'avantage d'un examen diagnostique est de réduire les incertitudes sur l'état de santé du patient. Cependant, leur utilisation peut conduire à l'apparition de cancers radio-induits. Par conséquent, les autorités sanitaires sont strictes quant au contrôle du risque radiologique.¦Le risque lié aux radiations doit être estimé dans différentes situations pratiques, dont l'utilisation médicale des rayons X. Les autorités internationales de radioprotection indiquent que le nombre d'examens et de procédures radiologiques augmente et elles recommandent des enquêtes visant à déterminer les doses de radiation délivrées à la population. Ces enquêtes assurent que les groupes de patients les plus à risque soient prioritaires. D'un autre côté, les médecins ont également besoin de connaître le risque lié aux radiations afin de justifier et optimiser les procédures et informer les patients.¦Le présent travail a pour objectif d'examiner les différents aspects de la radioprotection et de proposer une manière efficace pour estimer le risque radiologique au patient.¦Premièrement, le risque a été évalué du point de vue des autorités sanitaires. Une enquête nationale a été réalisée pour déterminer la contribution des différentes modalités radiologiques et des divers types d'examens à la dose efficace collective due à l'application médicale des rayons X. Bien que les procédures interventionnelles soient rares, elles contribuent de façon significative à la dose délivrée à la population. Parmi les principaux résultats de ce travail, il a été montré que les procédures de cardiologie interventionnelle délivrent des doses élevées et devraient donc être optimisées en priorité.¦La seconde approche concerne l'évaluation du risque du point de vue du patient et du médecin. Dans cette partie, des procédures interventionnelles cardiaques ont été étudiées au moyen de simulations Monte Carlo. La dose délivrée aux organes ainsi que la dose efficace ont été estimées. Les risques de développer des cancers dans plusieurs organes ont été calculés en fonction du sexe et de l'âge en utilisant la méthode établie dans Biological Effects of Ionizing Radiations Report VII. Les avantages et inconvénients de cette nouvelle technique ont été examinés et comparés à ceux de la dose efficace. Les résultats ont montré que cette méthode est la plus précise actuellement disponible pour estimer le risque lié aux radiations. Les conclusions de ce travail pourront guider de futures études dans le domaine de la radioprotection en médicine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The subject of this conference reflects the scientific community's interest in seeking to understand the complex causal web whose various social, economic, and biological components interact in the production and reproduction of schistosomiasis and its control in relation to community participation. From the onset, the author stresses the impossibility of dealing separately with community participation, as if social components were just one more "weapon" in the arsenal for schistosomiasis control. This study begins with a brief historical review of the 71 years of control activities with this endemic disease, stressing the enormous efforts and huge expenditures in this field vis-à-vis the limited results, despite the extraordinary technological development of specific, classical control inputs such as new treatment drugs and molluscicides. The article then discusses the various strategies used in control programs, emphasizing ideological consistencies and contradictions. Interactions at the macro and micro levels are discussed, as are the determinants and risk factors involved in producing the disease's endemicity. Unequal occupation of space leaves the segregated portion of the population exposed to extremely favorable conditions for transmission of the disease. This raises the issue of how to control an endemic disease which is so closely linked to the way of life imposed on the population. The study challenges the classical control model and suggests an alternative model now undergoing medium-term investigation in the States of Espirito Santo, and Pernambuco, Brazil. The author concludes that we do not need new strategies, but a new control model, contrary to the prevailing classical model in both concept and practice. From the conceptual point of view, the new model mentioned above is different from others in that schistosomiasis control is seen from a social perspective stressing the population's accumulated knowledge in addition to the building of shared knowledge. The model's praxis has the following characteristics: (1) it is integrated with and financed by research agencies and health services; (2) it operates at the local health services level; (3) use of molluscicides has been eliminated; (4) emphasis is given to individual medical treatment and improvement of sanitary conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The link between energy consumption and economic growth has been widely studied in the economic literature. Understanding this relationship is important from both an environmental and a socio-economic point of view, as energy consumption is crucial to economic activity and human environmental impact. This relevance is even higher for developing countries, since energy consumption per unit of output varies through the phases of development, increasing from an agricultural stage to an industrial one and then decreasing for certain service based economies. In the Argentinean case, the relevance of energy consumption to economic development seems to be particularly important. While energy intensity seems to exhibit a U-Shaped curve from 1990 to 2003 decreasing slightly after that year, total energy consumption increases along the period of analysis. Why does this happen? How can we relate this result with the sustainability debate? All these questions are very important due to Argentinean hydrocarbons dependence and due to the recent reduction in oil and natural gas reserves, which can lead to a lack of security of supply. In this paper we study Argentinean energy consumption pattern for the period 1990-2007, to discuss current and future energy and economic sustainability. To this purpose, we developed a conventional analysis, studying energy intensity, and a non conventional analysis, using the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) accounting methodology. Both methodologies show that the development process followed by Argentina has not been good enough to assure sustainability in the long term. Instead of improving energy use, energy intensity has increased. The current composition of its energy mix, and the recent economic crisis in Argentina, as well as its development path, are some of the possible explanations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determining the time since deposition of fingermarks may prove necessary in order to assess their relevance to criminal investigations. The crucial factor is the initial composition of fingermarks because it represents the starting point of any ageing model. This study mainly aimed to characterize the initial composition of fingerprints, which show a high variability between donors (inter-variability), but also to investigate the variations among fingerprints from the same donor (intra-variability). Solutions to reduce this initial variability using squalene and cholesterol as target compounds are proposed and should be further investigated. The influence of substrates was also evaluated and the initial composition was observed to be larger on porous surface than non-porous surfaces. Preliminary aging of fingerprints over 30 days was finally studied on a porous and a non-porous substrate to evaluate the potential for dating of fingermarks. Squalene was observed to decrease in a faster rate on a non-porous substrate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Clinical small-caliber vascular prostheses are unsatisfactory. Reasons for failure are early thrombosis and late intimal hyperplasia. We thus prepared biodegradable small-caliber vascular prostheses using electrospun polycaprolactone (PCL) with slow-releasing paclitaxel (PTX), an antiproliferative drug. METHODS AND RESULTS: PCL solutions containing PTX were used to prepare nonwoven nanofibre-based 2-mm ID prostheses. Mechanical morphological properties and drug loading, distribution, and release were studied in vitro. Infrarenal abdominal aortic replacement was carried out with nondrug-loaded and drug-loaded prostheses in 18 rats and followed for 6 months. Patency, stenosis, tissue reaction, and drug effect on endothelialization, vascular remodeling, and neointima formation were studied in vivo. In vitro prostheses showed controlled morphology mimicking extracellular matrix with mechanical properties similar to those of native vessels. PTX-loaded grafts with suitable mechanical properties and controlled drug-release were obtained by factorial design. In vivo, both groups showed 100% patency, no stenosis, and no aneurysmal dilatation. Endothelial coverage and cell ingrowth were significantly reduced at 3 weeks and delayed at 12 and 24 weeks in PTX grafts, but as envisioned, neointima formation was significantly reduced in these grafts at 12 weeks and delayed at 6 months. CONCLUSIONS: Biodegradable, electrospun, nanofibre, polycaprolactone prostheses are promising because in vitro they maintain their mechanical properties (regardless of PTX loading), and in vivo show good patency, reendothelialize, and remodel with autologous cells. PTX loading delays endothelialization and cellular ingrowth. Conversely, it reduces neointima formation until the end point of our study and thus may be an interesting option for small caliber vascular grafts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Overview of the Tax Coordination of the Regions in Spain from the point of view of the role of the Courts in developing the present system

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Improved survival after prophylactic implantation of a defibrillator in patients with reduced left ventricular ejection fraction (EF) after myocardial infarction (MI) has been demonstrated in patients who experienced remote MIs in the 1990s. The absolute survival benefit conferred by this recommended strategy must be related to the current risk of arrhythmic death, which is evolving. This study evaluates the mortality rate in survivors of MI with impaired left ventricular function and its relation to pre-hospital discharge baseline characteristics. METHODS: The clinical records of patients who had sustained an acute MI between 1999 and 2000 and had been discharged from the hospital with an EF of < or = 40% were included. Baseline characteristics, drug prescriptions, and invasive procedures were recorded. Bivariate and multivariate analyses were performed using a primary end point of total mortality. RESULTS: One hundred sixty-five patients were included. During a median follow-up period of 30 months (interquartile range, 22 to 36 months) 18 patients died. The 1-year and 2-year mortality rates were 6.7% and 8.6%, respectively. Variables reflecting coronary artery disease and its management (ie, prior MI, acute reperfusion, and complete revascularization) had a greater impact on mortality than variables reflecting mechanical dysfunction (ie, EF and Killip class). CONCLUSIONS: The mortality rate among survivors of MIs with reduced EF was substantially lower than that reported in the 1990s. The strong decrease in the arrhythmic risk implies a proportional increase in the number of patients needed to treat with a prophylactic defibrillator to prevent one adverse event. The risk of an event may even be sufficiently low to limit the detectable benefit of defibrillators in patients with the prognostic features identified in our study. This argues for additional risk stratification prior to the prophylactic implantation of a defibrillator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Schistosoma mansoni infection induces in their hosts a marked and sustained eosinophilia, which is influenced or modulated by complex mechanisms, that vary according to the phase of infection. To address this phenomenon, we used the air pouch (AP) model in control and infected Swiss webster mice, analyzing the cellular, tissue response and local expression of adhesion molecules [CD18 (beta 2-chain), CD44, ICAM-1 (CD54), L-selectin (CD62L), CD49d (alpha 4-chain), LFA1 (CD11a)]. Infected animals were studied at 3 (pre-oviposition phase), 7 (acute phase), and 14 (chronic phase) weeks after infection (5-6 mice/period of infection). Normal mice were age-matched. Results showed that after egg stimulation, compared with matched controls, the infected mice, at each point of infection, showed a lower eosinophil response in the acute (7 weeks) and chronic phase (14 weeks) of infection. However, when the infected mice were in pre-oviposition phase (3 weeks) their eosinophil response surpassed the control ones. In the AP wall of infected mice, a significant decrease in the expression of ICAM-1 and CD44 in fibroblastic-like cells and a reduction in the number of CD18 and CD11a in migratory cells were observed. The other adhesion molecules were negative or weakly expressed. The results indicated that in the air pouch model, in S. mansoni-infected mice: (1) eosinophil response is strikingly down-regulated, during the acute ovular phase; (2) in the pre-oviposition phase, in contrast, it occurs an up-regulatory modulation of eosinophil response, in which the mechanisms are completely unknown; (3) in the chronic phase of the infection, the down modulation of eosinophil response is less pronounced; 4) Down-regulation of adhesion molecules, specially of ICAM-1 appear to be associated with the lower eosinophil response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The history of tax havens during the decades before World War II is still little known. To date, the studies that have focused on the 1920s and 1930s have presented either a very general perspective on the development of tax havens or a narrow national point of view. Based on unpublished historical archives of five countries (Switzerland, Great Britain, Belgium, France, Germany), this paper offers therefore a new comparative appraisal of international tax competition during this period in order to answer the following question: What was the specificity of the Swiss case - already considered a quintessential tax haven at the time - in comparison to other banking centres? The findings of this research study are twofold. First, the 1920s and 1930s appear as something of a golden age of opportunity for avoiding taxation through the relocation of assets. Most of the financial centres granted consistent tax benefits for imported capital, while the limited degree of international cooperation and the usual guarantee of banking secrecy in European countries prevented the taxation of exported assets. Second, within this general environment, the fiscal strategies of a tax haven like Switzerland differed from those of a great financial power like Great Britain. Whereas the Swiss administration readily placed itself at the service of the banking community, British policy was more balanced between the contradictory interests of the Board of Inland Revenue, the Treasury, and the English business circles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors present morphogenetic and biomechanical approaches on the concept of the Schistosoma mansoni granulomas, considering them as organoid structures that depend on cellular adhesion and sorting, forming rearrangement into hierarchical concentric layers, creating tension-dependent structures, aiming to acquire round form, since this is the minimal energy form, in which opposing forces pull in equally from all directions and are in balance. From the morphogenetic point of view, the granulomas function as little organs, presenting maturative and involutional stages in their development with final disappearance (pre-granulomatous stages, subdivided in: weakly and/or initial reactive and exudative; granulomatous stages: exudative-productive, productive and involutional). A model for the development of granulomas was suggested, according to the following stages: encapsulating, focal histolysis, fiber production, orientation and compacting and involution and desintegration. The authors concluded that schistosomal granuloma is not a tangled web of individual cells and fibers, but an organized structure composed by host and parasite components, which is not formed to attack the miracidia, but functions as an hybrid interface between two different phylogenetic beings.