979 resultados para step-up -hakkuri


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present dual-wavelength Digital Holographic Microscopy (DHM) measurements on a certified 8.9 nm high Chromium thin step sample and demonstrate sub-nanometer axial accuracy. We introduce a modified DHM Reference Calibrated Hologram (RCH) reconstruction algorithm taking into account amplitude contributions. By combining this with a temporal averaging procedure and a specific dual-wavelength DHM arrangement, it is shown that specimen topography can be measured with an accuracy, defined as the axial standard deviation, reduced to at least 0.9 nm. Indeed, it is reported that averaging each of the two wavefronts recorded with real-time dual-wavelength DHM can provide up to 30% spatial noise reduction for the given configuration, thanks to their non-correlated nature. ©2008 COPYRIGHT SPIE

Relevância:

20.00% 20.00%

Publicador:

Resumo:

General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The object of game theory lies in the analysis of situations where different social actors have conflicting requirements and where their individual decisions will all influence the global outcome. In this framework, several games have been invented to capture the essence of various dilemmas encountered in many common important socio-economic situations. Even though these games often succeed in helping us understand human or animal behavior in interactive settings, some experiments have shown that people tend to cooperate with each other in situations for which classical game theory strongly recommends them to do the exact opposite. Several mechanisms have been invoked to try to explain the emergence of this unexpected cooperative attitude. Among them, repeated interaction, reputation, and belonging to a recognizable group have often been mentioned. However, the work of Nowak and May (1992) showed that the simple fact of arranging the players according to a spatial structure and only allowing them to interact with their immediate neighbors is sufficient to sustain a certain amount of cooperation even when the game is played anonymously and without repetition. Nowak and May's study and much of the following work was based on regular structures such as two-dimensional grids. Axelrod et al. (2002) showed that by randomizing the choice of neighbors, i.e. by actually giving up a strictly local geographical structure, cooperation can still emerge, provided that the interaction patterns remain stable in time. This is a first step towards a social network structure. However, following pioneering work by sociologists in the sixties such as that of Milgram (1967), in the last few years it has become apparent that many social and biological interaction networks, and even some technological networks, have particular, and partly unexpected, properties that set them apart from regular or random graphs. Among other things, they usually display broad degree distributions, and show small-world topological structure. Roughly speaking, a small-world graph is a network where any individual is relatively close, in terms of social ties, to any other individual, a property also found in random graphs but not in regular lattices. However, in contrast with random graphs, small-world networks also have a certain amount of local structure, as measured, for instance, by a quantity called the clustering coefficient. In the same vein, many real conflicting situations in economy and sociology are not well described neither by a fixed geographical position of the individuals in a regular lattice, nor by a random graph. Furthermore, it is a known fact that network structure can highly influence dynamical phenomena such as the way diseases spread across a population and ideas or information get transmitted. Therefore, in the last decade, research attention has naturally shifted from random and regular graphs towards better models of social interaction structures. The primary goal of this work is to discover whether or not the underlying graph structure of real social networks could give explanations as to why one finds higher levels of cooperation in populations of human beings or animals than what is prescribed by classical game theory. To meet this objective, I start by thoroughly studying a real scientific coauthorship network and showing how it differs from biological or technological networks using divers statistical measurements. Furthermore, I extract and describe its community structure taking into account the intensity of a collaboration. Finally, I investigate the temporal evolution of the network, from its inception to its state at the time of the study in 2006, suggesting also an effective view of it as opposed to a historical one. Thereafter, I combine evolutionary game theory with several network models along with the studied coauthorship network in order to highlight which specific network properties foster cooperation and shed some light on the various mechanisms responsible for the maintenance of this same cooperation. I point out the fact that, to resist defection, cooperators take advantage, whenever possible, of the degree-heterogeneity of social networks and their underlying community structure. Finally, I show that cooperation level and stability depend not only on the game played, but also on the evolutionary dynamic rules used and the individual payoff calculations. Synopsis Le but de la théorie des jeux réside dans l'analyse de situations dans lesquelles différents acteurs sociaux, avec des objectifs souvent conflictuels, doivent individuellement prendre des décisions qui influenceront toutes le résultat global. Dans ce cadre, plusieurs jeux ont été inventés afin de saisir l'essence de divers dilemmes rencontrés dans d'importantes situations socio-économiques. Bien que ces jeux nous permettent souvent de comprendre le comportement d'êtres humains ou d'animaux en interactions, des expériences ont montré que les individus ont parfois tendance à coopérer dans des situations pour lesquelles la théorie classique des jeux prescrit de faire le contraire. Plusieurs mécanismes ont été invoqués pour tenter d'expliquer l'émergence de ce comportement coopératif inattendu. Parmi ceux-ci, la répétition des interactions, la réputation ou encore l'appartenance à des groupes reconnaissables ont souvent été mentionnés. Toutefois, les travaux de Nowak et May (1992) ont montré que le simple fait de disposer les joueurs selon une structure spatiale en leur permettant d'interagir uniquement avec leurs voisins directs est suffisant pour maintenir un certain niveau de coopération même si le jeu est joué de manière anonyme et sans répétitions. L'étude de Nowak et May, ainsi qu'un nombre substantiel de travaux qui ont suivi, étaient basés sur des structures régulières telles que des grilles à deux dimensions. Axelrod et al. (2002) ont montré qu'en randomisant le choix des voisins, i.e. en abandonnant une localisation géographique stricte, la coopération peut malgré tout émerger, pour autant que les schémas d'interactions restent stables au cours du temps. Ceci est un premier pas en direction d'une structure de réseau social. Toutefois, suite aux travaux précurseurs de sociologues des années soixante, tels que ceux de Milgram (1967), il est devenu clair ces dernières années qu'une grande partie des réseaux d'interactions sociaux et biologiques, et même quelques réseaux technologiques, possèdent des propriétés particulières, et partiellement inattendues, qui les distinguent de graphes réguliers ou aléatoires. Entre autres, ils affichent en général une distribution du degré relativement large ainsi qu'une structure de "petit-monde". Grossièrement parlant, un graphe "petit-monde" est un réseau où tout individu se trouve relativement près de tout autre individu en termes de distance sociale, une propriété également présente dans les graphes aléatoires mais absente des grilles régulières. Par contre, les réseaux "petit-monde" ont, contrairement aux graphes aléatoires, une certaine structure de localité, mesurée par exemple par une quantité appelée le "coefficient de clustering". Dans le même esprit, plusieurs situations réelles de conflit en économie et sociologie ne sont pas bien décrites ni par des positions géographiquement fixes des individus en grilles régulières, ni par des graphes aléatoires. De plus, il est bien connu que la structure même d'un réseau peut passablement influencer des phénomènes dynamiques tels que la manière qu'a une maladie de se répandre à travers une population, ou encore la façon dont des idées ou une information s'y propagent. Ainsi, durant cette dernière décennie, l'attention de la recherche s'est tout naturellement déplacée des graphes aléatoires et réguliers vers de meilleurs modèles de structure d'interactions sociales. L'objectif principal de ce travail est de découvrir si la structure sous-jacente de graphe de vrais réseaux sociaux peut fournir des explications quant aux raisons pour lesquelles on trouve, chez certains groupes d'êtres humains ou d'animaux, des niveaux de coopération supérieurs à ce qui est prescrit par la théorie classique des jeux. Dans l'optique d'atteindre ce but, je commence par étudier un véritable réseau de collaborations scientifiques et, en utilisant diverses mesures statistiques, je mets en évidence la manière dont il diffère de réseaux biologiques ou technologiques. De plus, j'extrais et je décris sa structure de communautés en tenant compte de l'intensité d'une collaboration. Finalement, j'examine l'évolution temporelle du réseau depuis son origine jusqu'à son état en 2006, date à laquelle l'étude a été effectuée, en suggérant également une vue effective du réseau par opposition à une vue historique. Par la suite, je combine la théorie évolutionnaire des jeux avec des réseaux comprenant plusieurs modèles et le réseau de collaboration susmentionné, afin de déterminer les propriétés structurelles utiles à la promotion de la coopération et les mécanismes responsables du maintien de celle-ci. Je mets en évidence le fait que, pour ne pas succomber à la défection, les coopérateurs exploitent dans la mesure du possible l'hétérogénéité des réseaux sociaux en termes de degré ainsi que la structure de communautés sous-jacente de ces mêmes réseaux. Finalement, je montre que le niveau de coopération et sa stabilité dépendent non seulement du jeu joué, mais aussi des règles de la dynamique évolutionnaire utilisées et du calcul du bénéfice d'un individu.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: A device to perform sutureless end-to-side coronary artery anastomosis has been developed by means of stent technology (GraftConnector). The present study assesses the long-term quality of the GraftConnector anastomosis in a sheep model. METHODS: In 8 adult sheep, 40-55 kg in weight, through left anterior thoracotomy, the right internal mammary artery (RIMA) was prepared and connected to the left anterior descending artery (LAD) by means of GraftConnector, on beating heart, without using any stabilizer. Ticlopidine 250 mg/day for anticoagulation for 4 weeks and Aspirin 100 mg/day for 6 months were given. The animals were sacrificed after 6 months and histological examination of anastomoses was carried out after slicing with the connector in situ for morphological analysis. RESULTS: All animals survived at 6 months. All anastomoses were patent and mean luminal width at histology was 1.8 +/- 0.2 mm; mean myotomia hyperplasia thickness was 0.21 +/- 0.1 mm. CONCLUSIONS: Long-term results demonstrate that OPCABGs performed with GraftConnector had 100% patency rate. The mean anastomotic luminal width corresponds to mean LAD's adult sheep diameter. We may speculate that myotomia hyperplasia occurred as a result of local device oversizing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction An impaired ability to oxidize fat may be a factor in the obesity's aetiology (3). Moreover, the exercise intensity (Fatmax) eliciting the maximal fat oxidation rate (MFO) was lower in obese (O) compared with lean (L) individuals (4). However, difference in fat oxidation rate (FOR) during exercise between O and L remains equivocal and little is known about FORs during high intensities (>60% ) in O compared with L. This study aimed to characterize fat oxidation kinetics over a large range of intensities in L and O. Methods 12 healthy L [body mass index (BMI): 22.8±0.4] and 16 healthy O men (BMI: 38.9±1.4) performed submaximal incremental test (Incr) to determine whole-body fat oxidation kinetics using indirect calorimetry. After a 15-min resting period (Rest) and 10-min warm-up at 20% of maximal power output (MPO, determined by a maximal incremental test), the power output was increased by 7.5% MPO every 6-min until respiratory exchange ratio reached 1.0. Venous lactate and glucose and plasma concentration of epinephrine (E), norepinephrine (NE), insulin and non-esterified fatty acid (NEFA) were assessed at each step. A mathematical model (SIN) (1), including three variables (dilatation, symmetry, translation), was used to characterize fat oxidation (normalized by fat-free mass) kinetics and to determine Fatmax and MFO. Results FOR at Rest and MFO were not significantly different between groups (p≥0.1). FORs were similar from 20-60% (p≥0.1) and significantly lower from 65-85% in O than in L (p≤0.04). Fatmax was significantly lower in O than in L (46.5±2.5 vs 56.7±1.9 % respectively; p=0.005). Fat oxidation kinetics was characterized by similar translation (p=0.2), significantly lower dilatation (p=0.001) and tended to a left-shift symmetry in O compared with L (p=0.09). Plasma E, insulin and NEFA were significantly higher in L compared to O (p≤0.04). There were no significant differences in glucose, lactate and plasma NE between groups (p≥0.2). Conclusion The study showed that O presented a lower Fatmax and a lower reliance on fat oxidation at high, but not at moderate, intensities. This may be linked to a: i) higher levels of insulin and lower E concentrations in O, which may induce blunted lipolysis; ii) higher percentage of type II and a lower percentage of type I fibres (5), and iii) decreased mitochondrial content (2), which may reduce FORs at high intensities and Fatmax. These findings may have implications for an appropriate exercise intensity prescription for optimize fat oxidation in O. References 1. Cheneviere et al. Med Sci Sports Exerc. 2009 2. Holloway et al. Am J Clin Nutr. 2009 3. Kelley et al. Am J Physiol. 1999 4. Perez-Martin et al. Diabetes Metab. 2001 5. Tanner et al. Am J Physiol Endocrinol Metab. 2002

Relevância:

20.00% 20.00%

Publicador:

Resumo:

All-trans retinoic acid (ATRA) combined to anthracycline-based chemotherapy is the reference treatment of acute promyelocytic leukemia (APL). Whereas, in high-risk patients, cytarabine (AraC) is often considered useful in combination with anthracycline to prevent relapse, its usefulness in standard-risk APL is uncertain. In APL 2000 trial, patients with standard-risk APL [i.e., with baseline white blood cell (WBC) count <10,000/mm(3) ] were randomized between treatment with ATRA with Daunorubicin (DNR) and AraC (AraC group) and ATRA with DNR but without AraC (no AraC group). All patients subsequently received combined maintenance treatment. The trial had been prematurely terminated due to significantly more relapses in the no AraC group (J Clin Oncol, (24) 2006, 5703-10), but follow-up was still relatively short. With long-term follow-up (median 103 months), the 7-year cumulative incidence of relapses was 28.6% in the no AraC group, compared to 12.9% in the AraC group (P = 0.0065). In standard-risk APL, at least when the anthracycline used is DNR, avoiding AraC may lead to an increased risk of relapse suggesting that the need for AraC is regimen-dependent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of small Island Developing States (SIDS) is quite recent, end of the 80s and 90s, still looking for a theoretical consolidation. SIDS, as small states in development, formed by one or several islands geographically dispersed, present reduced population, market, territory, natural resources, including drinkable water, and, in great number of the cases, low level of economic activity, factors that together, hinder the gathering of scale economies. To these diseconomies they come to join the more elevated costs in transports and communications which, allies to lower productivities, to a smaller quality and diversification of its productions, which difficult its integration in the world economy. In some SIDS these factors are not dissociating of the few investments in infrastructures, in the formation of human resources and in productive investments, just as it happens in most of the developing countries. In ecological terms, many of them with shortage of natural resources, but integrating important ecosystems in national and world terms, but with great fragility relatively to the pollution action, of excessive fishing, of uncontrolled development of tourism, factors that, conjugated and associated to the stove effect, condition the climate and the slope of the medium level of the sea water and therefore could put in cause the own survival of some of them. The drive to the awareness of the international community towards its problems summed up with the accomplishment by the United Nations in the Barbados’s Conference, 1994 where the right to the development was emphasized, through the going up the appropriate strategies and the Programme of Action for the Sustainable Development of the SIDS. The orientation of the regional and international cooperation in that sense, sharing technology (namely clean technology and control and administration environmental technology), information and creation of capacity-building, supplying means, including financial resources, creating non discriminatory and just trade rules, it would drive to the establishment of a world system economically more equal, in which the production, the consumption, the pollution levels, the demographic politics were guided towards the sustainability. It constituted an important step for the recognition for the international community on the specificities of those states and it allowed the definition of a group of norms and politics to implement at the national, regional and international level and it was important that they continued in the sense of the sustainable development. But this Conference had in its origin previous summits: the Summit of Rio de Janeiro about Environment and Development, accomplished in 1992, which left an important document - the Agenda 21, in the Conference of Stockholm at 1972 and even in the Conference of Ramsar, 1971 about “Wetlands.” CENTRO DE ESTUDOS AFRICANOS Occasional Papers © CEA - Centro de Estudos Africanos 4 Later, the Valletta Declaration, Malta, 1998, the Forum of Small States, 2002, get the international community's attention for the problems of SIDS again, in the sense that they act to increase its resilience. If the definition of “vulnerability” was the inability of the countries to resist economical, ecological and socially to the external shocks and “resilience” as the potential for them to absorb and minimize the impact of those shocks, presenting a structure that allows them to be little affected by them, a part of the available studies, dated of the 90s, indicate that the SIDS are more vulnerable than the other developing countries. The vulnerability of SIDS results from the fact the they present an assemblage of characteristics that turns them less capable of resisting or they advance strategies that allow a larger resilience to the external shocks, either anthropogenic (economical, financial, environmental) or even natural, connected with the vicissitudes of the nature. If these vulnerability factors were grouped with the expansion of the economic capitalist system at world level, the economic and financial globalisation, the incessant search of growing profits on the part of the multinational enterprises, the technological accelerated evolution drives to a situation of disfavour of the more poor. The creation of the resilience to the external shocks, to the process of globalisation, demands from SIDS and of many other developing countries the endogen definition of strategies and solid but flexible programs of integrated development. These must be assumed by the instituted power, but also by the other stakeholders, including companies and organizations of the civil society and for the population in general. But that demands strong investment in the formation of human resources, in infrastructures, in investigation centres; it demands the creation capacity not only to produce, but also to produce differently and do international marketing. It demands institutional capacity. Cape Verde is on its way to this stage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neuroblastoma (NBL) is the commonest extra-cranial solid tumor in children and the leading cause of cancer related deaths in childhood between the age of 1 to 4 years. NBL may behave in very different ways, from the less aggressive stage 4S NBL or congenital forms that may resolve without treatment in up to 90% of the children, to the high-risk disseminated stage 4 disease in older children with a cure rate of 35 to 40%. Initial staging is crucial for effective management and radiolabeled metaiodobenzylguanidine (MIBG) with iodine-123 is a powerful tool with a sensitivity around 90% and a specificity close to 100% for the diagnosis of NBL. MIBG scintigraphy is used routinely and is mandatory in most investigational clinical trials both for the initial staging of the disease, the evaluation of the response to treatment, as well as for the detection of recurrence during follow-up. With respect to outcome of children presenting disseminated stage 4 NBL, the role of post-therapeutic [(123)I]MIBG scan has been investigated by several groups but so far there is no consensus whereas a complete or very good partial response as assessed by MIBG may be of prognostic value. NBL needs a multimodality approach at diagnosis and during follow-up and MIBG scintigraphy keeps its pivotal role, in particular with respect to bone marrow involvement and/or cortical bone metastases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intraoperative examination of sentinel axillary lymph nodes can be done by imprint cytology, frozen section, or, most recently, by PCR-based amplification of a cytokeratin signal. Using this technique, benign epithelial inclusions, representing mammary tissue displaced along the milk line, will likely generate a positive PCR signal and lead to a false-positive diagnosis of metastatic disease. To better appreciate the incidence of ectopic epithelial inclusions in axillary lymph nodes, we have performed an autopsy study, examining on 100 μm step sections 3,904 lymph nodes obtained from 160 axillary dissections in 80 patients. The median number of lymph nodes per axilla was 23 (15, 6, and 1 in levels 1, 2, and 3, respectively). A total of 30,450 hematoxylin-eosin stained slides were examined, as well as 8,825 slides immunostained with pan-cytokeratin antibodies. Despite this meticulous work-up, not a single epithelial inclusion was found in this study, suggesting that the incidence of such inclusions is much lower than the assumed 5% reported in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study assesses firstly the evaluation process of the first generation of asylum instruments while underlining the possibilities to improve it. It analyses secondly the asylum "acquis" regarding distribution of refugees between Member States, the eligibility for protection, the status of protected persons regarding detention and vulnerability, asylum procedures and the external dimension by formulating short-term recommendations of each area. Its last part is devoted to the long term evolution of the Common European Asylum System regarding the legal context including the accession of the EU to the Geneva Convention, the institutional perspectives including the new European Support Office, the jurisdictional perspective, the substantive perspective, the distributive perspective and the external perspective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The alpha1-adrenergic agonist phenylephrine stimulated phospholipase D (PLD) activity in Rat 1 fibroblasts transfected to express either the wild-type hamster alpha1B-adrenoceptor or a constitutively active mutant (CAM) form of this receptor. The EC50 for agonist stimulation of PLD activity was substantially lower at the CAM receptor than at the wild-type receptor as previously noted for phenylephrine stimulation of phosphoinositidase C activity. Sustained treatment of cells expressing the CAM alpha1B-adrenoceptor with phentolamine resulted in a marked up-regulation in levels of this receptor with half-maximal effects produced within 24 h and with an EC50 of approx. 40 nM. Such an up-regulation could be produced with a range of other ligands generally viewed as alpha1-adrenoceptor antagonists but equivalent treatment of cells expressing the wild-type alpha1B-adrenoceptor was unable to mimic these effects. After sustained treatment of the CAM alpha1B-adrenoceptor expressing cells with phentolamine, basal PLD activity was increased and phenylephrine was now able to stimulate PLD activity to greater levels than in vehicle-treated CAM alpha1B-adrenoceptor-expressing cells. The EC50 for phenylephrine stimulation of PLD activity was not altered, however, by phentolamine pretreatment and the associated up-regulation of the receptor. After phentolamine-induced up-regulation of basal PLD activity, a range of alpha1-antagonists were shown to possess the characteristics of inverse agonists of the CAM alpha1B-adrenoceptor as they were able to substantially decrease the elevated basal PLD activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Melanoma is the most common lethal cutaneous neoplasm. In order to harmonize treatment and follow-up of melanoma patients, guidelines for the management of melanoma in Switzerland have been inaugurated in 2001. These have been approved by all Swiss medical societies involved in the care of melanoma patients. New data necessitated changes concerning the safety margins (reduction to maximally 2 cm) and modifications of the recommendations of follow-up.