519 resultados para Puzzle unforgeability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Prognosis of prostate cancer (PCa) is based mainly in histological aspects together with PSA serum levels that not always reflect the real aggressive potential of the neoplasia. The micro RNA (miRNA) mir-21 has been shown to regulate invasiveness in cancer through translational repression of the Metaloproteinase (MMP) inhibitor RECK. Our aim is to investigate the levels of expression of RECK and miR-21 in PCa comparing with classical prognostic factors and disease outcome and also test if RECK is a target of miR-21 in in vitro study using PCa cell line. Materials and methods: To determine if RECK is a target of miR-21 in prostate cancer we performed an in vitro assay with PCa cell line DU-145 transfected with pre-miR-21 and anti-miR-21. To determine miR-21 and RECK expression levels in PCa samples we performed quantitative real-time polymerase chain reaction (qRT-PCR). Results: The in vitro assays showed a decrease in expression levels of RECK after transfection with pre-miR-21, and an increase of MMP9 that is regulated by RECK compared to PCa cells treated with anti-miR-21. We defined three profiles to compare the prognostic factors. The first was characterized by miR-21 and RECK underexpression (N = 25) the second was characterized by miR-21 overexpression and RECK underexpression (N = 12), and the third was characterized by miR-21 underexpression and RECK overexpression (N = 16). From men who presented the second profile (miR-21 overexpression and RECK underexpression) 91.7% were staged pT3. For the other two groups 48.0%, and 46.7% of patients were staged pT3 (p = 0.025). Conclusions: Our results demonstrate RECK as a target of miR-21. We believe that miR-21 may be important in PCa progression through its regulation of RECK, a known regulator of tumor cell invasion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1603, the Italian shoemaker Vincenzo Cascariolo found that a stone (baryte) from the outskirts of Bologna emitted light in the dark without any external excitation source. However, the calcination of the baryte was needed prior to this observation. The stone later named as the Bologna Stone was among the first luminescent materials and the first documented material to show persistent luminescence. The mechanism behind the persistent emission in this material has remained a mystery ever since. In this work, the Bologna Stone (BaS) was prepared from the natural baryte (Bologna, Italy) used by Cascariolo. Its properties, e. g. impurities (dopants) and their valences, luminescence, persistent luminescence and trap structure, were compared to those of the pure BaS materials doped with different (transition) metals (Cu, Ag, Pb) known to yield strong luminescence. The work was carried out by using different methods (XANES, TL, VUV-UV-vis luminescence, TGA-DTA, XPD). A plausible mechanism for the persistent luminescence from the Bologna Stone with Cu+ as the emitting species was constructed based on the results obtained. The puzzle of the Bologna Stone can thus be considered as resolved after some 400 years of studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The maintenance of biodiversity is a long standing puzzle in ecology. It is a classical result that if the interactions of the species in an ecosystem are chosen in a random way, then complex ecosystems can't sustain themselves, meaning that the structure of the interactions between the species must be a central component on the preservation of biodiversity and on the stability of ecosystems. The rock-paper-scissors model is one of the paradigmatic models that study how biodiversity is maintained. In this model 3 species dominate each other in a cyclic way (mimicking a trophic cycle), that is, rock dominates scissors, that dominates paper, that dominates rock. In the original version of this model, this dominance obeys a 'Z IND 3' symmetry, in the sense that the strength of dominance is always the same. In this work, we break this symmetry, studying the effects of the addition of an asymmetry parameter. In the usual model, in a two dimensional lattice, the species distribute themselves according to spiral patterns, that can be explained by the complex Landau-Guinzburg equation. With the addition of asymmetry, new spatial patterns appear during the transient and the system either ends in a state with spirals, similar to the ones of the original model, or in a state where unstable spatial patterns dominate or in a state where only one species survives (and biodiversity is lost).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work ls based on a atudy about tbe Dunes Camp of Maspalomas (fígure 1), in Gran Canaria Island (Spaln). Their sedimentary processes are detached and represented separately in order to delimit the diferent sedimentary and eolian sub-unit to get an "spectral" analysls. The developed analytic serie permit us to construct a sequence of thematic maps Later, the cartographlc puzzle is integrated, once the pbyslcal varieties of tbe sedimentary dynamycs that intervene in lbe terrltory 1s well konwn and clearly understood. Tbe cartography of integration or the general vision about the processes of both transport and sedimentary deposits contains yet enough information, inside a physical perspective ( based on the dune biotope ), to decide in relatlon to the arrangement, planning and management of the territory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present research aims at shedding light on the demanding puzzle characterizing the issue of child undernutrition in India. Indeed, the so called ‘Indian development paradox’ identifies the phenomenon according to which higher level of income per capita is recorded alongside a lethargic reduction in the proportion of underweight children aged below three years. Thus, in the time period occurring from 2000 to 2005, real Gross Domestic Production per capita has annually grown at 5.4%, whereas the proportion of children who are underweight has declined from 47% to 46%, a mere one point percent. Such trend opens up the space for discussing the traditionally assumed linkage between income-poverty and undernutrition as well as food intervention as the main focus of policies designed to fight child hunger. Also, it unlocks doors for evaluating the role of an alternative economic approach aiming at explaining undernutrition, such as the Capability Approach. The Capability Approach argues for widening the informational basis to account not only for resources, but also for variables related to liberties, opportunities and autonomy in pursuing what individuals value.The econometric analysis highlights the relevance of including behavioral factors when explaining child undernutrition. In particular, the ability of the mother to move freely in the community without the need of asking permission to her husband or mother-in-law is statistically significant when included in the model, which accounts also for confounding traditional variables, such as economic wealth and food security. Also, focusing on agency, results indicates the necessity of measuring autonomy in different domains and the need of improving the measurement scale for agency data, especially with regards the domain of household duties. Finally, future research is required to investigate policy venues for increasing agency in women and in the communities they live in as viable strategy for reducing the plague of child undernutrition in India.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work contributes to the field of spatial economics by embracing three distinct modelling approaches, belonging to different strands of the theoretical literature. In the first chapter I present a theoretical model in which the changes in urban system’s degree of functional specialisation are linked to (i) firms’ organisational choices and firms’ location decisions. The interplay between firms’ internal communication/managing costs (between headquarters and production plants) and the cost of communicating with distant business services providers leads the transition process from an “integrated” urban system where each city hosts every different functions to a “functionally specialised” urban system where each city is either a primary business center (hosting advanced business services providers, a secondary business center or a pure manufacturing city and all this city-types coexist in equilibrium.The second chapter investigates the impact of free trade on welfare in a two-country world modelled as an international Hotelling duopoly with quadratic transport costs and asymmetric countries, where a negative environmental externality is associated with the consumption of the good produced in the smaller country. Countries’ relative sizes as well as the intensity of negative environmental externality affect potential welfare gains of trade liberalisation. The third chapter focuses on the paradox, by which, contrary to theoretical predictions, empirical evidence shows that a decrease in international transport costs causes an increase in foreign direct investments (FDIs). Here we propose an explanation to this apparent puzzle by exploiting an approach which delivers a continuum of Bertrand- Nash equilibria ranging above marginal cost pricing. In our setting, two Bertrand firms, supplying a homogeneous good with a convex cost function, enter the market of a foreign country. We show that allowing for a softer price competition may indeed more than offset the standard effect generated by a decrease in trade costs, thereby restoring FDI incentives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il presente lavoro è dedicato allo studio della geografia immaginaria creata dallo scrittore indiano di lingua inglese R.K. Narayan (1906-2001), allo scopo non solo di indagare la relazione che si stabilisce tra spazio, personaggi e racconto, ma anche di rilevare l’interazione tra il mondo narrativo e le rappresentazioni dominanti dello spazio indiano elaborate nel contesto coloniale e postcoloniale. Dopo un primo capitolo di carattere teorico-metodologico (che interroga le principali riflessioni seguite allo "spatial turn" che ha interessato le scienze umane nel corso del Novecento, i concetti fondamentali formulati nell’ambito della teoria dei "fictional worlds", e i più recenti approcci al rapporto tra spazio e letteratura), la ricerca si articola in due ulteriori sezioni, che si rivolgono ai quattordici romanzi dell’autore attraverso una pratica interpretativa di ispirazione geocritica e “spazializzata”. Nel secondo capitolo, che concerne la dimensione “verticale” che si estende dal cronotopo dei romanzi a quello dell’autore e dei lettori, si procede al rilevamento, all’interno del mondo narrativo, di tre macro-paesaggi, successivamente messi a confronto con le rappresentazioni endogene e esogene dello spazio extratestuale; da questo confronto, la cittadina di Malgudi emerge come proposta autoriale di riorganizzazione sociale e urbana dal carattere innovativo e dallo statuto eterotopico, sia in rapporto alla tradizione letteraria dalla quale origina, sia rispetto alle circostanze ambientali dell’India meridionale in cui essa è finzionalmente collocata. Seguendo una dinamica “orizzontale”, il terzo capitolo esamina infine il rapporto tra lo spazio frazionato di Malgudi, i luoghi praticati dai suoi abitanti e la relazione che questi instaurano con il territorio transfrontaliero e con la figura del forestiero; inoltre, al fine di stabilire la misura in cui la natura dello spazio narrativo influisce sulla forma del racconto, si osservano le coincidenze tra il tema dell’incompiutezza che pervade le vicende dei personaggi e la forma aperta dei finali romanzeschi.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In vielen Bereichen der industriellen Fertigung, wie zum Beispiel in der Automobilindustrie, wer- den digitale Versuchsmodelle (sog. digital mock-ups) eingesetzt, um die Entwicklung komplexer Maschinen m ̈oglichst gut durch Computersysteme unterstu ̈tzen zu k ̈onnen. Hierbei spielen Be- wegungsplanungsalgorithmen eine wichtige Rolle, um zu gew ̈ahrleisten, dass diese digitalen Pro- totypen auch kollisionsfrei zusammengesetzt werden k ̈onnen. In den letzten Jahrzehnten haben sich hier sampling-basierte Verfahren besonders bew ̈ahrt. Diese erzeugen eine große Anzahl von zuf ̈alligen Lagen fu ̈r das ein-/auszubauende Objekt und verwenden einen Kollisionserken- nungsmechanismus, um die einzelnen Lagen auf Gu ̈ltigkeit zu u ̈berpru ̈fen. Daher spielt die Kollisionserkennung eine wesentliche Rolle beim Design effizienter Bewegungsplanungsalgorith- men. Eine Schwierigkeit fu ̈r diese Klasse von Planern stellen sogenannte “narrow passages” dar, schmale Passagen also, die immer dort auftreten, wo die Bewegungsfreiheit der zu planenden Objekte stark eingeschr ̈ankt ist. An solchen Stellen kann es schwierig sein, eine ausreichende Anzahl von kollisionsfreien Samples zu finden. Es ist dann m ̈oglicherweise n ̈otig, ausgeklu ̈geltere Techniken einzusetzen, um eine gute Performance der Algorithmen zu erreichen.rnDie vorliegende Arbeit gliedert sich in zwei Teile: Im ersten Teil untersuchen wir parallele Kollisionserkennungsalgorithmen. Da wir auf eine Anwendung bei sampling-basierten Bewe- gungsplanern abzielen, w ̈ahlen wir hier eine Problemstellung, bei der wir stets die selben zwei Objekte, aber in einer großen Anzahl von unterschiedlichen Lagen auf Kollision testen. Wir im- plementieren und vergleichen verschiedene Verfahren, die auf Hu ̈llk ̈operhierarchien (BVHs) und hierarchische Grids als Beschleunigungsstrukturen zuru ̈ckgreifen. Alle beschriebenen Verfahren wurden auf mehreren CPU-Kernen parallelisiert. Daru ̈ber hinaus vergleichen wir verschiedene CUDA Kernels zur Durchfu ̈hrung BVH-basierter Kollisionstests auf der GPU. Neben einer un- terschiedlichen Verteilung der Arbeit auf die parallelen GPU Threads untersuchen wir hier die Auswirkung verschiedener Speicherzugriffsmuster auf die Performance der resultierenden Algo- rithmen. Weiter stellen wir eine Reihe von approximativen Kollisionstests vor, die auf den beschriebenen Verfahren basieren. Wenn eine geringere Genauigkeit der Tests tolerierbar ist, kann so eine weitere Verbesserung der Performance erzielt werden.rnIm zweiten Teil der Arbeit beschreiben wir einen von uns entworfenen parallelen, sampling- basierten Bewegungsplaner zur Behandlung hochkomplexer Probleme mit mehreren “narrow passages”. Das Verfahren arbeitet in zwei Phasen. Die grundlegende Idee ist hierbei, in der er- sten Planungsphase konzeptionell kleinere Fehler zuzulassen, um die Planungseffizienz zu erh ̈ohen und den resultierenden Pfad dann in einer zweiten Phase zu reparieren. Der hierzu in Phase I eingesetzte Planer basiert auf sogenannten Expansive Space Trees. Zus ̈atzlich haben wir den Planer mit einer Freidru ̈ckoperation ausgestattet, die es erlaubt, kleinere Kollisionen aufzul ̈osen und so die Effizienz in Bereichen mit eingeschr ̈ankter Bewegungsfreiheit zu erh ̈ohen. Optional erlaubt unsere Implementierung den Einsatz von approximativen Kollisionstests. Dies setzt die Genauigkeit der ersten Planungsphase weiter herab, fu ̈hrt aber auch zu einer weiteren Perfor- mancesteigerung. Die aus Phase I resultierenden Bewegungspfade sind dann unter Umst ̈anden nicht komplett kollisionsfrei. Um diese Pfade zu reparieren, haben wir einen neuartigen Pla- nungsalgorithmus entworfen, der lokal beschr ̈ankt auf eine kleine Umgebung um den bestehenden Pfad einen neuen, kollisionsfreien Bewegungspfad plant.rnWir haben den beschriebenen Algorithmus mit einer Klasse von neuen, schwierigen Metall- Puzzlen getestet, die zum Teil mehrere “narrow passages” aufweisen. Unseres Wissens nach ist eine Sammlung vergleichbar komplexer Benchmarks nicht ̈offentlich zug ̈anglich und wir fan- den auch keine Beschreibung von vergleichbar komplexen Benchmarks in der Motion-Planning Literatur.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The future goal of modern physics is the discovery of physics beyond the Standard Model. One of the most significant hints for New Physics can be seen in the anomalous magnetic moment of the muon - one of the most precise measured variables in modern physics and the main motivation of this work. This variable is associated with the coupling of the muon, an elementary particle, to an external electromagnetic field and is defined as a = (g - 2)/2, whereas g is the gyromagnetic factor of the muon. The muon anomaly has been measured with a relative accuracy of 0.5·10-6. However, a difference between the direct measurement and the Standard Model prediction of 3.6 standard deviations can be observed. This could be a hint for the existence of New Physics. Unfortunately, it is, yet, not significant enough to claim an observation and, thus, more precise measurements and calculations have to be performed.rnThe muon anomaly has three contributions, whereas the ones from quantum electrodynamics and weak interaction can be determined from perturbative calculations. This cannot be done in case of the hadronic contributions at low energies. The leading order contribution - the hadronic vacuum polarization - can be computed via a dispersion integral, which needs as input hadronic cross section measurements from electron-positron annihilations. Hence, it is essential for a precise prediction of the muon anomaly to measure these hadronic cross sections, σ(e+e-→hadrons), with high accuracy. With a contribution of more than 70%, the final state containing two charged pions is the most important one in this context.rnIn this thesis, a new measurement of the σ(e+e-→π+π-) cross section and the pion form factor is performed with an accuracy of 0.9% in the dominant ρ(770) resonance region between 600 and rn900 MeV at the BESIII experiment. The two-pion contribution to the leading-order (LO) hadronic vacuum polarization contribution to (g - 2) from the BESIII result, obtained in this work, is computed to be a(ππ,LO,600-900 MeV) = (368.2±2.5stat±3.3sys)·10-10. With the result presented in this thesis, we make an important contribution on the way to solve the (g - 2) puzzle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic sexual signals often show a diel rhythm and may vary substantially with time of day. Diel and short-term fluctuations in such sexual signals pose a puzzle for condition capture models of mate choice, which assume a female preference for male traits that reliably reflect a male's quality. Here we experimentally manipulated the food supply of individual male field crickets Gryllus campestris in their natural habitat in two consecutive seasons to determine (i) the effect of male nutritional condition on the fine-scaled variation of diel investment in acoustic signalling and (ii) the temporal association between the diel variation in male signalling and female mate-searching behaviour. Overall food-supplemented males signalled more often, but the effect was only visible during the daytime. In the evening and the night, signal output was still high but the time spent signalling was unrelated to a male's nutritional condition. Females' mate-searching behaviour also showed a diel rhythm with peak activity during the afternoon, when differences among calling males were highest, and where signal output reliably reflects male quality. These findings suggest that males differing in nutritional condition may optimize their investment in signalling in relation to time of day as to maximize mating success.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the dual ex vivo perfusion of an isolated human placental cotyledon it takes on average 20-30 min to set up stable perfusion circuits for the maternal and fetal vascular compartments. In vivo placental tissue of all species maintains a highly active metabolism and it continues to puzzle investigators how this tissue can survive 30 min of ischemia with more or less complete anoxia following expulsion of the organ from the uterus and do so without severe damage. There seem to be parallels between "depressed metabolism" seen in the fetus and the immature neonate in the peripartum period and survival strategies described in mammals with increased tolerance of severe hypoxia like hibernators in the state of torpor or deep sea diving turtles. Increased tolerance of hypoxia in both is explained by "partial metabolic arrest" in the sense of a temporary suspension of Kleiber's rule. Furthermore the fetus can react to major changes in surrounding oxygen tension by decreasing or increasing the rate of specific basal metabolism, providing protection against severe hypoxia as well as oxidative stress. There is some evidence that adaptive mechanisms allowing increased tolerance of severe hypoxia in the fetus or immature neonate can also be found in placental tissue, of which at least the villous portion is of fetal origin. A better understanding of the molecular details of reprogramming of fetal and placental tissues in late pregnancy may be of clinical relevance for an improved risk assessment of the individual fetus during the critical transition from intrauterine life to the outside and for the development of potential prophylactic measures against severe ante- or intrapartum hypoxia. Responses of the tissue to reperfusion deserve intensive study, since they may provide a rational basis for preventive measures against reperfusion injury and related oxidative stress. Modification of the handling of placental tissue during postpartum ischemia, and adaptation of the artificial reperfusion, may lead to an improvement of the ex vivo perfusion technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this research was to develop a working physical model of the focused plenoptic camera and develop software that can process the measured image intensity, reconstruct this into a full resolution image, and to develop a depth map from its corresponding rendered image. The plenoptic camera is a specialized imaging system designed to acquire spatial, angular, and depth information in a single intensity measurement. This camera can also computationally refocus an image by adjusting the patch size used to reconstruct the image. The published methods have been vague and conflicting, so the motivation behind this research is to decipher the work that has been done in order to develop a working proof-of-concept model. This thesis outlines the theory behind the plenoptic camera operation and shows how the measured intensity from the image sensor can be turned into a full resolution rendered image with its corresponding depth map. The depth map can be created by a cross-correlation of adjacent sub-images created by the microlenslet array (MLA.) The full resolution image reconstruction can be done by taking a patch from each MLA sub-image and piecing them together like a puzzle. The patch size determines what object plane will be in-focus. This thesis also goes through a very rigorous explanation of the design constraints involved with building a plenoptic camera. Plenoptic camera data from Adobe © was used to help with the development of the algorithms written to create a rendered image and its depth map. Finally, using the algorithms developed from these tests and the knowledge for developing the plenoptic camera, a working experimental system was built, which successfully generated a rendered image and its corresponding depth map.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Telecommunications have developed at an incredible speed over the last couple of decades. The decreasing size of our phones and the increasing number of ways in which we can communicate are barely the only result of this (r)evolutionary development. The latter has indeed multiple implications. The change of paradigm for telecommunications regulation, epitomised by the processes of liberalisation and reregulation, was not sufficient to answer all regulatory questions pertinent to communications. Today, after the transition from monopoly to competition, we are faced perhaps with an even harder regulatory puzzle, since we must figure out how to regulate a sector that is as dynamic and as unpredictable as electronic communications have proven to be, and as vital and fundamental to the economy and to society at large. The present book addresses the regulatory puzzle of contemporary electronic communications and suggests the outlines of a coherent model for their regulation. The search for such a model involves essentially deliberations on the question "Can competition law do it all?", since generic competition rules are largely seen as the appropriate regulatory tool for the communications domain. The latter perception has been the gist of the 2002 reform of the European Community (EC) telecommunications regime, which envisages a withdrawal of sectoral regulation, as communications markets become effectively competitive and ultimately bestows the regulation of the sector upon competition law only. The book argues that the question of whether competition law is the appropriate tool needs to be examined not in the conventional contexts of sector specific rules versus competition rules or deregulation versus regulation but in a broader governance context. Consequently, the reader is provided with an insight into the workings and specific characteristics of the communications sector as network-bound, converging, dynamic and endowed with a special societal role and function. A thorough evaluation of the regulatory objectives in the communications environment contributes further to the comprehensive picture of the communications industry. Upon this carefully prepared basis, the book analyses the communications regulatory toolkit. It explores the interplay between sectoral communications regulation, competition rules (in particular Article 82 of the EC Treaty) and the rules of the World Trade Organization (WTO) relevant to telecommunications services. The in-depth analysis of multilevel construct of EC communications law is up-to-date and takes into account important recent developments in the EC competition law in practice, in particular in the field of refusal to supply and tying, of the reform of the EC electronic communications framework and new decisions of the WTO dispute settlement body, such as notably the Mexico-Telecommunications Services Panel Report. Upon these building elements, an assessment of the regulatory potential of the EC competition rules is made. The conclusions drawn are beyond the scope of the current situation of EC electronic communications and the applicable law and explore the possible contours of an optimal regulatory framework for modern communications. The book is of particular interest to communications and antitrust law experts, as well as policy makers, government agencies, consultancies and think-tanks active in the field. Experts on other network industries (such as electricity or postal communications) can also profit from the substantial experience gathered in the communications sector as the most advanced one in terms of liberalisation and reregulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the first decades of the 20th century, aerological observations were for the first time performed in tropical regions. One of the most prominent endeavours in this respect was ARTHUR BERSON’s aerological expedition to East Africa. Although the main target was the East African monsoon circulation, the expedition provided also other insights that profoundly changed meteorology and climatology. BERSON observed that the tropical tropopause was much higher and colder than that over midlatitudes. Moreover, westerly winds were observed in the lower stratosphere, apparently contradicting the high-altitude equatorial easterly winds that were known since the Krakatoa eruption (‘‘Krakatoa easterlies’’). The puzzle was only resolved five decades later with the discovery of the Quasi-Biennial Oscillation (QBO). In this paper we briefly summarize the expedition of BERSON and review the results in a historical context and in the light of the current research. In the second part of the paper we re-visit BERSON’s early aerological observations, which we have digitized. We compare the observed wind profiles with corresponding profiles extracted from the ‘‘Twentieth Century Reanalysis’’, which provides global three-dimensional weather information back to 1871 based on an assimilation of sea-level and surface pressure data. The comparison shows a good agreement at the coast but less good agreement further inland, at the shore of Lake Victoria, where the circulation is more complex. These results demonstrate that BERSON’s observations are still valuable today as input to current reanalysis systems or for their validation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ContentsBuilding a businessDiver displays artistry in scienceEvent offers food for thoughtRaw milk stirs up hot debate Gymnasts piece together puzzle before regionalsGlobal gas tank nears emptyBirds' nest treats for Easter time