471 resultados para puzzle
Resumo:
The future goal of modern physics is the discovery of physics beyond the Standard Model. One of the most significant hints for New Physics can be seen in the anomalous magnetic moment of the muon - one of the most precise measured variables in modern physics and the main motivation of this work. This variable is associated with the coupling of the muon, an elementary particle, to an external electromagnetic field and is defined as a = (g - 2)/2, whereas g is the gyromagnetic factor of the muon. The muon anomaly has been measured with a relative accuracy of 0.5·10-6. However, a difference between the direct measurement and the Standard Model prediction of 3.6 standard deviations can be observed. This could be a hint for the existence of New Physics. Unfortunately, it is, yet, not significant enough to claim an observation and, thus, more precise measurements and calculations have to be performed.rnThe muon anomaly has three contributions, whereas the ones from quantum electrodynamics and weak interaction can be determined from perturbative calculations. This cannot be done in case of the hadronic contributions at low energies. The leading order contribution - the hadronic vacuum polarization - can be computed via a dispersion integral, which needs as input hadronic cross section measurements from electron-positron annihilations. Hence, it is essential for a precise prediction of the muon anomaly to measure these hadronic cross sections, σ(e+e-→hadrons), with high accuracy. With a contribution of more than 70%, the final state containing two charged pions is the most important one in this context.rnIn this thesis, a new measurement of the σ(e+e-→π+π-) cross section and the pion form factor is performed with an accuracy of 0.9% in the dominant ρ(770) resonance region between 600 and rn900 MeV at the BESIII experiment. The two-pion contribution to the leading-order (LO) hadronic vacuum polarization contribution to (g - 2) from the BESIII result, obtained in this work, is computed to be a(ππ,LO,600-900 MeV) = (368.2±2.5stat±3.3sys)·10-10. With the result presented in this thesis, we make an important contribution on the way to solve the (g - 2) puzzle.
Resumo:
Dynamic sexual signals often show a diel rhythm and may vary substantially with time of day. Diel and short-term fluctuations in such sexual signals pose a puzzle for condition capture models of mate choice, which assume a female preference for male traits that reliably reflect a male's quality. Here we experimentally manipulated the food supply of individual male field crickets Gryllus campestris in their natural habitat in two consecutive seasons to determine (i) the effect of male nutritional condition on the fine-scaled variation of diel investment in acoustic signalling and (ii) the temporal association between the diel variation in male signalling and female mate-searching behaviour. Overall food-supplemented males signalled more often, but the effect was only visible during the daytime. In the evening and the night, signal output was still high but the time spent signalling was unrelated to a male's nutritional condition. Females' mate-searching behaviour also showed a diel rhythm with peak activity during the afternoon, when differences among calling males were highest, and where signal output reliably reflects male quality. These findings suggest that males differing in nutritional condition may optimize their investment in signalling in relation to time of day as to maximize mating success.
Resumo:
In the dual ex vivo perfusion of an isolated human placental cotyledon it takes on average 20-30 min to set up stable perfusion circuits for the maternal and fetal vascular compartments. In vivo placental tissue of all species maintains a highly active metabolism and it continues to puzzle investigators how this tissue can survive 30 min of ischemia with more or less complete anoxia following expulsion of the organ from the uterus and do so without severe damage. There seem to be parallels between "depressed metabolism" seen in the fetus and the immature neonate in the peripartum period and survival strategies described in mammals with increased tolerance of severe hypoxia like hibernators in the state of torpor or deep sea diving turtles. Increased tolerance of hypoxia in both is explained by "partial metabolic arrest" in the sense of a temporary suspension of Kleiber's rule. Furthermore the fetus can react to major changes in surrounding oxygen tension by decreasing or increasing the rate of specific basal metabolism, providing protection against severe hypoxia as well as oxidative stress. There is some evidence that adaptive mechanisms allowing increased tolerance of severe hypoxia in the fetus or immature neonate can also be found in placental tissue, of which at least the villous portion is of fetal origin. A better understanding of the molecular details of reprogramming of fetal and placental tissues in late pregnancy may be of clinical relevance for an improved risk assessment of the individual fetus during the critical transition from intrauterine life to the outside and for the development of potential prophylactic measures against severe ante- or intrapartum hypoxia. Responses of the tissue to reperfusion deserve intensive study, since they may provide a rational basis for preventive measures against reperfusion injury and related oxidative stress. Modification of the handling of placental tissue during postpartum ischemia, and adaptation of the artificial reperfusion, may lead to an improvement of the ex vivo perfusion technique.
Resumo:
The purpose of this research was to develop a working physical model of the focused plenoptic camera and develop software that can process the measured image intensity, reconstruct this into a full resolution image, and to develop a depth map from its corresponding rendered image. The plenoptic camera is a specialized imaging system designed to acquire spatial, angular, and depth information in a single intensity measurement. This camera can also computationally refocus an image by adjusting the patch size used to reconstruct the image. The published methods have been vague and conflicting, so the motivation behind this research is to decipher the work that has been done in order to develop a working proof-of-concept model. This thesis outlines the theory behind the plenoptic camera operation and shows how the measured intensity from the image sensor can be turned into a full resolution rendered image with its corresponding depth map. The depth map can be created by a cross-correlation of adjacent sub-images created by the microlenslet array (MLA.) The full resolution image reconstruction can be done by taking a patch from each MLA sub-image and piecing them together like a puzzle. The patch size determines what object plane will be in-focus. This thesis also goes through a very rigorous explanation of the design constraints involved with building a plenoptic camera. Plenoptic camera data from Adobe © was used to help with the development of the algorithms written to create a rendered image and its depth map. Finally, using the algorithms developed from these tests and the knowledge for developing the plenoptic camera, a working experimental system was built, which successfully generated a rendered image and its corresponding depth map.
Resumo:
Telecommunications have developed at an incredible speed over the last couple of decades. The decreasing size of our phones and the increasing number of ways in which we can communicate are barely the only result of this (r)evolutionary development. The latter has indeed multiple implications. The change of paradigm for telecommunications regulation, epitomised by the processes of liberalisation and reregulation, was not sufficient to answer all regulatory questions pertinent to communications. Today, after the transition from monopoly to competition, we are faced perhaps with an even harder regulatory puzzle, since we must figure out how to regulate a sector that is as dynamic and as unpredictable as electronic communications have proven to be, and as vital and fundamental to the economy and to society at large. The present book addresses the regulatory puzzle of contemporary electronic communications and suggests the outlines of a coherent model for their regulation. The search for such a model involves essentially deliberations on the question "Can competition law do it all?", since generic competition rules are largely seen as the appropriate regulatory tool for the communications domain. The latter perception has been the gist of the 2002 reform of the European Community (EC) telecommunications regime, which envisages a withdrawal of sectoral regulation, as communications markets become effectively competitive and ultimately bestows the regulation of the sector upon competition law only. The book argues that the question of whether competition law is the appropriate tool needs to be examined not in the conventional contexts of sector specific rules versus competition rules or deregulation versus regulation but in a broader governance context. Consequently, the reader is provided with an insight into the workings and specific characteristics of the communications sector as network-bound, converging, dynamic and endowed with a special societal role and function. A thorough evaluation of the regulatory objectives in the communications environment contributes further to the comprehensive picture of the communications industry. Upon this carefully prepared basis, the book analyses the communications regulatory toolkit. It explores the interplay between sectoral communications regulation, competition rules (in particular Article 82 of the EC Treaty) and the rules of the World Trade Organization (WTO) relevant to telecommunications services. The in-depth analysis of multilevel construct of EC communications law is up-to-date and takes into account important recent developments in the EC competition law in practice, in particular in the field of refusal to supply and tying, of the reform of the EC electronic communications framework and new decisions of the WTO dispute settlement body, such as notably the Mexico-Telecommunications Services Panel Report. Upon these building elements, an assessment of the regulatory potential of the EC competition rules is made. The conclusions drawn are beyond the scope of the current situation of EC electronic communications and the applicable law and explore the possible contours of an optimal regulatory framework for modern communications. The book is of particular interest to communications and antitrust law experts, as well as policy makers, government agencies, consultancies and think-tanks active in the field. Experts on other network industries (such as electricity or postal communications) can also profit from the substantial experience gathered in the communications sector as the most advanced one in terms of liberalisation and reregulation.
Resumo:
In the first decades of the 20th century, aerological observations were for the first time performed in tropical regions. One of the most prominent endeavours in this respect was ARTHUR BERSON’s aerological expedition to East Africa. Although the main target was the East African monsoon circulation, the expedition provided also other insights that profoundly changed meteorology and climatology. BERSON observed that the tropical tropopause was much higher and colder than that over midlatitudes. Moreover, westerly winds were observed in the lower stratosphere, apparently contradicting the high-altitude equatorial easterly winds that were known since the Krakatoa eruption (‘‘Krakatoa easterlies’’). The puzzle was only resolved five decades later with the discovery of the Quasi-Biennial Oscillation (QBO). In this paper we briefly summarize the expedition of BERSON and review the results in a historical context and in the light of the current research. In the second part of the paper we re-visit BERSON’s early aerological observations, which we have digitized. We compare the observed wind profiles with corresponding profiles extracted from the ‘‘Twentieth Century Reanalysis’’, which provides global three-dimensional weather information back to 1871 based on an assimilation of sea-level and surface pressure data. The comparison shows a good agreement at the coast but less good agreement further inland, at the shore of Lake Victoria, where the circulation is more complex. These results demonstrate that BERSON’s observations are still valuable today as input to current reanalysis systems or for their validation.
Resumo:
ContentsBuilding a businessDiver displays artistry in scienceEvent offers food for thoughtRaw milk stirs up hot debate Gymnasts piece together puzzle before regionalsGlobal gas tank nears emptyBirds' nest treats for Easter time
Resumo:
Prostate cancer is the second leading cause of cancer-related death and the most common non-skin cancer in men in the USA. Considerable advancements in the practice of medicine have allowed a significant improvement in the diagnosis and treatment of this disease and, in recent years, both incidence and mortality rates have been slightly declining. However, it is still estimated that 1 man in 6 will be diagnosed with prostate cancer during his lifetime, and 1 man in 35 will die of the disease. In order to identify novel strategies and effective therapeutic approaches in the fight against prostate cancer, it is imperative to improve our understanding of its complex biology since many aspects of prostate cancer initiation and progression still remain elusive. The study of tumor biomarkers, due to their specific altered expression in tumor versus normal tissue, is a valid tool for elucidating key aspects of cancer biology, and may provide important insights into the molecular mechanisms underlining the tumorigenesis process of prostate cancer. PCA3, is considered the most specific prostate cancer biomarker, however its biological role, until now, remained unknown. PCA3 is a long non-coding RNA (ncRNA) expressed from chromosome 9q21 and its study led us to the discovery of a novel human gene, PC-TSGC, transcribed from the opposite strand and in an antisense orientation to PCA3. With the work presented in this thesis, we demonstrate that PCA3 exerts a negative regulatory role over PC-TSGC, and we propose PC-TSGC to be a new tumor suppressor gene that contrasts the transformation of prostate cells by inhibiting Rho-GTPases signaling pathways. Our findings provide a biological role for PCA3 in prostate cancer and suggest a new mechanism of tumor suppressor gene inactivation mediated by non-coding RNA. Also, the characterization of PCA3 and PC-TSGC led us to propose a new molecular pathway involving both genes in the transformation process of the prostate, thus providing a new piece of the jigsaw puzzle representing the complex biology of prostate cancer.
Resumo:
Solving the riddle of a thrombocytopenic patient is a difficult and fascinating task. The spectrum of possible aetiologies is wide, ranging from an in vitro artefact to severe treatment-resistant thrombocytopenic bleeding conditions, or even life-threatening prothrombotic states. Moreover, thrombocytopenia by itself does not protect from thrombosis and sometimes a patient with a low platelet count requires concomitant antithrombotic treatment as well. In order to identify and treat the cause and the effects of the thrombocytopenia, you have to put together several pieces of information, solving a unique jig-jaw puzzle. The present work is not a textbook article about thrombocytopenia, rather a collection of differential diagnostic thoughts, treatment concepts, and some basic knowledge, that you can retrieve when facing your next thrombocytopenic patient. Enjoy reading it, but most importantly enjoy taking care of patients with a low platelet count. I bet the present work will assist you in this challenging and rewarding clinical task.
Resumo:
This paper analyses the World Trade Organization within a principal-agent framework. The concept of complex agency is introduced to focus on the variety of actors that comprise an international organization. Special attention is paid to the relationship between contracting parties’ representatives and the Secretariat. In the empirical part, the paper analyses the role of the Secretariat in assisting negotiations and presents evidence of declining influence. It is shown how principal-agent theory can contribute to addressing this ‘puzzle of missing delegation’. The paper concludes with a cautionary note as to the ‘location’ of international organizations’ emerging pathologies and calls for additional research to address the relationship between material and social sources to explain behaviour of the key actors within the complex agency.
Resumo:
The World Trade Organization (WTO) is one of the most judicialized dispute settlement systems in international politics. While a general appreciation has developed that the system has worked quite well, research has not paid sufficient attention to the weakest actors in the system. This paper addresses the puzzle of missing cases of least-developed countries initiating WTO disputes settlement procedures. It challenges the existing literature on developing countries in WTO dispute settlement which predominantly focuses on legal capacity and economic interests. The paper provides an argument that the small universe of ‘actionable cases’, the option of free riding and the assessment of the perceived opportunity costs related to other foreign policy priorities better explain the absence of cases. In addition (and somewhat counterintuitively), we argue that the absence of cases is not necessarily bad news and shows how the weakest actors can use the dispute settlement system in a ‘lighter version’ or in indirect ways. The argument is empirically assessed by conducting a case study on four West African cotton-producing countries (C4) and their involvement in dispute settlement.
Resumo:
Goal evaluation is an essential element of the process of designing regulatory frameworks. Lawyers and legal scholars do however tend to ignore it. The present paper stresses the importance of pinpointing the precise regulatory objectives in the fluid environment of electronic communications, since, due to their technological and economic development, they have become the vital basis for communication and distribution of information in modern societies. The paper attempts an analysis of the underlying regulatory objectives in contemporary communications and seeks to put together the complex puzzle of economic and societal issues.
Resumo:
Telecommunications have developed at an incredible speed over the last couple of decades. The decreasing size of our phones and the increasing number of ways in which we can communicate are barely the only result of this (r)evolutionary development. The latter has indeed multiple implications. The change of paradigm for telecommunications regulation, epitomised by the processes of liberalisation and reregulation, was not sufficient to answer all regulatory questions pertinent to communications. Today, after the transition from monopoly to competition, we are faced perhaps with an even harder regulatory puzzle, since we must figure out how to regulate a sector that is as dynamic and as unpredictable as electronic communications have proven to be, and as vital and fundamental to the economy and to society at large. The present book addresses the regulatory puzzle of contemporary electronic communications and suggests the outlines of a coherent model for their regulation. The search for such a model involves essentially deliberations on the question "Can competition law do it all?", since generic competition rules are largely seen as the appropriate regulatory tool for the communications domain. The latter perception has been the gist of the 2002 reform of the European Community (EC) telecommunications regime, which envisages a withdrawal of sectoral regulation, as communications markets become effectively competitive and ultimately bestows the regulation of the sector upon competition law only. The book argues that the question of whether competition law is the appropriate tool needs to be examined not in the conventional contexts of sector specific rules versus competition rules or deregulation versus regulation but in a broader governance context. Consequently, the reader is provided with an insight into the workings and specific characteristics of the communications sector as network-bound, converging, dynamic and endowed with a special societal role and function. A thorough evaluation of the regulatory objectives in the communications environment contributes further to the comprehensive picture of the communications industry. Upon this carefully prepared basis, the book analyses the communications regulatory toolkit. It explores the interplay between sectoral communications regulation, competition rules (in particular Article 82 of the EC Treaty) and the rules of the World Trade Organization (WTO) relevant to telecommunications services. The in-depth analysis of multilevel construct of EC communications law is up-to-date and takes into account important recent developments in the EC competition law in practice, in particular in the field of refusal to supply and tying, of the reform of the EC electronic communications framework and new decisions of the WTO dispute settlement body, such as notably the Mexico-Telecommunications Services Panel Report. Upon these building elements, an assessment of the regulatory potential of the EC competition rules is made. The conclusions drawn are beyond the scope of the current situation of EC electronic communications and the applicable law and explore the possible contours of an optimal regulatory framework for modern communications. The book is of particular interest to communications and antitrust law experts, as well as policy makers, government agencies, consultancies and think-tanks active in the field. Experts on other network industries (such as electricity or postal communications) can also profit from the substantial experience gathered in the communications sector as the most advanced one in terms of liberalisation and reregulation.
Resumo:
Public broadcasting has always been a regulatory field somewhat zealously guarded within the nation states' sphere and kept willingly untouched by regional or international rules. Values inherent to the role of public broadcasting, such as cultural and national identity, social cohesion, pluralism and a sustained public sphere, were thought too critical and too historically connected with the particular society to allow any "outside" influence. Different regulatory models have emerged to reflect these specificities within the national boundaries of European countries. Yet, as media evolved technologically and economically, the constraints of state borders were rendered obsolete and the inner tension between culture and commerce of the television medium became more pronounced. This tension was only intensified with the formulation of a European Community (EC) layer of regulation, which had as its primary objective the creation of a single market for audiovisual services (or as the EC Directive beautifully put it, a "Television without Frontiers"), while also including some provisions catering for cultural concerns, such as the infamous quota system for European and independent productions. Against this backdrop, public broadcasting makes a particularly intriguing subject for a study of regulatory dilemmas of national versus supranational, integration versus intergovernmentalism, culture versus commerce, intervention versus liberalisation, and all this in the dynamic setting of contemporary media. The present paper reviews Irini Katsirea's book PUBLIC BROADCASTING AND EUROPEAN LAW and seeks to identify whether all elements of the complex governance puzzle of European public service broadcasting rules are analytically well fitted together.
Resumo:
The goal of this work has been to calibrate sensitivities and fragmentation pattern of various molecules as well as further characterize the lab model of the ROSINA Double Focusing Mass Spectrometer (DFMS) on board ESA’s Rosetta spacecraft bound to comet 67P/Churyumov-Gerasimenko. The detailed calibration and characterization of the instrument is key to understand and interpret the results in the coma of the comet. A static calibration was performed for the following species: Ne, Ar, Kr, Xe, H2O, N2, CO2, CH4, C2H6, C3H8, C4H10, and C2H4. The purpose of the calibration was to obtain sensitivities for all detectors and emissions, the fragmentation behavior of the ion source and to show the capabilities to measure isotopic ratios at the comet. The calibration included the recording of different correction factors to evaluate the data, including a detailed investigation of the detector gain. The quality of the calibration that could be tested for different gas mixtures including the calibration of the density inside the ion source when calibration gas from the gas calibration unit is introduced. In conclusion the calibration shows that DFMS meets the design requirements and that DFMS will be able to measure the D/H at the comet and help shed more light on the puzzle about the origin of water on Earth.