44 resultados para modern physics
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
One of the unresolved questions of modern physics is the nature of Dark Matter. Strong experimental evidences suggest that the presence of this elusive component in the energy budget of the Universe is quite significant, without, however, being able to provide conclusive information about its nature. The most plausible scenario is that of weakly interacting massive particles (WIMPs), that includes a large class of non-baryonic Dark Matter candidates with a mass typically between few tens of GeV and few TeVs, and a cross section of the order of weak interactions. Search for Dark Matter particles using very high energy gamma-ray Cherenkov telescopes is based on the model that WIMPs can self-annihilate, leading to production of detectable species, like photons. These photons are very energetic, and since unreflected by the Universe's magnetic fields, they can be traced straight to the source of their creation. The downside of the approach is a great amount of background radiation, coming from the conventional astrophysical objects, that usually hides clear signals of the Dark Matter particle interactions. That is why good choice of the observational candidates is the crucial factor in search for Dark Matter. With MAGIC (Major Atmospheric Gamma-ray Imaging Cherenkov Telescopes), a two-telescope ground-based system located in La Palma, Canary Islands, we choose objects like dwarf spheroidal satellite galaxies of the Milky Way and galaxy clusters for our search. Our idea is to increase chances for WIMPs detection by pointing to objects that are relatively close, with great amount of Dark Matter and with as-little-as-possible pollution from the stars. At the moment, several observation projects are ongoing and analyses are being performed.
Resumo:
The 1st chapter of this work presents the different experiments and collaborations in which I am involved during my PhD studies of Physics. Following those descriptions, the 2nd chapter is dedicated to how the radiation affects the silicon sensors, as well as some experimental measurements carried out at CERN (Geneve, Schwitzerland) and IFIC (Valencia, Spain) laboratories. Besides the previous investigation results, this chapter includes the most recent scientific papers appeared in the latest RD50 (Research & Development #50) Status Report, published in January 2007, as well as some others published this year. The 3rd and 4th are dedicated to the simulation of the electrical behavior of solid state detectors. In chapter 3 are reported the results obtained for the illumination of edgeless detectors irradiated at different fluences, in the framework of the TOSTER Collaboration. The 4th chapter reports about simulation design, simulation and fabrication of a novel 3D detector developed at CNM for ions detection in the future ITER fusion reactor. This chapter will be extended with irradiation simulations and experimental measurements in my PhD Thesis.
Resumo:
Estudi realitzat a partir d’una estada al Physics Department de la New York University, United States, Estats Units, entre 2006 i 2008. Una de les observacions de més impacte en la cosmologia moderna ha estat la determinació empírica que l’Univers es troba actualment en una fase d’Expansió Accelerada (EA). Aquest fenòmen implica que o bé l’Univers està dominat per un nou sector de matèria/energia, o bé la Relativitat General deixa de tenir validesa a escales cosmològiques. La primera possibilitat comprèn els models d’Energia Fosca (EF), i el seu principal problema és que l’EF ha de tenir propietats tan especials que es fan difícils de justificar teòricament. La segona possibilitat requereix la construcció de teories consistents de Gravetat Modificada a Grans Distàncies (GMGD), que són una generalització dels models de gravetat massiva. L’interès fenomenològic per aquestes teories també va resorgir amb l’aparició dels primers exemples de models de GMGD, com ara el model de Dvali, Gabadadze i Porrati (DGP), que consisteix en un tipus de brana en una dimensió extra. Malauradament, però, aquest model no permet explicar de forma consistent l’EA de l’Univers. Un dels objectius d’aquest projecte ha estat establir la viabilitat interna i fenomenològica dels models de GMGD. Des del punt de vista fenomenològic, ens hem centrat en la questió més important a la pràctica: trobar signatures observacionals que permetin distingir els models de GMGD dels d’EF. A nivell més teòric, també hem investigat el significat de les inestabilitats del model DGP.L’altre gran objectiu que ens vam proposar va ser la construcció de noves teories de GMGD. En la segona part d’aquest projecte, hem elaborat i mostrat la consistència del model “DGP en Cascada”, que generalitza el model DGP a més dimensions extra, i representa el segon model consistent i invariant-Lorentz a l’espai pla conegut. L’existència d’altres models de GMGD més enllà de DGP és de gran interès atès que podria permetre obtenir l’EA de l’Univers de forma purament geomètrica.
Resumo:
We give a survey of some recent results on Grothendieck duality. We begin with a brief reminder of the classical theory, and then launch into an overview of some of the striking developments since 2005.
Resumo:
The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares
Resumo:
Taking on the challenge of understanding and explaining the Symphony of (today’s) New World in realistic terms (not realist), this essay aims to analyse the Post-Cold war era by devising a multi-conceptual framework that combines different theoretical contributions not yet linked in a fully explanatory way. This paper suggests two inter-related analytical contexts (or background melodies) to understand Dvorak´s "New World”. First, the socio-economic structural context that falls under the controversial category of Globalization and, second, the post-modern political structural context that is built on Robert Cooper’s threefold analysis (Pre-modern, Modern and Post-modern) of today’s world [Cooper, R: 1997, 1999]. Lastly, the closing movement (allegro con fuoco) enters the normative arena to assess American foreign policy options in the light of the theoretical framework devised in the first part of the essay.
Resumo:
An alternative approach to the fundamental general physics concepts has been proposed. We demonstrate that the electrostatic potential energy of a discrete or a continuous system of charges should be stored by the charges and not the field. It is found that there is a possibility that any electric field has no energy density, as well as magnetic field. It is found that there is no direct relation between the electric or magnetic energy and photons. An alternative derivation of the blackbody radiation formula is proposed. It is also found that the zero-point of energy of electromagnetic radiation may not exist.
Resumo:
El present projecte s'ha dut a terme a l'American Museum of Natural History (AMNH, New York) entre el 31 de Desembre de 2010 i el 30 de Desembre de 2012. L'objectiu del projecte era elucidar la història evolutiva de la mà humana: traçar els canvis evolutius en la seva forma i proporcions que van propiciar la seva estructura moderna que permet als humans manipular amb precisió. El treball realitzat ha inclòs recol•lecció de dades i anàlisis, redacció de resultats i formació en mètodes analítics específics. Durant aquest temps, l'autor a completat la seva de base de dades existent en mesures lineals de la mà a hominoides. També s'han agafat dades del peu; d'aquesta forma ara mateix es compta amb una base de dades amb més de 500 individus, amb més de 200 mesures per cada un. També s'han agafat dades en tres imensions utilitzant un làser escàner. S'han après tècniques de morfometria geomètrica 3D directament dels pioners al camp a l'AMNH. Com a resultat d'aquesta feina s'han produït 10 resums (publicats a congressos internacionals) i 9 manuscrits (molts d'ells ja publicats a revistes internacionals) amb resultats de gran rellevància: La mà humana posseeix unes proporcions relativament primitives, que són més similars a les proporciones que tenien els hominoides fòssils del Miocè que no pas a la dels grans antropomorfs actuals. Els darrers tenen unes mans allargades amb un polzes molt curts que reflexen l'ús de la mà com a eina de suspensió sota les branques. En canvi, els hominoides del Miocè tenien unes mans relativament curtes amb un polze llarg que feien servir per estabilitzar el seu pes quan caminaven per sobre de les branques. Una vegada els primers homínids van aparèixer al final del Miocè (fa uns 6 Ma) i van començar a fer servir el bipedisme com a mitjà més comú de locomoció, les seves mans van ser "alliberades" de les seves funcions locomotores. La selecció natural—ara només treballant en la manipulació—va convertir les proporcions ja existents de la mà d'aquests primats en l'òrgan manipulatori que representa la mà humana avui dia.
Resumo:
Taking the Royal College of Barcelona (1760 -1843) as a case study this paper shows the development of modern surgery in Spain initiated by Bourbon Monarchy founding new kinds of institutions through their academic activities of spreading scientific knowledge. Antoni Gimbernat was the most famousinternationally recognised Spanish surgeon. He was trained as a surgeon at the Royal College of Surgery in Cadiz and was later appointed as professor of theAnatomy in the College of Barcelona. He then became Royal Surgeon of King Carlos IV and with that esteemed position in Madrid he worked resiliently to improve the quality of the Royal colleges in Spain. Learning human bodystructure by performing hands-on dissections in the anatomical theatre has become a fundamental element of modern medical education. Gimbernat favoured the study of natural sciences, the new chemistry of Lavoisier and experimental physics in the academic programs of surgery. According to the study of a very relevant set of documents preserved in the library, the so-called “juntas literarias”, among the main subjects debated in the clinical sessions was the concept of human beings and diseases in relation to the development of the new experimental sciences. These documents showed that chemistry andexperimental physics were considered crucial tools to understand the unexplained processes that occurred in the diseased and healthy human bodyand in a medico-surgical context. It is important to stress that through these manuscripts we can examine the role and the reception of the new sciences applied to healing arts.
Resumo:
A contemporary perspective on the tradeoff between transmit antenna diversity andspatial multiplexing is provided. It is argued that, in the context of most modern wirelesssystems and for the operating points of interest, transmission techniques that utilizeall available spatial degrees of freedom for multiplexing outperform techniques that explicitlysacrifice spatial multiplexing for diversity. In the context of such systems, therefore,there essentially is no decision to be made between transmit antenna diversity and spatialmultiplexing in MIMO communication. Reaching this conclusion, however, requires thatthe channel and some key system features be adequately modeled and that suitable performancemetrics be adopted; failure to do so may bring about starkly different conclusions. Asa specific example, this contrast is illustrated using the 3GPP Long-Term Evolution systemdesign.
Resumo:
There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from realworld dynamics even though these are not necessarily deterministic and stationary. In the present study we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose we here propose a recurrence quantification analysis measure that allows tracking potentially curved and disrupted traces in cross recurrence plots. We apply this measure to cross recurrence plots constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Rössler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.
Resumo:
A contemporary perspective on the tradeoff between transmit antenna diversity and spatial multi-plexing is provided. It is argued that, in the context of modern cellular systems and for the operating points of interest, transmission techniques that utilize all available spatial degrees of freedom for multiplexingoutperform techniques that explicitly sacrifice spatialmultiplexing for diversity. Reaching this conclusion, however, requires that the channel and some key system features be adequately modeled; failure to do so may bring about starkly different conclusions. As a specific example, this contrast is illustrated using the 3GPP Long-Term Evolution system design.
Resumo:
In this article we present a hybrid approach for automatic summarization of Spanish medical texts. There are a lot of systems for automatic summarization using statistics or linguistics, but only a few of them combining both techniques. Our idea is that to reach a good summary we need to use linguistic aspects of texts, but as well we should benefit of the advantages of statistical techniques. We have integrated the Cortex (Vector Space Model) and Enertex (statistical physics) systems coupled with the Yate term extractor, and the Disicosum system (linguistics). We have compared these systems and afterwards we have integrated them in a hybrid approach. Finally, we have applied this hybrid system over a corpora of medical articles and we have evaluated their performances obtaining good results.
Resumo:
The present paper describes recent research on two central themes of Keynes General Theory: (i) the social waste associated with recessions, and (ii) the effectiveness of fiscal policy as a stabilization tool. The paper also discusses some evidence on the extent to which fiscal policy has been used as a stabilizing tool in industrial economies over the past two decades.