5 resultados para investing in the future
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
In the history of modern communication, after the development of the printing press, the telegraph unleashed a revolution in communications. Today, Internet is in many ways its heir. Reflections on the telegraph may open up perspectives concerning tendencies, possibilities and pitfalls of the Internet. The telegraph has been well explored in important literature on communication and media which tends to emphasize the history of this technology, its social context and institutional meaning [e.g. Robert L. Thompson, 1947, Tom Standage, 2007 [1998]. James W. Carey, the North- American critical cultural studies' mentor, in his essay "Technology and Ideology. The Case of the Telegraph" (2009 [1983]), suggests a distinctive approach. In the telegraph, Carey sees the prototype of many subsequent commercial empires based on science and technology, a pioneer model for complex business management; an example of interest struggle for the patents control; an inductor of changes both in language and in structures of knowledge; and a promoter of a futurist and utopian thought of information technologies. Having in mind a revolution in communications promoted by the Internet, this paper revisits this seminal essay to explore its great attainment, as well as the problems of this kind of approach which conceives the innovation of the telegraph as a metaphor for all the innovations announcing the modern stage of history and determining still today the major lines of development in modern communication systems.
Resumo:
The Gulf of Cadiz, as part of the Azores-Gibraltar plate boundary, is recognized as a potential source of big earthquakes and tsunamis that may affect the bordering countries, as occurred on 1 November 1755. Preparing for the future, Portugal is establishing a national tsunami warning system in which the threat caused by any large-magnitude earthquake in the area is estimated from a comprehensive database of scenarios. In this paper we summarize the knowledge about the active tectonics in the Gulf of Cadiz and integrate the available seismological information in order to propose the generation model of destructive tsunamis to be applied in tsunami warnings. The fault model derived is then used to estimate the recurrence of large earthquakes using the fault slip rates obtained by Cunha et al. (2012) from thin-sheet neotectonic modelling. Finally we evaluate the consistency of seismicity rates derived from historical and instrumental catalogues with the convergence rates between Eurasia and Nubia given by plate kinematic models.
Resumo:
The analysis of the Higgs boson data by the ATLAS and CMS Collaborations appears to exhibit an excess of h -> gamma gamma events above the Standard Model (SM) expectations, whereas no significant excess is observed in h -> ZZ* -> four lepton events, albeit with large statistical uncertainty due to the small data sample. These results (assuming they persist with further data) could be explained by a pair of nearly mass-degenerate scalars, one of which is an SM-like Higgs boson and the other is a scalar with suppressed couplings to W+W- and ZZ. In the two-Higgs-doublet model, the observed gamma gamma and ZZ* -> four lepton data can be reproduced by an approximately degenerate CP-even (h) and CP-odd (A) Higgs boson for values of sin (beta - alpha) near unity and 0: 70 less than or similar to tan beta less than or similar to 1. An enhanced gamma gamma signal can also arise in cases where m(h) similar or equal to m(H), m(H) similar or equal to m(A), or m(h) similar or equal to m(H) similar or equal to m(A). Since the ZZ* -> 4 leptons signal derives primarily from an SM-like Higgs boson whereas the gamma gamma signal receives contributions from two (or more) nearly mass-degenerate states, one would expect a slightly different invariant mass peak in the ZZ* -> four lepton and gamma gamma channels. The phenomenological consequences of such models can be tested with additional Higgs data that will be collected at the LHC in the near future. DOI: 10.1103/PhysRevD.87.055009.
Resumo:
We suggest that the weak-basis independent condition det(M-nu) = 0 for the effective neutrino mass matrix can be used in order to remove the ambiguities in the reconstruction of the neutrino mass matrix from input data available from present and future feasible experiments. In this framework, we study the full reconstruction of M-nu with special emphasis on the correlation between the Majorana CP-violating phase and the various mixing angles. The impact of the recent KamLAND results on the effective neutrino mass parameter is also briefly discussed. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
In this article, physical layer awareness in access, core, and metro networks is addressed, and a Physical Layer Aware Network Architecture Framework for the Future Internet is presented and discussed, as proposed within the framework of the European ICT Project 4WARD. Current limitations and shortcomings of the Internet architecture are driving research trends at a global scale toward a novel, secure, and flexible architecture. This Future Internet architecture must allow for the co-existence and cooperation of multiple networks on common platforms, through the virtualization of network resources. Possible solutions embrace a full range of technologies, from fiber backbones to wireless access networks. The virtualization of physical networking resources will enhance the possibility of handling different profiles, while providing the impression of mutual isolation. This abstraction strategy implies the use of well elaborated mechanisms in order to deal with channel impairments and requirements, in both wireless (access) and optical (core) environments.