107 resultados para single-event upset
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Tropical cyclones are affected by a large number of climatic factors, which translates into complex patterns of occurrence. The variability of annual metrics of tropical-cyclone activity has been intensively studied, in particular since the sudden activation of the North Atlantic in the mid 1990’s. We provide first a swift overview on previous work by diverse authors about these annual metrics for the North-Atlantic basin, where the natural variability of the phenomenon, the existence of trends, the drawbacks of the records, and the influence of global warming have been the subject of interesting debates. Next, we present an alternative approach that does not focus on seasonal features but on the characteristics of single events [Corral et al., Nature Phys. 6, 693 (2010)]. It is argued that the individual-storm power dissipation index (PDI) constitutes a natural way to describe each event, and further, that the PDI statistics yields a robust law for the occurrence of tropical cyclones in terms of a power law. In this context, methods of fitting these distributions are discussed. As an important extension to this work we introduce a distribution function that models the whole range of the PDI density (excluding incompleteness effects at the smallest values), the gamma distribution, consisting in a powerlaw with an exponential decay at the tail. The characteristic scale of this decay, represented by the cutoff parameter, provides very valuable information on the finiteness size of the basin, via the largest values of the PDIs that the basin can sustain. We use the gamma fit to evaluate the influence of sea surface temperature (SST) on the occurrence of extreme PDI values, for which we find an increase around 50 % in the values of these basin-wide events for a 0.49 C SST average difference. Similar findings are observed for the effects of the positive phase of the Atlantic multidecadal oscillation and the number of hurricanes in a season on the PDI distribution. In the case of the El Niño Southern oscillation (ENSO), positive and negative values of the multivariate ENSO index do not have a significant effect on the PDI distribution; however, when only extreme values of the index are used, it is found that the presence of El Niño decreases the PDI of the most extreme hurricanes.
Resumo:
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.
Resumo:
The medial temporal lobe (MTL)-comprising hippocampus and the surrounding neocortical regions-is a targeted brain area sensitive to several neurological diseases. Although functional magnetic resonance imaging (fMRI) has been widely used to assess brain functional abnormalities, detecting MTL activation has been technically challenging. The aim of our study was to provide an fMRI paradigm that reliably activates MTL regions at the individual level, thus providing a useful tool for future research in clinical memory-related studies. Twenty young healthy adults underwent an event-related fMRI study consisting of three encoding conditions: word-pairs, face-name associations and complex visual scenes. A region-of-interest analysis at the individual level comparing novel and repeated stimuli independently for each task was performed. The results of this analysis yielded activations in the hippocampal and parahippocampal regions in most of the participants. Specifically, 95% and 100% of participants showed significant activations in the left hippocampus during the face-name encoding and in the right parahippocampus, respectively, during scene encoding. Additionally, a whole brain analysis, also comparing novel versus repeated stimuli at the group level, showed mainly left frontal activation during the word task. In this group analysis, the face-name association engaged the HP and fusiform gyri bilaterally, along with the left inferior frontal gyrus, and the complex visual scenes activated mainly the parahippocampus and hippocampus bilaterally. In sum, our task design represents a rapid and reliable manner to study and explore MTL activity at the individual level, thus providing a useful tool for future research in clinical memory-related fMRI studies.
Resumo:
Markowitz portfolio theory (1952) has induced research into the efficiency of portfolio management. This paper studies existing nonparametric efficiency measurement approaches for single period portfolio selection from a theoretical perspective and generalises currently used efficiency measures into the full mean-variance space. Therefore, we introduce the efficiency improvement possibility function (a variation on the shortage function), study its axiomatic properties in the context of Markowitz efficient frontier, and establish a link to the indirect mean-variance utility function. This framework allows distinguishing between portfolio efficiency and allocative efficiency. Furthermore, it permits retrieving information about the revealed risk aversion of investors. The efficiency improvement possibility function thus provides a more general framework for gauging the efficiency of portfolio management using nonparametric frontier envelopment methods based on quadratic optimisation.
Resumo:
We identify in this paper two conditions that characterize the domain of single-peaked preferences on the line in the following sense: a preference profile satisfies these two properties if and only if there exists a linear order $L$ over the set of alternatives such that these preferences are single-peaked with respect L. The first property states that for any subset of alternatives the set of alternatives considered as the worst by all agents cannot contains more than 2 elements. The second property states that two agents cannot disagree on the relative ranking of two alternatives with respect to a third alternative but agree on the (relative) ranking of a fourth one. Classification-JEL: D71, C78
Resumo:
We consider the problem of allocating an infinitely divisible commodity among a group of agents with single-peaked preferences. A rule that has played a central role in the analysis of the problem is the so-called uniform rule. Chun (2001) proves that the uniform rule is the only rule satisfying Pareto optimality, no-envy, separability, and continuity (with respect to the social endowment). We obtain an alternative characterization by using a weak replication-invariance condition, called duplication-invariance, instead of continuity. Furthermore, we prove that Pareto optimality, equal division lower bound, and separability imply no-envy. Using this result, we strengthen one of Chun's (2001) characterizations of the uniform rule by showing that the uniform rule is the only rule satisfying Pareto optimality, equal división lower bound, separability, and either continuity or duplication-invariance.
Resumo:
Un reto al ejecutar las aplicaciones en un cluster es lograr mejorar las prestaciones utilizando los recursos de manera eficiente, y este reto es mayor al utilizar un ambiente distribuido. Teniendo en cuenta este reto, se proponen un conjunto de reglas para realizar el cómputo en cada uno de los nodos, basado en el análisis de cómputo y comunicaciones de las aplicaciones, se analiza un esquema de mapping de celdas y un método para planificar el orden de ejecución, tomando en consideración la ejecución por prioridad, donde las celdas de fronteras tienen una mayor prioridad con respecto a las celdas internas. En la experimentación se muestra el solapamiento del computo interno con las comunicaciones de las celdas fronteras, obteniendo resultados donde el Speedup aumenta y los niveles de eficiencia se mantienen por encima de un 85%, finalmente se obtiene ganancias de los tiempos de ejecución, concluyendo que si se puede diseñar un esquemas de solapamiento que permita que la ejecución de las aplicaciones SPMD en un cluster se hagan de forma eficiente.
Resumo:
We characterize the class of strategy-proof social choice functions on the domain of symmetric single-peaked preferences. This class is strictly larger than the set of generalized median voter schemes (the class of strategy-proof and tops-only social choice functions on the domain of single-peaked preferences characterized by Moulin (1980)) since, under the domain of symmetric single-peaked preferences, generalized median voter schemes can be disturbed by discontinuity points and remain strategy-proof on the smaller domain. Our result identifies the specific nature of these discontinuities which allow to design non-onto social choice functions to deal with feasibility constraints.
Resumo:
We compare rain event size distributions derived from measurements in climatically different regions, which we find to be well approximated by power laws of similar exponents over broad ranges. Differences can be seen in the large-scale cutoffs of the distributions. Event duration distributions suggest that the scale-free aspects are related to the absence of characteristic scales in the meteorological mesoscale.
Resumo:
Recent research in the field of study abroad shows that study abroad participation among all U.S. students increased 20% since 2001 and nearly 200,000 U.S. students currently go abroad each year. Additionally, about 8% of all undergraduate degree recipients receive part of their education abroad. Although quantitative studies have dominated research on study abroad, my research project calls for a qualitative approach since the goal is to understand what study abroad is as a cultural event, what authentic cultural immersion is, how program stakeholders understand and perceive cultural immersion, and how cultural immersion in programs can be improved. Following the tradition of ethnographic and case study approaches in study abroad research, my study also pivots on ethnography. As an ethnographer I collected data mainly through participant observation, semi-structured interviews, and document analysis. The study abroad participants were a group of undergraduate native speakers of English studying Spanish for seven weeks in Cádiz, a small costal city in southern Spain, as well as program coordinators, host community members, and professors. I also examined the specific program design features, particularly the in-class and out-of-class activities that students participated in. The goal was to understand if these features were conducive to authentic immersion in the language and culture. Eventually, I elaborated an ethnographic evaluation of the study abroad program and its design features suggesting improvements in order to enhance the significance and value of study abroad as a cultural event. Among other things, I discussed the difficulties that students had at the beginning of their sojourn to understand local people, get used to their host families’ small apartments, get adjusted to new schedules and eating habits, and venture out from the main group to individually explore the new social and cultural fabric and interact with the host community. The program evaluation revealed the need for carefully-designed pre-departure preparation sessions, pre-departure credit-bearing courses in intercultural communication, and additional language practice abroad and opportunities to come in contact with the local community through internships, volunteer or field work. My study gives an important contribution in study abroad research and education. It benefits students, teachers, and study abroad directors and coordinators in suggesting ideas on how to improve the program and optimize the students’ cultural experiences abroad. This study is also important because it investigated how US undergraduate learners studying the Spanish language and culture approach and perceive the study abroad experience in Spain.
Resumo:
The control of optical fields on the nanometre scale is becoming an increasingly important tool in many fields, ranging from channelling light delivery in photovoltaics and light emitting diodes to increasing the sensitivity of chemical sensors to single molecule levels. The ability to design and manipulate light fields with specific frequency and space characteristics is explored in this project. We present an alternative realisation of Extraordinary Optical Transmission (EOT) that requires only a single aperture and a coupled waveguide. We show how this waveguide-resonant EOT improves the transmissivity of single apertures. An important technique in imaging is Near-Field Scanning Optical Microscopy (NSOM); we show how waveguide-resonant EOT and the novel probe design assist in improving the efficiency of NSOM probes by two orders of magnitude, and allow the imaging of single molecules with an optical resolution of as good as 50 nm. We show how optical antennas are fabricated into the apex of sharp tips and can be used in a near-field configuration.
Resumo:
We explore in depth the validity of a recently proposed scaling law for earthquake inter-event time distributions in the case of the Southern California, using the waveform cross-correlation catalog of Shearer et al. Two statistical tests are used: on the one hand, the standard two-sample Kolmogorov-Smirnov test is in agreement with the scaling of the distributions. On the other hand, the one-sample Kolmogorov-Smirnov statistic complemented with Monte Carlo simulation of the inter-event times, as done by Clauset et al., supports the validity of the gamma distribution as a simple model of the scaling function appearing on the scaling law, for rescaled inter-event times above 0.01, except for the largest data set (magnitude greater than 2). A discussion of these results is provided.
Resumo:
Reseña del congreso WePreserve 2009 que tuvo lugar los pasados 23 al 27 de marzo en Barcelona, organizadopor la Facultad de Biblioteconomía y Documentación de la Universitat de Barcelona, con la colaboración del Institut d'Estudis Catalans, la Biblioteca de Catalunya y el Consorci de Biblioteques de Barcelona. El seminario 2009 WePreserve se celebra anualmente desde el año 2007 y participan todas las figuras de referencia europeas en materia de investigación en sistemas y metodologías que garantizan la preservación digital de los documentos: Digital Preservation Europe (DPE), Preservation and Long-term Access Through Networked Services (Planets), Cultural Artistic and Scientific Knowledge for Preservation, Access and Retrieval (CASPAR), y Network of expertise in Digital long-term preservation (nestor).
Resumo:
Omnidirectional cameras offer a much wider field of view than the perspective ones and alleviate the problems due to occlusions. However, both types of cameras suffer from the lack of depth perception. A practical method for obtaining depth in computer vision is to project a known structured light pattern on the scene avoiding the problems and costs involved by stereo vision. This paper is focused on the idea of combining omnidirectional vision and structured light with the aim to provide 3D information about the scene. The resulting sensor is formed by a single catadioptric camera and an omnidirectional light projector. It is also discussed how this sensor can be used in robot navigation applications
Resumo:
Most network operators have considered reducing LSR label spaces (number of labels used) as a way of simplifying management of underlaying virtual private networks (VPNs) and therefore reducing operational expenditure (OPEX). The IETF outlined the label merging feature in MPLS-allowing the configuration of multipoint-to-point connections (MP2P)-as a means of reducing label space in LSRs. We found two main drawbacks in this label space reduction a)it should be separately applied to a set of LSPs with the same egress LSR-which decreases the options for better reductions, and b)LSRs close to the edge of the network experience a greater label space reduction than those close to the core. The later implies that MP2P connections reduce the number of labels asymmetrically