952 resultados para Over-representation
Resumo:
Random access (RA) protocols are normally used in a satellite networks for initial terminal access and are particularly effective since no coordination is required. On the other hand, contention resolution diversity slotted Aloha (CRDSA), irregular repetition slotted Aloha (IRSA) and coded slotted Aloha (CSA) has shown to be more efficient than classic RA schemes as slotted Aloha, and can be exploited also when short packets transmissions are done over a shared medium. In particular, they relies on burst repetition and on successive interference cancellation (SIC) applied at the receiver. The SIC process can be well described using a bipartite graph representation and exploiting tools used for analyze iterative decoding. The scope of my Master Thesis has been to described the performance of such RA protocols when the Rayleigh fading is taken into account. In this context, each user has the ability to correctly decode a packet also in presence of collision and when SIC is considered this may result in multi-packet reception. Analysis of the SIC procedure under Rayleigh fading has been analytically derived for the asymptotic case (infinite frame length), helping the analysis of both throughput and packet loss rates. An upper bound of the achievable performance has been analytically obtained. It can be show that in particular channel conditions the throughput of the system can be greater than one packets per slot which is the theoretical limit of the Collision Channel case.
Resumo:
The traditional view of a predominant inferior parietal representation of gestures has been recently challenged by neuroimaging studies demonstrating that gesture production and discrimination may critically depend on inferior frontal lobe function. The aim of the present work was therefore to investigate the effect of transient disruption of these brain sites by continuous theta burst stimulation (cTBS) on gesture production and recognition.
Resumo:
Background Young children are known to be the most frequent hospital users compared to older children and young adults. Therefore, they are an important population from economic and policy perspectives of health care delivery. In Switzerland complete hospitalization discharge records for children [<5 years] of four consecutive years [2002–2005] were evaluated in order to analyze variation in patterns of hospital use. Methods Stationary and outpatient hospitalization rates on aggregated ZIP code level were calculated based on census data provided by the Swiss federal statistical office (BfS). Thirty-seven hospital service areas for children [HSAP] were created with the method of "small area analysis", reflecting user-based health markets. Descriptive statistics and general linear models were applied to analyze the data. Results The mean stationary hospitalization rate over four years was 66.1 discharges per 1000 children. Hospitalizations for respiratory problem are most dominant in young children (25.9%) and highest hospitalization rates are associated with geographical factors of urban areas and specific language regions. Statistical models yielded significant effect estimates for these factors and a significant association between ambulatory/outpatient and stationary hospitalization rates. Conclusion The utilization-based approach, using HSAP as spatial representation of user-based health markets, is a valid instrument and allows assessing the supply and demand of children's health care services. The study provides for the first time estimates for several factors associated with the large variation in the utilization and provision of paediatric health care resources in Switzerland.
Resumo:
The African great lakes are of utmost importance for the local economy (fishing), as well as being essential to the survival of the local people. During the past decades, these lakes experienced fast changes in ecosystem structure and functioning, and their future evolution is a major concern. In this study, for the first time a set of one-dimensional lake models are evaluated for Lake Kivu (2.28°S; 28.98°E), East Africa. The unique limnology of this meromictic lake, with the importance of salinity and subsurface springs in a tropical high-altitude climate, presents a worthy challenge to the seven models involved in the Lake Model Intercomparison Project (LakeMIP). Meteorological observations from two automatic weather stations are used to drive the models, whereas a unique dataset, containing over 150 temperature profiles recorded since 2002, is used to assess the model’s performance. Simulations are performed over the freshwater layer only (60 m) and over the average lake depth (240 m), since salinity increases with depth below 60 m in Lake Kivu and some lake models do not account for the influence of salinity upon lake stratification. All models are able to reproduce the mixing seasonality in Lake Kivu, as well as the magnitude and seasonal cycle of the lake enthalpy change. Differences between the models can be ascribed to variations in the treatment of the radiative forcing and the computation of the turbulent heat fluxes. Fluctuations in wind velocity and solar radiation explain inter-annual variability of observed water column temperatures. The good agreement between the deep simulations and the observed meromictic stratification also shows that a subset of models is able to account for the salinity- and geothermal-induced effects upon deep-water stratification. Finally, based on the strengths and weaknesses discerned in this study, an informed choice of a one-dimensional lake model for a given research purpose becomes possible.
Resumo:
In this paper we present a solution to the problem of action and gesture recognition using sparse representations. The dictionary is modelled as a simple concatenation of features computed for each action or gesture class from the training data, and test data is classified by finding sparse representation of the test video features over this dictionary. Our method does not impose any explicit training procedure on the dictionary. We experiment our model with two kinds of features, by projecting (i) Gait Energy Images (GEIs) and (ii) Motion-descriptors, to a lower dimension using Random projection. Experiments have shown 100% recognition rate on standard datasets and are compared to the results obtained with widely used SVM classifier.
Resumo:
Olfactory glomeruli are the loci where the first odor-representation map emerges. The glomerular layer comprises exquisite local synaptic circuits for the processing of olfactory coding patterns immediately after their emergence. To understand how an odor map is transferred from afferent terminals to postsynaptic dendrites, it is essential to directly monitor the odor-evoked glomerular postsynaptic activity patterns. Here we report the use of a transgenic mouse expressing a Ca(2+)-sensitive green fluorescence protein (GCaMP2) under a Kv3.1 potassium-channel promoter. Immunostaining revealed that GCaMP2 was specifically expressed in mitral and tufted cells and a subpopulation of juxtaglomerular cells but not in olfactory nerve terminals. Both in vitro and in vivo imaging combined with glutamate receptor pharmacology confirmed that odor maps reported by GCaMP2 were of a postsynaptic origin. These mice thus provided an unprecedented opportunity to analyze the spatial activity pattern reflecting purely postsynaptic olfactory codes. The odor-evoked GCaMP2 signal had both focal and diffuse spatial components. The focalized hot spots corresponded to individually activated glomeruli. In GCaMP2-reported postsynaptic odor maps, different odorants activated distinct but overlapping sets of glomeruli. Increasing odor concentration increased both individual glomerular response amplitude and the total number of activated glomeruli. Furthermore, the GCaMP2 response displayed a fast time course that enabled us to analyze the temporal dynamics of odor maps over consecutive sniff cycles. In summary, with cell-specific targeting of a genetically encoded Ca(2+) indicator, we have successfully isolated and characterized an intermediate level of odor representation between olfactory nerve input and principal mitral/tufted cell output.
Resumo:
The ability of the one-dimensional lake model FLake to represent the mixolimnion temperatures for tropical conditions was tested for three locations in East Africa: Lake Kivu and Lake Tanganyika's northern and southern basins. Meteorological observations from surrounding automatic weather stations were corrected and used to drive FLake, whereas a comprehensive set of water temperature profiles served to evaluate the model at each site. Careful forcing data correction and model configuration made it possible to reproduce the observed mixed layer seasonality at Lake Kivu and Lake Tanganyika (northern and southern basins), with correct representation of both the mixed layer depth and water temperatures. At Lake Kivu, mixolimnion temperatures predicted by FLake were found to be sensitive both to minimal variations in the external parameters and to small changes in the meteorological driving data, in particular wind velocity. In each case, small modifications may lead to a regime switch, from the correctly represented seasonal mixed layer deepening to either completely mixed or permanently stratified conditions from similar to 10 m downwards. In contrast, model temperatures were found to be robust close to the surface, with acceptable predictions of near-surface water temperatures even when the seasonal mixing regime is not reproduced. FLake can thus be a suitable tool to parameterise tropical lake water surface temperatures within atmospheric prediction models. Finally, FLake was used to attribute the seasonal mixing cycle at Lake Kivu to variations in the near-surface meteorological conditions. It was found that the annual mixing down to 60m during the main dry season is primarily due to enhanced lake evaporation and secondarily to the decreased incoming long wave radiation, both causing a significant heat loss from the lake surface and associated mixolimnion cooling.
Resumo:
Simulating surface wind over complex terrain is a challenge in regional climate modelling. Therefore, this study aims at identifying a set-up of the Weather Research and Forecasting Model (WRF) model that minimises system- atic errors of surface winds in hindcast simulations. Major factors of the model configuration are tested to find a suitable set-up: the horizontal resolution, the planetary boundary layer (PBL) parameterisation scheme and the way the WRF is nested to the driving data set. Hence, a number of sensitivity simulations at a spatial resolution of 2 km are carried out and compared to observations. Given the importance of wind storms, the analysis is based on case studies of 24 historical wind storms that caused great economic damage in Switzerland. Each of these events is downscaled using eight different model set-ups, but sharing the same driving data set. The results show that the lack of representation of the unresolved topography leads to a general overestimation of wind speed in WRF. However, this bias can be substantially reduced by using a PBL scheme that explicitly considers the effects of non-resolved topography, which also improves the spatial structure of wind speed over Switzerland. The wind direction, although generally well reproduced, is not very sensitive to the PBL scheme. Further sensitivity tests include four types of nesting methods: nesting only at the boundaries of the outermost domain, analysis nudging, spectral nudging, and the so-called re-forecast method, where the simulation is frequently restarted. These simulations show that restricting the freedom of the model to develop large-scale disturbances slightly increases the temporal agreement with the observations, at the same time that it further reduces the overestimation of wind speed, especially for maximum wind peaks. The model performance is also evaluated in the outermost domains, where the resolution is coarser. The results demonstrate the important role of horizontal resolution, where the step from 6 to 2 km significantly improves model performance. In summary, the combination of a grid size of 2 km, the non-local PBL scheme modified to explicitly account for non-resolved orography, as well as analysis or spectral nudging, is a superior combination when dynamical downscaling is aimed at reproducing real wind fields.
Resumo:
We analyse the variability of the probability distribution of daily wind speed in wintertime over Northern and Central Europe in a series of global and regional climate simulations covering the last centuries, and in reanalysis products covering approximately the last 60 years. The focus of the study lies on identifying the link of the variations in the wind speed distribution to the regional near-surface temperature, to the meridional temperature gradient and to the North Atlantic Oscillation. Our main result is that the link between the daily wind distribution and the regional climate drivers is strongly model dependent. The global models tend to behave similarly, although they show some discrepancies. The two regional models also tend to behave similarly to each other, but surprisingly the results derived from each regional model strongly deviates from the results derived from its driving global model. In addition, considering multi-centennial timescales, we find in two global simulations a long-term tendency for the probability distribution of daily wind speed to widen through the last centuries. The cause for this widening is likely the effect of the deforestation prescribed in these simulations. We conclude that no clear systematic relationship between the mean temperature, the temperature gradient and/or the North Atlantic Oscillation, with the daily wind speed statistics can be inferred from these simulations. The understand- ing of past and future changes in the distribution of wind speeds, and thus of wind speed extremes, will require a detailed analysis of the representation of the interaction between large-scale and small-scale dynamics.
Resumo:
Data from three previous experiments were analyzed to test the hypothesis that brain waves of spoken or written words can be represented by the superposition of a few sine waves. First, we averaged the data over trials and a set of subjects, and, in one case, over experimental conditions as well. Next we applied a Fourier transform to the averaged data and selected those frequencies with high energy, in no case more than nine in number. The superpositions of these selected sine waves were taken as prototypes. The averaged unfiltered data were the test samples. The prototypes were used to classify the test samples according to a least-squares criterion of fit. The results were seven of seven correct classifications for the first experiment using only three frequencies, six of eight for the second experiment using nine frequencies, and eight of eight for the third experiment using five frequencies.
Resumo:
In this study, we implement chronic optical imaging of intrinsic signals in rat barrel cortex and repeatedly quantify the functional representation of a single whisker over time. The success of chronic imaging for more than 1 month enabled an evaluation of the normal dynamic range of this sensory representation. In individual animals for a period of several weeks, we found that: (i) the average spatial extent of the quantified functional representation of whisker C2 is surprisingly large--1.71 mm2 (area at half-height); (ii) the location of the functional representation is consistent; and (iii) there are ongoing but nonsystematic changes in spatiotemporal characteristics such as the size, shape, and response amplitude of the functional representation. These results support a modified description of the functional organization of barrel cortex, where although a precisely located module corresponds to a specific whisker, this module is dynamic, large, and overlaps considerably with the modules of many other whiskers.
Resumo:
In this article, a new methodology is presented to obtain representation models for a priori relation z = u(x1, x2, . . . ,xn) (1), with a known an experimental dataset zi; x1i ; x2i ; x3i ; . . . ; xni i=1;2;...;p· In this methodology, a potential energy is initially defined over each possible model for the relationship (1), what allows the application of the Lagrangian mechanics to the derived system. The solution of the Euler–Lagrange in this system allows obtaining the optimal solution according to the minimal action principle. The defined Lagrangian, corresponds to a continuous medium, where a n-dimensional finite elements model has been applied, so it is possible to get a solution for the problem solving a compatible and determined linear symmetric equation system. The computational implementation of the methodology has resulted in an improvement in the process of get representation models obtained and published previously by the authors.
Resumo:
When the act of 'drawing' became what can only be called formalised, (whose growth can be said to have blossomed during the Renaissance), there developed a separation between the drawing and its procurement. Recently, David Ross Scheer, in his book ‘The Death of Drawing, Architecture in the Age of Simulation’ wrote: ‘…whereas architectural drawings exist to represent construction, architectural simulations exist to anticipate building performance.’ Meanwhile, Paolo Belardi, in his work ‘Why Architects Still Draw’ likens a drawing to an acorn, where he says: ‘It is the paradox of the acorn: a project emerges from a drawing – even from a sketch, rough and inchoate - just as an oak tree emerges from an acorn.’ He tells us that Giorgio Vasari would work late at night ‘seeking to solve the problems of perspective’ and he makes a passionate plea that this reflective process allows the concept to evolve, grow and/or develop. However, without belittling Belardi, the virtual model now needs this self-same treatment where it is nurtured, coaxed and encouraged to be the inchoate blueprint of the resultant oak tree. The model now too can embrace the creative process going through the first phase of preparation, where it focuses on the problem. The manipulation of the available material can then be incubated so that it is reasoned and generates feedback. This paper serves to align this shift in perception, methodologies and assess whether the 2D paper abstraction still has a purpose and role in today’s digital world!
Resumo:
Mode of access: Internet.
Resumo:
Head- and tailpieces, initials.