70 resultados para Real data
Resumo:
El projecte es desenvoluparà en format web i seguint l'estàndar J2EE, dins del què podríem anomenar MVC1, de manera que la seva consulta es pugui fer en línia aprofitant les noves tecnologies i l'accés a internet.
Resumo:
Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult toachieve because the relative values of the forecast components often fail to behave ina way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It hasbeen shown that cause-specic mortality forecasts are pessimistic when compared withall-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approachof using log mortality rates and forecasts the density of deaths in the life table. Sincethese values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbingstate), they are intrinsically relative rather than absolute values across decrements aswell as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison(1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that theunit sum constraint is honoured. The structure of the best-known, single-decrementmortality-rate forecasting model, devised by Lee and Carter (1992), is expressed incompositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortalityby cause of death for Japan
Resumo:
Traditionally, compositional data has been identified with closed data, and the simplex has been considered as the natural sample space of this kind of data. In our opinion, the emphasis on the constrained nature ofcompositional data has contributed to mask its real nature. More crucial than the constraining property of compositional data is the scale-invariant property of this kind of data. Indeed, when we are considering only few parts of a full composition we are not working with constrained data but our data are still compositional. We believe that it is necessary to give a more precisedefinition of composition. This is the aim of this oral contribution
Resumo:
The automatic interpretation of conventional traffic signs is very complex and time consuming. The paper concerns an automatic warning system for driving assistance. It does not interpret the standard traffic signs on the roadside; the proposal is to incorporate into the existing signs another type of traffic sign whose information will be more easily interpreted by a processor. The type of information to be added is profuse and therefore the most important object is the robustness of the system. The basic proposal of this new philosophy is that the co-pilot system for automatic warning and driving assistance can interpret with greater ease the information contained in the new sign, whilst the human driver only has to interpret the "classic" sign. One of the codings that has been tested with good results and which seems to us easy to implement is that which has a rectangular shape and 4 vertical bars of different colours. The size of these signs is equivalent to the size of the conventional signs (approximately 0.4 m2). The colour information from the sign can be easily interpreted by the proposed processor and the interpretation is much easier and quicker than the information shown by the pictographs of the classic signs
Resumo:
This article presents recent WMR (wheeled mobile robot) navigation experiences using local perception knowledge provided by monocular and odometer systems. A local narrow perception horizon is used to plan safety trajectories towards the objective. Therefore, monocular data are proposed as a way to obtain real time local information by building two dimensional occupancy grids through a time integration of the frames. The path planning is accomplished by using attraction potential fields, while the trajectory tracking is performed by using model predictive control techniques. The results are faced to indoor situations by using the lab available platform consisting in a differential driven mobile robot
Resumo:
L’objectiu d’aquest PFC és desenvolupar un sistema de pluja per a videojocs i aplicacions de realitat virtual que sigui acurat, tant en el sentit del realisme visual com del seu comportament. El projecte permetrà al desenvolupador de videojocs incorporar a les seves aplicacions, zones de pluja amb diferents intensitats utilitzant el hardware gràfic més modern, per així evitar que aquesta pluja sigui processada per la CPU i per tant pugui alentir el videojoc que està creant. S’han desenvolupat dos sistemes, el sistema d’edició de pluja i el de visualització en temps real
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
One of the disadvantages of old age is that there is more past than future: this,however, may be turned into an advantage if the wealth of experience and, hopefully,wisdom gained in the past can be reflected upon and throw some light on possiblefuture trends. To an extent, then, this talk is necessarily personal, certainly nostalgic,but also self critical and inquisitive about our understanding of the discipline ofstatistics. A number of almost philosophical themes will run through the talk: searchfor appropriate modelling in relation to the real problem envisaged, emphasis onsensible balances between simplicity and complexity, the relative roles of theory andpractice, the nature of communication of inferential ideas to the statistical layman, theinter-related roles of teaching, consultation and research. A list of keywords might be:identification of sample space and its mathematical structure, choices betweentransform and stay, the role of parametric modelling, the role of a sample spacemetric, the underused hypothesis lattice, the nature of compositional change,particularly in relation to the modelling of processes. While the main theme will berelevance to compositional data analysis we shall point to substantial implications forgeneral multivariate analysis arising from experience of the development ofcompositional data analysis…
Resumo:
Most central banks perceive a trade-off between stabilizing inflation and stabilizing the gap between output and desired output. However, the standard new Keynesian framework implies no such trade-off. In that framework, stabilizing inflation is equivalent to stabilizing the welfare-relevant output gap. In this paper, we argue that this property of the new Keynesian framework, which we call the divine coincidence, is due to a special feature of the model: the absence of non trivial real imperfections.We focus on one such real imperfection, namely, real wage rigidities. When the baseline new Keynesian model is extended to allow for real wage rigidities, the divine coincidence disappears, and central banks indeed face a trade-off between stabilizing inflation and stabilizing the welfare-relevant output gap. We show that not only does the extended model have more realistic normative implications, but it also has appealing positive properties. In particular, it provides a natural interpretation for the dynamic inflation-unemployment relation found in the data.
Resumo:
The well-known lack of power of unit root tests has often been attributed to the shortlength of macroeconomic variables and also to DGP s that depart from the I(1)-I(0)alternatives. This paper shows that by using long spans of annual real GNP and GNPper capita (133 years) high power can be achieved, leading to the rejection of both theunit root and the trend-stationary hypothesis. This suggests that possibly neither modelprovides a good characterization of these data. Next, more flexible representations areconsidered, namely, processes containing structural breaks (SB) and fractional ordersof integration (FI). Economic justification for the presence of these features in GNP isprovided. It is shown that the latter models (FI and SB) are in general preferred to theARIMA (I(1) or I(0)) ones. As a novelty in this literature, new techniques are appliedto discriminate between FI and SB models. It turns out that the FI specification ispreferred, implying that GNP and GNP per capita are non-stationary, highly persistentbut mean-reverting series. Finally, it is shown that the results are robust when breaksin the deterministic component are allowed for in the FI model. Some macroeconomicimplications of these findings are also discussed.
Resumo:
A biplot, which is the multivariate generalization of the two-variable scatterplot, can be used to visualize the results of many multivariate techniques, especially those that are based on the singular value decomposition. We consider data sets consisting of continuous-scale measurements, their fuzzy coding and the biplots that visualize them, using a fuzzy version of multiple correspondence analysis. Of special interest is the way quality of fit of the biplot is measured, since it is well-known that regular (i.e., crisp) multiple correspondence analysis seriously under-estimates this measure. We show how the results of fuzzy multiple correspondence analysis can be defuzzified to obtain estimated values of the original data, and prove that this implies an orthogonal decomposition of variance. This permits a measure of fit to be calculated in the familiar form of a percentage of explained variance, which is directly comparable to the corresponding fit measure used in principal component analysis of the original data. The approach is motivated initially by its application to a simulated data set, showing how the fuzzy approach can lead to diagnosing nonlinear relationships, and finally it is applied to a real set of meteorological data.
Resumo:
Resum En l’actualitat, els sistemes electrònics de processament de dades són cada cop més significatius dins del sector industrial. Són moltes les necessitats que sorgeixen en el món dels sistemes d’autentificació, de l’electrònica aeronàutica, d’equips d’emmagatzemament de dades, de telecomunicacions, etc. Aquestes necessitats tecnològiques exigeixen ser controlades per un sistema fiable, robust, totalment dependent amb els esdeveniments externs i que compleixi correctament les restriccions temporals imposades per tal de que realitzi el seu propòsit d’una manera eficient. Aquí és on entren en joc els sistemes encastats en temps real, els quals ofereixen una gran fiabilitat, disponibilitat, una ràpida resposta als esdeveniments externs del sistema, una alta garantia de funcionament i una àmplia possibilitat d’aplicacions. Aquest projecte està pensat per a fer una introducció al món dels sistemes encastats, com també explicar el funcionament del sistema operatiu en temps real FreeRTOS; el qual utilitza com a mètode de programació l’ús de tasques independents entre elles. Donarem una visió de les seves característiques de funcionament, com organitza tasques mitjançant un scheduler i uns exemples per a poder dissenyar-hi aplicacions.
Resumo:
Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.
Resumo:
Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest duration corresponding to the maturity stage (approximately 80% of the total time).
Resumo:
Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.