76 resultados para Sub-seafloor modeling
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The Bajo Segura fault zone (BSFZ) is the northern terminal splay of the Eastern Betic shear zone (EBSZ), a large left-lateral strike-slip fault system of sigmoid geometry stretching more than 450 km from Alicante to Almería. The BSFZ extends from the onshore Bajo Segura basin further into the Mediterranean Sea and shows a moderate instrumental seismic activity characterized by small earthquakes. Nevertheless, the zone was affected by large historical earthquakes of which the largest was the 1829 Torrevieja earthquake (IEMS98 X). The onshore area of the BSFZ is marked by active transpressive structures (faults and folds), whereas the offshore area has been scarcely explored from the tectonic point of view. During the EVENT-SHELF cruise, a total of 10 high-resolution single-channel seismic sparker profiles were obtained along and across the offshore Bajo Segura basin. Analysis of these profiles resulted in (a) the identification of 6 Quaternary seismo-stratigraphic units bounded by five horizons corresponding to regional erosional surfaces related to global sea level lowstands; and (b) the mapping of the active sub-seafloor structures and their correlation with those described onshore. Moreover, the results suggest that the Bajo Segura blind thrust fault or the Torrevieja left-lateral strike-slip fault, with prolongation offshore, could be considered as the source of the 1829 Torrevieja earthquake. These data improve our understanding of present deformation along the BSFZ and provide new insights into the seismic hazard in the area.
Resumo:
We explore the determinants of usage of six different types of health care services, using the Medical Expenditure Panel Survey data, years 1996-2000. We apply a number of models for univariate count data, including semiparametric, semi-nonparametric and finite mixture models. We find that the complexity of the model that is required to fit the data well depends upon the way in which the data is pooled across sexes and over time, and upon the characteristics of the usage measure. Pooling across time and sexes is almost always favored, but when more heterogeneous data is pooled it is often the case that a more complex statistical model is required.
Resumo:
We review recent likelihood-based approaches to modeling demand for medical care. A semi-nonparametric model along the lines of Cameron and Johansson's Poisson polynomial model, but using a negative binomial baseline model, is introduced. We apply these models, as well a semiparametric Poisson, hurdle semiparametric Poisson, and finite mixtures of negative binomial models to six measures of health care usage taken from the Medical Expenditure Panel survey. We conclude that most of the models lead to statistically similar results, both in terms of information criteria and conditional and unconditional prediction. This suggests that applied researchers may not need to be overly concerned with the choice of which of these models they use to analyze data on health care demand.
Resumo:
Variational steepest descent approximation schemes for the modified Patlak-Keller-Segel equation with a logarithmic interaction kernel in any dimension are considered. We prove the convergence of the suitably interpolated in time implicit Euler scheme, defined in terms of the Euclidean Wasserstein distance, associated to this equation for sub-critical masses. As a consequence, we recover the recent result about the global in time existence of weak-solutions to the modified Patlak-Keller-Segel equation for the logarithmic interaction kernel in any dimension in the sub-critical case. Moreover, we show how this method performs numerically in one dimension. In this particular case, this numerical scheme corresponds to a standard implicit Euler method for the pseudo-inverse of the cumulative distribution function. We demonstrate its capabilities to reproduce easily without the need of mesh-refinement the blow-up of solutions for super-critical masses.
Resumo:
We prove a double commutant theorem for hereditary subalgebras of a large class of C*-algebras, partially resolving a problem posed by Pedersen[8]. Double commutant theorems originated with von Neumann, whose seminal result evolved into an entire field now called von Neumann algebra theory. Voiculescu proved a C*-algebraic double commutant theorem for separable subalgebras of the Calkin algebra. We prove a similar result for hereditary subalgebras which holds for arbitrary corona C*-algebras. (It is not clear how generally Voiculescu's double commutant theorem holds.)
Resumo:
We show how to calibrate CES production and utility functions when indirect taxation affecting inputs and consumption is present. These calibrated functions can then be used in computable general equilibrium models. Taxation modifies the standard calibration procedures since any taxed good has two associated prices and a choice of reference value units has to be made. We also provide an example of computer code to solve the calibration of CES utilities under two alternate normalizations. To our knowledge, this paper fills a methodological gap in the CGE literature.
Resumo:
Treball de recerca realitzat per un alumne d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. La present investigació té per objecte d’estudi la relació, atès el període de turbulències econòmiques que travessa el món globalitzat, entre els episodis d’eufòria financera i les crisi financeres i econòmiques, i la periodicitat amb les que aquestes es produeixen. Aquesta pretén confrontar-se des d’una aproximació històric-econòmica, mitjançant l’anàlisi i la comparació de dos successos -el crack borsari de 1929 i la crisi sub-prime- per tal de demostrar la existència de comuns denominadors, i, a la llum dels resultats, apreciar les conclusions que aporta la Història. Serà, doncs, aquesta periodicitat i les seves implicacions la qual s'ambicionarà contrastar amb la realitat mitjançant l'aplicació i l'anàlisi pràctica de dos episodis rellevants i paradigmàtics, amb el recolzament i l'autoritat del model comparatiu establerts per l'economista John Kenneth Galbraith al seu llibre ''Breve historia de la euforia financiera''.
Resumo:
We study the properties of the well known Replicator Dynamics when applied to a finitely repeated version of the Prisoners' Dilemma game. We characterize the behavior of such dynamics under strongly simplifying assumptions (i.e. only 3 strategies are available) and show that the basin of attraction of defection shrinks as the number of repetitions increases. After discussing the difficulties involved in trying to relax the 'strongly simplifying assumptions' above, we approach the same model by means of simulations based on genetic algorithms. The resulting simulations describe a behavior of the system very close to the one predicted by the replicator dynamics without imposing any of the assumptions of the mathematical model. Our main conclusion is that mathematical and computational models are good complements for research in social sciences. Indeed, while computational models are extremely useful to extend the scope of the analysis to complex scenarios hard to analyze mathematically, formal models can be useful to verify and to explain the outcomes of computational models.
Resumo:
This paper is concerned with the modeling and analysis of quantum dissipation phenomena in the Schrödinger picture. More precisely, we do investigate in detail a dissipative, nonlinear Schrödinger equation somehow accounting for quantum Fokker–Planck effects, and how it is drastically reduced to a simpler logarithmic equation via a nonlinear gauge transformation in such a way that the physics underlying both problems keeps unaltered. From a mathematical viewpoint, this allows for a more achievable analysis regarding the local wellposedness of the initial–boundary value problem. This simplification requires the performance of the polar (modulus–argument) decomposition of the wavefunction, which is rigorously attained (for the first time to the best of our knowledge) under quite reasonable assumptions.
Resumo:
Observations in daily practice are sometimes registered as positive values larger then a given threshold α. The sample space is in this case the interval (α,+∞), α & 0, which can be structured as a real Euclidean space in different ways. This fact opens the door to alternative statistical models depending not only on the assumed distribution function, but also on the metric which is considered as appropriate, i.e. the way differences are measured, and thus variability
Resumo:
This paper is a first draft of the principle of statistical modelling on coordinates. Several causes —which would be long to detail—have led to this situation close to the deadline for submitting papers to CODAWORK’03. The main of them is the fast development of the approach along thelast months, which let appear previous drafts as obsolete. The present paper contains the essential parts of the state of the art of this approach from my point of view. I would like to acknowledge many clarifying discussions with the group of people working in this field in Girona, Barcelona, Carrick Castle, Firenze, Berlin, G¨ottingen, and Freiberg. They have given a lot of suggestions and ideas. Nevertheless, there might be still errors or unclear aspects which are exclusively my fault. I hope this contribution serves as a basis for further discussions and new developments
Resumo:
Photo-mosaicing techniques have become popular for seafloor mapping in various marine science applications. However, the common methods cannot accurately map regions with high relief and topographical variations. Ortho-mosaicing borrowed from photogrammetry is an alternative technique that enables taking into account the 3-D shape of the terrain. A serious bottleneck is the volume of elevation information that needs to be estimated from the video data, fused, and processed for the generation of a composite ortho-photo that covers a relatively large seafloor area. We present a framework that combines the advantages of dense depth-map and 3-D feature estimation techniques based on visual motion cues. The main goal is to identify and reconstruct certain key terrain feature points that adequately represent the surface with minimal complexity in the form of piecewise planar patches. The proposed implementation utilizes local depth maps for feature selection, while tracking over several views enables 3-D reconstruction by bundle adjustment. Experimental results with synthetic and real data validate the effectiveness of the proposed approach
Resumo:
This research work deals with the problem of modeling and design of low level speed controller for the mobile robot PRIM. The main objective is to develop an effective educational tool. On one hand, the interests in using the open mobile platform PRIM consist in integrating several highly related subjects to the automatic control theory in an educational context, by embracing the subjects of communications, signal processing, sensor fusion and hardware design, amongst others. On the other hand, the idea is to implement useful navigation strategies such that the robot can be served as a mobile multimedia information point. It is in this context, when navigation strategies are oriented to goal achievement, that a local model predictive control is attained. Hence, such studies are presented as a very interesting control strategy in order to develop the future capabilities of the system
Resumo:
Vegeu el resum a l'inici del document de l'arxiu adjunt
Resumo:
In previous work we proposed a multi-objective traffic engineering scheme (MHDB-S model) using different distribution trees to multicast several flows. In this paper, we propose a heuristic algorithm to create multiple point-to-multipoint (p2mp) LSPs based on the optimum sub-flow values obtained with our MHDB-S model. Moreover, a general problem for supporting multicasting in MPLS networks is the lack of labels. To reduce the number of labels used, a label space reduction algorithm solution is also considered