957 resultados para Analyze space
Resumo:
Research analysis of electrocardiograms (ECG) today is carried out mostly using time depending signals of different leads shown in the graphs. Definition of ECG parameters is performed by qualified personnel, and requiring particular skills. To support decoding the cardiac depolarization phase of ECG there are methods to analyze space-time convolution charts in three dimensions where the heartbeat is described by the trajectory of its electrical vector. Based on this, it can be assumed that all available options of the classical ECG analysis of this time segment can be obtained using this technique. Investigated ECG visualization techniques in three dimensions combined with quantitative methods giving additional features of cardiac depolarization and allow a better exploitation of the information content of the given ECG signals.
Resumo:
Nous analysons des bulles d'espace-temps d'épaisseur finie en relativité générale. Les conditions d'énergie sont utilisées afin d'obtenir un ensemble de critères permettant de restreindre la structure du bord de la bulle. Dans le cas des bulles statiques et à symétrie sphérique, nous obtenons quatre inégalités différentielles équivalentes aux trois conditions d'énergie les plus communes. Nous montrons qu'elles sont équivalentes à un ensemble de deux inégalités différentielles simples lorsque le potentiel gravitationnel effectif a une forme particulière. Nous paramétrons alors l'espace-temps de manière à rendre la vérification de ces inégalités plus simple lorsqu'il sera question de bulles d'espace-temps. Nous traitons en particulier quatre formes de bulles, toutes caractérisées par un extérieur de type Schwarzschild de Sitter. Nous montrons que notre méthode donne les bons résultats lorsque la limite où l'épaisseur de la bulle tend vers zéro est prise. Nous terminons par un traitement succinct du problème d'une onde gravitationnelle se propageant dans un nuage de bulles d'espace-temps.
Resumo:
The goal of the present work is to analyze space missions that use the terrestrial atmosphere to accomplish orbital maneuvers that involve a plane change. A set of analytical solutions is presented for the variation of the orbital elements due to a single passage through the atmosphere, assuming that the interval the spacecraft travels through the atmosphere is not too large. The study considers both the lift influence on the spacecraft orbit as well as drag. The final equations are tested with numerical integration and can be considered in accordance with the numerical results whenever the perigee height is larger than a critical value. Next, a numerical study of the ratio between the velocity increment required to correct the semimajor axis decay due to the atmospheric passage and the velocity variation required to obtain the change in the inclination is also presented. This analysis can be used to decide if a maneuver passing through the atmosphere can decrease the fuel consumption of the mission and, in the cases where this technique can be used, if a multiple passage is more efficient than a single passage.
Resumo:
Pós-graduação em Geografia - IGCE
Resumo:
At least 10% of glioblastoma relapses occur at distant and even contralateral locations. This disseminated growth limits surgical intervention and contributes to neurological morbidity. Preclinical data pointed toward a role for temozolomide (TMZ) in reducing radiotherapy-induced glioma cell invasiveness. Our objective was to develop and validate a new analysis tool of MRI data to examine the clinical recurrence pattern of glioblastomas. MRIcro software was used to map the location and extent of initial preoperative and recurrent tumors on MRI of 63 patients in the European Organisation for Research and Treatment of Cancer (EORTC) 26981/22981/National Cancer Institute of Canada (NCIC) CE.3 study into the same stereotaxic space. This allowed us to examine changes of site and distance between the initial and the recurrent tumor on the group level. Thirty of the 63 patients were treated using radiotherapy, while the other patients completed a radiotherapy-plus-TMZ treatment. Baseline characteristics (median age, KPS) and outcome data (progression-free survival, overall survival) of the patients included in this analysis resemble those of the general study cohort. The patient groups did not differ in the promoter methylation status of methyl guanine methyltransferase (MGMT). Overall frequency of distant recurrences was 20%. Analysis of recurrence patterns revealed no difference between the groups in the size of the recurrent tumor or in the differential effect on the distance of the recurrences from the preoperative tumor location. The data show the feasibility of groupwise recurrence pattern analysis. An effect of TMZ treatment on the recurrence pattern in the EORTC 26981/22981/NCIC CE.3 study could not be demonstrated.
Resumo:
This paper presents and estimates a dynamic choice model in the attribute space considering rational consumers. In light of the evidence of several state-dependence patterns, the standard attribute-based model is extended by considering a general utility function where pure inertia and pure variety-seeking behaviors can be explained in the model as particular linear cases. The dynamics of the model are fully characterized by standard dynamic programming techniques. The model presents a stationary consumption pattern that can be inertial, where the consumer only buys one product, or a variety-seeking one, where the consumer shifts among varied products.We run some simulations to analyze the consumption paths out of the steady state. Underthe hybrid utility assumption, the consumer behaves inertially among the unfamiliar brandsfor several periods, eventually switching to a variety-seeking behavior when the stationary levels are approached. An empirical analysis is run using scanner databases for three different product categories: fabric softener, saltine cracker, and catsup. Non-linear specifications provide the best fit of the data, as hybrid functional forms are found in all the product categories for most attributes and segments. These results reveal the statistical superiority of the non-linear structure and confirm the gradual trend to seek variety as the level of familiarity with the purchased items increases.
Space Competition and Time Delays in Human Range Expansions. Application to the Neolithic Transition
Resumo:
Space competition effects are well-known in many microbiological and ecological systems. Here we analyze such an effectin human populations. The Neolithic transition (change from foraging to farming) was mainly the outcome of a demographic process that spread gradually throughout Europe from the Near East. In Northern Europe, archaeological data show a slowdown on the Neolithic rate of spread that can be related to a high indigenous (Mesolithic) population density hindering the advance as a result of the space competition between the two populations. We measure this slowdown from a database of 902 Early Neolithic sites and develop a time-delayed reaction-diffusion model with space competition between Neolithic and Mesolithic populations, to predict the observed speeds. The comparison of the predicted speed with the observations and with a previous non-delayed model show that both effects, the time delay effect due to the generation lag and the space competition between populations, are crucial in order to understand the observations
Resumo:
The number of existing protein sequences spans a very small fraction of sequence space. Natural proteins have overcome a strong negative selective pressure to avoid the formation of insoluble aggregates. Stably folded globular proteins and intrinsically disordered proteins (IDP) use alternative solutions to the aggregation problem. While in globular proteins folding minimizes the access to aggregation prone regions IDPs on average display large exposed contact areas. Here, we introduce the concept of average meta-structure correlation map to analyze sequence space. Using this novel conceptual view we show that representative ensembles of folded and ID proteins show distinct characteristics and responds differently to sequence randomization. By studying the way evolutionary constraints act on IDPs to disable a negative function (aggregation) we might gain insight into the mechanisms by which function - enabling information is encoded in IDPs.
Resumo:
This paper aims at detecting spatio-temporal clustering in fire sequences using space?time scan statistics, a powerful statistical framework for the analysis of point processes. The methodology is applied to active fire detection in the state of Florida (US) identified by MODIS (Moderate Resolution Imaging Spectroradiometer) during the period 2003?06. Results of the present study show that statistically significant clusters can be detected and localized in specific areas and periods of the year. Three out of the five most likely clusters detected for the entire frame period are localized in the north of the state, and they cover forest areas; the other two clusters cover a large zone in the south, corresponding to agricultural land and the prairies in the Everglades. In order to analyze if the wildfires recur each year during the same period, the analyses have been performed separately for the 4 years: it emerges that clusters of forest fires are more frequent in hot seasons (spring and summer), while in the southern areas, they are widely present during the whole year. The recognition of overdensities of events and the ability to locate them in space and in time can help in supporting fire management and focussing on prevention measures.
Resumo:
With the aim of better understanding avalanche risk in the Catalan Pyrenees, the present work focuses on the analysis of major (or destructive) avalanches. For such purpose major avalanche cartography was made by an exhaustive photointerpretation of several flights, winter and summer field surveys and inquiries to local population. Major avalanche events were used to quantify the magnitude of the episodes during which they occurred, and a Major Avalanche Activity Magnitude Index (MAAMI) was developed. This index is based on the number of major avalanches registered and its estimated frequency in a given time period, hence it quantifies the magnitude of a major avalanche episode or winter. Furthermore, it permits a comparison of the magnitude between major avalanche episodes in a given mountain range, or between mountain ranges, and for a long enough period, it should allow analysis of temporal trends. Major episodes from winter 1995/96 to 2013/14 were reconstructed. Their magnitude, frequency and extent were also assessed. During the last 19 winters, the episodes of January 22-23 and February 6-8 in 1996 were those with highest MAAMI values,followed by January 30-31, 2003, January 29, 2006, and January 24-25, 2014. To analyze the whole twentieth century, a simplified MAAMI was defined in order to attain the same purpose with a less complete dataset. With less accuracy, the same parameters were obtained at winter time resolution throughout the twentieth century. Again, 1995/96 winter had the highest MAAMI value followed by 1971/72, 1974/75 and 1937/38 winter seasons. The analysis of the spatial extent of the different episodes allowed refining the demarcation of nivological regions, and improving our knowledge about the atmospheric patterns that cause major episodes and their climatic interpretation. In some cases, the importance of considering a major avalanche episode as the result of a previous preparatory period, followed by a triggering one was revealed.
Resumo:
This thesis is concerned with the state and parameter estimation in state space models. The estimation of states and parameters is an important task when mathematical modeling is applied to many different application areas such as the global positioning systems, target tracking, navigation, brain imaging, spread of infectious diseases, biological processes, telecommunications, audio signal processing, stochastic optimal control, machine learning, and physical systems. In Bayesian settings, the estimation of states or parameters amounts to computation of the posterior probability density function. Except for a very restricted number of models, it is impossible to compute this density function in a closed form. Hence, we need approximation methods. A state estimation problem involves estimating the states (latent variables) that are not directly observed in the output of the system. In this thesis, we use the Kalman filter, extended Kalman filter, Gauss–Hermite filters, and particle filters to estimate the states based on available measurements. Among these filters, particle filters are numerical methods for approximating the filtering distributions of non-linear non-Gaussian state space models via Monte Carlo. The performance of a particle filter heavily depends on the chosen importance distribution. For instance, inappropriate choice of the importance distribution can lead to the failure of convergence of the particle filter algorithm. In this thesis, we analyze the theoretical Lᵖ particle filter convergence with general importance distributions, where p ≥2 is an integer. A parameter estimation problem is considered with inferring the model parameters from measurements. For high-dimensional complex models, estimation of parameters can be done by Markov chain Monte Carlo (MCMC) methods. In its operation, the MCMC method requires the unnormalized posterior distribution of the parameters and a proposal distribution. In this thesis, we show how the posterior density function of the parameters of a state space model can be computed by filtering based methods, where the states are integrated out. This type of computation is then applied to estimate parameters of stochastic differential equations. Furthermore, we compute the partial derivatives of the log-posterior density function and use the hybrid Monte Carlo and scaled conjugate gradient methods to infer the parameters of stochastic differential equations. The computational efficiency of MCMC methods is highly depend on the chosen proposal distribution. A commonly used proposal distribution is Gaussian. In this kind of proposal, the covariance matrix must be well tuned. To tune it, adaptive MCMC methods can be used. In this thesis, we propose a new way of updating the covariance matrix using the variational Bayesian adaptive Kalman filter algorithm.
Resumo:
Ma thèse est composée de trois chapitres reliés à l'estimation des modèles espace-état et volatilité stochastique. Dans le première article, nous développons une procédure de lissage de l'état, avec efficacité computationnelle, dans un modèle espace-état linéaire et gaussien. Nous montrons comment exploiter la structure particulière des modèles espace-état pour tirer les états latents efficacement. Nous analysons l'efficacité computationnelle des méthodes basées sur le filtre de Kalman, l'algorithme facteur de Cholesky et notre nouvelle méthode utilisant le compte d'opérations et d'expériences de calcul. Nous montrons que pour de nombreux cas importants, notre méthode est plus efficace. Les gains sont particulièrement grands pour les cas où la dimension des variables observées est grande ou dans les cas où il faut faire des tirages répétés des états pour les mêmes valeurs de paramètres. Comme application, on considère un modèle multivarié de Poisson avec le temps des intensités variables, lequel est utilisé pour analyser le compte de données des transactions sur les marchés financières. Dans le deuxième chapitre, nous proposons une nouvelle technique pour analyser des modèles multivariés à volatilité stochastique. La méthode proposée est basée sur le tirage efficace de la volatilité de son densité conditionnelle sachant les paramètres et les données. Notre méthodologie s'applique aux modèles avec plusieurs types de dépendance dans la coupe transversale. Nous pouvons modeler des matrices de corrélation conditionnelles variant dans le temps en incorporant des facteurs dans l'équation de rendements, où les facteurs sont des processus de volatilité stochastique indépendants. Nous pouvons incorporer des copules pour permettre la dépendance conditionnelle des rendements sachant la volatilité, permettant avoir différent lois marginaux de Student avec des degrés de liberté spécifiques pour capturer l'hétérogénéité des rendements. On tire la volatilité comme un bloc dans la dimension du temps et un à la fois dans la dimension de la coupe transversale. Nous appliquons la méthode introduite par McCausland (2012) pour obtenir une bonne approximation de la distribution conditionnelle à posteriori de la volatilité d'un rendement sachant les volatilités d'autres rendements, les paramètres et les corrélations dynamiques. Le modèle est évalué en utilisant des données réelles pour dix taux de change. Nous rapportons des résultats pour des modèles univariés de volatilité stochastique et deux modèles multivariés. Dans le troisième chapitre, nous évaluons l'information contribuée par des variations de volatilite réalisée à l'évaluation et prévision de la volatilité quand des prix sont mesurés avec et sans erreur. Nous utilisons de modèles de volatilité stochastique. Nous considérons le point de vue d'un investisseur pour qui la volatilité est une variable latent inconnu et la volatilité réalisée est une quantité d'échantillon qui contient des informations sur lui. Nous employons des méthodes bayésiennes de Monte Carlo par chaîne de Markov pour estimer les modèles, qui permettent la formulation, non seulement des densités a posteriori de la volatilité, mais aussi les densités prédictives de la volatilité future. Nous comparons les prévisions de volatilité et les taux de succès des prévisions qui emploient et n'emploient pas l'information contenue dans la volatilité réalisée. Cette approche se distingue de celles existantes dans la littérature empirique en ce sens que ces dernières se limitent le plus souvent à documenter la capacité de la volatilité réalisée à se prévoir à elle-même. Nous présentons des applications empiriques en utilisant les rendements journaliers des indices et de taux de change. Les différents modèles concurrents sont appliqués à la seconde moitié de 2008, une période marquante dans la récente crise financière.
Resumo:
The present article analyze the urban transformations happened in the sector of Saint Victorino and Saint Ines in the city of Bogota D.C. between 1948 and 2010, making use of the "Genealogical Methodology" during the process of inquiry that allow to contrast the visions that are usually accepted of "progress" and "urban renovation" in the urban market context by the existence of a informal economic and a population in conditions of marginality that configures a good part of the "popular urban culture" of the Bogota in the 20th and 21st century. This vision permit to observe from various perspectives the changes that happened in this sector of the city, the impacts of the history facts occurred en this time period and, in special, the real effects of a rearrangement urban process that began in 1998 and has been prolonged to date, which has left a significant mark about the urban and social physiognomy of the place.
Resumo:
Based on the literature data from HT-29 cell monolayers, we develop a model for its growth, analogous to an epidemic model, mixing local and global interactions. First, we propose and solve a deterministic equation for the progress of these colonies. Thus, we add a stochastic (local) interaction and simulate the evolution of an Eden-like aggregate by using dynamical Monte Carlo methods. The growth curves of both deterministic and stochastic models are in excellent agreement with the experimental observations. The waiting times distributions, generated via our stochastic model, allowed us to analyze the role of mesoscopic events. We obtain log-normal distributions in the initial stages of the growth and Gaussians at long times. We interpret these outcomes in the light of cellular division events: in the early stages, the phenomena are dependent each other in a multiplicative geometric-based process, and they are independent at long times. We conclude that the main ingredients for a good minimalist model of tumor growth, at mesoscopic level, are intrinsic cooperative mechanisms and competitive search for space. © 2013 Elsevier Ltd.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)