951 resultados para Space-time analysis
Resumo:
Spatio-temporal variability in settlement and recruitment, high mortality during the first life-history stages, and selection may determine the genetic structure of cohorts of long-lived marine invertebrates at small scales. We conducted a spatial and temporal analysis of the common Mediterranean Sea urchin Paracentrotus lividus to determine the genetic structure of cohorts at different scales. In Tossa de Mar (NW Mediterranean), recruitment was followed over 5 consecutive springs (2006-2010). In spring 2008, recruits and two-year-old individuals were collected at 6 locations along East and South Iberian coasts separated from 200 to over 1,100 km. All cohorts presented a high genetic diversity based on a fragment of mtCOI. Our results showed a marked genetic homogeneity in the temporal monitoring and a low degree of spatial structure in 2006. In 2008, coupled with an abnormality in the usual circulation patterns in the area, the genetic structure of the southern populations studied changed markedly, with arrival of many private haplotypes. This fact highlights the importance of point events in renewing the genetic makeup of populations, which can only be detected through analysis of the cohort structure coupling temporal and spatial perspectives.
Resumo:
The extension of traditional data mining methods to time series has been effectively applied to a wide range of domains such as finance, econometrics, biology, security, and medicine. Many existing mining methods deal with the task of change points detection, but very few provide a flexible approach. Querying specific change points with linguistic variables is particularly useful in crime analysis, where intuitive, understandable, and appropriate detection of changes can significantly improve the allocation of resources for timely and concise operations. In this paper, we propose an on-line method for detecting and querying change points in crime-related time series with the use of a meaningful representation and a fuzzy inference system. Change points detection is based on a shape space representation, and linguistic terms describing geometric properties of the change points are used to express queries, offering the advantage of intuitiveness and flexibility. An empirical evaluation is first conducted on a crime data set to confirm the validity of the proposed method and then on a financial data set to test its general applicability. A comparison to a similar change-point detection algorithm and a sensitivity analysis are also conducted. Results show that the method is able to accurately detect change points at very low computational costs. More broadly, the detection of specific change points within time series of virtually any domain is made more intuitive and more understandable, even for experts not related to data mining.
Resumo:
With the aim of better understanding avalanche risk in the Catalan Pyrenees, the present work focuses on the analysis of major (or destructive) avalanches. For such purpose major avalanche cartography was made by an exhaustive photointerpretation of several flights, winter and summer field surveys and inquiries to local population. Major avalanche events were used to quantify the magnitude of the episodes during which they occurred, and a Major Avalanche Activity Magnitude Index (MAAMI) was developed. This index is based on the number of major avalanches registered and its estimated frequency in a given time period, hence it quantifies the magnitude of a major avalanche episode or winter. Furthermore, it permits a comparison of the magnitude between major avalanche episodes in a given mountain range, or between mountain ranges, and for a long enough period, it should allow analysis of temporal trends. Major episodes from winter 1995/96 to 2013/14 were reconstructed. Their magnitude, frequency and extent were also assessed. During the last 19 winters, the episodes of January 22-23 and February 6-8 in 1996 were those with highest MAAMI values,followed by January 30-31, 2003, January 29, 2006, and January 24-25, 2014. To analyze the whole twentieth century, a simplified MAAMI was defined in order to attain the same purpose with a less complete dataset. With less accuracy, the same parameters were obtained at winter time resolution throughout the twentieth century. Again, 1995/96 winter had the highest MAAMI value followed by 1971/72, 1974/75 and 1937/38 winter seasons. The analysis of the spatial extent of the different episodes allowed refining the demarcation of nivological regions, and improving our knowledge about the atmospheric patterns that cause major episodes and their climatic interpretation. In some cases, the importance of considering a major avalanche episode as the result of a previous preparatory period, followed by a triggering one was revealed.
Resumo:
This thesis presents an approach for formulating and validating a space averaged drag model for coarse mesh simulations of gas-solid flows in fluidized beds using the two-fluid model. Proper modeling for fluid dynamics is central in understanding any industrial multiphase flow. The gas-solid flows in fluidized beds are heterogeneous and usually simulated with the Eulerian description of phases. Such a description requires the usage of fine meshes and small time steps for the proper prediction of its hydrodynamics. Such constraint on the mesh and time step size results in a large number of control volumes and long computational times which are unaffordable for simulations of large scale fluidized beds. If proper closure models are not included, coarse mesh simulations for fluidized beds do not give reasonable results. The coarse mesh simulation fails to resolve the mesoscale structures and results in uniform solids concentration profiles. For a circulating fluidized bed riser, such predicted profiles result in a higher drag force between the gas and solid phase and also overestimated solids mass flux at the outlet. Thus, there is a need to formulate the closure correlations which can accurately predict the hydrodynamics using coarse meshes. This thesis uses the space averaging modeling approach in the formulation of closure models for coarse mesh simulations of the gas-solid flow in fluidized beds using Geldart group B particles. In the analysis of formulating the closure correlation for space averaged drag model, the main parameters for the modeling were found to be the averaging size, solid volume fraction, and distance from the wall. The closure model for the gas-solid drag force was formulated and validated for coarse mesh simulations of the riser, which showed the verification of this modeling approach. Coarse mesh simulations using the corrected drag model resulted in lowered values of solids mass flux. Such an approach is a promising tool in the formulation of appropriate closure models which can be used in coarse mesh simulations of large scale fluidized beds.
Resumo:
We propose to show in this paper, that the time series obtained from biological systems such as human brain are invariably nonstationary because of different time scales involved in the dynamical process. This makes the invariant parameters time dependent. We made a global analysis of the EEG data obtained from the eight locations on the skull space and studied simultaneously the dynamical characteristics from various parts of the brain. We have proved that the dynamical parameters are sensitive to the time scales and hence in the study of brain one must identify all relevant time scales involved in the process to get an insight in the working of brain.
Resumo:
Three existing models of Interplanetary Coronal Mass Ejection (ICME) transit between the Sun and the Earth are compared to coronagraph and in situ observations: all three models are found to perform with a similar level of accuracy (i.e. an average error between observed and predicted 1AU transit times of approximately 11 h). To improve long-term space weather prediction, factors influencing CME transit are investigated. Both the removal of the plane of sky projection (as suffered by coronagraph derived speeds of Earth directed CMEs) and the use of observed values of solar wind speed, fail to significantly improve transit time prediction. However, a correlation is found to exist between the late/early arrival of an ICME and the width of the preceding sheath region, suggesting that the error is a geometrical effect that can only be removed by a more accurate determination of a CME trajectory and expansion. The correlation between magnetic field intensity and speed of ejecta at 1AU is also investigated. It is found to be weak in the body of the ICME, but strong in the sheath, if the upstream solar wind conditions are taken into account.
Resumo:
An analysis was made that calculated the risk of disease for premises in the most heavily affected parts of the county of Cumbria during the foot-and-mouth disease epidemic in the UK in 2001. In over half the cases the occurrence of the disease was not directly attributable to a recently infected premises being located within 1.5 km. Premises more than 1.5 km from recently infected premises faced sufficiently high infection risks that culling within a 1.5 km radius of the infected premises alone could not have prevented the progress of the epidemic. A comparison of the final outcome in two areas of the county, south Penrith and north Cumbria, indicated that focusing on controlling the potential spread of the disease over short distances by culling premises contiguous to infected premises, while the disease continued to spread over longer distances, may have resulted in excessive numbers of premises being culled. Even though the contiguous cull in south Penrith appeared to have resulted in a smaller proportion of premises becoming infected, the overall proportion of premises culled was considerably greater than in north Cumbria, where, because of staff and resource limitations, a smaller proportion of premises contiguous to infected premises was culled
Resumo:
We present the symbolic resonance analysis (SRA) as a viable method for addressing the problem of enhancing a weakly dominant mode in a mixture of impulse responses obtained from a nonlinear dynamical system. We demonstrate this using results from a numerical simulation with Duffing oscillators in different domains of their parameter space, and by analyzing event-related brain potentials (ERPs) from a language processing experiment in German as a representative application. In this paradigm, the averaged ERPs exhibit an N400 followed by a sentence final negativity. Contemporary sentence processing models predict a late positivity (P600) as well. We show that the SRA is able to unveil the P600 evoked by the critical stimuli as a weakly dominant mode from the covering sentence final negativity. (c) 2007 American Institute of Physics. (c) 2007 American Institute of Physics.
Resumo:
In 1984 and 1985 a series of experiments was undertaken in which dayside ionospheric flows were measured by the EISCAT “Polar” experiment, while observations of the solar wind and interplanetary magnetic field (IMF) were made by the AMPTE UKS and IRM spacecraft upstream from the Earth's bow shock. As a result, 40 h of simultaneous data were acquired, which are analysed in this paper to investigate the relationship between the ionospheric flow and the North-South (Bz) component of the IMF. The ionospheric flow data have 2.5 min resolution, and cover the dayside local time sector from ∼ 09:30 to ∼ 18:30 M.L.T. and the latitude range from 70.8° to 74.3°. Using cross-correlation analysis it is shown that clear relationships do exist between the ionospheric flow and IMF Bz, but that the form of the relations depends strongly on latitude and local time. These dependencies are readily interpreted in terms of a twinvortex flow pattern in which the magnitude and latitudinal extent of the flows become successively larger as Bz becomes successively more negative. Detailed maps of the flow are derived for a range of Bz values (between ± 4 nT) which clearly demonstrate the presence of these effects in the data. The data also suggest that the morning reversal in the East-West component of flow moves to earlier local times as Bz, declines in value and becomes negative. The correlation analysis also provides information on the ionospheric response time to changes in IMF Bz, it being found that the response is very rapid indeed. The most rapid response occurs in the noon to mid-afternoon sector, where the westward flows of the dusk cell respond with a delay of 3.9 ± 2.2 min to changes in the North-South field at the subsolar magnetopause. The flows appear to evolve in form over the subsequent ~ 5 min interval, however, as indicated by the longer response times found for the northward component of flow in this sector (6.7 ±2.2 min), and in data from earlier and later local times. No evidence is found for a latitudinal gradient in response time; changes in flow take place coherently in time across the entire radar field-of-view.
Resumo:
This work aims at combining the Chaos theory postulates and Artificial Neural Networks classification and predictive capability, in the field of financial time series prediction. Chaos theory, provides valuable qualitative and quantitative tools to decide on the predictability of a chaotic system. Quantitative measurements based on Chaos theory, are used, to decide a-priori whether a time series, or a portion of a time series is predictable, while Chaos theory based qualitative tools are used to provide further observations and analysis on the predictability, in cases where measurements provide negative answers. Phase space reconstruction is achieved by time delay embedding resulting in multiple embedded vectors. The cognitive approach suggested, is inspired by the capability of some chartists to predict the direction of an index by looking at the price time series. Thus, in this work, the calculation of the embedding dimension and the separation, in Takens‘ embedding theorem for phase space reconstruction, is not limited to False Nearest Neighbor, Differential Entropy or other specific method, rather, this work is interested in all embedding dimensions and separations that are regarded as different ways of looking at a time series by different chartists, based on their expectations. Prior to the prediction, the embedded vectors of the phase space are classified with Fuzzy-ART, then, for each class a back propagation Neural Network is trained to predict the last element of each vector, whereas all previous elements of a vector are used as features.
Resumo:
This paper proposes to use a state-space technique to represent a frequency dependent line for simulating electromagnetic transients directly in time domain. The distributed nature of the line is represented by a multiple 1t section network made up of the lumped parameters and the frequency dependence of the per unit longitudinal parameters is matched by using a rational function. The rational function is represented by its equivalent circuit with passive elements. This passive circuit is then inserted in each 1t circuit of the cascade that represents the line. Because the system is very sparse, it is possible to use a sparsity technique to store only nonzero elements of this matrix for saving space and running time. The model was used to simulate the energization process of a 10 km length single-phase line. ©2008 IEEE.
Resumo:
This work provides a forward step in the study and comprehension of the relationships between stochastic processes and a certain class of integral-partial differential equation, which can be used in order to model anomalous diffusion and transport in statistical physics. In the first part, we brought the reader through the fundamental notions of probability and stochastic processes, stochastic integration and stochastic differential equations as well. In particular, within the study of H-sssi processes, we focused on fractional Brownian motion (fBm) and its discrete-time increment process, the fractional Gaussian noise (fGn), which provide examples of non-Markovian Gaussian processes. The fGn, together with stationary FARIMA processes, is widely used in the modeling and estimation of long-memory, or long-range dependence (LRD). Time series manifesting long-range dependence, are often observed in nature especially in physics, meteorology, climatology, but also in hydrology, geophysics, economy and many others. We deepely studied LRD, giving many real data examples, providing statistical analysis and introducing parametric methods of estimation. Then, we introduced the theory of fractional integrals and derivatives, which indeed turns out to be very appropriate for studying and modeling systems with long-memory properties. After having introduced the basics concepts, we provided many examples and applications. For instance, we investigated the relaxation equation with distributed order time-fractional derivatives, which describes models characterized by a strong memory component and can be used to model relaxation in complex systems, which deviates from the classical exponential Debye pattern. Then, we focused in the study of generalizations of the standard diffusion equation, by passing through the preliminary study of the fractional forward drift equation. Such generalizations have been obtained by using fractional integrals and derivatives of distributed orders. In order to find a connection between the anomalous diffusion described by these equations and the long-range dependence, we introduced and studied the generalized grey Brownian motion (ggBm), which is actually a parametric class of H-sssi processes, which have indeed marginal probability density function evolving in time according to a partial integro-differential equation of fractional type. The ggBm is of course Non-Markovian. All around the work, we have remarked many times that, starting from a master equation of a probability density function f(x,t), it is always possible to define an equivalence class of stochastic processes with the same marginal density function f(x,t). All these processes provide suitable stochastic models for the starting equation. Studying the ggBm, we just focused on a subclass made up of processes with stationary increments. The ggBm has been defined canonically in the so called grey noise space. However, we have been able to provide a characterization notwithstanding the underline probability space. We also pointed out that that the generalized grey Brownian motion is a direct generalization of a Gaussian process and in particular it generalizes Brownain motion and fractional Brownain motion as well. Finally, we introduced and analyzed a more general class of diffusion type equations related to certain non-Markovian stochastic processes. We started from the forward drift equation, which have been made non-local in time by the introduction of a suitable chosen memory kernel K(t). The resulting non-Markovian equation has been interpreted in a natural way as the evolution equation of the marginal density function of a random time process l(t). We then consider the subordinated process Y(t)=X(l(t)) where X(t) is a Markovian diffusion. The corresponding time-evolution of the marginal density function of Y(t) is governed by a non-Markovian Fokker-Planck equation which involves the same memory kernel K(t). We developed several applications and derived the exact solutions. Moreover, we considered different stochastic models for the given equations, providing path simulations.
A Phase Space Box-counting based Method for Arrhythmia Prediction from Electrocardiogram Time Series
Resumo:
Arrhythmia is one kind of cardiovascular diseases that give rise to the number of deaths and potentially yields immedicable danger. Arrhythmia is a life threatening condition originating from disorganized propagation of electrical signals in heart resulting in desynchronization among different chambers of the heart. Fundamentally, the synchronization process means that the phase relationship of electrical activities between the chambers remains coherent, maintaining a constant phase difference over time. If desynchronization occurs due to arrhythmia, the coherent phase relationship breaks down resulting in chaotic rhythm affecting the regular pumping mechanism of heart. This phenomenon was explored by using the phase space reconstruction technique which is a standard analysis technique of time series data generated from nonlinear dynamical system. In this project a novel index is presented for predicting the onset of ventricular arrhythmias. Analysis of continuously captured long-term ECG data recordings was conducted up to the onset of arrhythmia by the phase space reconstruction method, obtaining 2-dimensional images, analysed by the box counting method. The method was tested using the ECG data set of three different kinds including normal (NR), Ventricular Tachycardia (VT), Ventricular Fibrillation (VF), extracted from the Physionet ECG database. Statistical measures like mean (μ), standard deviation (σ) and coefficient of variation (σ/μ) for the box-counting in phase space diagrams are derived for a sliding window of 10 beats of ECG signal. From the results of these statistical analyses, a threshold was derived as an upper bound of Coefficient of Variation (CV) for box-counting of ECG phase portraits which is capable of reliably predicting the impeding arrhythmia long before its actual occurrence. As future work of research, it was planned to validate this prediction tool over a wider population of patients affected by different kind of arrhythmia, like atrial fibrillation, bundle and brunch block, and set different thresholds for them, in order to confirm its clinical applicability.
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
In this study we provide a baseline data on semidemersal fish assemblages and biology in a heterogeneous and yet less studied portion of the shelf of Antalya Gulf. The distribution of fish abundance in three transects subjected to different fisheries regulations (fishery vs non fishery areas), and including depths of 10, 25, 75, 125, 200 m, was studied between May 2014 and February 2015 in representative months of winter, spring, summer and autumn seasons. A total of 76 fish species belonging to 40 families was collected and semidemersal species distribution was analyzed in comparison with the whole community. Spatial distribution of fish was driven mainly by depth and two main assemblages were observed: shallow waters (10-25; 75 m) and deep waters (125-200 m). Significant differences among transects were found for the whole community but not for the semidemersal species. Analysis showed that this was due to a strong relation of these species with local environmental characteristics rather than to a different fishing pressure over transects. Firstly all species distribute according to the bathymetrical gradient and secondly to the bottom type structure. Semidemersal species were then found more related to zooplankton and suspended matter availability. The main morphological characteristics, sex and size distribution of the target semidemersal species Spicara smaris (Linnaeus, 1758), Saurida undosquamis (Richardson, 1848), Pagellus acarne (Risso, 1827) were also investigated.