879 resultados para Computation time delay
Resumo:
Adding to the on-going debate regarding vegetation recolonisation (more particularly the timing) in Europe and climate change since the Lateglacial, this study investigates a long sediment core (LL081) from Lake Ledro (652ma.s.l., southern Alps, Italy). Environmental changes were reconstructed using multiproxy analysis (pollen-based vegetation and climate reconstruction, lake levels, magnetic susceptibility and X-ray fluorescence (XRF) measurements) recorded climate and land-use changes during the Lateglacial and early-middle Holocene. The well-dated and high-resolution pollen record of Lake Ledro is compared with vegetation records from the southern and northern Alps to trace the history of tree species distribution. An altitudedependent progressive time delay of the first continuous occurrence of Abies (fir) and of the Larix (larch) development has been observed since the Lateglacial in the southern Alps. This pattern suggests that the mid-altitude Lake Ledro area was not a refuge and that trees originated from lowlands or hilly areas (e.g. Euganean Hills) in northern Italy. Preboreal oscillations (ca. 11 000 cal BP), Boreal oscillations (ca. 10 200, 9300 cal BP) and the 8.2 kyr cold event suggest a centennial-scale climate forcing in the studied area. Picea (spruce) expansion occurred preferentially around 10 200 and 8200 cal BP in the south-eastern Alps, and therefore reflects the long-lasting cumulative effects of successive boreal and the 8.2 kyr cold event. The extension of Abies is contemporaneous with the 8.2 kyr event, but its development in the southern Alps benefits from the wettest interval 8200-7300 cal BP evidenced in high lake levels, flood activity and pollen-based climate reconstructions. Since ca. 7500 cal BP, a weak signal of pollen-based anthropogenic activities suggest weak human impact. The period between ca. 5700 and ca. 4100 cal BP is considered as a transition period to colder and wetter conditions (particularly during summers) that favoured a dense beech (Fagus) forest development which in return caused a distinctive yew (Taxus) decline.We conclude that climate was the dominant factor controlling vegetation changes and erosion processes during the early and middle Holocene (up to ca. 4100 cal BP).
Resumo:
Brain tumor is one of the most aggressive types of cancer in humans, with an estimated median survival time of 12 months and only 4% of the patients surviving more than 5 years after disease diagnosis. Until recently, brain tumor prognosis has been based only on clinical information such as tumor grade and patient age, but there are reports indicating that molecular profiling of gliomas can reveal subgroups of patients with distinct survival rates. We hypothesize that coupling molecular profiling of brain tumors with clinical information might improve predictions of patient survival time and, consequently, better guide future treatment decisions. In order to evaluate this hypothesis, the general goal of this research is to build models for survival prediction of glioma patients using DNA molecular profiles (U133 Affymetrix gene expression microarrays) along with clinical information. First, a predictive Random Forest model is built for binary outcomes (i.e. short vs. long-term survival) and a small subset of genes whose expression values can be used to predict survival time is selected. Following, a new statistical methodology is developed for predicting time-to-death outcomes using Bayesian ensemble trees. Due to a large heterogeneity observed within prognostic classes obtained by the Random Forest model, prediction can be improved by relating time-to-death with gene expression profile directly. We propose a Bayesian ensemble model for survival prediction which is appropriate for high-dimensional data such as gene expression data. Our approach is based on the ensemble "sum-of-trees" model which is flexible to incorporate additive and interaction effects between genes. We specify a fully Bayesian hierarchical approach and illustrate our methodology for the CPH, Weibull, and AFT survival models. We overcome the lack of conjugacy using a latent variable formulation to model the covariate effects which decreases computation time for model fitting. Also, our proposed models provides a model-free way to select important predictive prognostic markers based on controlling false discovery rates. We compare the performance of our methods with baseline reference survival methods and apply our methodology to an unpublished data set of brain tumor survival times and gene expression data, selecting genes potentially related to the development of the disease under study. A closing discussion compares results obtained by Random Forest and Bayesian ensemble methods under the biological/clinical perspectives and highlights the statistical advantages and disadvantages of the new methodology in the context of DNA microarray data analysis.
Resumo:
Medical instrumentation used in diagnosis and treatment relies on the accurate detection and processing of various physiological events and signals. While signal detection technology has improved greatly in recent years, there remain inherent delays in signal detection/ processing. These delays may have significant negative clinical consequences during various pathophysiological events. Reducing or eliminating such delays would increase the ability to provide successful early intervention in certain disorders thereby increasing the efficacy of treatment. In recent years, a physical phenomenon referred to as Negative Group Delay (NGD), demonstrated in simple electronic circuits, has been shown to temporally advance the detection of analog waveforms. Specifically, the output is temporally advanced relative to the input, as the time delay through the circuit is negative. The circuit output precedes the complete detection of the input signal. This process is referred to as signal advance (SA) detection. An SA circuit model incorporating NGD was designed, developed and tested. It imparts a constant temporal signal advance over a pre-specified spectral range in which the output is almost identical to the input signal (i.e., it has minimal distortion). Certain human patho-electrophysiological events are good candidates for the application of temporally-advanced waveform detection. SA technology has potential in early arrhythmia and epileptic seizure detection and intervention. Demonstrating reliable and consistent temporally advanced detection of electrophysiological waveforms may enable intervention with a pathological event (much) earlier than previously possible. SA detection could also be used to improve the performance of neural computer interfaces, neurotherapy applications, radiation therapy and imaging. In this study, the performance of a single-stage SA circuit model on a variety of constructed input signals, and human ECGs is investigated. The data obtained is used to quantify and characterize the temporal advances and circuit gain, as well as distortions in the output waveforms relative to their inputs. This project combines elements of physics, engineering, signal processing, statistics and electrophysiology. Its success has important consequences for the development of novel interventional methodologies in cardiology and neurophysiology as well as significant potential in a broader range of both biomedical and non-biomedical areas of application.
Resumo:
In contrast to preoperative brain tumor segmentation, the problem of postoperative brain tumor segmentation has been rarely approached so far. We present a fully-automatic segmentation method using multimodal magnetic resonance image data and patient-specific semi-supervised learning. The idea behind our semi-supervised approach is to effectively fuse information from both pre- and postoperative image data of the same patient to improve segmentation of the postoperative image. We pose image segmentation as a classification problem and solve it by adopting a semi-supervised decision forest. The method is evaluated on a cohort of 10 high-grade glioma patients, with segmentation performance and computation time comparable or superior to a state-of-the-art brain tumor segmentation method. Moreover, our results confirm that the inclusion of preoperative MR images lead to a better performance regarding postoperative brain tumor segmentation.
Resumo:
Cataloging geocentric objects can be put in the framework of Multiple Target Tracking (MTT). Current work tends to focus on the S = 2 MTT problem because of its favorable computational complexity of O(n²). The MTT problem becomes NP-hard for a dimension of S˃3. The challenge is to find an approximation to the solution within a reasonable computation time. To effciently approximate this solution a Genetic Algorithm is used. The algorithm is applied to a simulated test case. These results represent the first steps towards a method that can treat the S˃3 problem effciently and with minimal manual intervention.
Resumo:
A study of the polarimetric backscattering response of newly formed sea ice types under a large assortment of surface coverage was conducted using a ship-based C-band polarimetric radar system. Polarimetric backscattering results and physical data for 40 stations during the fall freeze-up of 2003, 2006, and 2007 are presented. Analysis of the copolarized correlation coefficient showed its sensitivity to both sea ice thickness and surface coverage and resulted in a statistically significant separation of ice thickness into two regimes: ice less than 6 cm thick and ice greater than 8 cm thick. A case study quantified the backscatter of a layer of snow infiltrated frost flowers on new sea ice, showing that the presence of the old frost flowers can enhance the backscatter by more than 6 dB. Finally, a statistical analysis of a series of temporal-spatial measurements over a visually homogeneous frost-flower-covered ice floe identified temperature as a significant, but not exclusive, factor in the backscattering measurements.
Resumo:
The newly introduced temperature proxy, the tetraether index of archaeal lipids with 86 carbon atoms (TEX86), is based on the number of cyclopentane moieties in the glycerol dialkyl glycerol tetraether (GDGT) lipids of marine Crenarchaeota. The composition of sedimentary GDGTs used for TEX86 paleothermometry is thought to reflect sea surface temperature (SST). However, marine Crenarchaeota occur ubiquitously in the world oceans over the entire depth range and not just in surface waters. We analyzed the GDGT distribution in settling particulate organic matter collected in sediment traps from the northeastern Pacific Ocean and the Arabian Sea to investigate the seasonal and spatial distribution of the fluxes of crenarchaeotal GDGTs and the origin of the TEX86 signal transported to the sediment. In both settings the TEX86 measured at all trap deployment depths reflects SST. In the Arabian Sea, analysis of an annual time series showed that the SST estimate based on TEX86 in the shallowest trap at 500 m followed the in situ SST with a 1 to 3 week time delay, likely caused by the relatively low settling speed of sinking particles. This revealed that the GDGT signal that reaches deeper water is derived from the upper water column rather than in situ production of GDGTs. The GDGT temperature signal in deeper traps at 1500 m and 3000 m did not show a seasonal cyclicity observed in the 500 m trap but rather reflected the annual mean SST. This is probably due to a homogenization of the TEX86 SST signal carried by particles as they ultimately reach the interior of the ocean. Our data confirm the use of TEX86 as a temperature proxy of surface ocean waters.
Resumo:
Many context-aware applications rely on the knowledge of the position of the user and the surrounding objects to provide advanced, personalized and real-time services. In wide-area deployments, a routing protocol is needed to collect the location information from distant nodes. In this paper, we propose a new source-initiated (on demand) routing protocol for location-aware applications in IEEE 802.15.4 wireless sensor networks. This protocol uses a low power MAC layer to maximize the lifetime of the network while maintaining the communication delay to a low value. Its performance is assessed through experimental tests that show a good trade-off between power consumption and time delay in the localization of a mobile device.
Resumo:
The time delay of arrival (TDOA) between multiple microphones has been used since 2006 as a source of information (localization) to complement the spectral features for speaker diarization. In this paper, we propose a new localization feature, the intensity channel contribution (ICC) based on the relative energy of the signal arriving at each channel compared to the sum of the energy of all the channels. We have demonstrated that by joining the ICC features and the TDOA features, the robustness of the localization features is improved and that the diarization error rate (DER) of the complete system (using localization and spectral features) has been reduced. By using this new localization feature, we have been able to achieve a 5.2% DER relative improvement in our development data, a 3.6% DER relative improvement in the RT07 evaluation data and a 7.9% DER relative improvement in the last year's RT09 evaluation data.
Resumo:
In this paper we will see how the efficiency of the MBS simulations can be improved in two different ways, by considering both an explicit and implicit semi-recursive formulation. The explicit method is based on a double velocity transformation that involves the solution of a redundant but compatible system of equations. The high computational cost of this operation has been drastically reduced by taking into account the sparsity pattern of the system. Regarding this, the goal of this method is the introduction of MA48, a high performance mathematical library provided by Harwell Subroutine Library. The second method proposed in this paper has the particularity that, depending on the case, between 70 and 85% of the computation time is devoted to the evaluation of forces derivatives with respect to the relative position and velocity vectors. Keeping in mind that evaluating these derivatives can be decomposed into concurrent tasks, the main goal of this paper lies on a successful and straightforward parallel implementation that have led to a substantial improvement with a speedup of 3.2 by keeping all the cores busy in a quad-core processor and distributing the workload between them, achieving on this way a huge time reduction by doing an ideal CPU usage
Resumo:
The simulation of interest rate derivatives is a powerful tool to face the current market fluctuations. However, the complexity of the financial models and the way they are processed require exorbitant computation times, what is in clear conflict with the need of a processing time as short as possible to operate in the financial market. To shorten the computation time of financial derivatives the use of hardware accelerators becomes a must.
Resumo:
Several methods to improve multiple distant microphone (MDM) speaker diarization based on Time Delay of Arrival (TDOA) features are evaluated in this paper. All of them avoid the use of a single reference channel to calculate the TDOA values and, based on different criteria, select among all possible pairs of microphones a set of pairs that will be used to estimate the TDOA's. The evaluated methods have been named the "Dynamic Margin" (DM), the "Extreme Regions" (ER), the "Most Common" (MC), the "Cross Correlation" (XCorr) and the "Principle Component Analysis" (PCA). It is shown that all methods improve the baseline results for the development set and four of them improve also the results for the evaluation set. Improvements of 3.49% and 10.77% DER relative are obtained for DM and ER respectively for the test set. The XCorr and PCA methods achieve an improvement of 36.72% and 30.82% DER relative for the test set. Moreover, the computational cost for the XCorr method is 20% less than the baseline.
Resumo:
Based on a previously reported logic cell structure (see SPIE, vol. 2038, p. 67-77, 1993), the two types of cells present at the inner and ganglion cell layers of the vertebrate retina and their intracellular response, as well as their connections with each other, have been simulated. These cells are amacrines and ganglion cells. The main scheme of the authors' configuration is shown in a figure. These two types of cells, as well as some of their possible interconnections, have been implemented with the authors' previously reported optical-processing element. As it has been shown, the authors' logic structure is able to process two optical input binary signals, being the output two logical functions. Moreover, if a delayed feedback from one of the two possible outputs to one or both of the inputs is introduced, a very different behaviour is obtained. Depending on the value of the time delay, an oscillatory output can be obtained from a constant optical input signal. Period and length pulses are dependent on delay values, both external and internal, as well as on other control signals. Moreover, a chaotic behaviour can be obtained too under certain conditions
Resumo:
In informatics there is one kind of complexity that is perceived by everyone. It is the complexity of a concrete, isolated object, normally situated completely within one of the branches universally recognized by the scientific and technical community. Examples of this are the complexity of integrated electronic circuits, the complexity of lgorithms and the complexity of software. The first complexity deals with the number of circuit components, the second with computation time and the third with the number of necessary mental discriminations. In arder to illustrate my point, I will take up the last complexity, which, m o reo ver, is the least well-known.