921 resultados para Brownian Motion with Returns to Zero


Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extensive sample (2%) of private vehicles in Italy are equipped with a GPS device that periodically measures their position and dynamical state for insurance purposes. Having access to this type of data allows to develop theoretical and practical applications of great interest: the real-time reconstruction of traffic state in a certain region, the development of accurate models of vehicle dynamics, the study of the cognitive dynamics of drivers. In order for these applications to be possible, we first need to develop the ability to reconstruct the paths taken by vehicles on the road network from the raw GPS data. In fact, these data are affected by positioning errors and they are often very distanced from each other (~2 Km). For these reasons, the task of path identification is not straightforward. This thesis describes the approach we followed to reliably identify vehicle paths from this kind of low-sampling data. The problem of matching data with roads is solved with a bayesian approach of maximum likelihood. While the identification of the path taken between two consecutive GPS measures is performed with a specifically developed optimal routing algorithm, based on A* algorithm. The procedure was applied on an off-line urban data sample and proved to be robust and accurate. Future developments will extend the procedure to real-time execution and nation-wide coverage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Questa Tesi aspira a mostrare un codice a livello di pacchetto, che abbia performance molto vicine a quello ottimo, per progetti di comunicazioni Satellitari. L’altro scopo di questa Tesi è quello di capire se rimane ancora molto più difficile maneggiare direttamente gli errori piuttosto che le erasures. Le applicazioni per comunicazioni satellitari ora come ora usano tutte packet erasure coding per codificare e decodificare l’informazione. La struttura dell’erasure decoding è molto semplice, perché abbiamo solamente bisogno di un Cyclic Redundancy Check (CRC) per realizzarla. Il problema nasce quando abbiamo pacchetti di dimensioni medie o piccole (per esempio più piccole di 100 bits) perché in queste situazioni il costo del CRC risulta essere troppo dispendioso. La soluzione la possiamo trovare utilizzando il Vector Symbol Decoding (VSD) per raggiungere le stesse performance degli erasure codes, ma senza la necessità di usare il CRC. Per prima cosa viene fatta una breve introduzione su come è nata e su come si è evoluta la codifica a livello di pacchetto. In seguito è stato introdotto il canale q-ary Symmetric Channel (qSC), con sia la derivazione della sua capacità che quella del suo Random Coding Bound (RCB). VSD è stato poi proposto con la speranza di superare in prestazioni il Verification Based Decoding (VBD) su il canale qSC. Infine, le effettive performance del VSD sono state stimate via simulazioni numeriche. I possibili miglioramenti delle performance, per quanto riguarda il VBD sono state discusse, come anche le possibili applicazioni future. Inoltre abbiamo anche risposto alla domande se è ancora così tanto più difficile maneggiare gli errori piuttosto che le erasure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network Theory is a prolific and lively field, especially when it approaches Biology. New concepts from this theory find application in areas where extensive datasets are already available for analysis, without the need to invest money to collect them. The only tools that are necessary to accomplish an analysis are easily accessible: a computing machine and a good algorithm. As these two tools progress, thanks to technology advancement and human efforts, wider and wider datasets can be analysed. The aim of this paper is twofold. Firstly, to provide an overview of one of these concepts, which originates at the meeting point between Network Theory and Statistical Mechanics: the entropy of a network ensemble. This quantity has been described from different angles in the literature. Our approach tries to be a synthesis of the different points of view. The second part of the work is devoted to presenting a parallel algorithm that can evaluate this quantity over an extensive dataset. Eventually, the algorithm will also be used to analyse high-throughput data coming from biology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of a multibody model of a motorbike engine cranktrain is presented in this work, with an emphasis on flexible component model reduction. A modelling methodology based upon the adoption of non-ideal joints at interface locations, and the inclusion of component flexibility, is developed: both are necessary tasks if one wants to capture dynamic effects which arise in lightweight, high-speed applications. With regard to the first topic, both a ball bearing model and a journal bearing model are implemented, in order to properly capture the dynamic effects of the main connections in the system: angular contact ball bearings are modelled according to a five-DOF nonlinear scheme in order to grasp the crankshaft main bearings behaviour, while an impedance-based hydrodynamic bearing model is implemented providing an enhanced operation prediction at the conrod big end locations. Concerning the second matter, flexible models of the crankshaft and the connecting rod are produced. The well-established Craig-Bampton reduction technique is adopted as a general framework to obtain reduced model representations which are suitable for the subsequent multibody analyses. A particular component mode selection procedure is implemented, based on the concept of Effective Interface Mass, allowing an assessment of the accuracy of the reduced models prior to the nonlinear simulation phase. In addition, a procedure to alleviate the effects of modal truncation, based on the Modal Truncation Augmentation approach, is developed. In order to assess the performances of the proposed modal reduction schemes, numerical tests are performed onto the crankshaft and the conrod models in both frequency and modal domains. A multibody model of the cranktrain is eventually assembled and simulated using a commercial software. Numerical results are presented, demonstrating the effectiveness of the implemented flexible model reduction techniques. The advantages over the conventional frequency-based truncation approach are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finite element techniques for solving the problem of fluid-structure interaction of an elastic solid material in a laminar incompressible viscous flow are described. The mathematical problem consists of the Navier-Stokes equations in the Arbitrary Lagrangian-Eulerian formulation coupled with a non-linear structure model, considering the problem as one continuum. The coupling between the structure and the fluid is enforced inside a monolithic framework which computes simultaneously for the fluid and the structure unknowns within a unique solver. We used the well-known Crouzeix-Raviart finite element pair for discretization in space and the method of lines for discretization in time. A stability result using the Backward-Euler time-stepping scheme for both fluid and solid part and the finite element method for the space discretization has been proved. The resulting linear system has been solved by multilevel domain decomposition techniques. Our strategy is to solve several local subproblems over subdomain patches using the Schur-complement or GMRES smoother within a multigrid iterative solver. For validation and evaluation of the accuracy of the proposed methodology, we present corresponding results for a set of two FSI benchmark configurations which describe the self-induced elastic deformation of a beam attached to a cylinder in a laminar channel flow, allowing stationary as well as periodically oscillating deformations, and for a benchmark proposed by COMSOL multiphysics where a narrow vertical structure attached to the bottom wall of a channel bends under the force due to both viscous drag and pressure. Then, as an example of fluid-structure interaction in biomedical problems, we considered the academic numerical test which consists in simulating the pressure wave propagation through a straight compliant vessel. All the tests show the applicability and the numerical efficiency of our approach to both two-dimensional and three-dimensional problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virgin olive oil(VOO) is a product characterized by high economic and nutritional values, because of its superior sensory characteristics and minor compounds (phenols and tocopherols) contents. Since the original quality of VOO may change during its storage, this study aimed to investigate the influence of different storage and shipment conditions on the quality of VOO, by studying different solutions such as filtration, dark storage and shipment inside insulated containers to protect it. Different analytical techniques were used to follow-up the quality changes during virgin olive oil storage and simulated shipments, in terms of basic quality parameters, sensory analysis and evaluation of minor components (phenolic compounds, diglycerides, volatile compounds). Four main research streams were presented in this PhD thesis: The results obtained from the first experimental section revealed that the application of filtration and/or clarification can decrease the unavoidable quality loss of the oil samples during storage, in comparison with unfiltered oil samples. The second section indicated that the virgin olive oil freshness, evaluated by diglycerides content, was mainly affected by the storage time and temperature. The third section revealed that fluctuation in temperature during storage may adversely affect the virgin olive oil quality, in terms of hydrolytic rancidity and oxidation quality. The fourth section showed that virgin olive oil shipped inside insulated containers showed lower hydrolytic and oxidation degradation than those without insulation cover. Overall, this PhD thesis highlighted that application of adequate treatment, such as filtration or clarification, in addition to a good protection against other external variables, such as temperature and light, will improve the stability of virgin olive oil during storage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In questa tesi vengono studiate alcune caratteristiche dei network a multiplex; in particolare l'analisi verte sulla quantificazione delle differenze fra i layer del multiplex. Le dissimilarita sono valutate sia osservando le connessioni di singoli nodi in layer diversi, sia stimando le diverse partizioni dei layer. Sono quindi introdotte alcune importanti misure per la caratterizzazione dei multiplex, che vengono poi usate per la costruzione di metodi di community detection . La quantificazione delle differenze tra le partizioni di due layer viene stimata utilizzando una misura di mutua informazione. Viene inoltre approfondito l'uso del test dell'ipergeometrica per la determinazione di nodi sovra-rappresentati in un layer, mostrando l'efficacia del test in funzione della similarita dei layer. Questi metodi per la caratterizzazione delle proprieta dei network a multiplex vengono applicati a dati biologici reali. I dati utilizzati sono stati raccolti dallo studio DILGOM con l'obiettivo di determinare le implicazioni genetiche, trascrittomiche e metaboliche dell'obesita e della sindrome metabolica. Questi dati sono utilizzati dal progetto Mimomics per la determinazione di relazioni fra diverse omiche. Nella tesi sono analizzati i dati metabolici utilizzando un approccio a multiplex network per verificare la presenza di differenze fra le relazioni di composti sanguigni di persone obese e normopeso.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proper sample size estimation is an important part of clinical trial methodology and closely related to the precision and power of the trial's results. Trials with sufficient sample sizes are scientifically and ethically justified and more credible compared with trials with insufficient sizes. Planning clinical trials with inadequate sample sizes might be considered as a waste of time and resources, as well as unethical, since patients might be enrolled in a study in which the expected results will not be trusted and are unlikely to have an impact on clinical practice. Because of the low emphasis of sample size calculation in clinical trials in orthodontics, it is the objective of this article to introduce the orthodontic clinician to the importance and the general principles of sample size calculations for randomized controlled trials to serve as guidance for study designs and as a tool for quality assessment when reviewing published clinical trials in our specialty. Examples of calculations are shown for 2-arm parallel trials applicable to orthodontics. The working examples are analyzed, and the implications of design or inherent complexities in each category are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbonyl sulfide is the most abundant sulfur gas in the atmosphere. We have used MP2 and CCSD(T) theory to study the structures and thermochemistries of carbonyl sulfide interacting with one to four water molecules. We have completed an extensive search for clusters of OCS(H2O)n, where n = 1−4. We located three dimers, two trimers, five tetramers, and four pentamers with the MP2/aug-cc-pVDZ method. In each of the complexes with two or more waters, OCS preferentially interacts with low-energy water clusters. Our results match current theoretical and experimental literature, showing correlation with available geometries and frequencies for the OCS(H2O) species. The CCSD(T)/aug-cc-pVTZ thermochemical values combined with the average amount of OCS and the saturated concentration of H2O in the troposphere, lead to the prediction of 106 OCS(H2O) clusters·cm−3 and 102 OCS(H2O)2 clusters·cm−3 at 298 K. We predict the structures of OCS(H2O)n, n = 1−4 that should predominate in a low-temperature molecular beam and identify specific infrared vibrations that can be used to identify these different clusters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background—Pathology studies on fatal cases of very late stent thrombosis have described incomplete neointimal coverage as common substrate, in some cases appearing at side-branch struts. Intravascular ultrasound studies have described the association between incomplete stent apposition (ISA) and stent thrombosis, but the mechanism explaining this association remains unclear. Whether the neointimal coverage of nonapposed side-branch and ISA struts is delayed with respect to well-apposed struts is unknown. Methods and Results—Optical coherence tomography studies from 178 stents implanted in 99 patients from 2 randomized trials were analyzed at 9 to 13 months of follow-up. The sample included 38 sirolimus-eluting, 33 biolimus-eluting, 57 everolimus-eluting, and 50 zotarolimus-eluting stents. Optical coherence tomography coverage of nonapposed side-branch and ISA struts was compared with well-apposed struts of the same stent by statistical pooled analysis with a random-effects model. A total of 34 120 struts were analyzed. The risk ratio of delayed coverage was 9.00 (95% confidence interval, 6.58 to 12.32) for nonapposed side-branch versus well-apposed struts, 9.10 (95% confidence interval, 7.34 to 11.28) for ISA versus well-apposed struts, and 1.73 (95% confidence interval, 1.34 to 2.23) for ISA versus nonapposed side-branch struts. Heterogeneity of the effect was observed in the comparison of ISA versus well-apposed struts (H=1.27; I2=38.40) but not in the other comparisons. Conclusions—Coverage of ISA and nonapposed side-branch struts is delayed with respect to well-apposed struts in drug-eluting stents, as assessed by optical coherence tomography.