852 resultados para Initial data problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assuming that the heat capacity of a body is negligible outside certain inclusions the heat equation degenerates to a parabolic-elliptic interface problem. In this work we aim to detect these interfaces from thermal measurements on the surface of the body. We deduce an equivalent variational formulation for the parabolic-elliptic problem and give a new proof of the unique solvability based on Lions’s projection lemma. For the case that the heat conductivity is higher inside the inclusions, we develop an adaptation of the factorization method to this time-dependent problem. In particular this shows that the locations of the interfaces are uniquely determined by boundary measurements. The method also yields to a numerical algorithm to recover the inclusions and thus the interfaces. We demonstrate how measurement data can be simulated numerically by a coupling of a finite element method with a boundary element method, and finally we present some numerical results for the inverse problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet of Things (IoT) is the next industrial revolution: we will interact naturally with real and virtual devices as a key part of our daily life. This technology shift is expected to be greater than the Web and Mobile combined. As extremely different technologies are needed to build connected devices, the Internet of Things field is a junction between electronics, telecommunications and software engineering. Internet of Things application development happens in silos, often using proprietary and closed communication protocols. There is the common belief that only if we can solve the interoperability problem we can have a real Internet of Things. After a deep analysis of the IoT protocols, we identified a set of primitives for IoT applications. We argue that each IoT protocol can be expressed in term of those primitives, thus solving the interoperability problem at the application protocol level. Moreover, the primitives are network and transport independent and make no assumption in that regard. This dissertation presents our implementation of an IoT platform: the Ponte project. Privacy issues follows the rise of the Internet of Things: it is clear that the IoT must ensure resilience to attacks, data authentication, access control and client privacy. We argue that it is not possible to solve the privacy issue without solving the interoperability problem: enforcing privacy rules implies the need to limit and filter the data delivery process. However, filtering data require knowledge of how the format and the semantics of the data: after an analysis of the possible data formats and representations for the IoT, we identify JSON-LD and the Semantic Web as the best solution for IoT applications. Then, this dissertation present our approach to increase the throughput of filtering semantic data by a factor of ten.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a collection of works focused on the topic of Earthquake Early Warning, with a special attention to large magnitude events. The topic is addressed from different points of view and the structure of the thesis reflects the variety of the aspects which have been analyzed. The first part is dedicated to the giant, 2011 Tohoku-Oki earthquake. The main features of the rupture process are first discussed. The earthquake is then used as a case study to test the feasibility Early Warning methodologies for very large events. Limitations of the standard approaches for large events arise in this chapter. The difficulties are related to the real-time magnitude estimate from the first few seconds of recorded signal. An evolutionary strategy for the real-time magnitude estimate is proposed and applied to the single Tohoku-Oki earthquake. In the second part of the thesis a larger number of earthquakes is analyzed, including small, moderate and large events. Starting from the measurement of two Early Warning parameters, the behavior of small and large earthquakes in the initial portion of recorded signals is investigated. The aim is to understand whether small and large earthquakes can be distinguished from the initial stage of their rupture process. A physical model and a plausible interpretation to justify the observations are proposed. The third part of the thesis is focused on practical, real-time approaches for the rapid identification of the potentially damaged zone during a seismic event. Two different approaches for the rapid prediction of the damage area are proposed and tested. The first one is a threshold-based method which uses traditional seismic data. Then an innovative approach using continuous, GPS data is explored. Both strategies improve the prediction of large scale effects of strong earthquakes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Vrancea region, at the south-eastern bend of the Carpathian Mountains in Romania, represents one of the most puzzling seismically active zones of Europe. Beside some shallow seismicity spread across the whole Romanian territory, Vrancea is the place of an intense seismicity with the presence of a cluster of intermediate-depth foci placed in a narrow nearly vertical volume. Although large-scale mantle seismic tomographic studies have revealed the presence of a narrow, almost vertical, high-velocity body in the upper mantle, the nature and the geodynamic of this deep intra-continental seismicity is still questioned. High-resolution seismic tomography could help to reveal more details in the subcrustal structure of Vrancea. Recent developments in computational seismology as well as the availability of parallel computing now allow to potentially retrieve more information out of seismic waveforms and to reach such high-resolution models. This study was aimed to evaluate the application of a full waveform inversion tomography at regional scale for the Vrancea lithosphere using data from the 1999 six months temporary local network CALIXTO. Starting from a detailed 3D Vp, Vs and density model, built on classical travel-time tomography together with gravity data, I evaluated the improvements obtained with the full waveform inversion approach. The latter proved to be highly problem dependent and highly computational expensive. The model retrieved after the first two iterations does not show large variations with respect to the initial model but remains in agreement with previous tomographic models. It presents a well-defined downgoing slab shape high velocity anomaly, composed of a N-S horizontal anomaly in the depths between 40 and 70km linked to a nearly vertical NE-SW anomaly from 70 to 180km.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many application domains data can be naturally represented as graphs. When the application of analytical solutions for a given problem is unfeasible, machine learning techniques could be a viable way to solve the problem. Classical machine learning techniques are defined for data represented in a vectorial form. Recently some of them have been extended to deal directly with structured data. Among those techniques, kernel methods have shown promising results both from the computational complexity and the predictive performance point of view. Kernel methods allow to avoid an explicit mapping in a vectorial form relying on kernel functions, which informally are functions calculating a similarity measure between two entities. However, the definition of good kernels for graphs is a challenging problem because of the difficulty to find a good tradeoff between computational complexity and expressiveness. Another problem we face is learning on data streams, where a potentially unbounded sequence of data is generated by some sources. There are three main contributions in this thesis. The first contribution is the definition of a new family of kernels for graphs based on Directed Acyclic Graphs (DAGs). We analyzed two kernels from this family, achieving state-of-the-art results from both the computational and the classification point of view on real-world datasets. The second contribution consists in making the application of learning algorithms for streams of graphs feasible. Moreover,we defined a principled way for the memory management. The third contribution is the application of machine learning techniques for structured data to non-coding RNA function prediction. In this setting, the secondary structure is thought to carry relevant information. However, existing methods considering the secondary structure have prohibitively high computational complexity. We propose to apply kernel methods on this domain, obtaining state-of-the-art results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most precisely measured quantities in particle physics is the magnetic moment of the muon, which describes its coupling to an external magnetic field. It is expressed in form of the anomalous magnetic moment of the muon a_mu=(g_mu-2)/2 and has been determined experimentally with a precision of 0.5 parts per million. The current direct measurement and the theoretical prediction of the standard model differ by more than 3.5 standard deviations. Concerning theory, the contribution of the QED and weak interaction to a_mu can be calculated with very high precision in a perturbative approach.rnAt low energies, however, perturbation theory cannot be used to determine the hadronic contribution a^had_mu. On the other hand, a^had_mu may be derived via a dispersion relation from the sum of measured cross sections of exclusive hadronic reactions. Decreasing the experimental uncertainty on these hadronic cross sections is of utmost importance for an improved standard model prediction of a_mu.rnrnIn addition to traditional energy scan experiments, the method of Initial State Radiation (ISR) is used to measure hadronic cross sections. This approach allows experiments at colliders running at a fixed centre-of-mass energy to access smaller effective energies by studying events which contain a high-energetic photon emitted from the initial electron or positron. Using the technique of ISR, the energy range from threshold up to 4.5GeV can be accessed at Babar.rnrnThe cross section e+e- -> pi+pi- contributes with approximately 70% to the hadronic part of the anomalous magnetic moment of the muon a_mu^had. This important channel has been measured with a precision of better than 1%. Therefore, the leading contribution to the uncertainty of a_mu^had at present stems from the invariant mass region between 1GeV and 2GeV. In this energy range, the channels e+e- -> pi+pi-pi+pi- and e+e- -> pi+pi-pi0pi0 dominate the inclusive hadronic cross section. The measurement of the process e+e- -> pi+pi-pi+pi- will be presented in this thesis. This channel has been previously measured by Babar based on 25% of the total dataset. The new analysis includes a more detailed study of the background contamination from other ISR and non-radiative background reactions. In addition, sophisticated studies of the track reconstruction as well as the photon efficiency difference between the data and the simulation of the Babar detector are performed. With these auxiliary studies, a reduction of the systematic uncertainty from 5.0% to 2.4% in the peak region was achieved.rnrnThe pi+pi-pi+pi- final state has a rich internal structure. Hints are seen for the intermediate states rho(770)^0 f_2(1270), rho(770)^0 f_0(980), as well as a_1(1260)pi. In addition, the branching ratios BR(jpsi -> pi+pi-pi+pi-) and BR(psitwos -> jpsi pi+pi-) are extracted.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

rnThis thesis is on the flavor problem of Randall Sundrum modelsrnand their strongly coupled dual theories. These models are particularly wellrnmotivated extensions of the Standard Model, because they simultaneously address rntherngauge hierarchy problem and the hierarchies in the quarkrnmasses and mixings. In order to put this into context, special attention is given to concepts underlying therntheories which can explain the hierarchy problem and the flavor structure of the Standard Model (SM). ThernAdS/CFTrnduality is introduced and its implications for the Randall Sundrum model withrnfermions in the bulk andrngeneral bulk gauge groups is investigated. It will be shown that the differentrnterms in the general 5D propagator of a bulk gauge field can be related tornthe corresponding diagrams of the strongly coupled dual, which allows for arndeeperrnunderstanding of the origin of flavor changing neutral currents generated by thernexchange of the Kaluza Klein excitations of these bulk fields.rnIn the numerical analysis, different observables which are sensitive torncorrections from therntree-levelrnexchange of these resonances will be presented on the basis of updatedrnexperimental data from the Tevatron and LHC experiments. This includesrnelectroweak precision observables, namely corrections to the S and Trnparameters followed by corrections to the Zbb vertex, flavor changingrnobservables with flavor changes at one vertex, viz. BR (Bd -> mu+mu-) and BR (Bs -> mu+mu-), and two vertices,rn viz. S_psiphi and |eps_K|, as well as bounds from direct detectionrnexperiments. rnThe analysis will show that all of these bounds can be brought in agreement withrna new physics scale Lambda_NP in the TeV range, except for the CPrnviolating quantity |eps_K|, which requires Lambda_NP= Ord(10) TeVrnin the absencernof fine-tuning. The numerous modifications of the Randall Sundrum modelrnin the literature, which try to attenuate this bound are reviewed andrncategorized.rnrnSubsequently, a novel solution to this flavor problem, based on an extendedrncolor gauge group in the bulk and its thorough implementation inrnthe RS model, will be presented, as well as an analysis of the observablesrnmentioned above in the extended model. This solution is especially motivatedrnfromrnthe point of view of the strongly coupled dual theory and the implications forrnstrongly coupled models of new physics, which do not possess a holographic dual,rnare examined.rnFinally, the top quark plays a special role in models with a geometric explanation ofrnflavor hierarchies and the predictions in the Randall-Sundrum model with andrnwithout the proposed extension for the forward-backward asymmetryrnA_FB^trnin top pair production are computed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When designing metaheuristic optimization methods, there is a trade-off between application range and effectiveness. For large real-world instances of combinatorial optimization problems out-of-the-box metaheuristics often fail, and optimization methods need to be adapted to the problem at hand. Knowledge about the structure of high-quality solutions can be exploited by introducing a so called bias into one of the components of the metaheuristic used. These problem-specific adaptations allow to increase search performance. This thesis analyzes the characteristics of high-quality solutions for three constrained spanning tree problems: the optimal communication spanning tree problem, the quadratic minimum spanning tree problem and the bounded diameter minimum spanning tree problem. Several relevant tree properties, that should be explored when analyzing a constrained spanning tree problem, are identified. Based on the gained insights on the structure of high-quality solutions, efficient and robust solution approaches are designed for each of the three problems. Experimental studies analyze the performance of the developed approaches compared to the current state-of-the-art.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein-adsorption occurs immediately following implantation of biomaterials. It is unknown at which extent protein-adsorption impacts the cellular events at bone-implant interface. To investigate this question, we compared the in-vitro outcome of osteoblastic cells grown onto titanium substrates and glass as control, by modulating the exposure to serum-derived proteins. Substrates consisted of 1) polished titanium disks; 2) polished disks nanotextured with H2SO4/H2O2; 3) glass. In the pre-adsorption phase, substrates were treated for 1h with αMEM alone (M-noFBS) or supplemented with 10%-foetal-bovine-serum (M-FBS). MC3T3-osteoblastic-cells were cultured on the pre-treated substrates for 3h and 24h, in M-noFBS and M-FBS. Subsequently, the culture medium was replaced with M-FBS and cultures maintained for 3 and 7days. Cell-number was evaluated by: Alamar-Blue and MTT assay. Mitotic- and osteogenic-activities were evaluated through fluorescence-optical-microscope by immunolabeling for Ki-67 nuclear-protein and Osteopontin. Cellular morphology was evaluated by SEM-imaging. Data were statistically analyzed using ANOVA-test, (p<0.05). At day3 and day7, the presence or absence of serum-derived proteins during the pre-adsorption phase had not significant effect on cell-number. Only the absence of FBS during 24h of culture significantly affected cell-number (p<0.0001). Titanium surfaces performed better than glass, (p<0.01). The growth rate of cells between day3 and 7 was not affected by the initial absence of FBS. Immunolabeling for Ki-67 and Osteopontin showed that the mitotic- and osteogenic- activity were ongoing at 72h. SEM-analysis revealed that the absence of FBS had no major influence on cell-shape. • Physico-chemical interactions without mediation by proteins are sufficient to sustain the initial phase of culture and guide osteogenic-cells toward differentiation. • The challenge is avoiding adsorption of ‘undesirables’ molecules that negatively impact on the cueing cells receive from surface. This may not be a problem in healthy patients, but may have an important role in medically-compromised-individuals in whom the composition of tissue-fluids is altered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classic group recommender systems focus on providing suggestions for a fixed group of people. Our work tries to give an inside look at design- ing a new recommender system that is capable of making suggestions for a sequence of activities, dividing people in subgroups, in order to boost over- all group satisfaction. However, this idea increases problem complexity in more dimensions and creates great challenge to the algorithm’s performance. To understand the e↵ectiveness, due to the enhanced complexity and pre- cise problem solving, we implemented an experimental system from data collected from a variety of web services concerning the city of Paris. The sys- tem recommends activities to a group of users from two di↵erent approaches: Local Search and Constraint Programming. The general results show that the number of subgroups can significantly influence the Constraint Program- ming Approaches’s computational time and e�cacy. Generally, Local Search can find results much quicker than Constraint Programming. Over a lengthy period of time, Local Search performs better than Constraint Programming, with similar final results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past twenty years, new technologies have required an increasing use of mathematical models in order to understand better the structural behavior: finite element method is the one mostly used. However, the reliability of this method applied to different situations has to be tried each time. Since it is not possible to completely model the reality, different hypothesis must be done: these are the main problems of FE modeling. The following work deals with this problem and tries to figure out a way to identify some of the unknown main parameters of a structure. This main research focuses on a particular path of study and development, but the same concepts can be applied to other objects of research. The main purpose of this work is the identification of unknown boundary conditions of a bridge pier using the data acquired experimentally with field tests and a FEM modal updating process. This work doesn’t want to be new, neither innovative. A lot of work has been done during the past years on this main problem and many solutions have been shown and published. This thesis just want to rework some of the main aspects of the structural optimization process, using a real structure as fitting model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an automated solution for precise detection of fiducial screws from three-dimensional (3D) Computerized Tomography (CT)/Digital Volume Tomography (DVT) data for image-guided ENT surgery. Unlike previously published solutions, we regard the detection of the fiducial screws from the CT/DVT volume data as a pose estimation problem. We thus developed a model-based solution. Starting from a user-supplied initialization, our solution detects the fiducial screws by iteratively matching a computer aided design (CAD) model of the fiducial screw to features extracted from the CT/DVT data. We validated our solution on one conventional CT dataset and on five DVT volume datasets, resulting in a total detection of 24 fiducial screws. Our experimental results indicate that the proposed solution achieves much higher reproducibility and precision than the manual detection. Further comparison shows that the proposed solution produces better results on the DVT dataset than on the conventional CT dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The acquired enamel pellicle that forms on the tooth surface serves as a natural protective barrier against dental erosion. Numerous proteins composing the pellicle serve different functions within this thin layer. Our study examined the effect of incorporated mucin and casein on the erosion-inhibiting potential of the acquired enamel pellicle. Cyclic acidic conditions were applied to mimic the erosive environment present at the human enamel interface during the consumption of soft drinks. One hundred enamel specimens were prepared for microhardness tests and distributed randomly into 5 groups (n = 20) that received the following treatment: deionized water, humidity chamber, mucin, casein, or a combination of mucin and casein. Each group was exposed to 3 cycles of a 2-hour incubation in human saliva, followed by a 2-hour treatment in the testing solution and a 1-min exposure to citric acid. The microhardness analysis demonstrated that the mixture of casein and mucin significantly improved the erosion-inhibiting properties of the human pellicle layer. The addition of individual proteins did not statistically impact the function of the pellicle. These data suggest that protein-protein interactions may play an important role in the effectiveness of the pellicle to prevent erosion.