850 resultados para High-dimensional data visualization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zu den Hauptcharakteristika von Teilchen gehoert - neben der Masse - die Lebensdauer. Die mittlere Lebensdauer des Xi0-Hyperons, die sich aus der mittleren Lebensdauer des Xi--Hyperons ueber die Delta I=1/2-Regel theoretisch voraussagen laesst, wurde bereits mehrfach experimentell bestimmt. Die neueste Messung aus dem Jahr 1977 besitzt jedoch eine relative Unsicherheit von 5%, was sich mit Daten neuerer Experimente deutlich verbessern laesst. Die mittlere Lebensdauer ist ein wichtiger Parameter bei der Bestimmung des Matrixelements Vus der Cabibbo-Kobayashi-Maskawa-Matrix in semileptonischen Xi0-Zerfaellen. Im Jahre 2002 wurde mit dem NA48-Detektor eine Datennahme mit hoher Intensitaet durchgefuehrt, bei der unter anderem etwa 10^9 Xi0-Zerfallskandidaten aufgezeichnet wurden. Davon wurden im Rahmen dieser Arbeit 192000 Ereignisse des Typs "Xi0 nach Lambda pi0" rekonstruiert und 107000 Ereignisse zur Bestimmung der mittleren Lebensdauer durch Vergleich mit simulierten Ereignissen verwendet. Zur Vermeidung von systematischen Fehlern wurde die Lebensdauer in zehn Energieintervallen durch Vergleich von gemessenen und simulierten Daten ermittelt. Das Ergebnis ist wesentlich genauer als bisherige Messungen und weicht vom Literaturwert (tau=(2,90+-0,09)*10^(-10)s) um (+4,99+-0,50(stat)+-0,58(syst))% ab, was 1,7 Standardabweichungen entspricht. Die Lebensdauer ergibt sich zu tau=(3,045+-0,015(stat)+-0,017(syst))*10^(-10)s. Auf die gleiche Weise konnte mit den zur Verfuegung stehenden Daten erstmals die Lebensdauer des Anti-Xi0-Hyperons gemessen werden. Das Ergebnis dieser Messung ist tau=(3,042+-0,045(stat)+-0,017(syst))*10^(-10)s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is devoted to the study of the properties of high-redsfhit galaxies in the epoch 1 < z < 3, when a substantial fraction of galaxy mass was assembled, and when the evolution of the star-formation rate density peaked. Following a multi-perspective approach and using the most recent and high-quality data available (spectra, photometry and imaging), the morphologies and the star-formation properties of high-redsfhit galaxies were investigated. Through an accurate morphological analyses, the built up of the Hubble sequence was placed around z ~ 2.5. High-redshift galaxies appear, in general, much more irregular and asymmetric than local ones. Moreover, the occurrence of morphological k-­correction is less pronounced than in the local Universe. Different star-formation rate indicators were also studied. The comparison of ultra-violet and optical based estimates, with the values derived from infra-red luminosity showed that the traditional way of addressing the dust obscuration is problematic, at high-redshifts, and new models of dust geometry and composition are required. Finally, by means of stacking techniques applied to rest-frame ultra-violet spectra of star-forming galaxies at z~2, the warm phase of galactic-scale outflows was studied. Evidence was found of escaping gas at velocities of ~ 100 km/s. Studying the correlation of inter-­stellar absorption lines equivalent widths with galaxy physical properties, the intensity of the outflow-related spectral features was proven to depend strongly on a combination of the velocity dispersion of the gas and its geometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different types of proteins exist with diverse functions that are essential for living organisms. An important class of proteins is represented by transmembrane proteins which are specifically designed to be inserted into biological membranes and devised to perform very important functions in the cell such as cell communication and active transport across the membrane. Transmembrane β-barrels (TMBBs) are a sub-class of membrane proteins largely under-represented in structure databases because of the extreme difficulty in experimental structure determination. For this reason, computational tools that are able to predict the structure of TMBBs are needed. In this thesis, two computational problems related to TMBBs were addressed: the detection of TMBBs in large datasets of proteins and the prediction of the topology of TMBB proteins. Firstly, a method for TMBB detection was presented based on a novel neural network framework for variable-length sequence classification. The proposed approach was validated on a non-redundant dataset of proteins. Furthermore, we carried-out genome-wide detection using the entire Escherichia coli proteome. In both experiments, the method significantly outperformed other existing state-of-the-art approaches, reaching very high PPV (92%) and MCC (0.82). Secondly, a method was also introduced for TMBB topology prediction. The proposed approach is based on grammatical modelling and probabilistic discriminative models for sequence data labeling. The method was evaluated using a newly generated dataset of 38 TMBB proteins obtained from high-resolution data in the PDB. Results have shown that the model is able to correctly predict topologies of 25 out of 38 protein chains in the dataset. When tested on previously released datasets, the performances of the proposed approach were measured as comparable or superior to the current state-of-the-art of TMBB topology prediction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis consists of three self-contained papers. In the first paper I analyze the labor supply behavior of Bologna Pizza Delivery Vendors. Recent influential papers analyze labor supply behavior of taxi drivers (Camerer et al., 1997; and Crawford and Meng, 2011) and suggest that reference-dependence preferences have an important influence on drivers’ labor-supply decisions. Unlike previous papers, I am able to identify an exogenous and transitory change in labor demand. Using high frequency data on orders and rainfall as an exogenous demand shifter, I invariably find that reference-dependent preferences play no role in their labor’ supply decisions and the behavior of pizza vendors is perfectly consistent with the predictions of the standard model of labor’ supply. In the second paper, I investigate how the voting behavior of Members of Parliament is influenced by the Members seating nearby. By exploiting the random seating arrangements in the Icelandic Parliament, I show that being seated next to Members of a different party increases the probability of not being aligned with one’s own party. Using the exact spatial orientation of the peers, I provide evidence that supports the hypothesis that interaction is the main channel that explain these results. In the third paper, I provide an estimate of the trade flows that there would have been between the UK and Europe if the UK had joined the Euro. As an alternative approach to the standard log-linear gravity equation I employ the synthetic control method. I show that the aggregate trade flows between Britain and Europe would have been 13% higher if the UK had adopted the Euro.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The behaviour of a polymer depends strongly on the length- and time scale as well as on the temperature rnat which it is probed. In this work, I describe investigations of polymer surfaces using scanning probe rnmicroscopy with heatable probes. With these probes, surfaces can be heated within seconds down to rnmicroseconds. I introduce experiments for the local and fast determination of glass transition and melting rntemperatures. I developed a method which allows the determination of glass transition and melting rntemperatures on films with thicknesses below 100 nm: A background measurement on the substrate was rnperformed. The resulting curve was subtracted from the measurement on the polymer film. The rndifferential measurement on polystyrene films with thicknesses between 35 nm and 160 nm showed rncharacteristic signals at 95 ± 1 °C, in accordance with the glass transition of polystyrene. Pressing heated rnprobes into polymer films causes plastic deformation. Nanometer sized deformations are currently rninvestigated in novel concepts for high density data storage. A suitable medium for such a storage system rnhas to be easily indentable on one hand, but on the other hand it also has to be very stable towards rnsurface induced wear. For developing such a medium I investigated a new approach: A comparably soft rnmaterial, namely polystyrene, was protected with a thin but very hard layer made of plasma polymerized rnnorbornene. The resulting bilayered media were tested for surface stability and deformability. I showed rnthat the bilayered material combines the deformability of polystyrene with the surface stability of the rnplasma polymer, and that the material therefore is a very good storage medium. In addition we rninvestigated the glass transition temperature of polystyrene at timescales of 10 µs and found it to be rnapprox. 220 °C. The increase of this characteristic temperature of the polymer results from the short time rnat which the polymer was probed and reflects the well-known time-temperature superposition principle. rnHeatable probes were also used for the characterization of silverazide filled nanocapsules. The use of rnheatable probes allowed determining the decomposition temperature of the capsules from few rnnanograms of material. The measured decomposition temperatures ranged from 180 °C to 225 °C, in rnaccordance with literature values. The investigation of small amounts of sample was necessary due to the rnlimited availability of the material. Furthermore, investigating larger amounts of the capsules using rnconventional thermal gravimetric analysis could lead to contamination or even damage of the instrument. rnBesides the analysis of material parameters I used the heatable probes for the local thermal rndecomposition of pentacene precursor material in order to form nanoscale conductive structures. Here, rnthe thickness of the precursor layer was important for complete thermal decomposition. rnAnother aspect of my work was the investigation of redox active polymers - Poly-10-(4-vinylbenzyl)-10H-rnphenothiazine (PVBPT)- for data storage. Data is stored by changing the local conductivity of the material rnby applying a voltage between tip and surface. The generated structures were stable for more than 16 h. It rnwas shown that the presence of water is essential for succesfull patterning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most important challenges in chemistry and material science is the connection between the contents of a compound and its chemical and physical properties. In solids, these are greatly influenced by the crystal structure.rnrnThe prediction of hitherto unknown crystal structures with regard to external conditions like pressure and temperature is therefore one of the most important goals to achieve in theoretical chemistry. The stable structure of a compound is the global minimum of the potential energy surface, which is the high dimensional representation of the enthalpy of the investigated system with respect to its structural parameters. The fact that the complexity of the problem grows exponentially with the system size is the reason why it can only be solved via heuristic strategies.rnrnImprovements to the artificial bee colony method, where the local exploration of the potential energy surface is done by a high number of independent walkers, are developed and implemented. This results in an improved communication scheme between these walkers. This directs the search towards the most promising areas of the potential energy surface.rnrnThe minima hopping method uses short molecular dynamics simulations at elevated temperatures to direct the structure search from one local minimum of the potential energy surface to the next. A modification, where the local information around each minimum is extracted and used in an optimization of the search direction, is developed and implemented. Our method uses this local information to increase the probability of finding new, lower local minima. This leads to an enhanced performance in the global optimization algorithm.rnrnHydrogen is a highly relevant system, due to the possibility of finding a metallic phase and even superconductor with a high critical temperature. An application of a structure prediction method on SiH12 finds stable crystal structures in this material. Additionally, it becomes metallic at relatively low pressures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lo scopo dell'elaborato di tesi è la progettazione e lo sviluppo di alcuni moduli di un software per la lettura ad elevato throughput di dati da particolari dispositivi per elettrofisiologia sviluppati dall'azienda Elements s.r.l. Elements produce amplificatori ad alta precisione per elettrofisiologia, in grado di misurare correnti a bassa intensità prodotte dai canali ionici. Dato il grande sviluppo che l'azienda sta avendo, e vista la previsione di introdurre sul mercato nuovi dispositivi con precisione e funzionalità sempre migliori, Elements ha espresso l'esigenza di un sistema software che fosse in grado di supportare al meglio i dispositivi già prodotti, e, soprattutto, prevedere il supporto dei nuovi, con prestazioni molto migliori del software già sviluppato da loro per la lettura dei dati. Il software richiesto deve fornire una interfaccia grafica che, comunicando con il dispositivo tramite USB per leggere dati da questo, provvede a mostrarli a schermo e permette di registrarli ed effettuare basilari operazioni di analisi. In questa tesi verranno esposte analisi, progettazione e sviluppo dei moduli di software che si interfacciano direttamente con il dispositivo, quindi dei moduli di rilevamento, connessione, acquisizione ed elaborazione dati.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we study a polyenergetic and multimaterial model for the breast image reconstruction in Digital Tomosynthesis, taking into consideration the variety of the materials forming the object and the polyenergetic nature of the X-rays beam. The modelling of the problem leads to the resolution of a high-dimensional nonlinear least-squares problem that, due to its nature of inverse ill-posed problem, needs some kind of regularization. We test two main classes of methods: the Levenberg-Marquardt method (together with the Conjugate Gradient method for the computation of the descent direction) and two limited-memory BFGS-like methods (L-BFGS). We perform some experiments for different values of the regularization parameter (constant or varying at each iteration), tolerances and stop conditions. Finally, we analyse the performance of the several methods comparing relative errors, iterations number, times and the qualities of the reconstructed images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Progettazione e implementazione dei moduli di visualizzazione, memorizzazione e analisi di un sistema software di acquisizione dati in real-time da dispositivi prodotti da Elements s.r.l. La tesi mostra tutte le fasi di analisi, progettazione, implementazione e testing dei moduli sviluppati.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Abstractor training is a key element in creating valid and reliable data collection procedures. The choice between in-person vs. remote or simultaneous vs. sequential abstractor training has considerable consequences for time and resource utilization. We conducted a web-based (webinar) abstractor training session to standardize training across six individual Cancer Research Network (CRN) sites for a study of breast cancer treatment effects in older women (BOWII). The goals of this manuscript are to describe the training session, its participants and participants' evaluation of webinar technology for abstraction training. Findings A webinar was held for all six sites with the primary purpose of simultaneously training staff and ensuring consistent abstraction across sites. The training session involved sequential review of over 600 data elements outlined in the coding manual in conjunction with the display of data entry fields in the study's electronic data collection system. Post-training evaluation was conducted via Survey Monkey©. Inter-rater reliability measures for abstractors within each site were conducted three months after the commencement of data collection. Ten of the 16 people who participated in the training completed the online survey. Almost all (90%) of the 10 trainees had previous medical record abstraction experience and nearly two-thirds reported over 10 years of experience. Half of the respondents had previously participated in a webinar, among which three had participated in a webinar for training purposes. All rated the knowledge and information delivered through the webinar as useful and reported it adequately prepared them for data collection. Moreover, all participants would recommend this platform for multi-site abstraction training. Consistent with participant-reported training effectiveness, results of data collection inter-rater agreement within sites ranged from 89 to 98%, with a weighted average of 95% agreement across sites. Conclusions Conducting training via web-based technology was an acceptable and effective approach to standardizing medical record review across multiple sites for this group of experienced abstractors. Given the substantial time and cost savings achieved with the webinar, coupled with participants' positive evaluation of the training session, researchers should consider this instructional method as part of training efforts to ensure high quality data collection in multi-site studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The immune system exhibits an enormous complexity. High throughput methods such as the "-omic'' technologies generate vast amounts of data that facilitate dissection of immunological processes at ever finer resolution. Using high-resolution data-driven systems analysis, causal relationships between complex molecular processes and particular immunological phenotypes can be constructed. However, processes in tissues, organs, and the organism itself (so-called higher level processes) also control and regulate the molecular (lower level) processes. Reverse systems engineering approaches, which focus on the examination of the structure, dynamics and control of the immune system, can help to understand the construction principles of the immune system. Such integrative mechanistic models can properly describe, explain, and predict the behavior of the immune system in health and disease by combining both higher and lower level processes. Moving from molecular and cellular levels to a multiscale systems understanding requires the development of methodologies that integrate data from different biological levels into multiscale mechanistic models. In particular, 3D imaging techniques and 4D modeling of the spatiotemporal dynamics of immune processes within lymphoid tissues are central for such integrative approaches. Both dynamic and global organ imaging technologies will be instrumental in facilitating comprehensive multiscale systems immunology analyses as discussed in this review.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[1] Early and Mid-Pleistocene climate, ocean hydrography and ice sheet dynamics have been reconstructed using a high-resolution data set (planktonic and benthicδ18O time series, faunal-based sea surface temperature (SST) reconstructions and ice-rafted debris (IRD)) record from a high-deposition-rate sedimentary succession recovered at the Gardar Drift formation in the subpolar North Atlantic (Integrated Ocean Drilling Program Leg 306, Site U1314). Our sedimentary record spans from late in Marine Isotope Stage (MIS) 31 to MIS 19 (1069–779 ka). Different trends of the benthic and planktonic oxygen isotopes, SST and IRD records before and after MIS 25 (∼940 ka) evidence the large increase in Northern Hemisphere ice-volume, linked to the cyclicity change from the 41-kyr to the 100-kyr that occurred during the Mid-Pleistocene Transition (MPT). Beside longer glacial-interglacial (G-IG) variability, millennial-scale fluctuations were a pervasive feature across our study. Negative excursions in the benthicδ18O time series observed at the times of IRD events may be related to glacio-eustatic changes due to ice sheets retreats and/or to changes in deep hydrography. Time series analysis on surface water proxies (IRD, SST and planktonicδ18O) of the interval between MIS 31 to MIS 26 shows that the timing of these millennial-scale climate changes are related to half-precessional (10 kyr) components of the insolation forcing, which are interpreted as cross-equatorial heat transport toward high latitudes during both equinox insolation maxima at the equator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the major challenges for a mission to the Jovian system is the radiation tolerance of the spacecraft (S/C) and the payload. Moreover, being able to achieve science observations with high signal to noise ratios (SNR), while passing through the high flux radiation zones, requires additional ingenuity on the part of the instrument provider. Consequently, the radiation mitigation is closely intertwined with the payload, spacecraft and trajectory design, and requires a systems-level approach. This paper presents a design for the Io Volcano Observer (IVO), a Discovery mission concept that makes multiple close encounters with Io while orbiting Jupiter. The mission aims to answer key outstanding questions about Io, especially the nature of its intense active volcanism and the internal processes that drive it. The payload includes narrow-angle and wide-angle cameras (NAC and WAC), dual fluxgate magnetometers (FGM), a thermal mapper (ThM), dual ion and neutral mass spectrometers (INMS), and dual plasma ion analyzers (PIA). The radiation mitigation is implemented by drawing upon experiences from designs and studies for missions such as the Radiation Belt Storm Probes (RBSP) and Jupiter Europa Orbiter (JEO). At the core of the radiation mitigation is IVO's inclined and highly elliptical orbit, which leads to rapid passes through the most intense radiation near Io, minimizing the total ionizing dose (177 krads behind 100 mils of Aluminum with radiation design margin (RDM) of 2 after 7 encounters). The payload and the spacecraft are designed specifically to accommodate the fast flyby velocities (e.g. the spacecraft is radioisotope powered, remaining small and agile without any flexible appendages). The science instruments, which collect the majority of the high-priority data when close to Io and thus near the peak flux, also have to mitigate transient noise in their detectors. The cameras use a combination of shielding and CMOS detectors with extremely fast readout to mi- imize noise. INMS microchannel plate detectors and PIA channel electron multipliers require additional shielding. The FGM is not sensitive to noise induced by energetic particles and the ThM microbolometer detector is nearly insensitive. Detailed SNR calculations are presented. To facilitate targeting agility, all of the spacecraft components are shielded separately since this approach is more mass efficient than using a radiation vault. IVO uses proven radiation-hardened parts (rated at 100 krad behind equivalent shielding of 280 mils of Aluminum with RDM of 2) and is expected to have ample mass margin to increase shielding if needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Generalized linear mixed models (GLMM) are generalized linear models with normally distributed random effects in the linear predictor. Penalized quasi-likelihood (PQL), an approximate method of inference in GLMMs, involves repeated fitting of linear mixed models with “working” dependent variables and iterative weights that depend on parameter estimates from the previous cycle of iteration. The generality of PQL, and its implementation in commercially available software, has encouraged the application of GLMMs in many scientific fields. Caution is needed, however, since PQL may sometimes yield badly biased estimates of variance components, especially with binary outcomes. Recent developments in numerical integration, including adaptive Gaussian quadrature, higher order Laplace expansions, stochastic integration and Markov chain Monte Carlo (MCMC) algorithms, provide attractive alternatives to PQL for approximate likelihood inference in GLMMs. Analyses of some well known datasets, and simulations based on these analyses, suggest that PQL still performs remarkably well in comparison with more elaborate procedures in many practical situations. Adaptive Gaussian quadrature is a viable alternative for nested designs where the numerical integration is limited to a small number of dimensions. Higher order Laplace approximations hold the promise of accurate inference more generally. MCMC is likely the method of choice for the most complex problems that involve high dimensional integrals.