853 resultados para high dimensional data, call detail records (CDR), wireless telecommunication industry


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this paper is to verify and analyze the existence in Brazil of stylized facts observed in financial time series: volatility clustering, probability distributions with fat tails, the presence of long run memory in absolute return time series, absence of linear return autocorrelation, gain/loss asymmetry, aggregative gaussianity, slow absolute return autocorrelation decay, trading volume/volatility correlation and leverage effect. We analyzed intraday prices for 10 stocks traded at the BM&FBovespa, responsible for 52.1% of the Ibovespa portfolio on Sept. 01, 2009. The data analysis confirms the stylized facts, whose behavior is consistent with what is observed in international markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermal treatment (thermal rectification) is a process in which technological properties of wood are modified using thermal energy, the result of Which is often value-added wood. Thermally treated wood takes on similar color shades to tropical woods and offers considerable resistance to destructive microorganisms and climate action, in addition to having high dimensional stability and low hygroscopicity. Wood samples of Eucalyptus grandis were subjected to various thermal treatments, as performed in presence (140 degrees C; 160 degrees C; 180 degrees C) or in absence of oxygen (160 degrees C; 180 degrees C; 200 degrees C) inside a thermal treatment chamber, and then studied as to their chemical characteristics. Increasing the maximum treatment temperatures led to a reduction in the holocellulose content of samples as a result of the degradation and volatilization of hemicelluloses, also leading to an increase in the relative lignin content. Except for glucose, all monosaccharide levels were found to decrease in samples after the thermal treatment at a maximum temperature of 200 degrees C. The thermal treatment above 160 degrees C led to increased levels of total extractives in the wood samples, probably ascribed to the emergence of low molecular weight substances as a result of thermal degradation. Overall, it was not possible to clearly determine the effect of presence or absence of oxygen in the air during thermal treatment on the chemical characteristics of the relevant wood samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article investigates the effect of product market liberalisation on employment allowing for interactions between policies and institutions in product and labour markets. Using panel data for OECD countries over the period 19802002, we present evidence that product market deregulation is more effective at the margin when labour market regulation is high. The data also suggest that product market liberalisation may promote employment-enhancing labour market reforms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Low birth weight affects child growth and development, requiring the intensive use of health services. There are conversely proportional associations between prematurity and academic performance around the world. In this study we evaluated factors involved in weight and neuropsychomotor profile in one and two years old discharged from Intensive Care Units (ICU). Methods/Design: We investigated 203 children from the ICU who were followed for 24 +/- 4 months. The research was conducted by collecting data from medical records of patients in a Follow-up program. We investigated the following variables: inadequate weight at one year old; inadequate weight at two years old and a severe neurological disorder at two years old. Results: We observed increase of almost 20% in the proportion of children which weighted between the 10th and 90th percentiles and decrease of around 40% of children below the 15th percentile, from one to two years old. In almost 60% of the cases neuropsychomotor development was normal at 2 years old, less than 15% of children presented abnormal development. Variables that remained influential for clinical outcome at 1 and 2 years old were related to birth weight and gestational age, except for hypoglycemia. Neurological examination was the most influential variable for severe neurological disturbance. Conclusion: Hypoglycemia was considered a new fact to explain inadequate weight. The results, new in Brazil and difficult in terms of comparison, could be used to identify risk factors and for a better approach of newborn discharged from ICUs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to evaluate the immunoexpression of MMP-2, MMP-9 and CD31/microvascular density in squamous cell carcinomas of the floor of the mouth and to correlate the results with demographic, survival, clinical (TNM staging) and histopathological variables (tumor grade, perineural invasion, embolization and bone invasion). Data from medical records and diagnoses of 41 patients were reviewed. Histological sections were subjected to immunostaining using primary antibodies for human MMP-2, MMP-9 and CD31 and streptavidin-biotin-immunoperoxidase system. Histomorphometric analyses quantified positivity for MMPs (20 fields per slide, 100?points grade, ×200) and for CD31 (microvessels <50?µm in the area of the highest vascularization, 5 fields per slide, 100?points grade, ×400). Statistical design was composed by non-parametric Mann-Whitney U test (investigating the association between numerical variables and immunostainings), chi-square frequency test (in contingency tables), Fisher's exact test (when at least one expected frequency was less than 5 in 2×2 tables), Kaplan-Meier method (estimated probabilities of overall survival) and Iogrank test (comparison of survival curves), all with a significance level of 5%. There was a statistically significant correlation between immunostaining for MMP-2 and lymph node metastasis. Factors associated negatively with survival were N stage, histopathological grade, perineural invasion and immunostaining for MMP-9. There was no significant association between immunoexpression of CD31 and the other variables. The intensity of immunostaining for MMP-2 can be indicative of metastasis in lymph nodes and for MMP-9 of a lower probability of survival

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermal treatment (thermal rectification) is a process in which technological properties of wood are modified using thermal energy, the result of which is often value-added wood. Thermally treated wood takes on similar color shades to tropical woods and offers considerable resistance to destructive microorganisms and climate action, in addition to having high dimensional stability and low hygroscopicity. Wood samples of Eucalyptus grandis were subjected to various thermal treatments, as performed in presence (140ºC; 160ºC; 180ºC) or in absence of oxygen (160ºC; 180ºC; 200ºC) inside a thermal treatment chamber, and then studied as to their chemical characteristics. Increasing the maximum treatment temperatures led to a reduction in the holocellulose content of samples as a result of the degradation and volatilization of hemicelluloses, also leading to an increase in the relative lignin content. Except for glucose, all monosaccharide levels were found to decrease in samples after the thermal treatment at a maximum temperature of 200ºC. The thermal treatment above 160ºC led to increased levels of total extractives in the wood samples, probably ascribed to the emergence of low molecular weight substances as a result of thermal degradation. Overall, it was not possible to clearly determine the effect of presence or absence of oxygen in the air during thermal treatment on the chemical characteristics of the relevant wood samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research is part of a survey for the detection of the hydraulic and geotechnical conditions of river embankments funded by the Reno River Basin Regional Technical Service of the Region Emilia-Romagna. The hydraulic safety of the Reno River, one of the main rivers in North-Eastern Italy, is indeed of primary importance to the Emilia-Romagna regional administration. The large longitudinal extent of the banks (several hundreds of kilometres) has placed great interest in non-destructive geophysical methods, which, compared to other methods such as drilling, allow for the faster and often less expensive acquisition of high-resolution data. The present work aims to experience the Ground Penetrating Radar (GPR) for the detection of local non-homogeneities (mainly stratigraphic contacts, cavities and conduits) inside the Reno River and its tributaries embankments, taking into account supplementary data collected with traditional destructive tests (boreholes, cone penetration tests etc.). A comparison with non-destructive methodologies likewise electric resistivity tomography (ERT), Multi-channels Analysis of Surface Waves (MASW), FDEM induction, was also carried out in order to verify the usability of GPR and to provide integration of various geophysical methods in the process of regular maintenance and check of the embankments condition. The first part of this thesis is dedicated to the explanation of the state of art concerning the geographic, geomorphologic and geotechnical characteristics of Reno River and its tributaries embankments, as well as the description of some geophysical applications provided on embankments belonging to European and North-American Rivers, which were used as bibliographic basis for this thesis realisation. The second part is an overview of the geophysical methods that were employed for this research, (with a particular attention to the GPR), reporting also their theoretical basis and a deepening of some techniques of the geophysical data analysis and representation, when applied to river embankments. The successive chapters, following the main scope of this research that is to highlight advantages and drawbacks in the use of Ground Penetrating Radar applied to Reno River and its tributaries embankments, show the results obtained analyzing different cases that could yield the formation of weakness zones, which successively lead to the embankment failure. As advantages, a considerable velocity of acquisition and a spatial resolution of the obtained data, incomparable with respect to other methodologies, were recorded. With regard to the drawbacks, some factors, related to the attenuation losses of wave propagation, due to different content in clay, silt, and sand, as well as surface effects have significantly limited the correlation between GPR profiles and geotechnical information and therefore compromised the embankment safety assessment. Recapitulating, the Ground Penetrating Radar could represent a suitable tool for checking up river dike conditions, but its use has significantly limited by geometric and geotechnical characteristics of the Reno River and its tributaries levees. As a matter of facts, only the shallower part of the embankment was investigate, achieving also information just related to changes in electrical properties, without any numerical measurement. Furthermore, GPR application is ineffective for a preliminary assessment of embankment safety conditions, while for detailed campaigns at shallow depth, which aims to achieve immediate results with optimal precision, its usage is totally recommended. The cases where multidisciplinary approach was tested, reveal an optimal interconnection of the various geophysical methodologies employed, producing qualitative results concerning the preliminary phase (FDEM), assuring quantitative and high confidential description of the subsoil (ERT) and finally, providing fast and highly detailed analysis (GPR). Trying to furnish some recommendations for future researches, the simultaneous exploitation of many geophysical devices to assess safety conditions of river embankments is absolutely suggested, especially to face reliable flood event, when the entire extension of the embankments themselves must be investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Im Jahre 2002 wurde mit dem NA48/1-Detektor eine Datennahme mit hoher Intensität von K_S-Mesonen und neutralen Hyperonen durchgeführt, bei der unter anderem etwa 10^9 Xi^0-Zerfallskandidaten aufgezeichnet wurden. Im Rahmen dieser Arbeit wurden aus diesem Datensatz 6657 Xi^0 -> Sigma^+ e^- Anti-nü und 581 Anti-Xi^0 -> Anti-Sigma^+ e^+ nü-Ereignisse ausgewählt und damit die Verzweigungsverhältnisse BR1(Gamma(Xi^0 -> Sigma^+ e^- Anti-nü)/Gamma(Xi^0 total))=( 2.533 +-0.032(stat) -0.076+0.089(syst) )10^-4 und BR2(Gamma(Anti-Xi^0 -> Anti-Sigma^+ e^+ nü)/Gamma(Anti-Xi^0 total))= ( 2.57 +-0.12(stat) -0.09+0.10(syst) )10^-4 bestimmt. Dieses Ergebnis für BR1 ist etwa 3.5-mal genauer als die bisher veröffentlichte Messung. Die Analyse der Anti-Xi^0-Beta-Zerfälle stellt die erste Messung von BR2 dar. Beide Ergebnisse stimmen mit der theoretischen Vorhersage von 2.6*10^-4 überein. Aus dem Xi^0-Beta-Verzweigungsverhältnis folgt unter Verwendung des experimentellen Wertes des Formfaktorverhältnisses g1/f1 für das CKM-Matrixelement |Vus| = 0.209 +- 0.004(exp) +- 0.026(syst), wobei die dominierende Unsicherheit von g1/f1 herrührt. Außerdem wurden in dieser Arbeit 99 Xi^0 -> Sigma^+ mu^- Anti-nü Zerfallskandidaten mit einem abgeschätzten Untergrund von 30 Ereignissen rekonstruiert und daraus ebenfalls das Verzweigungsverhältnis extrahiert: BR3(Gamma(Xi^0 -> Sigma^+ mu^- Anti-nü)/Gamma(Xi^0 total)) = ( 2.11 +- 0.31(stat) +- 0.15(syst) )10^-6.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zu den Hauptcharakteristika von Teilchen gehoert - neben der Masse - die Lebensdauer. Die mittlere Lebensdauer des Xi0-Hyperons, die sich aus der mittleren Lebensdauer des Xi--Hyperons ueber die Delta I=1/2-Regel theoretisch voraussagen laesst, wurde bereits mehrfach experimentell bestimmt. Die neueste Messung aus dem Jahr 1977 besitzt jedoch eine relative Unsicherheit von 5%, was sich mit Daten neuerer Experimente deutlich verbessern laesst. Die mittlere Lebensdauer ist ein wichtiger Parameter bei der Bestimmung des Matrixelements Vus der Cabibbo-Kobayashi-Maskawa-Matrix in semileptonischen Xi0-Zerfaellen. Im Jahre 2002 wurde mit dem NA48-Detektor eine Datennahme mit hoher Intensitaet durchgefuehrt, bei der unter anderem etwa 10^9 Xi0-Zerfallskandidaten aufgezeichnet wurden. Davon wurden im Rahmen dieser Arbeit 192000 Ereignisse des Typs "Xi0 nach Lambda pi0" rekonstruiert und 107000 Ereignisse zur Bestimmung der mittleren Lebensdauer durch Vergleich mit simulierten Ereignissen verwendet. Zur Vermeidung von systematischen Fehlern wurde die Lebensdauer in zehn Energieintervallen durch Vergleich von gemessenen und simulierten Daten ermittelt. Das Ergebnis ist wesentlich genauer als bisherige Messungen und weicht vom Literaturwert (tau=(2,90+-0,09)*10^(-10)s) um (+4,99+-0,50(stat)+-0,58(syst))% ab, was 1,7 Standardabweichungen entspricht. Die Lebensdauer ergibt sich zu tau=(3,045+-0,015(stat)+-0,017(syst))*10^(-10)s. Auf die gleiche Weise konnte mit den zur Verfuegung stehenden Daten erstmals die Lebensdauer des Anti-Xi0-Hyperons gemessen werden. Das Ergebnis dieser Messung ist tau=(3,042+-0,045(stat)+-0,017(syst))*10^(-10)s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is devoted to the study of the properties of high-redsfhit galaxies in the epoch 1 < z < 3, when a substantial fraction of galaxy mass was assembled, and when the evolution of the star-formation rate density peaked. Following a multi-perspective approach and using the most recent and high-quality data available (spectra, photometry and imaging), the morphologies and the star-formation properties of high-redsfhit galaxies were investigated. Through an accurate morphological analyses, the built up of the Hubble sequence was placed around z ~ 2.5. High-redshift galaxies appear, in general, much more irregular and asymmetric than local ones. Moreover, the occurrence of morphological k-­correction is less pronounced than in the local Universe. Different star-formation rate indicators were also studied. The comparison of ultra-violet and optical based estimates, with the values derived from infra-red luminosity showed that the traditional way of addressing the dust obscuration is problematic, at high-redshifts, and new models of dust geometry and composition are required. Finally, by means of stacking techniques applied to rest-frame ultra-violet spectra of star-forming galaxies at z~2, the warm phase of galactic-scale outflows was studied. Evidence was found of escaping gas at velocities of ~ 100 km/s. Studying the correlation of inter-­stellar absorption lines equivalent widths with galaxy physical properties, the intensity of the outflow-related spectral features was proven to depend strongly on a combination of the velocity dispersion of the gas and its geometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different types of proteins exist with diverse functions that are essential for living organisms. An important class of proteins is represented by transmembrane proteins which are specifically designed to be inserted into biological membranes and devised to perform very important functions in the cell such as cell communication and active transport across the membrane. Transmembrane β-barrels (TMBBs) are a sub-class of membrane proteins largely under-represented in structure databases because of the extreme difficulty in experimental structure determination. For this reason, computational tools that are able to predict the structure of TMBBs are needed. In this thesis, two computational problems related to TMBBs were addressed: the detection of TMBBs in large datasets of proteins and the prediction of the topology of TMBB proteins. Firstly, a method for TMBB detection was presented based on a novel neural network framework for variable-length sequence classification. The proposed approach was validated on a non-redundant dataset of proteins. Furthermore, we carried-out genome-wide detection using the entire Escherichia coli proteome. In both experiments, the method significantly outperformed other existing state-of-the-art approaches, reaching very high PPV (92%) and MCC (0.82). Secondly, a method was also introduced for TMBB topology prediction. The proposed approach is based on grammatical modelling and probabilistic discriminative models for sequence data labeling. The method was evaluated using a newly generated dataset of 38 TMBB proteins obtained from high-resolution data in the PDB. Results have shown that the model is able to correctly predict topologies of 25 out of 38 protein chains in the dataset. When tested on previously released datasets, the performances of the proposed approach were measured as comparable or superior to the current state-of-the-art of TMBB topology prediction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis consists of three self-contained papers. In the first paper I analyze the labor supply behavior of Bologna Pizza Delivery Vendors. Recent influential papers analyze labor supply behavior of taxi drivers (Camerer et al., 1997; and Crawford and Meng, 2011) and suggest that reference-dependence preferences have an important influence on drivers’ labor-supply decisions. Unlike previous papers, I am able to identify an exogenous and transitory change in labor demand. Using high frequency data on orders and rainfall as an exogenous demand shifter, I invariably find that reference-dependent preferences play no role in their labor’ supply decisions and the behavior of pizza vendors is perfectly consistent with the predictions of the standard model of labor’ supply. In the second paper, I investigate how the voting behavior of Members of Parliament is influenced by the Members seating nearby. By exploiting the random seating arrangements in the Icelandic Parliament, I show that being seated next to Members of a different party increases the probability of not being aligned with one’s own party. Using the exact spatial orientation of the peers, I provide evidence that supports the hypothesis that interaction is the main channel that explain these results. In the third paper, I provide an estimate of the trade flows that there would have been between the UK and Europe if the UK had joined the Euro. As an alternative approach to the standard log-linear gravity equation I employ the synthetic control method. I show that the aggregate trade flows between Britain and Europe would have been 13% higher if the UK had adopted the Euro.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The behaviour of a polymer depends strongly on the length- and time scale as well as on the temperature rnat which it is probed. In this work, I describe investigations of polymer surfaces using scanning probe rnmicroscopy with heatable probes. With these probes, surfaces can be heated within seconds down to rnmicroseconds. I introduce experiments for the local and fast determination of glass transition and melting rntemperatures. I developed a method which allows the determination of glass transition and melting rntemperatures on films with thicknesses below 100 nm: A background measurement on the substrate was rnperformed. The resulting curve was subtracted from the measurement on the polymer film. The rndifferential measurement on polystyrene films with thicknesses between 35 nm and 160 nm showed rncharacteristic signals at 95 ± 1 °C, in accordance with the glass transition of polystyrene. Pressing heated rnprobes into polymer films causes plastic deformation. Nanometer sized deformations are currently rninvestigated in novel concepts for high density data storage. A suitable medium for such a storage system rnhas to be easily indentable on one hand, but on the other hand it also has to be very stable towards rnsurface induced wear. For developing such a medium I investigated a new approach: A comparably soft rnmaterial, namely polystyrene, was protected with a thin but very hard layer made of plasma polymerized rnnorbornene. The resulting bilayered media were tested for surface stability and deformability. I showed rnthat the bilayered material combines the deformability of polystyrene with the surface stability of the rnplasma polymer, and that the material therefore is a very good storage medium. In addition we rninvestigated the glass transition temperature of polystyrene at timescales of 10 µs and found it to be rnapprox. 220 °C. The increase of this characteristic temperature of the polymer results from the short time rnat which the polymer was probed and reflects the well-known time-temperature superposition principle. rnHeatable probes were also used for the characterization of silverazide filled nanocapsules. The use of rnheatable probes allowed determining the decomposition temperature of the capsules from few rnnanograms of material. The measured decomposition temperatures ranged from 180 °C to 225 °C, in rnaccordance with literature values. The investigation of small amounts of sample was necessary due to the rnlimited availability of the material. Furthermore, investigating larger amounts of the capsules using rnconventional thermal gravimetric analysis could lead to contamination or even damage of the instrument. rnBesides the analysis of material parameters I used the heatable probes for the local thermal rndecomposition of pentacene precursor material in order to form nanoscale conductive structures. Here, rnthe thickness of the precursor layer was important for complete thermal decomposition. rnAnother aspect of my work was the investigation of redox active polymers - Poly-10-(4-vinylbenzyl)-10H-rnphenothiazine (PVBPT)- for data storage. Data is stored by changing the local conductivity of the material rnby applying a voltage between tip and surface. The generated structures were stable for more than 16 h. It rnwas shown that the presence of water is essential for succesfull patterning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most important challenges in chemistry and material science is the connection between the contents of a compound and its chemical and physical properties. In solids, these are greatly influenced by the crystal structure.rnrnThe prediction of hitherto unknown crystal structures with regard to external conditions like pressure and temperature is therefore one of the most important goals to achieve in theoretical chemistry. The stable structure of a compound is the global minimum of the potential energy surface, which is the high dimensional representation of the enthalpy of the investigated system with respect to its structural parameters. The fact that the complexity of the problem grows exponentially with the system size is the reason why it can only be solved via heuristic strategies.rnrnImprovements to the artificial bee colony method, where the local exploration of the potential energy surface is done by a high number of independent walkers, are developed and implemented. This results in an improved communication scheme between these walkers. This directs the search towards the most promising areas of the potential energy surface.rnrnThe minima hopping method uses short molecular dynamics simulations at elevated temperatures to direct the structure search from one local minimum of the potential energy surface to the next. A modification, where the local information around each minimum is extracted and used in an optimization of the search direction, is developed and implemented. Our method uses this local information to increase the probability of finding new, lower local minima. This leads to an enhanced performance in the global optimization algorithm.rnrnHydrogen is a highly relevant system, due to the possibility of finding a metallic phase and even superconductor with a high critical temperature. An application of a structure prediction method on SiH12 finds stable crystal structures in this material. Additionally, it becomes metallic at relatively low pressures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lo scopo dell'elaborato di tesi è la progettazione e lo sviluppo di alcuni moduli di un software per la lettura ad elevato throughput di dati da particolari dispositivi per elettrofisiologia sviluppati dall'azienda Elements s.r.l. Elements produce amplificatori ad alta precisione per elettrofisiologia, in grado di misurare correnti a bassa intensità prodotte dai canali ionici. Dato il grande sviluppo che l'azienda sta avendo, e vista la previsione di introdurre sul mercato nuovi dispositivi con precisione e funzionalità sempre migliori, Elements ha espresso l'esigenza di un sistema software che fosse in grado di supportare al meglio i dispositivi già prodotti, e, soprattutto, prevedere il supporto dei nuovi, con prestazioni molto migliori del software già sviluppato da loro per la lettura dei dati. Il software richiesto deve fornire una interfaccia grafica che, comunicando con il dispositivo tramite USB per leggere dati da questo, provvede a mostrarli a schermo e permette di registrarli ed effettuare basilari operazioni di analisi. In questa tesi verranno esposte analisi, progettazione e sviluppo dei moduli di software che si interfacciano direttamente con il dispositivo, quindi dei moduli di rilevamento, connessione, acquisizione ed elaborazione dati.