912 resultados para simplicity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nell’ambito della Chimica Sostenibile e dell’applicazione dei suoi principi per la salvaguardia dell’ambiente, il progetto di dottorato ha riguardato lo sviluppo di materiali innovativi e lo studio della loro interazione con sistemi biologici e biomimetici. In particolare l’attività si è focalizzata sulla sintesi di liquidi ionici ed indagini delle interazioni con membrane cellulari e sull’utilizzo ed isolamento di molecole da fonti rinnovabili. I liquidi ionici sono sali organici liquidi a temperature inferiori ai 100 °C; sono considerati promettenti solventi a ridotta tossicità, ma vanno chiarite a pieno le modalità di interazione con i sistemi biologici ed i meccanismi di tossicità. A questo scopo è stata impiegata una batteria di test bio-chimici, con saggi di fluorescenza e colorimetrici, che hanno permesso di discriminare le diverse tipologie di interazioni con varie strutture di membrana. Le informazioni raccolte sono servite per progettare sostanze meno dannose per le strutture cellulari, al fine di scegliere le funzionalità molecolari che consentano ai liquidi ionici di mantenere la loro attività ma di essere meno dannosi per l’ambiente. Per quanto riguarda l’utilizzo ed isolamento di molecole da fonte rinnovabili, si è utilizzata la tecnica della pirolisi per l’ottenimento di starting materials ed il loro impiego nella sintesi di chemicals in alternativa a composti derivanti da fonti fossili. La pirolisi tradizionale della cellulosa fornisce una molecola interessante, per semplicità denominata LAC, in quantità insufficienti ad un uso applicativo. Nell’ambito delle ricerche svolte è stato scoperto che la pirolisi condotta in presenza di catalizzatori meso-strutturati (MCM-41) drogati con metalli di transizione, fornisce buone quantità di LAC. LAC si è dimostrato promettente sia per la produzione di nuove molecole con possibili applicazioni nella chimica fine e farmaceutica, che come monomero per nuovi polimeri (copolimero ed omopolimero).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Beamforming entails joint processing of multiple signals received or transmitted by an array of antennas. This thesis addresses the implementation of beamforming in two distinct systems, namely a distributed network of independent sensors, and a broad-band multi-beam satellite network. With the rising popularity of wireless sensors, scientists are taking advantage of the flexibility of these devices, which come with very low implementation costs. Simplicity, however, is intertwined with scarce power resources, which must be carefully rationed to ensure successful measurement campaigns throughout the whole duration of the application. In this scenario, distributed beamforming is a cooperative communication technique, which allows nodes in the network to emulate a virtual antenna array seeking power gains in the order of the size of the network itself, when required to deliver a common message signal to the receiver. To achieve a desired beamforming configuration, however, all nodes in the network must agree upon the same phase reference, which is challenging in a distributed set-up where all devices are independent. The first part of this thesis presents new algorithms for phase alignment, which prove to be more energy efficient than existing solutions. With the ever-growing demand for broad-band connectivity, satellite systems have the great potential to guarantee service where terrestrial systems can not penetrate. In order to satisfy the constantly increasing demand for throughput, satellites are equipped with multi-fed reflector antennas to resolve spatially separated signals. However, incrementing the number of feeds on the payload corresponds to burdening the link between the satellite and the gateway with an extensive amount of signaling, and to possibly calling for much more expensive multiple-gateway infrastructures. This thesis focuses on an on-board non-adaptive signal processing scheme denoted as Coarse Beamforming, whose objective is to reduce the communication load on the link between the ground station and space segment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The primary goals of this study were to develop a cell-free in vitro assay for the assessment of nonthermal electromagnetic (EMF) bioeffects and to develop theoretical models in accord with current experimental observations. Based upon the hypothesis that EMF effects operate by modulating Ca2+/CaM binding, an in vitro nitric oxide (NO) synthesis assay was developed to assess the effects of a pulsed radiofrequency (PRF) signal used for treatment of postoperative pain and edema. No effects of PRF on NO synthesis were observed. Effects of PRF on Ca2+/CaM binding were also assessed using a Ca2+-selective electrode, also yielding no EMF Ca2+/CaM binding. However, a PRF effect was observed on the interaction of hemoglobin (Hb) with tetrahydrobiopterin, leading to the development of an in vitro Hb deoxygenation assay, showing a reduction in the rate of Hb deoxygenation for exposures to both PRF and a static magnetic field (SMF). Structural studies using pyranine fluorescence, Gd3+ vibronic sideband luminescence and attenuated total reflectance Fourier transform infrared (ATR-FTIR) spectroscopy were conducted in order to ascertain the mechanism of this EMF effect on Hb. Also, the effect of SMF on Hb oxygen saturation (SO2) was assessed under gas-controlled conditions. These studies showed no definitive changes in protein/solvation structure or SO2 under equilibrium conditions, suggesting the need for real-time instrumentation or other means of observing out-of-equilibrium Hb dynamics. Theoretical models were developed for EMF transduction, effects on ion binding, neuronal spike timing, and dynamics of Hb deoxygenation. The EMF sensitivity and simplicity of the Hb deoxygenation assay suggest a new tool to further establish basic biophysical EMF transduction mechanisms. If an EMF-induced increase in the rate of deoxygenation can be demonstrated in vivo, then enhancement of oxygen delivery may be a new therapeutic method by which clinically relevant EMF-mediated enhancement of growth and repair processes can occur.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability of block copolymers to spontaneously self-assemble into a variety of ordered nano-structures not only makes them a scientifically interesting system for the investigation of order-disorder phase transitions, but also offers a wide range of nano-technological applications. The architecture of a diblock is the most simple among the block copolymer systems, hence it is often used as a model system in both experiment and theory. We introduce a new soft-tetramer model for efficient computer simulations of diblock copolymer melts. The instantaneous non-spherical shape of polymer chains in molten state is incorporated by modeling each of the two blocks as two soft spheres. The interactions between the spheres are modeled in a way that the diblock melt tends to microphase separate with decreasing temperature. Using Monte Carlo simulations, we determine the equilibrium structures at variable values of the two relevant control parameters, the diblock composition and the incompatibility of unlike components. The simplicity of the model allows us to scan the control parameter space in a completeness that has not been reached in previous molecular simulations.The resulting phase diagram shows clear similarities with the phase diagram found in experiments. Moreover, we show that structural details of block copolymer chains can be reproduced by our simple model.We develop a novel method for the identification of the observed diblock copolymer mesophases that formalizes the usual approach of direct visual observation,using the characteristic geometry of the structures. A cluster analysis algorithm is used to determine clusters of each component of the diblock, and the number and shape of the clusters can be used to determine the mesophase.We also employ methods from integral geometry for the identification of mesophases and compare their usefulness to the cluster analysis approach.To probe the properties of our model in confinement, we perform molecular dynamics simulations of atomistic polyethylene melts confined between graphite surfaces. The results from these simulations are used as an input for an iterative coarse-graining procedure that yields a surface interaction potential for the soft-tetramer model. Using the interaction potential derived in that way, we perform an initial study on the behavior of the soft-tetramer model in confinement. Comparing with experimental studies, we find that our model can reflect basic features of confined diblock copolymer melts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the race to obtain protons with higher energies, using more compact systems at the same time, laser-driven plasma accelerators are becoming an interesting possibility. But for now, only beams with extremely broad energy spectra and high divergence have been produced. The driving line of this PhD thesis was the study and design of a compact system to extract a high quality beam out of the initial bunch of protons produced by the interaction of a laser pulse with a thin solid target, using experimentally reliable technologies in order to be able to test such a system as soon as possible. In this thesis, different transport lines are analyzed. The first is based on a high field pulsed solenoid, some collimators and, for perfect filtering and post-acceleration, a high field high frequency compact linear accelerator, originally designed to accelerate a 30 MeV beam extracted from a cyclotron. The second one is based on a quadruplet of permanent magnetic quadrupoles: thanks to its greater simplicity and reliability, it has great interest for experiments, but the effectiveness is lower than the one based on the solenoid; in fact, the final beam intensity drops by an order of magnitude. An additional sensible decrease in intensity is verified in the third case, where the energy selection is achieved using a chicane, because of its very low efficiency for off-axis protons. The proposed schemes have all been analyzed with 3D simulations and all the significant results are presented. Future experimental work based on the outcome of this thesis can be planned and is being discussed now.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’attuale rilevanza rappresentata dalla stretta relazione tra cambiamenti climatici e influenza antropogenica ha da tempo posto l’attenzione sull’effetto serra e sul surriscaldamento planetario così come sull’aumento delle concentrazioni atmosferiche dei gas climaticamente attivi, in primo luogo la CO2. Il radiocarbonio è attualmente il tracciante ambientale per eccellenza in grado di fornire mediante un approccio “top-down” un valido strumento di controllo per discriminare e quantificare il diossido di carbonio presente in atmosfera di provenienza fossile o biogenica. Ecco allora che ai settori applicativi tradizionali del 14C, quali le datazioni archeometriche, si affiancano nuovi ambiti legati da un lato al settore energetico per quanto riguarda le problematiche associate alle emissioni di impianti, ai combustibili, allo stoccaggio geologico della CO2, dall’altro al mercato in forte crescita dei cosiddetti prodotti biobased costituiti da materie prime rinnovabili. Nell’ambito del presente lavoro di tesi è stato quindi esplorato il mondo del radiocarbonio sia dal punto di vista strettamente tecnico e metodologico che dal punto di vista applicativo relativamente ai molteplici e diversificati campi d’indagine. E’ stato realizzato e validato un impianto di analisi basato sul metodo radiometrico mediante assorbimento diretto della CO2 ed analisi in scintillazione liquida apportando miglioramenti tecnologici ed accorgimenti procedurali volti a migliorare le performance del metodo in termini di semplicità, sensibilità e riproducibilità. Il metodo, pur rappresentando generalmente un buon compromesso rispetto alle metodologie tradizionalmente usate per l’analisi del 14C, risulta allo stato attuale ancora inadeguato a quei settori applicativi laddove è richiesta una precisione molto puntuale, ma competitivo per l’analisi di campioni moderni ad elevata concentrazione di 14C. La sperimentazione condotta su alcuni liquidi ionici, seppur preliminare e non conclusiva, apre infine nuove linee di ricerca sulla possibilità di utilizzare questa nuova classe di composti come mezzi per la cattura della CO2 e l’analisi del 14C in LSC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Die Analyse tandem-repetitiver DNA-Sequenzen hat einen festen Platz als genetisches Typisierungsverfahren in den Breichen der stammesgeschichtlichen Untersuchung, der Verwandtschaftsanalyse und vor allem in der forensischen Spurenkunde, bei der es durch den Einsatz der Multiplex-PCR-Analyse von Short Tandem Repeat-Systemen (STR) zu einem Durchbruch bei der Aufklärung und sicheren Zuordnung von biologischen Tatortspuren kam. Bei der Sequenzierung des humanen Genoms liegt ein besonderes Augenmerk auf den genetisch polymorphen Sequenzvariationen im Genom, den SNPs (single nucleotide polymorphisms). Zwei ihrer Eigenschaften – das häufige Vorkommen innerhalb des humanen Genoms und ihre vergleichbar geringe Mutationsrate – machen sie zu besonders gut geeigneten Werkzeugen sowohl für die Forensik als auch für die Populationsgenetik.rnZum Ziel des EU-Projekts „SNPforID“, aus welchem die vorliegende Arbeit entstanden ist, wurde die Etablierung neuer Methoden zur validen Typisierung von SNPs in Multiplexverfahren erklärt. Die Berücksichtigung der Sensitivität bei der Untersuchung von Spuren sowie die statistische Aussagekraft in der forensischen Analyse standen dabei im Vordergrund. Hierfür wurden 52 autosomale SNPs ausgewählt und auf ihre maximale Individualisierungsstärke hin untersucht. Die Untersuchungen der ersten 23 selektierten Marker stellen den ersten Teil der vorliegenden Arbeit dar. Sie umfassen die Etablierung des Multiplexverfahrens und der SNaPshot™-Typisierungsmethode sowie ihre statistische Auswertung. Die Ergebnisse dieser Untersuchung sind ein Teil der darauf folgenden, in enger Zusammenarbeit der Partnerlaboratorien durchgeführten Studie der 52-SNP-Multiplexmethode. rnEbenfalls im Rahmen des Projekts und als Hauptziel der Dissertation erfolgten Etablierung und Evaluierung des auf der Microarray-Technologie basierenden Verfahrens der Einzelbasenverlängerung auf Glasobjektträgern. Ausgehend von einer begrenzten DNA-Menge wurde hierbei die Möglichkeit der simultanen Hybridisierung einer möglichst hohen Anzahl von SNP-Systemen untersucht. Die Auswahl der hierbei eingesetzten SNP-Marker erfolgte auf der Basis der Vorarbeiten, die für die Etablierung des 52-SNP-Multiplexes erfolgreich durchgeführt worden waren. rnAus einer Vielzahl von Methoden zur Genotypisierung von biallelischen Markern hebt sich das Assay in seiner Parallelität und der Einfachheit des experimentellen Ansatzes durch eine erhebliche Zeit- und Kostenersparnis ab. In der vorliegenden Arbeit wurde das „array of arrays“-Prinzip eingesetzt, um zur gleichen Zeit unter einheitlichen Versuchsbedingungen zwölf DNA-Proben auf einem Glasobjektträger zu typisieren. Auf der Basis von insgesamt 1419 typisierten Allelen von 33 Markern konnte die Validierung mit einem Typisierungserfolg von 86,75% abgeschlossen werden. Dabei wurden zusätzlich eine Reihe von Randbedingungen in Bezug auf das Sonden- und Primerdesign, die Hybridisierungsbedingungen sowie physikalische Parameter der laserinduzierten Fluoreszenzmessung der Signale ausgetestet und optimiert. rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, the remarkable versatility and usefulness of applications of Xe-129 NMR experiments is further extended. The application of Xe-129 NMR spectroscopy to very different system is studied, including dynamic and static, solid and liquid, porous and non-porous systems. Using the large non-equilibrium polarization created by hyperpolarization of Xe-129, time-resolved NMR measurements can be used for the online-monitoring of dynamic systems. In the first part of this work, several improvements for medical applications of hyperpolarized Xe-129 are achieved and their feasibility shown experimentally. A large gain in speed and reproducibility of the accumulation process of Xe-129 as ice and an enhancement of the usable polarization in any experiment requiring prior accumulation are achieved. An enhancement of the longitudinal relaxation time of Xe-129 is realized by admixture of a buffer gas during the storage of hyperpolarized Xe-129. Pursuing the efforts of simplifying the accumulation process and enhancing the storage time of hyperpolarized Xe-129 will allow for a wider use of the hyperpolarized gas in (medical) MRI experiments. Concerning the use of hyperpolarized Xe-129 in MRI, the influence of the diffusion coefficient of the gas on parameters of the image contrast is experimentally demonstrated here by admixture of a buffer gas and thus changing the diffusion coefficient. In the second part of this work, a polymer system with unique features is probed by Xe-129 NMR spectroscopy, proving the method to be a valuable tool for the characterization of the anisotropic properties of semicrystalline, syndiotactic polystyrene films. The polymer films contain hollow cavities or channels with sizes in the sub-nanometer range, allowing for adsorption of Xe-129 and subsequent NMR measurements. Despite the use of a ’real-world’ system, the transfer of the anisotropic properties from the material to adsorbed Xe-129 atoms is shown, which was previously only known for fully crystalline materials. The anisotropic behavior towards atomar guests inside the polymer films is proven here for the first time for one of the phases. For the polymer phase containing nanochannels, the dominance of interactions between Xe-129 atoms in the channels compared to interactions between Xe atoms and the channel walls are proven by measurements of a powder sample of the polymer material and experiments including the rotation of the films in the external magnetic field as well as temperature-dependent measurements. The characterization of ’real-world’ systems showing very high degrees of anisotropy by Xe-129 are deemed to be very valuable in future applications. In the last part of this work, a new method for the online monitoring of chemical reactions has been proposed and its feasibility and validity are experimentally proven. The chemical shift dependence of dissolved Xe-129 on the composition of a reaction mixture is used for the online monitoring of free-radical miniemulsion polymerization reactions. Xe-129 NMR spectroscopy provides an excellent method for the online monitoring of polymerization reactions, due to the simplicity of the Xe-129 NMR spectra and the simple relationship between the Xe-129 chemical shift and the reaction conversion. The results of the time-resolved Xe-129 NMR measurements are compared to those from calorimetric measurements, showing a good qualitative agreement. The applicability of the new method to reactions other than polymerization reactions is investigated by the online monitoring of an enzymatic reaction in a miniemulsion. The successful combination of the large sensitivity of Xe-129, the NMR signal enhancements due to hyperpolarization, and the solubility of Xe-129 gives access to the large new field of investigations of chemical reaction kinetics in dynamic and complex systems like miniemulsions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present thesis, we study quantization of classical systems with non-trivial phase spaces using the group-theoretical quantization technique proposed by Isham. Our main goal is a better understanding of global and topological aspects of quantum theory. In practice, the group-theoretical approach enables direct quantization of systems subject to constraints and boundary conditions in a natural and physically transparent manner -- cases for which the canonical quantization method of Dirac fails. First, we provide a clarification of the quantization formalism. In contrast to prior treatments, we introduce a sharp distinction between the two group structures that are involved and explain their physical meaning. The benefit is a consistent and conceptually much clearer construction of the Canonical Group. In particular, we shed light upon the 'pathological' case for which the Canonical Group must be defined via a central Lie algebra extension and emphasise the role of the central extension in general. In addition, we study direct quantization of a particle restricted to a half-line with 'hard wall' boundary condition. Despite the apparent simplicity of this example, we show that a naive quantization attempt based on the cotangent bundle over the half-line as classical phase space leads to an incomplete quantum theory; the reflection which is a characteristic aspect of the 'hard wall' is not reproduced. Instead, we propose a different phase space that realises the necessary boundary condition as a topological feature and demonstrate that quantization yields a suitable quantum theory for the half-line model. The insights gained in the present special case improve our understanding of the relation between classical and quantum theory and illustrate how contact interactions may be incorporated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cardiotocography (CTG) is a widespread foetal diagnostic methods. However, it lacks of objectivity and reproducibility since its dependence on observer's expertise. To overcome these limitations, more objective methods for CTG interpretation have been proposed. In particular, many developed techniques aim to assess the foetal heart rate variability (FHRV). Among them, some methodologies from nonlinear systems theory have been applied to the study of FHRV. All the techniques have proved to be helpful in specific cases. Nevertheless, none of them is more reliable than the others. Therefore, an in-depth study is necessary. The aim of this work is to deepen the FHRV analysis through the Symbolic Dynamics Analysis (SDA), a nonlinear technique already successfully employed for HRV analysis. Thanks to its simplicity of interpretation, it could be a useful tool for clinicians. We performed a literature study involving about 200 references on HRV and FHRV analysis; approximately 100 works were focused on non-linear techniques. Then, in order to compare linear and non-linear methods, we carried out a multiparametric study. 580 antepartum recordings of healthy fetuses were examined. Signals were processed using an updated software for CTG analysis and a new developed software for generating simulated CTG traces. Finally, statistical tests and regression analyses were carried out for estimating relationships among extracted indexes and other clinical information. Results confirm that none of the employed techniques is more reliable than the others. Moreover, in agreement with the literature, each analysis should take into account two relevant parameters, the foetal status and the week of gestation. Regarding the SDA, results show its promising capabilities in FHRV analysis. It allows recognizing foetal status, gestation week and global variability of FHR signals, even better than other methods. Nevertheless, further studies, which should involve even pathological cases, are necessary to establish its reliability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The full blood cell (FBC) count is the most common indicator of diseases. At present hematology analyzers are used for the blood cell characterization, but, recently, there has been interest in using techniques that take advantage of microscale devices and intrinsic properties of cells for increased automation and decreased cost. Microfluidic technologies offer solutions to handling and processing small volumes of blood (2-50 uL taken by finger prick) for point-of-care(PoC) applications. Several PoC blood analyzers are in use and may have applications in the fields of telemedicine, out patient monitoring and medical care in resource limited settings. They have the advantage to be easy to move and much cheaper than traditional analyzers, which require bulky instruments and consume large amount of reagents. The development of miniaturized point-of-care diagnostic tests may be enabled by chip-based technologies for cell separation and sorting. Many current diagnostic tests depend on fractionated blood components: plasma, red blood cells (RBCs), white blood cells (WBCs), and platelets. Specifically, white blood cell differentiation and counting provide valuable information for diagnostic purposes. For example, a low number of WBCs, called leukopenia, may be an indicator of bone marrow deficiency or failure, collagen- vascular diseases, disease of the liver or spleen. The leukocytosis, a high number of WBCs, may be due to anemia, infectious diseases, leukemia or tissue damage. In the laboratory of hybrid biodevices, at the University of Southampton,it was developed a functioning micro impedance cytometer technology for WBC differentiation and counting. It is capable to classify cells and particles on the base of their dielectric properties, in addition to their size, without the need of labeling, in a flow format similar to that of a traditional flow cytometer. It was demonstrated that the micro impedance cytometer system can detect and differentiate monocytes, neutrophils and lymphocytes, which are the three major human leukocyte populations. The simplicity and portability of the microfluidic impedance chip offer a range of potential applications in cell analysis including point-of-care diagnostic systems. The microfluidic device has been integrated into a sample preparation cartridge that semi-automatically performs erythrocyte lysis before leukocyte analysis. Generally erythrocytes are manually lysed according to a specific chemical lysis protocol, but this process has been automated in the cartridge. In this research work the chemical lysis protocol, defined in the patent US 5155044 A, was optimized in order to improve white blood cell differentiation and count performed by the integrated cartridge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The activity carried out during my PhD was principally addressed to the development of portable microfluidic analytical devices based on biospecific molecular recognition reactions and CL detection. In particular, the development of biosensors required the study of different materials and procedures for their construction, with particular attention to the development of suitable immobilization procedures, fluidic systems and the selection of the suitable detectors. Different methods were exploited, such as gene probe hybridization assay or immunoassay, based on different platform (functionalized glass slide or nitrocellulose membrane) trying to improve the simplicity of the assay procedure. Different CL detectors were also employed and compared with each other in the search for the best compromise between portability and sensitivity. The work was therefore aimed at miniaturization and simplification of analytical devices and the study involved all aspects of the system, from the analytical methodology to the type of detector, in order to combine high sensitivity with easiness-of-use and rapidity. The latest development involving the use of smartphone as chemiluminescent detector paves the way for a new generation of analytical devices in the clinical diagnostic field thanks to the ideal combination of sensibility a simplicity of the CL with the day-by-day increase in the performance of the new generation smartphone camera. Moreover, the connectivity and data processing offered by smartphones can be exploited to perform analysis directly at home with simple procedures. The system could eventually be used to monitor patient health and directly notify the physician of the analysis results allowing a decrease in costs and an increase in the healthcare availability and accessibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In condensed matter systems, the interfacial tension plays a central role for a multitude of phenomena. It is the driving force for nucleation processes, determines the shape and structure of crystalline structures and is important for industrial applications. Despite its importance, the interfacial tension is hard to determine in experiments and also in computer simulations. While for liquid-vapor interfacial tensions there exist sophisticated simulation methods to compute the interfacial tension, current methods for solid-liquid interfaces produce unsatisfactory results.rnrnAs a first approach to this topic, the influence of the interfacial tension on nuclei is studied within the three-dimensional Ising model. This model is well suited because despite its simplicity, one can learn much about nucleation of crystalline nuclei. Below the so-called roughening temperature, nuclei in the Ising model are not spherical anymore but become cubic because of the anisotropy of the interfacial tension. This is similar to crystalline nuclei, which are in general not spherical but more like a convex polyhedron with flat facets on the surface. In this context, the problem of distinguishing between the two bulk phases in the vicinity of the diffuse droplet surface is addressed. A new definition is found which correctly determines the volume of a droplet in a given configuration if compared to the volume predicted by simple macroscopic assumptions.rnrnTo compute the interfacial tension of solid-liquid interfaces, a new Monte Carlo method called ensemble switch method'' is presented which allows to compute the interfacial tension of liquid-vapor interfaces as well as solid-liquid interfaces with great accuracy. In the past, the dependence of the interfacial tension on the finite size and shape of the simulation box has often been neglected although there is a nontrivial dependence on the box dimensions. As a consequence, one needs to systematically increase the box size and extrapolate to infinite volume in order to accurately predict the interfacial tension. Therefore, a thorough finite-size scaling analysis is established in this thesis. Logarithmic corrections to the finite-size scaling are motivated and identified, which are of leading order and therefore must not be neglected. The astounding feature of these logarithmic corrections is that they do not depend at all on the model under consideration. Using the ensemble switch method, the validity of a finite-size scaling ansatz containing the aforementioned logarithmic corrections is carefully tested and confirmed. Combining the finite-size scaling theory with the ensemble switch method, the interfacial tension of several model systems, ranging from the Ising model to colloidal systems, is computed with great accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays communication is switching from a centralized scenario, where communication media like newspapers, radio, TV programs produce information and people are just consumers, to a completely different decentralized scenario, where everyone is potentially an information producer through the use of social networks, blogs, forums that allow a real-time worldwide information exchange. These new instruments, as a result of their widespread diffusion, have started playing an important socio-economic role. They are the most used communication media and, as a consequence, they constitute the main source of information enterprises, political parties and other organizations can rely on. Analyzing data stored in servers all over the world is feasible by means of Text Mining techniques like Sentiment Analysis, which aims to extract opinions from huge amount of unstructured texts. This could lead to determine, for instance, the user satisfaction degree about products, services, politicians and so on. In this context, this dissertation presents new Document Sentiment Classification methods based on the mathematical theory of Markov Chains. All these approaches bank on a Markov Chain based model, which is language independent and whose killing features are simplicity and generality, which make it interesting with respect to previous sophisticated techniques. Every discussed technique has been tested in both Single-Domain and Cross-Domain Sentiment Classification areas, comparing performance with those of other two previous works. The performed analysis shows that some of the examined algorithms produce results comparable with the best methods in literature, with reference to both single-domain and cross-domain tasks, in $2$-classes (i.e. positive and negative) Document Sentiment Classification. However, there is still room for improvement, because this work also shows the way to walk in order to enhance performance, that is, a good novel feature selection process would be enough to outperform the state of the art. Furthermore, since some of the proposed approaches show promising results in $2$-classes Single-Domain Sentiment Classification, another future work will regard validating these results also in tasks with more than $2$ classes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a novel methodology to generate realistic network flow traces to enable systematic evaluation of network monitoring systems in various traffic conditions. Our technique uses a graph-based approach to model the communication structure observed in real-world traces and to extract traffic templates. By combining extracted and user-defined traffic templates, realistic network flow traces that comprise normal traffic and customized conditions are generated in a scalable manner. A proof-of-concept implementation demonstrates the utility and simplicity of our method to produce a variety of evaluation scenarios. We show that the extraction of templates from real-world traffic leads to a manageable number of templates that still enable accurate re-creation of the original communication properties on the network flow level.