928 resultados para Nuclear structure models and methods
Resumo:
The diagnosis of mixed genotype hepatitis C virus (HCV) infection is rare and information on incidence in the UK, where genotypes 1a and 3 are the most prevalent, is sparse. Considerable variations in the efficacies of direct-acting antivirals (DAAs) for the HCV genotypes have been documented and the ability of DAAs to treat mixed genotype HCV infections remains unclear, with the possibility that genotype switching may occur. In order to estimate the prevalence of mixed genotype 1a/3 infections in Scotland, a cohort of 512 samples was compiled and then screened using a genotype-specific nested PCR assay. Mixed genotype 1a/3 infections were found in 3.8% of samples tested, with a significantly higher prevalence rate of 6.7% (p<0.05) observed in individuals diagnosed with genotype 3 infections than genotype 1a (0.8%). An analysis of the samples using genotypic-specific qPCR assays found that in two-thirds of samples tested, the minor strain contributed <1% of the total viral load. The potential of deep sequencing methods for the diagnosis of mixed genotype infections was assessed using two pan-genotypic PCR assays compatible with the Illumina MiSeq platform that were developed targeting the E1-E2 and NS5B regions of the virus. The E1-E2 assay detected 75% of the mixed genotype infections, proving to be more sensitive than the NS5B assay which identified only 25% of the mixed infections. Studies of sequence data and linked patient records also identified significantly more neurological disorders in genotype 3 patients. Evidence of distinctive dinucleotide expression within the genotypes was also uncovered. Taken together these findings raise interesting questions about the evolutionary history of the virus and indicate that there is still more to understand about the different genotypes. In an era where clinical medicine is frequently more personalised, the development of diagnostic methods for HCV providing increased patient stratification is increasingly important. This project has shown that sequence-based genotyping methods can be highly discriminatory and informative, and their use should be encouraged in diagnostic laboratories. Mixed genotype infections were challenging to identify and current deep sequencing methods were not as sensitive or cost-effective as Sanger-based approaches in this study. More research is needed to evaluate the clinical prognosis of patients with mixed genotype infection and to develop clinical guidelines on their treatment.
Resumo:
Esta publicação apresenta um estudo sobre as As Báquides de Plauto, no qual se analisam o contexto e as características da peça, a estrutura e as personagens, e ainda o problema da originalidade plautina na conceptualização do ‘terceiro engano’.
Resumo:
2008
Resumo:
Analytics is the technology working with the manipulation of data to produce information able to change the world we live every day. Analytics have been largely used within the last decade to cluster people’s behaviour to predict their preferences of items to buy, music to listen, movies to watch and even electoral preference. The most advanced companies succeded in controlling people’s behaviour using analytics. Despite the evidence of the super-power of analytics, they are rarely applied to the big data collected within supply chain systems (i.e. distribution network, storage systems and production plants). This PhD thesis explores the fourth research paradigm (i.e. the generation of knowledge from data) applied to supply chain system design and operations management. An ontology defining the entities and the metrics of supply chain systems is used to design data structures for data collection in supply chain systems. The consistency of this data is provided by mathematical demonstrations inspired by the factory physics theory. The availability, quantity and quality of the data within these data structures define different decision patterns. Ten decision patterns are identified, and validated on-field, to address ten different class of design and control problems in the field of supply chain systems research.
Resumo:
In questo elaborato ci siamo occupati della legge di Zipf sia da un punto di vista applicativo che teorico. Tale legge empirica afferma che il rango in frequenza (RF) delle parole di un testo seguono una legge a potenza con esponente -1. Per quanto riguarda l'approccio teorico abbiamo trattato due classi di modelli in grado di ricreare leggi a potenza nella loro distribuzione di probabilità. In particolare, abbiamo considerato delle generalizzazioni delle urne di Polya e i processi SSR (Sample Space Reducing). Di questi ultimi abbiamo dato una formalizzazione in termini di catene di Markov. Infine abbiamo proposto un modello di dinamica delle popolazioni capace di unificare e riprodurre i risultati dei tre SSR presenti in letteratura. Successivamente siamo passati all'analisi quantitativa dell'andamento del RF sulle parole di un corpus di testi. Infatti in questo caso si osserva che la RF non segue una pura legge a potenza ma ha un duplice andamento che può essere rappresentato da una legge a potenza che cambia esponente. Abbiamo cercato di capire se fosse possibile legare l'analisi dell'andamento del RF con le proprietà topologiche di un grafo. In particolare, a partire da un corpus di testi abbiamo costruito una rete di adiacenza dove ogni parola era collegata tramite un link alla parola successiva. Svolgendo un'analisi topologica della struttura del grafo abbiamo trovato alcuni risultati che sembrano confermare l'ipotesi che la sua struttura sia legata al cambiamento di pendenza della RF. Questo risultato può portare ad alcuni sviluppi nell'ambito dello studio del linguaggio e della mente umana. Inoltre, siccome la struttura del grafo presenterebbe alcune componenti che raggruppano parole in base al loro significato, un approfondimento di questo studio potrebbe condurre ad alcuni sviluppi nell'ambito della comprensione automatica del testo (text mining).
Resumo:
Silicon-based discrete high-power devices need to be designed with optimal performance up to several thousand volts and amperes to reach power ratings ranging from few kWs to beyond the 1 GW mark. To this purpose, a key element is the improvement of the junction termination (JT) since it allows to drastically reduce surface electric field peaks which may lead to an earlier device failure. This thesis will be mostly focused on the negative bevel termination which from several years constitutes a standard processing step in bipolar production lines. A simple methodology to realize its counterpart, a planar JT with variation of the lateral doping concentration (VLD) will be also described. On the JT a thin layer of a semi insulating material is usually deposited, which acts as passivation layer reducing the interface defects and contributing to increase the device reliability. A thorough understanding of how the passivation layer properties affect the breakdown voltage and the leakage current of a fast-recovery diode is fundamental to preserve the ideal termination effect and provide a stable blocking capability. More recently, amorphous carbon, also called diamond-like carbon (DLC), has been used as a robust surface passivation material. By using a commercial TCAD tool, a detailed physical explanation of DLC electrostatic and transport properties has been provided. The proposed approach is able to predict the breakdown voltage and the leakage current of a negative beveled power diode passivated with DLC as confirmed by the successfully validation against the available experiments. In addition, the VLD JT proposed to overcome the limitation of the negative bevel architecture has been simulated showing a breakdown voltage very close to the ideal one with a much smaller area consumption. Finally, the effect of a low junction depth on the formation of current filaments has been analyzed by performing reverse-recovery simulations.
Resumo:
Understanding why market manipulation is conducted, under which conditions it is the most profitable and investigating the magnitude of these practices are crucial questions for financial regulators. Closing price manipulation induced by derivatives’ expiration is the primary subject of this thesis. The first chapter provides a mathematical framework in continuous time to study the incentive to manipulate a set of securities induced by a derivative position. An agent holding a European-type contingent claim, depending on the price of a basket of underlying securities, is considered. The agent can affect the price of the underlying securities by trading on each of them before expiration. The elements of novelty are at least twofold: (1) a multi-asset market is considered; (2) the problem is solved by means of both classic optimisation and stochastic control techniques. Both linear and option payoffs are considered. In the second chapter an empirical investigation is conducted on the existence of expiration day effects on the UK equity market. Intraday data on FTSE 350 stocks over a six-year period from 2015-2020 are used. The results show that the expiration of index derivatives is associated with a rise in both trading activity and volatility, together with significant price distortions. The expiration of single stock options appears to have little to no impact on the underlying securities. The last chapter examines the existence of patterns in line with closing price manipulation of UK stocks on option expiration days. The main contributions are threefold: (1) this is one of the few empirical studies on manipulation induced by the options market; (2) proprietary equity orderbook and transaction data sets are used to define manipulation proxies, providing a more detailed analysis; (3) the behaviour of proprietary trading firms is studied. Despite the industry concerns, no evidence is found of this type of manipulative behaviour.
Diffusive models and chaos indicators for non-linear betatron motion in circular hadron accelerators
Resumo:
Understanding the complex dynamics of beam-halo formation and evolution in circular particle accelerators is crucial for the design of current and future rings, particularly those utilizing superconducting magnets such as the CERN Large Hadron Collider (LHC), its luminosity upgrade HL-LHC, and the proposed Future Circular Hadron Collider (FCC-hh). A recent diffusive framework, which describes the evolution of the beam distribution by means of a Fokker-Planck equation, with diffusion coefficient derived from the Nekhoroshev theorem, has been proposed to describe the long-term behaviour of beam dynamics and particle losses. In this thesis, we discuss the theoretical foundations of this framework, and propose the implementation of an original measurement protocol based on collimator scans in view of measuring the Nekhoroshev-like diffusive coefficient by means of beam loss data. The available LHC collimator scan data, unfortunately collected without the proposed measurement protocol, have been successfully analysed using the proposed framework. This approach is also applied to datasets from detailed measurements of the impact on the beam losses of so-called long-range beam-beam compensators also at the LHC. Furthermore, dynamic indicators have been studied as a tool for exploring the phase-space properties of realistic accelerator lattices in single-particle tracking simulations. By first examining the classification performance of known and new indicators in detecting the chaotic character of initial conditions for a modulated Hénon map and then applying this knowledge to study the properties of realistic accelerator lattices, we tried to identify a connection between the presence of chaotic regions in the phase space and Nekhoroshev-like diffusive behaviour, providing new tools to the accelerator physics community.
Resumo:
This thesis deals with efficient solution of optimization problems of practical interest. The first part of the thesis deals with bin packing problems. The bin packing problem (BPP) is one of the oldest and most fundamental combinatorial optimiza- tion problems. The bin packing problem and its generalizations arise often in real-world ap- plications, from manufacturing industry, logistics and transportation of goods, and scheduling. After an introductory chapter, I will present two applications of two of the most natural extensions of the bin packing: Chapter 2 will be dedicated to an application of bin packing in two dimension to a problem of scheduling a set of computational tasks on a computer cluster, while Chapter 3 deals with the generalization of BPP in three dimensions that arise frequently in logistic and transportation, often com- plemented with additional constraints on the placement of items and characteristics of the solution, like, for example, guarantees on the stability of the items, to avoid potential damage to the transported goods, on the distribution of the total weight of the bins, and on compatibility with loading and unloading operations. The second part of the thesis, and in particular Chapter 4 considers the Trans- mission Expansion Problem (TEP), where an electrical transmission grid must be expanded so as to satisfy future energy demand at the minimum cost, while main- taining some guarantees of robustness to potential line failures. These problems are gaining importance in a world where a shift towards renewable energy can impose a significant geographical reallocation of generation capacities, resulting in the ne- cessity of expanding current power transmission grids.
Resumo:
Questa tesi intende approfondire da un punto di vista, sia teorico sia computazionale, le proprietà fondamentali dei fononi. A tal fine, sono presentati i modelli quantistici di Einstein e di Debye che permettono la derivazione analitica degli osservabili macroscopici principali di un solido, come l’energia media e la capacità termica. Ciò è possibile tramite una trattazione meccano-statistica basata sull’approssimazione armonica dei modi normali di vibrazione degli ioni reticolari. Quindi, all’inizio si mostrano brevemente i risultati principali riguardanti l’oscillatore armonico quantistico. Successivamente, si approfondiscono i temi della dispersione fononica e della densità degli stati vibrazionali per reticoli cristallini 1D e 3D. Si ottiene che la prima non può essere considerata lineare se non nel limite di alte lunghezze d’onda, e che la seconda può presentare punti di singolarità correlati alla forma della relazione di dispersione. Infine, sono state svolte alcune analisi computazionali ab initio relative alla dispersione fononica, la densità degli stati vibrazionali e la frequenza di Debye del Carbonio (diamante) tramite i programmi VASP e Phonopy, confrontando i risultati con dati sperimentali presenti in letteratura.
Resumo:
Nowadays the idea of injecting world or domain-specific structured knowledge into pre-trained language models (PLMs) is becoming an increasingly popular approach for solving problems such as biases, hallucinations, huge architectural sizes, and explainability lack—critical for real-world natural language processing applications in sensitive fields like bioinformatics. One recent work that has garnered much attention in Neuro-symbolic AI is QA-GNN, an end-to-end model for multiple-choice open-domain question answering (MCOQA) tasks via interpretable text-graph reasoning. Unlike previous publications, QA-GNN mutually informs PLMs and graph neural networks (GNNs) on top of relevant facts retrieved from knowledge graphs (KGs). However, taking a more holistic view, existing PLM+KG contributions mainly consider commonsense benchmarks and ignore or shallowly analyze performances on biomedical datasets. This thesis start from a propose of a deep investigation of QA-GNN for biomedicine, comparing existing or brand-new PLMs, KGs, edge-aware GNNs, preprocessing techniques, and initialization strategies. By combining the insights emerged in DISI's research, we introduce Bio-QA-GNN that include a KG. Working with this part has led to an improvement in state-of-the-art of MCOQA model on biomedical/clinical text, largely outperforming the original one (+3.63\% accuracy on MedQA). Our findings also contribute to a better understanding of the explanation degree allowed by joint text-graph reasoning architectures and their effectiveness on different medical subjects and reasoning types. Codes, models, datasets, and demos to reproduce the results are freely available at: \url{https://github.com/disi-unibo-nlp/bio-qagnn}.
Resumo:
The evolution of the historiography of psychology in Brazil is surveyed, to describe how the field has evolved from the seminal works of the pioneer, mostly self-taught, psychologists, to the now professional historians working from a variety of theoretical models and methods of inquiry. The first accounts of the history of psychology written by Brazilians and by foreigners are surveyed, as well as the recent works made by researchers linked to the Work Group on the History of Psychology of the Brazilian Association of Research and Graduate Education in Psychology and published in periodicals such as Memorandum and Mnemosine. The present historiography focuses mainly the relationship of psychological knowledge to specific social and cultural conditions, emphasizing themes such as women`s participation in the construction of the field, the development of psychology as a science and as a profession in education and health, and the development of psychology as an expression of Brazilian culture and of the experience of resistance of local communities to domination. To reveal this process of identity construction, a cultural historiography is an important tool, coupled with methodological pluralism.
Resumo:
Demand response is an energy resource that has gained increasing importance in the context of competitive electricity markets and of smart grids. New business models and methods designed to integrate demand response in electricity markets and of smart grids have been published, reporting the need of additional work in this field. In order to adequately remunerate the participation of the consumers in demand response programs, improved consumers’ performance evaluation methods are needed. The methodology proposed in the present paper determines the characterization of the baseline approach that better fits the consumer historic consumption, in order to determine the expected consumption in absent of participation in a demand response event and then determine the actual consumption reduction. The defined baseline can then be used to better determine the remuneration of the consumer. The paper includes a case study with real data to illustrate the application of the proposed methodology.
Resumo:
At the beginning of the 21st century, a new social arrangement of work poses a series of questions and challenges to scholars who aim to help people develop their working lives. Given the globalization of career counseling, we decided to address these issues and then to formulate potentially innovative responses in an international forum. We used this approach to avoid the difficulties of creating models and methods in one country and then trying to export them to other countries where they would be adapted for use. This article presents the initial outcome of this collaboration, a counseling model and methods. The life-designing model for career intervention endorses five presuppositions about people and their work lives: contextual possibilities, dynamic processes, non-linear progression, multiple perspectives, and personal patterns. Thinking from these five presuppositions, we have crafted a contextualized model based on the epistemology of social constructionism, particularly recognizing that an individual's knowledge and identity are the product of social interaction and that meaning is co-constructed through discourse. The life-design framework for counseling implements the theories of self-constructing [Guichard, J. (2005). Life-long self-construction. International Journal for Educational and Vocational Guidance, 5, 111-124] and career construction [Savickas, M. L. (2005). The theory and practice of career construction. In S. D. Brown & R. W. Lent (Eds.), Career development and counselling: putting theory and research to work (pp. 42-70). Hoboken, NJ: Wiley] that describe vocational behavior and its development. Thus, the framework is structured to be life-long, holistic, contextual, and preventive.
Resumo:
Tämändiplomityön tavoitteena oli tutkia, miten liiketoimintaprosessissa esiintyviä toimijoiden välisiä riippuvuussuhteita voidaan kehittää, käyttäen juuri tähän tarkoitukseen räätälöityä kehittämismenetelmää ja mitä osia tämäntyyppisen kehittämismenetelmän tulisi sisältää. Työssä ideoitu menetelmä on tarkoitettu jo käytössä olevien liiketoimintaprosessien kehittämiseen. Työ aloitettiin tutkimalla teoriataustaa liiketoimintaprosesseistaja olemassa olevista organisaatioiden sekä liiketoimintaprosessien kehittämismenetelmistä. Kehittämismenetelmän sisällön määrittelyssä hyödynnettiin myös työn tilaajan toimesta aikaisemmin tehtyä menetelmien ja mallien kehittämistyötä. Menetelmän sisältö rajattiin kolmeen vaiheeseen, joita ovat kehittämisprojektin suunnitteluvaihe, prosessin ja toimijoiden välisten yhteistyösuhteiden analysointivaihe sekä kehittämisratkaisujen suunnittelu- ja toteutusvaihe. Menetelmää testattiin kahdessa julkisen sektorin palveluprosessissa, joista saatujen kokemuksien perusteella menetelmää kehitettiin edelleen lopulliseen muotoonsa. Työn varsinaisena tuloksena syntyi sekä käytännössä testattu kehittämismenetelmä että useita potentiaalisia jatkokehityskohteita, jotka koskevat menetelmän yksittäisiä vaiheita. Lisäksi kehittämistyön yhteydessä syntyi myös laajempia kehittämisideoita, jotka koskevat prosessi- ja verkostomallien yhdistämistä, prosessijohtamista ja kehittämishankkeiden koordinointia.