933 resultados para non-trivial data structures
Resumo:
[ES] El objetivo de este Trabajo es el de parametrizar, implementar las estructuras de datos y programar las aplicaciones necesarias que posibilitan el intercambio de información entre dos entornos software, SAP R/3 y Knapp, líderes en sus campos de actuación. El resultado de aplicar tales cambios permitirá a la organización no sólo centralizar la información en el ERP, sino que mejorará sus procesos de negocio y agilizará la toma de decisiones por parte de los responsables. Se realiza un estudio de la situación actual y, tras un análisis detallado, se propone una solución que permita alcanzar los objetivos propuestos. Una vez diseñada, presentada y aprobada la propuesta, se procede a la parametrización de SAP R/3, a la definición de los segmentos y tipos de IDOC y a la codificación de funciones y programas que permitan tratar la información enviada por Knapp. Finalizadas estas tareas, se elaboran juegos de datos de los procesos comerciales y se ejecutan en un entorno de test, en colaboración con los usuarios claves, para comprobar la bondad de la solución implementada. Se analizan los resultados y se corrigen posibles deficiencias. Finalmente se transporta al sistema productivo todos los cambios realizados y se verifica la correcta ejecución de los procesos de negocio de la organización.
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.
Resumo:
Cercare di comprendere l'importanza che il villaggio sanatoriale Eugenio Morelli ha avuto nel passato è fondamentale per capire quali necessità abbiano spinto ad erigere un tale colosso, sulle pendici delle alpi. Ed è proprio da qui che la mia tesi è partita, dalla ricerca storica, dall'analisi degli elementi politici, sanitari e soprattutto sociali che si sono verificati tra la fine dell'ottocento e i primi tre decenni del novecento; periodo in cui la tubercolosi è passata dall'essere un malattia poco considerata, che colpiva solo le classi più povere e disagiate, costrette a vivere nei periferie sovraffollate delle grandi città, all'essere considerata una vera e propria piaga sociale, che dove essere sanata. In questa prima ricerca non si è comunque trascurata l'aspetto architettonico, andando ad analizzare l'evoluzione che hanno visto i sanatori, prima in Europa e poi in particolare in Italia. Valutandone gli aspetti stilistici, formali e soprattutto funzionali, con le tante standardizzazioni che si sono avute sotto il fascismo, e che proprio nel villaggio sanatoriale ritroviamo. In seguito sono passato ad analizzare i lavori sul villaggio effettuati nel dopoguerra, come l'eliminazione della teleferica e la costruzione delle passerelle pedonali, che collegano direttamente i vari padiglioni tra di loro, che vennero aggiunte negli anni 70, da quando cioè la destinazione del complesso è mutata in una funzione ospedaliera. Questo però è avvenuto per una sola parte del complesso, cioè quella che una volta era destinata agli alloggi femminili, mentre la porzione più a ovest è stata lasciata in disuso. In varie sedi si è affrontato il dibattito su come sfruttare le migliaia di metri quadri che gli edifici dismessi offrono, ma fino ad ora non è stata data una nuova vocazione a quest'ultimi. La soluzione più accreditata è quella di cedere parte del villaggio ad una società di ricerca svizzera che lo utilizzerebbe come campus per gli scienziati. Ed è proprio considerando questa possibilità, e tenendo conto dell'attuale porzione destinata all'attività ospedaliera, che si sviluppato il progetto di tesi per le nuove passerelle pedonali, che dovrebbero risultare come un elemento di unione dell'intero villaggio e come elemento che si integra alla moltitudine di percorsi già presenti in esso, dandone una maggior razionalizzazione rispetto a quelle esistenti e trasformando i singoli padiglioni, da unità chiuse che in se stesse trovano la conclusione della loro funzione ad un elemento inserito in una rete funzionale che, con la riattivazione delle teleferiche, attraversa tutto il villaggio
Resumo:
In this thesis, we present our work about some generalisations of ideas, techniques and physical interpretations typical for integrable models to one of the most outstanding advances in theoretical physics of nowadays: the AdS/CFT correspondences. We have undertaken the problem of testing this conjectured duality under various points of view, but with a clear starting point - the integrability - and with a clear ambitious task in mind: to study the finite-size effects in the energy spectrum of certain string solutions on a side and in the anomalous dimensions of the gauge theory on the other. Of course, the final desire woul be the exact comparison between these two faces of the gauge/string duality. In few words, the original part of this work consists in application of well known integrability technologies, in large parte borrowed by the study of relativistic (1+1)-dimensional integrable quantum field theories, to the highly non-relativisic and much complicated case of the thoeries involved in the recent conjectures of AdS5/CFT4 and AdS4/CFT3 corrspondences. In details, exploiting the spin chain nature of the dilatation operator of N = 4 Super-Yang-Mills theory, we concentrated our attention on one of the most important sector, namely the SL(2) sector - which is also very intersting for the QCD understanding - by formulating a new type of nonlinear integral equation (NLIE) based on a previously guessed asymptotic Bethe Ansatz. The solutions of this Bethe Ansatz are characterised by the length L of the correspondent spin chain and by the number s of its excitations. A NLIE allows one, at least in principle, to make analytical and numerical calculations for arbitrary values of these parameters. The results have been rather exciting. In the important regime of high Lorentz spin, the NLIE clarifies how it reduces to a linear integral equations which governs the subleading order in s, o(s0). This also holds in the regime with L ! 1, L/ ln s finite (long operators case). This region of parameters has been particularly investigated in literature especially because of an intriguing limit into the O(6) sigma model defined on the string side. One of the most powerful methods to keep under control the finite-size spectrum of an integrable relativistic theory is the so called thermodynamic Bethe Ansatz (TBA). We proposed a highly non-trivial generalisation of this technique to the non-relativistic case of AdS5/CFT4 and made the first steps in order to determine its full spectrum - of energies for the AdS side, of anomalous dimensions for the CFT one - at any values of the coupling constant and of the size. At the leading order in the size parameter, the calculation of the finite-size corrections is much simpler and does not necessitate the TBA. It consists in deriving for a nonrelativistc case a method, invented for the first time by L¨uscher to compute the finite-size effects on the mass spectrum of relativisic theories. So, we have formulated a new version of this approach to adapt it to the case of recently found classical string solutions on AdS4 × CP3, inside the new conjecture of an AdS4/CFT3 correspondence. Our results in part confirm the string and algebraic curve calculations, in part are completely new and then could be better understood by the rapidly evolving developments of this extremely exciting research field.
Resumo:
The increasing precision of current and future experiments in high-energy physics requires a likewise increase in the accuracy of the calculation of theoretical predictions, in order to find evidence for possible deviations of the generally accepted Standard Model of elementary particles and interactions. Calculating the experimentally measurable cross sections of scattering and decay processes to a higher accuracy directly translates into including higher order radiative corrections in the calculation. The large number of particles and interactions in the full Standard Model results in an exponentially growing number of Feynman diagrams contributing to any given process in higher orders. Additionally, the appearance of multiple independent mass scales makes even the calculation of single diagrams non-trivial. For over two decades now, the only way to cope with these issues has been to rely on the assistance of computers. The aim of the xloops project is to provide the necessary tools to automate the calculation procedures as far as possible, including the generation of the contributing diagrams and the evaluation of the resulting Feynman integrals. The latter is based on the techniques developed in Mainz for solving one- and two-loop diagrams in a general and systematic way using parallel/orthogonal space methods. These techniques involve a considerable amount of symbolic computations. During the development of xloops it was found that conventional computer algebra systems were not a suitable implementation environment. For this reason, a new system called GiNaC has been created, which allows the development of large-scale symbolic applications in an object-oriented fashion within the C++ programming language. This system, which is now also in use for other projects besides xloops, is the main focus of this thesis. The implementation of GiNaC as a C++ library sets it apart from other algebraic systems. Our results prove that a highly efficient symbolic manipulator can be designed in an object-oriented way, and that having a very fine granularity of objects is also feasible. The xloops-related parts of this work consist of a new implementation, based on GiNaC, of functions for calculating one-loop Feynman integrals that already existed in the original xloops program, as well as the addition of supplementary modules belonging to the interface between the library of integral functions and the diagram generator.
Resumo:
Die vorliegende Dissertation analysiert die Middleware- Technologien CORBA (Common Object Request Broker Architecture), COM/DCOM (Component Object Model/Distributed Component Object Model), J2EE (Java-2-Enterprise Edition) und Web Services (inklusive .NET) auf ihre Eignung bzgl. eng und lose gekoppelten verteilten Anwendungen. Zusätzlich werden primär für CORBA die dynamischen CORBA-Komponenten DII (Dynamic Invocation Interface), IFR (Interface Repository) und die generischen Datentypen Any und DynAny (dynamisches Any) im Detail untersucht. Ziel ist es, a. konkrete Aussagen über diese Komponenten zu erzielen, und festzustellen, in welchem Umfeld diese generischen Ansätze ihre Berechtigung finden. b. das zeitliche Verhalten der dynamischen Komponenten bzgl. der Informationsgewinnung über die unbekannten Objekte zu analysieren. c. das zeitliche Verhalten der dynamischen Komponenten bzgl. ihrer Kommunikation zu messen. d. das zeitliche Verhalten bzgl. der Erzeugung von generischen Datentypen und das Einstellen von Daten zu messen und zu analysieren. e. das zeitliche Verhalten bzgl. des Erstellens von unbekannten, d. h. nicht in IDL beschriebenen Datentypen zur Laufzeit zu messen und zu analysieren. f. die Vorzüge/Nachteile der dynamischen Komponenten aufzuzeigen, ihre Einsatzgebiete zu definieren und mit anderen Technologien wie COM/DCOM, J2EE und den Web Services bzgl. ihrer Möglichkeiten zu vergleichen. g. Aussagen bzgl. enger und loser Koppelung zu tätigen. CORBA wird als standardisierte und vollständige Verteilungsplattform ausgewählt, um die o. a. Problemstellungen zu untersuchen. Bzgl. seines dynamischen Verhaltens, das zum Zeitpunkt dieser Ausarbeitung noch nicht oder nur unzureichend untersucht wurde, sind CORBA und die Web Services richtungsweisend bzgl. a. Arbeiten mit unbekannten Objekten. Dies kann durchaus Implikationen bzgl. der Entwicklung intelligenter Softwareagenten haben. b. der Integration von Legacy-Applikationen. c. der Möglichkeiten im Zusammenhang mit B2B (Business-to-Business). Diese Problemstellungen beinhalten auch allgemeine Fragen zum Marshalling/Unmarshalling von Daten und welche Aufwände hierfür notwendig sind, ebenso wie allgemeine Aussagen bzgl. der Echtzeitfähigkeit von CORBA-basierten, verteilten Anwendungen. Die Ergebnisse werden anschließend auf andere Technologien wie COM/DCOM, J2EE und den Web Services, soweit es zulässig ist, übertragen. Die Vergleiche CORBA mit DCOM, CORBA mit J2EE und CORBA mit Web Services zeigen im Detail die Eignung dieser Technologien bzgl. loser und enger Koppelung. Desweiteren werden aus den erzielten Resultaten allgemeine Konzepte bzgl. der Architektur und der Optimierung der Kommunikation abgeleitet. Diese Empfehlungen gelten uneingeschränkt für alle untersuchten Technologien im Zusammenhang mit verteilter Verarbeitung.
Resumo:
The association between celiac disease (CD) and dental enamel defects (DED) is well known. AIM: This study was designed to investigate the prevalence of DED in CD children and to specifically find a possible correlation between DED and gluten exposure period, CD clinical forms, HLA class II haplotype. MATERIALS AND METHODS: This study was designed as a matched case-control study: 374 children were enrolled (187 celiac and 187 non celiac). Data about age at CD diagnosis, CD clinical form and HLA haplotype were recorded. RESULTS: DED were detected in 87 celiac subject while no dental lesions were found in the remaining 100 patients; in 187 healthy controls enamel lesion were significantly less frequent (5.3 % versus 46.5% ; p<0.005).We found a correlation between DED and gluten exposure period, since among CD patients the mean age at CD diagnosis was significantly (p= 0.0004) higher in the group with DED (3.41± 1.27) than without DED (1.26± 0.7). DED resulted more frequent in atypical and silent forms than in the typical one. The presence of HLA DR 52-53 and DQ7 antigens significantly increased the risk of DED (p=0.0017). CONCLUSIONS: Our results confirmed a possible correlation between CD clinical form, age at CD diagnosis, HLA antigens and DED. The origin of DED in CD children is due to multifactorial events and further studies are needed to investigate other determinants.
Resumo:
The aim of this work is to explore, within the framework of the presumably asymptotically safe Quantum Einstein Gravity, quantum corrections to black hole spacetimes, in particular in the case of rotating black holes. We have analysed this problem by exploiting the scale dependent Newton s constant implied by the renormalization group equation for the effective average action, and introducing an appropriate "cutoff identification" which relates the renormalization scale to the geometry of the spacetime manifold. We used these two ingredients in order to "renormalization group improve" the classical Kerr metric that describes the spacetime generated by a rotating black hole. We have focused our investigation on four basic subjects of black hole physics. The main results related to these topics can be summarized as follows. Concerning the critical surfaces, i.e. horizons and static limit surfaces, the improvement leads to a smooth deformation of the classical critical surfaces. Their number remains unchanged. In relation to the Penrose process for energy extraction from black holes, we have found that there exists a non-trivial correlation between regions of negative energy states in the phase space of rotating test particles and configurations of critical surfaces of the black hole. As for the vacuum energy-momentum tensor and the energy conditions we have shown that no model with "normal" matter, in the sense of matter fulfilling the usual energy conditions, can simulate the quantum fluctuations described by the improved Kerr spacetime that we have derived. Finally, in the context of black hole thermodynamics, we have performed calculations of the mass and angular momentum of the improved Kerr black hole, applying the standard Komar integrals. The results reflect the antiscreening character of the quantum fluctuations of the gravitational field. Furthermore we calculated approximations to the entropy and the temperature of the improved Kerr black hole to leading order in the angular momentum. More generally we have proven that the temperature can no longer be proportional to the surface gravity if an entropy-like state function is to exist.
Resumo:
La ricerca si propone un duplice obbiettivo: 1. provare, attraverso l’applicazione di un metodo teorico tradizionale di analisi economico-finanziaria, il livello ottimale di equilibrio finanziario fra accesso al credito esterno e capitale proprio; 2. mostrare l’utilità di alcuni strumenti finanziari partecipativi per la ricapitalizzazione dell’impresa cooperativa. Oggetto di studio è l’impresa cooperativa che si occupa di una o più fasi del processo di lavorazione, trasformazione e prima commercializzazione del prodotto agricolo conferito dai soci, confrontata con le imprese di capitali che svolgono la medesima attività. La società cooperativa e quella capitalistica saranno, pertanto analizzate in termini di liquidità generata, redditività prodotta e grado di indebitamento, attraverso il calcolo e l’analisi di una serie di indici, tratti dai rispettivi bilanci d’esercizio. È opportuno sottolineare che nella seguente trattazione sarà riservato uno spazio al tema della ricerca del valore nell’impresa cooperativa inteso come espressione della ricchezza creata dai processi aziendali in un determinato periodo di tempo tentando di definire, se esiste, una struttura finanziaria ottimale , ossia uno specifico rapporto tra indebitamento finanziario e mezzi propri, che massimizzi il valore dell’impresa. L’attenzione verso la struttura finanziaria, pertanto, non sarà solo rivolta al costo esplicito del debito o dell’equity, ma si estenderà anche alle implicazioni delle scelte di finanziamento sulle modalità di governo dell’impresa. Infatti molti studi di economia aziendale, e in particolar modo di gestione d’impresa e finanza aziendale, hanno trattato il tema dell’attività di governo dell’impresa, quale elemento in grado di contribuire alla creazione di valore non solo attraverso la selezione dei progetti d’investimento ma anche attraverso la composizione della struttura finanziaria.
Resumo:
In this thesis we will investigate some properties of one-dimensional quantum systems. From a theoretical point of view quantum models in one dimension are particularly interesting because they are strongly interacting, since particles cannot avoid each other in their motion, and you we can never ignore collisions. Yet, integrable models often generate new and non-trivial solutions, which could not be found perturbatively. In this dissertation we shall focus on two important aspects of integrable one- dimensional models: Their entanglement properties at equilibrium and their dynamical correlators after a quantum quench. The first part of the thesis will be therefore devoted to the study of the entanglement entropy in one- dimensional integrable systems, with a special focus on the XYZ spin-1/2 chain, which, in addition to being integrable, is also an interacting model. We will derive its Renyi entropies in the thermodynamic limit and its behaviour in different phases and for different values of the mass-gap will be analysed. In the second part of the thesis we will instead study the dynamics of correlators after a quantum quench , which represent a powerful tool to measure how perturbations and signals propagate through a quantum chain. The emphasis will be on the Transverse Field Ising Chain and the O(3) non-linear sigma model, which will be both studied by means of a semi-classical approach. Moreover in the last chapter we will demonstrate a general result about the dynamics of correlation functions of local observables after a quantum quench in integrable systems. In particular we will show that if there are not long-range interactions in the final Hamiltonian, then the dynamics of the model (non equal- time correlations) is described by the same statistical ensemble that describes its statical properties (equal-time correlations).
Resumo:
Diese Arbeit besch"aftigt sich mit algebraischen Zyklen auf komplexen abelschen Variet"aten der Dimension 4. Ziel der Arbeit ist ein nicht-triviales Element in $Griff^{3,2}(A^4)$ zu konstruieren. Hier bezeichnet $A^4$ die emph{generische} abelsche Variet"at der Dimension 4 mit Polarisierung von Typ $(1,2,2,2)$. Die ersten drei Kapitel sind eine Wiederholung von elementaren Definitionen und Begriffen und daher eine Festlegung der Notation. In diesen erinnern wir an elementare Eigenschaften der von Saito definierten Filtrierungen $F_S$ und $Z$ auf den Chowgruppen (vgl. cite{Sa0} und cite{Sa}). Wir wiederholen auch eine Beziehung zwischen der $F_S$-Filtrierung und der Zerlegung von Beauville der Chowgruppen (vgl. cite{Be2} und cite{DeMu}), welche aus cite{Mu} stammt. Die wichtigsten Begriffe in diesem Teil sind die emph{h"ohere Griffiths' Gruppen} und die emph{infinitesimalen Invarianten h"oherer Ordnung}. Dann besch"aftigen wir uns mit emph{verallgemeinerten Prym-Variet"aten} bez"uglich $(2:1)$ "Uberlagerungen von Kurven. Wir geben ihre Konstruktion und wichtige geometrische Eigenschaften und berechnen den Typ ihrer Polarisierung. Kapitel ref{p-moduli} enth"alt ein Resultat aus cite{BCV} "uber die Dominanz der Abbildung $p(3,2):mathcal R(3,2)longrightarrow mathcal A_4(1,2,2,2)$. Dieses Resultat ist von Relevanz f"ur uns, weil es besagt, dass die generische abelsche Variet"at der Dimension 4 mit Polarisierung von Typ $(1,2,2,2)$ eine verallgemeinerte Prym-Variet"at bez"uglich eine $(2:1)$ "Uberlagerung einer Kurve vom Geschlecht $7$ "uber eine Kurve vom Geschlecht $3$ ist. Der zweite Teil der Dissertation ist die eigentliche Arbeit und ist auf folgende Weise strukturiert: Kapitel ref{Deg} enth"alt die Konstruktion der Degeneration von $A^4$. Das bedeutet, dass wir in diesem Kapitel eine Familie $Xlongrightarrow S$ von verallgemeinerten Prym-Variet"aten konstruieren, sodass die klassifizierende Abbildung $Slongrightarrow mathcal A_4(1,2,2,2)$ dominant ist. Desweiteren wird ein relativer Zykel $Y/S$ auf $X/S$ konstruiert zusammen mit einer Untervariet"at $Tsubset S$, sodass wir eine explizite Beschreibung der Einbettung $Yvert _Thookrightarrow Xvert _T$ angeben k"onnen. Das letzte und wichtigste Kapitel enth"ahlt Folgendes: Wir beweisen dass, die emph{ infinitesimale Invariante zweiter Ordnung} $delta _2(alpha)$ von $alpha$ nicht trivial ist. Hier bezeichnet $alpha$ die Komponente von $Y$ in $Ch^3_{(2)}(X/S)$ unter der Beauville-Zerlegung. Damit und mit Hilfe der Ergebnissen aus Kapitel ref{Cohm} k"onnen wir zeigen, dass [ 0neq [alpha ] in Griff ^{3,2}(X/S) . ] Wir k"onnen diese Aussage verfeinern und zeigen (vgl. Theorem ref{a4}) begin{theorem}label{maintheorem} F"ur $sin S$ generisch gilt [ 0neq [alpha _s ]in Griff ^{3,2}(A^4) , ] wobei $A^4$ die generische abelsche Variet"at der Dimension $4$ mit Polarisierung vom Typ $(1,2,2,2)$ ist. end{theorem}
Resumo:
The efficient emulation of a many-core architecture is a challenging task, each core could be emulated through a dedicated thread and such threads would be interleaved on an either single-core or a multi-core processor. The high number of context switches will results in an unacceptable performance. To support this kind of application, the GPU computational power is exploited in order to schedule the emulation threads on the GPU cores. This presents a non trivial divergence issue, since GPU computational power is offered through SIMD processing elements, that are forced to synchronously execute the same instruction on different memory portions. Thus, a new emulation technique is introduced in order to overcome this limitation: instead of providing a routine for each ISA opcode, the emulator mimics the behavior of the Micro Architecture level, here instructions are date that a unique routine takes as input. Our new technique has been implemented and compared with the classic emulation approach, in order to investigate the chance of a hybrid solution.
Resumo:
A permutation is said to avoid a pattern if it does not contain any subsequence which is order-isomorphic to it. Donald Knuth, in the first volume of his celebrated book "The art of Computer Programming", observed that the permutations that can be computed (or, equivalently, sorted) by some particular data structures can be characterized in terms of pattern avoidance. In more recent years, the topic was reopened several times, while often in terms of sortable permutations rather than computable ones. The idea to sort permutations by using one of Knuth’s devices suggests to look for a deterministic procedure that decides, in linear time, if there exists a sequence of operations which is able to convert a given permutation into the identical one. In this thesis we show that, for the stack and the restricted deques, there exists an unique way to implement such a procedure. Moreover, we use these sorting procedures to create new sorting algorithms, and we prove some unexpected commutation properties between these procedures and the base step of bubblesort. We also show that the permutations that can be sorted by a combination of the base steps of bubblesort and its dual can be expressed, once again, in terms of pattern avoidance. In the final chapter we give an alternative proof of some enumerative results, in particular for the classes of permutations that can be sorted by the two restricted deques. It is well-known that the permutations that can be sorted through a restricted deque are counted by the Schrӧder numbers. In the thesis, we show how the deterministic sorting procedures yield a bijection between sortable permutations and Schrӧder paths.
Resumo:
Die Entstehung eines Marktpreises für einen Vermögenswert kann als Superposition der einzelnen Aktionen der Marktteilnehmer aufgefasst werden, die damit kumulativ Angebot und Nachfrage erzeugen. Dies ist in der statistischen Physik mit der Entstehung makroskopischer Eigenschaften vergleichbar, die von mikroskopischen Wechselwirkungen zwischen den beteiligten Systemkomponenten hervorgerufen werden. Die Verteilung der Preisänderungen an Finanzmärkten unterscheidet sich deutlich von einer Gaußverteilung. Dies führt zu empirischen Besonderheiten des Preisprozesses, zu denen neben dem Skalierungsverhalten nicht-triviale Korrelationsfunktionen und zeitlich gehäufte Volatilität zählen. In der vorliegenden Arbeit liegt der Fokus auf der Analyse von Finanzmarktzeitreihen und den darin enthaltenen Korrelationen. Es wird ein neues Verfahren zur Quantifizierung von Muster-basierten komplexen Korrelationen einer Zeitreihe entwickelt. Mit dieser Methodik werden signifikante Anzeichen dafür gefunden, dass sich typische Verhaltensmuster von Finanzmarktteilnehmern auf kurzen Zeitskalen manifestieren, dass also die Reaktion auf einen gegebenen Preisverlauf nicht rein zufällig ist, sondern vielmehr ähnliche Preisverläufe auch ähnliche Reaktionen hervorrufen. Ausgehend von der Untersuchung der komplexen Korrelationen in Finanzmarktzeitreihen wird die Frage behandelt, welche Eigenschaften sich beim Wechsel von einem positiven Trend zu einem negativen Trend verändern. Eine empirische Quantifizierung mittels Reskalierung liefert das Resultat, dass unabhängig von der betrachteten Zeitskala neue Preisextrema mit einem Anstieg des Transaktionsvolumens und einer Reduktion der Zeitintervalle zwischen Transaktionen einhergehen. Diese Abhängigkeiten weisen Charakteristika auf, die man auch in anderen komplexen Systemen in der Natur und speziell in physikalischen Systemen vorfindet. Über 9 Größenordnungen in der Zeit sind diese Eigenschaften auch unabhängig vom analysierten Markt - Trends, die nur für Sekunden bestehen, zeigen die gleiche Charakteristik wie Trends auf Zeitskalen von Monaten. Dies eröffnet die Möglichkeit, mehr über Finanzmarktblasen und deren Zusammenbrüche zu lernen, da Trends auf kleinen Zeitskalen viel häufiger auftreten. Zusätzlich wird eine Monte Carlo-basierte Simulation des Finanzmarktes analysiert und erweitert, um die empirischen Eigenschaften zu reproduzieren und Einblicke in deren Ursachen zu erhalten, die zum einen in der Finanzmarktmikrostruktur und andererseits in der Risikoaversion der Handelsteilnehmer zu suchen sind. Für die rechenzeitintensiven Verfahren kann mittels Parallelisierung auf einer Graphikkartenarchitektur eine deutliche Rechenzeitreduktion erreicht werden. Um das weite Spektrum an Einsatzbereichen von Graphikkarten zu aufzuzeigen, wird auch ein Standardmodell der statistischen Physik - das Ising-Modell - auf die Graphikkarte mit signifikanten Laufzeitvorteilen portiert. Teilresultate der Arbeit sind publiziert in [PGPS07, PPS08, Pre11, PVPS09b, PVPS09a, PS09, PS10a, SBF+10, BVP10, Pre10, PS10b, PSS10, SBF+11, PB10].
Resumo:
Chiroptical spectroscopies play a fundamental role in pharmaceutical analysis for the stereochemical characterisation of bioactive molecules, due to the close relationship between chirality and optical activity and the increasing evidence of stereoselectivity in the pharmacological and toxicological profiles of chiral drugs. The correlation between chiroptical properties and absolute stereochemistry, however, requires the development of accurate and reliable theoretical models. The present thesis will report the application of theoretical chiroptical spectroscopies in the field of drug analysis, with particular emphasis on the huge influence of conformational flexibility and solvation on chiroptical properties and on the main computational strategies available to describe their effects by means of electronic circular dichroism (ECD) spectroscopy and time-dependent density functional theory (TD-DFT) calculations. The combination of experimental chiroptical spectroscopies with state-of-the-art computational methods proved to be very efficient at predicting the absolute configuration of a wide range of bioactive molecules (fluorinated 2-arylpropionic acids, β-lactam derivatives, difenoconazole, fenoterol, mycoleptones, austdiol). The results obtained for the investigated systems showed that great care must be taken in describing the molecular system in the most accurate fashion, since chiroptical properties are very sensitive to small electronic and conformational perturbations. In the future, the improvement of theoretical models and methods, such as ab initio molecular dynamics, will benefit pharmaceutical analysis in the investigation of non-trivial effects on the chiroptical properties of solvated systems and in the characterisation of the stereochemistry of complex chiral drugs.