928 resultados para proof-of-concept
Resumo:
A pénzügyi modellek jelentős része feltételezi a piacok hatékony működését. Ennek következtében számos tudományos kutatás központi témája volt a piacok hatékonyságának tesztelése és ennek igazolása, esetleg cáfolata. Ezen próbálkozások azonban mind a mai napig eredménytelenül zárultak. A tesztelések nyomán a kutatások a termékek áralakulásából indultak ki, és a hozamokat ezen keresztül elemezték. Az elmúlt években azonban a fókusz átterelődött az árak alakulásáról egy elemibb tényezőre, az ajánlati könyvre. Ugyanis végső soron az ajánlatvezérelt piacokon az árakat az ajánlati könyvbe benyújtott megbízások alakulása fogja meghatározni. Mivel a tőzsdék jelentős része ajánlatvezérelt piacként működik, ezért érdemesnek tartották a kutatók, hogy inkább az ajánlati könyv alakulásának statisztikai jellemzőit elemezzék, hátha az eredményre vezet, és sikerül közelebb jutni a hatékony piacok elméletének igazolásához vagy cáfolatához. Jelen tanulmány célja az, hogy az eddig megjelent tudományos kutatások alapján ismertesse az ajánlati könyv alapvető statisztikai tulajdonságait, és rávilágítson arra: mindez valójában hozzájárult-e a hatékony piacok elméletének igazolásához? ______ Most of the fi nancial models assume that markets are effi cient. As a result, numerous scientifi c researchers were focused on testing the effi cient market hypothesis, and tried to prove, or deny it. However, all these attempts are still unsuccessful. During these researches, the analyses of the effi cient market hypothesis were based on the price evolution of a certain asset, and through this the returns were examined. In the recent years the research interest has changed, and instead of analyzing the returns, a more primary factor got into focus, namely the limit order book. The reason is that on order driven markets the prices and the order sizes in the limit order book infl uence the price evolution on the market. Since a notable number of stock markets operate as an order driven market, the researchers thought that it worth analyzing the statistical properties of the limit order book, because maybe it will get us closer to the proof of the effi cient market hypothesis. The purpose of this study is to summarize the statistical properties of the limit order book, based on the scientifi c works published so far. The study would like to highlight whether these studies contributed to the proof or disproof of the effi cient market hypothesis.
Resumo:
A pénzügyi eszközök árazásának alaptétele - kissé pongyolán megfogalmazva - azt állítja, hogy egy értékpapírpiacon akkor nincs arbitrázs, ha létezik egy az eredetivel ekvivalens valószínűségi mérték, amelyre vonatkozóan az értékpapírok árait leíró folyamat egy bizonyos értelemben "martingál". Az első ilyen jellegű állítást M. Harrison és S. R. Pliska bizonyították arra esetre, amikor a valószínűségi mező végesen generált. Azóta a tételnek számos általánosítása született. Ezek közül az egyik legismertebb a Dalang{Morton{ Willinger-tétel, ami már teljesen általános valószínűségi mezőből indul ki, de felteszi, hogy az időparaméter diszkrét, és az időhorizont véges. Időközben a tételnek számos folytonos időparaméterű folyamatokra vonatkozó változata is született. Az alaptételt általános esetben, vagyis amikor valószínűségi mező teljesen általános, és az értékpapírok piaci árait leíró folyamat lokálisan korlátos szemimartingál, Delbaen és W. Schachermayer bizonyították be. A Delbaen{Schachermayer-féle alaptétel a maga nemében egy igen általános áll ítás. A tétel bizonyítása igen hosszadalmas, és a funkcionálanalízis valamint a sztochasztikus folyamatok általános elméletének mély eredményeit használja. Utóbbi tudományterület nagy részét P. A. Meyer és a francia strassbourgi iskola matematikusai dolgozták ki a 60-as évek végétől kezdve. A terület megértését tehát alaposan megnehezíti, hogy a felhasznált matematikai apparátus viszonylag friss, egy része pedig csak francia nyelven érhető el. Meggyőződésünk szerint az eredeti, 1994-es Delbaen és Schachermayer-féle bizonyítás csak kevesek által hozzáférhető. A tételnek tudomásunk szerint azóta sem született tankönyvi feldolgozása, annak ellenére, hogy maga az állítás közgazdász körökben is széles körben ismerté vált, és az eredeti cikket számos szerző idézi. Az itt bemutatott bizonyítás Delbaen és Schachermayer 1992 és 2006 közötti írásain alapul. ______ The Delbaen and Schachermayer's theorem is one of the deepest results of mathematical finance. In this article we tried to rethink and slightly simplify the original proof of the theorem to make understandable for nonspecialists who are familiar with general theory of stochastic processes. We give a detailed proof of the theorem and we give new proofs for some of the used statements.
Resumo:
A cikk Oliver Hart és szerzőtársai modelljeinek következtetéseit hasonlítja össze Williamson tranzakciós költségekre vonatkozó nézeteivel. Megmutatja, hogy a két irányzat a vállalat vagy piac kérdéskörében más eszközöket használ, de hasonlóan érvel. Megismerkedhetünk Williamson Harttal szemben megfogalmazott azon kritikájával, hogy Hart modelljeiben az alkunak nincsenek tranzakciós költségei, illetve a kritika kritikájával is. Hart elképzeléseit támasztja alá a tulajdonjogi irányzaton belül nemrégiben kialakult referenciapont-elmélet, amely kísérleti lehetőségeket is nyújt a különböző feltételezések igazolására. ____ The article compares the conclusions from the models of Oliver Hart et al. with the views of Williamson on transaction costs. It shows that the two schools use different means on the question of the firm or the market, but similar reasoning. The author covers Williamson's criticism of Hart that there are no transaction costs in his models, and also the criticism of that criticism. Hart's notions are supported by the recently developed theory of reference point within the property-right trend, which offers chances of experimental proof of the various assumptions.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^
Resumo:
This dissertation analyzes various types of non-canonical texts authorized by women from a wide spectrum of classes and races in the Spanish colonies. The female voice, generally absent from official colonial documents of the sixteenth, seventeenth and eighteen centuries, left a gap in the complex subject of women's history and social participation. Through the study of personal letters, autobiographies, journals, court documents, inquisitorial transcripts, wills and testaments, edicts, orders, proclamations and posters, that voice is recovered. Thus, the Indigenous, Spaniards and African women and their descendants who lived during this period left their written legacy and proof of participation. Beginning with a thorough history of the native woman's interest in writing, this study focuses on how women of all social levels utilized the few means of writing available at their disposal to display a testimonial, critical and sometimes fictional narrative of their surroundings. ^ This investigation concludes that it is necessary to change the traditional image of the passive women of the colonies, subjected to a patriarchal authority and unable to speak or grow on their own. The documents under study, introduced women who were able to self represent themselves as followers of the tradition while at the same time their writings were denying that very same statement. They passed from the private arena to the public one with discourses that confessed their innermost feelings and concerns, challenged the authority of the Inquisitor or the Governor, exposed their sexual freedom and transvestite narratives, successfully developed stratagems that challenged the official ideology of the oppressive religious environment and established their own authority reaching at last the freedom of their souls. ^
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.
Resumo:
In my thesis, “Commandeering Aesop’s Bamboo Canon: A 19th Century Confederacy of Creole Fugitive Fables,” I ask and answer the ‘Who? What? Where? When? Why?” of Creole Literature using the 19th century production of Aesopian fables as clues to resolve a set of linguistic, historical, literary, and geographical enigmas pertaining the ‘birth-place(s)’ of Creolophone Literatures in the Caribbean Sea, North and South America, as well as the Indian Ocean. Focusing on the fables in Martinique (1846), Reunion Island (1826), and Mauritius (1822), my thesis should read be as an attempt capture the links between these islands through the creation of a particular archive defined as a cartulary-chronicle, a diplomatic codex, or simply a map in which I chart and trace the flight of the founding documents relating to the lives of the individual authors, editors, and printers in order to illustrate the articulation of a formal and informal confederation that enabled the global and local institutional promotion of Creole Literature. While I integrate various genres and multi-polar networks between the authors of this 19th century canon comprised of sacred and secular texts such as proclamations, catechisms, and proverbs, the principle literary genre charted in my thesis are collections of fables inspired by French 17th century French Classical fabulist, Jean de la Fontaine. Often described as the ‘matrix’ of Creolophone Literature, these blues and fables constitute the base of the canon, and are usually described as either ‘translated,’ ‘adapted,’ and even ‘cross-dressed’ into Creole in all of the French Creolophone spaces. My documentation of their transnational sprouting offers proof of an opaque canonical formation of Creole popular literature. By constituting this archive, I emphasize the fact that despite 200 years of critical reception and major developments and discoveries on behalf of Creole language pedagogues, literary scholars, linguists, historians, librarians, archivist, and museum curators, up until now not only have none have curated this literature as a formal canon. I also offer new empirical evidence in order to try and solve the enigma of “How?” the fables materially circulated between the islands, and seek to come to terms with the anonymous nature of the texts, some of which were published under pseudonyms. I argue that part of the confusion on the part of scholars has been the result of being willfully taken by surprise or defrauded by the authors, or ‘bamboozled’ as I put it. The major paradigmatic shift in my thesis is that while I acknowledge La Fontaine as the base of this literary canon, I ultimately bypass him to trace the ancient literary genealogy of fables to the infamous Aesop the Phrygian, whose biography – the first of a slave in the history of the world – and subsequent use of fables reflects a ‘hidden transcript’ of ‘masked political critique’ between ‘master and slave classes’ in the 4th Century B.C.E. Greece.
This archive draws on, connects and critiques the methodologies of several disciplinary fields. I use post-colonial literary studies to map the literary genealogies Aesop; use a comparative historical approach to the abolitions of slavery in both the 19th century Caribbean and the Indian Ocean; and chart the early appearance of folk music in early colonial societies through Musicology and Performance Studies. Through the use of Sociolinguistics and theories of language revival, ecology, and change, I develop an approach of ‘reflexive Creolistics’ that I ultimately hope will offer new educational opportunities to Creole speakers. While it is my desire that this archive serves linguists, book collectors, and historians for further scientific inquiry into the innate international nature of Creole language, I also hope that this innovative material defense and illustration of Creole Literature will transform the consciousness of Creolophones (native and non-native) who too remain ‘bamboozled’ by the archive. My goal is to erase the ‘unthinkability’ of the existence of this ancient maritime creole literary canon from the collective cultural imaginary of readers around the globe.
Resumo:
Stroke is a leading cause of death and permanent disability worldwide, affecting millions of individuals. Traditional clinical scores for assessment of stroke-related impairments are inherently subjective and limited by inter-rater and intra-rater reliability, as well as floor and ceiling effects. In contrast, robotic technologies provide objective, highly repeatable tools for quantification of neurological impairments following stroke. KINARM is an exoskeleton robotic device that provides objective, reliable tools for assessment of sensorimotor, proprioceptive and cognitive brain function by means of a battery of behavioral tasks. As such, KINARM is particularly useful for assessment of neurological impairments following stroke. This thesis introduces a computational framework for assessment of neurological impairments using the data provided by KINARM. This is done by achieving two main objectives. First, to investigate how robotic measurements can be used to estimate current and future abilities to perform daily activities for subjects with stroke. We are able to predict clinical scores related to activities of daily living at present and future time points using a set of robotic biomarkers. The findings of this analysis provide a proof of principle that robotic evaluation can be an effective tool for clinical decision support and target-based rehabilitation therapy. The second main objective of this thesis is to address the emerging problem of long assessment time, which can potentially lead to fatigue when assessing subjects with stroke. To address this issue, we examine two time reduction strategies. The first strategy focuses on task selection, whereby KINARM tasks are arranged in a hierarchical structure so that an earlier task in the assessment procedure can be used to decide whether or not subsequent tasks should be performed. The second strategy focuses on time reduction on the longest two individual KINARM tasks. Both reduction strategies are shown to provide significant time savings, ranging from 30% to 90% using task selection and 50% using individual task reductions, thereby establishing a framework for reduction of assessment time on a broader set of KINARM tasks. All in all, findings of this thesis establish an improved platform for diagnosis and prognosis of stroke using robot-based biomarkers.
Resumo:
In the context of products from certain regions or countries being banned because of an identified or non-identified hazard, proof of geographical origin is essential with regard to feed and food safety issues. Usually, the product labeling of an affected feed lot shows origin, and the paper documentation shows traceability. Incorrect product labeling is common in embargo situations, however, and alternative analytical strategies for controlling feed authenticity are therefore needed. In this study, distillers' dried grains and solubles (DDGS) were chosen as the product on which to base a comparison of analytical strategies aimed at identifying the most appropriate one. Various analytical techniques were investigated for their ability to authenticate DDGS, including spectroscopic and spectrometric techniques combined with multivariate data analysis, as well as proven techniques for authenticating food, such as DNA analysis and stable isotope ratio analysis. An external validation procedure (called the system challenge) was used to analyze sample sets blind and to compare analytical techniques. All the techniques were adapted so as to be applicable to the DDGS matrix. They produced positive results in determining the botanical origin of DDGS (corn vs. wheat), and several of them were able to determine the geographical origin of the DDGS in the sample set. The maintenance and extension of the databanks generated in this study through the analysis of new authentic samples from a single location are essential in order to monitor developments and processing that could affect authentication.
Resumo:
In this paper, we describe how the pathfinder algorithm converts relatedness ratings of concept pairs to concept maps; we also present how this algorithm has been used to develop the Concept Maps for Learning website (www.conceptmapsforlearning.com) based on the principles of effective formative assessment. The pathfinder networks, one of the network representation tools, claim to help more students memorize and recall the relations between concepts than spatial representation tools (such as Multi- Dimensional Scaling). Therefore, the pathfinder networks have been used in various studies on knowledge structures, including identifying students’ misconceptions. To accomplish this, each student’s knowledge map and the expert knowledge map are compared via the pathfinder software, and the differences between these maps are highlighted. After misconceptions are identified, the pathfinder software fails to provide any feedback on these misconceptions. To overcome this weakness, we have been developing a mobile-based concept mapping tool providing visual, textual and remedial feedback (ex. videos, website links and applets) on the concept relations. This information is then placed on the expert concept map, but not on the student’s concept map. Additionally, students are asked to note what they understand from given feedback, and given the opportunity to revise their knowledge maps after receiving various types of feedback.
Resumo:
Similarly to the case of LIF (Laser-Induced Fluorescence), an equally revolutionary impact to science is expected from resonant X-ray photo-pumping. It will particularly contribute to a progress in high energy density science: pumped core hole states create X-ray transitions that can escape dense matter on a 10 fs-time scale without essential photoabsorption, thus providing a unique possibility to study matter under extreme conditions. In the first proof of principle experiment at the X-ray Free Electron Laser LCLS at SCLAC [Seely, J., Rosmej, F.B., Shepherd, R., Riley, D., Lee, R.W. Proposal to Perform the 1st High Energy Density Plasma Spectroscopic Pump/Probe Experiment", approved LCLS proposal L332 (2010)] we have successfully pumped inner-shell X-ray transitions in dense plasmas. The plasma was generated with a YAG laser irradiating solid Al and Mg targets attached to a rotating cylinder. In parallel to the optical laser beam, the XFEL was focused into the plasma plume at different delay times and pump energies. Pumped X-ray transitions have been observed with a spherically bent crystal spectrometer coupled to a Princeton CCD. By using this experimental configuration, we have simultaneously achieved extremely high spectral (λ/δλ ≈ 5000) and spatial resolution (δx≈70 μm) while maintaining high luminosity and a large spectral range covered (6.90 - 8.35 Å). By precisely measuring the variations in spectra emitted from plasma under action of XFEL radiation, we have successfully demonstrated transient X- ray pumping in a dense plasma.
Resumo:
Many engineers currently in professional practice will have gained a degree level qualification which involved studying a curriculum heavy with mathematics and engineering science. While this knowledge is vital to the engineering design process so also is manufacturing knowledge, if the resulting designs are to be both technically and commercially viable.
The methodology advanced by the CDIO Initiative aims to improve engineering education by teaching in the context of Conceiving, Designing, Implementing and Operating products, processes or systems. A key element of this approach is the use of Design-Built-Test (DBT) projects as the core of an integrated curriculum. This approach facilitates the development of professional skills as well as the application of technical knowledge and skills developed in other parts of the degree programme. This approach also changes the role of lecturer to that of facilitator / coach in an active learning environment in which students gain concrete experiences that support their development.
The case study herein describes Mechanical Engineering undergraduate student involvement in the manufacture and assembly of concept and functional prototypes of a folding bicycle.
Resumo:
A Fourier transform infrared gas-phase method is described herein and capable of deriving the vapour pressure of each pure component of a poorly volatile mixture and determining the relative vapour phase composition for each system. The performance of the present method has been validated using two standards (naphthalene and ferrocene), and a Raoult’s plot surface of a ternary system is reported as proof-of-principle. This technique is ideal for studying solutions comprising two, three, or more organic compounds dissolved in ionic liquids as they have no measurable vapour pressures.
Resumo:
An epithermal neutron imager based on detecting alpha particles created via boron neutron capture mechanism is discussed. The diagnostic mainly consists of a mm thick Boron Nitride (BN) sheet (as an alpha converter) in contact with a non-borated cellulose nitride film (LR115 type-II) detector. While the BN absorbs the neutrons in the thermal and epithermal ranges, the fast neutrons register insignificantly on the detector due to their low neutron capture and recoil cross-sections. The use of solid-state nuclear track detectors (SSNTD), unlike image plates, micro-channel plates and scintillators, provide safeguard from the x-rays, gamma-rays and electrons. The diagnostic was tested on a proof-of-principle basis, in front of a laser driven source of moderated neutrons, which suggests the potential of using this diagnostic (BN+SSNTD) for dosimetry and imaging applications.
Resumo:
L’échec des différents essais cliniques souligne la nécessité de développer des nouvelles thérapies pour la maladie d’Alzheimer (MA), la cause la plus commune de démence. Les microARNs (miARNs) sont les ARNs non-codants les plus étudiés et ils jouent un rôle important dans la modulation de l’expression des gènes et de multiples voies de signalisation. Des études antérieures, dont celles de mon laboratoire d’accueil, ont permis de développer l’hypothèse que certains membres de la famille miR-15/107 (c.-à-d. miR-15ab, miR-16, miR-195, miR-424, and miR-497) pourraient être utilisés comme agents thérapeutiques dans MA. En effet, cette famille avait le potentiel de réguler de multiples gènes associés à MA, tels que la protéine précurseur de l’amyloïde (APP), la β-secrétase (BACE1), et la protéine Tau. Tel que démontré dans ce projet de thèse, j’ai choisi miR-16 comme cible thérapeutique potentielle pour MA parmi tous les membres de la famille. L’essai luciférase dans ce projet confirme que miR-16 peut réguler simultanément APP et BACE1, directement par une interaction avec la région non-codante en 3’ de l’ARNm). Notamment, nous observons aussi une réduction de la production des peptides amyloïdes et de la phosphorylation de Tau après une augmentation de miR-16 en cellule. J’ai ensuite validé mes résultats in vivo dans la souris en utilisant une méthode de livraison de miR-16 via une pompe osmotique implanté dans le cerveau. Dans ce cas, l’expression des protéines d’intérêts (APP, BACE1, Tau) a été mesurée par immunobuvardage et PCR à temps réel. Après validation, ces résultats ont été complémentés par une étude protéomique (iTRAQ) du tronc cérébral et de l’hippocampe, deux régions associées à la maladie. Ces données m’ont permis d’identifier d’autres protéines régulées par miR-16 in vivo, incluant α-Synucléine, Transferrine receptor1, et SRm300. Une autre observation intéressante : les voies régulées par miR-16 in vivo sont directement en lien avec le stress oxydatif et la neurodégénération. En résumé, ce travail démontre l’efficacité et la faisabilité d’utiliser un miARN comme outil thérapeutique pour la maladie d’Alzheimer. Ces résultats rentrent dans un cadre plus vaste de découvrir de nouvelles cibles pour MA, et en particulier la forme sporadique de la maladie qui représente plus de 95% de tous les cas. Évidemment, la découverte d’une molécule pouvant cibler simultanément les deux pathologies de la maladie (plaques amyloïdes et hyper phosphorylation de tau) est nouvelle et intéressante, et ce domaine de recherche ouvre la porte aux autres petits ARNs non-codants dans MA et les maladies neurodégénératives connexes.