906 resultados para sort
Resumo:
A permutation is said to avoid a pattern if it does not contain any subsequence which is order-isomorphic to it. Donald Knuth, in the first volume of his celebrated book "The art of Computer Programming", observed that the permutations that can be computed (or, equivalently, sorted) by some particular data structures can be characterized in terms of pattern avoidance. In more recent years, the topic was reopened several times, while often in terms of sortable permutations rather than computable ones. The idea to sort permutations by using one of Knuth’s devices suggests to look for a deterministic procedure that decides, in linear time, if there exists a sequence of operations which is able to convert a given permutation into the identical one. In this thesis we show that, for the stack and the restricted deques, there exists an unique way to implement such a procedure. Moreover, we use these sorting procedures to create new sorting algorithms, and we prove some unexpected commutation properties between these procedures and the base step of bubblesort. We also show that the permutations that can be sorted by a combination of the base steps of bubblesort and its dual can be expressed, once again, in terms of pattern avoidance. In the final chapter we give an alternative proof of some enumerative results, in particular for the classes of permutations that can be sorted by the two restricted deques. It is well-known that the permutations that can be sorted through a restricted deque are counted by the Schrӧder numbers. In the thesis, we show how the deterministic sorting procedures yield a bijection between sortable permutations and Schrӧder paths.
Resumo:
Background. Neoangiogenesis is crucial in plaque progression and instability. Previous data from our group demonstrated that intra-plaque neovessels show both a Nestin+/WT+ and a Nestin+/WT1- phenotype, the latter being correlated with complications and plaque instability. Aims. The aims of the present thesis are: (i) to confirm our previous results on Nestin/WT1 phenotype in a larger series of carotid atheromatous plaques, (ii) to evaluate the relationship between the Nestin+/WT1- neoangiogenesis phenotype and plaque morphology, (iii) to evaluate the relationship between the immunohistochemical and histopathological characteristics and the clinical instability of the plaques. Materials and Methods. Seventy-three patients (53 males, 20 females, mean age 71 years) were consecutively enrolled. Symptoms, brain CT scan, 14 histological variables, including intraplaque hemorrhage and diffuse calcifications, were collected. Immunohistochemistry for CD34, Nestin and WT1 was performed. RT-PCR was performed to evaluate Nestin and WT1 mRNA (including 5 healthy arteries as controls). Results. Diffusely calcified plaques (13 out of 73) were found predominantly in females (P=0.017), with a significantly lower incidence of symptoms (TIA/stroke) and brain focal lesions (P=0.019 and P=0.013 respectively) than not-calcified plaques, but with the same incidence of intraplaque complications (P=0.156). Accordingly, both calcified and not calcified plaques showed similar mean densities of positivity for CD34, Nestin and WT1. The density of Nestin and WT1 correlated with the occurrence of intra-plaque hemorrhage in all cases, while the density of CD34 correlated only in not-calcified plaques. Conclusions. We confirmed that the Nestin+/WT1- phenotype characterizes the neovessels of instable plaques, regardless the real amount of CD34-positive neoangiogenesis. The calcified plaques show the same incidence of histological complications, albeit they do not influence symptomatology and plaque vulnerability. Female patients show a much higher incidence of not-complicated or calcified plaques, receiving de facto a sort of protection compared to male patients.
Resumo:
Il presente progetto è incentrato sull’analisi paleografica della scrittura delle carte dei notai bolognesi del secolo XII (dal 1100 al 1164) ed è stata condotta su un totale di circa 730 documenti, quasi totalmente inediti. La ricerca rientra nell’ambito del progetto di edizione critica delle Carte bolognesi del secolo XII, in corso presso la Cattedra di Paleografia e Diplomatica dell’Università di Bologna. Il lavoro ha previsto un’analisi tecnica e puntuale delle abitudini grafiche di ogni notaio, con particolare attenzione al sistema abbreviativo (al fine di fornire una serie di dati di confronto che potranno essere utili al momento dell’edizione). È stata così realizzata una sorta di database delle diverse grafie esistenti sul territorio, organizzate per notaio e in ordine cronologico. Le caratteristiche della documentazione sono state poi prese in esame sul piano sincronico e nel loro sviluppo diacronico, e si è proceduto a un confronto tra la produzione dei diversi notai, verificando la presenza di nessi e parentele “grafiche”, che hanno permesso di ricostruire raggruppamenti di scriventi con caratteristiche affini.L’analisi dei dati ha permesso di indagare a fondo gli sviluppi della minuscola carolina bolognese e di osservare l’organizzazione e le modalità di apprendimento della pratica notarile. È stato così possibile cogliere le dinamiche con cui la carolina, introdotta da alcuni notai “innovatori”, come Angelo e Bonando, si è diffusa dalla città al contado: si è trattato di un processo graduale, in cui accanto a forme già mature, di transizione verso la gotica, sono convissute forme ancora arcaiche. In linea con quanto la storiografia ha evidenziato, anche l’analisi grafica della documentazione privata bolognese conferma che il processo di rinnovamento della corporazione dovette essere successivo all’impresa irneriana, traendo probabilmente alimento anche dai rapporti diretti e documentati tra Irnerio e alcune personalità più avanzate del notariato bolognese.
Resumo:
The benthic dinoflagellate O. ovata represents a serious threat for human health and for the ecology of its blooming areas: thanks to its toxicity this microalga has been responsible for several cases of human intoxication and mass mortalities of benthic invertebrates. Although the large number of studies on this dinoflagellate, the mechanisms underpinning O. ovata growth and toxin production are still far to be fully understood. In this work we have enriched the dataset on this species by carrying out a new experiment on an Adriatic O. cf. ovata strain. Data from this experiment (named Beta) and from another comparable experiment previously conducted on the same strain (named Alpha), revealed some interesting aspects of this dinoflagellate: it is able to grow also in a condition of strong intracellular nutrient deficiency (C:P molar ratio > 400; C:N > 25), reaching extremely low values of chlorophyll-a to carbon ratio (0.0004). Was also found a significant inverse relationships (r > -0.7) between cellular toxin to carbon and cellular nutrient to carbon ratios of experiment Alpha. In the light of these result, we hypothesized that in O. cf. ovata nutrient-stress conditions (intended as intracellular nutrient deficiency) can cause: i) an increase in toxin production; ii) a strong decrease in chlorophyll-a synthesis; iii) a lowering of metabolism associated with the formation of a sort of resting stage. We then used a modelling approach to test and critically evaluate these hypotheses in a mechanistic way: newly developed formulation describing toxin production and fate, and ad hoc changes in the already existent formulations describing chlorophyll synthesis, rest respiration, and mortality, have been incorporated in a simplified version of the European Regional Seas Ecosystem Model (ERSEM), together with a new ad hoc parameterization. The adapted model was able to accurately reproduce many of the trends observed in the Alpha experiment, allowing us to support our hypotheses. Instead the simulations of the experiment Beta were not fully satisfying in quantitative terms. We explained this gap with the presumed different physiological behaviors between the algae of the two experiments, due to the different pre-experimental periods of acclimation: the model was not able to reproduce acclimation processes in its simulations of the experiment Beta. Thus we attempt to simulate the acclimation of the algae to nutrient-stress conditions by manual intervention on some parameters of nutrient-stress thresholds, but we received conflicting results. Further studies are required to shed light on this interesting aspect. In this work we also improve the range of applicability of a state of the art marine biogeochemical model (ERSEM) by implementing in it an ecological relevant process such as the production of toxic compounds.
Resumo:
The present dissertation aims at analyzing the construction of American adolescent culture through teen-targeted television series and the shift in perception that occurs as a consequence of the translation process. In light of the recent changes in television production and consumption modes, largely caused by new technologies, this project explores the evolution of Italian audiences, focusing on fansubbing (freely distributed amateur subtitles made by fans for fan consumption) and social viewing (the re-aggregation of television consumption based on social networks and dedicated platforms, rather than on physical presence). These phenomena are symptoms of a sort of ‘viewership 2.0’ and of a new type of active viewing, which calls for a revision of traditional AVT strategies. Using a framework that combines television studies, new media studies, and fandom studies with an approach to AVT based on Descriptive Translation Studies (Toury 1995), this dissertation analyzes the non-Anglophone audience’s growing need to participation in the global dialogue and appropriation process based on US scheduling and informed by the new paradigm of convergence culture, transmedia storytelling, and affective economics (Jenkins 2006 and 2007), as well as the constraints intrinsic to multimodal translation and the different types of linguistic and cultural adaptation performed through dubbing (which tends to be more domesticating; Venuti 1995) and fansubbing (typically more foreignizing). The study analyzes a selection of episodes from six of the most popular teen television series between 1990 and 2013, which has been divided into three ages based on the different modes of television consumption: top-down, pre-Internet consumption (Beverly Hills, 90210, 1990 – 2000), emergence of audience participation (Buffy the Vampire Slayer, 1997 – 2003; Dawson’s Creek, 1998 – 2003), age of convergence and Viewership 2.0 (Gossip Girl, 2007 – 2012; Glee, 2009 – present; The Big Bang Theory, 2007 - present).
Resumo:
Con le "Imagini degli dei degli antichi", pubblicate a Venezia nel 1556 e poi in più edizioni arricchite e illustrate, l’impegnato gentiluomo estense Vincenzo Cartari realizza il primo, fortunatissimo manuale mitografico italiano in lingua volgare, diffuso e tradotto in tutta l’Europa moderna. Cartari rimodula, secondo accenti divulgativi ma fedeli, fonti latine tradizionali: come le ricche "Genealogie deorum gentilium" di Giovanni Boccaccio, l’appena precedente "De deis gentium varia et multiplex historia" di Lilio Gregorio Giraldi, i curiosi "Fasti" ovidiani, da lui stesso commentati e tradotti. Soprattutto, però, introduce il patrimonio millenario di favole ed esegesi classiche, di aperture egiziane, mediorientali, sassoni, a una chiave di lettura inedita, agile e vitalissima: l’ecfrasi. Le divinità e i loro cortei di creature minori, aneddoti leggendari e attributi identificativi si susseguono secondo un taglio iconico e selettivo. Sfilano, in trionfi intrisi di raffinato petrarchismo neoplatonico e di emblematica picta poesis rinascimentale, soltanto gli aspetti figurabili e distintivi dei personaggi mitici: perché siano «raccontate interamente» tutte le cose attinenti alle figure antiche, «con le imagini quasi di tutti i dei, e le ragioni perché fossero così dipinti». Così, le "Imagini" incontrano il favore di lettori colti e cortigiani eleganti, di pittori e ceramisti, di poeti e artigiani. Allestiscono una sorta di «manuale d’uso» pronto all’inchiostro del poeta o al pennello dell’artista, una suggestiva raccolta di «libretti figurativi» ripresi tanto dalla maniera di Paolo Veronese o di Giorgio Vasari, quanto dal classicismo dei Carracci e di Nicolas Poussin. Si rivelano, infine, summa erudita capace di attirare appunti e revisioni: l’antiquario padovano Lorenzo Pignoria, nel 1615 e di nuovo nel 1626, vi aggiunge appendici archeologiche e comparatistiche, interessate al remoto regno dei faraoni quanto agli esotici idoli orientali e dei Nuovi Mondi.
Resumo:
The research activity focused on the study, design and evaluation of innovative human-machine interfaces based on virtual three-dimensional environments. It is based on the brain electrical activities recorded in real time through the electrical impulses emitted by the brain waves of the user. The achieved target is to identify and sort in real time the different brain states and adapt the interface and/or stimuli to the corresponding emotional state of the user. The setup of an experimental facility based on an innovative experimental methodology for “man in the loop" simulation was established. It allowed involving during pilot training in virtually simulated flights, both pilot and flight examiner, in order to compare the subjective evaluations of this latter to the objective measurements of the brain activity of the pilot. This was done recording all the relevant information versus a time-line. Different combinations of emotional intensities obtained, led to an evaluation of the current situational awareness of the user. These results have a great implication in the current training methodology of the pilots, and its use could be extended as a tool that can improve the evaluation of a pilot/crew performance in interacting with the aircraft when performing tasks and procedures, especially in critical situations. This research also resulted in the design of an interface that adapts the control of the machine to the situation awareness of the user. The new concept worked on, aimed at improving the efficiency between a user and the interface, and gaining capacity by reducing the user’s workload and hence improving the system overall safety. This innovative research combining emotions measured through electroencephalography resulted in a human-machine interface that would have three aeronautical related applications: • An evaluation tool during the pilot training; • An input for cockpit environment; • An adaptation tool of the cockpit automation.
Resumo:
The aim of this thesis is to investigate the nature of quantum computation and the question of the quantum speed-up over classical computation by comparing two different quantum computational frameworks, the traditional quantum circuit model and the cluster-state quantum computer. After an introductory survey of the theoretical and epistemological questions concerning quantum computation, the first part of this thesis provides a presentation of cluster-state computation suitable for a philosophical audience. In spite of the computational equivalence between the two frameworks, their differences can be considered as structural. Entanglement is shown to play a fundamental role in both quantum circuits and cluster-state computers; this supports, from a new perspective, the argument that entanglement can reasonably explain the quantum speed-up over classical computation. However, quantum circuits and cluster-state computers diverge with regard to one of the explanations of quantum computation that actually accords a central role to entanglement, i.e. the Everett interpretation. It is argued that, while cluster-state quantum computation does not show an Everettian failure in accounting for the computational processes, it threatens that interpretation of being not-explanatory. This analysis presented here should be integrated in a more general work in order to include also further frameworks of quantum computation, e.g. topological quantum computation. However, what is revealed by this work is that the speed-up question does not capture all that is at stake: both quantum circuits and cluster-state computers achieve the speed-up, but the challenges that they posit go besides that specific question. Then, the existence of alternative equivalent quantum computational models suggests that the ultimate question should be moved from the speed-up to a sort of “representation theorem” for quantum computation, to be meant as the general goal of identifying the physical features underlying these alternative frameworks that allow for labelling those frameworks as “quantum computation”.
Resumo:
Der bei weitem überwiegende Teil der autoritären Regime weltweit verfügt mittlerweile über formal-demokratische Institutionen, wie Parlamente und Wahlen. Die Einführung solcher Institutionen soll unter anderem eine Entwicklung in Richtung Demokratie andeuten oder vortäuschen und so den internationalen und innenpolitischen Druck auf die jeweilige Regierung vermindern. Diese Arbeit beschäftigt sich mit der Frage, ob von diesen formal-demokratischen Institutionen eine Wirkung auf das Regierungshandeln ausgeht und die Menschenrechtslage im Land durch sie verbessert wird. Zunächst werden autoritäre Regime unter Verwendung des minimalistischen Ansatzes von Cheibub et al. definiert. Anschließend werden aus den bisherigen Erkenntnissen der Forschung zur Rolle von formal-demokratischen Institutionen in autoritären Regimen Hypothesen zum Zusammenhang zwischen diesen Institutionen und repressivem Regierungsverhalten abgeleitet, die mit Hilfe einer empirische Analyse von Zeitreihen-Querschnittsdaten aus sämtlichen autoritären Regime zwischen 1979 und 2004 getestet werden. Die Ergebnisse zeigen unter anderem, dass mit steigender Kompetitivität der Parlamentswahlen die Wahrscheinlichkeit drastischster Menschenrechtsverletzungen sinkt. Zudem finden sich Anzeichen dafür, dass es zu weniger Menschenrechtsverletzungen kommt, je geringer die Zersplitterung der Oppositionsparteien ist, während mit einer Zunahme der formalen Kompetenzen der Parlamente das Repressionsniveau steigt.
Resumo:
Il Data Distribution Management (DDM) è un componente dello standard High Level Architecture. Il suo compito è quello di rilevare le sovrapposizioni tra update e subscription extent in modo efficiente. All'interno di questa tesi si discute la necessità di avere un framework e per quali motivi è stato implementato. Il testing di algoritmi per un confronto equo, librerie per facilitare la realizzazione di algoritmi, automatizzazione della fase di compilazione, sono motivi che sono stati fondamentali per iniziare la realizzazione framework. Il motivo portante è stato che esplorando articoli scientifici sul DDM e sui vari algoritmi si è notato che in ogni articolo si creavano dei dati appositi per fare dei test. L'obiettivo di questo framework è anche quello di riuscire a confrontare gli algoritmi con un insieme di dati coerente. Si è deciso di testare il framework sul Cloud per avere un confronto più affidabile tra esecuzioni di utenti diversi. Si sono presi in considerazione due dei servizi più utilizzati: Amazon AWS EC2 e Google App Engine. Sono stati mostrati i vantaggi e gli svantaggi dell'uno e dell'altro e il motivo per cui si è scelto di utilizzare Google App Engine. Si sono sviluppati quattro algoritmi: Brute Force, Binary Partition, Improved Sort, Interval Tree Matching. Sono stati svolti dei test sul tempo di esecuzione e sulla memoria di picco utilizzata. Dai risultati si evince che l'Interval Tree Matching e l'Improved Sort sono i più efficienti. Tutti i test sono stati svolti sulle versioni sequenziali degli algoritmi e che quindi ci può essere un riduzione nel tempo di esecuzione per l'algoritmo Interval Tree Matching.
Resumo:
Ziel dieser Studie war es zu untersuchen, ob die unter Belastung vorliegenden Anstiege plasmatischer zellfreier DNA über den Mechanismus der NETose zu erklären sind. Zudem sollte die Assoziation von zellfreier DNA und leistungsphysiologischen Parametern geklärt werden. Anhand eines Stufenprotokolls wurden Straßenradfahrer belastet und durch Blutuntersuchungen auf DNA, MPO, Elastase sowie Leistungsphysiologie untersucht. Anhand der Ergebnisse kann die Herkunft der DNA aus NETs nicht bewiesen werden. Die Neutrophilen Granulozyten zeigen eine Degranulationsreaktion, die aber nicht parallel mit den DNA-Anstiegen verläuft. Leitsungsphysiologisch war auffällig, dass die absolute Leistung mit der DNA korreliert sowie Parameter des Herzkreislaufsystems ebenfalls ähnliche ansteigen wie die DNA im Blut.
Resumo:
This dissertation is part of the Language Toolkit project which is a collaboration between the School of Foreign Languages and Literature, Interpreting and Translation of the University of Bologna, Forlì campus, and the Chamber of Commerce of Forlì-Cesena. This project aims to create an exchange between translation students and companies who want to pursue a process of internationalization. The purpose of this dissertation is demonstrating the benefits that translation systems can bring to businesses. In particular, it consists of the translation into English of documents supplied by the Italian company Technologica S.r.l. and the creation of linguistic resources that can be integrated into computer-assisted translation (CAT) software, in order to optimize the translation process. The latter is claimed to be a priority with respect to the actual translation products (the target texts), since the analysis conducted on the source texts highlighted that the company could streamline and optimize its English language communication thanks to the use of open source CAT tools such as OmegaT. The work consists of five chapters. The first introduces the Language Toolkit project, the company (Technologica S.r.l ) and its products. The second chapter provides some considerations about technical translation, its features and some misconceptions about it. The difference between technical translation and scientific translation is then clarified and an overview is offered of translation aids such as those used for computer-assisted translation, machine translation, termbases and translation memories. The third chapter contains the analysis of the texts commissioned by Technologica S.r.l. and their categorization. The fourth chapter describes the translation process, with particular attention to terminology extraction and the creation of a bilingual glossary based on a specialized corpus. The glossary was integrated into the OmegaT software in order to facilitate the translation process both for the present task and for future applications. The memory deriving from the translation represents a sort of hybrid resource between a translation memory and a glossary. This was found to be the most appropriate format, given the specific nature of the texts to be translated. Finally, in chapter five conclusions are offered about the importance of language training within a company environment, the potentialities of translation aids and the benefits that they would bring to a company wishing to internationalize itself.
Resumo:
The aim of this dissertation is to demonstrate how theory and practice are linked in translation. The translation of the essay Light Years Ahead helped me to understand this connection and to develop the two main thesis included in this work, that is the possibility the translator has to choose among all the different theories, without giving one or another the absolute supremacy, and the diversity of the non-fiction genre. Therefore, the first chapter focuses on the different theories of translation, presented in a way which suggests that one might be the completion and the development of another. The second chapter deals with the peculiar issues of non-fiction translation, with particular attention to the way in which this genre gathers different elements of other text types. Despite this variety, it is also claimed that the function at the higher level of an essay is always the informative one. This concept led me to simplify and make more intelligible the Italian version of the text I translated (Light Years Ahead). In the third chapter, this last point is discussed, as well as my considerations about the function, the dominant aspect and the cultural analysis of the text, with particular regard to how the quality of the English translation affected my choices. In the fourth chapter I included some examples of translation, which best demonstrate the distinctive variety of styles of non-fiction texts and the possibility for the translator to choose each time which theory suits them best. Finally, I also included three examples which represent a sort of defeat for me, that is to say three points where the ambiguity of the text obliged me to remove that information for the sake of the dominant informative function.
Resumo:
Tomatoes are the most common crop in Italy. The production cycle requires operations in the field and factory that can cause musculoskeletal disorders due to the repetitive movements of the upper limbs of the workers employed in the sorting phase. This research aims to evaluate these risks using the OCRA (occupational repetitive actions) index method This method is based firstly on the calculation of a maximum number of recommended actions, related to the way the operation is performed, and secondly on a comparison of the number of actions effectively carried out by the upper limb with the recommended calculated value. The results of the risk evaluation for workers who manually sort tomatoes during harvest showed a risk for the workers, with an exposure index greater than 20; the OCRA index defines an index higher than 3.5 as unacceptable. The present trend of replacing manual sorting onboard a vehicle with optical sorters seems to be appropriate to reduce the risk of work-related musculoskeletal disorders (WMSDs) and is supported from both a financial point of view and as a quality control measure.
Resumo:
In traditional medicine, numerous plant preparations are used to treat inflammation both topically and systemically. Several anti-inflammatory plant extracts and a few natural product-based monosubstances have even found their way into the clinic. Unfortunately, a number of plant secondary metabolites have been shown to trigger detrimental pro-allergic immune reactions and are therefore considered to be toxic. In the phytotherapy research literature, numerous plants are also claimed to exert immunostimulatory effects. However, while the concepts of plant-derived anti-inflammatory agents and allergens are well established, the widespread notion of immunostimulatory plant natural products and their potential therapeutic use is rather obscure, often with the idea that the product is some sort of "tonic" for the immune system without actually specifying the mechanisms. In this commentary it is argued that the paradigm of oral plant immunostimulants lacks clinical evidence and may therefore be a myth, which has originated primarily from in vitro studies with plant extracts. The fact that no conclusive data on orally administered immunostimulants can be found in the scientific literature inevitably prompts us to challenge this paradigm.