927 resultados para Paradigm of complexity
Resumo:
[EN] Research background and hypothesis. Several attempts have been made to understand some modalities of sport from the point of view of complexity. Most of these studies deal with this phenomenon with regard to the mechanics of the game itself (in isolation). Nevertheless, some research has been conducted from the perspective of competition between teams. Our hypothesis was that for the study of competitiveness levels in the system of league competition our analysis model (Shannon entropy), is a useful and highly sensitive tool to determine the degree of global competitiveness of a league. Research aim. The aim of our study was to develop a model for the analysis of competitiveness level in team sport competitions based on the uncertainty level that might exist for each confrontation. Research methods. Degree of uncertainty or randomness of the competition was analyzed as a factor of competitiveness. It was calculated on the basis of the Shannon entropy. Research results. We studied 17 NBA regular seasons, which showed a fairly steady entropic tendency. There were seasons less competitive (? 0.9800) than the overall average (0.9835), and periods where the competitiveness remained at higher levels (range: 0.9851 to 0.9902). Discussion and conclusions. A league is more competitive when it is more random. Thus, it is harder to predict the fi nal outcome. However, when the competition is less random, the degree of competitiveness will decrease signifi cantly. The NBA is a very competitive league, there is a high degree of uncertainty of knowing the fi nal result.
Resumo:
This Ph.D. candidate thesis collects the research work I conducted under the supervision of Prof.Bruno Samor´ı in 2005,2006 and 2007. Some parts of this work included in the Part III have been begun by myself during my undergraduate thesis in the same laboratory and then completed during the initial part of my Ph.D. thesis: the whole results have been included for the sake of understanding and completeness. During my graduate studies I worked on two very different protein systems. The theorical trait d’union between these studies, at the biological level, is the acknowledgement that protein biophysical and structural studies must, in many cases, take into account the dynamical states of protein conformational equilibria and of local physico-chemical conditions where the system studied actually performs its function. This is introducted in the introductory part in Chapter 2. Two different examples of this are presented: the structural significance deriving from the action of mechanical forces in vivo (Chapter 3) and the complexity of conformational equilibria in intrinsically unstructured proteins and amyloid formation (Chapter 4). My experimental work investigated both these examples by using in both cases the single molecule force spectroscopy technique (described in Chapter 5 and Chapter 6). The work conducted on angiostatin focused on the characterization of the relationships between the mechanochemical properties and the mechanism of action of the angiostatin protein, and most importantly their intertwining with the further layer of complexity due to disulfide redox equilibria (Part III). These studies were accompanied concurrently by the elaboration of a theorical model for a novel signalling pathway that may be relevant in the extracellular space, detailed in Chapter 7.2. The work conducted on -synuclein (Part IV) instead brought a whole new twist to the single molecule force spectroscopy methodology, applying it as a structural technique to elucidate the conformational equilibria present in intrinsically unstructured proteins. These equilibria are of utmost interest from a biophysical point of view, but most importantly because of their direct relationship with amyloid aggregation and, consequently, the aetiology of relevant pathologies like Parkinson’s disease. The work characterized, for the first time, conformational equilibria in an intrinsically unstructured protein at the single molecule level and, again for the first time, identified a monomeric folded conformation that is correlated with conditions leading to -synuclein and, ultimately, Parkinson’s disease. Also, during the research work, I found myself in the need of a generalpurpose data analysis application for single molecule force spectroscopy data analysis that could solve some common logistic and data analysis problems that are common in this technique. I developed an application that addresses some of these problems, herein presented (Part V), and that aims to be publicly released soon.
Resumo:
The vast majority of known proteins have not yet been experimentally characterized and little is known about their function. The design and implementation of computational tools can provide insight into the function of proteins based on their sequence, their structure, their evolutionary history and their association with other proteins. Knowledge of the three-dimensional (3D) structure of a protein can lead to a deep understanding of its mode of action and interaction, but currently the structures of <1% of sequences have been experimentally solved. For this reason, it became urgent to develop new methods that are able to computationally extract relevant information from protein sequence and structure. The starting point of my work has been the study of the properties of contacts between protein residues, since they constrain protein folding and characterize different protein structures. Prediction of residue contacts in proteins is an interesting problem whose solution may be useful in protein folding recognition and de novo design. The prediction of these contacts requires the study of the protein inter-residue distances related to the specific type of amino acid pair that are encoded in the so-called contact map. An interesting new way of analyzing those structures came out when network studies were introduced, with pivotal papers demonstrating that protein contact networks also exhibit small-world behavior. In order to highlight constraints for the prediction of protein contact maps and for applications in the field of protein structure prediction and/or reconstruction from experimentally determined contact maps, I studied to which extent the characteristic path length and clustering coefficient of the protein contacts network are values that reveal characteristic features of protein contact maps. Provided that residue contacts are known for a protein sequence, the major features of its 3D structure could be deduced by combining this knowledge with correctly predicted motifs of secondary structure. In the second part of my work I focused on a particular protein structural motif, the coiled-coil, known to mediate a variety of fundamental biological interactions. Coiled-coils are found in a variety of structural forms and in a wide range of proteins including, for example, small units such as leucine zippers that drive the dimerization of many transcription factors or more complex structures such as the family of viral proteins responsible for virus-host membrane fusion. The coiled-coil structural motif is estimated to account for 5-10% of the protein sequences in the various genomes. Given their biological importance, in my work I introduced a Hidden Markov Model (HMM) that exploits the evolutionary information derived from multiple sequence alignments, to predict coiled-coil regions and to discriminate coiled-coil sequences. The results indicate that the new HMM outperforms all the existing programs and can be adopted for the coiled-coil prediction and for large-scale genome annotation. Genome annotation is a key issue in modern computational biology, being the starting point towards the understanding of the complex processes involved in biological networks. The rapid growth in the number of protein sequences and structures available poses new fundamental problems that still deserve an interpretation. Nevertheless, these data are at the basis of the design of new strategies for tackling problems such as the prediction of protein structure and function. Experimental determination of the functions of all these proteins would be a hugely time-consuming and costly task and, in most instances, has not been carried out. As an example, currently, approximately only 20% of annotated proteins in the Homo sapiens genome have been experimentally characterized. A commonly adopted procedure for annotating protein sequences relies on the "inheritance through homology" based on the notion that similar sequences share similar functions and structures. This procedure consists in the assignment of sequences to a specific group of functionally related sequences which had been grouped through clustering techniques. The clustering procedure is based on suitable similarity rules, since predicting protein structure and function from sequence largely depends on the value of sequence identity. However, additional levels of complexity are due to multi-domain proteins, to proteins that share common domains but that do not necessarily share the same function, to the finding that different combinations of shared domains can lead to different biological roles. In the last part of this study I developed and validate a system that contributes to sequence annotation by taking advantage of a validated transfer through inheritance procedure of the molecular functions and of the structural templates. After a cross-genome comparison with the BLAST program, clusters were built on the basis of two stringent constraints on sequence identity and coverage of the alignment. The adopted measure explicity answers to the problem of multi-domain proteins annotation and allows a fine grain division of the whole set of proteomes used, that ensures cluster homogeneity in terms of sequence length. A high level of coverage of structure templates on the length of protein sequences within clusters ensures that multi-domain proteins when present can be templates for sequences of similar length. This annotation procedure includes the possibility of reliably transferring statistically validated functions and structures to sequences considering information available in the present data bases of molecular functions and structures.
Resumo:
Il presente lavoro analizza il ruolo ricoperto dal legislatore e dalla pubblica amministrazione rispettivamente nel delineare e attuare politiche pubbliche volte alla promozione di modelli di sviluppo economico caratterizzati da un elevato tasso di sostenibilità ambientale. A tal fine, il lavoro è suddiviso in quattro capitoli. Nel primo capitolo vengono presi in considerazione i principali elementi della teoria generale che costituiscono i piani di lettura del tema trattato. Questa prima fase della ricerca è incentrata, da un lato, sull’analisi (storico-evolutiva) del concetto di ambiente alla luce della prevalente elaborazione giuridica e, dall’altro, sulla formazione del concetto di sviluppo sostenibile, con particolare riguardo alla sua declinazione in chiave ambientale. Nella parte centrale del lavoro, costituita dal secondo e dal terzo capitolo, l’analisi è rivolta a tre settori d’indagine determinanti per l’inquadramento sistematico delle politiche pubbliche del settore: il sistema di rapporti che esiste tra i molteplici soggetti (internazionali, nazionali e locali) coinvolti nella ricerca di soluzioni alla crisi sistemica ambientale; l’individuazione e la definizione dell’insieme dei principi sostanziali che governano il sistema di tutela ambientale e che indirizzano le scelte di policy nel settore; i principali strumenti (giuridici ed economici) di protezione attualmente in vigore. Il quarto ed ultimo capitolo prende in considerazione le politiche relative alle procedure di autorizzazione alla costruzione e all’esercizio degli impianti per la produzione di energia alimentati da fonti energetiche rinnovabili, analizzate quale caso specifico che può essere assunto a paradigma del ruolo ricoperto dal legislatore e dalla pubblica amministrazione nel settore delle politiche di sviluppo sostenibile. L’analisi condotta mostra un elevato tasso di complessità del sistema istituzionale e organizzativo, a cui si aggiungono evidenti limiti di efficienza per quanto riguarda il regime amministrativo delle autorizzazioni introdotto dal legislatore nazionale.
Resumo:
Questo studio indaga la tradizione storiografica sulle regine seleucidi di III secolo a.C. - da Laodice I fino all’ascesa al trono di Laodice III - alla luce delle testimonianze documentarie. L’intento è stabilire l’orizzonte politico dell’azione delle regine nel momento meno conosciuto e più buio della storia del regno di Siria. Il III secolo seleucidico è infatti un periodo caratterizzato da incertezza nella ricostruzione evenemenziale e nell’analisi delle dinamiche sociopolitiche, soprattutto a causa dell’interesse relativo e limitato delle fonti storiografiche antiche a noi giunte relativamente alle vicende della monarchia seleucidica precedenti l’avvento di Antioco III il Grande. In questo panorama rappresenta un’eccezione il caso di Laodice I, che è al centro di un rilevante numero di testimonianze. Tenendo conto della differente natura delle fonti e della complessità del contesto evenemenziale, non sempre ricostruibile, è stata studiata la figura di Laodice I nell’intento di comprendere il paradigma di regalità femminile stabilito dalle azioni della regina, e quindi l’influenza di tale paradigma sulla generazione seleucide successiva. La ricerca si articola in tre sezioni, legate agli eventi principali del III secolo seleucide che vedono coinvolta la regina. La prima parte è dedicata all’attività di Laodice I dopo il secondo matrimonio di Antioco II con la tolemaica Berenice. La seconda parte riguarda il ruolo di Laodice nei complessi eventi che seguirono la morte del marito Antioco II nel 246 e nella Terza Guerra Siriaca, che vide la regina a fianco del figlio Seleuco II contrapporsi a Tolemeo III e Berenice Sira. La terza sezione si occupa dell’azione di Laodice I nella Guerra Fraterna al fianco di Antioco Ierace contro Seleuco II e Laodice II. Nella conclusione si riflette sull’influenza delle azioni politiche di Laodice I sulla scelta di Antioco III di sposare la cugina Laodice III, prima principessa del Ponto entrata nella dinastia regnante.
Resumo:
The complex process of gait is rendered partially automatic by central pattern generators (CPGs). To further our understanding of their role in gait control in healthy subjects, we applied a paradigm of anti-phase, or syncopated, movement to gait. To provide a context for our results, we reviewed the literature on in-phase, or synchronized, gait. The review results are as follows. Auditory cueing increased step/stride rate for older subjects, but not younger. Stride rate variability decreased for younger subjects, perhaps because the metronome’s cue acted as a temporal ‘anchor point’ for each step. Step width increased in half of the treadmill studies, but none of the overground ones, suggesting a cumulative effect of the attentional demands of synchronizing gait while on a treadmill. Time series analysis revealed that the α exponent was the most sensitive parameter reported, decreasing toward anti-persistence in almost all cued-gait studies. This project compares in-phase (IN) and anti-phase gait (ANTI) in young and old healthy subjects. We expected gait to be less disrupted during ANTI trials at preferred speed, when the facilitating effect of CPGs would be strongest. The measures step time variability, jerk index, and harmonic ratio quantified gait perturbation: none indicated that ANTI was easiest at preferred walking speed. Surprisingly, the gait of older subjects was no more perturbed than that of younger subjects. When they successfully matched the pace of the beat, they unwittingly synchronized to it. The temporal relationship of their steps to the beat was the same in the IN and ANTI conditions. Younger subjects, visibly struggling during ANTI trials, were able to walk in syncopation. This result suggests that cognitive resources available only to the younger group are required to resist synchronizing to the beat.
Resumo:
The present dissertation aims at analyzing the construction of American adolescent culture through teen-targeted television series and the shift in perception that occurs as a consequence of the translation process. In light of the recent changes in television production and consumption modes, largely caused by new technologies, this project explores the evolution of Italian audiences, focusing on fansubbing (freely distributed amateur subtitles made by fans for fan consumption) and social viewing (the re-aggregation of television consumption based on social networks and dedicated platforms, rather than on physical presence). These phenomena are symptoms of a sort of ‘viewership 2.0’ and of a new type of active viewing, which calls for a revision of traditional AVT strategies. Using a framework that combines television studies, new media studies, and fandom studies with an approach to AVT based on Descriptive Translation Studies (Toury 1995), this dissertation analyzes the non-Anglophone audience’s growing need to participation in the global dialogue and appropriation process based on US scheduling and informed by the new paradigm of convergence culture, transmedia storytelling, and affective economics (Jenkins 2006 and 2007), as well as the constraints intrinsic to multimodal translation and the different types of linguistic and cultural adaptation performed through dubbing (which tends to be more domesticating; Venuti 1995) and fansubbing (typically more foreignizing). The study analyzes a selection of episodes from six of the most popular teen television series between 1990 and 2013, which has been divided into three ages based on the different modes of television consumption: top-down, pre-Internet consumption (Beverly Hills, 90210, 1990 – 2000), emergence of audience participation (Buffy the Vampire Slayer, 1997 – 2003; Dawson’s Creek, 1998 – 2003), age of convergence and Viewership 2.0 (Gossip Girl, 2007 – 2012; Glee, 2009 – present; The Big Bang Theory, 2007 - present).
Resumo:
Pediatric acute myeloid leukemia (AML) is a molecularly heterogeneous disease that arises from genetic alterations in pathways that regulate self-renewal and myeloid differentiation. While the majority of patients carry recurrent chromosomal translocations, almost 20% of childhood AML do not show any recognizable cytogenetic alteration and are defined as cytogenetically normal (CN)-AML. CN-AML patients have always showed a great variability in response to therapy and overall outcome, underlining the presence of unknown genetic changes, not detectable by conventional analyses, but relevant for pathogenesis, and outcome of AML. The development of novel genome-wide techniques such as next-generation sequencing, have tremendously improved our ability to interrogate the cancer genome. Based on this background, the aim of this research study was to investigate the mutational landscape of pediatric CN-AML patients negative for all the currently known somatic mutations reported in AML through whole-transcriptome sequencing (RNA-seq). RNA-seq performed on diagnostic leukemic blasts from 19 pediatric CN-AML cases revealed a considerable incidence of cryptic chromosomal rearrangements, with the identification of 21 putative fusion genes. Several of the fusion genes that were identified in this study are recurrent and might have a prognostic and/or therapeutic relevance. A paradigm of that is the CBFA2T3-GLIS2 fusion, which has been demonstrated to be a common alteration in pediatric CN-AML, predicting poor outcome. Important findings have been also obtained in the identification of novel therapeutic targets. On one side, the identification of NUP98-JARID1A fusion suggests the use of disulfiram; on the other, here we describe alteration-activating tyrosine kinases, providing functional data supporting the use of tyrosine kinase inhibitors to specifically inhibit leukemia cells. This study provides new insights in the knowledge of genetic alterations underlying pediatric AML, defines novel prognostic markers and putative therapeutic targets, and prospectively ensures a correct risk stratification and risk-adapted therapy also for the “all-neg” AML subgroup.
Resumo:
Beside the traditional paradigm of "centralized" power generation, a new concept of "distributed" generation is emerging, in which the same user becomes pro-sumer. During this transition, the Energy Storage Systems (ESS) can provide multiple services and features, which are necessary for a higher quality of the electrical system and for the optimization of non-programmable Renewable Energy Source (RES) power plants. A ESS prototype was designed, developed and integrated into a renewable energy production system in order to create a smart microgrid and consequently manage in an efficient and intelligent way the energy flow as a function of the power demand. The produced energy can be introduced into the grid, supplied to the load directly or stored in batteries. The microgrid is composed by a 7 kW wind turbine (WT) and a 17 kW photovoltaic (PV) plant are part of. The load is given by electrical utilities of a cheese factory. The ESS is composed by the following two subsystems, a Battery Energy Storage System (BESS) and a Power Control System (PCS). With the aim of sizing the ESS, a Remote Grid Analyzer (RGA) was designed, realized and connected to the wind turbine, photovoltaic plant and the switchboard. Afterwards, different electrochemical storage technologies were studied, and taking into account the load requirements present in the cheese factory, the most suitable solution was identified in the high temperatures salt Na-NiCl2 battery technology. The data acquisition from all electrical utilities provided a detailed load analysis, indicating the optimal storage size equal to a 30 kW battery system. Moreover a container was designed and realized to locate the BESS and PCS, meeting all the requirements and safety conditions. Furthermore, a smart control system was implemented in order to handle the different applications of the ESS, such as peak shaving or load levelling.
Resumo:
This thesis regards the study and the development of new cognitive assessment and rehabilitation techniques of subjects with traumatic brain injury (TBI). In particular, this thesis i) provides an overview about the state of art of this new assessment and rehabilitation technologies, ii) suggests new methods for the assessment and rehabilitation and iii) contributes to the explanation of the neurophysiological mechanism that is involved in a rehabilitation treatment. Some chapters provide useful information to contextualize TBI and its outcome; they describe the methods used for its assessment/rehabilitation. The other chapters illustrate a series of experimental studies conducted in healthy subjects and TBI patients that suggest new approaches to assessment and rehabilitation. The new proposed approaches have in common the use of electroencefalografy (EEG). EEG was used in all the experimental studies with a different purpose, such as diagnostic tool, signal to command a BCI-system, outcome measure to evaluate the effects of a treatment, etc. The main achieved results are about: i) the study and the development of a system for the communication with patients with disorders of consciousness. It was possible to identify a paradigm of reliable activation during two imagery task using EEG signal or EEG and NIRS signal; ii) the study of the effects of a neuromodulation technique (tDCS) on EEG pattern. This topic is of great importance and interest. The emerged founding showed that the tDCS can manipulate the cortical network activity and through the research of optimal stimulation parameters, it is possible move the working point of a neural network and bring it in a condition of maximum learning. In this way could be possible improved the performance of a BCI system or to improve the efficacy of a rehabilitation treatment, like neurofeedback.
Resumo:
The application of dexterous robotic hands out of research laboratories has been limited by the intrinsic complexity that these devices present. This is directly reflected as an economically unreasonable cost and a low overall reliability. Within the research reported in this thesis it is shown how the problem of complexity in the design of robotic hands can be tackled, taking advantage of modern technologies (i.e. rapid prototyping), leading to innovative concepts for the design of the mechanical structure, the actuation and sensory systems. The solutions adopted drastically reduce the prototyping and production costs and increase the reliability, reducing the number of parts required and averaging their single reliability factors. In order to get guidelines for the design process, the problem of robotic grasp and manipulation by a dual arm/hand system has been reviewed. In this way, the requirements that should be fulfilled at hardware level to guarantee successful execution of the task has been highlighted. The contribution of this research from the manipulation planning side focuses on the redundancy resolution that arise in the execution of the task in a dexterous arm/hand system. In literature the problem of coordination of arm and hand during manipulation of an object has been widely analyzed in theory but often experimentally demonstrated in simplified robotic setup. Our aim is to cover the lack in the study of this topic and experimentally evaluate it in a complex system as a anthropomorphic arm hand system.
Resumo:
In traditional medicine, numerous plant preparations are used to treat inflammation both topically and systemically. Several anti-inflammatory plant extracts and a few natural product-based monosubstances have even found their way into the clinic. Unfortunately, a number of plant secondary metabolites have been shown to trigger detrimental pro-allergic immune reactions and are therefore considered to be toxic. In the phytotherapy research literature, numerous plants are also claimed to exert immunostimulatory effects. However, while the concepts of plant-derived anti-inflammatory agents and allergens are well established, the widespread notion of immunostimulatory plant natural products and their potential therapeutic use is rather obscure, often with the idea that the product is some sort of "tonic" for the immune system without actually specifying the mechanisms. In this commentary it is argued that the paradigm of oral plant immunostimulants lacks clinical evidence and may therefore be a myth, which has originated primarily from in vitro studies with plant extracts. The fact that no conclusive data on orally administered immunostimulants can be found in the scientific literature inevitably prompts us to challenge this paradigm.
Resumo:
Cognitive functioning is based on binding processes, by which different features and elements of neurocognition are integrated and coordinated. Binding is an essential ingredient of, for instance, Gestalt perception. We have implemented a paradigm of causality perception based on the work of Albert Michotte, in which 2 identical discs move from opposite sides of a monitor, steadily toward, and then past one another. Their coincidence generates an ambiguous percept of either "streaming" or "bouncing," which the subjects (34 schizophrenia spectrum patients and 34 controls with mean age 27.9 y) were instructed to report. The latter perception is a marker of the binding processes underlying perceived causality (type I binding). In addition to this visual task, acoustic stimuli were presented at different times during the task (150 ms before and after visual coincidence), which can modulate perceived causality. This modulation by intersensory and temporally delayed stimuli is viewed as a different type of binding (type II). We show here, using a mixed-effects hierarchical analysis, that type II binding distinguishes schizophrenia spectrum patients from healthy controls, whereas type I binding does not. Type I binding may even be excessive in some patients, especially those with positive symptoms; Type II binding, however, was generally attenuated in patients. The present findings point to ways in which the disconnection (or Gestalt) hypothesis of schizophrenia can be refined, suggesting more specific markers of neurocognitive functioning and potential targets of treatment.
Resumo:
Partner notification (PN or contact tracing) is an important aspect of treating bacterial sexually transmitted infections (STIs), such as Chlamydia trachomatis. It facilitates the identification of new infected cases that can be treated through individual case management. PN also acts indirectly by limiting onward transmission in the general population. However, the impact of PN, both at the level of individuals and the population, remains unclear. Since it is difficult to study the effects of PN empirically, mathematical and computational models are useful tools for investigating its potential as a public health intervention. To this end, we developed an individual-based modeling framework called Rstisim. It allows the implementation of different models of STI transmission with various levels of complexity and the reconstruction of the complete dynamic sexual partnership network over any time period. A key feature of this framework is that we can trace an individual's partnership history in detail and investigate the outcome of different PN strategies for C. trachomatis. For individual case management, the results suggest that notifying three or more partners from the preceding 18 months yields substantial numbers of new cases. In contrast, the successful treatment of current partners is most important for preventing re-infection of index cases and reducing further transmission of C. trachomatis at the population level. The findings of this study demonstrate the difference between individual and population level outcomes of public health interventions for STIs.
Resumo:
Background The release of quality data from acute care hospitals to the general public is based on the aim to inform the public, to provide transparency and to foster quality-based competition among providers. Due to the expected mechanisms of action and possibly the adverse consequences of public quality comparison, it is a controversial topic. The perspective of physicians and nurses is of particular importance in this context. They are mainly responsible for the collection of quality-control data, and are directly confronted with the results of public comparison. The research focus of this qualitative study was to discover what the views and opinions of the Swiss physicians and nurses were regarding these issues. It was investigated as to how the two professional groups appraised the opportunities as well as the risks of the release of quality data in Switzerland. Methods A qualitative approach was chosen to answer the research question. For data collection, four focus groups were conducted with physicians and nurses who were employed in Swiss acute care hospitals. Qualitative content analysis was applied to the data. Results The results revealed that both occupational groups had a very critical and negative attitude regarding the recent developments. The perceived risks were dominating their view. In summary, their main concerns were: the reduction of complexity, the one-sided focus on measurable quality variables, risk selection, the threat of data manipulation and the abuse of published information by the media. An additional concern was that the impression is given that the complex construct of quality can be reduced to a few key figures, and it that it is constructed from a false message which then influences society and politics. This critical attitude is associated with the different value system and the professional self-concept that both physicians and nurses have, in comparison to the underlying principles of a market-based economy and the economic orientation of health care business. Conclusions The critical and negative attitude of Swiss physicians and nurses must, under all conditions, be heeded to and investigated regarding its impact on work motivation and identification with the profession. At the same time, the two professional groups are obligated to reflect upon their critical attitude and take a proactive role in the development of appropriate quality indicators for the publication of quality data in Switzerland.