10 resultados para Incremental Information-content
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Most of current ultra-miniaturized devices are obtained by the top-down approach, in which nanoscale components are fabricated by cutting down larger precursors. Since this physical-engineering method is reaching its limits, especially for components below 30 nm in size, alternative strategies are necessary. Of particular appeal to chemists is the supramolecular bottom-up approach to nanotechnology, a methodology that utilizes the principles of molecular recognition to build materials and devices from molecular components. The subject of this thesis is the photophysical and electrochemical investigation of nanodevices obtained harnessing the principles of supramolecular chemistry. These systems operate in solution-based environments and are investigated at the ensemble level. The majority of the chemical systems discussed here are based on pseudorotaxanes and catenanes. Such supramolecular systems represent prototypes of molecular machines since they are capable of performing simple controlled mechanical movements. Their properties and operation are strictly related to the supramolecular interactions between molecular components (generally photoactive or electroactive molecules) and to the possibility of modulating such interactions by means of external stimuli. The main issues addressed throughout the thesis are: (i) the analysis of the factors that can affect the architecture and perturb the stability of supramolecular systems; (ii) the possibility of controlling the direction of supramolecular motions exploiting the molecular information content; (iii) the development of switchable supramolecular polymers starting from simple host-guest complexes; (iv) the capability of some molecular machines to process information at molecular level, thus behaving as logic devices; (v) the behaviour of molecular machine components in a biological-type environment; (vi) the study of chemically functionalized metal nanoparticles by second harmonic generation spectroscopy.
Resumo:
Among the scientific objectives addressed by the Radio Science Experiment hosted on board the ESA mission BepiColombo is the retrieval of the rotational state of planet Mercury. In fact, the estimation of the obliquity and the librations amplitude were proven to be fundamental for constraining the interior composition of Mercury. This is accomplished by the Mercury Orbiter Radio science Experiment (MORE) via a strict interaction among different payloads thus making the experiment particularly challenging. The underlying idea consists in capturing images of the same landmark on the surface of the planet in different epochs in order to observe a displacement of the identified features with respect to a nominal rotation which allows to estimate the rotational parameters. Observations must be planned accurately in order to obtain image pairs carrying the highest information content for the following estimation process. This is not a trivial task especially in light of the several dynamical constraints involved. Another delicate issue is represented by the pattern matching process between image pairs for which the lowest correlation errors are desired. The research activity was conducted in the frame of the MORE rotation experiment and addressed the design and implementation of an end-to-end simulator of the experiment with the final objective of establishing an optimal science planning of the observations. In the thesis, the implementation of the singular modules forming the simulator is illustrated along with the simulations performed. The results obtained from the preliminary release of the optimization algorithm are finally presented although the software implemented is only at a preliminary release and will be improved and refined in the future also taking into account the developments of the mission.
Resumo:
The idea of balancing the resources spent in the acquisition and encoding of natural signals strictly to their intrinsic information content has interested nearly a decade of research under the name of compressed sensing. In this doctoral dissertation we develop some extensions and improvements upon this technique's foundations, by modifying the random sensing matrices on which the signals of interest are projected to achieve different objectives. Firstly, we propose two methods for the adaptation of sensing matrix ensembles to the second-order moments of natural signals. These techniques leverage the maximisation of different proxies for the quantity of information acquired by compressed sensing, and are efficiently applied in the encoding of electrocardiographic tracks with minimum-complexity digital hardware. Secondly, we focus on the possibility of using compressed sensing as a method to provide a partial, yet cryptanalysis-resistant form of encryption; in this context, we show how a random matrix generation strategy with a controlled amount of perturbations can be used to distinguish between multiple user classes with different quality of access to the encrypted information content. Finally, we explore the application of compressed sensing in the design of a multispectral imager, by implementing an optical scheme that entails a coded aperture array and Fabry-Pérot spectral filters. The signal recoveries obtained by processing real-world measurements show promising results, that leave room for an improvement of the sensing matrix calibration problem in the devised imager.
Resumo:
The world of Computational Biology and Bioinformatics presently integrates many different expertise, including computer science and electronic engineering. A major aim in Data Science is the development and tuning of specific computational approaches to interpret the complexity of Biology. Molecular biologists and medical doctors heavily rely on an interdisciplinary expert capable of understanding the biological background to apply algorithms for finding optimal solutions to their problems. With this problem-solving orientation, I was involved in two basic research fields: Cancer Genomics and Enzyme Proteomics. For this reason, what I developed and implemented can be considered a general effort to help data analysis both in Cancer Genomics and in Enzyme Proteomics, focusing on enzymes which catalyse all the biochemical reactions in cells. Specifically, as to Cancer Genomics I contributed to the characterization of intratumoral immune microenvironment in gastrointestinal stromal tumours (GISTs) correlating immune cell population levels with tumour subtypes. I was involved in the setup of strategies for the evaluation and standardization of different approaches for fusion transcript detection in sarcomas that can be applied in routine diagnostic. This was part of a coordinated effort of the Sarcoma working group of "Alleanza Contro il Cancro". As to Enzyme Proteomics, I generated a derived database collecting all the human proteins and enzymes which are known to be associated to genetic disease. I curated the data search in freely available databases such as PDB, UniProt, Humsavar, Clinvar and I was responsible of searching, updating, and handling the information content, and computing statistics. I also developed a web server, BENZ, which allows researchers to annotate an enzyme sequence with the corresponding Enzyme Commission number, the important feature fully describing the catalysed reaction. More to this, I greatly contributed to the characterization of the enzyme-genetic disease association, for a better classification of the metabolic genetic diseases.
Resumo:
In this thesis, we investigate the role of applied physics in epidemiological surveillance through the application of mathematical models, network science and machine learning. The spread of a communicable disease depends on many biological, social, and health factors. The large masses of data available make it possible, on the one hand, to monitor the evolution and spread of pathogenic organisms; on the other hand, to study the behavior of people, their opinions and habits. Presented here are three lines of research in which an attempt was made to solve real epidemiological problems through data analysis and the use of statistical and mathematical models. In Chapter 1, we applied language-inspired Deep Learning models to transform influenza protein sequences into vectors encoding their information content. We then attempted to reconstruct the antigenic properties of different viral strains using regression models and to identify the mutations responsible for vaccine escape. In Chapter 2, we constructed a compartmental model to describe the spread of a bacterium within a hospital ward. The model was informed and validated on time series of clinical measurements, and a sensitivity analysis was used to assess the impact of different control measures. Finally (Chapter 3) we reconstructed the network of retweets among COVID-19 themed Twitter users in the early months of the SARS-CoV-2 pandemic. By means of community detection algorithms and centrality measures, we characterized users’ attention shifts in the network, showing that scientific communities, initially the most retweeted, lost influence over time to national political communities. In the Conclusion, we highlighted the importance of the work done in light of the main contemporary challenges for epidemiological surveillance. In particular, we present reflections on the importance of nowcasting and forecasting, the relationship between data and scientific research, and the need to unite the different scales of epidemiological surveillance.
Resumo:
The aim of the thesis is to investigate the topic of semantic under-determinacy, i.e. the failure of the semantic content of certain expressions to determine a truth-evaluable utterance content. In the first part of the thesis, I engage with the problem of setting apart semantic under-determinacy as opposed to other phenomena such as ambiguity, vagueness, indexicality. As I will argue, the feature that distinguishes semantic under-determinacy from these phenomena is its being explainable solely in terms of under-articulation. In the second part of the thesis, I discuss the topic of how communication is possible, despite the semantic under-determinacy of language. I discuss a number of answers that have been offered: (i) the Radical Contextualist explanation which emphasises the role of pragmatic processes in utterance comprehension; (ii) the Indexicalist explanation in terms of hidden syntactic positions; (iii) the Relativist account, which regards sentences as true or false relative to extra coordinates in the circumstances of evaluation (besides possible worlds). In the final chapter, I propose an account of the comprehension of utterances of semantically under-determined sentences in terms of conceptual constraints, i.e. ways of organising information which regulate thought and discourse on certain matters. Conceptual constraints help the hearer to work out the truth-conditions of an utterance of a semantically under-determined sentence. Their role is clearly semantic, in that they contribute to “what is said” (rather than to “what is implied”); however, they do not respond to any syntactic constraint. The view I propose therefore differs, on the one hand, from Radical Contextualism, because it stresses the role of semantic-governed processes as opposed to pragmatics-governed processes; on the other hand, it differs from Indexicalism in its not endorsing any commitment as to hidden syntactic positions; and it differs from Relativism in that it maintains a monadic notion if truth.
Resumo:
In this thesis, the author presents a query language for an RDF (Resource Description Framework) database and discusses its applications in the context of the HELM project (the Hypertextual Electronic Library of Mathematics). This language aims at meeting the main requirements coming from the RDF community. in particular it includes: a human readable textual syntax and a machine-processable XML (Extensible Markup Language) syntax both for queries and for query results, a rigorously exposed formal semantics, a graph-oriented RDF data access model capable of exploring an entire RDF graph (including both RDF Models and RDF Schemata), a full set of Boolean operators to compose the query constraints, fully customizable and highly structured query results having a 4-dimensional geometry, some constructions taken from ordinary programming languages that simplify the formulation of complex queries. The HELM project aims at integrating the modern tools for the automation of formal reasoning with the most recent electronic publishing technologies, in order create and maintain a hypertextual, distributed virtual library of formal mathematical knowledge. In the spirit of the Semantic Web, the documents of this library include RDF metadata describing their structure and content in a machine-understandable form. Using the author's query engine, HELM exploits this information to implement some functionalities allowing the interactive and automatic retrieval of documents on the basis of content-aware requests that take into account the mathematical nature of these documents.
Resumo:
This research aims at contributing to a better understanding of changes in local governments’ accounting and reporting practices. Particularly, ‘why’, ‘what’ and ‘how’ environmental aspects are included and the significance of changes across time. It adopts an interpretative approach to conduct a longitudinal analysis of case studies. Pettigrew and Whipp’s framework on context, content and process is used as a lens to distinguish changes under each dimension and analyse their interconnections. Data is collected from official documents and triangulated with semi-structured interviews. The legal framework defines as boundaries of the accounting information the territory under local governments’ jurisdiction and their immediate surrounding area. Organisational environmental performance and externalities are excluded from the requirements. An interplay between the local outer context, political commitment and organisational culture justifies the implementation of changes beyond what is regulated and the implementation of transformational changes. Local governments engage in international networks to gain access to funding and implement changes, leading to adopting the dominant environmental agenda. Key stakeholders, like citizens, are not engaged in the accounting and reporting process. Thus, there is no evidence that the environmental aspects addressed and related changes align with stakeholders’ needs and expectations, which jeopardises its significance. Findings from the current research have implications in other EU member states due to the harmonisation of accounting and reporting practices and the common practice across the EU in using external funding to conceptualise and implement changes. This implies that other local governments could also be representing a limited account related to environmental aspects.
Resumo:
This dissertation consists of three standalone articles that contribute to the economics literature concerning technology adoption, information diffusion, and network economics in one way or another, using a couple of primary data sources from Ethiopia. The first empirical paper identifies the main behavioral factors affecting the adoption of brand new (radical) and upgraded (incremental) bioenergy innovations in Ethiopia. The results highlight the importance of targeting different instruments to increase the adoption rate of the two types of innovations. The second and the third empirical papers of this thesis, use primary data collected from 3,693 high school students in Ethiopia, and shed light on how we should select informants to effectively and equitably disseminate new information, mainly concerning environmental issues. There are different well-recognized standard centrality measures that are used to select informants. These standard centrality measures, however, are based on the network topology---shaped only by the number of connections---and fail to incorporate the intrinsic motivations of the informants. This thesis introduces an augmented centrality measure (ACM) by modifying the eigenvector centrality measure through weighting the adjacency matrix with the altruism levels of connected nodes. The results from the two papers suggest that targeting informants based on network position and behavioral attributes ensures more effective and equitable (gender perspective) transmission of information in social networks than selecting informants on network centrality measures alone. Notably, when the information is concerned with environmental issues.
Resumo:
Most cognitive functions require the encoding and routing of information across distributed networks of brain regions. Information propagation is typically attributed to physical connections existing between brain regions, and contributes to the formation of spatially correlated activity patterns, known as functional connectivity. While structural connectivity provides the anatomical foundation for neural interactions, the exact manner in which it shapes functional connectivity is complex and not yet fully understood. Additionally, traditional measures of directed functional connectivity only capture the overall correlation between neural activity, and provide no insight on the content of transmitted information, limiting their ability in understanding neural computations underlying the distributed processing of behaviorally-relevant variables. In this work, we first study the relationship between structural and functional connectivity in simulated recurrent spiking neural networks with spike timing dependent plasticity. We use established measures of time-lagged correlation and overall information propagation to infer the temporal evolution of synaptic weights, showing that measures of dynamic functional connectivity can be used to reliably reconstruct the evolution of structural properties of the network. Then, we extend current methods of directed causal communication between brain areas, by deriving an information-theoretic measure of Feature-specific Information Transfer (FIT) quantifying the amount, content and direction of information flow. We test FIT on simulated data, showing its key properties and advantages over traditional measures of overall propagated information. We show applications of FIT to several neural datasets obtained with different recording methods (magneto and electro-encephalography, spiking activity, local field potentials) during various cognitive functions, ranging from sensory perception to decision making and motor learning. Overall, these analyses demonstrate the ability of FIT to advance the investigation of communication between brain regions, uncovering the previously unaddressed content of directed information flow.