13 resultados para 10 Technology

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A systematic characterization of the composition and structure of the bacterial cell-surface proteome and its complexes can provide an invaluable tool for its comprehensive understanding. The knowledge of protein complexes composition and structure could offer new, more effective targets for a more specific and consequently effective immune response against a complex instead of a single protein. Large-scale protein-protein interaction screens are the first step towards the identification of complexes and their attribution to specific pathways. Currently, several methods exist for identifying protein interactions and protein microarrays provide the most appealing alternative to existing techniques for a high throughput screening of protein-protein interactions in vitro under reasonably straightforward conditions. In this study approximately 100 proteins of Group A Streptococcus (GAS) predicted to be secreted or surface exposed by genomic and proteomic approaches were purified in a His-tagged form and used to generate protein microarrays on nitrocellulose-coated slides. To identify protein-protein interactions each purified protein was then labeled with biotin, hybridized to the microarray and interactions were detected with Cy3-labelled streptavidin. Only reciprocal interactions, i. e. binding of the same two interactors irrespective of which of the two partners is in solid-phase or in solution, were taken as bona fide protein-protein interactions. Using this approach, we have identified 20 interactors of one of the potent toxins secreted by GAS and known as superantigens. Several of these interactors belong to the molecular chaperone or protein folding catalyst families and presumably are involved in the secretion and folding of the superantigen. In addition, a very interesting interaction was found between the superantigen and the substrate binding subunit of a well characterized ABC transporter. This finding opens a new perspective on the current understanding of how superantigens are modified by the bacterial cell in order to become major players in causing disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ancient pavements are composed of a variety of preparatory or foundation layers constituting the substrate, and of a layer of tesserae, pebbles or marble slabs forming the surface of the floor. In other cases, the surface consists of a mortar layer beaten and polished. The term mosaic is associated with the presence of tesserae or pebbles, while the more general term pavement is used in all the cases. As past and modern excavations of ancient pavements demonstrated, all pavements do not necessarily display the stratigraphy of the substrate described in the ancient literary sources. In fact, the number and thickness of the preparatory layers, as well as the nature and the properties of their constituent materials, are often varying in pavements which are placed either in different sites or in different buildings within a same site or even in a same building. For such a reason, an investigation that takes account of the whole structure of the pavement is important when studying the archaeological context of the site where it is placed, when designing materials to be used for its maintenance and restoration, when documenting it and when presenting it to public. Five case studies represented by archaeological sites containing floor mosaics and other kind of pavements, dated to the Hellenistic and the Roman period, have been investigated by means of in situ and laboratory analyses. The results indicated that the characteristics of the studied pavements, namely the number and the thickness of the preparatory layers, and the properties of the mortars constituting them, vary according to the ancient use of the room where the pavements are placed and to the type of surface upon which they were built. The study contributed to the understanding of the function and the technology of the pavements’ substrate and to the characterization of its constituent materials. Furthermore, the research underlined the importance of the investigation of the whole structure of the pavement, included the foundation surface, in the interpretation of the archaeological context where it is located. A series of practical applications of the results of the research, in the designing of repair mortars for pavements, in the documentation of ancient pavements in the conservation practice, and in the presentation to public in situ and in museums of ancient pavements, have been suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hybrid technologies, thanks to the convergence of integrated microelectronic devices and new class of microfluidic structures could open new perspectives to the way how nanoscale events are discovered, monitored and controlled. The key point of this thesis is to evaluate the impact of such an approach into applications of ion-channel High Throughput Screening (HTS)platforms. This approach offers promising opportunities for the development of new classes of sensitive, reliable and cheap sensors. There are numerous advantages of embedding microelectronic readout structures strictly coupled to sensing elements. On the one hand the signal-to-noise-ratio is increased as a result of scaling. On the other, the readout miniaturization allows organization of sensors into arrays, increasing the capability of the platform in terms of number of acquired data, as required in the HTS approach, to improve sensing accuracy and reliabiity. However, accurate interface design is required to establish efficient communication between ionic-based and electronic-based signals. The work made in this thesis will show a first example of a complete parallel readout system with single ion channel resolution, using a compact and scalable hybrid architecture suitable to be interfaced to large array of sensors, ensuring simultaneous signal recording and smart control of the signal-to-noise ratio and bandwidth trade off. More specifically, an array of microfluidic polymer structures, hosting artificial lipid bilayers blocks where single ion channel pores are embededed, is coupled with an array of ultra-low noise current amplifiers for signal amplification and data processing. As demonstrating working example, the platform was used to acquire ultra small currents derived by single non-covalent molecular binding between alpha-hemolysin pores and beta-cyclodextrin molecules in artificial lipid membranes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My PhD project was focused on Atlantic bluefin tuna, Thunnus thynnus, a fishery resource overexploited in the last decades. For a better management of stocks, it was necessary to improve scientific knowledge of this species and to develop novel tools to avoid collapse of this important commercial resource. To do this, we used new high throughput sequencing technologies, as Next Generation Sequencing (NGS), and markers linked to expressed genes, as SNPs (Single Nucleotide Polymorphisms). In this work we applied a combined approach: transcriptomic resources were used to build cDNA libreries from mRNA isolated by muscle, and genomic resources allowed to create a reference backbone for this species lacking of reference genome. All cDNA reads, obtained from mRNA, were mapped against this genome and, employing several bioinformatics tools and different restricted parameters, we achieved a set of contigs to detect SNPs. Once a final panel of 384 SNPs was developed, following the selection criteria, it was genotyped in 960 individuals of Atlantic bluefin tuna, including all size/age classes, from larvae to adults, collected from the entire range of the species. The analysis of obtained data was aimed to evaluate the genetic diversity and the population structure of Thunnus thynnus. We detect a low but significant signal of genetic differentiation among spawning samples, that can suggest the presence of three genetically separate reproduction areas. The adult samples resulted instead genetically undifferentiated between them and from the spawning populations, indicating a presence of panmictic population of adult bluefin tuna in the Mediterranean Sea, without different meta populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Doctoral Thesis unfolds into a collection of three distinct papers that share an interest in institutional theory and technology transfer. Taking into account that organizations are increasingly exposed to a multiplicity of demands and pressures, we aim to analyze what renders this situation of institutional complexity more or less difficult to manage for organizations, and what makes organizations more or less successful in responding to it. The three studies offer a novel contribution both theoretically and empirically. In particular, the first paper “The dimensions of organizational fields for understanding institutional complexity: A theoretical framework” is a theoretical contribution that tries to better understand the relationship between institutional complexity and fields by providing a framework. The second article “Beyond institutional complexity: The case of different organizational successes in confronting multiple institutional logics” is an empirical study which aims to explore the strategies that allow organizations facing multiple logics to respond more successfully to them. The third work “ How external support may mitigate the barriers to university-industry collaboration” is oriented towards practitioners and presents a case study about technology transfer in Italy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. Glycomic analysis allows investigating on the global glycome within body fluids (as serum/plasma), this could eventually lead to identify new types of disease biomarkers, or as in this study, biomarkers of human aging studying specific aging models. Recent studies demonstrated that the plasma N-glycome is modified during human aging, suggesting that measurements of log-ratio of two serum/plasma N-glycans (NGA2F and NA2F), named GlycoAge test could provide a non-invasive biomarker of aging. Down syndrome (DS) is a genetic disorder in which multiple major aspects of senescent phenotype occur much earlier than in healthy age-matched subjects and has been often defined as an accelerated aging syndrome. The aim of this study was to compare plasma N-glycome of patients affected by DS with age- and sex matched non-affected controls, represented by their siblings (DSS), in order to assess if DS is characterized by a specific N-glycomic pattern. Therefore, in order to investigate if N-glycans changes that occur in DS were able to reveal an accelerated aging in DS patients, we enrolled the mothers (DSM) of the DS and DSS, representing the non-affected control group with a different chronological age respect to DS. We applied two different N-glycomics approaches on the same samples: first, in order to study the complete plasma N-glycome we applied a new high-sensitive protocol based on a MALDI-TOF-MS approach, second, we used DSA-FACE technology. Results: MALDI-TOF/MS analysis detected a specific N-glycomics signature for DS, characterized by an increase of fucosylated and bisecting species. Moreover, in DS the abundance of agalactosylated (as NA2F) species was similar or higher than their mothers. The measurement of GlycoAge test with DSA-FACE, validated also by MALDI-TOF, demonstrated a strongly association with age, moreover in DS, it’s value was similar to their mothers, and significantly higher than their age- and sex matched not-affected siblings

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present PhD dissertation is dedicated to the general topic of knowledge transfer from academia to industry and the role of various measures at both institutional and university levels in support of commercialization of university research. The overall contribution of the present dissertation work refers to presenting an in-depth and comprehensive analysis of the main critical issues that currently exist with regard to commercial exploitation of academic research, while providing evidence on the role of previously underexplored areas (e.g. strategic use of academic patents; female academic patenting) in a general debate on the ways to successful knowledge transfer from academia to industry. The first paper, included in the present PhD dissertation, aims to address this gap by developing a taxonomy of literature, based on a comprehensive review of the existing body of research on government measures in support of knowledge transfer from academia to industry. The results of the review reveal that there is a considerable gap in the analysis of the impact and relative effectiveness of the public policy measures, especially in what regards the measures aimed at building knowledge and expertise among academic faculty and technology transfer agents. The second paper, presented as a part of the dissertation, focuses on the role of interorganizational collaborations and their effect on the likelihood of an academic patent to remain unused, and points to the strategic management of patents by universities. In the third paper I turn to the issue of female participation in patenting and commercialization; in particular, I find evidence on the positive role of university and its internal support structures in closing the gender gap in female academic patenting. The results of the research, carried out for the present dissertation, provide important implications for policy makers in crafting measures to increase the efficient use of university knowledge stock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the several issues faced in the past, the evolutionary trend of silicon has kept its constant pace. Today an ever increasing number of cores is integrated onto the same die. Unfortunately, the extraordinary performance achievable by the many-core paradigm is limited by several factors. Memory bandwidth limitation, combined with inefficient synchronization mechanisms, can severely overcome the potential computation capabilities. Moreover, the huge HW/SW design space requires accurate and flexible tools to perform architectural explorations and validation of design choices. In this thesis we focus on the aforementioned aspects: a flexible and accurate Virtual Platform has been developed, targeting a reference many-core architecture. Such tool has been used to perform architectural explorations, focusing on instruction caching architecture and hybrid HW/SW synchronization mechanism. Beside architectural implications, another issue of embedded systems is considered: energy efficiency. Near Threshold Computing is a key research area in the Ultra-Low-Power domain, as it promises a tenfold improvement in energy efficiency compared to super-threshold operation and it mitigates thermal bottlenecks. The physical implications of modern deep sub-micron technology are severely limiting performance and reliability of modern designs. Reliability becomes a major obstacle when operating in NTC, especially memory operation becomes unreliable and can compromise system correctness. In the present work a novel hybrid memory architecture is devised to overcome reliability issues and at the same time improve energy efficiency by means of aggressive voltage scaling when allowed by workload requirements. Variability is another great drawback of near-threshold operation. The greatly increased sensitivity to threshold voltage variations in today a major concern for electronic devices. We introduce a variation-tolerant extension of the baseline many-core architecture. By means of micro-architectural knobs and a lightweight runtime control unit, the baseline architecture becomes dynamically tolerant to variations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Precision Agriculture (PA) and the more specific branch of Precision Horticulture are two very promising sectors. They focus on the use of technologies in agriculture to optimize the use of inputs, so to reach a better efficiency, and minimize waste of resources. This important objective motivated many researchers and companies to search new technology solutions. Sometimes the effort proved to be a good seed, but sometimes an unfeasible idea. So that PA, from its birth more or less 25 years ago, is still a “new” management, interesting for the future, but an actual low adoption rate is still reported by experts and researchers. This work aims to give a contribution in finding the causes of this low adoption rate and proposing a methodological solution to this problem. The first step was to examine prior research about Precision Agriculture adoption, by ex ante and ex post approach. It was supposed as important to find connections between these two phases of a purchase experience. In fact, the ex ante studies dealt with potential consumer’s perceptions before a usage experience occurred, therefore before purchasing a technology, while the ex post studies described the drivers which made a farmer become an end-user of PA technology. Then, an example of consumer research is presented. This was an ex ante research focused on pre-prototype technology for fruit production. This kind of research could give precious information about consumer acceptance before reaching an advanced development phase of the technology, and so to have the possibility to change something with the least financial impact. The final step was to develop the pre-prototype technology that was the subject of the consumer acceptance research and test its technical characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chapter 1 studies how consumers’ switching costs affect the pricing and profits of firms competing in two-sided markets such as Apple and Google in the smartphone market. When two-sided markets are dynamic – rather than merely static – I show that switching costs lower the first-period price if network externalities are strong, which is in contrast to what has been found in one-sided markets. By contrast, switching costs soften price competition in the initial period if network externalities are weak and consumers are more patient than the platforms. Moreover, an increase in switching costs on one side decreases the first-period price on the other side. Chapter 2 examines firms’ incentives to invest in local and flexible resources when demand is uncertain and correlated. I find that market power of the monopolist providing flexible resources distorts investment incentives, while competition mitigates them. The extent of improvement depends critically on demand correlation and the cost of capacity: under social optimum and monopoly, if the flexible resource is cheap, the relationship between investment and correlation is positive, and if it is costly, the relationship becomes negative; under duopoly, the relationship is positive. The analysis also sheds light on some policy discussions in markets such as cloud computing. Chapter 3 develops a theory of sequential investments in cybersecurity. The regulator can use safety standards and liability rules to increase security. I show that the joint use of an optimal standard and a full liability rule leads to underinvestment ex ante and overinvestment ex post. Instead, switching to a partial liability rule can correct the inefficiencies. This suggests that to improve security, the regulator should encourage not only firms, but also consumers to invest in security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal sand dunes represent a richness first of all in terms of defense from the sea storms waves and the saltwater ingression; moreover these morphological elements constitute an unique ecosystem of transition between the sea and the land environment. The research about dune system is a strong part of the coastal sciences, since the last century. Nowadays this branch have assumed even more importance for two reasons: on one side the born of brand new technologies, especially related to the Remote Sensing, have increased the researcher possibilities; on the other side the intense urbanization of these days have strongly limited the dune possibilities of development and fragmented what was remaining from the last century. This is particularly true in the Ravenna area, where the industrialization united to the touristic economy and an intense subsidence, have left only few dune ridges residual still active. In this work three different foredune ridges, along the Ravenna coast, have been studied with Laser Scanner technology. This research didn’t limit to analyze volume or spatial difference, but try also to find new ways and new features to monitor this environment. Moreover the author planned a series of test to validate data from Terrestrial Laser Scanner (TLS), with the additional aim of finalize a methodology to test 3D survey accuracy. Data acquired by TLS were then applied on one hand to test some brand new applications, such as Digital Shore Line Analysis System (DSAS) and Computational Fluid Dynamics (CFD), to prove their efficacy in this field; on the other hand the author used TLS data to find any correlation with meteorological indexes (Forcing Factors), linked to sea and wind (Fryberger's method) applying statistical tools, such as the Principal Component Analysis (PCA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis aims to make the dynamics of the tradeoffs involving privacy more visible; both theoretically and in two of the central current policy debates in European data protection law, the right to be forgotten and online tracking. In doing so, it offers an explanation for data protection law from an economic perspective and provides a basis for the evaluation of further data protection measures.