866 resultados para 10 Technology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

My PhD project was focused on Atlantic bluefin tuna, Thunnus thynnus, a fishery resource overexploited in the last decades. For a better management of stocks, it was necessary to improve scientific knowledge of this species and to develop novel tools to avoid collapse of this important commercial resource. To do this, we used new high throughput sequencing technologies, as Next Generation Sequencing (NGS), and markers linked to expressed genes, as SNPs (Single Nucleotide Polymorphisms). In this work we applied a combined approach: transcriptomic resources were used to build cDNA libreries from mRNA isolated by muscle, and genomic resources allowed to create a reference backbone for this species lacking of reference genome. All cDNA reads, obtained from mRNA, were mapped against this genome and, employing several bioinformatics tools and different restricted parameters, we achieved a set of contigs to detect SNPs. Once a final panel of 384 SNPs was developed, following the selection criteria, it was genotyped in 960 individuals of Atlantic bluefin tuna, including all size/age classes, from larvae to adults, collected from the entire range of the species. The analysis of obtained data was aimed to evaluate the genetic diversity and the population structure of Thunnus thynnus. We detect a low but significant signal of genetic differentiation among spawning samples, that can suggest the presence of three genetically separate reproduction areas. The adult samples resulted instead genetically undifferentiated between them and from the spawning populations, indicating a presence of panmictic population of adult bluefin tuna in the Mediterranean Sea, without different meta populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La diffusione nelle strutture sanitarie di un numero sempre crescente di apparecchiature biomediche e di tecnologie "avanzate" per la diagnosi e la terapia ha radicalmente modificato l'approccio alla cura della salute. Questo processo di "tecnologizzazione" rende evidente la necessità di fare ricorso a competenze specifiche e a strutture organizzative adeguate in modo da garantire un’efficiente e corretta gestione delle tecnologie, sia dal punto di vista tecnico che economico, bisogni a cui da circa 40 anni risponde l’Ingegneria Clinica e i Servizi di Ingegneria Clinica. Nei paesi industrializzati la crescita economica ha permesso di finanziare nuovi investimenti e strutture all'avanguardia dal punto di vista tecnologico, ma d'altra parte il pesante ingresso della tecnologia negli ospedali ha contribuito, insieme ad altri fattori (aumento del tenore di vita, crescente urbanizzazione, invecchiamento della popolazione, ...) a rendere incontrollabile e difficilmente gestibile la spesa sanitaria. A fronte quindi di una distribuzione sempre più vasta ed ormai irrinunciabile di tecnologie biomediche, la struttura sanitaria deve essere in grado di scegliere le appropriate tecnologie e di impiegare correttamente la strumentazione, di garantire la sicurezza dei pazienti e degli operatori, nonché la qualità del servizio erogato e di ridurre e ottimizzare i costi di acquisto e di gestione. Davanti alla necessità di garantire gli stessi servizi con meno risorse è indispensabile utilizzare l’approccio dell’Health Technology Assessment (HTA), ossia la Valutazione delle Tecnologie Sanitarie, sia nell’introduzione di innovazioni sia nella scelta di disinvestire su servizi inappropriati od obsoleti che non aggiungono valore alla tutela della salute dei cittadini. Il seguente elaborato, dopo la definizione e classificazione delle tecnologie sanitarie, un’analisi del mercato di tale settore e delle spesa sanitaria sostenuta dai vari paesi Ocse, pone l’attenzione ai Servizi di Ingegneria Clinica e il ruolo chiave che essi hanno nel garantire efficienza ed economicità grazie anche all’ausilio dei profili HTA per la programmazione degli acquisti in sanità.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Doctoral Thesis unfolds into a collection of three distinct papers that share an interest in institutional theory and technology transfer. Taking into account that organizations are increasingly exposed to a multiplicity of demands and pressures, we aim to analyze what renders this situation of institutional complexity more or less difficult to manage for organizations, and what makes organizations more or less successful in responding to it. The three studies offer a novel contribution both theoretically and empirically. In particular, the first paper “The dimensions of organizational fields for understanding institutional complexity: A theoretical framework” is a theoretical contribution that tries to better understand the relationship between institutional complexity and fields by providing a framework. The second article “Beyond institutional complexity: The case of different organizational successes in confronting multiple institutional logics” is an empirical study which aims to explore the strategies that allow organizations facing multiple logics to respond more successfully to them. The third work “ How external support may mitigate the barriers to university-industry collaboration” is oriented towards practitioners and presents a case study about technology transfer in Italy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. Glycomic analysis allows investigating on the global glycome within body fluids (as serum/plasma), this could eventually lead to identify new types of disease biomarkers, or as in this study, biomarkers of human aging studying specific aging models. Recent studies demonstrated that the plasma N-glycome is modified during human aging, suggesting that measurements of log-ratio of two serum/plasma N-glycans (NGA2F and NA2F), named GlycoAge test could provide a non-invasive biomarker of aging. Down syndrome (DS) is a genetic disorder in which multiple major aspects of senescent phenotype occur much earlier than in healthy age-matched subjects and has been often defined as an accelerated aging syndrome. The aim of this study was to compare plasma N-glycome of patients affected by DS with age- and sex matched non-affected controls, represented by their siblings (DSS), in order to assess if DS is characterized by a specific N-glycomic pattern. Therefore, in order to investigate if N-glycans changes that occur in DS were able to reveal an accelerated aging in DS patients, we enrolled the mothers (DSM) of the DS and DSS, representing the non-affected control group with a different chronological age respect to DS. We applied two different N-glycomics approaches on the same samples: first, in order to study the complete plasma N-glycome we applied a new high-sensitive protocol based on a MALDI-TOF-MS approach, second, we used DSA-FACE technology. Results: MALDI-TOF/MS analysis detected a specific N-glycomics signature for DS, characterized by an increase of fucosylated and bisecting species. Moreover, in DS the abundance of agalactosylated (as NA2F) species was similar or higher than their mothers. The measurement of GlycoAge test with DSA-FACE, validated also by MALDI-TOF, demonstrated a strongly association with age, moreover in DS, it’s value was similar to their mothers, and significantly higher than their age- and sex matched not-affected siblings

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present PhD dissertation is dedicated to the general topic of knowledge transfer from academia to industry and the role of various measures at both institutional and university levels in support of commercialization of university research. The overall contribution of the present dissertation work refers to presenting an in-depth and comprehensive analysis of the main critical issues that currently exist with regard to commercial exploitation of academic research, while providing evidence on the role of previously underexplored areas (e.g. strategic use of academic patents; female academic patenting) in a general debate on the ways to successful knowledge transfer from academia to industry. The first paper, included in the present PhD dissertation, aims to address this gap by developing a taxonomy of literature, based on a comprehensive review of the existing body of research on government measures in support of knowledge transfer from academia to industry. The results of the review reveal that there is a considerable gap in the analysis of the impact and relative effectiveness of the public policy measures, especially in what regards the measures aimed at building knowledge and expertise among academic faculty and technology transfer agents. The second paper, presented as a part of the dissertation, focuses on the role of interorganizational collaborations and their effect on the likelihood of an academic patent to remain unused, and points to the strategic management of patents by universities. In the third paper I turn to the issue of female participation in patenting and commercialization; in particular, I find evidence on the positive role of university and its internal support structures in closing the gender gap in female academic patenting. The results of the research, carried out for the present dissertation, provide important implications for policy makers in crafting measures to increase the efficient use of university knowledge stock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the several issues faced in the past, the evolutionary trend of silicon has kept its constant pace. Today an ever increasing number of cores is integrated onto the same die. Unfortunately, the extraordinary performance achievable by the many-core paradigm is limited by several factors. Memory bandwidth limitation, combined with inefficient synchronization mechanisms, can severely overcome the potential computation capabilities. Moreover, the huge HW/SW design space requires accurate and flexible tools to perform architectural explorations and validation of design choices. In this thesis we focus on the aforementioned aspects: a flexible and accurate Virtual Platform has been developed, targeting a reference many-core architecture. Such tool has been used to perform architectural explorations, focusing on instruction caching architecture and hybrid HW/SW synchronization mechanism. Beside architectural implications, another issue of embedded systems is considered: energy efficiency. Near Threshold Computing is a key research area in the Ultra-Low-Power domain, as it promises a tenfold improvement in energy efficiency compared to super-threshold operation and it mitigates thermal bottlenecks. The physical implications of modern deep sub-micron technology are severely limiting performance and reliability of modern designs. Reliability becomes a major obstacle when operating in NTC, especially memory operation becomes unreliable and can compromise system correctness. In the present work a novel hybrid memory architecture is devised to overcome reliability issues and at the same time improve energy efficiency by means of aggressive voltage scaling when allowed by workload requirements. Variability is another great drawback of near-threshold operation. The greatly increased sensitivity to threshold voltage variations in today a major concern for electronic devices. We introduce a variation-tolerant extension of the baseline many-core architecture. By means of micro-architectural knobs and a lightweight runtime control unit, the baseline architecture becomes dynamically tolerant to variations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Precision Agriculture (PA) and the more specific branch of Precision Horticulture are two very promising sectors. They focus on the use of technologies in agriculture to optimize the use of inputs, so to reach a better efficiency, and minimize waste of resources. This important objective motivated many researchers and companies to search new technology solutions. Sometimes the effort proved to be a good seed, but sometimes an unfeasible idea. So that PA, from its birth more or less 25 years ago, is still a “new” management, interesting for the future, but an actual low adoption rate is still reported by experts and researchers. This work aims to give a contribution in finding the causes of this low adoption rate and proposing a methodological solution to this problem. The first step was to examine prior research about Precision Agriculture adoption, by ex ante and ex post approach. It was supposed as important to find connections between these two phases of a purchase experience. In fact, the ex ante studies dealt with potential consumer’s perceptions before a usage experience occurred, therefore before purchasing a technology, while the ex post studies described the drivers which made a farmer become an end-user of PA technology. Then, an example of consumer research is presented. This was an ex ante research focused on pre-prototype technology for fruit production. This kind of research could give precious information about consumer acceptance before reaching an advanced development phase of the technology, and so to have the possibility to change something with the least financial impact. The final step was to develop the pre-prototype technology that was the subject of the consumer acceptance research and test its technical characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chapter 1 studies how consumers’ switching costs affect the pricing and profits of firms competing in two-sided markets such as Apple and Google in the smartphone market. When two-sided markets are dynamic – rather than merely static – I show that switching costs lower the first-period price if network externalities are strong, which is in contrast to what has been found in one-sided markets. By contrast, switching costs soften price competition in the initial period if network externalities are weak and consumers are more patient than the platforms. Moreover, an increase in switching costs on one side decreases the first-period price on the other side. Chapter 2 examines firms’ incentives to invest in local and flexible resources when demand is uncertain and correlated. I find that market power of the monopolist providing flexible resources distorts investment incentives, while competition mitigates them. The extent of improvement depends critically on demand correlation and the cost of capacity: under social optimum and monopoly, if the flexible resource is cheap, the relationship between investment and correlation is positive, and if it is costly, the relationship becomes negative; under duopoly, the relationship is positive. The analysis also sheds light on some policy discussions in markets such as cloud computing. Chapter 3 develops a theory of sequential investments in cybersecurity. The regulator can use safety standards and liability rules to increase security. I show that the joint use of an optimal standard and a full liability rule leads to underinvestment ex ante and overinvestment ex post. Instead, switching to a partial liability rule can correct the inefficiencies. This suggests that to improve security, the regulator should encourage not only firms, but also consumers to invest in security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal sand dunes represent a richness first of all in terms of defense from the sea storms waves and the saltwater ingression; moreover these morphological elements constitute an unique ecosystem of transition between the sea and the land environment. The research about dune system is a strong part of the coastal sciences, since the last century. Nowadays this branch have assumed even more importance for two reasons: on one side the born of brand new technologies, especially related to the Remote Sensing, have increased the researcher possibilities; on the other side the intense urbanization of these days have strongly limited the dune possibilities of development and fragmented what was remaining from the last century. This is particularly true in the Ravenna area, where the industrialization united to the touristic economy and an intense subsidence, have left only few dune ridges residual still active. In this work three different foredune ridges, along the Ravenna coast, have been studied with Laser Scanner technology. This research didn’t limit to analyze volume or spatial difference, but try also to find new ways and new features to monitor this environment. Moreover the author planned a series of test to validate data from Terrestrial Laser Scanner (TLS), with the additional aim of finalize a methodology to test 3D survey accuracy. Data acquired by TLS were then applied on one hand to test some brand new applications, such as Digital Shore Line Analysis System (DSAS) and Computational Fluid Dynamics (CFD), to prove their efficacy in this field; on the other hand the author used TLS data to find any correlation with meteorological indexes (Forcing Factors), linked to sea and wind (Fryberger's method) applying statistical tools, such as the Principal Component Analysis (PCA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis aims to make the dynamics of the tradeoffs involving privacy more visible; both theoretically and in two of the central current policy debates in European data protection law, the right to be forgotten and online tracking. In doing so, it offers an explanation for data protection law from an economic perspective and provides a basis for the evaluation of further data protection measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Swiss Federal Office of Public Health demanded a nationwide health technology assessment registry for cervical and lumbar total disc arthroplasty and for balloon kyphoplasty (BKP) to make a decision about reimbursement of these interventions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ventricular tachycardia (VT) late after myocardial infarction is an important contributor to morbidity and mortality. This prospective multicenter study assessed the efficacy and safety of electroanatomical mapping in combination with open-saline irrigated ablation technology for ablation of chronic recurrent mappable and unmappable VT in remote myocardial infarction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SMARTDIAB is a platform designed to support the monitoring, management, and treatment of patients with type 1 diabetes mellitus (T1DM), by combining state-of-the-art approaches in the fields of database (DB) technologies, communications, simulation algorithms, and data mining. SMARTDIAB consists mainly of two units: 1) the patient unit (PU); and 2) the patient management unit (PMU), which communicate with each other for data exchange. The PMU can be accessed by the PU through the internet using devices, such as PCs/laptops with direct internet access or mobile phones via a Wi-Fi/General Packet Radio Service access network. The PU consists of an insulin pump for subcutaneous insulin infusion to the patient and a continuous glucose measurement system. The aforementioned devices running a user-friendly application gather patient's related information and transmit it to the PMU. The PMU consists of a diabetes data management system (DDMS), a decision support system (DSS) that provides risk assessment for long-term diabetes complications, and an insulin infusion advisory system (IIAS), which reside on a Web server. The DDMS can be accessed from both medical personnel and patients, with appropriate security access rights and front-end interfaces. The DDMS, apart from being used for data storage/retrieval, provides also advanced tools for the intelligent processing of the patient's data, supporting the physician in decision making, regarding the patient's treatment. The IIAS is used to close the loop between the insulin pump and the continuous glucose monitoring system, by providing the pump with the appropriate insulin infusion rate in order to keep the patient's glucose levels within predefined limits. The pilot version of the SMARTDIAB has already been implemented, while the platform's evaluation in clinical environment is being in progress.