10 resultados para Computer systems organization: general-emerging technologies

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The consumer demand for natural, minimally processed, fresh like and functional food has lead to an increasing interest in emerging technologies. The aim of this PhD project was to study three innovative food processing technologies currently used in the food sector. Ultrasound-assisted freezing, vacuum impregnation and pulsed electric field have been investigated through laboratory scale systems and semi-industrial pilot plants. Furthermore, analytical and sensory techniques have been developed to evaluate the quality of food and vegetable matrix obtained by traditional and emerging processes. Ultrasound was found to be a valuable technique to improve the freezing process of potatoes, anticipating the beginning of the nucleation process, mainly when applied during the supercooling phase. A study of the effects of pulsed electric fields on phenol and enzymatic profile of melon juice has been realized and the statistical treatment of data was carried out through a response surface method. Next, flavour enrichment of apple sticks has been realized applying different techniques, as atmospheric, vacuum, ultrasound technologies and their combinations. The second section of the thesis deals with the development of analytical methods for the discrimination and quantification of phenol compounds in vegetable matrix, as chestnut bark extracts and olive mill waste water. The management of waste disposal in mill sector has been approached with the aim of reducing the amount of waste, and at the same time recovering valuable by-products, to be used in different industrial sectors. Finally, the sensory analysis of boiled potatoes has been carried out through the development of a quantitative descriptive procedure for the study of Italian and Mexican potato varieties. An update on flavour development in fresh and cooked potatoes has been realized and a sensory glossary, including general and specific definitions related to organic products, used in the European project Ecropolis, has been drafted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present work qualitative aspects of products that fall outside the classic Italian of food production view will be investigated, except for the apricot, a fruit, however, less studied by the methods considered here. The development of computer systems and the advanced software systems dedicated for statistical processing of data, has permitted the application of advanced technologies including the analysis of niche products. The near-infrared spectroscopic analysis was applied to the chemical industry for over twenty years and, subsequently, was applied in food industry with great success for non-destructive in line and off-line analysis. The work that will be presented below range from the use of spectroscopy for the determination of some rheological indices of ice cream applications to the characterization of the main quality indices of apricots, fresh dates, determination of the production areas of pistachio. Next to the spectroscopy will be illustrated different methods of multivariate analysis for spectra interpretation or for the construction of qualitative models of estimation. The thesis is divided into four separate studies that consider the same number of products. Each one of it is introduced by its own premise and ended with its own bibliography. This studies are preceded by a general discussion on the state of art and the basics of NIR spectroscopy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainable computer systems require some flexibility to adapt to environmental unpredictable changes. A solution lies in autonomous software agents which can adapt autonomously to their environments. Though autonomy allows agents to decide which behavior to adopt, a disadvantage is a lack of control, and as a side effect even untrustworthiness: we want to keep some control over such autonomous agents. How to control autonomous agents while respecting their autonomy? A solution is to regulate agents’ behavior by norms. The normative paradigm makes it possible to control autonomous agents while respecting their autonomy, limiting untrustworthiness and augmenting system compliance. It can also facilitate the design of the system, for example, by regulating the coordination among agents. However, an autonomous agent will follow norms or violate them in some conditions. What are the conditions in which a norm is binding upon an agent? While autonomy is regarded as the driving force behind the normative paradigm, cognitive agents provide a basis for modeling the bindingness of norms. In order to cope with the complexity of the modeling of cognitive agents and normative bindingness, we adopt an intentional stance. Since agents are embedded into a dynamic environment, things may not pass at the same instant. Accordingly, our cognitive model is extended to account for some temporal aspects. Special attention is given to the temporal peculiarities of the legal domain such as, among others, the time in force and the time in efficacy of provisions. Some types of normative modifications are also discussed in the framework. It is noteworthy that our temporal account of legal reasoning is integrated to our commonsense temporal account of cognition. As our intention is to build sustainable reasoning systems running unpredictable environment, we adopt a declarative representation of knowledge. A declarative representation of norms will make it easier to update their system representation, thus facilitating system maintenance; and to improve system transparency, thus easing system governance. Since agents are bounded and are embedded into unpredictable environments, and since conflicts may appear amongst mental states and norms, agent reasoning has to be defeasible, i.e. new pieces of information can invalidate formerly derivable conclusions. In this dissertation, our model is formalized into a non-monotonic logic, namely into a temporal modal defeasible logic, in order to account for the interactions between normative systems and software cognitive agents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This doctoral dissertation aims to establish fiber-optic technologies overcoming the limiting issues of data communications in indoor environments. Specific applications are broadband mobile distribution in different in-building scenarios and high-speed digital transmission over short-range wired optical systems. Two key enabling technologies are considered: Radio over Fiber (RoF) techniques over standard silica fibers for distributed antenna systems (DAS) and plastic optical fibers (POFs) for short-range communications. Hence, the objectives and achievements of this thesis are related to the application of RoF and POF technologies in different in-building scenarios. On one hand, a theoretical and experimental analysis combined with demonstration activities has been performed on cost-effective RoF systems. An extensive modeling on modal noise impact both on linear and non-linear characteristics of RoF link over silica multimode fiber has been performed to achieve link design rules for an optimum choice of the transmitter, receiver and launching technique. A successful transmission of Long Term Evolution (LTE) mobile signals on the resulting optimized RoF system over silica multimode fiber employing a Fabry-Perot LD, central launch technique and a photodiode with a built-in ball lens was demonstrated up to 525m with performances well compliant with standard requirements. On the other hand, digital signal processing techniques to overcome the bandwidth limitation of POF have been investigated. An uncoded net bit-rate of 5.15Gbit/s was obtained on a 50m long POF link employing an eye-safe transmitter, a silicon photodiode, and DMT modulation with bit and power loading algorithm. With the insertion of 3x2N quadrature amplitude modulation constellation formats, an uncoded net-bit-rate of 5.4Gbit/s was obtained on a 50 m long POF link employing an eye-safe transmitter and a silicon avalanche photodiode. Moreover, simultaneous transmission of baseband 2Gbit/s with DMT and 200Mbit/s with an ultra-wideband radio signal has been validated over a 50m long POF link.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase in aquaculture operations worldwide has provided new opportunities for the transmission of aquatic viruses. The occurrence of viral diseases remains a significant limiting factor in aquaculture production and for the sustainability. The ability to identify quickly the presence/absence of a pathogenic organism in fish would have significant advantages for the aquaculture systems. Several molecular methods have found successful application in fish pathology both for confirmatory diagnosis of overt diseases and for detection of asymptomatic infections. However, a lot of different variants occur among fish host species and virus strains and consequently specific methods need to be developed and optimized for each pathogen and often also for each host species. The first chapter of this PhD thesis presents a complete description of the major viruses that infect fish and provides a relevant information regarding the most common methods and emerging technologies for the molecular diagnosis of viral diseases of fish. The development and application of a real time PCR assay for the detection and quantification of lymphocystivirus was described in the second chapter. It showed to be highly sensitive, specific, reproducible and versatile for the detection and quantitation of lymphocystivirus. The use of this technique can find multiple application such as asymptomatic carrier detection or pathogenesis studies of different LCDV strains. The third chapter, a multiplex RT-PCR (mRT-PCR) assay was developed for the simultaneous detection of viral haemorrhagic septicaemia (VHS), infectious haematopoietic necrosis (IHN), infectious pancreatic necrosis (IPN) and sleeping disease (SD) in a single assay. This method was able to efficiently detect the viral RNA in tissue samples, showing the presence of single infections and co-infections in rainbow trout samples. The mRT-PCR method was revealed to be an accurate and fast method to support traditional diagnostic techniques in the diagnosis of major viral diseases of rainbow trout.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays licensing practices have increased in importance and relevance driving the widespread diffusion of markets for technologies. Firms are shifting from a tactical to a strategic attitude towards licensing, addressing both business and corporate level objectives. The Open Innovation Paradigm has been embraced. Firms rely more and more on collaboration and external sourcing of knowledge. This new model of innovation requires firms to leverage on external technologies to unlock the potential of firms’ internal innovative efforts. In this context, firms’ competitive advantage depends both on their ability to recognize available opportunities inside and outside their boundaries and on their readiness to exploit them in order to fuel their innovation process dynamically. Licensing is one of the ways available to firm to ripe the advantages associated to an open attitude in technology strategy. From the licensee’s point view this implies challenging the so-called not-invented-here syndrome, affecting the more traditional firms that emphasize the myth of internal research and development supremacy. This also entails understanding the so-called cognitive constraints affecting the perfect functioning of markets for technologies that are associated to the costs for the assimilation, integration and exploitation of external knowledge by recipient firms. My thesis aimed at shedding light on new interesting issues associated to in-licensing activities that have been neglected by the literature on licensing and markets for technologies. The reason for this gap is associated to the “perspective bias” affecting the works within this stream of research. With very few notable exceptions, they have been generally concerned with the investigation of the so-called licensing dilemma of the licensor – whether to license out or to internally exploit the in-house developed technologies, while neglecting the licensee’s perspective. In my opinion, this has left rooms for improving the understanding of the determinants and conditions affecting licensing-in practices. From the licensee’s viewpoint, the licensing strategy deals with the search, integration, assimilation, exploitation of external technologies. As such it lies at the very hearth of firm’s technology strategy. Improving our understanding of this strategy is thus required to assess the full implications of in-licensing decisions as they shape firms’ innovation patterns and technological capabilities evolution. It also allow for understanding the so-called cognitive constraints associated to the not-invented-here syndrome. In recognition of that, the aim of my work is to contribute to the theoretical and empirical literature explaining the determinants of the licensee’s behavior, by providing a comprehensive theoretical framework as well as ad-hoc conceptual tools to understand and overcome frictions and to ease the achievement of satisfactory technology transfer agreements in the marketplace. Aiming at this, I investigate licensing-in in three different fashions developed in three research papers. In the first work, I investigate the links between licensing and the patterns of firms’ technological search diversification according to the framework of references of the Search literature, Resource-based Theory and the theory of general purpose technologies. In the second paper - that continues where the first one left off – I analyze the new concept of learning-bylicensing, in terms of development of new knowledge inside the licensee firms (e.g. new patents) some years after the acquisition of the license, according to the Dynamic Capabilities perspective. Finally, in the third study, Ideal with the determinants of the remuneration structure of patent licenses (form and amount), and in particular on the role of the upfront fee from the licensee’s perspective. Aiming at this, I combine the insights of two theoretical approaches: agency and real options theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proper hazard identification has become progressively more difficult to achieve, as witnessed by several major accidents that took place in Europe, such as the Ammonium Nitrate explosion at Toulouse (2001) and the vapour cloud explosion at Buncefield (2005), whose accident scenarios were not considered by their site safety case. Furthermore, the rapid renewal in the industrial technology has brought about the need to upgrade hazard identification methodologies. Accident scenarios of emerging technologies, which are not still properly identified, may remain unidentified until they take place for the first time. The consideration of atypical scenarios deviating from normal expectations of unwanted events or worst case reference scenarios is thus extremely challenging. A specific method named Dynamic Procedure for Atypical Scenarios Identification (DyPASI) was developed as a complementary tool to bow-tie identification techniques. The main aim of the methodology is to provide an easier but comprehensive hazard identification of the industrial process analysed, by systematizing information from early signals of risk related to past events, near misses and inherent studies. DyPASI was validated on the two examples of new and emerging technologies: Liquefied Natural Gas regasification and Carbon Capture and Storage. The study broadened the knowledge on the related emerging risks and, at the same time, demonstrated that DyPASI is a valuable tool to obtain a complete and updated overview of potential hazards. Moreover, in order to tackle underlying accident causes of atypical events, three methods for the development of early warning indicators were assessed: the Resilience-based Early Warning Indicator (REWI) method, the Dual Assurance method and the Emerging Risk Key Performance Indicator method. REWI was found to be the most complementary and effective of the three, demonstrating that its synergy with DyPASI would be an adequate strategy to improve hazard identification methodologies towards the capture of atypical accident scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In questo lavoro di tesi si è elaborato un quadro di riferimento per l’utilizzo combinato di due metodologie di valutazione di impatti LCA e RA, per tecnologie emergenti. L’originalità dello studio sta nell’aver proposto e anche applicato il quadro di riferimento ad un caso studio, in particolare ad una tecnologia innovativa di refrigerazione, basata su nanofluidi (NF), sviluppata da partner del progetto Europeo Nanohex che hanno collaborato all’elaborazione degli studi soprattutto per quanto riguarda l’inventario dei dati necessari. La complessità dello studio è da ritrovare tanto nella difficile integrazione di due metodologie nate per scopi differenti e strutturate per assolvere a quegli scopi, quanto nel settore di applicazione che seppur in forte espansione ha delle forti lacune di informazioni circa processi di produzione e comportamento delle sostanze. L’applicazione è stata effettuata sulla produzione di nanofluido (NF) di allumina secondo due vie produttive (single-stage e two-stage) per valutare e confrontare gli impatti per la salute umana e l’ambiente. Occorre specificare che il LCA è stato quantitativo ma non ha considerato gli impatti dei NM nelle categorie di tossicità. Per quanto concerne il RA è stato sviluppato uno studio di tipo qualitativo, a causa della problematica di carenza di parametri tossicologici e di esposizione su citata avente come focus la categoria dei lavoratori, pertanto è stata fatta l’assunzione che i rilasci in ambiente durante la fase di produzione sono trascurabili. Per il RA qualitativo è stato utilizzato un SW specifico, lo Stoffenmanger-Nano che rende possibile la prioritizzazione dei rischi associati ad inalazione in ambiente di lavoro. Il quadro di riferimento prevede una procedura articolata in quattro fasi: DEFINIZIONE SISTEMA TECNOLOGICO, RACCOLTA DATI, VALUTAZIONE DEL RISCHIO E QUANTIFICAZIONE DEGLI IMPATTI, INTERPRETAZIONE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La prova informatica richiede l’adozione di precauzioni come in un qualsiasi altro accertamento scientifico. Si fornisce una panoramica sugli aspetti metodologici e applicativi dell’informatica forense alla luce del recente standard ISO/IEC 27037:2012 in tema di trattamento del reperto informatico nelle fasi di identificazione, raccolta, acquisizione e conservazione del dato digitale. Tali metodologie si attengono scrupolosamente alle esigenze di integrità e autenticità richieste dalle norme in materia di informatica forense, in particolare della Legge 48/2008 di ratifica della Convenzione di Budapest sul Cybercrime. In merito al reato di pedopornografia si offre una rassegna della normativa comunitaria e nazionale, ponendo l’enfasi sugli aspetti rilevanti ai fini dell’analisi forense. Rilevato che il file sharing su reti peer-to-peer è il canale sul quale maggiormente si concentra lo scambio di materiale illecito, si fornisce una panoramica dei protocolli e dei sistemi maggiormente diffusi, ponendo enfasi sulla rete eDonkey e il software eMule che trovano ampia diffusione tra gli utenti italiani. Si accenna alle problematiche che si incontrano nelle attività di indagine e di repressione del fenomeno, di competenza delle forze di polizia, per poi concentrarsi e fornire il contributo rilevante in tema di analisi forensi di sistemi informatici sequestrati a soggetti indagati (o imputati) di reato di pedopornografia: la progettazione e l’implementazione di eMuleForensic consente di svolgere in maniera estremamente precisa e rapida le operazioni di analisi degli eventi che si verificano utilizzando il software di file sharing eMule; il software è disponibile sia in rete all’url http://www.emuleforensic.com, sia come tool all’interno della distribuzione forense DEFT. Infine si fornisce una proposta di protocollo operativo per l’analisi forense di sistemi informatici coinvolti in indagini forensi di pedopornografia.