435 resultados para Potentialities


Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT: The dimension stone qualification through the use of non-destructive tests (NDT) is a relevant research topic for the industrial characterisation of finite products, because the competition of low-costs products can be sustained by an offer of highqualification and a top-guarantee products. The synthesis of potentialities offered by the NDT is the qualification and guarantee similar to the well known agro-industrial PDO, Protected Denomination of Origin. In fact it is possible to guarantee both, the origin and the quality of each stone product element, even through a Factory Production Control on line. A specific disciplinary is needed. A research developed at DICMA-Univ. Bologna in the frame of the “OSMATER” INTERREG project, allowed identifying good correlations between destructive and non-destructive tests for some types of materials from Verbano-Cusio-Ossola region. For example non conventional ultrasonic tests, image analysis parameters, water absorption and other measurements showed to be well correlated with the bending resistance, by relationships varying for each product. In conclusion it has been demonstrated that a nondestructive approach allows reaching several goals, among the most important: 1) the identification of materials; 2) the selection of products; 3) the substitution of DT by NDT. Now it is necessary to move from a research phase to the industrial implementation, as well as to develop new ND technologies focused on specific aims.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The running innovation processes of the microwave transistor technologies, used in the implementation of microwave circuits, have to be supported by the study and development of proper design methodologies which, depending on the applications, will fully exploit the technology potentialities. After the choice of the technology to be used in the particular application, the circuit designer has few degrees of freedom when carrying out his design; in the most cases, due to the technological constrains, all the foundries develop and provide customized processes optimized for a specific performance such as power, low-noise, linearity, broadband etc. For these reasons circuit design is always a “compromise”, an investigation for the best solution to reach a trade off between the desired performances. This approach becomes crucial in the design of microwave systems to be used in satellite applications; the tight space constraints impose to reach the best performances under proper electrical and thermal de-rated conditions, respect to the maximum ratings provided by the used technology, in order to ensure adequate levels of reliability. In particular this work is about one of the most critical components in the front-end of a satellite antenna, the High Power Amplifier (HPA). The HPA is the main power dissipation source and so the element which mostly engrave on space, weight and cost of telecommunication apparatus; it is clear from the above reasons that design strategies addressing optimization of power density, efficiency and reliability are of major concern. Many transactions and publications demonstrate different methods for the design of power amplifiers, highlighting the availability to obtain very good levels of output power, efficiency and gain. Starting from existing knowledge, the target of the research activities summarized in this dissertation was to develop a design methodology capable optimize power amplifier performances complying all the constraints imposed by the space applications, tacking into account the thermal behaviour in the same manner of the power and the efficiency. After a reminder of the existing theories about the power amplifier design, in the first section of this work, the effectiveness of the methodology based on the accurate control of the dynamic Load Line and her shaping will be described, explaining all steps in the design of two different kinds of high power amplifiers. Considering the trade-off between the main performances and reliability issues as the target of the design activity, we will demonstrate that the expected results could be obtained working on the characteristics of the Load Line at the intrinsic terminals of the selected active device. The methodology proposed in this first part is based on the assumption that designer has the availability of an accurate electrical model of the device; the variety of publications about this argument demonstrates that it is so difficult to carry out a CAD model capable to taking into account all the non-ideal phenomena which occur when the amplifier operates at such high frequency and power levels. For that, especially for the emerging technology of Gallium Nitride (GaN), in the second section a new approach for power amplifier design will be described, basing on the experimental characterization of the intrinsic Load Line by means of a low frequency high power measurements bench. Thanks to the possibility to develop my Ph.D. in an academic spin-off, MEC – Microwave Electronics for Communications, the results of this activity has been applied to important research programs requested by space agencies, with the aim support the technological transfer from universities to industrial world and to promote a science-based entrepreneurship. For these reasons the proposed design methodology will be explained basing on many experimental results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The elusive fiction of J. M. Coetzee is not a work in which you can read fixed ethical stances. I suggest testing the potentialities of a logic based on frames and double binds in Coetzee's novels. A double bind is a dilemma in communication which consists on tho conflicting messages, with the result that you can’t successfully respond to neither. Jacques Derrida highlighted the strategic value of a way of thinking based on the double bind (but on frames as well), which enables to escape binary thinking and so it opens an ethical space, where you can make a choice out of a set of fixed rules and take responsibility for it. In Coetzee’s fiction the author himself can be considered in a double bind, seeing that he is a white South African writer who feels that his “task” can’t be as simply as choosing to represent faithfully the violence and the racism of the apartheid or of choosing to give a voice to the oppressed. Good intentions alone do not ensure protection against entering unwittingly into complicity with the dominant discourse, and this is why is important to make the frame in which one is always situated clearly visible and explicit. The logic of the double bind becomes the way in which moral problem are staged in Coetzee’s fiction as well: the opportunity to give a voice to the oppressed through the same language which co-opted to serve the cause of oppression, a relation with the otherness never completed, or the representability of evil in literature, of the secret and of the paradoxical implications of confession and forgiveness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dielectric Elastomers (DE) are incompressible dielectrics which can experience deviatoric (isochoric) finite deformations in response to applied large electric fields. Thanks to the strong electro-mechanical coupling, DE intrinsically offer great potentialities for conceiving novel solid-state mechatronic devices, in particular linear actuators, which are more integrated, lightweight, economic, silent, resilient and disposable than equivalent devices based on traditional technologies. Such systems may have a huge impact in applications where the traditional technology does not allow coping with the limits of weight or encumbrance, and with problems involving interaction with humans or unknown environments. Fields such as medicine, domotic, entertainment, aerospace and transportation may profit. For actuation usage, DE are typically shaped in thin films coated with compliant electrodes on both sides and piled one on the other to form a multilayered DE. DE-based Linear Actuators (DELA) are entirely constituted by polymeric materials and their overall performance is highly influenced by several interacting factors; firstly by the electromechanical properties of the film, secondly by the mechanical properties and geometry of the polymeric frame designed to support the film, and finally by the driving circuits and activation strategies. In the last decade, much effort has been focused in the devolvement of analytical and numerical models that could explain and predict the hyperelastic behavior of different types of DE materials. Nevertheless, at present, the use of DELA is limited. The main reasons are 1) the lack of quantitative and qualitative models of the actuator as a whole system 2) the lack of a simple and reliable design methodology. In this thesis, a new point of view in the study of DELA is presented which takes into account the interaction between the DE film and the film supporting frame. Hyperelastic models of the DE film are reported which are capable of modeling the DE and the compliant electrodes. The supporting frames are analyzed and designed as compliant mechanisms using pseudo-rigid body models and subsequent finite element analysis. A new design methodology is reported which optimize the actuator performances allowing to specifically choose its inherent stiffness. As a particular case, the methodology focuses on the design of constant force actuators. This class of actuators are an example of how the force control could be highly simplified. Three new DE actuator concepts are proposed which highlight the goodness of the proposed method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tissue engineering is a discipline that aims at regenerating damaged biological tissues by using a cell-construct engineered in vitro made of cells grown into a porous 3D scaffold. The role of the scaffold is to guide cell growth and differentiation by acting as a bioresorbable temporary substrate that will be eventually replaced by new tissue produced by cells. As a matter or fact, the obtainment of a successful engineered tissue requires a multidisciplinary approach that must integrate the basic principles of biology, engineering and material science. The present Ph.D. thesis aimed at developing and characterizing innovative polymeric bioresorbable scaffolds made of hydrolysable polyesters. The potentialities of both commercial polyesters (i.e. poly-e-caprolactone, polylactide and some lactide copolymers) and of non-commercial polyesters (i.e. poly-w-pentadecalactone and some of its copolymers) were explored and discussed. Two techniques were employed to fabricate scaffolds: supercritical carbon dioxide (scCO2) foaming and electrospinning (ES). The former is a powerful technology that enables to produce 3D microporous foams by avoiding the use of solvents that can be toxic to mammalian cells. The scCO2 process, which is commonly applied to amorphous polymers, was successfully modified to foam a highly crystalline poly(w-pentadecalactone-co-e-caprolactone) copolymer and the effect of process parameters on scaffold morphology and thermo-mechanical properties was investigated. In the course of the present research activity, sub-micrometric fibrous non-woven meshes were produced using ES technology. Electrospun materials are considered highly promising scaffolds because they resemble the 3D organization of native extra cellular matrix. A careful control of process parameters allowed to fabricate defect-free fibres with diameters ranging from hundreds of nanometers to several microns, having either smooth or porous surface. Moreover, versatility of ES technology enabled to produce electrospun scaffolds from different polyesters as well as “composite” non-woven meshes by concomitantly electrospinning different fibres in terms of both fibre morphology and polymer material. The 3D-architecture of the electrospun scaffolds fabricated in this research was controlled in terms of mutual fibre orientation by properly modifying the instrumental apparatus. This aspect is particularly interesting since the micro/nano-architecture of the scaffold is known to affect cell behaviour. Since last generation scaffolds are expected to induce specific cell response, the present research activity also explored the possibility to produce electrospun scaffolds bioactive towards cells. Bio-functionalized substrates were obtained by loading polymer fibres with growth factors (i.e. biomolecules that elicit specific cell behaviour) and it was demonstrated that, despite the high voltages applied during electrospinning, the growth factor retains its biological activity once released from the fibres upon contact with cell culture medium. A second fuctionalization approach aiming, at a final stage, at controlling cell adhesion on electrospun scaffolds, consisted in covering fibre surface with highly hydrophilic polymer brushes of glycerol monomethacrylate synthesized by Atom Transfer Radical Polymerization. Future investigations are going to exploit the hydroxyl groups of the polymer brushes for functionalizing the fibre surface with desired biomolecules. Electrospun scaffolds were employed in cell culture experiments performed in collaboration with biochemical laboratories aimed at evaluating the biocompatibility of new electrospun polymers and at investigating the effect of fibre orientation on cell behaviour. Moreover, at a preliminary stage, electrospun scaffolds were also cultured with tumour mammalian cells for developing in vitro tumour models aimed at better understanding the role of natural ECM on tumour malignity in vivo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Die Verbindung von elektrisch aktiven, lebenden Zellen zu extrazellulären Sensorsystemen eröffnet vielfälige Möglichkeiten im Bereich der Biosensorik. Die vorliegende Arbeit leistet einen Beitrag zum tieferen Verständnis der elektrischen Kopplungsmechanismen zwischen den biologischen und elektronischen Teilen solcher Hybridsysteme. Es wurden dazu drei Hauptbereiche bearbeitet:Ein System zur extrazellulären Signalableitung an lebenden Zellen bestehend aus einem Sensorchip, einem Vorverstärkerkopf und einem Hauptverstärker wurde weiterentwickelt.Als Sensoren wurden entweder Metallmikroelektroden-Chips mit 64 Kanälen oder Feldeffekt Transistoren-Chips mit 16 Kanälen (FET) eingesetzt. Es wurden zusätzlich spezielle FET Sensoren mit Rückseitenkontakten hergestellt und eingesetzt.Die elektrische Kopplung von einzelnen Nervenzellen der neuronalen Zell-Linien SH-SY5Y und TR14 oder primär kultivierten Neuronen aus dem Hirnstamm oder dem Hippocampus von embryonalen Ratten mit den extrazellulären Sensoren wurde untersucht. In der 'whole-cell' Patch-Clamp Technik wurden die Beiträge der spannungsgesteuerten Na+- und K+-Ionenkanäle zur extrazellulären Signalform identifiziert. Die Simulation der Signale mit einem Ersatzschaltkreis (Punkt-Kontakt Modell), der in PSPICE implementiert wurde, deutet auf eine starke Abhängigkeit der Signalformen in bezug auf Konzentrationsänderungen von Na+- und K+-Ionen im Volumenbereich zwischen Zelle und den ionensensitiven Transistoren hin. Ein empirisch erweitertes Punkt-Kontakt Modell wurde daraufhin vorgestellt.Im dritten Teil der Arbeit wurden Zellschichten von Kardiomyocyten embryonaler Ratten auf den extrazellulären Sensoren kultiviert. Die Eignung eines solchen Hybridsensors als Modellherz fuer das pharmazeutische Screeing wurde durch Messungen mit Herzstimulanzien und -relaktanzien bestätigt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il tema centrale di questo lavoro è costituito dalle nuove forme di pianificazione territoriale in uso nelle principali città europee, con particolare riferimento all'esperienza della pianificazione strategica applicata al governo del territorio, e dall'analisi approfondita delle politiche e degli strumenti di pianificazione urbanistica e territoriale della città di Bologna, dal Piano Regolatore Generale del 1985-'89 al nuovo Piano Strutturale Comunale del 2008. Più precisamente, le caratteristiche, potenzialità e criticità del nuovo strumento urbanistico del capoluogo emiliano-romagnolo, vengono esaminati, non solo, in rapporto alle caratteristiche tipiche dei piani strategici europei, ma anche alle forme tradizionali della pianificazione urbanistica (i piani regolatori generali) di cui il piano strutturale dovrebbe superare i limiti, sia in termini di efficacia operativa, sia per quanto riguarda la capacità di costruire condivisione e consenso tra i diversi attori urbani, sull'idea di città di cui è portatore. The main topics of this research are the new tools for urban planning used by the main European cities - with particular reference to strategic planning applied to the territorial management - and the analysis of Bologna policies and instruments for urban and territorial planning, from the Piano Regolatore Generale '85-'89, to the Piano Strutturale Comunale in 2008. More precisely, the Bologna new planning instrument's characteristics, potentialities and criticalities, are not only investigated in relation to the fundamental characteristics of European strategic plans, but also to the traditional instruments of Italian urbanistic planning (Piani Regolatori Generali), of which the new structural plan should exceed the limits, both in terms of effectiveness; and in terms of ability to build agreement and sharing on its urban project, between different urban actors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nano(bio)science and nano(bio)technology play a growing and tremendous interest both on academic and industrial aspects. They are undergoing rapid developments on many fronts such as genomics, proteomics, system biology, and medical applications. However, the lack of characterization tools for nano(bio)systems is currently considered as a major limiting factor to the final establishment of nano(bio)technologies. Flow Field-Flow Fractionation (FlFFF) is a separation technique that is definitely emerging in the bioanalytical field, and the number of applications on nano(bio)analytes such as high molar-mass proteins and protein complexes, sub-cellular units, viruses, and functionalized nanoparticles is constantly increasing. This can be ascribed to the intrinsic advantages of FlFFF for the separation of nano(bio)analytes. FlFFF is ideally suited to separate particles over a broad size range (1 nm-1 μm) according to their hydrodynamic radius (rh). The fractionation is carried out in an empty channel by a flow stream of a mobile phase of any composition. For these reasons, fractionation is developed without surface interaction of the analyte with packing or gel media, and there is no stationary phase able to induce mechanical or shear stress on nanosized analytes, which are for these reasons kept in their native state. Characterization of nano(bio)analytes is made possible after fractionation by interfacing the FlFFF system with detection techniques for morphological, optical or mass characterization. For instance, FlFFF coupling with multi-angle light scattering (MALS) detection allows for absolute molecular weight and size determination, and mass spectrometry has made FlFFF enter the field of proteomics. Potentialities of FlFFF couplings with multi-detection systems are discussed in the first section of this dissertation. The second and the third sections are dedicated to new methods that have been developed for the analysis and characterization of different samples of interest in the fields of diagnostics, pharmaceutics, and nanomedicine. The second section focuses on biological samples such as protein complexes and protein aggregates. In particular it focuses on FlFFF methods developed to give new insights into: a) chemical composition and morphological features of blood serum lipoprotein classes, b) time-dependent aggregation pattern of the amyloid protein Aβ1-42, and c) aggregation state of antibody therapeutics in their formulation buffers. The third section is dedicated to the analysis and characterization of structured nanoparticles designed for nanomedicine applications. The discussed results indicate that FlFFF with on-line MALS and fluorescence detection (FD) may become the unparallel methodology for the analysis and characterization of new, structured, fluorescent nanomaterials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gels are materials that are easier to recognize than to define. For all practical purpose, a material is termed a gel if the whole volume of liquid is completely immobilized as usually tested by the ‘tube inversion’ method. Recently, supramolecular gels obtained from low molecular weight gelators (LMWGs) have attracted considerable attention in materials science since they represent a new class of smart materials sensitive to external stimuli, such as temperature, ultrasounds, light, chemical species and so on. Accordingly, during the past years a large variety of potentialities and applications of these soft materials in optoelectronics, as electronic devices, light harvesting systems and sensors, in bio-materials and in drug delivery have been reported. Spontaneous self-assembly of low molecular weight molecules is a powerful tool that allows complex supramolecular nanoscale structures to be built. The weak and non-covalent interactions such as hydrogen bonding, π–π stacking, coordination, electrostatic and van der Waals interactions are usually considered as the most important features for promoting sol-gel equilibria. However, the occurrence of gelation processes is ruled by further “external” factors, among which the temperature and the nature of the solvents that are employed are of crucial importance. For example, some gelators prefer aromatic or halogenated solvents and in some cases both the gelation temperature and the type of the solvent affect the morphologies of the final aggregation. Functionalized cyclopentadienones are fascinating systems largely employed as building blocks for the synthesis of polyphenylene derivatives. In addition, it is worth noting that structures containing π-extended conjugated chromophores with enhanced absorption properties are of current interest in the field of materials science since they can be used as “organic metals”, as semiconductors, and as emissive or absorbing layers for OLEDs or photovoltaics. The possibility to decorate the framework of such structures prompted us to study the synthesis of new hydroxy propargyl arylcyclopentadienone derivatives. Considering the ability of such systems to give π–π stacking interactions, the introduction on a polyaromatic structure of polar substituents able to generate hydrogen bonding could open the possibility to form gels, although any gelation properties has been never observed for these extensively studied systems. we have synthesized a new class of 3,4-bis (4-(3-hydroxy- propynyl) phenyl) -2, 5-diphenylcyclopentadienone derivatives, one of which (1a) proved to be, for the first time, a powerful organogelator. The experimental results indicated that the hydroxydimethylalkynyl substituents are fundamental to guarantee the gelation properties of the tetraarylcyclopentadienone unit. Combining the results of FT-IR, 1H NMR, UV-vis and fluorescence emission spectra, we believe that H-bonding and π–π interactions are the driving forces played for the gel formation. The importance of soft materials lies on their ability to respond to external stimuli, that can be also of chemical nature. In particular, high attention has been recently devoted to anion responsive properties of gels. Therefore the behaviour of organogels of 1a in toluene, ACN and MeNO2 towards the addition of 1 equivalent of various tetrabutylammonium salts were investigated. The rheological properties of gels in toluene, ACN and MeNO2 with and without the addition of Bu4N+X- salts were measured. In addition a qualitative analysis on cation recognition was performed. Finally the nature of the cyclic core of the gelator was changed in order to verify how the carbonyl group was essential to gel solvents. Until now, 4,5-diarylimidazoles have been synthesized.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La trattazione cerca di delineare i confini teorici ed applicativi dell’istituto dell’interpretazione autentica, nella chiara consapevolezza che dietro tale tematica si celi il più complesso problema di una corretta delimitazione tra attività di legis-latio e attività di legis-executio. Il fenomeno delle leggi interpretative costituisce infatti nodo nevralgico e punto di intersezione di tre ambiti materiali distinti, ossia la teoria dell’interpretazione, la teoria delle fonti del diritto e la dottrina di matrice liberale della separazione dei poteri. All’interno del nostro ordinamento, nell’epoca più recente, si è assistito ad un aumento esponenziale di interventi legislativi interpretativi che, allo stato attuale, sono utilizzati per lo più come strumenti di legislazione ordinaria. Sotto questo profilo, il sempre più frequente ricorso alla fonte interpretativa può essere inquadrato nel più complesso fenomeno della “crisi della legge” i cui tradizionali requisiti di generalità, astrattezza ed irretroattività sono stati progressivamente abbandonati dal legislatore parallelamente con l’affermarsi dello Stato costituzionale. L’abuso dello strumento interpretativo da parte del legislatore, gravemente lesivo delle posizioni giuridiche soggettive, non è stato finora efficacemente contrastato all’interno dell’ordinamento nonostante l’elaborazione da parte della Corte costituzionale di una serie di limiti e requisiti di legittimità dell’esegesi legislativa. In tale prospettiva, diventano quindi di rilevanza fondamentale la ricerca e l’esame di strategie e rimedi, giurisdizionali ed istituzionali, tali da arginare l’“onnipotenza” del legislatore interprete. A seguito dell’analisi svolta, è maturata la consapevolezza delle potenzialità insite nella valorizzazione della giurisprudenza della Corte Edu, maggiormente incline a sanzionare l’abuso delle leggi interpretative.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los accesorios metálicos de indumentaria constituyen uno de las fuentes materiales principales para aproximarse a la realidad social, cultural y económica de la población del Mediterráneo tardoantiguo. En el caso de los hallazgos de los siglos V y VI procedentes de la Península Ibérica y del suroeste de Francia, numerosos problemas de documentación han impedido extraer y desarrollar todo su potencial, tanto en lo referente al encuadre tipológico y cronológico de estos objetos como en la consiguiente fase interpretativa. Se hacía necesario acometer un nuevo estudio monográfico que actualizara el panorama de la investigación. El trabajo cataloga, data y clasifica tipológicamente más de cuatro millares de fíbulas y accesorios de cinturón recuperados en casi medio millar de yacimientos localizados en los actuales Portugal, España, Andorra y Francia. El resultado permite aproximarse a las áreas de producción y modalidades de circulación y utilización de cada uno de los tipos individualizados. Una veintena de indumentarias distintas, definidas por combinaciones de distintos tipos de accesorios en contextos funerarios, ha sido identificada. Parte de éstas constituye la base principal de un sistema cronológico organizado en seis fases distintas que cubren una cronología situada aproximadamente entre las últimas décadas del siglo IV y las últimas décadas del siglo VI. La investigación acomete asimismo el análisis de la implantación de los accesorios y de las indumentarias relacionadas con ellos en el paisaje tardoantiguo de Hispania y la Galia. El resultado permite reconstruir secuencias regionales de evolución indumentaria y establecer relaciones entre diversas tipologías de contextos funerarios y habitativos y los tipos de indumentaria previamente definidos. Los resultados permiten renovar la mirada sobre este tipo de objetos y el lugar que ocuparon en la vida cotidiana de muchos de los habitantes del regnum visigodo temprano.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il contesto nazionale è cambiato recentemente per l’introduzione del nuovo Sistema Geodetico coincidente con quello Europeo (ETRS89, frame ETRF00) e realizzato dalle stazioni della Rete Dinamica Nazionale. Sistema geodetico, associato al cartografico UTM_ETRF00, divenuto per decreto obbligatorio nelle Pubbliche Amministrazioni. Questo cambiamento ha consentito di ottenere rilevamenti dei dati cartografici in coordinate assolute ETRF00 molto più accurate. Quando i dati così rilevati vengono utilizzati per aggiornamenti cartografici perdono le coordinate originarie e vengono adattati a particolari cartografici circostanti. Per progettare una modernizzazione delle mappe catastali e delle carte tecniche finalizzata a consentire l’introduzione degli aggiornamenti senza modificarne le coordinate assolute originarie, lo studio è iniziato valutando come utilizzare sviluppi di strutturazione dei dati topografici presenti nel Database Geotopografico, modellizzazioni 3D di fabbricati nelle esperienze catastali INSPIRE, integrazioni in ambito MUDE tra progetti edilizi e loro realizzazioni. Lo studio è proseguito valutando i servizi di posizionamento in tempo reale NRTK presenti in Italia. Inoltre sono state effettuate sperimentazioni per verificare anche in sede locale la precisione e l’affidabilità dei servizi di posizionamento presenti. La criticità della cartografia catastale deriva sostanzialmente dal due fatti: che originariamente fu inquadrata in 850 Sistemi e successivamente fu trasformata in Roma40 con una esigua densità di punti rimisurati; che fino al 1988 fu aggiornata con modalità non rigorose di bassa qualità. Per risolvere tali criticità si è quindi ipotizzato di sfruttare le modalità di rilevamento NRTK per aumentare localmente la densità dei punti rimisurati e reinquadrare le mappe catastali. Il test, realizzato a Bologna, ha comportato un’analisi preliminare per individuare quali Punti Fiduciali considerare coerenti con le specifiche cartografiche per poi utilizzarli e aumentare localmente la densità dei punti rimisurati. La sperimentazione ha consentito la realizzazione del progetto e di inserire quindi i prossimi aggiornamenti senza modificarne le coordinate ETRF00 ottenute dal servizio di posizionamento.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis analyses the hydrodynamic induced by an array of Wave energy Converters (WECs), under an experimental and numerical point of view. WECs can be considered an innovative solution able to contribute to the green energy supply and –at the same time– to protect the rear coastal area under marine spatial planning considerations. This research activity essentially rises due to this combined concept. The WEC under exam is a floating device belonging to the Wave Activated Bodies (WAB) class. Experimental data were performed at Aalborg University in different scales and layouts, and the performance of the models was analysed under a variety of irregular wave attacks. The numerical simulations performed with the codes MIKE 21 BW and ANSYS-AQWA. Experimental results were also used to calibrate the numerical parameters and/or to directly been compared to numerical results, in order to extend the experimental database. Results of the research activity are summarized in terms of device performance and guidelines for a future wave farm installation. The device length should be “tuned” based on the local climate conditions. The wave transmission behind the devices is pretty high, suggesting that the tested layout should be considered as a module of a wave farm installation. Indications on the minimum inter-distance among the devices are provided. Furthermore, a CALM mooring system leads to lower wave transmission and also larger power production than a spread mooring. The two numerical codes have different potentialities. The hydrodynamics around single and multiple devices is obtained with MIKE 21 BW, while wave loads and motions for a single moored device are derived from ANSYS-AQWA. Combining the experimental and numerical it is suggested –for both coastal protection and energy production– to adopt a staggered layout, which will maximise the devices density and minimize the marine space required for the installation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays the rise of non-recurring engineering (NRE) costs associated with complexity is becoming a major factor in SoC design, limiting both scaling opportunities and the flexibility advantages offered by the integration of complex computational units. The introduction of embedded programmable elements can represent an appealing solution, able both to guarantee the desired flexibility and upgradabilty and to widen the SoC market. In particular embedded FPGA (eFPGA) cores can provide bit-level optimization for those applications which benefits from synthesis, paying on the other side in terms of performance penalties and area overhead with respect to standard cell ASIC implementations. In this scenario this thesis proposes a design methodology for a synthesizable programmable device designed to be embedded in a SoC. A soft-core embedded FPGA (eFPGA) is hence presented and analyzed in terms of the opportunities given by a fully synthesizable approach, following an implementation flow based on Standard-Cell methodology. A key point of the proposed eFPGA template is that it adopts a Multi-Stage Switching Network (MSSN) as the foundation of the programmable interconnects, since it can be efficiently synthesized and optimized through a standard cell based implementation flow, ensuring at the same time an intrinsic congestion-free network topology. The evaluation of the flexibility potentialities of the eFPGA has been performed using different technology libraries (STMicroelectronics CMOS 65nm and BCD9s 0.11μm) through a design space exploration in terms of area-speed-leakage tradeoffs, enabled by the full synthesizability of the template. Since the most relevant disadvantage of the adopted soft approach, compared to a hardcore, is represented by a performance overhead increase, the eFPGA analysis has been made targeting small area budgets. The generation of the configuration bitstream has been obtained thanks to the implementation of a custom CAD flow environment, and has allowed functional verification and performance evaluation through an application-aware analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il progetto di ricerca è finalizzato allo sviluppo di una metodologia innovativa di supporto decisionale nel processo di selezione tra alternative progettuali, basata su indicatori di prestazione. In particolare il lavoro si è focalizzato sulla definizione d’indicatori atti a supportare la decisione negli interventi di sbottigliamento di un impianto di processo. Sono stati sviluppati due indicatori, “bottleneck indicators”, che permettono di valutare la reale necessità dello sbottigliamento, individuando le cause che impediscono la produzione e lo sfruttamento delle apparecchiature. Questi sono stati validati attraverso l’applicazione all’analisi di un intervento su un impianto esistente e verificando che lo sfruttamento delle apparecchiature fosse correttamente individuato. Definita la necessità dell’intervento di sbottigliamento, è stato affrontato il problema della selezione tra alternative di processo possibili per realizzarlo. È stato applicato alla scelta un metodo basato su indicatori di sostenibilità che consente di confrontare le alternative considerando non solo il ritorno economico degli investimenti ma anche gli impatti su ambiente e sicurezza, e che è stato ulteriormente sviluppato in questa tesi. Sono stati definiti due indicatori, “area hazard indicators”, relativi alle emissioni fuggitive, per integrare questi aspetti nell’analisi della sostenibilità delle alternative. Per migliorare l’accuratezza nella quantificazione degli impatti è stato sviluppato un nuovo modello previsionale atto alla stima delle emissioni fuggitive di un impianto, basato unicamente sui dati disponibili in fase progettuale, che tiene conto delle tipologie di sorgenti emettitrici, dei loro meccanismi di perdita e della manutenzione. Validato mediante il confronto con dati sperimentali di un impianto produttivo, si è dimostrato che tale metodo è indispensabile per un corretto confronto delle alternative poiché i modelli esistenti sovrastimano eccessivamente le emissioni reali. Infine applicando gli indicatori ad un impianto esistente si è dimostrato che sono fondamentali per semplificare il processo decisionale, fornendo chiare e precise indicazioni impiegando un numero limitato di informazioni per ricavarle.