147 resultados para Tecnologie, recupero, colonia reggiana, anziani.
Resumo:
Il tema della casa, e più in generale dell’abitare, è argomento tornato al centro del dibattito sociale più di quanto non sia avvenuto in campo tecnico‐architettonico. Sono infatti abbastanza evidenti i limiti delle proposte che nel recente passato sono state, di norma, elaborate nelle nostre città, proposte molto spesso incapaci di tener conto delle molteplici dimensioni che l’evoluzione dei costumi e della struttura urbana e sociale ha indotto anche nella sfera della residenza e che sono legate a mutate condizioni lavorative, alla diversità di cultura e di religione di nuovi gruppi etnici insediati, alla struttura dei nuclei familiari (ove ancora esistano) ed a molti altri fattori; cambiate le esigenze, un tempo composte nella struttura della famiglia, sono cambiati desideri e richieste mentre l’apparato normativo è rimasto strutturato su modelli sociali ed economici superati. Il tema dunque assume, oggi più che mai, connotazioni con forti relazioni fra problematiche funzionali, tecnologiche e simboliche. Stimolata da queste osservazioni generali, la ricerca si è mossa partendo da un’analisi di casi realizzati nel periodo storico in cui si è esaurita, in Italia, l’emergenza abitativa post‐bellica, nell’intento di riconsiderare l’approccio vitale che era stato messo in campo in quella drammatica circostanza, ma già consapevole che lo sviluppo che avrebbe poi avuto sarebbe stato molto più circoscritto. La tesi infatti, dopo aver osservato rapidamente la consistenza tipologica ed architettonica di quegli interventi, per trarne suggestioni capaci di suggerire un credibile e nuovo prototipo da indagare, attraverso un’analisi comparativa sugli strumenti oggi disponibili per la comunicazione e gestione del progetto, si è soffermata sulla potenzialità delle nuove tecnologie dell'informazione (IT). Non si può infatti non osservare che esse hanno modificato non solo il modo di vivere, di lavorare, di produrre documenti e di scambiare informazioni, ma anche quello di controllare il processo di progetto. Il fenomeno è tuttora in corso ma è del tutto evidente che anche l'attività progettuale, seppure in un settore quale è quello dell'industria edilizia, caratterizzato da una notevole inerzia al cambiamento e restio all'innovazione, grazie alle nuove tecnologie ha conosciuto profonde trasformazioni (già iniziate con l’avvento del CAD) che hanno accelerato il progressivo mutamento delle procedure di rappresentazione e documentazione digitale del progetto. Su questo tema quindi si è concentrata la ricerca e la sperimentazione, valutando che l'”archivio di progetto integrato”, (ovvero IPDB ‐ Integrated Project Database) è, probabilmente, destinato a sostituire il concetto di CAD (utilizzato fino ad ora per il settore edilizio ed inteso quale strumento di elaborazione digitale, principalmente grafica ma non solo). Si è esplorata quindi, in una prima esperienza di progetto, la potenzialità e le caratteristiche del BIM (Building Information Model) per verificare se esso si dimostra realmente capace di formulare un archivio informativo, di sostegno al progetto per tutto il ciclo di vita del fabbricato, ed in grado di definirne il modello tridimensionale virtuale a partire dai suoi componenti ed a collezionare informazioni delle geometrie, delle caratteristiche fisiche dei materiali, della stima dei costi di costruzione, delle valutazioni sulle performance di materiali e componenti, delle scadenze manutentive, delle informazioni relative a contratti e procedure di appalto. La ricerca analizza la strutturazione del progetto di un edificio residenziale e presenta una costruzione teorica di modello finalizzata alla comunicazione e gestione della pianificazione, aperta a tutti i soggetti coinvolti nel processo edilizio e basata sulle potenzialità dell’approccio parametrico.
Resumo:
A Machining Centre is nowadays a complex mechanical, electronic, electrical system that needs integrated design capabilities which very often require a high time-consuming effort. Numerical techniques for designing and dimensioning the machine structure and components usually requires different knowledge according to the system that have to be designed. This Ph. D Thesis is related about the efforts of the Authors to develop a system that allows to perform the complete project of a new machine optimized in its dynamic behaviour. An integration of the different systems developed, each of which respond to specific necessities of designer, is here presented. In particular a dynamic analysis system, based on a lumped mass approach, that rapidly allows to setup the drives of the machine and an Integrated Dynamic Simulation System, based on a FEM approach, that permit a dynamic optimization, are shown. A multilevel Data Base, and an operator interface module provide to complete the designing platform. The proposed approach represents a significant step toward the virtual machining for the prediction of the quality of the worked surface.
Resumo:
“Il tocco pianistico: premesse storiche e sviluppi scientifici” si pone l’obiettivo di provare la politimbricità del pianoforte. A tal fine, ho indagato la relazione tra il gesto, la meccanica del pianoforte e il suono, problema sfiorato da alcuni maestri del Novecento, ma mai approfondito e sviscerato per ovvie ragioni riguardanti la mancanza di una tecnologia idonea e competenze difficili da trovare in una medesima persona. Per quest’ultima ragione mi sono avvalsa della collaborazione sia del Laboratorio di Anatomia Funzionale dell'Apparato Locomotore del Dipartimento di Morfologia Umana dell’Università di Milano, dove lavorano esperti delle più moderne tecnologie idonee alla registrazione del movimento, sia dell’ingegnere Alberto Amendola, docente a contratto di Acustica musicale presso l’Università di Parma per ciò che concerne l’analisi del suono e i rilievi acustici. La tesi si articola in due parti organizzate in quattro capitoli. Nel primo, La didattica pianistica nel primo trentennio del Novecento: il tocco e il timbro come parole chiave, dopo aver tracciato un quadro generale riguardante i concetti di ‘tocco’ e ‘timbro’ incontrati nei metodi e trattati del Sette/Ottocento, già affrontati nella tesi di laurea, procedo ad analizzare alcuni dei lavori più rappresentativi scritti tra la fine dell’Ottocento e gli anni Trenta del Novecento (The Leschetizky Method. A Guide to Fine and Correct Piano Playing di Malwine Brée, Über die physiologischen Fehler und die Umgestaltung der Klaviertechnik di Albert Maria Steinhausen, Die Grundlagen der Klaviertechnik di Rudolph Maria Breithaupt e The Phisiological Mechanics of Piano Technique di Otto Ortmann). Tali studi presentano una parte dedicata alle diverse modalità di produzione sonora e, quasi tutti, giungono ad una medesima conclusione: pur nella sua ricchezza, il pianoforte è uno strumento monotimbrico, dove la differenza tra i suoni è data dall’intensità e dall’agogica. Al fine di provare la politimbricità del pianoforte, il mio percorso di studi si è scontrato sia con la meccanica del pianoforte sia con l’acustica musicale. Ho fatto precedere quindi l’indagine scientifica, che confluisce nel capitolo IV, da una sezione in cui presento l’evoluzione della meccanica del pianoforte fino a giungere alla descrizione della meccanica moderna (capitolo II, Il Pianoforte: meccanica e modalità di produzione del suono), e da un’altra in cui affronto i fondamenti di acustica musicale, al fine di fornire al lettore i mezzi basilari per cimentarsi con la parte scientifica (capitolo III, Cenni di acustica musicale). Il capitolo IV è il resoconto organico e sistematico delle sperimentazioni svolte durante il dottorato presso il laboratorio di Anatomia funzionale dell’apparato locomotore dell’Università di Milano. La presentazione ripercorre necessariamente le tappe della ricerca considerata la novità assoluta dell’oggetto indagato. All’illustrazione dei dati di ogni fase segue sempre la discussione e l’interpretazione dei risultati per garantire la validità dell’esperimento. L’interesse della ricerca è stato condiviso oltre che dal dipartimento di Anatomia, anche dalla casa costruttrice di pianoforti Bechstein che ha costruito una meccanica speciale, e dalla ditta di pianoforti Angelo Fabbrini, che ha messo a disposizione un mezza coda Bechstein per effettuare i rilievi. Il capitolo IV, che rappresenta, dunque, il cuore della presente dissertazione dottorale, dimostra che il pianoforte è uno strumento politimbrico: mettendo in relazione il gesto pianistico, la reazione della meccanica e il suono è risultato che al movimento del martello, ripetibilmente diverso a seconda del tocco pianistico, corrisponde una reazione acustica che varia ripetibilmente in maniera differente a seconda del tocco.
Resumo:
This PhD thesis describes set up of technological models for obtaining high health value foods and ingredients that preserve the final product characteristics as well as enrich with nutritional components. In particular, the main object of my research has been Virgin Olive Oil (VOO) and its important antioxidant compounds which differentiate it from all other vegetables oils. It is well known how the qualitative and quantitative presence of phenolic molecules extracted from olives during oil production is fundamental for its oxidative and nutritional quality. For this purpose, agronomic and technological conditions of its production have been investigated. It has also been examined how this fraction can be better preserved during storage. Moreover, its relation with VOO sensorial characteristics and its interaction with a protein in emulsion foods have also been studied. Finally, an experimental work was carried out to determine the antioxidative and heat resistance properties of a new antioxidant (EVS-OL) when used for high temperature frying such as is typically employed for the preparation of french fries. Results of the scientific research have been submitted for a publication and some data has already been published in national and international scientific journals.
Resumo:
This PhD thesis has been proposed to validate and then apply innovative analytical methodologies for the determination of compounds with harmful impact on human health, such as biogenic amines and ochratoxin A in wines. Therefore, the influence of production technology (pH, amino acids precursor and use of different malolactic starters) on biogenic amines content in wines was evaluated. An HPLC method for simultaneous determination of amino acids and amines with precolumnderivatization with 9-Fluorenyl-methoxycarbonyl chloride (FMOC-Cl) and UV detection was developed. Initially, the influence of pH, time of derivatization, gradient profile were studied. In order to improve the separation of amino acids and amines and reduce the time of analysis, it was decided to study the influence of different flows and the use of different columns in the chromatographic method. Firstly, a C18 Luna column was used and later two monolithic columns Chromolith in series. It appeared to be suitable for an easy, precise and accurate determination of a relatively large number of amino acids and amines in wines. This method was then applied on different wines produced in the Emilia Romagna region. The investigation permitted to discriminate between red and white wines. Amino acids content is related to the winemaking process. Biogenic amines content in these wines does not represent a possible toxicological problem for human health. The results of the study of influence of technologies and wine composition demonstrated that pH of wines and amino acids content are the most important factors. Particularly wines with pH > 3,5 show higher concentration of biogenic amines than wines with lower pH. The enrichment of wines by nutrients also influences the content of some biogenic amines that are higher in wines added with amino acids precursors. In this study, amino acids and biogenic amines are not statistically affected by strain of lactic acid bacteria inoculated as a starter for malolactic fermentation. An evaluation of different clean-up (SPE-MycoSep; IACs and LLE) and determination methods (HPLC and ELISA) of ochratoxin A was carried out. The results obtained proved that the SPE clean-up are reliable at the same level while the LLE procedures shows lowest recovery. The ELISA method gave a lower determination and a low reproducibility than HPLC method.
Resumo:
The aim of the first part of this thesis was to evaluate the effect of trans fatty acid- (TFA), contaminant, polycyclic aromatic hydrocarbon (PAH)- and oxidation productenriched diets on the content of TFA and conjugated linoleic acid (CLA) isomers in meat and liver of both poultry and rabbit. The enriched feedings were prepared with preselected fatty co-and by-products that contained low and high levels of TFA (low, palm fatty acid distillate; high, hydrogenated palm fatty acid distillate), environmental contaminants (dioxins and PCBs) (two different fish oils), PAH (olive oil acid oils and pomace olive oil from chemical refining, for low and high levels) and oxidation products (sunflower-olive oil blend before and after frying), so as to obtain single feedings with three enrichment degrees (high, medium and low) of the compound of interest. This experimental set-up is a part of a large, collaborative European project (http://www.ub.edu/feedfat/), where other chemical and health parameters are assessed. Lipids were extracted, methylated with diazomethane, then transmethylated with 2N KOH/methanol and analyzed by GC and silver-ion TLC-GC. TFA and CLA were determined in the fats, the feedings, meat and liver of both poultry and rabbit. In general, the level of TFA and CLA in meat and liver mainly varied according to those originally found in the feeding fats. It must be pointed out, though, that TFA and CLA accumulation was different for the two animal species, as well as for the two types of tissues. The TFA composition of meat and liver changes according to the composition of the oils added to the feeds with some differences between species. Chicken meat with skin shows higher TFA content (2.6–5.4 fold) than rabbit meat, except for the “PAH” trial. Chicken liver shows higher TFA content (1.2–2.1 fold) than rabbit liver, except for the “TRANS” and “PAH” trials. In both chicken and rabbit meats, the TFA content was higher for the “TRANS” trial, followed by the “DIOXIN” trial. Slight differences were found on the “OXIDATION” and “PAH” trends in both types of meats. In both chicken and rabbit livers, the TFA content was higher for the “TRANS” trial, followed by those of the “PAH”, “DIOXIN” and “OXIDATION” trials. This trend, however, was not identical to that of feeds, where the TFA content varied as follows: “TRANS” > “DIOXIN” >“PAH” > “OXIDATION”. In chicken and rabbit meat samples, C18:1 TFA were the most abundant, followed by C18:2 TFA and C18:3 TFA, except for the “DIOXIN” trial where C18:3 TFA > C18:2 TFA. In chicken and rabbit liver samples of the “TRANS” and “OXIDATION” trials, C18:1 TFA were the most abundant, followed by C18:2 TFA and C18:3 TFA, whereas C18:3 TFA > C18:2 in the “DIOXIN” trial. Slight differences were found on the “PAH” trend in livers from both species. The second part of the thesis dealt with the study of lipid oxidation in washed turkey muscle added with different antioxidants. The evaluation on the oxidative stability of muscle foods found that oxidation could be measured by headspace solid phase microestraction (SPME) of hexanal and propanal. To make this method effective, an antioxidant system was added to stored muscle to stop the oxidative processes. An increase in ionic strength of the sample was also implemented to increase the concentration of aldehydes in the headspace. This method was found to be more sensitive than the commonly used thiobarbituric acid reactive substances (TBARs) method. However, after antioxidants were added and oxidation was stopped, the concentration of aldehydes decreased. It was found that the decrease in aldehyde concentration was due to the binding of the aldehydes to muscle proteins, thus decreasing the volatility and making them less detectable.
Resumo:
Maintaining the postharvest quality of whole and fresh-cut fruit during storage and distribution is the major challenge facing fruit industry. For this purpose, industry adopt a wide range of technologies to enable extended shelf-life. Many factors can lead to loss of quality in fresh product, hence the common description of these products as ‘perishable’. As a consequence normal factors such as transpiration and respiration lead ultimately to water loss and senescence of the product. Fruits and vegetables are living commodities and their rate of respiration is of key importance to maintenance of quality. It has been commonly observed that the greater the respiration rate of a product, the shorter the shelf-life. The principal problem for fresh-cut fruit industries is the relative shorter shelf-life of minimally processed fruit (MPF) compared to intact product. This fact is strictly connected with the higher ethylene production of fruit tissue stimulated during fresh-cut processing (peeling, cutting, dipping). 1-Methylcyclopropene (1-MCP) is an inhibitor of ethylene action and several researches have shown its effectiveness on the inhibition of ripening and senescence incidence for intact fruit and consequently on their shelf-life extension. More recently 1-MCP treatment has been tested also for shelf-life extension of MPF but discordant results have been obtained. Considering that in some countries 1-MCP is already a commercial product registered for the use on a number of horticultural products, the main aim of this actual study was to enhance our understanding on the effects of 1-MCP treatment on the quality maintenance of whole and fresh-cut climacteric and non-climacteric fruit (apple, kiwifruit and pineapple). Concerning the effects of 1-MCP on whole fruit, was investigated the effects of a semi-commercial postharvest treatment with 1-MCP on the quality of Pink Lady apples as functions of fruit ripening stage, 1-MCP dose, storage time and also in combination with controlled atmospheres storage in order to better understand what is the relationship among these parameters and if is possible to maximize the 1-MCP treatment to meet the market/consumer needs and then in order to put in the market excellent fruit. To achieve this purpose an incomplete three-level three-factor design was adopted. During the storage were monitored several quality parameters: firmness, ripening index, ethylene and carbon dioxide production and were also performed a sensory evaluations after 6 month of storage. In this study the higher retention of firmness (at the end of storage) was achieved by applying the greatest 1-MCP concentration to fruits with the lowest maturity stage. This finding means that in these semi-commercial conditions we may considerate completely blocked the fruit softening. 1-MCP was able to delay also the ethylene and CO2 production and the maturity parameters (soluble solids content and total acidity). Only in some cases 1-MCP generate a synergistic effect with the CA storage. The results of sensory analyses indicated that, the 1-MCP treatment did not affect the sweetness and whole fruit flavour while had a little effect on the decreasing cut fruit flavour. On the contrary the treated apple was more sour, crisp, firm and juicy. The effects of some treatment (dipping and MAP) on the nutrient stability were also investigated showing that in this case study the adopted treatments did not have drastic effects on the antioxidant compounds on the contrary the dipping may enhance the total antioxidant activity by the accumulation of ascorbic acid on the apple cut surface. Results concerning the effects of 1-MCP in combination with MAP on the quality parameters behaviour of the kiwifruit were not always consistent and clear: in terms of colour maintenance, it seemed to have a synergistic effect with N2O MAP; as far as ripening index is concerned, 1-MCP had a preservative effect, but just for sample packed in air.
Resumo:
Biological processes are very complex mechanisms, most of them being accompanied by or manifested as signals that reflect their essential characteristics and qualities. The development of diagnostic techniques based on signal and image acquisition from the human body is commonly retained as one of the propelling factors in the advancements in medicine and biosciences recorded in the recent past. It is a fact that the instruments used for biological signal and image recording, like any other acquisition system, are affected by non-idealities which, by different degrees, negatively impact on the accuracy of the recording. This work discusses how it is possible to attenuate, and ideally to remove, these effects, with a particular attention toward ultrasound imaging and extracellular recordings. Original algorithms developed during the Ph.D. research activity will be examined and compared to ones in literature tackling the same problems; results will be drawn on the base of comparative tests on both synthetic and in-vivo acquisitions, evaluating standard metrics in the respective field of application. All the developed algorithms share an adaptive approach to signal analysis, meaning that their behavior is not dependent only on designer choices, but driven by input signal characteristics too. Performance comparisons following the state of the art concerning image quality assessment, contrast gain estimation and resolution gain quantification as well as visual inspection highlighted very good results featured by the proposed ultrasound image deconvolution and restoring algorithms: axial resolution up to 5 times better than algorithms in literature are possible. Concerning extracellular recordings, the results of the proposed denoising technique compared to other signal processing algorithms pointed out an improvement of the state of the art of almost 4 dB.
Resumo:
Context-aware computing is currently considered the most promising approach to overcome information overload and to speed up access to relevant information and services. Context-awareness may be derived from many sources, including user profile and preferences, network information, sensor analysis; usually context-awareness relies on the ability of computing devices to interact with the physical world, i.e. with the natural and artificial objects hosted within the "environment”. Ideally, context-aware applications should not be intrusive and should be able to react according to user’s context, with minimum user effort. Context is an application dependent multidimensional space and the location is an important part of it since the very beginning. Location can be used to guide applications, in providing information or functions that are most appropriate for a specific position. Hence location systems play a crucial role. There are several technologies and systems for computing location to a vary degree of accuracy and tailored for specific space model, i.e. indoors or outdoors, structured spaces or unstructured spaces. The research challenge faced by this thesis is related to pedestrian positioning in heterogeneous environments. Particularly, the focus will be on pedestrian identification, localization, orientation and activity recognition. This research was mainly carried out within the “mobile and ambient systems” workgroup of EPOCH, a 6FP NoE on the application of ICT to Cultural Heritage. Therefore applications in Cultural Heritage sites were the main target of the context-aware services discussed. Cultural Heritage sites are considered significant test-beds in Context-aware computing for many reasons. For example building a smart environment in museums or in protected sites is a challenging task, because localization and tracking are usually based on technologies that are difficult to hide or harmonize within the environment. Therefore it is expected that the experience made with this research may be useful also in domains other than Cultural Heritage. This work presents three different approaches to the pedestrian identification, positioning and tracking: Pedestrian navigation by means of a wearable inertial sensing platform assisted by the vision based tracking system for initial settings an real-time calibration; Pedestrian navigation by means of a wearable inertial sensing platform augmented with GPS measurements; Pedestrian identification and tracking, combining the vision based tracking system with WiFi localization. The proposed localization systems have been mainly used to enhance Cultural Heritage applications in providing information and services depending on the user’s actual context, in particular depending on the user’s location.
Resumo:
Technology scaling increasingly emphasizes complexity and non-ideality of the electrical behavior of semiconductor devices and boosts interest on alternatives to the conventional planar MOSFET architecture. TCAD simulation tools are fundamental to the analysis and development of new technology generations. However, the increasing device complexity is reflected in an augmented dimensionality of the problems to be solved. The trade-off between accuracy and computational cost of the simulation is especially influenced by domain discretization: mesh generation is therefore one of the most critical steps and automatic approaches are sought. Moreover, the problem size is further increased by process variations, calling for a statistical representation of the single device through an ensemble of microscopically different instances. The aim of this thesis is to present multi-disciplinary approaches to handle this increasing problem dimensionality in a numerical simulation perspective. The topic of mesh generation is tackled by presenting a new Wavelet-based Adaptive Method (WAM) for the automatic refinement of 2D and 3D domain discretizations. Multiresolution techniques and efficient signal processing algorithms are exploited to increase grid resolution in the domain regions where relevant physical phenomena take place. Moreover, the grid is dynamically adapted to follow solution changes produced by bias variations and quality criteria are imposed on the produced meshes. The further dimensionality increase due to variability in extremely scaled devices is considered with reference to two increasingly critical phenomena, namely line-edge roughness (LER) and random dopant fluctuations (RD). The impact of such phenomena on FinFET devices, which represent a promising alternative to planar CMOS technology, is estimated through 2D and 3D TCAD simulations and statistical tools, taking into account matching performance of single devices as well as basic circuit blocks such as SRAMs. Several process options are compared, including resist- and spacer-defined fin patterning as well as different doping profile definitions. Combining statistical simulations with experimental data, potentialities and shortcomings of the FinFET architecture are analyzed and useful design guidelines are provided, which boost feasibility of this technology for mainstream applications in sub-45 nm generation integrated circuits.
Resumo:
In biological world, life of cells is guaranteed by their ability to sense and to respond to a large variety of internal and external stimuli. In particular, excitable cells, like muscle or nerve cells, produce quick depolarizations in response to electrical, mechanical or chemical stimuli: this means that they can change their internal potential through a quick exchange of ions between cytoplasm and the external environment. This can be done thanks to the presence of ion channels, proteins that span the lipid bilayer and act like switches, allowing ionic current to flow opening and shutting in a stochastic way. For a particular class of ion channels, ligand-gated ion channels, the gating processes is strongly influenced by binding between receptive sites located on the channel surface and specific target molecules. These channels, inserted in biomimetic membranes and in presence of a proper electronic system for acquiring and elaborating the electrical signal, could give us the possibility of detecting and quantifying concentrations of specific molecules in complex mixtures from ionic currents across the membrane; in this thesis work, this possibility is investigated. In particular, it reports a description of experiments focused on the creation and the characterization of artificial lipid membranes, the reconstitution of ion channels and the analysis of their electrical and statistical properties. Moreover, after a chapter about the basis of the modelling of the kinetic behaviour of ligand gated ion channels, a possible approach for the estimation of the target molecule concentration, based on a statistical analysis of the ion channel open probability, is proposed. The fifth chapter contains a description of the kinetic characterisation of a ligand gated ion channel: the homomeric α2 isoform of the glycine receptor. It involved both experimental acquisitions and signal analysis. The last chapter represents the conclusions of this thesis, with some remark on the effective performance that may be achieved using ligand gated ion channels as sensing elements.
Resumo:
Electromagnetic spectrum can be identified as a resource for the designer, as well as for the manufacturer, from two complementary points of view: first, because it is a good in great demand by many different kind of applications; second, because despite its scarce availability, it may be advantageous to use more spectrum than necessary. This is the case of Spread-Spectrum Systems, those systems in which the transmitted signal is spread over a wide frequency band, much wider, in fact, than the minimum bandwidth required to transmit the information being sent. Part I of this dissertation deals with Spread-Spectrum Clock Generators (SSCG) aiming at reducing Electro Magnetic Interference (EMI) of clock signals in integrated circuits (IC) design. In particular, the modulation of the clock and the consequent spreading of its spectrum are obtained through a random modulating signal outputted by a chaotic map, i.e. a discrete-time dynamical system showing chaotic behavior. The advantages offered by this kind of modulation are highlighted. Three different prototypes of chaos-based SSCG are presented in all their aspects: design, simulation, and post-fabrication measurements. The third one, operating at a frequency equal to 3GHz, aims at being applied to Serial ATA, standard de facto for fast data transmission to and from Hard Disk Drives. The most extreme example of spread-spectrum signalling is the emerging ultra-wideband (UWB) technology, which proposes the use of large sections of the radio spectrum at low amplitudes to transmit high-bandwidth digital data. In part II of the dissertation, two UWB applications are presented, both dealing with the advantages as well as with the challenges of a wide-band system, namely: a chaos-based sequence generation method for reducing Multiple Access Interference (MAI) in Direct Sequence UWB Wireless-Sensor-Networks (WSNs), and design and simulations of a Low-Noise Amplifier (LNA) for impulse radio UWB. This latter topic was studied during a study-abroad period in collaboration with Delft University of Technology, Delft, Netherlands.