920 resultados para sottotitolaggio, pivot subtitling, subtitles, tecniche di sottotitolaggio, This is England
Resumo:
Lo scopo di questa tesi è mettere a confronto due tecniche di sottotitolaggio presenti sul mercato, sottotitolando uno spezzone del film “This is England”. Il primo approccio è il sottotitolaggio tradizionale, ovvero la creazione di sottotitoli in una data lingua a partire dall’audio e dallo script nella lingua dei dialoghi originali. Il secondo metodo è il pivot subtitling, che consiste nella creazione di sottotitoli in una determinata lingua a partire da sottotitoli esistenti in una lingua diversa da quella dei dialoghi originali. Questa tesi nasce dalla curiosità di sperimentare personalmente le due tecniche e formare una mia opinione critica al riguardo. Così è seguita la scelta di un film. Il film “This is England”, di difficile comprensione a causa di accenti particolarmente marcati di alcuni protagonisti, della densità di alcuni dialoghi e dei riferimenti culturali alla sottocultura inglese skinhead nata a fine anni ‘60. La parte principale del lavoro è stata la creazione dei sottotitoli. Nel primo capitolo introdurrò il film, spiegherò la prima fase del lavoro, ovvero come ho svolto il processo del sottotitolaggio tradizionale, successivamente seguiranno nozioni sui pivot subtitles e il secondo processo di creazione di quelli italiani. Il secondo capitolo spiegherà il concetto di qualità dei sottotitoli. Successivamente seguiranno i commenti ai sottotitoli: si tratterà di mettere a confronto i sottotitoli ottenuti con le due diverse tecniche per la medesima battuta o scena. Per finire analizzerò le differenze che emergono dalle due tecniche, l’impatto sul pubblico, i vantaggi e gli svantaggi di ognuna e i motivi per cui una viene utilizzata più dell’altra. Queste considerazioni serviranno per dare una risposta alla questione della tesi: in cosa consiste la differenza fra le due tecniche? Sono accettabili entrambi i risultati seppur diversi?
Resumo:
This Is England is social realist film portraying racism and poverty in 1980s Britain through the eyes of Shaun, a 12 year old boy, who has lost his father in the Falklands war and as to come to terms with his own identity, the difficult transition from childhood to adolescence and the need to fit in a determined group/tribe/gang. The following article aims at analysing relevant aspects depicted from the film emphasizing the so much debated reality of life during 80s. In This is England Shane Meadows manages to rediscover his own self geography, by revisiting his adolescent years. It is a biographical film about the importance of peer pressure and the results of an excess of nationalism, at the same time it typifies some issues related to the 80s youth culture.
Resumo:
Human biomonitoring (HBM) is an ideal tool for evaluating toxicant exposure in health risk assessment. Chemical substances or their metabolites related to environmental pollutants can be detected as biomarkers of exposure using a wide variety of biological fluids. Individual exposure to aromatic hydrocarbon compounds (benzene, toluene, and o-xylene –“BTX”) were analysed with a liquid chromatography coupled to electrospray ionisation-mass spectrometry (μHPLC-ESI-MS/MS) method for the simultaneous quantitative detection of the BTX exposure biomarker SPMA, SBMA and o-MBMA in human urine. Urinary S-phenylmercapturic acid (SPMA) is a biomarker proposed by the American Conference of Governmental Industrial Hygienists (ACGIH) for assessing occupational exposure to benzene (Biological Exposure Index of 25 microg/g creatinine). Urinary S-benzylmercapturic (SBMA) and o-methyl S-benzyl mercapturic acid (o-MBMA) are specific toluene and o-xylene metabolites of glutathione detoxicant pathways, proposed as reliable biomarkers of exposure. To this aim a pre-treatment of the urine with solid phase extraction (SPE) and an evaporation step were necessary to concentrate the mercapturic acids before instrumental analysis. A liquid chromatography separation was carried out with a reversed phase capillary column (Synergi 4u Max-RP) using a binary gradient composed of an acquous solution of formic acid 0.07% v/v and methanol. The mercapturic acids were determinated by negative-ion-mass spectrometry and the data were corrected using isotope-labelled analogs as internal standards. The analytical method follows U.S. Food and Drug Administration guidance and was applied to assess exposure to BTX in a group of 396 traffic wardens. The association between biomarker results and individual factors, such as age, sex and tobacco smoke were also investigated. The present work also included improvements in the methods used by modifying various chromatographic parameters and experimental procedures. A partial validation was conducted to evaluate LOD, precision, accuracy, recovery as well as matrix effects. Higher sensitivity will be possible in future biological monitoring programmes, allowing evaluation of very low level of BTX human exposure. Keywords: Human biomonitoring, aromatic hydrocarbons, biomarker of exposure, HPLC-MS/MS.
Resumo:
Ambient Intelligence (AmI) envisions a world where smart, electronic environments are aware and responsive to their context. People moving into these settings engage many computational devices and systems simultaneously even if they are not aware of their presence. AmI stems from the convergence of three key technologies: ubiquitous computing, ubiquitous communication and natural interfaces. The dependence on a large amount of fixed and mobile sensors embedded into the environment makes of Wireless Sensor Networks one of the most relevant enabling technologies for AmI. WSN are complex systems made up of a number of sensor nodes, simple devices that typically embed a low power computational unit (microcontrollers, FPGAs etc.), a wireless communication unit, one or more sensors and a some form of energy supply (either batteries or energy scavenger modules). Low-cost, low-computational power, low energy consumption and small size are characteristics that must be taken into consideration when designing and dealing with WSNs. In order to handle the large amount of data generated by a WSN several multi sensor data fusion techniques have been developed. The aim of multisensor data fusion is to combine data to achieve better accuracy and inferences than could be achieved by the use of a single sensor alone. In this dissertation we present our results in building several AmI applications suitable for a WSN implementation. The work can be divided into two main areas: Multimodal Surveillance and Activity Recognition. Novel techniques to handle data from a network of low-cost, low-power Pyroelectric InfraRed (PIR) sensors are presented. Such techniques allow the detection of the number of people moving in the environment, their direction of movement and their position. We discuss how a mesh of PIR sensors can be integrated with a video surveillance system to increase its performance in people tracking. Furthermore we embed a PIR sensor within the design of a Wireless Video Sensor Node (WVSN) to extend its lifetime. Activity recognition is a fundamental block in natural interfaces. A challenging objective is to design an activity recognition system that is able to exploit a redundant but unreliable WSN. We present our activity in building a novel activity recognition architecture for such a dynamic system. The architecture has a hierarchical structure where simple nodes performs gesture classification and a high level meta classifiers fuses a changing number of classifier outputs. We demonstrate the benefit of such architecture in terms of increased recognition performance, and fault and noise robustness. Furthermore we show how we can extend network lifetime by performing a performance-power trade-off. Smart objects can enhance user experience within smart environments. We present our work in extending the capabilities of the Smart Micrel Cube (SMCube), a smart object used as tangible interface within a tangible computing framework, through the development of a gesture recognition algorithm suitable for this limited computational power device. Finally the development of activity recognition techniques can greatly benefit from the availability of shared dataset. We report our experience in building a dataset for activity recognition. Such dataset is freely available to the scientific community for research purposes and can be used as a testbench for developing, testing and comparing different activity recognition techniques.
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.
Resumo:
In the last decades, the increase of industrial activities and of the request for the world food requirement, the intensification of natural resources exploitation, directly connected to pollution, have aroused an increasing interest of the public opinion towards initiatives linked to the regulation of food production, as well to the institution of a modern legislation for the consumer guardianship. This work was planned taking into account some important thematics related to marine environment, collecting and showing the data obtained from the studies made on different marine species of commercial interest (Chamelea gallina, Mytilus edulis, Ostrea edulis, Crassostrea gigas, Salmo salar, Gadus morhua). These studies have evaluated the effects of important physic and chemical parameters variations (temperature, xenobiotics like drugs, hydrocarbons and pesticides) on cells involved in the immune defence (haemocytes) and on some important enzymatic systems involved in xenobiotic biotransformation processes (cytochrome P450 complex) and in the related antioxidant defence processes (Superoxide dismutase, Catalase, Heat Shock Protein), from a biochemical and bimolecular point of view. Oxygen is essential in the biological answer of a living organism. Its consume in the normal cellular breathing physiological processes and foreign substances biotransformation, leads to reactive oxygen species (ROS) formation, potentially toxic and responsible of biological macromolecules damages with consequent pathologies worsening. Such processes can bring to a qualitative alteration of the derived products, but also to a general state of suffering that in the most serious cases can provoke the death of the organism, with important repercussions in economic field, in the output of the breedings, of fishing and of aquaculture. In this study it seemed interesting to apply also alternative methodologies currently in use in the medical field (cytofluorimetry) and in proteomic studies (bidimensional electrophoresis, mass spectrometry) with the aim of identify new biomarkers to place beside the traditional methods for the control of the animal origin food quality. From the results it’s possible to point out some relevant aspects from each experiment: 1. The cytofluorimetric techniques applied to O. edulis and C. gigas could bring to important developments in the search of alternative methods that quickly allows to identify with precision the origin of a specific sample, contributing to oppose possible alimentary frauds, in this case for example related to presence of a different species, also under a qualitative profile, but morpholgically similar. A concrete perspective for the application in the inspective field of this method has to be confirmed by further laboratory tests that take also in account in vivo experiments to evaluate the effect in the whole organism of the factors evaluated only on haemocytes in vitro. These elements suggest therefore the possibility to suit the cytofluorimetric methods for the study of animal organisms of food interest, still before these enter the phase of industrial working processes, giving useful information about the possible presence of contaminants sources that can induce an increase of the immune defence and an alteration of normal cellular parameter values. 2. C. gallina immune system has shown an interesting answer to benzo[a]pyrene (B[a]P) exposure, dose and time dependent, with a significant decrease of the expression and of the activity of one of the most important enzymes involved in the antioxidant defence in haemocytes and haemolymph. The data obtained are confirmed by several measurements of physiological parameters, that together with the decrease of the activity of 7-etossi-resourifine-O-deetilase (EROD linked to xenobiotic biotransformation processes) during exposure, underline the major effects of B[a]P action. The identification of basal levels of EROD supports the possible presence of CYP1A subfamily in the invertebrates, still today controversial, never identified previously in C. gallina and never isolated in the immune cells, as confirmed instead in this study with the identification of CYP1A-immunopositive protein (CYP1A-IPP). This protein could reveal a good biomarker at the base of a simple and quick method that could give clear information about specific pollutants presence, even at low concentrations in the environment where usually these organisms are fished before being commercialized. 3. In this experiment it has been evaluated the effect of the antibiotic chloramphenicol (CA) in an important species of commercial interest, Chamelea gallina. Chloramphenicol is a drug still used in some developing countries, also in veterinary field. Controls to evaluate its presence in the alimentary products of animal origin, can reveal ineffective whereas the concentration results to be below the limit of sensitivity of the instruments usually used in this type of analysis. Negative effects of CA towards the CYP1A- IPP proteins, underlined in this work, seem to be due to the attack of free radicals resultant from the action of the antibiotic. This brings to a meaningful alteration of the biotransformation mechanisms through the free radicals. It seems particularly interesting to pay attention to the narrow relationships in C. gallina, between SOD/CAT and CYP450 system, actively involved in detoxification mechanism, especially if compared with the few similar works today present about mollusc, a group that is composed by numerous species that enter in the food field and on which constant controls are necessary to evaluate in a rapid and effective way the presence of possible contaminations. 4. The investigations on fishes (Gadus morhua, and Salmo salar) and on a bivalve mollusc (Mytilus edulis) have allowed to evaluate different aspects related to the possibility to identify a biomarker for the evaluation of the health of organisms of food interest and consequently for the quality of the final product through 2DE methodologies. In the seafood field these techniques are currently used with a discreet success only for vertebrates (fishes), while in the study of the invertebrates (molluscs) there are a lot of difficulties. The results obtained in this work have underline several problems in the correct identification of the isolated proteins in animal organisms of which doesn’t currently exist a complete genomic sequence. This brings to attribute some identities on the base of the comparison with similar proteins in other animal groups, incurring in the possibility to obtain inaccurate data and above all discordant with those obtained on the same animals by other authors. Nevertheless the data obtained in this work after MALDI-ToF analysis, result however objective and the spectra collected could be again analyzed in the future after the update of genomic database related to the species studied. 4-A. The investigation about the presence of HSP70 isoforms directly induced by different phenomena of stress like B[a]P presence, has used bidimensional electrophoresis methods in C. gallina, that have allowed to isolate numerous protein on 2DE gels, allowing the collection of several spots currently in phase of analysis with MALDI-ToF-MS. The present preliminary work has allowed therefore to acquire and to improve important methodologies in the study of cellular parameters and in the proteomic field, that is not only revealed of great potentiality in the application in medical and veterinary field, but also in the field of the inspection of the foods with connections to the toxicology and the environmental pollution. Such study contributes therefore to the search of rapid and new methodologies, that can increase the inspective strategies, integrating themselves with those existing, but improving at the same time the general background of information related to the state of health of the considered animal organism, with the possibility, still hypothetical, to replace in particular cases the employment of the traditional techniques in the alimentary field.
Resumo:
Il presente studio si concentra sulle diverse applicazioni del telerilevamento termico in ambito urbano. Vengono inizialmente descritti la radiazione infrarossa e le sue interazioni con l’atmosfera terrestre, le leggi principali che regolano lo scambio di calore per irraggiamento, le caratteristiche dei sensori e le diverse applicazioni di termografia. Successivamente sono trattati nel dettaglio gli aspetti caratteristici della termografia da piattaforma satellitare, finalizzata principalmente alla valutazione del fenomeno dell'Urban Heat Island; vengono descritti i sensori disponibili, le metodologie di correzione per gli effetti atmosferici, per la stima dell'emissività delle superfici e per il calcolo della temperatura superficiale dei pixels. Viene quindi illustrata la sperimentazione effettuata sull'area di Bologna mediante immagini multispettrali ASTER: i risultati mostrano come sull'area urbana sia riscontrabile la presenza dell'Isola di Calore Urbano, anche se la sua quantificazione risulta complessa. Si procede quindi alla descrizione di potenzialità e limiti della termografia aerea, dei suoi diversi utilizzi, delle modalità operative di rilievo e degli algoritmi utilizzati per il calcolo della temperatura superficiale delle coperture edilizie. Tramite l’analisi di alcune esperienze precedenti vengono trattati l’influenza dell’atmosfera, la modellazione dei suoi effetti sulla radianza rilevata, i diversi metodi per la stima dell’emissività. Viene quindi introdotto il progetto europeo Energycity, finalizzato alla creazione di un sistema GeoWeb di supporto spaziale alle decisioni per la riduzione di consumi energetici e produzione di gas serra su sette città dell'Europa Centrale. Vengono illustrate le modalità di rilievo e le attività di processing dei datasets digitali per la creazione di mappe di temperatura superficiale da implementare nel sistema SDSS. Viene infine descritta la sperimentazione effettuata sulle immagini termiche acquisite nel febbraio 2010 sulla città di Treviso, trasformate in un mosaico georiferito di temperatura radiometrica tramite correzioni geometriche e radiometriche; a seguito della correzione per l’emissività quest’ultimo verrà trasformato in un mosaico di temperatura superficiale.
Resumo:
Nel Comune di Ravenna, oltre 6.800 ettari di terreni agricoli sono a rischio salinizzazione, a causa dell’alta salinità delle acque sotterranee presenti all’interno dell’acquifero freatico costiero. L'area è interessata da subsidenza naturale, per compattazione dei sedimenti alluvionali e antropica, causata dall’estrazione di gas e dall’eccessivo sfruttamento delle acque sotterranee. Ne deriva che la maggior parte di questo territorio è sotto il livello medio del mare e l'agricoltura, così come ogni altra attività umana, è possibile grazie ad una fitta rete di canali di drenaggio che garantiscono il franco di coltivazione. L’agricoltura è una risorsa importante per la zona, ma a causa della scarsa disponibilità di acque dolci e per l’aumento dei processi di salinizzazione dei suoli, necessita di un cambiamento. Servono pratiche agricole sostenibili, con idonei requisiti irrigui, di drenaggio del suolo, di resistenza alla salinizzazione e di controllo del suolo. Dopo un’analisi generale sulle condizioni dell’acquifero, è stato monitorato un transetto di 10km rappresentativo della parte costiera di Ravenna. Infine, con l'obiettivo di comprendere l'interazione tra un canale d'irrigazione e le acque sotterranee, una piccola area agricola (12 ettari), è stata monitorata nel corso del 2011 utilizzando metodi idrologici, geochimici e geofisici. I risultati di questo lavoro mostrano una diffusa salinizzazione della falda freatica, ma anche la presenza di una lente d'acqua dolce spessa 5m, a 400m dalla linea di riva, con caratteristiche chimiche (hydrofacies) tipici di acque continentali e con dimensioni variabili stagionalmente. Questa bolla di acqua dolce si è originata esclusivamente dalle infiltrazioni dal canale d’irrigazione presente, in quanto, il contributo dell’irrigazione superficiale è stato nullo. Sfruttando la rete di canali di drenaggio già presente sarebbe possibile estendere questo processo d’infiltrazione da canale in altre porzioni dell’acquifero allo scopo di ricaricare l’acquifero stesso e limitare la salinizzazione dei suoli.
Resumo:
Forms 2 vol. of Enciclopedia portatile, compilata sotta la direzione di C. Bailly.