7 resultados para Facial Object Based Method

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’obiettivo della tesi riguarda l’utilizzo di immagini aerofotogrammetriche e telerilevate per la caratterizzazione qualitativa e quantitativa di ecosistemi forestali e della loro evoluzione. Le tematiche affrontate hanno riguardato, da una parte, l’aspetto fotogrammetrico, mediante recupero, digitalizzazione ed elaborazione di immagini aeree storiche di varie epoche, e, dall’altra, l’aspetto legato all’uso del telerilevamento per la classificazione delle coperture al suolo. Nel capitolo 1 viene fatta una breve introduzione sullo sviluppo delle nuove tecnologie di rilievo con un approfondimento delle applicazioni forestali; nel secondo capitolo è affrontata la tematica legata all’acquisizione dei dati telerilevati e fotogrammetrici con una breve descrizione delle caratteristiche e grandezze principali; il terzo capitolo tratta i processi di elaborazione e classificazione delle immagini per l’estrazione delle informazioni significative. Nei tre capitoli seguenti vengono mostrati tre casi di applicazioni di fotogrammetria e telerilevamento nello studio di ecosistemi forestali. Il primo caso (capitolo 4) riguarda l’area del gruppo montuoso del Prado- Cusna, sui cui è stata compiuta un’analisi multitemporale dell’evoluzione del limite altitudinale degli alberi nell’arco degli ultimi cinquant’anni. E’ stata affrontata ed analizzata la procedura per il recupero delle prese aeree storiche, definibile mediante una serie di successive operazioni, a partire dalla digitalizzazione dei fotogrammi, continuando con la determinazione di punti di controllo noti a terra per l’orientamento delle immagini, per finire con l’ortorettifica e mosaicatura delle stesse, con l’ausilio di un Modello Digitale del Terreno (DTM). Tutto ciò ha permesso il confronto di tali dati con immagini digitali più recenti al fine di individuare eventuali cambiamenti avvenuti nell’arco di tempo intercorso. Nel secondo caso (capitolo 5) si è definita per lo studio della zona del gruppo del monte Giovo una procedura di classificazione per l’estrazione delle coperture vegetative e per l’aggiornamento della cartografia esistente – in questo caso la carta della vegetazione. In particolare si è cercato di classificare la vegetazione soprasilvatica, dominata da brughiere a mirtilli e praterie con prevalenza di quelle secondarie a nardo e brachipodio. In alcune aree sono inoltre presenti comunità che colonizzano accumuli detritici stabilizzati e le rupi arenacee. A questo scopo, oltre alle immagini aeree (Volo IT2000) sono state usate anche immagini satellitari ASTER e altri dati ancillari (DTM e derivati), ed è stato applicato un sistema di classificazione delle coperture di tipo objectbased. Si è cercato di definire i migliori parametri per la segmentazione e il numero migliore di sample per la classificazione. Da una parte, è stata fatta una classificazione supervisionata della vegetazione a partire da pochi sample di riferimento, dall’altra si è voluto testare tale metodo per la definizione di una procedura di aggiornamento automatico della cartografia esistente. Nel terzo caso (capitolo 6), sempre nella zona del gruppo del monte Giovo, è stato fatto un confronto fra la timberline estratta mediante segmentazione ad oggetti ed il risultato di rilievi GPS a terra appositamente effettuati. L’obiettivo è la definizione del limite altitudinale del bosco e l’individuazione di gruppi di alberi isolati al di sopra di esso mediante procedure di segmentazione e classificazione object-based di ortofoto aeree in formato digitale e la verifica sul campo in alcune zone campione dei risultati, mediante creazione di profili GPS del limite del bosco e determinazione delle coordinate dei gruppi di alberi isolati. I risultati finali del lavoro hanno messo in luce come le moderne tecniche di analisi di immagini sono ormai mature per consentire il raggiungimento degli obiettivi prefissi nelle tre applicazioni considerate, pur essendo in ogni caso necessaria una attenta validazione dei dati ed un intervento dell’operatore in diversi momenti del processo. In particolare, le operazioni di segmentazione delle immagini per l’estrazione di feature significative hanno dimostrato grandi potenzialità in tutti e tre i casi. Un software ad “oggetti” semplifica l’implementazione dei risultati della classificazione in un ambiente GIS, offrendo la possibilità, ad esempio, di esportare in formato vettoriale gli oggetti classificati. Inoltre dà la possibilità di utilizzare contemporaneamente, in un unico ambiente, più sorgenti di informazione quali foto aeree, immagini satellitari, DTM e derivati. Le procedure automatiche per l’estrazione della timberline e dei gruppi di alberi isolati e per la classificazione delle coperture sono oggetto di un continuo sviluppo al fine di migliorarne le prestazioni; allo stato attuale esse non devono essere considerate una soluzione ottimale autonoma ma uno strumento per impostare e semplificare l’intervento da parte dello specialista in fotointerpretazione.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite new methods and combined strategies, conventional cancer chemotherapy still lacks specificity and induces drug resistance. Gene therapy can offer the potential to obtain the success in the clinical treatment of cancer and this can be achieved by replacing mutated tumour suppressor genes, inhibiting gene transcription, introducing new genes encoding for therapeutic products, or specifically silencing any given target gene. Concerning gene silencing, attention has recently shifted onto the RNA interference (RNAi) phenomenon. Gene silencing mediated by RNAi machinery is based on short RNA molecules, small interfering RNAs (siRNAs) and microRNAs (miRNAs), that are fully o partially homologous to the mRNA of the genes being silenced, respectively. On one hand, synthetic siRNAs appear as an important research tool to understand the function of a gene and the prospect of using siRNAs as potent and specific inhibitors of any target gene provides a new therapeutical approach for many untreatable diseases, particularly cancer. On the other hand, the discovery of the gene regulatory pathways mediated by miRNAs, offered to the research community new important perspectives for the comprehension of the physiological and, above all, the pathological mechanisms underlying the gene regulation. Indeed, changes in miRNAs expression have been identified in several types of neoplasia and it has also been proposed that the overexpression of genes in cancer cells may be due to the disruption of a control network in which relevant miRNA are implicated. For these reasons, I focused my research on a possible link between RNAi and the enzyme cyclooxygenase-2 (COX-2) in the field of colorectal cancer (CRC), since it has been established that the transition adenoma-adenocarcinoma and the progression of CRC depend on aberrant constitutive expression of COX-2 gene. In fact, overexpressed COX-2 is involved in the block of apoptosis, the stimulation of tumor-angiogenesis and promotes cell invasion, tumour growth and metastatization. On the basis of data reported in the literature, the first aim of my research was to develop an innovative and effective tool, based on the RNAi mechanism, able to silence strongly and specifically COX-2 expression in human colorectal cancer cell lines. In this study, I firstly show that an siRNA sequence directed against COX-2 mRNA (siCOX-2), potently downregulated COX-2 gene expression in human umbilical vein endothelial cells (HUVEC) and inhibited PMA-induced angiogenesis in vitro in a specific, non-toxic manner. Moreover, I found that the insertion of a specific cassette carrying anti-COX-2 shRNA sequence (shCOX-2, the precursor of siCOX-2 previously tested) into a viral vector (pSUPER.retro) greatly increased silencing potency in a colon cancer cell line (HT-29) without activating any interferon response. Phenotypically, COX-2 deficient HT-29 cells showed a significant impairment of their in vitro malignant behaviour. Thus, results reported here indicate an easy-to-use, powerful and high selective virus-based method to knockdown COX-2 gene in a stable and long-lasting manner, in colon cancer cells. Furthermore, they open up the possibility of an in vivo application of this anti-COX-2 retroviral vector, as therapeutic agent for human cancers overexpressing COX-2. In order to improve the tumour selectivity, pSUPER.retro vector was modified for the shCOX-2 expression cassette. The aim was to obtain a strong, specific transcription of shCOX-2 followed by COX-2 silencing mediated by siCOX-2 only in cancer cells. For this reason, H1 promoter in basic pSUPER.retro vector [pS(H1)] was substituted with the human Cox-2 promoter [pS(COX2)] and with a promoter containing repeated copies of the TCF binding element (TBE) [pS(TBE)]. These promoters were choosen because they are partculary activated in colon cancer cells. COX-2 was effectively silenced in HT-29 and HCA-7 colon cancer cells by using enhanced pS(COX2) and pS(TBE) vectors. In particular, an higher siCOX-2 production followed by a stronger inhibition of Cox-2 gene were achieved by using pS(TBE) vector, that represents not only the most effective, but also the most specific system to downregulate COX-2 in colon cancer cells. Because of the many limits that a retroviral therapy could have in a possible in vivo treatment of CRC, the next goal was to render the enhanced RNAi-mediate COX-2 silencing more suitable for this kind of application. Xiang and et al. (2006) demonstrated that it is possible to induce RNAi in mammalian cells after infection with engineered E. Coli strains expressing Inv and HlyA genes, which encode for two bacterial factors needed for successful transfer of shRNA in mammalian cells. This system, called “trans-kingdom” RNAi (tkRNAi) could represent an optimal approach for the treatment of colorectal cancer, since E. Coli in normally resident in human intestinal flora and could easily vehicled to the tumor tissue. For this reason, I tested the improved COX-2 silencing mediated by pS(COX2) and pS(TBE) vectors by using tkRNAi system. Results obtained in HT-29 and HCA-7 cell lines were in high agreement with data previously collected after the transfection of pS(COX2) and pS(TBE) vectors in the same cell lines. These findings suggest that tkRNAi system for COX-2 silencing, in particular mediated by pS(TBE) vector, could represent a promising tool for the treatment of colorectal cancer. Flanking the studies addressed to the setting-up of a RNAi-mediated therapeutical strategy, I proposed to get ahead with the comprehension of new molecular basis of human colorectal cancer. In particular, it is known that components of the miRNA/RNAi pathway may be altered during the progressive development of colorectal cancer (CRC), and it has been already demonstrated that some miRNAs work as tumor suppressors or oncomiRs in colon cancer. Thus, my hypothesis was that overexpressed COX-2 protein in colon cancer could be the result of decreased levels of one or more tumor suppressor miRNAs. In this thesis, I clearly show an inverse correlation between COX-2 expression and the human miR- 101(1) levels in colon cancer cell lines, tissues and metastases. I also demonstrate that the in vitro modulating of miR-101(1) expression in colon cancer cell lines leads to significant variations in COX-2 expression, and this phenomenon is based on a direct interaction between miR-101(1) and COX-2 mRNA. Moreover, I started to investigate miR-101(1) regulation in the hypoxic environment since adaptation to hypoxia is critical for tumor cell growth and survival and it is known that COX-2 can be induced directly by hypoxia-inducible factor 1 (HIF-1). Surprisingly, I observed that COX-2 overexpression induced by hypoxia is always coupled to a significant decrease of miR-101(1) levels in colon cancer cell lines, suggesting that miR-101(1) regulation could be involved in the adaption of cancer cells to the hypoxic environment that strongly characterize CRC tissues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The continuous increase of genome sequencing projects produced a huge amount of data in the last 10 years: currently more than 600 prokaryotic and 80 eukaryotic genomes are fully sequenced and publically available. However the sole sequencing process of a genome is able to determine just raw nucleotide sequences. This is only the first step of the genome annotation process that will deal with the issue of assigning biological information to each sequence. The annotation process is done at each different level of the biological information processing mechanism, from DNA to protein, and cannot be accomplished only by in vitro analysis procedures resulting extremely expensive and time consuming when applied at a this large scale level. Thus, in silico methods need to be used to accomplish the task. The aim of this work was the implementation of predictive computational methods to allow a fast, reliable, and automated annotation of genomes and proteins starting from aminoacidic sequences. The first part of the work was focused on the implementation of a new machine learning based method for the prediction of the subcellular localization of soluble eukaryotic proteins. The method is called BaCelLo, and was developed in 2006. The main peculiarity of the method is to be independent from biases present in the training dataset, which causes the over‐prediction of the most represented examples in all the other available predictors developed so far. This important result was achieved by a modification, made by myself, to the standard Support Vector Machine (SVM) algorithm with the creation of the so called Balanced SVM. BaCelLo is able to predict the most important subcellular localizations in eukaryotic cells and three, kingdom‐specific, predictors were implemented. In two extensive comparisons, carried out in 2006 and 2008, BaCelLo reported to outperform all the currently available state‐of‐the‐art methods for this prediction task. BaCelLo was subsequently used to completely annotate 5 eukaryotic genomes, by integrating it in a pipeline of predictors developed at the Bologna Biocomputing group by Dr. Pier Luigi Martelli and Dr. Piero Fariselli. An online database, called eSLDB, was developed by integrating, for each aminoacidic sequence extracted from the genome, the predicted subcellular localization merged with experimental and similarity‐based annotations. In the second part of the work a new, machine learning based, method was implemented for the prediction of GPI‐anchored proteins. Basically the method is able to efficiently predict from the raw aminoacidic sequence both the presence of the GPI‐anchor (by means of an SVM), and the position in the sequence of the post‐translational modification event, the so called ω‐site (by means of an Hidden Markov Model (HMM)). The method is called GPIPE and reported to greatly enhance the prediction performances of GPI‐anchored proteins over all the previously developed methods. GPIPE was able to predict up to 88% of the experimentally annotated GPI‐anchored proteins by maintaining a rate of false positive prediction as low as 0.1%. GPIPE was used to completely annotate 81 eukaryotic genomes, and more than 15000 putative GPI‐anchored proteins were predicted, 561 of which are found in H. sapiens. In average 1% of a proteome is predicted as GPI‐anchored. A statistical analysis was performed onto the composition of the regions surrounding the ω‐site that allowed the definition of specific aminoacidic abundances in the different considered regions. Furthermore the hypothesis that compositional biases are present among the four major eukaryotic kingdoms, proposed in literature, was tested and rejected. All the developed predictors and databases are freely available at: BaCelLo http://gpcr.biocomp.unibo.it/bacello eSLDB http://gpcr.biocomp.unibo.it/esldb GPIPE http://gpcr.biocomp.unibo.it/gpipe

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivation An actual issue of great interest, both under a theoretical and an applicative perspective, is the analysis of biological sequences for disclosing the information that they encode. The development of new technologies for genome sequencing in the last years, opened new fundamental problems since huge amounts of biological data still deserve an interpretation. Indeed, the sequencing is only the first step of the genome annotation process that consists in the assignment of biological information to each sequence. Hence given the large amount of available data, in silico methods became useful and necessary in order to extract relevant information from sequences. The availability of data from Genome Projects gave rise to new strategies for tackling the basic problems of computational biology such as the determination of the tridimensional structures of proteins, their biological function and their reciprocal interactions. Results The aim of this work has been the implementation of predictive methods that allow the extraction of information on the properties of genomes and proteins starting from the nucleotide and aminoacidic sequences, by taking advantage of the information provided by the comparison of the genome sequences from different species. In the first part of the work a comprehensive large scale genome comparison of 599 organisms is described. 2,6 million of sequences coming from 551 prokaryotic and 48 eukaryotic genomes were aligned and clustered on the basis of their sequence identity. This procedure led to the identification of classes of proteins that are peculiar to the different groups of organisms. Moreover the adopted similarity threshold produced clusters that are homogeneous on the structural point of view and that can be used for structural annotation of uncharacterized sequences. The second part of the work focuses on the characterization of thermostable proteins and on the development of tools able to predict the thermostability of a protein starting from its sequence. By means of Principal Component Analysis the codon composition of a non redundant database comprising 116 prokaryotic genomes has been analyzed and it has been showed that a cross genomic approach can allow the extraction of common determinants of thermostability at the genome level, leading to an overall accuracy in discriminating thermophilic coding sequences equal to 95%. This result outperform those obtained in previous studies. Moreover, we investigated the effect of multiple mutations on protein thermostability. This issue is of great importance in the field of protein engineering, since thermostable proteins are generally more suitable than their mesostable counterparts in technological applications. A Support Vector Machine based method has been trained to predict if a set of mutations can enhance the thermostability of a given protein sequence. The developed predictor achieves 88% accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesi si pone come obiettivo quello di indagare le mostre di moda contemporanee come macchine testuali. Se consideriamo l’attuale panorama del fashion design come caratterizzato da una complessità costitutiva e da rapidi mutamenti che lo attraversano, e se partiamo dal presupposto che lo spettro di significati che uno stile di abbigliamento e i singoli capi possono assumere è estremamente sfuggente, probabilmente risulta più produttivo interrogarsi su come funziona la moda, su quali sono i suoi meccanismi di produzione di significato. L’analisi delle fashion exhibition si rivela quindi un modo utile per affrontare la questione, dato che gli allestimenti discorsivizzano questi meccanismi e rappresentano delle riflessioni tridimensionali attorno a temi specifici. La mostra di moda mette in scena delle eccezionalità che magnificano aspetti tipici del funzionamento del fashion system, sia se ci rivolgiamo alla moda dal punto di vista della produzione, sia se la consideriamo dal punto di vista della fruizione. L’indagine ha rintracciato nelle mostre curate da Diana Vreeland al Costume Institute del Metropolitan Museum di New York il modello di riferimento per le mostre di moda contemporanee. Vreeland, che dal 1936 al 1971 è stata prima fashion editor e poi editor-in-chief rispettivamente di “Harper’s Bazaar” e di “Vogue USA”, ha segnato un passaggio fondamentale quando nel 1972 ha deciso di accettare il ruolo di Special Consultant al Costume Institute. È ormai opinione diffusa fra critici e studiosi di moda che le mostre da lei organizzate nel corso di più di un decennio abbiano cambiato il modo di mettere in scena i vestiti nei musei. Al lavoro di Vreeland abbiamo poi accostato una recente mostra di moda che ha fatto molto parlare di sé: Spectres. When Fashion Turns Back, a cura di Judith Clark (2004). Nell’indagare i rapporti fra il fashion design contemporaneo e la storia della moda questa mostra ha utilizzato macchine allestitive abitate dai vestiti, per “costruire idee spaziali” e mettere in scena delle connessioni non immediate fra passato e presente. Questa mostra ci è sembrata centrale per evidenziare lo sguardo semiotico del curatore nel suo interrogarsi sul progetto complessivo dell’exhibition design e non semplicemente sullo studio degli abiti in mostra. In questo modo abbiamo delineato due posizioni: una rappresentata da un approccio object-based all’analisi del vestito, che si lega direttamente alla tradizione dei conservatori museali; l’altra rappresentata da quella che ormai si può considerare una disciplina, il fashion curation, che attribuisce molta importanza a tutti gli aspetti che concorrono a formare il progetto allestitivo di una mostra. Un lavoro comparativo fra alcune delle più importanti mostre di moda recentemente organizzate ci ha permesso di individuare elementi ricorrenti e specificità di questi dispositivi testuali. Utilizzando il contributo di Manar Hammad (2006) abbiamo preso in considerazione i diversi livelli di una mostra di moda: gli abiti e il loro rapporto con i manichini; l’exhibition design e lo spazio della mostra; il percorso e la sequenza, sia dal punto di vista della strategia di costruzione e dispiegamento testuale, sia dal punto di vista del fruitore modello. Abbiamo così individuato quattro gruppi di mostre di moda: mostre museali-archivistiche; retrospettive monografiche; mostre legate alla figura di un curatore; forme miste che si posizionano trasversalmente rispetto a questi primi tre modelli. Questa sistematizzazione ha evidenziato che una delle dimensione centrali per le mostre di moda contemporanee è proprio la questione della curatorship, che possiamo leggere in termini di autorialità ed enunciazione. Si sono ulteriormente chiariti anche gli orizzonti valoriali di riferimento: alla dimensione dell’accuratezza storica è associata una mostra che predilige il livello degli oggetti (gli abiti) e un coinvolgimento del visitatore puramente visivo; alla dimensione del piacere visivo possiamo invece associare un modello di mostra che assegna all’exhibition design un ruolo centrale e “chiede” al visitatore di giocare un ruolo pienamente interattivo. L’approccio curatoriale più compiuto ci sembra essere quello che cerca di conciliare queste due dimensioni.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tracking activities during daily life and assessing movement parameters is essential for complementing the information gathered in confined environments such as clinical and physical activity laboratories for the assessment of mobility. Inertial measurement units (IMUs) are used as to monitor the motion of human movement for prolonged periods of time and without space limitations. The focus in this study was to provide a robust, low-cost and an unobtrusive solution for evaluating human motion using a single IMU. First part of the study focused on monitoring and classification of the daily life activities. A simple method that analyses the variations in signal was developed to distinguish two types of activity intervals: active and inactive. Neural classifier was used to classify active intervals; the angle with respect to gravity was used to classify inactive intervals. Second part of the study focused on extraction of gait parameters using a single inertial measurement unit (IMU) attached to the pelvis. Two complementary methods were proposed for gait parameters estimation. First method was a wavelet based method developed for the estimation of gait events. Second method was developed for estimating step and stride length during level walking using the estimations of the previous method. A special integration algorithm was extended to operate on each gait cycle using a specially designed Kalman filter. The developed methods were also applied on various scenarios. Activity monitoring method was used in a PRIN’07 project to assess the mobility levels of individuals living in a urban area. The same method was applied on volleyball players to analyze the fitness levels of them by monitoring their daily life activities. The methods proposed in these studies provided a simple, unobtrusive and low-cost solution for monitoring and assessing activities outside of controlled environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is a collection of works focused on the topic of Earthquake Early Warning, with a special attention to large magnitude events. The topic is addressed from different points of view and the structure of the thesis reflects the variety of the aspects which have been analyzed. The first part is dedicated to the giant, 2011 Tohoku-Oki earthquake. The main features of the rupture process are first discussed. The earthquake is then used as a case study to test the feasibility Early Warning methodologies for very large events. Limitations of the standard approaches for large events arise in this chapter. The difficulties are related to the real-time magnitude estimate from the first few seconds of recorded signal. An evolutionary strategy for the real-time magnitude estimate is proposed and applied to the single Tohoku-Oki earthquake. In the second part of the thesis a larger number of earthquakes is analyzed, including small, moderate and large events. Starting from the measurement of two Early Warning parameters, the behavior of small and large earthquakes in the initial portion of recorded signals is investigated. The aim is to understand whether small and large earthquakes can be distinguished from the initial stage of their rupture process. A physical model and a plausible interpretation to justify the observations are proposed. The third part of the thesis is focused on practical, real-time approaches for the rapid identification of the potentially damaged zone during a seismic event. Two different approaches for the rapid prediction of the damage area are proposed and tested. The first one is a threshold-based method which uses traditional seismic data. Then an innovative approach using continuous, GPS data is explored. Both strategies improve the prediction of large scale effects of strong earthquakes.