907 resultados para Deep Geological Repository


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A numerical model for studying the influences of deep convective cloud systems on photochemistry was developed based on a non-hydrostatic meteorological model and chemistry from a global chemistry transport model. The transport of trace gases, the scavenging of soluble trace gases, and the influences of lightning produced nitrogen oxides (NOx=NO+NO2) on the local ozone-related photochemistry were investigated in a multi-day case study for an oceanic region located in the tropical western Pacific. Model runs considering influences of large scale flows, previously neglected in multi-day cloud resolving and single column model studies of tracer transport, yielded that the influence of the mesoscale subsidence (between clouds) on trace gas transport was considerably overestimated in these studies. The simulated vertical transport and scavenging of highly soluble tracers were found to depend on the initial profiles, reconciling contrasting results from two previous studies. Influences of the modeled uptake of trace gases by hydrometeors in the liquid and the ice phase were studied in some detail for a small number of atmospheric trace gases and novel aspects concerning the role of the retention coefficient (i.e. the fraction of a dissolved trace gas that is retained in the ice phase upon freezing) on the vertical transport of highly soluble gases were illuminated. Including lightning NOx production inside a 500 km 2-D model domain was found to be important for the NOx budget and caused small to moderate changes in the domain averaged ozone concentrations. A number of sensitivity studies yielded that the fraction of lightning associated NOx which was lost through photochemical reactions in the vicinity of the lightning source was considerable, but strongly depended on assumptions about the magnitude and the altitude of the lightning NOx source. In contrast to a suggestion from an earlier study, it was argued that the near zero upper tropospheric ozone mixing ratios which were observed close to the study region were most probably not caused by the formation of NO associated with lightning. Instead, it was argued in agreement with suggestions from other studies that the deep convective transport of ozone-poor air masses from the relatively unpolluted marine boundary layer, which have most likely been advected horizontally over relatively large distances (both before and after encountering deep convection) probably played a role. In particular, it was suggested that the ozone profiles observed during CEPEX (Central Equatorial Pacific Experiment) were strongly influenced by the deep convection and the larger scale flow which are associated with the intra-seasonal oscillation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The modern stratigraphy of clastic continental margins is the result of the interaction between several geological processes acting on different time scales, among which sea level oscillations, sediment supply fluctuations and local tectonics are the main mechanisms. During the past three years my PhD was focused on understanding the impact of each of these process in the deposition of the central and northern Adriatic sedimentary successions, with the aim of reconstructing and quantifying the Late Quaternary eustatic fluctuations. In the last few decades, several Authors tried to quantify past eustatic fluctuations through the analysis of direct sea level indicators, among which drowned barrier-island deposits or coral reefs, or indirect methods, such as Oxygen isotope ratios (δ18O) or modeling simulations. Sea level curves, obtained from direct sea level indicators, record a composite signal, formed by the contribution of the global eustatic change and regional factors, as tectonic processes or glacial-isostatic rebound effects: the eustatic signal has to be obtained by removing the contribution of these other mechanisms. To obtain the most realistic sea level reconstructions it is important to quantify the tectonic regime of the central Adriatic margin. This result has been achieved integrating a numerical approach with the analysis of high-resolution seismic profiles. In detail, the subsidence trend obtained from the geohistory analysis and the backstripping of the borehole PRAD1.2 (the borehole PRAD1.2 is a 71 m continuous borehole drilled in -185 m of water depth, south of the Mid Adriatic Deep - MAD - during the European Project PROMESS 1, Profile Across Mediterranean Sedimentary Systems, Part 1), has been confirmed by the analysis of lowstand paleoshorelines and by benthic foraminifera associations investigated through the borehole. This work showed an evolution from inner-shelf environment, during Marine Isotopic Stage (MIS) 10, to upper-slope conditions, during MIS 2. Once the tectonic regime of the central Adriatic margin has been constrained, it is possible to investigate the impact of sea level and sediment supply fluctuations on the deposition of the Late Pleistocene-Holocene transgressive deposits. The Adriatic transgressive record (TST - Transgressive Systems Tract) is formed by three correlative sedimentary bodies, deposited in less then 14 kyr since the Last Glacial Maximum (LGM); in particular: along the central Adriatic shelf and in the adjacent slope basin the TST is formed by marine units, while along the northern Adriatic shelf the TST is represented by costal deposits in a backstepping configuration. The central Adriatic margin, characterized by a thick transgressive sedimentary succession, is the ideal site to investigate the impact of late Pleistocene climatic and eustatic fluctuations, among which Meltwater Pulses 1A and 1B and the Younger Dryas cold event. The central Adriatic TST is formed by a tripartite deposit bounded by two regional unconformities. In particular, the middle TST unit includes two prograding wedges, deposited in the interval between the two Meltwater Pulse events, as highlighted by several 14C age estimates, and likely recorded the Younger Dryas cold interval. Modeling simulations, obtained with the two coupled models HydroTrend 3.0 and 2D-Sedflux 1.0C (developed by the Community Surface Dynamics Modeling System - CSDMS), integrated by the analysis of high resolution seismic profiles and core samples, indicate that: 1 - the prograding middle TST unit, deposited during the Younger Dryas, was formed as a consequence of an increase in sediment flux, likely connected to a decline in vegetation cover in the catchment area due to the establishment of sub glacial arid conditions; 2 - the two-stage prograding geometry was the consequence of a sea level still-stand (or possibly a fall) during the Younger Dryas event. The northern Adriatic margin, characterized by a broad and gentle shelf (350 km wide with a low angle plunge of 0.02° to the SE), is the ideal site to quantify the timing of each steps of the post LGM sea level rise. The modern shelf is characterized by sandy deposits of barrier-island systems in a backstepping configuration, showing younger ages at progressively shallower depths, which recorded the step-wise nature of the last sea level rise. The age-depth model, obtained by dated samples of basal peat layers, is in good agreement with previous published sea level curves, and highlights the post-glacial eustatic trend. The interval corresponding to the Younger Dyas cold reversal, instead, is more complex: two coeval coastal deposits characterize the northern Adriatic shelf at very different water depths. Several explanations and different models can be attempted to explain this conundrum, but the problem remains still unsolved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study new tomographic models of Colombia were calculated. I used the seismicity recorded by the Colombian seismic network during the period 2006-2009. In this time period, the improvement of the seismic network yields more stable hypocentral results with respect to older data set and allows to compute new 3D Vp and Vp/Vs models. The final dataset consists of 10813 P- and 8614 S-arrival times associated to 1405 earthquakes. Tests with synthetic data and resolution analysis indicate that velocity models are well constrained in central, western and southwestern Colombia to a depth of 160 km; the resolution is poor in the northern Colombia and close to Venezuela due to a lack of seismic stations and seismicity. The tomographic models and the relocated seismicity indicate the existence of E-SE subducting Nazca lithosphere beneath central and southern Colombia. The North-South changes in Wadati-Benioff zone, Vp & Vp/Vs pattern and volcanism, show that the downgoing plate is segmented by slab tears E-W directed, suggesting the presence of three sectors. Earthquakes in the northernmost sector represent most of the Colombian seimicity and concentrated on 100-170 km depth interval, beneath the Eastern Cordillera. Here a massive dehydration is inferred, resulting from a delay in the eclogitization of a thickened oceanic crust in a flat-subduction geometry. In this sector a cluster of intermediate-depth seismicity (Bucaramanga Nest) is present beneath the elbow of the Eastern Cordillera, interpreted as the result of massive and highly localized dehydration phenomenon caused by a hyper-hydrous oceanic crust. The central and southern sectors, although different in Vp pattern show, conversely, a continuous, steep and more homogeneous Wadati-Benioff zone with overlying volcanic areas. Here a "normalthickened" oceanic crust is inferred, allowing for a gradual and continuous metamorphic reactions to take place with depth, enabling the fluid migration towards the mantle wedge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deep convection by pyro-cumulonimbus clouds (pyroCb) can transport large amounts of forest fire smoke into the upper troposphere and lower stratosphere. Here, results from numerical simulations of such deep convective smoke transport are presented. The structure, shape and injection height of the pyroCb simulated for a specific case study are in good agreement with observations. The model results confirm that substantial amounts of smoke are injected into the lower stratosphere. Small-scale mixing processes at the cloud top result in a significant enhancement of smoke injection into the stratosphere. Sensitivity studies show that the release of sensible heat by the fire plays an important role for the dynamics of the pyroCb. Furthermore, the convection is found to be very sensitive to background meteorological conditions. While the abundance of aerosol particles acting as cloud condensation nuclei (CCN) has a strong influence on the microphysical structure of the pyroCb, the CCN effect on the convective dynamics is rather weak. The release of latent heat dominates the overall energy budget of the pyroCb. Since most of the cloud water originates from moisture entrained from the background atmosphere, the fire-released moisture contributes only minor to convection dynamics. Sufficient fire heating, favorable meteorological conditions, and small-scale mixing processes at the cloud top are identified as the key ingredients for troposphere-to-stratosphere transport by pyroCb convection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of my thesis was the technical-economic feasibility of a system of electricity generation integrated with CCS. The policy framework for development processing is part of the recent attention that at the political level has been directed towards the use of CCS technologies with the aim of addressing the problems of actual climate change. Several technological options have been proposed to stabilize and reduce the atmospheric concentrations of carbon dioxide (CO2) among which, the most promising for IPPC (Intergovernmental Panel on Climate Change)are the CCS technologies (Carbon Capture and Storage & Carbon Capture and Sequestration). The remedy proposed for large stationary CO2 sources as thermoelectric power plants is to separate the flue gas capturing CO2 and to store it into deep subsurface geological formations (more than 800 meters of depth). In order to support the identification of potential CO2 storage reservoirs in Italy and in Europe by Geo Capacity(an European database) new studies are developing. From the various literature data analyzed shows that most of the CO2 emitted from large stationary sources comes from the processes of electricity generation (78% of total emissions) and from (about 60%) those using coal especially. The CCS have the objective of return "to the sender" , the ground, the carbon in oxidized form (CO2) after it has been burned by man starting from its reduced form (CH4, oil and coal), then the carbon dioxide is not a "pollutant" if injected into the subsurface, CO2 is an acid reagent that interacts with the rock, with underground fluid and the characteristics of the host rock. The results showed that the CCS technology are very urgent, because unfortunately there are too many industrial sources of CO2 in assets (power plants, refineries, cement plants, steel mills) in the world who are carrying too quickly the CO2 atmospheric concentration levels to values that aren't acceptable for our dear planet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, we have presented two deep 1.4 GHz and 345 MHz overlapping surveys of the Lockman Hole field taken with the Westerbork Synthesis Radio Telescope. We extracted a catalogue of ~6000 radio sources from the 1.4 GHz mosaic down to a flux limit of ~55 μJy and a catalogue of 334 radio sources down to a flux limit of ~4 mJy from the inner 7 sq. degree region of the 345 MHz image. The extracted catalogues were used to derive the source number counts at 1.4 GHz and at 345 MHz. The source counts were found to be fully consistent with previous determinations. In particular the 1.4 GHz source counts derived by the present sample provide one of the most statistically robust determinations in the flux range 0.1 < S < 1 mJy. During the commissioning program of the LOFAR telescope, the Lockman Hole field was observed at 58 MHz and 150 MHz. The 150 MHz LOFAR observation is particularly relevant as it allowed us to obtain the first LOFAR flux calibrated high resolution image of a deep field. From this image we extracted a preliminary source catalogue down to a flux limit of ~15 mJy (~10σ), that can be considered complete down to 20‒30 mJy. A spectral index study of the mJy sources in the Lockman Hole region, was performed using the available catalogues ( 1.4 GHz, 345 MHz and 150 MHz) and a deep 610 MHz source catalogue available from the literature (Garn et al. 2008, 2010).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the several issues faced in the past, the evolutionary trend of silicon has kept its constant pace. Today an ever increasing number of cores is integrated onto the same die. Unfortunately, the extraordinary performance achievable by the many-core paradigm is limited by several factors. Memory bandwidth limitation, combined with inefficient synchronization mechanisms, can severely overcome the potential computation capabilities. Moreover, the huge HW/SW design space requires accurate and flexible tools to perform architectural explorations and validation of design choices. In this thesis we focus on the aforementioned aspects: a flexible and accurate Virtual Platform has been developed, targeting a reference many-core architecture. Such tool has been used to perform architectural explorations, focusing on instruction caching architecture and hybrid HW/SW synchronization mechanism. Beside architectural implications, another issue of embedded systems is considered: energy efficiency. Near Threshold Computing is a key research area in the Ultra-Low-Power domain, as it promises a tenfold improvement in energy efficiency compared to super-threshold operation and it mitigates thermal bottlenecks. The physical implications of modern deep sub-micron technology are severely limiting performance and reliability of modern designs. Reliability becomes a major obstacle when operating in NTC, especially memory operation becomes unreliable and can compromise system correctness. In the present work a novel hybrid memory architecture is devised to overcome reliability issues and at the same time improve energy efficiency by means of aggressive voltage scaling when allowed by workload requirements. Variability is another great drawback of near-threshold operation. The greatly increased sensitivity to threshold voltage variations in today a major concern for electronic devices. We introduce a variation-tolerant extension of the baseline many-core architecture. By means of micro-architectural knobs and a lightweight runtime control unit, the baseline architecture becomes dynamically tolerant to variations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The object of this work has been the analysis of natural processes controlling the geological evolution of the Montenegro and Northern Albania Continental Margin (MACM) during the Late Quaternary. These include the modern sediment dispersal system and oceanographic regime, the building and shaping of the shelf margin at the scale of 100 kyr and relative to the most recent transition between glacial and interglacial periods. The analysis of the new data shows that the MACM is a shelf-slope system formed by a suite of physiographic elements, including: an inner and an outer continental shelf, separated by two tectonically-controlled morphological highs; a lobated drowned mid-shelf paleodelta, formed during the last sea level fall and low stand; an upper continental slope, affected by gravity-driven instability and a system of extensional faults with surficial displacement, featuring an orientation coherent with the regional tectonics. The stratigraphic study of the MACM shows a clear correspondence between the Late Pleistocene/Holocene mud-wedge and the low reflectivity sectors of the inner shelf. Conversely, most of the outer shelf and part of the continental slope expose deposits from the last sea level low stand, featuring a general sediment starving condition or the presence of a thin postglacial sediments cover. The MACM shows uplift in correspondence of the Kotor and Bar ridges, and subsidence in the outer shelf and upper slope sectors. In fact, seaward of these tectonic ridges, the sparker seismic profile show the presence of four well-defined seismo-stratigraphic sequences, interpreted as forced regression deposits, formed during the last four main glacial phases. In this way, the MACM records the 100 kyr scale sea level fluctuations on its seismo-stratigraphic architecture over the last 350 kyr. Over such time range, through the identification of the paleoshoreline deposits, we estimated an average subsidence rate of about 1.2 mm/yr.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis is to provide a geochemical characterization of the Seehausen territory (a neighborhood) of Bremen, Germany. In this territory it is hosted a landfill of dredged sediments coming both from Bremerhaven (North See) and Bremen harbor (directly on the river Weser). For this reason this work has been focused also on possible impacts of the landfill on the groundwaters (shallow and deep aquifer). The Seehausen landfill uses the dewatering technique to manage the dredged sediments: incoming sediments are put into dewatering fields until they are completely dried (it takes almost a year). Then they are randomly sampled and analyzed: if the pollutants content is acceptable, sediments are treated with other materials and used instead of raw material for embankment, bricks, etc., otherwise they are disposed in the landfill. During this work it has been made a study of the natural geology and hydrogeology of the whole area of interest, especially because it is characterized by ancient natural salt deposits. Then, together with the Geological Survey of Bremen and the Harbor Authority of Bremen there have been identified all useful piezometers for a monitoring net around the landfill. During the sampling campaign there have been collected data of the principal anions and cations, physical parameters and stable water isotopes. Data analysis has been focused particularly on Cl, Na, SO4 and EC because these parameters might be helpful to attribute geochemical trends to the landfill or to a natural background. Furthermore dataloggers have been installed for a month in some piezometers and EC, pressure, dissolved oxygen and temperature data have been collected. Finally there has been made a deep comparison between current and historical data (1996 – 2011) and between old interpolation maps and current ones in order to see time trends of the aquifer geochemistry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thesis describes the implementation of a calibration, format-translation and data conditioning software for radiometric tracking data of deep-space spacecraft. All of the available propagation-media noise rejection techniques available as features in the code are covered in their mathematical formulations, performance and software implementations. Some techniques are retrieved from literature and current state of the art, while other algorithms have been conceived ex novo. All of the three typical deep-space refractive environments (solar plasma, ionosphere, troposphere) are dealt with by employing specific subroutines. Specific attention has been reserved to the GNSS-based tropospheric path delay calibration subroutine, since it is the most bulky module of the software suite, in terms of both the sheer number of lines of code, and development time. The software is currently in its final stage of development and once completed will serve as a pre-processing stage for orbit determination codes. Calibration of transmission-media noise sources in radiometric observables proved to be an essential operation to be performed of radiometric data in order to meet the more and more demanding error budget requirements of modern deep-space missions. A completely autonomous and all-around propagation-media calibration software is a novelty in orbit determination, although standalone codes are currently employed by ESA and NASA. The described S/W is planned to be compatible with the current standards for tropospheric noise calibration used by both these agencies like the AMC, TSAC and ESA IFMS weather data, and it natively works with the Tracking Data Message file format (TDM) adopted by CCSDS as standard aimed to promote and simplify inter-agency collaboration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrothermal vents are often compared to desert oases, because of the presence of highly diverse and abundant biotic communities inhabiting these extreme environments. Nevertheless, the microbial communities associated with shallow-hydrothermal systems have been poorly studied. Hydrothermal activity at Dominica Island is quite well known under the geological and geochemical aspects, but no previous information existed about the microbial communities associated to this area. This thesis is therefore targeting the microbiology of hydrothermal sediments combining geochemical and molecular biological investigations, focusing on differences between hydrothermal vents and background (i.e. control) areas, and between hydrothermal sites. It was also intended to assess relationship between geochemical parameters and microbial diversity at the two hydrothermally impacted sites. Two shallow-sea hydrothermal vents located south-west off Dominica Island (Lesser Antilles) have been investigated in this study: Champagne Hot Springs and Soufriere Bay offshore vent. During this study, sediments for geochemical and molecular analyses were collected every 2 cm from the two impacted areas and from two control sites not associated with hydrothermal activity; in situ temperatures measurements were also taken every 5 cm deep in the sediment for all the sites. A geochemical characterization of the sediment porewater was performed through the analysis of several elements’ concentrations (i.e. H2S, Cl-, Br-, SO42-, Fe2+, Na+, K+, B+, Si+). Microbial communities at the different sites were studied by Automated Intergenic Spacer Analysis (ARISA). Inspection of the operational taxonomic units (OTUs) distribution was performed, as well as statistical analyses for communities’ structure and composition differences, and for changes of β-diversity along with sediment geochemistry. Data suggested that mixing between hydrothermal fluids and seawater results in distinct different environmental gradients and potential ecological niches between the two investigated hydrothermal vents, reflecting a difference in microbial community structures between them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lo studio dei processi biogeochimici che avvengono all’interfaccia acqua-sedimento riveste grande importanza per comprendere quali fattori ambientali siano responsabili di un eventuale modifica nel bilancio del carbonio organico e di altri elementi maggiori o minori e può` fornire un' indicazione su quali siano le aree più sensibili a tali processi. In questo studio sono stati analizzati i meccanismi che guidano la mineralizzazione della sostanza organica in aree caratterizzate da differenti condizioni idrodinamiche, batimetriche e trofiche nel Mediterraneo centrale. In particolare sono state prelevate carote di sedimento e analizzate le acque interstiziali in siti localizzati nell'Adriatico centro-meridionale, caratterizzati da basse profondità, alti tassi di sedimentazione e elevati apporti di sostanza organica, e in siti localizzati nello Ionio centro-settentrionale, caratterizzati da profondità crescenti, minori tassi di sedimentazione e ridotti apporti fluviali. L'analisi dei processi di degradazione della sostanza organica evidenzia differenze regionali tra il bacino adriatico e quello ionico: processi di mineralizzazione ossica e subossica appaiono intensi nei sedimenti adriatici, diversamente il bacino ionico appare caratterizzato principalmente da processi di degradazione ossica della sostanza organica. Inoltre, relativamente ai flussi bentici di Carbonio Inorganico Disciolto (DIC) flussi inversi sono stati registrati nei due bacini: i sedimenti adriatici si comportano come sourse di DIC, mentre i sedimenti Ionici si comportano come dei sink di DIC suggerendo una possibile precipitazione di carbonati nel bacino ionico.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Un argomento di attualità è la privacy e la sicurezza in rete. La tesi, attraverso lo studio di diversi documenti e la sperimentazione di applicazioni per garantire l'anonimato, analizza la situazione attuale. La nostra privacy è compromessa e risulta importante una sensibilizzazione globale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Questo lavoro è iniziato con uno studio teorico delle principali tecniche di classificazione di immagini note in letteratura, con particolare attenzione ai più diffusi modelli di rappresentazione dell’immagine, quali il modello Bag of Visual Words, e ai principali strumenti di Apprendimento Automatico (Machine Learning). In seguito si è focalizzata l’attenzione sulla analisi di ciò che costituisce lo stato dell’arte per la classificazione delle immagini, ovvero il Deep Learning. Per sperimentare i vantaggi dell’insieme di metodologie di Image Classification, si è fatto uso di Torch7, un framework di calcolo numerico, utilizzabile mediante il linguaggio di scripting Lua, open source, con ampio supporto alle metodologie allo stato dell’arte di Deep Learning. Tramite Torch7 è stata implementata la vera e propria classificazione di immagini poiché questo framework, grazie anche al lavoro di analisi portato avanti da alcuni miei colleghi in precedenza, è risultato essere molto efficace nel categorizzare oggetti in immagini. Le immagini su cui si sono basati i test sperimentali, appartengono a un dataset creato ad hoc per il sistema di visione 3D con la finalità di sperimentare il sistema per individui ipovedenti e non vedenti; in esso sono presenti alcuni tra i principali ostacoli che un ipovedente può incontrare nella propria quotidianità. In particolare il dataset si compone di potenziali ostacoli relativi a una ipotetica situazione di utilizzo all’aperto. Dopo avere stabilito dunque che Torch7 fosse il supporto da usare per la classificazione, l’attenzione si è concentrata sulla possibilità di sfruttare la Visione Stereo per aumentare l’accuratezza della classificazione stessa. Infatti, le immagini appartenenti al dataset sopra citato sono state acquisite mediante una Stereo Camera con elaborazione su FPGA sviluppata dal gruppo di ricerca presso il quale è stato svolto questo lavoro. Ciò ha permesso di utilizzare informazioni di tipo 3D, quali il livello di depth (profondità) di ogni oggetto appartenente all’immagine, per segmentare, attraverso un algoritmo realizzato in C++, gli oggetti di interesse, escludendo il resto della scena. L’ultima fase del lavoro è stata quella di testare Torch7 sul dataset di immagini, preventivamente segmentate attraverso l’algoritmo di segmentazione appena delineato, al fine di eseguire il riconoscimento della tipologia di ostacolo individuato dal sistema.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, Deep Learning techniques have shown to perform well on a large variety of problems both in Computer Vision and Natural Language Processing, reaching and often surpassing the state of the art on many tasks. The rise of deep learning is also revolutionizing the entire field of Machine Learning and Pattern Recognition pushing forward the concepts of automatic feature extraction and unsupervised learning in general. However, despite the strong success both in science and business, deep learning has its own limitations. It is often questioned if such techniques are only some kind of brute-force statistical approaches and if they can only work in the context of High Performance Computing with tons of data. Another important question is whether they are really biologically inspired, as claimed in certain cases, and if they can scale well in terms of "intelligence". The dissertation is focused on trying to answer these key questions in the context of Computer Vision and, in particular, Object Recognition, a task that has been heavily revolutionized by recent advances in the field. Practically speaking, these answers are based on an exhaustive comparison between two, very different, deep learning techniques on the aforementioned task: Convolutional Neural Network (CNN) and Hierarchical Temporal memory (HTM). They stand for two different approaches and points of view within the big hat of deep learning and are the best choices to understand and point out strengths and weaknesses of each of them. CNN is considered one of the most classic and powerful supervised methods used today in machine learning and pattern recognition, especially in object recognition. CNNs are well received and accepted by the scientific community and are already deployed in large corporation like Google and Facebook for solving face recognition and image auto-tagging problems. HTM, on the other hand, is known as a new emerging paradigm and a new meanly-unsupervised method, that is more biologically inspired. It tries to gain more insights from the computational neuroscience community in order to incorporate concepts like time, context and attention during the learning process which are typical of the human brain. In the end, the thesis is supposed to prove that in certain cases, with a lower quantity of data, HTM can outperform CNN.