817 resultados para Framework development


Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] This abstract describes the development of a wildfire forecasting plugin using Capaware. Capaware is designed as an easy to use open source framework to develop 3D graphics applications over large geographic areas offering high performance 3D visualization and powerful interaction tools for the Geographic Information Systems (GIS) community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present thesis treats the issue of gender equality in Macedonia during the period of transition from the socialist system to the one of parliamentary democracy. The main aim is to mainstream the gender perspective in the analysis of the transitional policies through the examination of the basic citizenship rights to which citizens are entitled and by the means of the evaluation of their capabilities to exercise these rights. Gender equality, as one of the main strongholds of the concept of human development is measured through the application of nine gender relevant capabilities in a Case study conducted within selected municipalities in the country. Through the analysis of the Macedonian constitutional and legal framework and the assessment of gender based inequalities, the research questions the need for the enactment of a process of engendering of citizenship, which would integrate gender based differences, contemplate the private sphere of citizens lives and pledge participation in the political life of the country. The thesis, finally, analyses the gender equality strategy of the Macedonian government with the purpose to evaluate whether it is context based, i.e. it tackles the main fields where inequalities emerge and in this context whether it envisages a process of engendering of citizenship.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Loaded with 16% of the world’s population, India is a challenged country. More than a third of its citizens live below the poverty line - on less than a dollar a day. These people have no proper electricity, no proper drinking water supply, no proper sanitary facilities and well over 40% are illiterates. More than 65% live in rural areas and 60% earn their livelihood from agriculture. Only a meagre 3.63% have access to telephone and less than 1% have access to a computer. Therefore, providing access to timely information on agriculture, weather, social, health care, employment, fishing, is of utmost importance to improve the conditions of rural poor. After some introductive chapters, whose function is to provide a comprehensive framework – both theoretical and practical – of the current rural development policies and of the media situation in India and Uttar Pradesh, my dissertation presents the findings of the pilot project entitled “Enhancing development support to rural masses through community media activity”, launched in 2005 by the Department of Mass Communication and Journalism of the Faculty of Arts of the University of Lucknow (U.P.) and by the local NGO Bharosa. The project scope was to involve rural people and farmers from two villages of the district of Lucknow (namely Kumhrava and Barhi Gaghi) in a three-year participatory community media project, based on the creation, implementation and use of a rural community newspaper and a rural community internet centre. Community media projects like this one have been rarely carried out in India because the country has no proper community media tradition: therefore the development of the project has been a challenge for the all stakeholders involved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims at providing a theoretical framework encompassing the two approaches towards entrepreneurial opportunity (opportunity discovery and opportunity creation) by outlining a trajectory from firm creation to capability development, to firm performance in the short term (firm survival) and the medium/long term (growth rate). A set of empirically testable hypotheses is proposed and tested by performing qualitative analyses on interviews on a small sample of entrepreneurs and event history analysis on a large sample of firms founded in the United States in 2004.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this research is to demonstrate that the Clean Development Mechanism (CDM), an instrument created under a global international treaty, can achieve multiple objectives beyond those for which it has been established. As such, while being already a powerful tool to contribute to the global fight against climate change, the CDM can also be successful if applied to different sectors not contemplated before. In particular, this research aimed at demonstrating that a wider utilization of the CDM in the tourism sector can represent an innovative way to foster sustainable tourism and generate additional benefits. The CDM was created by Article 12 of the Kyoto Protocol of the United Nations Framework Convention on Climate Change (UNFCCC) and represents an innovative tool to reduce greenhouse gases emissions through the implementation of mitigation activities in developing countries which generate certified emission reductions (CERs), each of them equivalent to one ton of CO2 not emitted in the atmosphere. These credits can be used for compliance reasons by industrialized countries in achieving their reduction targets. The logic path of this research begins with an analysis of the scientific evidences of climate change and its impacts on different economic sectors including tourism and it continues with a focus on the linkages between climate and the tourism sector. Then, it analyses the international responses to the issue of climate change and the peculiar activities in the international arena addressing climate change and the tourism sector. The concluding part of the work presents the objectives and achievements of the CDM and its links to the tourism sector by considering case studies of existing projects which demonstrate that the underlying question can be positively answered. New opportunities for the tourism sector are available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this Thesis is to develop a robust and powerful method to classify galaxies from large surveys, in order to establish and confirm the connections between the principal observational parameters of the galaxies (spectral features, colours, morphological indices), and help unveil the evolution of these parameters from $z \sim 1$ to the local Universe. Within the framework of zCOSMOS-bright survey, and making use of its large database of objects ($\sim 10\,000$ galaxies in the redshift range $0 < z \lesssim 1.2$) and its great reliability in redshift and spectral properties determinations, first we adopt and extend the \emph{classification cube method}, as developed by Mignoli et al. (2009), to exploit the bimodal properties of galaxies (spectral, photometric and morphologic) separately, and then combining together these three subclassifications. We use this classification method as a test for a newly devised statistical classification, based on Principal Component Analysis and Unsupervised Fuzzy Partition clustering method (PCA+UFP), which is able to define the galaxy population exploiting their natural global bimodality, considering simultaneously up to 8 different properties. The PCA+UFP analysis is a very powerful and robust tool to probe the nature and the evolution of galaxies in a survey. It allows to define with less uncertainties the classification of galaxies, adding the flexibility to be adapted to different parameters: being a fuzzy classification it avoids the problems due to a hard classification, such as the classification cube presented in the first part of the article. The PCA+UFP method can be easily applied to different datasets: it does not rely on the nature of the data and for this reason it can be successfully employed with others observables (magnitudes, colours) or derived properties (masses, luminosities, SFRs, etc.). The agreement between the two classification cluster definitions is very high. ``Early'' and ``late'' type galaxies are well defined by the spectral, photometric and morphological properties, both considering them in a separate way and then combining the classifications (classification cube) and treating them as a whole (PCA+UFP cluster analysis). Differences arise in the definition of outliers: the classification cube is much more sensitive to single measurement errors or misclassifications in one property than the PCA+UFP cluster analysis, in which errors are ``averaged out'' during the process. This method allowed us to behold the \emph{downsizing} effect taking place in the PC spaces: the migration between the blue cloud towards the red clump happens at higher redshifts for galaxies of larger mass. The determination of $M_{\mathrm{cross}}$ the transition mass is in significant agreement with others values in literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hierarchical organisation of biological systems plays a crucial role in the pattern formation of gene expression resulting from the morphogenetic processes, where autonomous internal dynamics of cells, as well as cell-to-cell interactions through membranes, are responsible for the emergent peculiar structures of the individual phenotype. Being able to reproduce the systems dynamics at different levels of such a hierarchy might be very useful for studying such a complex phenomenon of self-organisation. The idea is to model the phenomenon in terms of a large and dynamic network of compartments, where the interplay between inter-compartment and intra-compartment events determines the emergent behaviour resulting in the formation of spatial patterns. According to these premises the thesis proposes a review of the different approaches already developed in modelling developmental biology problems, as well as the main models and infrastructures available in literature for modelling biological systems, analysing their capabilities in tackling multi-compartment / multi-level models. The thesis then introduces a practical framework, MS-BioNET, for modelling and simulating these scenarios exploiting the potential of multi-level dynamics. This is based on (i) a computational model featuring networks of compartments and an enhanced model of chemical reaction addressing molecule transfer, (ii) a logic-oriented language to flexibly specify complex simulation scenarios, and (iii) a simulation engine based on the many-species/many-channels optimised version of Gillespie’s direct method. The thesis finally proposes the adoption of the agent-based model as an approach capable of capture multi-level dynamics. To overcome the problem of parameter tuning in the model, the simulators are supplied with a module for parameter optimisation. The task is defined as an optimisation problem over the parameter space in which the objective function to be minimised is the distance between the output of the simulator and a target one. The problem is tackled with a metaheuristic algorithm. As an example of application of the MS-BioNET framework and of the agent-based model, a model of the first stages of Drosophila Melanogaster development is realised. The model goal is to generate the early spatial pattern of gap gene expression. The correctness of the models is shown comparing the simulation results with real data of gene expression with spatial and temporal resolution, acquired in free on-line sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tesi ha lo scopo di esplorare la produzione di sistemi software per Embedded Systems mediante l'utilizzo di tecniche relative al mondo del Model Driven Software Development. La fase più importante dello sviluppo sarà la definizione di un Meta-Modello che caratterizza i concetti fondamentali relativi agli embedded systems. Tale modello cercherà di astrarre dalla particolare piattaforma utilizzata ed individuare quali astrazioni caratterizzano il mondo degli embedded systems in generale. Tale meta-modello sarà quindi di tipo platform-independent. Per la generazione automatica di codice è stata adottata una piattaforma di riferimento, cioè Arduino. Arduino è un sistema embedded che si sta sempre più affermando perché coniuga un buon livello di performance ed un prezzo relativamente basso. Tale piattaforma permette lo sviluppo di sistemi special purpose che utilizzano sensori ed attuatori di vario genere, facilmente connessi ai pin messi a disposizione. Il meta-modello definito è un'istanza del meta-metamodello MOF, definito formalmente dall'organizzazione OMG. Questo permette allo sviluppatore di pensare ad un sistema sotto forma di modello, istanza del meta-modello definito. Un meta-modello può essere considerato anche come la sintassi astratta di un linguaggio, quindi può essere definito da un insieme di regole EBNF. La tecnologia utilizzata per la definizione del meta-modello è stata Xtext: un framework che permette la scrittura di regole EBNF e che genera automaticamente il modello Ecore associato al meta-modello definito. Ecore è l'implementazione di EMOF in ambiente Eclipse. Xtext genera inoltre dei plugin che permettono di avere un editor guidato dalla sintassi, definita nel meta-modello. La generazione automatica di codice è stata realizzata usando il linguaggio Xtend2. Tale linguaggio permette di esplorare l'Abstract Syntax Tree generato dalla traduzione del modello in Ecore e di generare tutti i file di codice necessari. Il codice generato fornisce praticamente tutta la schematic part dell'applicazione, mentre lascia all'application designer lo sviluppo della business logic. Dopo la definizione del meta-modello di un sistema embedded, il livello di astrazione è stato spostato più in alto, andando verso la definizione della parte di meta-modello relativa all'interazione di un sistema embedded con altri sistemi. Ci si è quindi spostati verso un ottica di Sistema, inteso come insieme di sistemi concentrati che interagiscono. Tale difinizione viene fatta dal punto di vista del sistema concentrato di cui si sta definendo il modello. Nella tesi viene inoltre introdotto un caso di studio che, anche se abbastanza semplice, fornisce un esempio ed un tutorial allo sviluppo di applicazioni mediante l'uso del meta-modello. Ci permette inoltre di notare come il compito dell'application designer diventi piuttosto semplice ed immediato, sempre se basato su una buona analisi del problema. I risultati ottenuti sono stati di buona qualità ed il meta-modello viene tradotto in codice che funziona correttamente.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flood disasters are a major cause of fatalities and economic losses, and several studies indicate that global flood risk is currently increasing. In order to reduce and mitigate the impact of river flood disasters, the current trend is to integrate existing structural defences with non structural measures. This calls for a wider application of advanced hydraulic models for flood hazard and risk mapping, engineering design, and flood forecasting systems. Within this framework, two different hydraulic models for large scale analysis of flood events have been developed. The two models, named CA2D and IFD-GGA, adopt an integrated approach based on the diffusive shallow water equations and a simplified finite volume scheme. The models are also designed for massive code parallelization, which has a key importance in reducing run times in large scale and high-detail applications. The two models were first applied to several numerical cases, to test the reliability and accuracy of different model versions. Then, the most effective versions were applied to different real flood events and flood scenarios. The IFD-GGA model showed serious problems that prevented further applications. On the contrary, the CA2D model proved to be fast and robust, and able to reproduce 1D and 2D flow processes in terms of water depth and velocity. In most applications the accuracy of model results was good and adequate to large scale analysis. Where complex flow processes occurred local errors were observed, due to the model approximations. However, they did not compromise the correct representation of overall flow processes. In conclusion, the CA model can be a valuable tool for the simulation of a wide range of flood event types, including lowland and flash flood events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change has been acknowledged as a threat to humanity. Most scholars agree that to avert dangerous climate change and to transform economies into low-carbon societies, deep global emission reductions are required by the year 2050. Under the framework of the Kyoto Protocol, the Clean Development Mechanism (CDM) is the only market-based instrument that encourages industrialised countries to pursue emission reductions in developing countries. The CDM aims to pay the incremental finance necessary to operationalize emission reduction projects which are otherwise not financially viable. According to the objectives of the Kyoto Protocol, the CDM should finance projects that are additional to those which would have happened anyway, contribute to sustainable development in the countries hosting the projects, and be cost-effective. To enable the identification of such projects, an institutional framework has been established by the Kyoto Protocol which lays out responsibilities for public and private actors. This thesis examines whether the CDM has achieved these objectives in practice and can thus be considered an effective tool to reduce emissions. To complete this investigation, the book applies economic theory and analyses the CDM from two perspectives. The first perspective is the supply-dimension which answers the question of how, in practice, the CDM system identified additional, cost-effective, sustainable projects and, generated emission reductions. The main contribution of this book is the second perspective, the compliance-dimension, which answers the question of whether industrialised countries effectively used the CDM for compliance with their Kyoto targets. The application of the CDM in the European Union Emissions Trading Scheme (EU ETS) is used as a case-study. Where the analysis identifies inefficiencies within the supply or the compliance dimension, potential improvements of the legal framework are proposed and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to contribute to the development of new multifunctional nanocarriers for improved encapsulation and delivery of anticancer and antiviral drugs. The work focused on water soluble and biocompatible oligosaccharides, the cyclodextrins (CyDs), and a new family of nanostructured, biodegradable carrier materials made of porous metal-organic frameworks (nanoMOFs). The drugs of choice were the anticancer doxorubicin (DOX), azidothymidine (AZT) and its phosphate derivatives and artemisinin (ART). DOX possesses a pharmacological drawback due to its self-aggregation tendency in water. The non covalent binding of DOX to a series of CyD derivatives, such as g-CyD, an epichlorohydrin crosslinked b-CyD polymer (pb-CyD) and a citric acid crosslinked g-CyD polymer (pg-CyD) was studied by UV visible absorption, circular dichroism and fluorescence. Multivariate global analysis of multiwavelength data from spectroscopic titrations allowed identification and characterization of the stable complexes. pg-CyD proved to be the best carrier showing both high association constants and ability to monomerize DOX. AZT is an important antiretroviral drug. The active form is AZT-triphosphate (AZT-TP), formed in metabolic paths of low efficiency. Direct administration of AZT-TP is limited by its poor stability in biological media. So the development of suitable carriers is highly important. In this context we studied the binding of some phosphorilated derivatives to nanoMOFs by spectroscopic methods. The results obtained with iron(III)-trimesate nanoMOFs allowed to prove that the binding of these drugs mainly occurs by strong iono-covalent bonds to iron(III) centers. On the basis of these and other results obtained in partner laboratories, it was possible to propose this highly versatile and “green” carrier system for delivery of phosphorylated nucleoside analogues. The interaction of DOX with nanoMOFs was also studied. Finally the binding of the antimalarial drug, artemisinin (ART) with two cyclodextrin-based carriers,the pb-CyD and a light responsive bis(b-CyD) host, was also studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this thesis is to explore the short and long run causality patterns in the finance – growth nexus and finance-growth-trade nexus before and after the global financial crisis, in the case of Albania. To this end we use quarterly data on real GDP, 13 proxy measures for financial development and the trade openness indicator for the period 1998Q1 – 2013Q2 and 1998Q1-2008Q3. Causality patterns will be explored in a VAR-VECM framework. For this purpose we will proceed as follows: (i) testing for the integration order of the variables; (ii) cointegration analysis and (iii) performing Granger causality tests in a VAR-VECM framework. In the finance-growth nexus, empirical evidence suggests for a positive long run relationship between finance and economic growth, with causality running from financial development to economic growth. The global financial crisis seems to have not affected the causality direction in the finance and growth nexus, thus supporting the finance led growth hypothesis in the long run in the case of Albania. In the finance-growth-trade openness nexus, we found evidence for a positive long run relationship the variables, with causality direction depending on the proxy used for financial development. When the pre-crisis sample is considered, we find evidence for causality running from financial development and trade openness to economic growth. The global financial crisis seems to have affected somewhat the causality direction in the finance-growth-trade nexus, which has become sensible to the proxy used for financial development. On the short run, empirical evidence suggests for a clear unidirectional relationship between finance and growth, with causality mostly running from economic growth to financial development. When we consider the per-crisis sub sample results are mixed, depending on the proxy used for financial development. The same results are confirmed when trade openness is taken into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we develop further the functional renormalization group (RG) approach to quantum field theory (QFT) based on the effective average action (EAA) and on the exact flow equation that it satisfies. The EAA is a generalization of the standard effective action that interpolates smoothly between the bare action for krightarrowinfty and the standard effective action rnfor krightarrow0. In this way, the problem of performing the functional integral is converted into the problem of integrating the exact flow of the EAA from the UV to the IR. The EAA formalism deals naturally with several different aspects of a QFT. One aspect is related to the discovery of non-Gaussian fixed points of the RG flow that can be used to construct continuum limits. In particular, the EAA framework is a useful setting to search for Asymptotically Safe theories, i.e. theories valid up to arbitrarily high energies. A second aspect in which the EAA reveals its usefulness are non-perturbative calculations. In fact, the exact flow that it satisfies is a valuable starting point for devising new approximation schemes. In the first part of this thesis we review and extend the formalism, in particular we derive the exact RG flow equation for the EAA and the related hierarchy of coupled flow equations for the proper-vertices. We show how standard perturbation theory emerges as a particular way to iteratively solve the flow equation, if the starting point is the bare action. Next, we explore both technical and conceptual issues by means of three different applications of the formalism, to QED, to general non-linear sigma models (NLsigmaM) and to matter fields on curved spacetimes. In the main part of this thesis we construct the EAA for non-abelian gauge theories and for quantum Einstein gravity (QEG), using the background field method to implement the coarse-graining procedure in a gauge invariant way. We propose a new truncation scheme where the EAA is expanded in powers of the curvature or field strength. Crucial to the practical use of this expansion is the development of new techniques to manage functional traces such as the algorithm proposed in this thesis. This allows to project the flow of all terms in the EAA which are analytic in the fields. As an application we show how the low energy effective action for quantum gravity emerges as the result of integrating the RG flow. In any treatment of theories with local symmetries that introduces a reference scale, the question of preserving gauge invariance along the flow emerges as predominant. In the EAA framework this problem is dealt with the use of the background field formalism. This comes at the cost of enlarging the theory space where the EAA lives to the space of functionals of both fluctuation and background fields. In this thesis, we study how the identities dictated by the symmetries are modified by the introduction of the cutoff and we study so called bimetric truncations of the EAA that contain both fluctuation and background couplings. In particular, we confirm the existence of a non-Gaussian fixed point for QEG, that is at the heart of the Asymptotic Safety scenario in quantum gravity; in the enlarged bimetric theory space where the running of the cosmological constant and of Newton's constant is influenced by fluctuation couplings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flüchtige organische Bestandteile (engl.: VOC) sind in der Atmosphäre in Spuren vorhanden, spielen aber trotzdem eine wichtige Rolle in der Luftchemie: sie beeinflussen das Ozon der Troposphäre, städtischen Smog, Oxidationskapazität und haben direkte und indirekte Auswirkungen auf die globale Klimaveränderung. Eine wichtige Klasse der VOC sind die Nicht-Methan-Kohlenwasserstoffe (engl.: NMHC), die überwiegend von anthropogenen Quellen kommen. Aus diesem Grund ist für Luftchemiker ein Messinstrument nötig, das die VOC, die NMHC eingeschlossen, mit einer höheren Zeitauflösung misst, besonders für Echtzeitmessungen an Bord eines Forschungsflugzeuges. Dafür wurde das System zur schnellen Beobachtung von organischen Spuren (engl.: FOTOS) entworfen, gebaut für den Einsatz in einem neuen Wissenschaftlichen Flugzeug, das in großen Höhen und über weite Strecken fliegt, genannt HALO. In der Folge wurde FOTOS in zwei Messkampagnen am Boden getestet. FOTOS wurde entworfen und gebaut mit einem speziell angefertigten, automatisierten, kryogenen Probensystem mit drei Fallen und einem angepassten, erworbenen schnellen GC-MS. Ziel dieses Aufbaus war es, die Vielseitigkeit zu vergrößern und das Störungspotential zu verringern, deshalb wurden keine chemischen Trocknungsmittel oder adsorbierenden Stoffe verwendet. FOTOS erreichte eine Probenfrequenz von 5.5 Minuten, während es mindestens 13 verschiedene C2- bis C5-NMHC maß. Die Drei-Sigma-Detektionsgrenze für n- und iso-Pentan wurde als 2.6 und 2.0 pptv ermittelt, in dieser Reihenfolge. Labortests bestätigten, dass FOTOS ein vielseitiges, robustes, hochautomatisiertes, präzises, genaues, empfindliches Instrument ist, geeignet für Echtzeitmessungen von VOC in Probenfrequenzen, die angemessen sind für ein Forschungsflugzeug wie HALO. Um die Leistung von FOTOS zu bestätigen, wurde vom 26. Januar bis 4. Februar 2010 ein Zwischenvergleich gemacht mit dem GC-FID-System am Meteorologischen Observatorium Hohenpeißenberg, einer WMO-GAW-globalen Station. Dreizehn verschiedene NMHC wurden innerhalb des Rahmens der GWA Data Quality Objectives (DQO) analysiert und verglichen. Mehr als 80% der Messungen von sechs C3- bis C5-NMHC erfüllten diese DQO. Diese erste Messkampagne im Feld hob die Robustheit und Messgenauigkeit von FOTOS hervor, zusätzlich zu dem Vorteil der höheren Probenfrequenz, sogar in einer Messung am Boden. Um die Möglichkeiten dieses Instrumentes im Feld zu zeigen, maß FOTOS ausgewählte leichte NMHC während einer Messkampagne im Borealen Waldgebiet, HUMPPA-COPEC 2010. Vom 12. Juli bis zum 12. August 2010 beteiligte sich eine internationale Gruppe von Instituten und Instrumenten an Messungen physikalischer und chemischer Größen der Gas- und Partikelphasen der Luft über dem Borealen Wald an der SMEAR II-Station nahe Hyyttiälä, Finnland. Es wurden mehrere Hauptpunkte von Interesse im Mischungsverhältnis der Alkane und im Isomerenverhätnis von Pentan identifiziert, insbesondere sehr unterschiedliche Perioden niedriger und hoher Variabilität, drei Rauchschwaden von Biomassen-Verbrennung von russischen Waldbränden und zwei Tage mit extrem sauberer Luft aus der Polarregion. Vergleiche der NMHC mit anderen anthropogenen Indikatoren zeigten mehrere Quellen anthropogener Einflüsse am Ort auf und erlaubten eine Unterscheidung zwischen lokalen und weiter entfernten Quellen. Auf einen minimalen natürlichen Beitrag zum 24h-Kreislauf von NOx wurde geschlussfolgert aus der Korrelation von NOx mit Alkanen. Altersschätzungen der Luftmassen durch das Isomerenverhältnis von Pentan wurden erschwert durch sich verändernde Verhältnisse der Quellen und durch Besonderheiten der Photochemie während des Sommers im hohen Norden. Diese Messungen zeigten den Wert des Messens leichter NMHC, selbst in abgelegenen Regionen, als einen zusätzlichen spezifischen Marker von anthropogenem Einfluss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is settled within the STOCKMAPPING project, which represents one of the studies that were developed in the framework of RITMARE Flagship project. The main goals of STOCKMAPPING were the creation of a genomic mapping for stocks of demersal target species and the assembling of a database of population genomic, in order to identify stocks and stocks boundaries. The thesis focuses on three main objectives representing the core for the initial assessment of the methodologies and structure that would be applied to the entire STOCKMAPPING project: individuation of an analytical design to identify and locate stocks and stocks boundaries of Mullus barbatus, application of a multidisciplinary approach to validate biological methods and an initial assessment and improvement for the genotyping by sequencing technique utilized (2b-RAD). The first step is the individuation of an analytical design that has to take in to account the biological characteristics of red mullet and being representative for STOCKMAPPING commitments. In this framework a reduction and selection steps was needed due to budget reduction. Sampling areas were ranked according the individuation of four priorities. To guarantee a multidisciplinary approach the biological data associated to the collected samples were used to investigate differences between sampling areas and GSAs. Genomic techniques were applied to red mullet for the first time so an initial assessment of molecular protocols for DNA extraction and 2b-RAD processing were needed. At the end 192 good quality DNAs have been extracted and eight samples have been processed with 2b-RAD. Utilizing the software Stacks for sequences analyses a great number of SNPs markers among the eight samples have been identified. Several tests have been performed changing the main parameter of the Stacks pipeline in order to identify the most explicative and functional sets of parameters.