882 resultados para mixed stock analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il presente studio si occupa di indagare lo stato delle popolazioni di alici, Engraulis encrasicolus, e sardine, Sardina pilchardus, presenti nel Mar Adriatico Centrale e Settentrionale attraverso l’utilizzo di metodi di dinamica di popolazione. L’attenzione per queste specie è dovuta alla loro importanza commerciale; sono, infatti, specie “target” della flotta peschereccia italiana, in particolare nell’area adriatica. I metodi di dinamica di popolazione sono uno degli aspetti più importanti delle scienze della pesca. Attraverso lo stock assessment si possono acquisire informazioni sull’abbondanza in mare delle risorse nel tempo e nello spazio, nonché sulla mortalità dovuta all’attività di pesca, che sono di primaria importanza per l’adozione di misure gestionali. I metodi di dinamica di popolazione esaminati e confrontati in questa tesi sono stati due: Virtual Population Analysis (VPA) e Integrated Catch-at-Age Analysis (ICA). Prima, però, è stato necessario esaminare le modalità con cui ottenere i dati di “input”, quali: tassi di crescita delle specie, mortalità naturale, sforzo di pesca, dati di cattura. Infine, è stato possibile ricostruire nel tempo la storia dello stock in questione e il suo stato attuale, dando indicazioni per lo sfruttamento futuro in un’ottica di conservazione dello stock stesso. Attraverso la determinazione della curva di crescita si sono potuti ottenere i parametri di crescita delle specie in esame, necessari per definire i tassi di mortalità naturale. L’abbondanza di questi stock è stata valutata con i programmi Age Length Key (ALK) e Iterative Age Length Key (IALK). Nei programmi di stock assessment utilizzati si è preferito utilizzare la stima di abbondanza calcolata con il primo metodo, in quanto più rappresentativo dello stock in esame. Un parametro di fondamentale importanza e di difficile stima è la mortalità; in particolare, in questo studio ci siamo occupati di determinare la mortalità naturale. Questa è stata determinata utilizzando due programmi: ProdBiom (Abella et al., 1998) e il sistema ideato da Gislason et al. (2008). Nonostante l’approccio conservativo suggerisca l’utilizzo dei valori ricavati da ProdBiom, in quanto più bassi, si è preferito utilizzare i tassi di mortalità naturale ricavati dalla seconda procedura. Questa preferenza è stata determinata dal fatto che il programma ProdBiom consegna indici di mortalità naturale troppo bassi, se confrontati con quelli presentati in letteratura per le specie in esame. Inoltre, benché nessuno dei due programmi sia stato costruito appositamente per le specie pelagiche, è comunque preferibile la metodologia ideata da Gislason et al. (2008), in quanto ottenuta da un esame di 367 pubblicazioni, in alcune delle quali erano presenti dati per queste specie. Per quanto riguarda i dati di cattura utilizzati in questo lavoro per il calcolo della Catch Per Unit Effort (CPUE, cioè le catture per unità di sforzo), si sono utilizzati quelli della marineria di Porto Garibaldi, in quanto questa vanta una lunga serie temporale di dati, dal 1975 ad oggi. Inoltre, in questa marineria si è sempre pescato senza imposizione di quote e con quantitativi elevati. Determinati questi dati è stato possibile applicare i programmi di valutazione degli stock ittici: VPA e ICA. L’ICA risulta essere più attendibile, soprattutto per gli anni recenti, in quanto prevede un periodo nel quale la selettività è mantenuta costante, riducendo i calcoli da fare e, di conseguenza, diminuendo gli errori. In particolare, l’ICA effettua i suoi calcoli considerando che i dati di cattura e gli indici di “tuning” possono contenere degli errori. Nonostante le varie differenze dei programmi e le loro caratteristiche, entrambi concordano sullo stato degli stock in mare. Per quanto riguarda l’alice, lo stock di questa specie nel Mar Adriatico Settentrionale e Centrale, altamente sfruttato in passato, oggi risulta moderatamente sfruttato in quanto il livello di sfruttamento viene ottenuto con un basso livello di sforzo di pesca. Si raccomanda, comunque, di non incrementare lo sforzo di pesca, in modo da non determinare nuove drastiche diminuzioni dello stock con pesanti conseguenze per l’attività di pesca. Le sardine, invece, presentano un trend diverso: dalla metà degli anni ottanta lo stock di Sardina pilchardus ha conosciuto un continuo e progressivo declino, che solo nell’ultimo decennio mostra un’inversione di tendenza. Questo, però, non deve incoraggiare ad aumentare lo pressione di pesca, anzi bisogna cercare di mantenere costante lo sforzo di pesca al livello attuale in modo da permettere il completo ristabilimento dello stock (le catture della flotta italiana sono, infatti, ancora relativamente basse). Questo lavoro, nonostante i vari aspetti da implementare (quali: il campionamento, le metodologie utilizzate, l’introduzione di aspetti non considerati, come ad es. gli scarti,… etc.) e le difficoltà incontrate nel suo svolgimento, ha fornito un contributo di approfondimento sugli spinosi aspetti della definizione del tasso di mortalità naturale, individuando una procedura più adatta per stimare questo parametro. Inoltre, ha presentato l’innovativo aspetto del confronto tra i programmi ICA e VPA, mostrando una buon accordo dei risultati ottenuti. E’ necessario, comunque, continuare ad approfondire questi aspetti per ottenere valutazioni sempre più precise e affidabili, per raggiungere una corretta gestione dell’attività di pesca e ai fini della preservazione degli stock stessi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixed integer programming is up today one of the most widely used techniques for dealing with hard optimization problems. On the one side, many practical optimization problems arising from real-world applications (such as, e.g., scheduling, project planning, transportation, telecommunications, economics and finance, timetabling, etc) can be easily and effectively formulated as Mixed Integer linear Programs (MIPs). On the other hand, 50 and more years of intensive research has dramatically improved on the capability of the current generation of MIP solvers to tackle hard problems in practice. However, many questions are still open and not fully understood, and the mixed integer programming community is still more than active in trying to answer some of these questions. As a consequence, a huge number of papers are continuously developed and new intriguing questions arise every year. When dealing with MIPs, we have to distinguish between two different scenarios. The first one happens when we are asked to handle a general MIP and we cannot assume any special structure for the given problem. In this case, a Linear Programming (LP) relaxation and some integrality requirements are all we have for tackling the problem, and we are ``forced" to use some general purpose techniques. The second one happens when mixed integer programming is used to address a somehow structured problem. In this context, polyhedral analysis and other theoretical and practical considerations are typically exploited to devise some special purpose techniques. This thesis tries to give some insights in both the above mentioned situations. The first part of the work is focused on general purpose cutting planes, which are probably the key ingredient behind the success of the current generation of MIP solvers. Chapter 1 presents a quick overview of the main ingredients of a branch-and-cut algorithm, while Chapter 2 recalls some results from the literature in the context of disjunctive cuts and their connections with Gomory mixed integer cuts. Chapter 3 presents a theoretical and computational investigation of disjunctive cuts. In particular, we analyze the connections between different normalization conditions (i.e., conditions to truncate the cone associated with disjunctive cutting planes) and other crucial aspects as cut rank, cut density and cut strength. We give a theoretical characterization of weak rays of the disjunctive cone that lead to dominated cuts, and propose a practical method to possibly strengthen those cuts arising from such weak extremal solution. Further, we point out how redundant constraints can affect the quality of the generated disjunctive cuts, and discuss possible ways to cope with them. Finally, Chapter 4 presents some preliminary ideas in the context of multiple-row cuts. Very recently, a series of papers have brought the attention to the possibility of generating cuts using more than one row of the simplex tableau at a time. Several interesting theoretical results have been presented in this direction, often revisiting and recalling other important results discovered more than 40 years ago. However, is not clear at all how these results can be exploited in practice. As stated, the chapter is a still work-in-progress and simply presents a possible way for generating two-row cuts from the simplex tableau arising from lattice-free triangles and some preliminary computational results. The second part of the thesis is instead focused on the heuristic and exact exploitation of integer programming techniques for hard combinatorial optimization problems in the context of routing applications. Chapters 5 and 6 present an integer linear programming local search algorithm for Vehicle Routing Problems (VRPs). The overall procedure follows a general destroy-and-repair paradigm (i.e., the current solution is first randomly destroyed and then repaired in the attempt of finding a new improved solution) where a class of exponential neighborhoods are iteratively explored by heuristically solving an integer programming formulation through a general purpose MIP solver. Chapters 7 and 8 deal with exact branch-and-cut methods. Chapter 7 presents an extended formulation for the Traveling Salesman Problem with Time Windows (TSPTW), a generalization of the well known TSP where each node must be visited within a given time window. The polyhedral approaches proposed for this problem in the literature typically follow the one which has been proven to be extremely effective in the classical TSP context. Here we present an overall (quite) general idea which is based on a relaxed discretization of time windows. Such an idea leads to a stronger formulation and to stronger valid inequalities which are then separated within the classical branch-and-cut framework. Finally, Chapter 8 addresses the branch-and-cut in the context of Generalized Minimum Spanning Tree Problems (GMSTPs) (i.e., a class of NP-hard generalizations of the classical minimum spanning tree problem). In this chapter, we show how some basic ideas (and, in particular, the usage of general purpose cutting planes) can be useful to improve on branch-and-cut methods proposed in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3D video-fluoroscopy is an accurate but cumbersome technique to estimate natural or prosthetic human joint kinematics. This dissertation proposes innovative methodologies to improve the 3D fluoroscopic analysis reliability and usability. Being based on direct radiographic imaging of the joint, and avoiding soft tissue artefact that limits the accuracy of skin marker based techniques, the fluoroscopic analysis has a potential accuracy of the order of mm/deg or better. It can provide fundamental informations for clinical and methodological applications, but, notwithstanding the number of methodological protocols proposed in the literature, time consuming user interaction is exploited to obtain consistent results. The user-dependency prevented a reliable quantification of the actual accuracy and precision of the methods, and, consequently, slowed down the translation to the clinical practice. The objective of the present work was to speed up this process introducing methodological improvements in the analysis. In the thesis, the fluoroscopic analysis was characterized in depth, in order to evaluate its pros and cons, and to provide reliable solutions to overcome its limitations. To this aim, an analytical approach was followed. The major sources of error were isolated with in-silico preliminary studies as: (a) geometric distortion and calibration errors, (b) 2D images and 3D models resolutions, (c) incorrect contour extraction, (d) bone model symmetries, (e) optimization algorithm limitations, (f) user errors. The effect of each criticality was quantified, and verified with an in-vivo preliminary study on the elbow joint. The dominant source of error was identified in the limited extent of the convergence domain for the local optimization algorithms, which forced the user to manually specify the starting pose for the estimating process. To solve this problem, two different approaches were followed: to increase the optimal pose convergence basin, the local approach used sequential alignments of the 6 degrees of freedom in order of sensitivity, or a geometrical feature-based estimation of the initial conditions for the optimization; the global approach used an unsupervised memetic algorithm to optimally explore the search domain. The performances of the technique were evaluated with a series of in-silico studies and validated in-vitro with a phantom based comparison with a radiostereometric gold-standard. The accuracy of the method is joint-dependent, and for the intact knee joint, the new unsupervised algorithm guaranteed a maximum error lower than 0.5 mm for in-plane translations, 10 mm for out-of-plane translation, and of 3 deg for rotations in a mono-planar setup; and lower than 0.5 mm for translations and 1 deg for rotations in a bi-planar setups. The bi-planar setup is best suited when accurate results are needed, such as for methodological research studies. The mono-planar analysis may be enough for clinical application when the analysis time and cost may be an issue. A further reduction of the user interaction was obtained for prosthetic joints kinematics. A mixed region-growing and level-set segmentation method was proposed and halved the analysis time, delegating the computational burden to the machine. In-silico and in-vivo studies demonstrated that the reliability of the new semiautomatic method was comparable to a user defined manual gold-standard. The improved fluoroscopic analysis was finally applied to a first in-vivo methodological study on the foot kinematics. Preliminary evaluations showed that the presented methodology represents a feasible gold-standard for the validation of skin marker based foot kinematics protocols.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work, then, is concerned with the forgotten elements of the Lebanese economy, agriculture and rural development. It investigates the main problematic which arose from these forgotten components, in particular the structure of the agricultural sector, production technology, income distribution, poverty, food security, territorial development and local livelihood strategies. It will do so using quantitative Computable General Equilibrium (CGE) modeling and a qualitative phenomenological case study analysis, both embedded in a critical review of the historical development of the political economy of Lebanon, and a structural analysis of its economy. The research shows that under-development in Lebanese rural areas is not due to lack of resources, but rather is the consequence of political choices. It further suggests that agriculture – in both its mainstream conventional and its innovative locally initiated forms of production – still represents important potential for inducing economic growth and development. In order to do so, Lebanon has to take full advantage of its human and territorial capital, by developing a rural development strategy based on two parallel sets of actions: one directed toward the support of local rural development initiatives, and the other directed toward intensive form of production. In addition to its economic returns, such a strategy would promote social and political stability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work is devoted to the assessment of the energy fluxes physics in the space of scales and physical space of wall-turbulent flows. The generalized Kolmogorov equation will be applied to DNS data of a turbulent channel flow in order to describe the energy fluxes paths from production to dissipation in the augmented space of wall-turbulent flows. This multidimensional description will be shown to be crucial to understand the formation and sustainment of the turbulent fluctuations fed by the energy fluxes coming from the near-wall production region. An unexpected behavior of the energy fluxes comes out from this analysis consisting of spiral-like paths in the combined physical/scale space where the controversial reverse energy cascade plays a central role. The observed behavior conflicts with the classical notion of the Richardson/Kolmogorov energy cascade and may have strong repercussions on both theoretical and modeling approaches to wall-turbulence. To this aim a new relation stating the leading physical processes governing the energy transfer in wall-turbulence is suggested and shown able to capture most of the rich dynamics of the shear dominated region of the flow. Two dynamical processes are identified as driving mechanisms for the fluxes, one in the near wall region and a second one further away from the wall. The former, stronger one is related to the dynamics involved in the near-wall turbulence regeneration cycle. The second suggests an outer self-sustaining mechanism which is asymptotically expected to take place in the log-layer and could explain the debated mixed inner/outer scaling of the near-wall statistics. The same approach is applied for the first time to a filtered velocity field. A generalized Kolmogorov equation specialized for filtered velocity field is derived and discussed. The results will show what effects the subgrid scales have on the resolved motion in both physical and scale space, singling out the prominent role of the filter length compared to the cross-over scale between production dominated scales and inertial range, lc, and the reverse energy cascade region lb. The systematic characterization of the resolved and subgrid physics as function of the filter scale and of the wall-distance will be shown instrumental for a correct use of LES models in the simulation of wall turbulent flows. Taking inspiration from the new relation for the energy transfer in wall turbulence, a new class of LES models will be also proposed. Finally, the generalized Kolmogorov equation specialized for filtered velocity fields will be shown to be an helpful statistical tool for the assessment of LES models and for the development of new ones. As example, some classical purely dissipative eddy viscosity models are analyzed via an a priori procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present PhD thesis was focused on the development and application of chemical methodology (Py-GC-MS) and data-processing method by multivariate data analysis (chemometrics). The chromatographic and mass spectrometric data obtained with this technique are particularly suitable to be interpreted by chemometric methods such as PCA (Principal Component Analysis) as regards data exploration and SIMCA (Soft Independent Models of Class Analogy) for the classification. As a first approach, some issues related to the field of cultural heritage were discussed with a particular attention to the differentiation of binders used in pictorial field. A marker of egg tempera the phosphoric acid esterified, a pyrolysis product of lecithin, was determined using HMDS (hexamethyldisilazane) rather than the TMAH (tetramethylammonium hydroxide) as a derivatizing reagent. The validity of analytical pyrolysis as tool to characterize and classify different types of bacteria was verified. The FAMEs chromatographic profiles represent an important tool for the bacterial identification. Because of the complexity of the chromatograms, it was possible to characterize the bacteria only according to their genus, while the differentiation at the species level has been achieved by means of chemometric analysis. To perform this study, normalized areas peaks relevant to fatty acids were taken into account. Chemometric methods were applied to experimental datasets. The obtained results demonstrate the effectiveness of analytical pyrolysis and chemometric analysis for the rapid characterization of bacterial species. Application to a samples of bacterial (Pseudomonas Mendocina), fungal (Pleorotus ostreatus) and mixed- biofilms was also performed. A comparison with the chromatographic profiles established the possibility to: • Differentiate the bacterial and fungal biofilms according to the (FAMEs) profile. • Characterize the fungal biofilm by means the typical pattern of pyrolytic fragments derived from saccharides present in the cell wall. • Individuate the markers of bacterial and fungal biofilm in the same mixed-biofilm sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis is to apply multilevel regression model in context of household surveys. Hierarchical structure in this type of data is characterized by many small groups. In last years comparative and multilevel analysis in the field of perceived health have grown in size. The purpose of this thesis is to develop a multilevel analysis with three level of hierarchy for Physical Component Summary outcome to: evaluate magnitude of within and between variance at each level (individual, household and municipality); explore which covariates affect on perceived physical health at each level; compare model-based and design-based approach in order to establish informativeness of sampling design; estimate a quantile regression for hierarchical data. The target population are the Italian residents aged 18 years and older. Our study shows a high degree of homogeneity within level 1 units belonging from the same group, with an intraclass correlation of 27% in a level-2 null model. Almost all variance is explained by level 1 covariates. In fact, in our model the explanatory variables having more impact on the outcome are disability, unable to work, age and chronic diseases (18 pathologies). An additional analysis are performed by using novel procedure of analysis :"Linear Quantile Mixed Model", named "Multilevel Linear Quantile Regression", estimate. This give us the possibility to describe more generally the conditional distribution of the response through the estimation of its quantiles, while accounting for the dependence among the observations. This has represented a great advantage of our models with respect to classic multilevel regression. The median regression with random effects reveals to be more efficient than the mean regression in representation of the outcome central tendency. A more detailed analysis of the conditional distribution of the response on other quantiles highlighted a differential effect of some covariate along the distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this research is to provide empirical evidence on determinants of the economic use of patented inventions in order to contribute to the literature on technology and innovation management. The current work consists of three main parts, each of which constitutes a self-consistent research paper. The first paper uses a meta-analytic approach to review and synthesize the existing body of empirical research on the determinants of technology licensing. The second paper investigates the factors affecting the choice between the following alternative economic uses of patented inventions: pure internal use, pure licensing, and mixed use. Finally, the third paper explores the least studied option of the economic use of patented inventions, namely, the sale of patent rights. The data to empirically test the hypotheses come from a large-scale survey of European Patent inventors resident in 21 European countries, Japan, and US. The findings provided in this dissertation contribute to a better understanding of the economic use of patented inventions by expanding the limits of previous research in several different dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dissertation contains five parts: An introduction, three major chapters, and a short conclusion. The First Chapter starts from a survey and discussion of the studies on corporate law and financial development literature. The commonly used methods in these cross-sectional analyses are biased as legal origins are no longer valid instruments. Hence, the model uncertainty becomes a salient problem. The Bayesian Model Averaging algorithm is applied to test the robustness of empirical results in Djankov et al. (2008). The analysis finds that their constructed legal index is not robustly correlated with most of the various stock market outcome variables. The second Chapter looks into the effects of minority shareholders protection in corporate governance regime on entrepreneurs' ex ante incentives to undertake IPO. Most of the current literature focuses on the beneficial part of minority shareholder protection on valuation, while overlooks its private costs on entrepreneur's control. As a result, the entrepreneur trade-offs the costs of monitoring with the benefits of cheap sources of finance when minority shareholder protection improves. The theoretical predictions are empirically tested using panel data and GMM-sys estimator. The third Chapter investigates the corporate law and corporate governance reform in China. The corporate law in China regards shareholder control as the means to the ends of pursuing the interests of stakeholders, which is inefficient. The Chapter combines the recent development of theories of the firm, i.e., the team production theory and the property rights theory, to solve such problem. The enlightened shareholder value, which emphasizes on the long term valuation of the firm, should be adopted as objectives of listed firms. In addition, a move from the mandatory division of power between shareholder meeting and board meeting to the default regime, is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim of this research is the development and validation of a comprehensive multibody motorcycle model featuring rigid-ring tires, taking into account both slope and roughness of road surfaces. A novel parametrization for the general kinematics of the motorcycle is proposed, using a mixed reference-point and relative-coordinates approach. The resulting description, developed in terms of dependent coordinates, makes it possible to efficiently include rigid-ring kinematics as well as road elevation and slope. The equations of motion for the multibody system are derived symbolically and the constraint equations arising from the dependent-coordinate formulation are handled using a projection technique. Therefore the resulting system of equations can be integrated in time domain using a standard ODE algorithm. The model is validated with respect to maneuvers experimentally measured on the race track, showing consistent results and excellent computational efficiency. More in detail, it is also capable of reproducing the chatter vibration of racing motorcycles. The chatter phenomenon, appearing during high speed cornering maneuvers, consists of a self-excited vertical oscillation of both the front and rear unsprung masses in the range of frequency between 17 and 22 Hz. A critical maneuver is numerically simulated, and a self-excited vibration appears, consistent with the experimentally measured chatter vibration. Finally, the driving mechanism for the self-excitation is highlighted and a physical interpretation is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work deals with the car sequencing (CS) problem, a combinatorial optimization problem for sequencing mixed-model assembly lines. The aim is to find a production sequence for different variants of a common base product, such that work overload of the respective line operators is avoided or minimized. The variants are distinguished by certain options (e.g., sun roof yes/no) and, therefore, require different processing times at the stations of the line. CS introduces a so-called sequencing rule H:N for each option, which restricts the occurrence of this option to at most H in any N consecutive variants. It seeks for a sequence that leads to no or a minimum number of sequencing rule violations. In this work, CS’ suitability for workload-oriented sequencing is analyzed. Therefore, its solution quality is compared in experiments to the related mixed-model sequencing problem. A new sequencing rule generation approach as well as a new lower bound for the problem are presented. Different exact and heuristic solution methods for CS are developed and their efficiency is shown in experiments. Furthermore, CS is adjusted and applied to a resequencing problem with pull-off tables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’anguilla europea, è una specie eurialina catadroma con un complesso ciclo biologico: l’area di riproduzione, unica, si trova molto distante da quella di distribuzione. La specie necessita di una gestione dello stock a fini conservazionistici. Il problema è europeo: lo stock è unico, distribuito in Europa e nell’Africa settentrionale, si riproduce in Atlantico ed è panmittico. C’è preoccupazione per il declino del reclutamento e delle catture di adulti. Lo scopo del progetto è di individuare possibili unità di stock nella penisola italiana. La ricerca è basata sullo studio degli otoliti mediante analisi morfometrica e microchimica. I contorni degli otoliti sono sottoposti ad analisi ellittica di Fourier per individuare eventuali gruppi. Gli otoliti sono stati levigati per effettuare: letture d’età, indagini microstrutturali al SEM delle fasi larvali, analisi microchimiche LA-ICP-MS del nucleo, studiarne l’origine e valutare l’ambiente di sviluppo. Le indagini morfometriche mostrano evidenti pattern ontogenetici, ma non legati ocorrelati alla località, sesso o anno di nascita. Le indagini microstrutturali hanno evidenziano l’alto contenuto organico nucleare, un pattern comune di crescita ed eventi chiave delle fasi larvali, con una media di 212 anelli giornalieri. La microchimica rivela che le larve si sviluppano in acque salate fino alla metamorfosi, poi migrano verso acque meno salate. Le analisi su campioni nati nello stesso anno, evidenziano due gruppi: individui di rimonta naturale e individui di ripopolamento. I profili nucleo bordo evidenziano la permanenza a salinità intermedie degli adulti. L’attività di ricerca si è dimostrata proficua dal punto di vista tecnico con la messa a punto di protocolli innovativi e con forti ricadute sulla riduzione dei tempi e costi d’analisi. Il debole segnale di possibili unità di stock andrà verificato in futuro mediante analisi più dettagliate discriminando meglio la storia di ogni singolo individuo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the time, Twitter has become a fundamental source of information for news. As a one step forward, researchers have tried to analyse if the tweets contain predictive power. In the past, in financial field, a lot of research has been done to propose a function which takes as input all the tweets for a particular stock or index s, analyse them and predict the stock or index price of s. In this work, we take an alternative approach: using the stock price and tweet information, we investigate following questions. 1. Is there any relation between the amount of tweets being generated and the stocks being exchanged? 2. Is there any relation between the sentiment of the tweets and stock prices? 3. What is the structure of the graph that describes the relationships between users?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Der atmosphärische Kreislauf reaktiver Stickstoffverbindungen beschäftigt sowohl die Naturwissenschaftler als auch die Politik. Dies ist insbesondere darauf zurückzuführen, dass reaktive Stickoxide die Bildung von bodennahem Ozon kontrollieren. Reaktive Stickstoffverbindungen spielen darüber hinaus als gasförmige Vorläufer von Feinstaubpartikeln eine wichtige Rolle und der Transport von reaktivem Stickstoff über lange Distanzen verändert den biogeochemischen Kohlenstoffkreislauf des Planeten, indem er entlegene Ökosysteme mit Stickstoff düngt. Die Messungen von stabilen Stickstoffisotopenverhältnissen (15N/14N) bietet ein Hilfsmittel, welches es erlaubt, die Quellen von reaktiven Stickstoffverbindungen zu identifizieren und die am Stickstoffkeislauf beteiligten Reaktionen mithilfe ihrer reaktionsspezifischen Isotopenfraktionierung genauer zu untersuchen. rnIn dieser Doktorarbeit demonstriere ich, dass es möglich ist, mit Hilfe von Nano-Sekundärionenmassenspektrometrie (NanoSIMS) verschiedene stickstoffhaltige Verbindungen, die üblicherweise in atmosphärischen Feinstaubpartikeln vorkommen, mit einer räumlichen Auflösung von weniger als einem Mikrometer zu analysieren und zu identifizieren. Die Unterscheidung verschiedener stickstoffhaltiger Verbindungen erfolgt anhand der relativen Signalintensitäten der positiven und negativen Sekundärionensignale, die beobachtet werden, wenn die Feinstaubproben mit einem Cs+ oder O- Primärionenstrahl beschossen werden. Die Feinstaubproben können direkt auf dem Probenahmesubstrat in das Massenspektrometer eingeführt werden, ohne chemisch oder physikalisch aufbereited zu werden. Die Methode wurde Mithilfe von Nitrat, Nitrit, Ammoniumsulfat, Harnstoff, Aminosären, biologischen Feinstaubproben (Pilzsporen) und Imidazol getestet. Ich habe gezeigt, dass NO2 Sekundärionen nur beim Beschuss von Nitrat und Nitrit (Salzen) mit positiven Primärionen entstehen, während NH4+ Sekundärionen nur beim Beschuss von Aminosäuren, Harnstoff und Ammoniumsalzen mit positiven Primärionen freigesetzt werden, nicht aber beim Beschuss biologischer Proben wie z.B. Pilzsporen. CN- Sekundärionen werden beim Beschuss aller stickstoffhaltigen Verbindungen mit positiven Primärionen beobachtet, da fast alle Proben oberflächennah mit Kohlenstoffspuren kontaminiert sind. Die relative Signalintensität der CN- Sekundärionen ist bei kohlenstoffhaltigen organischen Stickstoffverbindungen am höchsten.rnDarüber hinaus habe ich gezeigt, dass an reinen Nitratsalzproben (NaNO3 und KNO3), welche auf Goldfolien aufgebracht wurden speziesspezifische stabile Stickstoffisotopenverhältnisse mithilfe des 15N16O2- / 14N16O2- - Sekundärionenverhältnisses genau und richtig gemessen werden können. Die Messgenauigkeit auf Feldern mit einer Rastergröße von 5×5 µm2 wurde anhand von Langzeitmessungen an einem hausinternen NaNO3 Standard als ± 0.6 ‰ bestimmt. Die Differenz der matrixspezifischen instrumentellen Massenfraktionierung zwischen NaNO3 und KNO3 betrug 7.1 ± 0.9 ‰. 23Na12C2- Sekundärionen können eine ernst zu nehmende Interferenz darstellen wenn 15N16O2- Sekundärionen zur Messung des nitratspezifischen schweren Stickstoffs eingesetzt werden sollen und Natrium und Kohlenstoff im selben Feinstaubpartikel als interne Mischung vorliegt oder die natriumhaltige Probe auf einem kohlenstoffhaltigen Substrat abgelegt wurde. Selbst wenn, wie im Fall von KNO3, keine derartige Interferenz vorliegt, führt eine interne Mischung mit Kohlenstoff im selben Feinstaubpartikel zu einer matrixspezifischen instrumentellen Massenfraktionierung die mit der folgenden Gleichung beschrieben werden kann: 15Nbias = (101 ± 4) ∙ f − (101 ± 3) ‰, mit f = 14N16O2- / (14N16O2- + 12C14N-). rnWird das 12C15N- / 12C14N- Sekundärionenverhältnis zur Messung der stabilen Stickstoffisotopenzusammensetzung verwendet, beeinflusst die Probematrix die Messungsergebnisse nicht, auch wenn Stickstoff und Kohlenstoff in den Feinstaubpartikeln in variablen N/C–Verhältnissen vorliegen. Auch Interferenzen spielen keine Rolle. Um sicherzustellen, dass die Messung weiterhin spezifisch auf Nitratspezies eingeschränkt bleibt, kann eine 14N16O2- Maske bei der Datenauswertung verwendet werden. Werden die Proben auf einem kohlenstoffhaltigen, stickstofffreien Probennahmesubstrat gesammelt, erhöht dies die Signalintensität für reine Nitrat-Feinstaubpartikel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’obiettivo del presente lavoro di tesi è stato quello di analizzare i campioni di otoliti delle due specie del genere Mullus (Mullus barbatus e Mullus surmuletus) per mezzo dell’Analisi Ellittica di Fourier (EFA) e l’analisi di morfometria classica con gli indici di forma, al fine di verificare la simmetria tra l’otolite destro e sinistro in ognuna delle singole specie di Mullus e se varia la forma in base alla taglia dell’individuo. Con l’EFA è stato possibile mettere a confronto le forme degli otoliti facendo i confronti multipli in base alla faccia, al sesso e alla classe di taglia. Inoltre è stato fatto un confronto tra le forme degli otoliti delle due specie. Dalle analisi EFA è stato possibile anche valutare se gli esemplari raccolti appartenessero tutti al medesimo stock o a stock differenti. Gli otoliti appartengono agli esemplari di triglia catturati durante la campagna sperimentale MEDITS 2012. Per i campioni di Mullus surmuletus, data la modesta quantità, sono stati analizzati anche gli otoliti provenienti dalla campagna MEDITS 2014 e GRUND 2002. I campioni sono stati puliti e analizzati allo stereomicroscopio con telecamera e collegato ad un PC fornito di programma di analisi di immagine. Dalle analisi di morfometria classica sugli otoliti delle due specie si può sostenere che in generale vi sia una simmetria tra l’otolite destro e sinistro. Dalle analisi EFA sono state riscontrate differenze significative in tutti i confronti, anche nel confronto tra le due specie. I campioni sembrano però appartenere al medesimo stock. In conclusione si può dire che l’analisi di morfometria classica ha dato dei risultati congrui con quello che ci si aspettava. I risultati dell’analisi EFA invece hanno evidenziato delle differenze significative che dimostrano una superiore potenza discriminante. La particolare sensibilità dell’analisi dei contorni impone un controllo di qualità rigoroso durante l’acquisizione delle forme.