862 resultados para machine tools and accessories


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biobanks represent key resources for clinico-genomic research and are needed to pave the way to personalised medicine. To achieve this goal, it is crucial that scientists can securely access and share high-quality biomaterial and related data. Therefore, there is a growing interest in integrating biobanks into larger biomedical information and communication technology (ICT) infrastructures. The European project p-medicine is currently building an innovative ICT infrastructure to meet this need. This platform provides tools and services for conducting research and clinical trials in personalised medicine. In this paper, we describe one of its main components, the biobank access framework p-BioSPRE (p-medicine Biospecimen Search and Project Request Engine). This generic framework enables and simplifies access to existing biobanks, but also to offer own biomaterial collections to research communities, and to manage biobank specimens and related clinical data over the ObTiMA Trial Biomaterial Manager. p-BioSPRE takes into consideration all relevant ethical and legal standards, e.g., safeguarding donors’ personal rights and enabling biobanks to keep control over the donated material and related data. The framework thus enables secure sharing of biomaterial within open and closed research communities, while flexibly integrating related clinical and omics data. Although the development of the framework is mainly driven by user scenarios from the cancer domain, in this case, acute lymphoblastic leukaemia and Wilms tumour, it can be extended to further disease entities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Online advertising has been growing rapidly since the mid-90s. In recent years, online advertising has become as relevant as an advertising medium as print and television are. However, despite the growth, the generally held consensus is that consumers are not interested in online advertisements. The arrival of online ad blocking tools has offered consumers a very effective way to avoid and block online advertisements. Although these tools have now been around for several years and the most popular software’s have hundreds of millions of active users, the phenomenon of ad blocking has gathered surprisingly little attention from academic marketing research. For this reason, ad blocking was chosen as the topic of this thesis study. The researcher was particularly interested in the reasons behind the usage of online ad blocking tools. The object of the empirical part of this study is to provide new and valid information regarding the reasons behind the usage of online ad blocking tools. The empirical research of this study consisted of a survey study that mixed both quantitative and qualitative elements. Although the sample size of the study was fairly limited, the study provided useful answers to individual research questions and provided new knowledge regarding the reasons behind the usage of online ad blocking and the phenomenon of ad blocking as a whole. The study provides further evidence that consumers are aware of online ad blocking tools and a significant portion of them are interested in using them. This study indicates that many of the consumers who do not have previous knowledge of these tools would actually be interested in them. The reason for the usage of online ad blocking tools varies from institutional reasons to instrumental reason, such as poor quality of the online advertisements. However, there were participants who were not interested in the usage of online ad blocking tools, and some of them reported to find online advertising useful. There also seems to be a concern among some consumers that online ad blocking will have a negative effect on online content in the future if the ad blocking will keep growing as a phenomenon. This certainly can be the case, and it is very interesting to see how the development of online ad blocking will shape the online advertising, how the advertising industry will respond to the growth of online ad blocking and how will the consumers respond to these changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cada vez mais, os principais objetivos na indústria é a produção a baixo custo, com a máxima qualidade e com o tempo de fabrico o mais curto possível. Para atingir esta meta, a indústria recorre, frequentemente, às máquinas de comando numérico (CNC), uma vez que com esta tecnologia torna se capaz alcançar uma elevada precisão e um tempo de processamento mais baixo. As máquinas ferramentas CNC podem ser aplicadas em diferentes processos de maquinagem, tais como: torneamento, fresagem, furação, entre outros. De todos estes processos, o mais utilizado é a fresagem devido à sua versatilidade. Utiliza-se normalmente este processo para maquinar materiais metálicos como é o caso do aço e dos ferros fundidos. Neste trabalho, são analisados os efeitos da variação de quatro parâmetros no processo de fresagem (velocidade de corte, velocidade de avanço, penetração radial e penetração axial), individualmente e a interação entre alguns deles, na variação da rugosidade num aço endurecido (aço 12738). Para essa análise são utilizados dois métodos de otimização: o método de Taguchi e o método das superfícies. O primeiro método foi utilizado para diminuir o número de combinações possíveis e, consequentemente, o número de ensaios a realizar é denominado por método de Taguchi. O método das superfícies ou método das superfícies de resposta (RSM) foi utilizado com o intuito de comparar os resultados obtidos com o método de Taguchi, de acordo com alguns trabalhos referidos na bibliografia especializada, o RSM converge mais rapidamente para um valor ótimo. O método de Taguchi é muito conhecido no setor industrial onde é utilizado para o controlo de qualidade. Apresenta conceitos interessantes, tais como robustez e perda de qualidade, sendo bastante útil para identificar variações do sistema de produção, durante o processo industrial, quantificando a variação e permitindo eliminar os fatores indesejáveis. Com este método foi vi construída uma matriz ortogonal L16 e para cada parâmetro foram definidos dois níveis diferentes e realizados dezasseis ensaios. Após cada ensaio, faz-se a medição superficial da rugosidade da peça. Com base nos resultados obtidos das medições da rugosidade é feito um tratamento estatístico dos dados através da análise de variância (Anova) a fim de determinar a influência de cada um dos parâmetros na rugosidade superficial. Verificou-se que a rugosidade mínima medida foi de 1,05m. Neste estudo foi também determinada a contribuição de cada um dos parâmetros de maquinagem e a sua interação. A análise dos valores de “F-ratio” (Anova) revela que os fatores mais importantes são a profundidade de corte radial e da interação entre profundidade de corte radial e profundidade de corte axial para minimizar a rugosidade da superfície. Estes têm contribuições de cerca de 30% e 24%, respetivamente. Numa segunda etapa este mesmo estudo foi realizado pelo método das superfícies, a fim de comparar os resultados por estes dois métodos e verificar qual o melhor método de otimização para minimizar a rugosidade. A metodologia das superfícies de resposta é baseada num conjunto de técnicas matemáticas e estatísticas úteis para modelar e analisar problemas em que a resposta de interesse é influenciada por diversas variáveis e cujo objetivo é otimizar essa resposta. Para este método apenas foram realizados cinco ensaios, ao contrário de Taguchi, uma vez que apenas em cinco ensaios consegue-se valores de rugosidade mais baixos do que a média da rugosidade no método de Taguchi. O valor mais baixo por este método foi de 1,03μm. Assim, conclui-se que RSM é um método de otimização mais adequado do que Taguchi para os ensaios realizados. Foram obtidos melhores resultados num menor número de ensaios, o que implica menos desgaste da ferramenta, menor tempo de processamento e uma redução significativa do material utilizado.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Does European integration influence national social policies? What is the use of EU laws, orientations and guidelines? Based on a systematic comparison of ten national cases, including both old and new member states, representing all families of welfare regimes, this volume explores and specifies the mechanisms through which the EU plays a role in domestic social policy changes. It focuses on where, when and how national actors use the tools and resources offered by the process of European integration to support them in the national welfare reforms they are engaged in. The comprehensive research design and the systematic comparisons provide a unique opportunity to fully grasp the mechanisms of domestic welfare state change within the context of the European Union multilevel political system. This book proposes both a new step within the Europeanization and the welfare state literatures. It confirms the idea that Europe matters in a differential way since EU social policy will be selectively used by domestic political actors in accordance with their political preferences. It provides a clear explanation of why no EU-induced social policy change can occur without an overall support offered by key domestic decision-makers. (Résumé éditeur)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European shellfish industry enjoys a privileged position on the global scene. Its social dimension is essential, as it employs a high number of people in more than 8000 companies, mostly micro-companies. Shellfish production in Europe is little diversified and mainly relies on the industrially produced mussels, oysters and clams. Over the recent years, this sector has grown more slowly than other fish farming sectors, notably because it depends a great deal on the environmental quality and the emergence of diseases. Mortality events, linked to pathogen organisms such as viruses, bacteria and parasites (protozoa), tend to weaken the production’s sustainability. In this context, the European project VIVALDI (PreVenting and mItigating farmed biVALve DIseases) aims at increasing the sustainability and competitiveness of the shellfish industry in Europe, developing tools and approaches with a view to better preventing and controlling marine bivalve diseases. VIVALDI is a 4-years European Horizon 2020 project coordinated by Ifremer (2016-2020): 21 mostly European, public and private partners are involved, representing the diversity of the European shellfish industry landscape

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change communication has become a salient topic in science and society. It has grown to be something like a booming industry alongside more established ‘communication enterprises’, such as health communication, risk communication, and science communication. This article situates the theory of climate change communication within theoretical developments in the field of science communication. It discusses the importance and difficulties inherent in talking about climate change to different types of publics using various types of communication tools and strategies. It engages with the difficult issue of the relationship between climate change communication and behavior change, and it focuses, in particular, on the role of language (metaphors, words, strategies, frames, and narratives) in conveying climate change issues to stakeholders. In the process, it attempts to provide an overview of emerging theories of climate change communication, theories that recently have begun to proliferate quite dramatically. In some cases, we can, therefore only provide signposts to the most relevant research that is being carried out with regard to climate change communication without being able to engage with all its aspects. We end with an assessment of how communication could be improved in light of the theories and practices discussed in this article.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fungal fruit rots and insect pests are among the most important problems negatively affecting the yield and quality of mid-Atlantic wine. In pathogenicity trials of fungi recovered from diseased Chardonnay and Vidal blanc grapes, Alternaria alternata, Pestalotiopsis telopeae, and Aspergillus japonicus were found to be unreported fruit rot pathogens in the region. Additionally, P. telopeae and A. japonicus had comparable virulence to the region’s common fruit rot pathogens. Furthermore, a timed-exclusion field study was implemented to evaluate vineyard insect-fruit rot relationships. It was found that clusters exposed to early-season insect communities that included Paralobesia viteana had a significantly greater incidence of sour rot than clusters protected from insects all season. These results were contrary to the current assumption that fall insects are the primary drivers of sour rot in the region. This research provides diagnostic tools and information to develop management-strategies against fungal and insect pests for mid-Atlantic grape growers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The Grooved Carpet shell clam Ruditapes decussatus is the autochthonous European clam and the most appreciated from a gastronomic and economic point of view. The production is in decline due to several factors such as Perkinsiosis and habitat invasion and competition by the introduced exotic species, the manila clam Ruditapes philippinarum. After we sequenced R. decussatus transcriptome we have designed an oligo microarray capable of contributing to provide some clues on molecular response of the clam to Perkinsiosis. Results A database consisting of 41,119 unique transcripts was constructed, of which 12,479 (30.3%) were annotated by similarity. An oligo-DNA microarray platform was then designed and applied to profile gene expression in R. decussatus heavily infected by Perkinsus olseni. Functional annotation of differentially expressed genes between those two conditionswas performed by gene set enrichment analysis. As expected, microarrays unveil genes related with stress/infectious agents such as hydrolases, proteases and others. The extensive role of innate immune system was also analyzed and effect of parasitosis upon expression of important molecules such as lectins reviewed. Conclusions This study represents a first attempt to characterize Ruditapes decussatus transcriptome, an important marine resource for the European aquaculture. The trancriptome sequencing and consequent annotation will increase the available tools and resources for this specie, introducing the possibility of high throughput experiments such as microarrays analysis. In this specific case microarray approach was used to unveil some important aspects of host-parasite interaction between the Carpet shell clam and Perkinsus, two non-model species, highlighting some genes associated with this interaction. Ample information was obtained to identify biological processes significantly enriched among differentially expressed genes in Perkinsus infected versus non-infected gills. An overview on the genes related with the immune system on R. decussatus transcriptome is also reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several modern-day cooling applications require the incorporation of mini/micro-channel shear-driven flow condensers. There are several design challenges that need to be overcome in order to meet those requirements. The difficulty in developing effective design tools for shear-driven flow condensers is exacerbated due to the lack of a bridge between the physics-based modelling of condensing flows and the current, popular approach based on semi-empirical heat transfer correlations. One of the primary contributors of this disconnect is a lack of understanding caused by the fact that typical heat transfer correlations eliminate the dependence of the heat transfer coefficient on the method of cooling employed on the condenser surface when it may very well not be the case. This is in direct contrast to direct physics-based modeling approaches where the thermal boundary conditions have a direct and huge impact on the heat transfer coefficient values. Typical heat transfer correlations instead introduce vapor quality as one of the variables on which the value of the heat transfer coefficient depends. This study shows how, under certain conditions, a heat transfer correlation from direct physics-based modeling can be equivalent to typical engineering heat transfer correlations without making the same apriori assumptions. Another huge factor that raises doubts on the validity of the heat-transfer correlations is the opacity associated with the application of flow regime maps for internal condensing flows. It is well known that flow regimes influence heat transfer rates strongly. However, several heat transfer correlations ignore flow regimes entirely and present a single heat transfer correlation for all flow regimes. This is believed to be inaccurate since one would expect significant differences in the heat transfer correlations for different flow regimes. Several other studies present a heat transfer correlation for a particular flow regime - however, they ignore the method by which extents of the flow regime is established. This thesis provides a definitive answer (in the context of stratified/annular flows) to: (i) whether a heat transfer correlation can always be independent of the thermal boundary condition and represented as a function of vapor quality, and (ii) whether a heat transfer correlation can be independently obtained for a flow regime without knowing the flow regime boundary (even if the flow regime boundary is represented through a separate and independent correlation). To obtain the results required to arrive at an answer to these questions, this study uses two numerical simulation tools - the approximate but highly efficient Quasi-1D simulation tool and the exact but more expensive 2D Steady Simulation tool. Using these tools and the approximate values of flow regime transitions, a deeper understanding of the current state of knowledge in flow regime maps and heat transfer correlations in shear-driven internal condensing flows is obtained. The ideas presented here can be extended for other flow regimes of shear-driven flows as well. Analogous correlations can also be obtained for internal condensers in the gravity-driven and mixed-driven configuration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intracochlear trauma from surgical insertion of bulky electrode arrays and inadequate pitch perception are areas of concern with current hand-assembled commercial cochlear implants. Parylene thin-film arrays with higher electrode densities and lower profiles are a potential solution, but lack rigidity and hence depend on manually fabricated permanently attached polyethylene terephthalate (PET) tubing based bulky backing devices. As a solution, we investigated a new backing device with two sub-systems. The first sub-system is a thin poly(lactic acid) (PLA) stiffener that will be embedded in the parylene array. The second sub-system is an attaching and detaching mechanism, utilizing a poly(N-vinylpyrrolidone)-block-poly(d,l-lactide) (PVP-b-PDLLA) copolymer-based biodegradable and water soluble adhesive, that will help to retract the PET insertion tool after implantation. As a proof-of-concept of sub-system one, a microfabrication process for patterning PLA stiffeners embedded in parylene has been developed. Conventional hotembossing, mechanical micromachining, and standard cleanroom processes were integrated for patterning fully released and discrete stiffeners coated with parylene. The released embedded stiffeners were thermoformed to demonstrate that imparting perimodiolar shapes to stiffener-embedded arrays will be possible. The developed process when integrated with the array fabrication process will allow fabrication of stiffener-embedded arrays in a single process. As a proof-of-concept of sub-system two, the feasibility of the attaching and detaching mechanism was demonstrated by adhering 1x and 1.5x scale PET tube-based insertion tools and PLA stiffeners embedded in parylene using the copolymer adhesive. The attached devices survived qualitative adhesion tests, thermoforming, and flexing. The viability of the detaching mechanism was tested by aging the assemblies in-vitro in phosphate buffer solution. The average detachment times, 2.6 minutes and 10 minutes for 1x and 1.5x scale devices respectively, were found to be clinically relevant with respect to the reported array insertion times during surgical implantation. Eventually, the stiffener-embedded arrays would not need to be permanently attached to current insertion tools which are left behind after implantation and congest the cochlear scala tympani chamber. Finally, a simulation-based approach for accelerated failure analysis of PLA stiffeners and characterization of PVP-b-PDLLA copolymer adhesive has been explored. The residual functional life of embedded PLA stiffeners exposed to body-fluid and thereby subjected to degradation and erosion has been estimated by simulating PLA stiffeners with different parylene coating failure types and different PLA types for a given parylene coating failure type. For characterizing the PVP-b-PDLLA copolymer adhesive, several formulations of the copolymer adhesive were simulated and compared based on the insertion tool detachment times that were predicted from the dissolution, degradation, and erosion behavior of the simulated adhesive formulations. Results indicate that the simulation-based approaches could be used to reduce the total number of time consuming and expensive in-vitro tests that must be conducted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette thèse examine le comportement de quatre doyens canadiens dont les facultés sont en situation financière précaire. Dans un contexte d’imputabilité accrue des établissements universitaires et de modification constante aux rapports de pouvoir internes, cette étude présente une importance particulière pour la compréhension de l’exercice réel des fonctions administratives des cadres académiques intermédiaires. La recherche scientifique sur ce sujet attribue aux doyens une contribution vitale au bon fonctionnement des établissements; le doyen en demeure pourtant un des rouages les moins étudiés. La position intermédiaire qu’occupent les doyens les place dans un conflit de rôle évident. Pris entre des rôles et des attentes contradictoires émis par la communauté facultaire, d’une part, et par la direction de l’établissement, d’autre part, ils doivent conjuguer cette ambiguïté avec des responsabilités croissantes en matière de gestion financière, de leadership et d’opérationnalisation des visées stratégiques de l’université. Comprendre la façon dont les doyens arrivent à mobiliser les leviers institutionnels à leur disposition pour résoudre des situations critiques permettra d’améliorer notre compréhension de la complexité de la prise de décision au niveau intermédiaire et notre compréhension globale du fonctionnement des établissements universitaires. La présente étude importe de la théorie culturelle-historique de l’activité (culturalhistorical activity theory, ou CHAT) un modèle d’analyse de l’activité collective en situation de contradictions systémiques. Quatre doyens d’une même université ont été interviewés afin de présenter une situation vécue de difficulté budgétaire, et le processus de résolution qui a suivi. Les données ont été analysées qualitativement afin de décrire les interventions que réalisent les doyens sur les leviers présents dans leurs environnements. Les résultats suggèrent que les participants ont misé sur une maîtrise fine des rouages institutionnels et sur des interventions diversifiées afin de résoudre les contradictions de leurs systèmes d’activité.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adults of most marine benthic and demersal fish are site-attached, with the dispersal of their larval stages ensuring connectivity among populations. In this study we aimed to infer spatial and temporal variation in population connectivity and dispersal of a marine fish species, using genetic tools and comparing these with oceanographic transport. We focused on an intertidal rocky reef fish species, the shore clingfish Lepadogaster lepadogaster, along the southwest Iberian Peninsula, in 2011 and 2012. We predicted high levels of self-recruitment and distinct populations, due to short pelagic larval duration and because all its developmental stages have previously been found near adult habitats. Genetic analyses based on microsatellites countered our prediction and a biophysical dispersal model showed that oceanographic transport was a good explanation for the patterns observed. Adult sub-populations separated by up to 300 km of coastline displayed no genetic differentiation, revealing a single connected population with larvae potentially dispersing long distances over hundreds of km. Despite this, parentage analysis performed on recruits from one focal site within the Marine Park of Arrábida (Portugal), revealed self-recruitment levels of 2.5% and 7.7% in 2011 and 2012, respectively, suggesting that both long- and short-distance dispersal play an important role in the replenishment of these populations. Population differentiation and patterns of dispersal, which were highly variable between years, could be linked to the variability inherent in local oceanographic processes. Overall, our measures of connectivity based on genetic and oceanographic data highlight the relevance of long-distance dispersal in determining the degree of connectivity, even in species with short pelagic larval durations.