848 resultados para User friendly interface


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Questo testo si pone come obbiettivo l'analisi di fattibilità tecnica e l'introduzione all'implementazione di sistemi che permettano il riutilizzo di codice sorgente di applicazioni con necessità simili su dispositivi Smartphone. In particolare su sistemi Google Android. Questo è il concetto di personalizzazione di applicazioni, in pratica la costruzione di sistemi che permettano di generare applicazioni mobile attraverso interfacce user-friendly e mediante l'uso di codice modulare. L'obbiettivo è fornire una visione globale delle soluzioni e delle problematiche di questo campo, poste come linee guida per chi intendesse studiare questo contesto o dovesse sviluppare un progetto, anche complesso, inerente alla personalizzazione di applicazioni. Sarà implementato come esempio, un web service per la personalizzazione di applicazioni Android, in particolare webview, soffermandosi in particolare sulle problematiche legate alla paternità del software e delle firme digitali necessarie per la pubblicazione sul market Android. Saranno definite alcune scelte da prendere se si sviluppano applicazioni per terzi che in seguito saranno rilasciate sul market. Nella ultima parte sarà analizzata una strategia di customizzazione attraverso alcune buone pratiche, che permette attraverso l'uso di progetti libreria e direttamente nell'ambiente di sviluppo, di realizzare codice modulare e pronto per il market Android in diverse versioni.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Das Time-of-Flight Aerosol Mass Spectrometer (ToF-AMS) der Firma Aerodyne ist eine Weiterentwicklung des Aerodyne Aerosolmassenspektrometers (Q-AMS). Dieses ist gut charakterisiert und kommt weltweit zum Einsatz. Beide Instrumente nutzen eine aerodynamische Linse, aerodynamische Partikelgrößenbestimmung, thermische Verdampfung und Elektronenstoß-Ionisation. Im Gegensatz zum Q-AMS, wo ein Quadrupolmassenspektrometer zur Analyse der Ionen verwendet wird, kommt beim ToF-AMS ein Flugzeit-Massenspektrometer zum Einsatz. In der vorliegenden Arbeit wird anhand von Laborexperimenten und Feldmesskampagnen gezeigt, dass das ToF-AMS zur quantitativen Messung der chemischen Zusammensetzung von Aerosolpartikeln mit hoher Zeit- und Größenauflösung geeignet ist. Zusätzlich wird ein vollständiges Schema zur ToF-AMS Datenanalyse vorgestellt, dass entwickelt wurde, um quantitative und sinnvolle Ergebnisse aus den aufgenommenen Rohdaten, sowohl von Messkampagnen als auch von Laborexperimenten, zu erhalten. Dieses Schema basiert auf den Charakterisierungsexperimenten, die im Rahmen dieser Arbeit durchgeführt wurden. Es beinhaltet Korrekturen, die angebracht werden müssen, und Kalibrationen, die durchgeführt werden müssen, um zuverlässige Ergebnisse aus den Rohdaten zu extrahieren. Beträchtliche Arbeit wurde außerdem in die Entwicklung eines zuverlässigen und benutzerfreundlichen Datenanalyseprogramms investiert. Dieses Programm kann zur automatischen und systematischen ToF-AMS Datenanalyse und –korrektur genutzt werden.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Il lavoro è costituito da una prima parte nella quale sono analizzate in dettaglio le caratteristiche fisiche e tecnologiche degli impianti a vapore, a gas e combinati. Segue quindi una seconda parte nella quale è introdotto l'ambiente di programmazione Matlab, sono spiegate alcune funzionalità del programma e infine vengono analizzate alcune toolboxes rilevanti per la parte conclusiva del lavoro, che riguarda l'analisi numerica dei cicli sopra menzionati volta all'ottenimento di interfacce user-friendly che consentono l'approccio analitico completo a questo tipo di sistemi anche da parte di utenti inesperti di programmazione.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The determination of skeletal loading conditions in vivo and their relationship to the health of bone tissues, remain an open question. Computational modeling of the musculoskeletal system is the only practicable method providing a valuable approach to muscle and joint loading analyses, although crucial shortcomings limit the translation process of computational methods into the orthopedic and neurological practice. A growing attention focused on subject-specific modeling, particularly when pathological musculoskeletal conditions need to be studied. Nevertheless, subject-specific data cannot be always collected in the research and clinical practice, and there is a lack of efficient methods and frameworks for building models and incorporating them in simulations of motion. The overall aim of the present PhD thesis was to introduce improvements to the state-of-the-art musculoskeletal modeling for the prediction of physiological muscle and joint loads during motion. A threefold goal was articulated as follows: (i) develop state-of-the art subject-specific models and analyze skeletal load predictions; (ii) analyze the sensitivity of model predictions to relevant musculotendon model parameters and kinematic uncertainties; (iii) design an efficient software framework simplifying the effort-intensive phases of subject-specific modeling pre-processing. The first goal underlined the relevance of subject-specific musculoskeletal modeling to determine physiological skeletal loads during gait, corroborating the choice of full subject-specific modeling for the analyses of pathological conditions. The second goal characterized the sensitivity of skeletal load predictions to major musculotendon parameters and kinematic uncertainties, and robust probabilistic methods were applied for methodological and clinical purposes. The last goal created an efficient software framework for subject-specific modeling and simulation, which is practical, user friendly and effort effective. Future research development aims at the implementation of more accurate models describing lower-limb joint mechanics and musculotendon paths, and the assessment of an overall scenario of the crucial model parameters affecting the skeletal load predictions through probabilistic modeling.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As a large and long-lived species with high economic value, restricted spawning areas and short spawning periods, the Atlantic bluefin tuna (BFT; Thunnus thynnus) is particularly susceptible to over-exploitation. Although BFT have been targeted by fisheries in the Mediterranean Sea for thousands of years, it has only been in these last decades that the exploitation rate has reached far beyond sustainable levels. An understanding of the population structure, spatial dynamics, exploitation rates and the environmental variables that affect BFT is crucial for the conservation of the species. The aims of this PhD project were 1) to assess the accuracy of larval identification methods, 2) determine the genetic structure of modern BFT populations, 3) assess the self-recruitment rate in the Gulf of Mexico and Mediterranean spawning areas, 4) estimate the immigration rate of BFT to feeding aggregations from the various spawning areas, and 5) develop tools capable of investigating the temporal stability of population structuring in the Mediterranean Sea. Several weaknesses in modern morphology-based taxonomy including demographic decline of expert taxonomists, flawed identification keys, reluctance of the taxonomic community to embrace advances in digital communications and a general scarcity of modern user-friendly materials are reviewed. Barcoding of scombrid larvae revealed important differences in the accuracy of the taxonomic identifications carried out by different ichthyoplanktologists following morphology-based methods. Using a Genotyping-by-Sequencing a panel of 95 SNPs was developed and used to characterize the population structuring of BFT and composition of adult feeding aggregations. Using novel molecular techniques, DNA was extracted from bluefin tuna vertebrae excavated from late iron age, ancient roman settlements Byzantine-era Constantinople and a 20th century collection. A second panel of 96 SNPs was developed to genotype historical and modern samples in order to elucidate changes in population structuring and allele frequencies of loci associated with selective traits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study aims at exploring listeners’ perception of disfluencies, i.e. ungrammatical pauses, filled pauses, repairs, false starts and repetitions, which can irritate listeners and impede comprehension. As professional communicators, conference interpreters should be competent public speakers. This means that their speech should be easily understood by listeners and not contain elements that may be considered irritating. The aim of this study was to understand to what extent listeners notice disfluencies and consider them irritating, and to examine whether there are differences between interpreters and non-interpreters and between different age groups. A survey was therefore carried out among professional interpreters, students of interpreting and people who regularly attend conferences. The respondents were asked to answer a questionnaire after listening to three speeches: three consecutive interpretations delivered during the final exams held at the Advanced School of Languages, Literature, Translation and Interpretation (SSLLTI) in Forlì. Since conference interpreters’ public speaking skills should be at least as good as those of the speakers at a conference, the speeches were presented to the listeners as speeches delivered during a conference, with no mention of interpreting being made. The study is divided into five chapters. Chapter I outlines the characteristics of the interpreter as a professional communicator. The quality criterion “user-friendliness” is explored, with a focus on features that make a speech more user-friendly: fluency, intonation, coherence and cohesion. The Chapter also focuses on listeners’ quality expectations and evaluations. In Chapter II the methodology of the study is described. Chapter III contains a detailed analysis of the texts used for the study, focusing on those elements that may irritate listeners or impede comprehension, namely disfluencies, the wrong use of intonation and a lack of coherence or cohesion. Chapter IV outlines the results of the survey, while Chapter V presents our conclusions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation document deals with the development of a project, over a span of more than two years, carried out within the scope of the Arrowhead Framework and which bears my personal contribution in several sections. The final part of the project took place during a visiting period at the university of Luleå. The Arrowhead Project is an European project, belonging to the ARTEMIS association, which aims to foster new technologies and unify the access to them into an unique framework. Such technologies include the Internet of Things phe- nomenon, Smart Houses, Electrical Mobility and renewable energy production. An application is considered compliant with such framework when it respects the Service Oriented Architecture paradigm and it is able to interact with a set of defined components called Arrowhead Core Services. My personal contribution to this project is given by the development of several user-friendly API, published in the project's main repository, and the integration of a legacy system within the Arrowhead Framework. The implementation of this legacy system was initiated by me in 2012 and, after many improvements carried out by several developers in UniBO, it has been again significantly modified this year in order to achieve compatibility. The system consists of a simulation of an urban scenario where a certain amount of electrical vehicles are traveling along their specified routes. The vehicles are con-suming their battery and, thus, need to recharge at the charging stations. The electrical vehicles need to use a reservation mechanism to be able to recharge and avoid waiting lines, due to the long recharge process. The integration with the above mentioned framework consists in the publication of the services that the system provides to the end users through the instantiation of several Arrowhead Service Producers, together with a demo Arrowhead- compliant client application able to consume such services.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Questo progetto di tesi è lo sviluppo di un sistema distribuito di acquisizione e visualizzazione interattiva di dati. Tale sistema è utilizzato al CERN (Organizzazione Europea per la Ricerca Nucleare) al fine di raccogliere i dati relativi al funzionamento dell'LHC (Large Hadron Collider, infrastruttura ove avvengono la maggior parte degli esperimenti condotti al CERN) e renderli disponibili al pubblico in tempo reale tramite una dashboard web user-friendly. L'infrastruttura sviluppata è basata su di un prototipo progettato ed implementato al CERN nel 2013. Questo prototipo è nato perché, dato che negli ultimi anni il CERN è diventato sempre più popolare presso il grande pubblico, si è sentita la necessità di rendere disponibili in tempo reale, ad un numero sempre maggiore di utenti esterni allo staff tecnico-scientifico, i dati relativi agli esperimenti effettuati e all'andamento dell'LHC. Le problematiche da affrontare per realizzare ciò riguardano sia i produttori dei dati, ovvero i dispositivi dell'LHC, sia i consumatori degli stessi, ovvero i client che vogliono accedere ai dati. Da un lato, i dispositivi di cui vogliamo esporre i dati sono sistemi critici che non devono essere sovraccaricati di richieste, che risiedono in una rete protetta ad accesso limitato ed utilizzano protocolli di comunicazione e formati dati eterogenei. Dall'altro lato, è necessario che l'accesso ai dati da parte degli utenti possa avvenire tramite un'interfaccia web (o dashboard web) ricca, interattiva, ma contemporaneamente semplice e leggera, fruibile anche da dispositivi mobili. Il sistema da noi sviluppato apporta miglioramenti significativi rispetto alle soluzioni precedentemente proposte per affrontare i problemi suddetti. In particolare presenta un'interfaccia utente costituita da diversi widget configurabili, riuitilizzabili che permettono di esportare i dati sia presentati graficamente sia in formato "machine readable". Un'alta novità introdotta è l'architettura dell'infrastruttura da noi sviluppata. Essa, dato che è basata su Hazelcast, è un'infrastruttura distribuita modulare e scalabile orizzontalmente. È infatti possibile inserire o rimuovere agenti per interfacciarsi con i dispositivi dell'LHC e web server per interfacciarsi con gli utenti in modo del tutto trasparente al sistema. Oltre a queste nuove funzionalità e possbilità, il nostro sistema, come si può leggere nella trattazione, fornisce molteplici spunti per interessanti sviluppi futuri.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Il progetto di tesi ha il compito di far comunicare un device mobile con un apparato elettromedicale tramite tecnologia bluetooth low energy. Il paziente ha a disposizione questo apparato per misurarsi l'elettrocardiogramma autonomamente e poi, tramite l'uso dell'app, vengono mostrati i risultati ottenuti dalla misurazione. Una volta inviati i dati dal dispositivo elettromedicale all'app, questi vengono anche inoltrati a un server dove saranno sottoposti a controlli, da parte del medico curante.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

SMARTDIAB is a platform designed to support the monitoring, management, and treatment of patients with type 1 diabetes mellitus (T1DM), by combining state-of-the-art approaches in the fields of database (DB) technologies, communications, simulation algorithms, and data mining. SMARTDIAB consists mainly of two units: 1) the patient unit (PU); and 2) the patient management unit (PMU), which communicate with each other for data exchange. The PMU can be accessed by the PU through the internet using devices, such as PCs/laptops with direct internet access or mobile phones via a Wi-Fi/General Packet Radio Service access network. The PU consists of an insulin pump for subcutaneous insulin infusion to the patient and a continuous glucose measurement system. The aforementioned devices running a user-friendly application gather patient's related information and transmit it to the PMU. The PMU consists of a diabetes data management system (DDMS), a decision support system (DSS) that provides risk assessment for long-term diabetes complications, and an insulin infusion advisory system (IIAS), which reside on a Web server. The DDMS can be accessed from both medical personnel and patients, with appropriate security access rights and front-end interfaces. The DDMS, apart from being used for data storage/retrieval, provides also advanced tools for the intelligent processing of the patient's data, supporting the physician in decision making, regarding the patient's treatment. The IIAS is used to close the loop between the insulin pump and the continuous glucose monitoring system, by providing the pump with the appropriate insulin infusion rate in order to keep the patient's glucose levels within predefined limits. The pilot version of the SMARTDIAB has already been implemented, while the platform's evaluation in clinical environment is being in progress.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

STEPanizer is an easy-to-use computer-based software tool for the stereological assessment of digitally captured images from all kinds of microscopical (LM, TEM, LSM) and macroscopical (radiology, tomography) imaging modalities. The program design focuses on providing the user a defined workflow adapted to most basic stereological tasks. The software is compact, that is user friendly without being bulky. STEPanizer comprises the creation of test systems, the appropriate display of digital images with superimposed test systems, a scaling facility, a counting module and an export function for the transfer of results to spreadsheet programs. Here we describe the major workflow of the tool illustrating the application on two examples from transmission electron microscopy and light microscopy, respectively.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Advances in laboratory techniques have led to a rapidly increasing use of biomarkers in epidemiological studies. Biomarkers of internal dose, early biological change, susceptibility, and clinical outcomes are used as proxies for investigating the interactions between external and/or endogenous agents and the body components or processes. The need for improved reporting of scientific research led to influential statements of recommendations such as STrengthening Reporting of Observational studies in Epidemiology (STROBE) statement. The STROBE initiative established in 2004 aimed to provide guidance on how to report observational research. Its guidelines provide a user-friendly checklist of 22 items to be reported in epidemiological studies, with items specific to the three main study designs: cohort studies, case-control studies and cross-sectional studies. The present STrengthening the Reporting of OBservational studies in Epidemiology-Molecular Epidemiology (STROBE-ME) initiative builds on the STROBE Statement implementing 9 existing items of STROBE and providing 17 additional items to the 22 items of STROBE checklist. The additions relate to the use of biomarkers in epidemiological studies, concerning collection, handling and storage of biological samples; laboratory methods, validity and reliability of biomarkers; specificities of study design; and ethical considerations. The STROBE-ME recommendations are intended to complement the STROBE recommendations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Advances in laboratory techniques have led to a rapidly increasing use of biomarkers in epidemiological studies. Biomarkers of internal dose, early biological change susceptibility and clinical outcomes are used as proxies for investigating the interactions between external and/or endogenous agents and body components or processes. The need for improved reporting of scientific research led to influential statements of recommendations such as the STrengthening Reporting of OBservational studies in Epidemiology (STROBE) statement. The STROBE initiative established in 2004 aimed to provide guidance on how to report observational research. Its guidelines provide a user-friendly checklist of 22 items to be reported in epidemiological studies, with items specific to the three main study designs: cohort studies, case-control studies and cross-sectional studies. The present STrengthening the Reporting of OBservational studies in Epidemiology -Molecular Epidemiology (STROBE-ME) initiative builds on the STROBE statement implementing 9 existing items of STROBE and providing 17 additional items to the 22 items of STROBE checklist. The additions relate to the use of biomarkers in epidemiological studies, concerning collection, handling and storage of biological samples; laboratory methods, validity and reliability of biomarkers; specificities of study design; and ethical considerations. The STROBE-ME recommendations are intended to complement the STROBE recommendations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Advances in laboratory techniques have led to a rapidly increasing use of biomarkers in epidemiological studies. Biomarkers of internal dose, early biological change, susceptibility and clinical outcomes are used as proxies for investigating the interactions between external and/or endogenous agents and the body components or processes. The need for improved reporting of scientific research led to influential statements of recommendations such as the STrenghtening Reporting of Observational studies in Epidemiology (STROBE) statement. The STROBE initiative established in 2004 aimed to provide guidance on how to report observational research. Its guidelines provide a user-friendly checklist of 22 items to be reported in epidemiological studies, with items specific to the three main study designs: cohort studies, case-control studies and cross-sectional studies. The present STrengthening the Reporting of OBservational studies in Epidemiology - Molecular Epidemiology (STROBE-ME) initiative builds on the STROBE Statement implementing 9 existing items of STROBE and providing 17 additional items to the 22 items of STROBE checklist. The additions relate to the use of biomarkers in epidemiological studies, concerning collection, handling and storage of biological samples; laboratory methods, validity and reliability of biomarkers; specificities of study design; and ethical considerations. The STROBE-ME recommendations are intended to complement the STROBE recommendations.