919 resultados para Tools and techniques
Resumo:
Background and Purpose: The circadian rhythm of melatonin in saliva or plasma, or of the melatonin metabolite 6-sulfatoxymelatonin (a6MTs) in urine, is a defining feature of suprachiasmatic nucleus (SCN) function, the body's endogenous oscillatory pacemaker. The primary objective of this review is to ascertain the clinical benefits and limitations of current methodologies employed for detection and quantification of melatonin in biological fluids and tissues. Data Identification: A search of the English-language literature (Medline) and a systematic review of published articles were carried out. Study Selection: Articles that specified both the methodology for quantifying melatonin and indicated the clinical purpose were chosen for inclusion in the review. Data Extraction: The authors critically evaluated the methodological issues associated with various tools and techniques (e.g. standards, protocols, and procedures). Results of Data Synthesis: Melatonin measurements are useful for evaluating problems related to the onset or offset of sleep and for assessing phase delays or advances of rhythms in entrained individuals. They have also become an important tool for psychiatric diagnosis, their use being recommended for phase typing in patients suffering from sleep and mood disorders. Additionally, there has been a continuous interest in the use of melatonin as a marker for neoplasms of the pineal region. Melatonin decreases such as found with aging are or post pinealectomy can cause alterations in the sleep/wake cycle. The development of sensitive and selective methods for the precise detection of melatonin in tissues and fluids has increasingly been shown to have direct relevance for clinical decision making. Conclusions: Due to melatonin's low concentration, as well as the coexistence of numerous other compounds in the blood, the routine determination of melatonin has been an analytical challenge. The available evidence indicates however that these challenges can be overcome and consequently that evaluation of melatonin's presence and activity can be an accessible and useful tool for clinical diagnosis. © Springer-Verlag 2010.
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
Three factors define the main difficulties faced by developing countries in the area of trade facilitation: (i) limited understanding and use by governments and business (especially SMEs) of trade facilitation and of ICT tools and techniques; (ii) developing countries' limited capacity for policy analysis and inadequate policy instruments for the implementation of trade facilitation, and (iii) inadequate policy coordination for negotiation on trade facilitation. These obstacles tend to reduce countries' development opportunities and to increase the costs of general economic development and social welfare.The United Nations, through its five regional commissions, is launching a project that seeks to disseminate the benefits of trade facilitation and the standards, tools and requirements for its successful implementation. The project will focus on trade facilitation promoted by: (a) enhanced knowledge and understanding of governments and business regarding trade facilitation and the role of ICT; (b) enhanced use of ICT by SMEs in trade facilitation, and (c) national capacity-building for trade facilitation negotiations.
Resumo:
Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change. The consortium comprises 12 work packages to address a set of research questions in three areas: Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring. Research area 2: Develop novel strategies to prevent dengue in children. Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change. In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.
Resumo:
Abstract Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.
Resumo:
The popularity of herbal products, especially plant food supplements (PFS) and herbal medicine is on the rise in Europe and other parts of the world, with increased use in the general population as well as among specific subgroups encompassing children, women or those suffering from diseases such as cancer. The aim of this paper is to examine the PFS market structures in European Community (EC) Member States as well as to examine issues addressing methodologies and consumption data relating to PFS use in Europe. A revision of recent reports on market data, trends and main distribution channels, in addition an example of the consumption of PFS in Spain, is presented. An overview of the methods and administration techniques used...
Resumo:
The aims of this research were: - To identify the characteristics, properties and provenance of the building and decorative material found in three Hungarian Roman sites: Nagyharsány, Nemesvámos-Balácapuszta and Aquincum - To provide a database of information on the different sites - To have an overview of main conservation strategies applied in Hungary. Geological studies, macroscopical and microscopical observations, XRD investigations, physical and chemical analyses allowed us to define the characteristics and properties of the different kinds of collected materials. Building stones sampled from Nagyharsány site showed two different kinds of massive limestone belonging to the areas surrounding the villa. Also Building stones sampled from Nemesvámos-Balácapuszta Roman villa proved to be compatible with limestone belonging to local sources. Mural painting fragments show that all samples are units composed of multilayered structures. Mosaic tesserae can be classified as following: -Pale yellow , blackish and pink tesserae are comparable with local limestone; -White tessera, composed of marble, was probably imported from distant regions of the Empire, as the usual practice of Romans. Mortars present different characteristics according to the age, the site and the functions: -Building mortars are generally lime based, white or pale yellow in colour, present a high percentage of aggregates represented by fine sand; -Supporting mortars from both mosaics and mural paintings are reddish or pinkish in colour, due to the presence of high percentage of brick dust and tiles fragments, and present a higher content of MgO. Although the condition of the sites, there is an insignificant content of soluble salts. Database The whole study has allowed us to provide work sheets for each samples, including all characteristics and properties. Furthermore, all sites included in the frame of the research have been described and illustrated on the base of their floor plans, material and construction methodologies. It can be concluded that: 1. In Nagyharsány Archaeological site, it is possible to define a sequence of different construction phases on the base of the study of building material and mortars. The results are comparable with the chronology of the site provided by the archaeologists 2. The material used for construction was of local origin while the more precious ones, used for decorative elements, were probably imported from long distance 3. Construction techniques in Hungary mainly refer to the usual Roman knowledge and practice (Vitruvius); few differences have been found 4. The database will represent an archive for Archaeologists, Historians and Conservators dealing with Roman period in Hungary.
Resumo:
In the last years, the importance of locating people and objects and communicating with them in real time has become a common occurrence in every day life. Nowadays, the state of the art of location systems for indoor environments has not a dominant technology as instead occurs in location systems for outdoor environments, where GPS is the dominant technology. In fact, each location technology for indoor environments presents a set of features that do not allow their use in the overall application scenarios, but due its characteristics, it can well coexist with other similar technologies, without being dominant and more adopted than the others indoor location systems. In this context, the European project SELECT studies the opportunity of collecting all these different features in an innovative system which can be used in a large number of application scenarios. The goal of this project is to realize a wireless system, where a network of fixed readers able to query one or more tags attached to objects to be located. The SELECT consortium is composed of European institutions and companies, including Datalogic S.p.A. and CNIT, which deal with software and firmware development of the baseband receiving section of the readers, whose function is to acquire and process the information received from generic tagged objects. Since the SELECT project has an highly innovative content, one of the key stages of the system design is represented by the debug phase. This work aims to study and develop tools and techniques that allow to perform the debug phase of the firmware of the baseband receiving section of the readers.
Resumo:
Con questa dissertazione di tesi miro ad illustrare i risultati della mia ricerca nel campo del Semantic Publishing, consistenti nello sviluppo di un insieme di metodologie, strumenti e prototipi, uniti allo studio di un caso d‟uso concreto, finalizzati all‟applicazione ed alla focalizzazione di Lenti Semantiche (Semantic Lenses).
Resumo:
This thesis analyses problems related to the applicability, in business environments, of Process Mining tools and techniques. The first contribution is a presentation of the state of the art of Process Mining and a characterization of companies, in terms of their "process awareness". The work continues identifying circumstance where problems can emerge: data preparation; actual mining; and results interpretation. Other problems are the configuration of parameters by not-expert users and computational complexity. We concentrate on two possible scenarios: "batch" and "on-line" Process Mining. Concerning the batch Process Mining, we first investigated the data preparation problem and we proposed a solution for the identification of the "case-ids" whenever this field is not explicitly indicated. After that, we concentrated on problems at mining time and we propose the generalization of a well-known control-flow discovery algorithm in order to exploit non instantaneous events. The usage of interval-based recording leads to an important improvement of performance. Later on, we report our work on the parameters configuration for not-expert users. We present two approaches to select the "best" parameters configuration: one is completely autonomous; the other requires human interaction to navigate a hierarchy of candidate models. Concerning the data interpretation and results evaluation, we propose two metrics: a model-to-model and a model-to-log. Finally, we present an automatic approach for the extension of a control-flow model with social information, in order to simplify the analysis of these perspectives. The second part of this thesis deals with control-flow discovery algorithms in on-line settings. We propose a formal definition of the problem, and two baseline approaches. The actual mining algorithms proposed are two: the first is the adaptation, to the control-flow discovery problem, of a frequency counting algorithm; the second constitutes a framework of models which can be used for different kinds of streams (stationary versus evolving).
Resumo:
The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.
Resumo:
The need to effectively manage the documentation covering the entire production process, from the concept phase right through to market realise, constitutes a key issue in the creation of a successful and highly competitive product. For almost forty years the most commonly used strategies to achieve this have followed Product Lifecycle Management (PLM) guidelines. Translated into information management systems at the end of the '90s, this methodology is now widely used by companies operating all over the world in many different sectors. PLM systems and editor programs are the two principal types of software applications used by companies for their process aotomation. Editor programs allow to store in documents the information related to the production chain, while the PLM system stores and shares this information so that it can be used within the company and made it available to partners. Different software tools, which capture and store documents and information automatically in the PLM system, have been developed in recent years. One of them is the ''DirectPLM'' application, which has been developed by the Italian company ''Focus PLM''. It is designed to ensure interoperability between many editors and the Aras Innovator PLM system. In this dissertation we present ''DirectPLM2'', a new version of the previous software application DirectPLM. It has been designed and developed as prototype during the internship by Focus PLM. Its new implementation separates the abstract logic of business from the real commands implementation, previously strongly dependent on Aras Innovator. Thanks to its new design, Focus PLM can easily develop different versions of DirectPLM2, each one devised for a specific PLM system. In fact, the company can focus the development effort only on a specific set of software components which provides specialized functions interacting with that particular PLM system. This allows shorter Time-To-Market and gives the company a significant competitive advantage.
Resumo:
Background Through this paper, we present the initial steps for the creation of an integrated platform for the provision of a series of eHealth tools and services to both citizens and travelers in isolated areas of thesoutheast Mediterranean, and on board ships travelling across it. The platform was created through an INTERREG IIIB ARCHIMED project called INTERMED. Methods The support of primary healthcare, home care and the continuous education of physicians are the three major issues that the proposed platform is trying to facilitate. The proposed system is based on state-of-the-art telemedicine systems and is able to provide the following healthcare services: i) Telecollaboration and teleconsultation services between remotely located healthcare providers, ii) telemedicine services in emergencies, iii) home telecare services for "at risk" citizens such as the elderly and patients with chronic diseases, and iv) eLearning services for the continuous training through seminars of both healthcare personnel (physicians, nurses etc) and persons supporting "at risk" citizens. These systems support data transmission over simple phone lines, internet connections, integrated services digital network/digital subscriber lines, satellite links, mobile networks (GPRS/3G), and wireless local area networks. The data corresponds, among others, to voice, vital biosignals, still medical images, video, and data used by eLearning applications. The proposed platform comprises several systems, each supporting different services. These were integrated using a common data storage and exchange scheme in order to achieve system interoperability in terms of software, language and national characteristics. Results The platform has been installed and evaluated in different rural and urban sites in Greece, Cyprus and Italy. The evaluation was mainly related to technical issues and user satisfaction. The selected sites are, among others, rural health centers, ambulances, homes of "at-risk" citizens, and a ferry. Conclusions The results proved the functionality and utilization of the platform in various rural places in Greece, Cyprus and Italy. However, further actions are needed to enable the local healthcare systems and the different population groups to be familiarized with, and use in their everyday lives, mature technological solutions for the provision of healthcare services.