940 resultados para Common data environment


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Through this paper, we present the initial steps for the creation of an integrated platform for the provision of a series of eHealth tools and services to both citizens and travelers in isolated areas of thesoutheast Mediterranean, and on board ships travelling across it. The platform was created through an INTERREG IIIB ARCHIMED project called INTERMED. Methods The support of primary healthcare, home care and the continuous education of physicians are the three major issues that the proposed platform is trying to facilitate. The proposed system is based on state-of-the-art telemedicine systems and is able to provide the following healthcare services: i) Telecollaboration and teleconsultation services between remotely located healthcare providers, ii) telemedicine services in emergencies, iii) home telecare services for "at risk" citizens such as the elderly and patients with chronic diseases, and iv) eLearning services for the continuous training through seminars of both healthcare personnel (physicians, nurses etc) and persons supporting "at risk" citizens. These systems support data transmission over simple phone lines, internet connections, integrated services digital network/digital subscriber lines, satellite links, mobile networks (GPRS/3G), and wireless local area networks. The data corresponds, among others, to voice, vital biosignals, still medical images, video, and data used by eLearning applications. The proposed platform comprises several systems, each supporting different services. These were integrated using a common data storage and exchange scheme in order to achieve system interoperability in terms of software, language and national characteristics. Results The platform has been installed and evaluated in different rural and urban sites in Greece, Cyprus and Italy. The evaluation was mainly related to technical issues and user satisfaction. The selected sites are, among others, rural health centers, ambulances, homes of "at-risk" citizens, and a ferry. Conclusions The results proved the functionality and utilization of the platform in various rural places in Greece, Cyprus and Italy. However, further actions are needed to enable the local healthcare systems and the different population groups to be familiarized with, and use in their everyday lives, mature technological solutions for the provision of healthcare services.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Adaptive and non-adaptive evolutionary processes are likely to play important roles in biological invasions but their relative importance has hardly ever been quantified. Moreover, although genetic differences between populations in their native versus invasive ranges may simply reflect different positions along a genetic latitudinal cline, this has rarely been controlled for. To study non-adaptive evolutionary processes in invasion of Mimulus guttatus, we used allozyme analyses on offspring of seven native populations from western North America, and three and four invasive populations from Scotland and New Zealand, respectively. To study quantitative genetic differentiation, we grew 2474 plants representing 17 native populations and the seven invasive populations in a common greenhouse environment under temporarily and permanently wet soil conditions. The absence of allozyme differentiation between the invasive and native range indicates that multiple genotypes had been introduced to Scotland and New Zealand, and suggests that founder effects and genetic drift played small, if any, roles in shaping genetic structure of invasive M. guttatus populations. Plants from the invasive and native range did not differ in phenology, floral traits and sexual and vegetative reproduction, and also not in plastic responses to the watering treatments. However, plants from the invasive range produced twice as many flower-bearing upright side branches than the ones from the native populations. Further, with increasing latitude of collection, vegetative reproduction of our experimental plants increased while sexual reproduction decreased. Plants from the invasive and native range shared these latitudinal clines. Because allozymes showed that the relatedness between native and invasive populations did not depend on latitude, this suggests that plants in the invasive regions have adapted to the local latitude. Overall, our study indicates that quantitative genetic variation of M. guttatus in its two invasive regions is shaped by adaptive evolutionary processes rather than by non-adaptive ones. (C) 2007 Gesellschaft fur Okologie. Published by Elsevier GmbH. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A large body of published work shows that proton (hydrogen 1 [(1)H]) magnetic resonance (MR) spectroscopy has evolved from a research tool into a clinical neuroimaging modality. Herein, the authors present a summary of brain disorders in which MR spectroscopy has an impact on patient management, together with a critical consideration of common data acquisition and processing procedures. The article documents the impact of (1)H MR spectroscopy in the clinical evaluation of disorders of the central nervous system. The clinical usefulness of (1)H MR spectroscopy has been established for brain neoplasms, neonatal and pediatric disorders (hypoxia-ischemia, inherited metabolic diseases, and traumatic brain injury), demyelinating disorders, and infectious brain lesions. The growing list of disorders for which (1)H MR spectroscopy may contribute to patient management extends to neurodegenerative diseases, epilepsy, and stroke. To facilitate expanded clinical acceptance and standardization of MR spectroscopy methodology, guidelines are provided for data acquisition and analysis, quality assessment, and interpretation. Finally, the authors offer recommendations to expedite the use of robust MR spectroscopy methodology in the clinical setting, including incorporation of technical advances on clinical units. © RSNA, 2014 Online supplemental material is available for this article.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Sensor Node Overlay Multicast (SNOMC) protocol supports reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers as it is needed for configuration, code update, and management operations in wireless sensor networks. SNOMC supports end-to-end reliability using negative acknowledgements. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. SNOMC supports three different caching strategies namely caching on each intermediate node, caching on branching nodes, or caching on the sender node only. SNOMC was evaluated in our in-house real-world testbed and compared to a number of common data dissemination protocols. It outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Content Distribution Networks are mandatory components of modern web architectures, with plenty of vendors offering their services. Despite its maturity, new paradigms and architecture models are still being developed in this area. Cloud Computing, on the other hand, is a more recent concept which has expanded extremely quickly, with new services being regularly added to cloud management software suites such as OpenStack. The main contribution of this paper is the architecture and the development of an open source CDN that can be provisioned in an on-demand, pay-as-you-go model thereby enabling the CDN as a Service paradigm. We describe our experience with integration of CDNaaS framework in a cloud environment, as a service for enterprise users. We emphasize the flexibility and elasticity of such a model, with each CDN instance being delivered on-demand and associated to personalized caching policies as well as an optimized choice of Points of Presence based on exact requirements of an enterprise customer. Our development is based on the framework developed in the Mobile Cloud Networking EU FP7 project, which offers its enterprise users a common framework to instantiate and control services. CDNaaS is one of the core support components in this project as is tasked to deliver different type of multimedia content to several thousands of users geographically distributed. It integrates seamlessly in the MCN service life-cycle and as such enjoys all benefits of a common design environment, allowing for an improved interoperability with the rest of the services within the MCN ecosystem.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a shallow dialogue analysis model, aimed at human-human dialogues in the context of staff or business meetings. Four components of the model are defined, and several machine learning techniques are used to extract features from dialogue transcripts: maximum entropy classifiers for dialogue acts, latent semantic analysis for topic segmentation, or decision tree classifiers for discourse markers. A rule-based approach is proposed for solving cross-modal references to meeting documents. The methods are trained and evaluated thanks to a common data set and annotation format. The integration of the components into an automated shallow dialogue parser opens the way to multimodal meeting processing and retrieval applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Based on data from R.V. Pelagia, R.V. Sonne and R.V. Meteor multibeam sonar surveys, a high resolution bathymetry was generated for the Mozambique Ridge. The mapping area is divided into five sheets, one overview and four sub-sheets. The boundaries are (west/east/south/north): Sheet 1: 28°30' E/37°00' E/36°20' S/24°50' S; Sheet 2: 32°45' E/36°45' E/28°20' S/25°20' S; Sheet 3: 31°30' E/36°45' E/30°20' S/28°10' S; Sheet 4: 30°30' E/36°30' E/33°15' S/30°15' S; Sheet 5: 28°30' E/36°10' E/36°20' S/33°10' S. Each sheet was generated twice: one from swath sonar bathymetry only, the other one is completed with depths from ETOPO2 predicted bathymetry. Basic outcome of the investigation are Digital Terrain Models (DTM), one for each sheet with 0.05 arcmin (~91 meter) grid spacing and one for the entire area (sheet 1) with 0.1 arcmin grid spacing. The DTM's were utilized for contouring and generating maps. The grid formats are NetCDF (Network Common Data Form) and ASCII (ESRI ArcGIS exchange format). The Maps are formatted as jpg-images and as small sized PNG (Portable Network Graphics) preview images. The provided maps have a paper size of DIN A0 (1189 x 841 mm).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article shows software that allows determining the statistical behavior of qualitative data originating surveys previously transformed with a Likert’s scale to quantitative data. The main intention is offer to users a useful tool to know statistics' characteristics and forecasts of financial risks in a fast and simple way. Additionally,this paper presents the definition of operational risk. On the other hand, the article explains different techniques to do surveys with a Likert’s scale (Avila, 2008) to know expert’s opinion with the transformation of qualitative data to quantitative data. In addition, this paper will show how is very easy to distinguish an expert’s opinion related to risk, but when users have a lot of surveys and matrices is very difficult to obtain results because is necessary to compare common data. On the other hand, statistical value representative must be extracted from common data to get weight of each risk. In the end, this article exposes the development of “Qualitative Operational Risk Software” or QORS by its acronym, which has been designed to determine the root of risks in organizations and its value at operational risk OpVaR (Jorion, 2008; Chernobai et al, 2008) when input data comes from expert’s opinion and their associated matrices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hoy día, en la era post genómica, los ensayos clínicos de cáncer implican la colaboración de diversas instituciones. El análisis multicéntrico y retrospectivo requiere de métodos avanzados para garantizar la interoperabilidad semántica. En este escenario, el objetivo de los proyectos EURECA e INTEGRATE es proporcionar una infraestructura para compartir conocimientos y datos de los ensayos clínicos post genómicos de cáncer. Debido en gran parte a la gran complejidad de los procesos colaborativos de las instituciones, provoca que la gestión de una información tan heterogénea sea un desafío dentro del área médica. Las tecnologías semánticas y las investigaciones relacionadas están centradas en búsqueda de conocimiento de la información extraída, permitiendo una mayor flexibilidad y usabilidad de los datos extraidos. Debido a la falta de estándares adoptados por estas entidades y la complejidad de los datos procedentes de ensayos clínicos, una capacidad semántica es esencial para asegurar la integración homogénea de esta información. De otra manera, los usuarios finales necesitarán conocer cada modelo y cada formato de dato de las instituciones participantes en cada estudio. Para proveer de una capa de interoperabilidad semántica, el primer paso es proponer un\Common Data Model" (CDM) que represente la información a almacenar, y un \Core Dataset" que permita el uso de múltiples terminologías como vocabulario compartido. Una vez que el \Core Dataset" y el CDM han sido seleccionados, la manera en la que realizar el mapping para unir los conceptos de una terminología dada al CDM, requiere de una mecanismo especial para realizar dicha labor. Dicho mecanismo, debe definir que conceptos de diferentes vocabularios pueden ser almacenados en determinados campos del modelo de datos, con la finalidad de crear una representación común de la información. El presente proyecto fin de grado, presenta el desarrollo de un servicio que implementa dicho mecanismo para vincular elementos de las terminologías médicas SNOMED CT, LOINC y HGNC, con objetos del \Health Level 7 Reference Information Model" (HL7 RIM). El servicio propuesto, y nombrado como TermBinding, sigue las recomendaciones del proyecto TermInfo del grupo HL7, pero también se tienen en cuenta cuestiones importantes que surgen al enlazar entre las citadas terminologas y el modelo de datos planteado. En este proceso de desarrollo de la interoperabilidad semántica en ensayos clínicos de cáncer, los datos de fuentes heterogéneas tienen que ser integrados, y es requisito que se deba habilitar una interfaz de acceso homogéneo a toda esta información. Para poder hacer unificar los datos provenientes de diferentes aplicaciones y bases de datos, es esencial representar todos estos datos de una manera canónica o normalizada. La estandarización de un determinado concepto de SNOMED CT, simplifica las recomendaciones del proyecto TermInfo del grupo HL7, utilizadas para poder almacenar cada concepto en el modelo de datos. Siguiendo este enfoque, la interoperabilidad semántica es conseguida con éxito para conceptos SNOMED CT, sean o no post o pre coordinados, así como para las terminologías LOINC y HGNC. Los conceptos son estandarizados en una forma normal que puede ser usada para unir los datos al \Common Data Model" basado en el RIM de HL7. Aunque existen limitaciones debido a la gran heterogeneidad de los datos a integrar, un primer prototipo del servicio propuesto se está utilizando con éxito en el contexto de los proyectos EURECA e INTEGRATE. Una mejora en la interoperabilidad semántica de los datos de ensayos clínicos de cáncer tiene como objetivo mejorar las prácticas en oncología.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La investigación sobre el consumidor ha sido el eje central del trabajo del planificador estratégico desde el nacimiento de la profesión en 1968. En concreto, en el origen de la disciplina de la Planificación Estratégica está la relevancia de la investigación cualitativa como fuente fiable para conocer en profundidad al consumidor y poder desarrollar campañas de comunicación eficaces, relevantes y distintivas. Por ello, y por la repercusión que tiene el conocimiento profundo del consumidor hoy en día, se va a hacer un repaso bibliográfico por las funciones que tradicionalmente ha adquirido el planificador en relación a la investigación para después aplicarlo a la realidad española actual a partir de un estudio empírico a los planificadores estratégicos españoles. El artículo termina con una reflexión sobre el papel relevante que el planner tendrá en un futuro muy próximo en el panorama de Big Data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Among Small and Medium Sized Enterprises (SMEs) in particular, the UK Government’s ambitions regarding BIM uptake and diffusion across the construction sector may be tempered by a realpolitik shaped in part by interactions between the industry, Higher Education (HE) and professional practice. That premise also has a global perspective. Building on the previous 2 papers, Architectural technology and the BIM Acronym 1 and 2, this third iteration is a synthesis of research and investigations carried out over a number of years directly related to the practical implementation of BIM and its impact upon BE SMEs. First challenges, risks and potential benefits for SMEs and micros in facing up to the necessity to engage with digital tools in a competitive and volatile marketplace are discussed including tailoring BIM to suit business models, and filtering out achievable BIM outcomes from generic and bespoke aspects of practice. Second the focus is on setting up and managing teams engaging with BIM scenarios, including the role of clients; addresses a range of paradigms including lonely BIM and collaborative working. The significance of taking a whole life view with BIM is investigated including embedding soft landings principles into project planning and realisation. Thirdly procedures for setting up and managing common data environments are identified and the value of achieving smooth information flow is addressed. The overall objective of this paper is to provide SMEs with a practical strategy to develop a toolkit to BIM implementation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Baltic Sea is a seasonally ice-covered, marginal sea in central northern Europe. It is an essential waterway connecting highly industrialised countries. Because ship traffic is intermittently hindered by sea ice, the local weather services have been monitoring sea ice conditions for decades. In the present study we revisit a historical monitoring data set, covering the winters 1960/1961 to 1978/1979. This data set, dubbed Data Bank for Baltic Sea Ice and Sea Surface Temperatures (BASIS) ice, is based on hand-drawn maps that were collected and then digitised in 1981 in a joint project of the Finnish Institute of Marine Research (today the Finnish Meteorological Institute (FMI)) and the Swedish Meteorological and Hydrological Institute (SMHI). BASIS ice was designed for storage on punch cards and all ice information is encoded by five digits. This makes the data hard to access. Here we present a post-processed product based on the original five-digit code. Specifically, we convert to standard ice quantities (including information on ice types), which we distribute in the current and free Network Common Data Format (NetCDF). Our post-processed data set will help to assess numerical ice models and provide easy-to-access unique historical reference material for sea ice in the Baltic Sea. In addition we provide statistics showcasing the data quality. The website http://www.baltic-ocean.org hosts the post-processed data and the conversion code.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While substance use problems are considered to be common in medical settings, they are not systematically assessed and diagnosed for treatment management. Research data suggest that the majority of individuals with a substance use disorder either do not use treatment or delay treatment-seeking for over a decade. The separation of substance abuse services from mainstream medical care and a lack of preventive services for substance abuse in primary care can contribute to under-detection of substance use problems. When fully enacted in 2014, the Patient Protection and Affordable Care Act 2010 will address these barriers by supporting preventive services for substance abuse (screening, counseling) and integration of substance abuse care with primary care. One key factor that can help to achieve this goal is to incorporate the standardized screeners or common data elements for substance use and related disorders into the electronic health records (EHR) system in the health care setting. Incentives for care providers to adopt an EHR system for meaningful use are part of the Health Information Technology for Economic and Clinical Health Act 2009. This commentary focuses on recent evidence about routine screening and intervention for alcohol/drug use and related disorders in primary care. Federal efforts in developing common data elements for use as screeners for substance use and related disorders are described. A pressing need for empirical data on screening, brief intervention, and referral to treatment (SBIRT) for drug-related disorders to inform SBIRT and related EHR efforts is highlighted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this report is to give an overview of the results of Work Package 5 “Engineering Tools”. In this workpackage numerical tools have been developed for all relevant CHCP systems in the PolySMART demonstration projects (WP3). First, existing simulation platforms have been described and specific characteristics have been identified. Several different simulation platforms are in principle appropriate for the needs in the PolySMART project. The result is an evaluation of available simulation and engineering tools for CHCP simulation, and an agreement upon a common simulation environment within the PolySMART project. Next, numerical models for components in the demonstration projects have been developed. These models are available to the PolySMART consortium. Of all modeled components an overall and detailed working principle is formulated, including a parameter list and (in some cases) a control strategy. Finally, for four CHCP systems in the PolySMART project, a system simulation model has been developed. For each system simulation a separate deliverable is available (D5.5b to D5.5e) These deliverables replace deliverable 5.4 ‘system models’. The numerical models for components and systems developed in the Polysmart project form a valuable basis for the component development and optimisation and for the system optimisation, both within and outside the project. Developers and researchers interested in more information about specific models can refer to the institutes and contact persons involved in the model development.