36 resultados para computation- and data-intensive applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectra in the visible (VIS) and infrared (IR) region contain a wide variety of information about inorganic and organic substances in sediments. The information from the spectra enables a wide array of applications that allow quantitative, semiquantitative, and qualitative characterization of sediment. Due to the fact that instrument/experimental setups are simple, rapid, and cost-saving and that only small sample quantities are required, the technique has become valuable in paleolimnological and Quaternary science. This article summarizes the theoretical background of VIS and IR spectroscopy, explains the analytical process, introduces statistical tools used for interpretation of spectra, and provides examples of applications in Quaternary science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Cloud computing service emerged as an essential component of the Enterprise {IT} infrastructure. Migration towards a full range and large-scale convergence of Cloud and network services has become the current trend for addressing requirements of the Cloud environment. Our approach takes the infrastructure as a service paradigm to build converged virtual infrastructures, which allow offering tailored performance and enable multi-tenancy over a common physical infrastructure. Thanks to virtualization, new exploitation activities of the physical infrastructures may arise for both transport network and Data Centres services. This approach makes network and Data Centres’ resources dedicated to Cloud Computing to converge on the same flexible and scalable level. The work presented here is based on the automation of the virtual infrastructure provisioning service. On top of the virtual infrastructures, a coordinated operation and control of the different resources is performed with the objective of automatically tailoring connectivity services to the Cloud service dynamics. Furthermore, in order to support elasticity of the Cloud services through the optical network, dynamic re-planning features have been provided to the virtual infrastructure service, which allows scaling up or down existing virtual infrastructures to optimize resource utilisation and dynamically adapt to users’ demands. Thus, the dynamic re-planning of the service becomes key component for the coordination of Cloud and optical network resource in an optimal way in terms of resource utilisation. The presented work is complemented with a use case of the virtual infrastructure service being adopted in a distributed Enterprise Information System, that scales up and down as a function of the application requests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The liquid argon calorimeter is a key component of the ATLAS detector installed at the CERN Large Hadron Collider. The primary purpose of this calorimeter is the measurement of electron and photon kinematic properties. It also provides a crucial input for measuring jets and missing transverse momentum. An advanced data monitoring procedure was designed to quickly identify issues that would affect detector performance and ensure that only the best quality data are used for physics analysis. This article presents the validation procedure developed during the 2011 and 2012 LHC data-taking periods, in which more than 98% of the proton-proton luminosity recorded by ATLAS at a centre-of-mass energy of 7–8 TeV had calorimeter data quality suitable for physics analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report on the project activities 2003.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Through this paper, we present the initial steps for the creation of an integrated platform for the provision of a series of eHealth tools and services to both citizens and travelers in isolated areas of thesoutheast Mediterranean, and on board ships travelling across it. The platform was created through an INTERREG IIIB ARCHIMED project called INTERMED. Methods The support of primary healthcare, home care and the continuous education of physicians are the three major issues that the proposed platform is trying to facilitate. The proposed system is based on state-of-the-art telemedicine systems and is able to provide the following healthcare services: i) Telecollaboration and teleconsultation services between remotely located healthcare providers, ii) telemedicine services in emergencies, iii) home telecare services for "at risk" citizens such as the elderly and patients with chronic diseases, and iv) eLearning services for the continuous training through seminars of both healthcare personnel (physicians, nurses etc) and persons supporting "at risk" citizens. These systems support data transmission over simple phone lines, internet connections, integrated services digital network/digital subscriber lines, satellite links, mobile networks (GPRS/3G), and wireless local area networks. The data corresponds, among others, to voice, vital biosignals, still medical images, video, and data used by eLearning applications. The proposed platform comprises several systems, each supporting different services. These were integrated using a common data storage and exchange scheme in order to achieve system interoperability in terms of software, language and national characteristics. Results The platform has been installed and evaluated in different rural and urban sites in Greece, Cyprus and Italy. The evaluation was mainly related to technical issues and user satisfaction. The selected sites are, among others, rural health centers, ambulances, homes of "at-risk" citizens, and a ferry. Conclusions The results proved the functionality and utilization of the platform in various rural places in Greece, Cyprus and Italy. However, further actions are needed to enable the local healthcare systems and the different population groups to be familiarized with, and use in their everyday lives, mature technological solutions for the provision of healthcare services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A confocal imaging and image processing scheme is introduced to visualize and evaluate the spatial distribution of spectral information in tissue. The image data are recorded using a confocal laser-scanning microscope equipped with a detection unit that provides high spectral resolution. The processing scheme is based on spectral data, is less error-prone than intensity-based visualization and evaluation methods, and provides quantitative information on the composition of the sample. The method is tested and validated in the context of the development of dermal drug delivery systems, introducing a quantitative uptake indicator to compare the performances of different delivery systems is introduced. A drug penetration study was performed in vitro. The results show that the method is able to detect, visualize and measure spectral information in tissue. In the penetration study, uptake efficiencies of different experiment setups could be discriminated and quantitatively described. The developed uptake indicator is a step towards a quantitative assessment and, in a more general view apart from pharmaceutical research, provides valuable information on tissue composition. It can potentially be used for clinical in vitro and in vivo applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CONTEXT: Determination of arginine vasopressin (AVP) concentrations may be helpful to guide therapy in critically ill patients. A new assay analyzing copeptin, a stable peptide derived from the AVP precursor, has been introduced. OBJECTIVE: Our objective was to determine plasma copeptin concentrations. DESIGN: We conducted a post hoc analysis of plasma samples and data from a prospective study. SETTING: The setting was a 12-bed general and surgical intensive care unit (ICU) in a tertiary university teaching hospital. PATIENTS: Our subjects were 70 healthy volunteers and 157 ICU patients with sepsis, with systemic inflammatory response syndrome (SIRS), and after cardiac surgery. INTERVENTIONS: There were no interventions. MAIN OUTCOME MEASURES: Copeptin plasma concentrations, demographic data, AVP plasma concentrations, and a multiple organ dysfunction syndrome score were documented 24 h after ICU admission. RESULTS: AVP (P < 0.001) and copeptin (P < 0.001) concentrations were significantly higher in ICU patients than in controls. Patients after cardiac surgery had higher AVP (P = 0.003) and copeptin (P = 0.003) concentrations than patients with sepsis or SIRS. Independent of critical illness, copeptin and AVP correlated highly significantly with each other. Critically ill patients with sepsis and SIRS exhibited a significantly higher ratio of copeptin/AVP plasma concentrations than patients after cardiac surgery (P = 0.012). The American Society of Anesthesiologists' classification (P = 0.046) and C-reactive protein concentrations (P = 0.006) were significantly correlated with the copeptin/AVP ratio. CONCLUSIONS: Plasma concentrations of copeptin and AVP in healthy volunteers and critically ill patients correlate significantly with each other. The ratio of copeptin/AVP plasma concentrations is increased in patients with sepsis and SIRS, suggesting that copeptin may overestimate AVP plasma concentrations in these patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To describe the electronic medical databases used in antiretroviral therapy (ART) programmes in lower-income countries and assess the measures such programmes employ to maintain and improve data quality and reduce the loss of patients to follow-up. METHODS: In 15 countries of Africa, South America and Asia, a survey was conducted from December 2006 to February 2007 on the use of electronic medical record systems in ART programmes. Patients enrolled in the sites at the time of the survey but not seen during the previous 12 months were considered lost to follow-up. The quality of the data was assessed by computing the percentage of missing key variables (age, sex, clinical stage of HIV infection, CD4+ lymphocyte count and year of ART initiation). Associations between site characteristics (such as number of staff members dedicated to data management), measures to reduce loss to follow-up (such as the presence of staff dedicated to tracing patients) and data quality and loss to follow-up were analysed using multivariate logit models. FINDINGS: Twenty-one sites that together provided ART to 50 060 patients were included (median number of patients per site: 1000; interquartile range, IQR: 72-19 320). Eighteen sites (86%) used an electronic database for medical record-keeping; 15 (83%) such sites relied on software intended for personal or small business use. The median percentage of missing data for key variables per site was 10.9% (IQR: 2.0-18.9%) and declined with training in data management (odds ratio, OR: 0.58; 95% confidence interval, CI: 0.37-0.90) and weekly hours spent by a clerk on the database per 100 patients on ART (OR: 0.95; 95% CI: 0.90-0.99). About 10 weekly hours per 100 patients on ART were required to reduce missing data for key variables to below 10%. The median percentage of patients lost to follow-up 1 year after starting ART was 8.5% (IQR: 4.2-19.7%). Strategies to reduce loss to follow-up included outreach teams, community-based organizations and checking death registry data. Implementation of all three strategies substantially reduced losses to follow-up (OR: 0.17; 95% CI: 0.15-0.20). CONCLUSION: The quality of the data collected and the retention of patients in ART treatment programmes are unsatisfactory for many sites involved in the scale-up of ART in resource-limited settings, mainly because of insufficient staff trained to manage data and trace patients lost to follow-up.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: This study developed percentile curves for anthropometric (waist circumference) and cardiovascular (lipid profile) risk factors for US children and adolescents. STUDY DESIGN: A representative sample of US children and adolescents from the National Health and Nutrition Examination Survey from 1988 to 1994 (NHANES III) and the current national series (NHANES 1999-2006) were combined. Percentile curves were constructed, nationally weighted, and smoothed using the Lambda, Mu, and Sigma method. The percentile curves included age- and sex-specific percentile values that correspond with and transition into the adult abnormal cut-off values for each of these anthropometric and cardiovascular components. To increase the sample size, a second series of percentile curves was also created from the combination of the 2 NHANES databases, along with cross-sectional data from the Bogalusa Heart Study, the Muscatine Study, the Fels Longitudinal Study and the Princeton Lipid Research Clinics Study. RESULTS: These analyses resulted in a series of growth curves for waist circumference, total cholesterol, LDL cholesterol, triglycerides, and HDL cholesterol from a combination of pediatric data sets. The cut-off for abnormal waist circumference in adult males (102 cm) was equivalent to the 94(th) percentile line in 18-year-olds, and the cut-off in adult females (88 cm) was equivalent to the 84(th) percentile line in 18-year-olds. Triglycerides were found to have a bimodal pattern among females, with an initial peak at age 11 and a second at age 20; the curve for males increased steadily with age. The HDL curve for females was relatively flat, but the male curve declined starting at age 9 years. Similar curves for total and LDL cholesterol were constructed for both males and females. When data from the additional child studies were added to the national data, there was little difference in their patterns or rates of change from year to year. CONCLUSIONS: These curves represent waist and lipid percentiles for US children and adolescents, with identification of values that transition to adult abnormalities. They could be used conditionally for both epidemiological and possibly clinical applications, although they need to be validated against longitudinal data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mid-Holocene (6 kyr BP; thousand years before present) is a key period to study the consistency between model results and proxy-based reconstruction data as it corresponds to a standard test for models and a reasonable number of proxy-based records is available. Taking advantage of this relatively large amount of information, we have compared a compilation of 50 air and sea surface temperature reconstructions with the results of three simulations performed with general circulation models and one carried out with LOVECLIM, a model of intermediate complexity. The conclusions derived from this analysis confirm that models and data agree on the large-scale spatial pattern but the models underestimate the magnitude of some observed changes and that large discrepancies are observed at the local scale. To further investigate the origin of those inconsistencies, we have constrained LOVECLIM to follow the signal recorded by the proxies selected in the compilation using a data-assimilation method based on a particle filter. In one simulation, all the 50 proxy-based records are used while in the other two only the continental or oceanic proxy-based records constrain the model results. As expected, data assimilation leads to improving the consistency between model results and the reconstructions. In particular, this is achieved in a robust way in all the experiments through a strengthening of the westerlies at midlatitude that warms up northern Europe. Furthermore, the comparison of the LOVECLIM simulations with and without data assimilation has also objectively identified 16 proxy-based paleoclimate records whose reconstructed signal is either incompatible with the signal recorded by some other proxy-based records or with model physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several strategies relying on kriging have recently been proposed for adaptively estimating contour lines and excursion sets of functions under severely limited evaluation budget. The recently released R package KrigInv 3 is presented and offers a sound implementation of various sampling criteria for those kinds of inverse problems. KrigInv is based on the DiceKriging package, and thus benefits from a number of options concerning the underlying kriging models. Six implemented sampling criteria are detailed in a tutorial and illustrated with graphical examples. Different functionalities of KrigInv are gradually explained. Additionally, two recently proposed criteria for batch-sequential inversion are presented, enabling advanced users to distribute function evaluations in parallel on clusters or clouds of machines. Finally, auxiliary problems are discussed. These include the fine tuning of numerical integration and optimization procedures used within the computation and the optimization of the considered criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers a framework where data from correlated sources are transmitted with the help of network coding in ad hoc network topologies. The correlated data are encoded independently at sensors and network coding is employed in the intermediate nodes in order to improve the data delivery performance. In such settings, we focus on the problem of reconstructing the sources at decoder when perfect decoding is not possible due to losses or bandwidth variations. We show that the source data similarity can be used at decoder to permit decoding based on a novel and simple approximate decoding scheme. We analyze the influence of the network coding parameters and in particular the size of finite coding fields on the decoding performance. We further determine the optimal field size that maximizes the expected decoding performance as a trade-off between information loss incurred by limiting the resolution of the source data and the error probability in the reconstructed data. Moreover, we show that the performance of the approximate decoding improves when the accuracy of the source model increases even with simple approximate decoding techniques. We provide illustrative examples showing how the proposed algorithm can be deployed in sensor networks and distributed imaging applications.