759 resultados para Spectroscopy computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a proposal for a management model based on reliability requirements concerning Cloud Computing (CC). The proposal was based on a literature review focused on the problems, challenges and underway studies related to the safety and reliability of Information Systems (IS) in this technological environment. This literature review examined the existing obstacles and challenges from the point of view of respected authors on the subject. The main issues are addressed and structured as a model, called "Trust Model for Cloud Computing environment". This is a proactive proposal that purposes to organize and discuss management solutions for the CC environment, aiming improved reliability of the IS applications operation, for both providers and their customers. On the other hand and central to trust, one of the CC challenges is the development of models for mutual audit management agreements, so that a formal relationship can be established involving the relevant legal responsibilities. To establish and control the appropriate contractual requirements, it is necessary to adopt technologies that can collect the data needed to inform risk decisions, such as access usage, security controls, location and other references related to the use of the service. In this process, the cloud service providers and consumers themselves must have metrics and controls to support cloud-use management in compliance with the SLAs agreed between the parties. The organization of these studies and its dissemination in the market as a conceptual model that is able to establish parameters to regulate a reliable relation between provider and user of IT services in CC environment is an interesting instrument to guide providers, developers and users in order to provide services and secure and reliable applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kidney renal failure means that one’s kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient’s history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis.The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the sensitivity and the specificity exhibited values range between 93.1 and 94.9 and 91.9–94.2 %, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of ubiquitous computing (ubicomp) environments raises several challenges in terms of their evaluation. Ubicomp virtual reality prototyping tools enable users to experience the system to be developed and are of great help to face those challenges, as they support developers in assessing the consequences of a design decision in the early phases of development. Given the situated nature of ubicomp environments, a particular issue to consider is the level of realism provided by the prototypes. This work presents a case study where two ubicomp prototypes, featuring different levels of immersion (desktop-based versus CAVE-based), were developed and compared. The goal was to determine the cost/benefits relation of both solutions, which provided better user experience results, and whether or not simpler solutions provide the same user experience results as more elaborate one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Uno de los temas centrales del proyecto concierne la naturaleza de la ciencia de la computación. La reciente aparición de esta disciplina sumada a su origen híbrido como ciencia formal y disciplina tecnológica hace que su caracterización aún no esté completa y menos aún acordada entre los científicos del área. En el trabajo Three paradigms of Computer Science de A. Eden, se presentan tres posiciones admitidamente exageradas acerca de como entender tanto el objeto de estudio (ontología) como los métodos de trabajo (metodología) y la estructura de la teoría y las justificaciones del conocimiento informático (epistemología): La llamada racionalista, la cual se basa en la idea de que los programas son fórmulas lógicas y que la forma de trabajo es deductiva, la tecnocrática que presenta a la ciencia computacional como una disciplina ingenieril y la ahi llamada científica, la cual asimilaría a la computación a las ciencias empíricas. Algunos de los problemas de ciencia de la computación están relacionados con cuestiones de filosofía de la matemática, en particular la relación entre las entidades abstractas y el mundo. Sin embargo, el carácter prescriptivo de los axiomas y teoremas de las teorías de la programación puede permitir interpretaciones alternativas y cuestionaría fuertemente la posibilidad de pensar a la ciencia de la computación como una ciencia empírica, al menos en el sentido tradicional. Por otro lado, es posible que el tipo de análisis aplicado a las ciencias de la computación propuesto en este proyecto aporte nuevas ideas para pensar problemas de filosofía de la matemática. Un ejemplo de estos posibles aportes puede verse en el trabajo de Arkoudas Computers, Justi?cation, and Mathematical Knowledge el cual echa nueva luz al problema del significado de las demostraciones matemáticas.Los objetivos del proyecto son: Caracterizar el campo de las ciencias de la computación.Evaluar los fundamentos ontológicos, epistemológicos y metodológicos de la ciencia de la computación actual.Analizar las relaciones entre las diferentes perspectivas heurísticas y epistémicas y las practicas de la programación.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La idea principal del proyecto abarca el estudio de parámetros y fenómenos físicos. Los avances logrados se aplicarán al desarrollo de software y metodologías para cuantificación de materiales mediante microanálisis con sonda de electrones y microscopía electrónica de barrido. El microanálisis no es una técnica absoluta, sino que requiere de estándares de referencia, para obviar el uso de ciertos parámetros geométricos y atómicos difíciles de conocer con una precisión adecuada. Para contar con un método sin estándares debe abordarse la determinación de parámetros atómicos e instrumentales, que es uno de los aspectos que se desea encarar en este proyecto. Por otro lado, también se pretende incluir los parámetros estudiados en un software de cuantificación desarrollado por integrantes del proyecto. Otro de los propósitos del plan de trabajo es estudiar la potencialidad de la resolución espacial de una microsonda de electrones con el fin de desarrollar una metodología para caracterizar interfases, bordes de granos e inclusiones, con resolución submicrométrica, ya que los métodos tradicionales de cuantificación se restringen al caso de muestras planas y homogéneas dentro del volumen de interacción, pero la caracterización de inhomogeneidades a nivel micrométrico no ha sido desarrollada todavía, salvo algunas excepciones. The main idea of this project involves the study of physical parameters and phenomena. The concretion of the different goals will permit the elaboration of softeare and methodologies for materials characterization by means of electron probe microanalysis and scanning microscopy. Electron probe microanalysis is not an absolute technique, but requires reference standards in order not to involve certain geometrical and atomic parameters for which high uncertainties cannot be avoided. In order to have standardless method, the determination of atomic and instrumental parameters must be accomplished, as will be faced through this project. Complementary, the parameters studied will be included in a quantification software developed in our research group of FaMAF. Another objective of this activity plan is to study the spatial resolution potentiality of a focalized electron beam, with the aim of characterizing interphases, grain boundaries and inclusions with submicron sensitivity, since the traditional quantification procedures are restricted to flat homogeneous samples, whereas the characterization of inhomogeneities has not been developed yet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring, object-orientation, real-time, execution-time, scheduling

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Naturwiss., Diss., 2012

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Naturwiss., Diss., 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Raman spectroscopy has been applied to characterize fiber dyes and determine the discriminating ability of the method. Black, blue, and red acrylic, cotton, and wool samples were analyzed. Four excitation sources were used to obtain complementary responses in the case of fluorescent samples. Fibers that did not provide informative spectra using a given laser were usually detected using another wavelength. For any colored acrylic, the 633-nm laser did not provide Raman information. The 514-nm laser provided the highest discrimination for blue and black cotton, but half of the blue cottons produced noninformative spectra. The 830-nm laser exhibited the highest discrimination for red cotton. Both visible lasers provided the highest discrimination for black and blue wool, and NIR lasers produced remarkable separation for red and black wool. This study shows that the discriminating ability of Raman spectroscopy depends on the fiber type, color, and the laser wavelength.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cerebral metabolism is compartmentalized between neurons and glia. Although glial glycolysis is thought to largely sustain the energetic requirements of neurotransmission while oxidative metabolism takes place mainly in neurons, this hypothesis is matter of debate. The compartmentalization of cerebral metabolic fluxes can be determined by (13)C nuclear magnetic resonance (NMR) spectroscopy upon infusion of (13)C-enriched compounds, especially glucose. Rats under light α-chloralose anesthesia were infused with [1,6-(13)C]glucose and (13)C enrichment in the brain metabolites was measured by (13)C NMR spectroscopy with high sensitivity and spectral resolution at 14.1 T. This allowed determining (13)C enrichment curves of amino acid carbons with high reproducibility and to reliably estimate cerebral metabolic fluxes (mean error of 8%). We further found that TCA cycle intermediates are not required for flux determination in mathematical models of brain metabolism. Neuronal tricarboxylic acid cycle rate (V(TCA)) and neurotransmission rate (V(NT)) were 0.45 ± 0.01 and 0.11 ± 0.01 μmol/g/min, respectively. Glial V(TCA) was found to be 38 ± 3% of total cerebral oxidative metabolism, accounting for more than half of neuronal oxidative metabolism. Furthermore, glial anaplerotic pyruvate carboxylation rate (V(PC)) was 0.069 ± 0.004 μmol/g/min, i.e., 25 ± 1% of the glial TCA cycle rate. These results support a role of glial cells as active partners of neurons during synaptic transmission beyond glycolytic metabolism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En este proyecto se han visto dos sistemas de computación distribuida diferentes entre ellos: Condor y BOINC. Se exploran las posibilidades para poder conseguir que ambos sistemas logren trabajar conjuntamente, escogiendo la parte más efectiva de cada uno de los sistemas con el fin de complementarse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.