943 resultados para software analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 21-day experimental gingivitis model, an established noninvasive model of inflammation in response to increasing bacterial accumulation in humans, is designed to enable the study of both the induction and resolution of inflammation. Here, we have analyzed gingival crevicular fluid, an oral fluid comprising a serum transudate and tissue exudates, by LC-MS/MS using Fourier transform ion cyclotron resonance mass spectrometry and iTRAQ isobaric mass tags, to establish meta-proteomic profiles of inflammation-induced changes in proteins in healthy young volunteers. Across the course of experimentally induced gingivitis, we identified 16 bacterial and 186 human proteins. Although abundances of the bacterial proteins identified did not vary temporally, Fusobacterium outer membrane proteins were detected. Fusobacterium species have previously been associated with periodontal health or disease. The human proteins identified spanned a wide range of compartments (both extracellular and intracellular) and functions, including serum proteins, proteins displaying antibacterial properties, and proteins with functions associated with cellular transcription, DNA binding, the cytoskeleton, cell adhesion, and cilia. PolySNAP3 clustering software was used in a multilayered analytical approach. Clusters of proteins that associated with changes to the clinical parameters included neuronal and synapse associated proteins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate knowledge exchange among commercial organizations, the rationale behind it, and its effects on the market. Knowledge exchange is known to be beneficial for industry, but in order to explain it, authors have used high-level concepts like network effects, reputation, and trust. We attempt to formalize a plausible and elegant explanation of how and why companies adopt information exchange and why it benefits the market as a whole when this happens. This explanation is based on a multiagent model that simulates a market of software providers. Even though the model does not include any high-level concepts, information exchange naturally emerges during simulations as a successful profitable behavior. The conclusions reached by this agent-based analysis are twofold: 1) a straightforward set of assumptions is enough to give rise to exchange in a software market, and 2) knowledge exchange is shown to increase the efficiency of the market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly software systems are required to survive variations in their execution environment without or with only little human intervention. Such systems are called "eternal software systems". In contrast to the traditional view of development and execution as separate cycles, these modern software systems should not present such a separation. Research in MDE has been primarily concerned with the use of models during the first cycle or development (i.e. during the design, implementation, and deployment) and has shown excellent results. In this paper the author argues that an eternal software system must have a first-class representation of itself available to enable change. These runtime representations (or runtime models) will depend on the kind of dynamic changes that we want to make available during execution or on the kind of analysis we want the system to support. Hence, different models can be conceived. Self-representation inevitably implies the use of reflection. In this paper the author briefly summarizes research that supports the use of runtime models, and points out different issues and research questions. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DEA literature continues apace but software has lagged behind. This session uses suitably selected data to present newly developed software which includes many of the most recent DEA models. The software enables the user to address a variety of issues not frequently found in existing DEA software such as: -Assessments under a variety of possible assumptions of returns to scale including NIRS and NDRS; -Scale elasticity computations; -Numerous Input/Output variables and truly unlimited number of assessment units (DMUs) -Panel data analysis -Analysis of categorical data (multiple categories) -Malmquist Index and its decompositions -Computations of Supper efficiency -Automated removal of super-efficient outliers under user-specified criteria; -Graphical presentation of results -Integrated statistical tests

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis provides a set of tools for managing uncertainty in Web-based models and workflows.To support the use of these tools, this thesis firstly provides a framework for exposing models through Web services. An introduction to uncertainty management, Web service interfaces,and workflow standards and technologies is given, with a particular focus on the geospatial domain.An existing specification for exposing geospatial models and processes, theWeb Processing Service (WPS), is critically reviewed. A processing service framework is presented as a solutionto usability issues with the WPS standard. The framework implements support for Simple ObjectAccess Protocol (SOAP), Web Service Description Language (WSDL) and JavaScript Object Notation (JSON), allowing models to be consumed by a variety of tools and software. Strategies for communicating with models from Web service interfaces are discussed, demonstrating the difficultly of exposing existing models on the Web. This thesis then reviews existing mechanisms for uncertainty management, with an emphasis on emulator methods for building efficient statistical surrogate models. A tool is developed to solve accessibility issues with such methods, by providing a Web-based user interface and backend to ease the process of building and integrating emulators. These tools, plus the processing service framework, are applied to a real case study as part of the UncertWeb project. The usability of the framework is proved with the implementation of aWeb-based workflow for predicting future crop yields in the UK, also demonstrating the abilities of the tools for emulator building and integration. Future directions for the development of the tools are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The author looks at trends in software and systems, and the current and likely implications of these trends on the discipline of performance engineering. In particular, he examines software complexity growth and its consequences for performance engineering for enhanced understanding, more efficient analysis and effective performance improvement. The pressures for adaptive and autonomous systems introduce further opportunities for performance innovation. The promise of aspect oriented software development technologies for assisting with some of these challenges is introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Poor diet is thought to be a risk factor for many diseases, including age-related macular disease (ARMD), which is the leading cause of blind registration in those aged over 60 years in the developed world. The aims of this study were 1) to evaluate the dietary food intake of three subject groups: participants under the age of 50 years without ARMD (U50), participants over the age of 50 years without ARMD (O50), and participants with ARMD (AMD), and 2) to obtain information on nutritional supplement usage. Methods: A prospective cross-sectional study designed in a clinical practice setting. Seventy-four participants were divided into three groups: U50; 20 participants aged < 50 years, from 21 to 40 (mean ± SD, 37.7 ± 10.1 years), O50; 27 participants aged > 50 years, from 52 to 77 (62.7 ± 6.8 years), and ARMD; 27 participants aged > 50 years with ARMD, from 55 to 79 (66.0 ± 5.8 years). Participants were issued with a three-day food diary, and were also asked to provide details of any daily nutritional supplements. The diaries were analysed using FoodBase 2000 software. Data were input by one investigator and statistically analysed using Microsoft Excel for Microsoft Windows XP software, employing unpaired t-tests. Results: Group O50 consumed significantly more vitamin C (t = 3.049, p = 0.005) and significantly more fibre (t = 2.107, p = 0.041) than group U50. Group ARMD consumed significantly more protein (t = 3.487, p = 0.001) and zinc (t = 2.252, p = 0.029) than group O50. The ARMD group consumed the highest percentage of specific ocular health supplements and the U50 group consumed the most multivitamins. Conclusions: We did not detect a deficiency of any specific nutrient in the diets of those with ARMD compared with age- and gender-matched controls. ARMD patients may be aware of research into use of nutritional supplementation to prevent progression of their condition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To determine whether curve-fitting analysis of the ranked segment distributions of topographic optic nerve head (ONH) parameters, derived using the Heidelberg Retina Tomograph (HRT), provide a more effective statistical descriptor to differentiate the normal from the glaucomatous ONH. Methods: The sample comprised of 22 normal control subjects (mean age 66.9 years; S.D. 7.8) and 22 glaucoma patients (mean age 72.1 years; S.D. 6.9) confirmed by reproducible visual field defects on the Humphrey Field Analyser. Three 10°-images of the ONH were obtained using the HRT. The mean topography image was determined and the HRT software was used to calculate the rim volume, rim area to disc area ratio, normalised rim area to disc area ratio and retinal nerve fibre cross-sectional area for each patient at 10°-sectoral intervals. The values were ranked in descending order, and each ranked-segment curve of ordered values was fitted using the least squares method. Results: There was no difference in disc area between the groups. The group mean cup-disc area ratio was significantly lower in the normal group (0.204 ± 0.16) compared with the glaucoma group (0.533 ± 0.083) (p < 0.001). The visual field indices, mean deviation and corrected pattern S.D., were significantly greater (p < 0.001) in the glaucoma group (-9.09 dB ± 3.3 and 7.91 ± 3.4, respectively) compared with the normal group (-0.15 dB ± 0.9 and 0.95 dB ± 0.8, respectively). Univariate linear regression provided the best overall fit to the ranked segment data. The equation parameters of the regression line manually applied to the normalised rim area-disc area and the rim area-disc area ratio data, correctly classified 100% of normal subjects and glaucoma patients. In this study sample, the regression analysis of ranked segment parameters method was more effective than conventional ranked segment analysis, in which glaucoma patients were misclassified in approximately 50% of cases. Further investigation in larger samples will enable the calculation of confidence intervals for normality. These reference standards will then need to be investigated for an independent sample to fully validate the technique. Conclusions: Using a curve-fitting approach to fit ranked segment curves retains information relating to the topographic nature of neural loss. Such methodology appears to overcome some of the deficiencies of conventional ranked segment analysis, and subject to validation in larger scale studies, may potentially be of clinical utility for detecting and monitoring glaucomatous damage. © 2007 The College of Optometrists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of our work is the verification of tight functional properties of numerical programs, such as showing that a floating-point implementation of Riemann integration computes a close approximation of the exact integral. Programmers and engineers writing such programs will benefit from verification tools that support an expressive specification language and that are highly automated. Our work provides a new method for verification of numerical software, supporting a substantially more expressive language for specifications than other publicly available automated tools. The additional expressivity in the specification language is provided by two constructs. First, the specification can feature inclusions between interval arithmetic expressions. Second, the integral operator from classical analysis can be used in the specifications, where the integration bounds can be arbitrary expressions over real variables. To support our claim of expressivity, we outline the verification of four example programs, including the integration example mentioned earlier. A key component of our method is an algorithm for proving numerical theorems. This algorithm is based on automatic polynomial approximation of non-linear real and real-interval functions defined by expressions. The PolyPaver tool is our implementation of the algorithm and its source code is publicly available. In this paper we report on experiments using PolyPaver that indicate that the additional expressivity does not come at a performance cost when comparing with other publicly available state-of-the-art provers. We also include a scalability study that explores the limits of PolyPaver in proving tight functional specifications of progressively larger randomly generated programs. © 2014 Springer International Publishing Switzerland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AOSD'03 Practitioner Report Performance analysis is motivated as an ideal domain for benefiting from the application of Aspect Oriented (AO) technology. The experience of a ten week project to apply AO to the performance analysis domain is described. We show how all phases of a performance analysts’ activities – initial profiling, problem identification, problem analysis and solution exploration – were candidates for AO technology assistance – some being addressed with more success than others. A Profiling Workbench is described that leverages the capabilities of AspectJ, and delivers unique capabilities into the hands of developers exploring caching opportunities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In data mining, efforts have focused on finding methods for efficient and effective cluster analysis in large databases. Active themes of research focus on the scalability of clustering methods, the effectiveness of methods for clustering complex shapes and types of data, high-dimensional clustering techniques, and methods for clustering mixed numerical and categorical data in large databases. One of the most accuracy approach based on dynamic modeling of cluster similarity is called Chameleon. In this paper we present a modified hierarchical clustering algorithm that used the main idea of Chameleon and the effectiveness of suggested approach will be demonstrated by the experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significance: Oxidized phospholipids are now well-recognized as markers of biological oxidative stress and bioactive molecules with both pro-inflammatory and anti-inflammatory effects. While analytical methods continue to be developed for studies of generic lipid oxidation, mass spectrometry (MS) has underpinned the advances in knowledge of specific oxidized phospholipids by allowing their identification and characterization, and is responsible for the expansion of oxidative lipidomics. Recent Advances: Studies of oxidized phospholipids in biological samples, both from animal models and clinical samples, have been facilitated by the recent improvements in MS, especially targeted routines that depend on the fragmentation pattern of the parent molecular ion and improved resolution and mass accuracy. MS can be used to identify selectively individual compounds or groups of compounds with common features, which greatly improves the sensitivity and specificity of detection. Application of these methods have enabled important advances in understanding the mechanisms of inflammatory diseases such as atherosclerosis, steatohepatitis, leprosy and cystic fibrosis, and offer potential for developing biomarkers of molecular aspects of the diseases. Critical Issues and Future Directions: The future in this field will depend on development of improved MS technologies, such as ion mobility, novel enrichment methods and databases and software for data analysis, owing to the very large amount of data generated in these experiments. Imaging of oxidized phospholipids in tissue MS is an additional exciting direction emerging that can be expected to advance understanding of physiology and disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The questions of software-based design of “virtual” technical systems are considered as facility of imitation experiment for educational purposes. These virtual systems are usable for analysis of medical intrascopy systems functioning. The virtual educational technical systems allow guarantee the goodness technical training of bioengineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main requirements to DRM platforms implementing effective user experience and strong security measures to prevent unauthorized use of content are discussed. Comparison of hardware-based and software- based platforms is made showing the general inherent advantages of hardware DRM solutions. Analysis and evaluation of the main flaws of hardware platforms are conducted, pointing out the possibilities to overcome them. The overview of the existing concepts for practical realization of hardware DRM protection reveals their advantages and disadvantages and the increasing demand for creation of multi-core architecture, which could assure an effective DRM protection without decreasing the user’s freedom and importing risks for end system security.