873 resultados para Mass based allocation
Resumo:
MS is an important analytical tool in clinical proteomics, primarily in the disease-specific discovery identification and characterisation of proteomic biomarkers and patterns. MS-based proteomics is increasingly used in clinical validation and diagnostic method development. The latter departs from the typical application of MS-based proteomics by exchanging some of the high performance of analysis for the throughput, robustness and simplicity required for clinical diagnostics. Although conventional MS-based proteomics has become an important field in clinical applications, some of the most recent MS technologies have not yet been extensively applied in clinical proteomics. in this review, we will describe the current state of MS in clinical proteomics and look to the future of this field.
Resumo:
Uncertainty contributes a major part in the accuracy of a decision-making process while its inconsistency is always difficult to be solved by existing decision-making tools. Entropy has been proved to be useful to evaluate the inconsistency of uncertainty among different respondents. The study demonstrates an entropy-based financial decision support system called e-FDSS. This integrated system provides decision support to evaluate attributes (funding options and multiple risks) available in projects. Fuzzy logic theory is included in the system to deal with the qualitative aspect of these options and risks. An adaptive genetic algorithm (AGA) is also employed to solve the decision algorithm in the system in order to provide optimal and consistent rates to these attributes. Seven simplified and parallel projects from a Hong Kong construction small and medium enterprise (SME) were assessed to evaluate the system. The result shows that the system calculates risk adjusted discount rates (RADR) of projects in an objective way. These rates discount project cash flow impartially. Inconsistency of uncertainty is also successfully evaluated by the use of the entropy method. Finally, the system identifies the favourable funding options that are managed by a scheme called SME Loan Guarantee Scheme (SGS). Based on these results, resource allocation could then be optimized and the best time to start a new project could also be identified throughout the overall project life cycle.
Resumo:
We previously reported sequence determination of neutral oligosaccharides by negative ion electrospray tandem mass spectrometry on a quadrupole-orthogonal time-of-flight instrument with high sensitivity and without the need of derivatization. In the present report, we extend our strategies to sialylated oligosaccharides for analysis of chain and blood group types together with branching patterns. A main feature in the negative ion mass spectrometry approach is the unique double glycosidic cleavage induced by 3-glycosidic substitution, producing characteristic D-type fragments which can be used to distinguish the type 1 and type 2 chains, the blood group related Lewis determinants, 3,6-disubstituted core branching patterns, and to assign the structural details of each of the branches. Twenty mono- and disialylated linear and branched oligosaccharides were used for the investigation, and the sensitivity achieved is in the femtomole range. To demonstrate the efficacy of the strategy, we have determined a novel complex disialylated and monofucosylated tridecasaccharide that is based on the lacto-N-decaose core. The structure and sequence assignment was corroborated by :methylation analysis and H-1 NMR spectroscopy.
Resumo:
A novel Swarm Intelligence method for best-fit search, Stochastic Diffusion Search, is presented capable of rapid location of the optimal solution in the search space. Population based search mechanisms employed by Swarm Intelligence methods can suffer lack of convergence resulting in ill defined stopping criteria and loss of the best solution. Conversely, as a result of its resource allocation mechanism, the solutions SDS discovers enjoy excellent stability.
Resumo:
Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.
Resumo:
The Kodar Mountains in eastern Siberia accommodate 30 small, cold-based glaciers with a combined surface area of about 19 km2. Very little is known about these glaciers, with the first survey conducted in the late 1950s. In this paper, we use terrestrial photogrammetry to calculate changes in surface area, elevation, volume and geodetic mass balance of the Azarova Glacier between 1979 and 2007 and relate these to meteorological data from nearby Chara weather station (1938-2007). The glacier surface area declined by 20±6.9% and surface lowered on average by 20±1.8 m (mean thinning: 0.71 m a-1) resulting in a strongly negative cumulative and average mass balance of -18±1.6 m w.e. and -640±60 mm w.e.a-1 respectively. The July-August air temperature increased at a rate of 0.036oC a-1 between 1979 and 2007 and the 1980-2007 period was, on average, around 1oC warmer than 1938-1979. The regional climate projections for A2 and B2 CO2 emission scenarios developed using PRECIS regional climate model indicate that summer temperatures will increase in 2071–2100 by 2.6-4.7°C and 4.9-6.2°C respectively in comparison with 1961–1990. The annual total of solid precipitation will increase by 20% under B2 scenario but decline by 3% under A2 scenario. The length of the ablation season will extend from July–August to June-September. The Azarova Glacier exhibits high sensitivity to climatic warming due to its low elevation, exposure to comparatively high summer temperatures, and the absence of a compensating impact of cold season precipitation. Further summer warming and decline of solid precipitation projected under the A2 scenario will force Azarova to retreat further while impacts of an increase in solid precipitation projected under the B2 scenario require further investigation.
Resumo:
Observations of a chemical at a point in the atmosphere typically show sudden transitions between episodes of high and low concentration. Often these are associated with a rapid change in the origin of air arriving at the site. Lagrangian chemical models riding along trajectories can reproduce such transitions, but small timing errors from trajectory phase errors dramatically reduce the correlation between modeled concentrations and observations. Here the origin averaging technique is introduced to obtain maps of average concentration as a function of air mass origin for the East Atlantic Summer Experiment 1996 (EASE96, a ground-based chemistry campaign). These maps are used to construct origin averaged time series which enable comparison between a chemistry model and observations with phase errors factored out. The amount of the observed signal explained by trajectory changes can be quantified, as can the systematic model errors as a function of air mass origin. The Cambridge Tropospheric Trajectory model of Chemistry and Transport (CiTTyCAT) can account for over 70% of the observed ozone signal variance during EASE96 when phase errors are side-stepped by origin averaging. The dramatic increase in correlation (from 23% without averaging) cannot be achieved by time averaging. The success of the model is attributed to the strong relationship between changes in ozone along trajectories and their origin and its ability to simulate those changes. The model performs less well for longer-lived chemical constituents because the initial conditions 5 days before arrival are insufficiently well known.
Resumo:
Estimating snow mass at continental scales is difficult but important for understanding landatmosphere interactions, biogeochemical cycles and Northern latitudes’ hydrology. Remote sensing provides the only consistent global observations, but the uncertainty in measurements is poorly understood. Existing techniques for the remote sensing of snow mass are based on the Chang algorithm, which relates the absorption of Earth-emitted microwave radiation by a snow layer to the snow mass within the layer. The absorption also depends on other factors such as the snow grain size and density, which are assumed and fixed within the algorithm. We examine the assumptions, compare them to field measurements made at the NASA Cold Land Processes Experiment (CLPX) Colorado field site in 2002–3, and evaluate the consequences of deviation and variability for snow mass retrieval. The accuracy of the emission model used to devise the algorithm also has an impact on its accuracy, so we test this with the CLPX measurements of snow properties against SSM/I and AMSR-E satellite measurements.
Resumo:
Peak picking is an early key step in MS data analysis. We compare three commonly used approaches to peak picking and discuss their merits by means of statistical analysis. Methods investigated encompass signal-to-noise ratio, continuous wavelet transform, and a correlation-based approach using a Gaussian template. Functionality of the three methods is illustrated and discussed in a practical context using a mass spectral data set created with MALDI-TOF technology. Sensitivity and specificity are investigated using a manually defined reference set of peaks. As an additional criterion, the robustness of the three methods is assessed by a perturbation analysis and illustrated using ROC curves.
Resumo:
We present stereoscopic images of an Earth-impacting Coronal Mass Ejection (CME). The CME was imaged by the Heliospheric Imagers onboard the twin STEREO spacecraft during December 2008. The apparent acceleration of the CME is used to provide independent estimates of its speed and direction from the two spacecraft. Three distinct signatures within the CME were all found to be closely Earth-directed. At the time that the CME was predicted to pass the ACE spacecraft, in-situ observations contained a typical CME signature. At Earth, ground-based magnetometer observations showed a small but widespread sudden response to the compression of the geomagnetic cavity at CME impact. In this case, STEREO could have given warning of CME impact at least 24 hours in advance. These stereoscopic observations represent a significant milestone for the STEREO mission and have significant potential for improving operational space weather forecasting.
Resumo:
Remote sensing is the only practicable means to observe snow at large scales. Measurements from passive microwave instruments have been used to derive snow climatology since the late 1970’s, but the algorithms used were limited by the computational power of the era. Simplifications such as the assumption of constant snow properties enabled snow mass to be retrieved from the microwave measurements, but large errors arise from those assumptions, which are still used today. A better approach is to perform retrievals within a data assimilation framework, where a physically-based model of the snow properties can be used to produce the best estimate of the snow cover, in conjunction with multi-sensor observations such as the grain size, surface temperature, and microwave radiation. We have developed an existing snow model, SNOBAL, to incorporate mass and energy transfer of the soil, and to simulate the growth of the snow grains. An evaluation of this model is presented and techniques for the development of new retrieval systems are discussed.
Resumo:
This article describes a number of velocity-based moving mesh numerical methods formultidimensional nonlinear time-dependent partial differential equations (PDEs). It consists of a short historical review followed by a detailed description of a recently developed multidimensional moving mesh finite element method based on conservation. Finite element algorithms are derived for both mass-conserving and non mass-conserving problems, and results shown for a number of multidimensional nonlinear test problems, including the second order porous medium equation and the fourth order thin film equation as well as a two-phase problem. Further applications and extensions are referenced.
Resumo:
This paper, examines whether the asset holdings and weights of an international real estate portfolio using exchange rate adjusted returns are essentially the same or radically different from those based on unadjusted returns. The results indicate that the portfolio compositions produced by exchange rate adjusted returns are markedly different from those based on unadjusted returns. However following the introduction of the single currency the differences in portfolio composition are much less pronounced. The findings have a practical consequence for the investor because they suggest that following the introduction of the single currency international investors can concentrate on the real estate fundamentals when making their portfolio choices, rather than worry about the implications of exchange rate risk.
Resumo:
This research establishes the feasibility of using a network centric technology, Jini, to provide a grid framework on which to perform parallel video encoding. A solution was implemented using Jini and obtained real-time on demand encoding of a 480 HD video stream. Further, a projection is made concerning the encoding of 1080 HD video in real-time, as the current grid was not powerful enough to achieve this above 15fps. The research found that Jini is able to provide a number of tools and services highly applicable in a grid environment. It is also suitable in terms of performance and responds well to a varying number of grid nodes. The main performance limiter was found to be the network bandwidth allocation, which when loaded with a large number of grid nodes was unable to handle the traffic.