63 resultados para Propagation prediction models

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a scalable, statistical ‘black-box’ model for predicting the performance of parallel programs on multi-core non-uniform memory access (NUMA) systems. We derive a model with low overhead, by reducing data collection and model training time. The model can accurately predict the behaviour of parallel applications in response to changes in their concurrency, thread layout on NUMA nodes, and core voltage and frequency. We present a framework that applies the model to achieve significant energy and energy-delay-square (ED2) savings (9% and 25%, respectively) along with performance improvement (10% mean) on an actual 16-core NUMA system running realistic application workloads. Our prediction model proves substantially more accurate than previous efforts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper contributes to the understanding of lime-mortar masonry strength and deformation (which determine durability and allowable stresses/stiffness in design codes) by measuring the mechanical properties of brick bound with lime and lime-cement mortars. Based on the regression analysis of experimental results, models to estimate lime-mortar masonry compressive strength are proposed (less accurate for hydrated lime (CL90s) masonry due to the disparity between mortar and brick strengths). Also, three relationships between masonry elastic modulus and its compressive strength are proposed for cement-lime; hydraulic lime (NHL3.5 and 5); and hydrated/feebly hydraulic lime masonries respectively.

Disagreement between the experimental results and former mathematical prediction models (proposed primarily for cement masonry) is caused by a lack of provision for the significant deformation of lime masonry and the relative changes in strength and stiffness between mortar and brick over time (at 6 months and 1 year, the NHL 3.5 and 5 mortars are often stronger than the brick). Eurocode 6 provided the best predictions for the compressive strength of lime and cement-lime masonry based on the strength of their components. All models vastly overestimated the strength of CL90s masonry at 28 days however, Eurocode 6 became an accurate predictor after 6 months, when the mortar had acquired most of its final strength and stiffness.

The experimental results agreed with former stress-strain curves. It was evidenced that mortar strongly impacts masonry deformation, and that the masonry stress/strain relationship becomes increasingly non-linear as mortar strength lowers. It was also noted that, the influence of masonry stiffness on its compressive strength becomes smaller as the mortar hydraulicity increases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We propose simple models to predict the performance degradation of disk requests due to storage device contention in consolidated virtualized environments. Model parameters can be deduced from measurements obtained inside Virtual Machines (VMs) from a system where a single VM accesses a remote storage server. The parameterized model can then be used to predict the effect of storage contention when multiple VMs are consolidated on the same server. We first propose a trace-driven approach that evaluates a queueing network with fair share scheduling using simulation. The model parameters consider Virtual Machine Monitor level disk access optimizations and rely on a calibration technique. We further present a measurement-based approach that allows a distinct characterization of read/write performance attributes. In particular, we define simple linear prediction models for I/O request mean response times, throughputs and read/write mixes, as well as a simulation model for predicting response time distributions. We found our models to be effective in predicting such quantities across a range of synthetic and emulated application workloads. 

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A novel model for indoor wireless communication, based on a dual image and ray-shooting approach, is presented. The model, capable of improved site-specific indoor propagation prediction, considers multiple human bodies moving within the environment. In a modern office at 2.45GHz, the combined effect of pedestrian traffic and a moving receiver causes rapid temporal fading of up to 30dB.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Predicting the next location of a user based on their previous visiting pattern is one of the primary tasks over data from location based social networks (LBSNs) such as Foursquare. Many different aspects of these so-called “check-in” profiles of a user have been made use of in this task, including spatial and temporal information of check-ins as well as the social network information of the user. Building more sophisticated prediction models by enriching these check-in data by combining them with information from other sources is challenging due to the limited data that these LBSNs expose due to privacy concerns. In this paper, we propose a framework to use the location data from LBSNs, combine it with the data from maps for associating a set of venue categories with these locations. For example, if the user is found to be checking in at a mall that has cafes, cinemas and restaurants according to the map, all these information is associated. This category information is then leveraged to predict the next checkin location by the user. Our experiments with publicly available check-in dataset show that this approach improves on the state-of-the-art methods for location prediction.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The identification of subjects at high risk for Alzheimer’s disease is important for prognosis and early intervention. We investigated the polygenic architecture of Alzheimer’s disease and the accuracy of Alzheimer’s disease prediction models, including and excluding the polygenic component in the model. This study used genotype data from the powerful dataset comprising 17 008 cases and 37 154 controls obtained from the International Genomics of Alzheimer’s Project (IGAP). Polygenic score analysis tested whether the alleles identified to associate with disease in one sample set were significantly enriched in the cases relative to the controls in an independent sample. The disease prediction accuracy was investigated in a subset of the IGAP data, a sample of 3049 cases and 1554 controls (for whom APOE genotype data were available) by means of sensitivity, specificity, area under the receiver operating characteristic curve (AUC) and positive and negative predictive values. We observed significant evidence for a polygenic component enriched in Alzheimer’s disease (P = 4.9 × 10−26). This enrichment remained significant after APOE and other genome-wide associated regions were excluded (P = 3.4 × 10−19). The best prediction accuracy AUC = 78.2% (95% confidence interval 77–80%) was achieved by a logistic regression model with APOE, the polygenic score, sex and age as predictors. In conclusion, Alzheimer’s disease has a significant polygenic component, which has predictive utility for Alzheimer’s disease risk and could be a valuable research tool complementing experimental designs, including preventative clinical trials, stem cell selection and high/low risk clinical studies. In modelling a range of sample disease prevalences, we found that polygenic scores almost doubles case prediction from chance with increased prediction at polygenic extremes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES: To determine effective and efficient monitoring criteria for ocular hypertension [raised intraocular pressure (IOP)] through (i) identification and validation of glaucoma risk prediction models; and (ii) development of models to determine optimal surveillance pathways.

DESIGN: A discrete event simulation economic modelling evaluation. Data from systematic reviews of risk prediction models and agreement between tonometers, secondary analyses of existing datasets (to validate identified risk models and determine optimal monitoring criteria) and public preferences were used to structure and populate the economic model.

SETTING: Primary and secondary care.

PARTICIPANTS: Adults with ocular hypertension (IOP > 21 mmHg) and the public (surveillance preferences).

INTERVENTIONS: We compared five pathways: two based on National Institute for Health and Clinical Excellence (NICE) guidelines with monitoring interval and treatment depending on initial risk stratification, 'NICE intensive' (4-monthly to annual monitoring) and 'NICE conservative' (6-monthly to biennial monitoring); two pathways, differing in location (hospital and community), with monitoring biennially and treatment initiated for a ≥ 6% 5-year glaucoma risk; and a 'treat all' pathway involving treatment with a prostaglandin analogue if IOP > 21 mmHg and IOP measured annually in the community.

MAIN OUTCOME MEASURES: Glaucoma cases detected; tonometer agreement; public preferences; costs; willingness to pay and quality-adjusted life-years (QALYs).

RESULTS: The best available glaucoma risk prediction model estimated the 5-year risk based on age and ocular predictors (IOP, central corneal thickness, optic nerve damage and index of visual field status). Taking the average of two IOP readings, by tonometry, true change was detected at two years. Sizeable measurement variability was noted between tonometers. There was a general public preference for monitoring; good communication and understanding of the process predicted service value. 'Treat all' was the least costly and 'NICE intensive' the most costly pathway. Biennial monitoring reduced the number of cases of glaucoma conversion compared with a 'treat all' pathway and provided more QALYs, but the incremental cost-effectiveness ratio (ICER) was considerably more than £30,000. The 'NICE intensive' pathway also avoided glaucoma conversion, but NICE-based pathways were either dominated (more costly and less effective) by biennial hospital monitoring or had a ICERs > £30,000. Results were not sensitive to the risk threshold for initiating surveillance but were sensitive to the risk threshold for initiating treatment, NHS costs and treatment adherence.

LIMITATIONS: Optimal monitoring intervals were based on IOP data. There were insufficient data to determine the optimal frequency of measurement of the visual field or optic nerve head for identification of glaucoma. The economic modelling took a 20-year time horizon which may be insufficient to capture long-term benefits. Sensitivity analyses may not fully capture the uncertainty surrounding parameter estimates.

CONCLUSIONS: For confirmed ocular hypertension, findings suggest that there is no clear benefit from intensive monitoring. Consideration of the patient experience is important. A cohort study is recommended to provide data to refine the glaucoma risk prediction model, determine the optimum type and frequency of serial glaucoma tests and estimate costs and patient preferences for monitoring and treatment.

FUNDING: The National Institute for Health Research Health Technology Assessment Programme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Virtual metrology (VM) aims to predict metrology values using sensor data from production equipment and physical metrology values of preceding samples. VM is a promising technology for the semiconductor manufacturing industry as it can reduce the frequency of in-line metrology operations and provide supportive information for other operations such as fault detection, predictive maintenance and run-to-run control. The prediction models for VM can be from a large variety of linear and nonlinear regression methods and the selection of a proper regression method for a specific VM problem is not straightforward, especially when the candidate predictor set is of high dimension, correlated and noisy. Using process data from a benchmark semiconductor manufacturing process, this paper evaluates the performance of four typical regression methods for VM: multiple linear regression (MLR), least absolute shrinkage and selection operator (LASSO), neural networks (NN) and Gaussian process regression (GPR). It is observed that GPR performs the best among the four methods and that, remarkably, the performance of linear regression approaches that of GPR as the subset of selected input variables is increased. The observed competitiveness of high-dimensional linear regression models, which does not hold true in general, is explained in the context of extreme learning machines and functional link neural networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The global prevalence of diabetic nephropathy is rising in parallel with the increasing incidence of diabetes in most countries. Unfortunately, up to 40 % of persons diagnosed with diabetes may develop kidney complications. Diabetic nephropathy is associated with substantially increased risks of cardiovascular disease and premature mortality. An inherited susceptibility to diabetic nephropathy exists, and progress is being made unravelling the genetic basis for nephropathy thanks to international research collaborations, shared biological resources and new analytical approaches. Multiple epidemiological studies have highlighted the clinical heterogeneity of nephropathy and the need for better phenotyping to help define important subgroups for analysis and increase the power of genetic studies. Collaborative genome-wide association studies for nephropathy have reported unique genes, highlighted novel biological pathways and suggested new disease mechanisms, but progress towards clinically relevant risk prediction models for diabetic nephropathy has been slow. This review summarises the current status, recent developments and ongoing challenges elucidating the genetics of diabetic nephropathy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The European Eye Epidemiology (E3) consortium is a recently formed consortium of 29 groups from 12 European countries. It already comprises 21 population-based studies and 20 other studies (case-control, cases only, randomized trials), providing ophthalmological data on approximately 170,000 European participants. The aim of the consortium is to promote and sustain collaboration and sharing of data and knowledge in the field of ophthalmic epidemiology in Europe, with particular focus on the harmonization of methods for future research, estimation and projection of frequency and impact of visual outcomes in European populations (including temporal trends and European subregions), identification of risk factors and pathways for eye diseases (lifestyle, vascular and metabolic factors, genetics, epigenetics and biomarkers) and development and validation of prediction models for eye diseases. Coordinating these existing data will allow a detailed study of the risk factors and consequences of eye diseases and visual impairment, including study of international geographical variation which is not possible in individual studies. It is expected that collaborative work on these existing data will provide additional knowledge, despite the fact that the risk factors and the methods for collecting them differ somewhat among the participating studies. Most studies also include biobanks of various biological samples, which will enable identification of biomarkers to detect and predict occurrence and progression of eye diseases. This article outlines the rationale of the consortium, its design and presents a summary of the methodology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To assess the efficiency of alternative monitoring services for people with ocular hypertension (OHT), a glaucoma risk factor.

DESIGN: Discrete event simulation model comparing five alternative care pathways: treatment at OHT diagnosis with minimal monitoring; biennial monitoring (primary and secondary care) with treatment if baseline predicted 5-year glaucoma risk is ≥6%; monitoring and treatment aligned to National Institute for Health and Care Excellence (NICE) glaucoma guidance (conservative and intensive).

SETTING: UK health services perspective.

PARTICIPANTS: Simulated cohort of 10 000 adults with OHT (mean intraocular pressure (IOP) 24.9 mm Hg (SD 2.4).

MAIN OUTCOME MEASURES: Costs, glaucoma detected, quality-adjusted life years (QALYs).

RESULTS: Treating at diagnosis was the least costly and least effective in avoiding glaucoma and progression. Intensive monitoring following NICE guidance was the most costly and effective. However, considering a wider cost-utility perspective, biennial monitoring was less costly and provided more QALYs than NICE pathways, but was unlikely to be cost-effective compared with treating at diagnosis (£86 717 per additional QALY gained). The findings were robust to risk thresholds for initiating monitoring but were sensitive to treatment threshold, National Health Service costs and treatment adherence.

CONCLUSIONS: For confirmed OHT, glaucoma monitoring more frequently than every 2 years is unlikely to be efficient. Primary treatment and minimal monitoring (assessing treatment responsiveness (IOP)) could be considered; however, further data to refine glaucoma risk prediction models and value patient preferences for treatment are needed. Consideration to innovative and affordable service redesign focused on treatment responsiveness rather than more glaucoma testing is recommended.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Motivated by environmental protection concerns, monitoring the flue gas of thermal power plant is now often mandatory due to the need to ensure that emission levels stay within safe limits. Optical based gas sensing systems are increasingly employed for this purpose, with regression techniques used to relate gas optical absorption spectra to the concentrations of specific gas components of interest (NOx, SO2 etc.). Accurately predicting gas concentrations from absorption spectra remains a challenging problem due to the presence of nonlinearities in the relationships and the high-dimensional and correlated nature of the spectral data. This article proposes a generalized fuzzy linguistic model (GFLM) to address this challenge. The GFLM is made up of a series of “If-Then” fuzzy rules. The absorption spectra are input variables in the rule antecedent. The rule consequent is a general nonlinear polynomial function of the absorption spectra. Model parameters are estimated using least squares and gradient descent optimization algorithms. The performance of GFLM is compared with other traditional prediction models, such as partial least squares, support vector machines, multilayer perceptron neural networks and radial basis function networks, for two real flue gas spectral datasets: one from a coal-fired power plant and one from a gas-fired power plant. The experimental results show that the generalized fuzzy linguistic model has good predictive ability, and is competitive with alternative approaches, while having the added advantage of providing an interpretable model.