986 resultados para PREDICTION SERVER
Resumo:
This note announces the discovery of a tract on eclipse prediction in Paris, BnF, lat. 6400b, composed by an Irish scholar in ad 754. It is the earliest such text in the early middle ages and it is here placed in its scientific context.
Resumo:
The separation of enantiomers and confirmation of their absolute configurations is significant in the development of chiral drugs. The interactions between the enantiomers of chiral pyrazole derivative and polysaccharide-based chiral stationary phase cellulose tris(4-methylbenzoate) (Chiralcel OJ) in seven solvents and under different temperature were studied using molecular dynamics simulations. The results show that solvent effect has remarkable influence on the interactions. Structure analysis discloses that the different interactions between two isomers and chiral stationary phase are dependent on the nature of solvents, which may invert the elution order. The computational method in the present study can be used to predict the elution order and the absolute configurations of enantiomers in HPLC separations and therefore would be valuable in development of chiral drugs.
Resumo:
Coastal and estuarine landforms provide a physical template that not only accommodates diverse ecosystem functions and human activities, but also mediates flood and erosion risks that are expected to increase with climate change. In this paper, we explore some of the issues associated with the conceptualisation and modelling of coastal morphological change at time and space scales relevant to managers and policy makers. Firstly, we revisit the question of how to define the most appropriate scales at which to seek quantitative predictions of landform change within an age defined by human interference with natural sediment systems and by the prospect of significant changes in climate and ocean forcing. Secondly, we consider the theoretical bases and conceptual frameworks for determining which processes are most important at a given scale of interest and the related problem of how to translate this understanding into models that are computationally feasible, retain a sound physical basis and demonstrate useful predictive skill. In particular, we explore the limitations of a primary scale approach and the extent to which these can be resolved with reference to the concept of the coastal tract and application of systems theory. Thirdly, we consider the importance of different styles of landform change and the need to resolve not only incremental evolution of morphology but also changes in the qualitative dynamics of a system and/or its gross morphological configuration. The extreme complexity and spatially distributed nature of landform systems means that quantitative prediction of future changes must necessarily be approached through mechanistic modelling of some form or another. Geomorphology has increasingly embraced so-called ‘reduced complexity’ models as a means of moving from an essentially reductionist focus on the mechanics of sediment transport towards a more synthesist view of landform evolution. However, there is little consensus on exactly what constitutes a reduced complexity model and the term itself is both misleading and, arguably, unhelpful. Accordingly, we synthesise a set of requirements for what might be termed ‘appropriate complexity modelling’ of quantitative coastal morphological change at scales commensurate with contemporary management and policy-making requirements: 1) The system being studied must be bounded with reference to the time and space scales at which behaviours of interest emerge and/or scientific or management problems arise; 2) model complexity and comprehensiveness must be appropriate to the problem at hand; 3) modellers should seek a priori insights into what kind of behaviours are likely to be evident at the scale of interest and the extent to which the behavioural validity of a model may be constrained by its underlying assumptions and its comprehensiveness; 4) informed by qualitative insights into likely dynamic behaviour, models should then be formulated with a view to resolving critical state changes; and 5) meso-scale modelling of coastal morphological change should reflect critically on the role of modelling and its relation to the observable world.
Resumo:
BACKGROUND: Pre-eclampsia is a leading cause of maternal and perinatal morbidity and mortality. Women with type 1 diabetes are considered a high-risk group for developing pre-eclampsia. Much research has focused on biomarkers as a means of screening for pre-eclampsia in the general maternal population; however, there is a lack of evidence for women with type 1 diabetes.
OBJECTIVES: To undertake a systematic review to identify potential biomarkers for the prediction of pre-eclampsia in women with type 1 diabetes.
SEARCH STRATEGY: We searched Medline, EMBASE, Maternity and Infant Care, Scopus, Web of Science and CINAHL SELECTION CRITERIA: Studies were included if they measured biomarkers in blood or urine of women who developed pre-eclampsia and had pre-gestational type 1 diabetes mellitus Data collection and analysis A narrative synthesis was adopted as a meta-analysis could not be performed, due to high study heterogeneity.
MAIN RESULTS: A total of 72 records were screened, with 21 eligible studies being included in the review. A wide range of biomarkers was investigated and study size varied from 34 to 1258 participants. No single biomarker appeared to be effective in predicting pre-eclampsia; however, glycaemic control was associated with an increased risk while a combination of angiogenic and anti-angiogenic factors seemed to be potentially useful.
CONCLUSIONS: Limited evidence suggests that combinations of biomarkers may be more effective in predicting pre-eclampsia than single biomarkers. Further research is needed to verify the predictive potential of biomarkers that have been measured in the general maternal population, as many studies exclude women with diabetes preceding pregnancy.
Resumo:
The high level of unemployment is one of the major problems in most European countries nowadays. Hence, the demand for small area labor market statistics has rapidly increased over the past few years. The Labour Force Survey (LFS) conducted by the Portuguese Statistical Office is the main source of official statistics on the labour market at the macro level (e.g. NUTS2 and national level). However, the LFS was not designed to produce reliable statistics at the micro level (e.g. NUTS3, municipalities or further disaggregate level) due to small sample sizes. Consequently, traditional design-based estimators are not appropriate. A solution to this problem is to consider model-based estimators that "borrow information" from related areas or past samples by using auxiliary information. This paper reviews, under the model-based approach, Best Linear Unbiased Predictors and an estimator based on the posterior predictive distribution of a Hierarchical Bayesian model. The goal of this paper is to analyze the possibility to produce accurate unemployment rate statistics at micro level from the Portuguese LFS using these kinds of stimators. This paper discusses the advantages of using each approach and the viability of its implementation.
Resumo:
In this study, Artificial Neural Networks are applied to multistep long term solar radiation prediction. The networks are trained as one-step-ahead predictors and iterated over time to obtain multi-step longer term predictions. Auto-regressive and Auto-regressive with exogenous inputs solar radiationmodels are compared, considering cloudiness indices as inputs in the latter case. These indices are obtained through pixel classification of ground-to-sky images. The input-output structure of the neural network models is selected using evolutionary computation methods.
Resumo:
Despite the importance of larval abundance in determining the recruitment of benthic marine invertebrates and as a major factor in marine benthic community structure, relating planktonic larval abundance with post-settlement post-larvae and juveniles in the benthos is difficult. It is hampered by several methodological difficulties, including sampling frequency, ability to follow larval and post-larval or juvenile cohorts, and ability to calculate growth and mortality rates. In our work, an intensive sampling strategy was used. Larvae in the plankton were collected at weekly intervals, while post-larvae that settled into collectors were analysed fortnightly. Planktonic larval and benthic post-larval/juvenile cohorts were determined, and growth and mortality rates calculated. Integration of all equations allowed the development of a theoretical formulation that, based on the abundance and planktonic larval duration, permits an estimation of the future abundance of post-larvae/juveniles during the first year of benthic life. The model can be applied to a sample in which it was necessary only to measure larval length.
Resumo:
Tese de doutoramento, Engenharia Electrónica e Telecomunicações (Processamento de Sinal), Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2014
Resumo:
Thesis (Master's)--University of Washington, 2016-03
Resumo:
BACKGROUND: Data for multiple common susceptibility alleles for breast cancer may be combined to identify women at different levels of breast cancer risk. Such stratification could guide preventive and screening strategies. However, empirical evidence for genetic risk stratification is lacking. METHODS: We investigated the value of using 77 breast cancer-associated single nucleotide polymorphisms (SNPs) for risk stratification, in a study of 33 673 breast cancer cases and 33 381 control women of European origin. We tested all possible pair-wise multiplicative interactions and constructed a 77-SNP polygenic risk score (PRS) for breast cancer overall and by estrogen receptor (ER) status. Absolute risks of breast cancer by PRS were derived from relative risk estimates and UK incidence and mortality rates. RESULTS: There was no strong evidence for departure from a multiplicative model for any SNP pair. Women in the highest 1% of the PRS had a three-fold increased risk of developing breast cancer compared with women in the middle quintile (odds ratio [OR] = 3.36, 95% confidence interval [CI] = 2.95 to 3.83). The ORs for ER-positive and ER-negative disease were 3.73 (95% CI = 3.24 to 4.30) and 2.80 (95% CI = 2.26 to 3.46), respectively. Lifetime risk of breast cancer for women in the lowest and highest quintiles of the PRS were 5.2% and 16.6% for a woman without family history, and 8.6% and 24.4% for a woman with a first-degree family history of breast cancer. CONCLUSIONS: The PRS stratifies breast cancer risk in women both with and without a family history of breast cancer. The observed level of risk discrimination could inform targeted screening and prevention strategies. Further discrimination may be achievable through combining the PRS with lifestyle/environmental factors, although these were not considered in this report.
Resumo:
Previous research on the prediction of fiscal aggregates has shown evidence that simple autoregressive models often provide better forecasts of fiscal variables than multivariate specifications. We argue that the multivariate models considered by previous studies are small-scale, probably burdened by overparameterization, and not robust to structural changes. Bayesian Vector Autoregressions (BVARs), on the other hand, allow the information contained in a large data set to be summarized efficiently, and can also allow for time variation in both the coefficients and the volatilities. In this paper we explore the performance of BVARs with constant and drifting coefficients for forecasting key fiscal variables such as government revenues, expenditures, and interest payments on the outstanding debt. We focus on both point and density forecasting, as assessments of a country’s fiscal stability and overall credit risk should typically be based on the specification of a whole probability distribution for the future state of the economy. Using data from the US and the largest European countries, we show that both the adoption of a large system and the introduction of time variation help in forecasting, with the former playing a relatively more important role in point forecasting, and the latter being more important for density forecasting.
Resumo:
Geostatistics has been successfully used to analyze and characterize the spatial variability of environmental properties. Besides giving estimated values at unsampled locations, it provides a measure of the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. In this work universal block kriging is novelty used to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfall monitoring campaign, with the aim of distinguishing the effluent plume from the receiving waters, characterizing its spatial variability in the vicinity of the discharge and estimating dilution. The results demonstrate that geostatistical methodology can provide good estimates of the dispersion of effluents that are very valuable in assessing the environmental impact and managing sea outfalls. Moreover, since accurate measurements of the plume’s dilution are rare, these studies might be very helpful in the future to validate dispersion models.
Resumo:
This paper proposes a dynamic scheduler that supports the coexistence of guaranteed and non-guaranteed bandwidth servers to efficiently handle soft-tasks’ overloads by making additional capacity available from two sources: (i) residual capacity allocated but unused when jobs complete in less than their budgeted execution time; (ii) stealing capacity from inactive non-isolated servers used to schedule best-effort jobs. The effectiveness of the proposed approach in reducing the mean tardiness of periodic jobs is demonstrated through extensive simulations. The achieved results become even more significant when tasks’ computation times have a large variance.
Resumo:
The structural integrity of multi-component structures is usually determined by the strength and durability of their unions. Adhesive bonding is often chosen over welding, riveting and bolting, due to the reduction of stress concentrations, reduced weight penalty and easy manufacturing, amongst other issues. In the past decades, the Finite Element Method (FEM) has been used for the simulation and strength prediction of bonded structures, by strength of materials or fracture mechanics-based criteria. Cohesive-zone models (CZMs) have already proved to be an effective tool in modelling damage growth, surpassing a few limitations of the aforementioned techniques. Despite this fact, they still suffer from the restriction of damage growth only at predefined growth paths. The eXtended Finite Element Method (XFEM) is a recent improvement of the FEM, developed to allow the growth of discontinuities within bulk solids along an arbitrary path, by enriching degrees of freedom with special displacement functions, thus overcoming the main restriction of CZMs. These two techniques were tested to simulate adhesively bonded single- and double-lap joints. The comparative evaluation of the two methods showed their capabilities and/or limitations for this specific purpose.