945 resultados para Advanced Transaction Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work a method for building multiple-model structures is presented. A clustering algorithm that uses data from the system is employed to define the architecture of the multiple-model, including the size of the region covered by each model, and the number of models. A heating ventilation and air conditioning system is used as a testbed of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of four operational weather forecast models [ECMWF, Action de Recherche Petite Echelle Grande Echelle model (ARPEGE), Regional Atmospheric Climate Model (RACMO), and Met Office] to generate a cloud at the right location and time (the cloud frequency of occurrence) is assessed in the present paper using a two-year time series of observations collected by profiling ground-based active remote sensors (cloud radar and lidar) located at three different sites in western Europe (Cabauw. Netherlands; Chilbolton, United Kingdom; and Palaiseau, France). Particular attention is given to potential biases that may arise from instrumentation differences (especially sensitivity) from one site to another and intermittent sampling. In a second step the statistical properties of the cloud variables involved in most advanced cloud schemes of numerical weather forecast models (ice water content and cloud fraction) are characterized and compared with their counterparts in the models. The two years of observations are first considered as a whole in order to evaluate the accuracy of the statistical representation of the cloud variables in each model. It is shown that all models tend to produce too many high-level clouds, with too-high cloud fraction and ice water content. The midlevel and low-level cloud occurrence is also generally overestimated, with too-low cloud fraction but a correct ice water content. The dataset is then divided into seasons to evaluate the potential of the models to generate different cloud situations in response to different large-scale forcings. Strong variations in cloud occurrence are found in the observations from one season to the same season the following year as well as in the seasonal cycle. Overall, the model biases observed using the whole dataset are still found at seasonal scale, but the models generally manage to well reproduce the observed seasonal variations in cloud occurrence. Overall, models do not generate the same cloud fraction distributions and these distributions do not agree with the observations. Another general conclusion is that the use of continuous ground-based radar and lidar observations is definitely a powerful tool for evaluating model cloud schemes and for a responsive assessment of the benefit achieved by changing or tuning a model cloud

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a water balance modelling framework, this paper analyses the effects of urban design on the water balance, with a focus on evapotranspiration and storm water. First, two quite different urban water balance models are compared: Aquacycle which has been calibrated for a suburban catchment in Canberra, Australia, and the single-source urban evapotranspiration-interception scheme (SUES), an energy-based approach with a biophysically advanced representation of interception and evapotranspiration. A fair agreement between the two modelled estimates of evapotranspiration was significantly improved by allowing the vegetation cover (leaf area index, LAI) to vary seasonally, demonstrating the potential of SUES to quantify the links between water sensitive urban design and microclimates and the advantage of comparing the two modelling approaches. The comparison also revealed where improvements to SUES are needed, chiefly through improved estimates of vegetation cover dynamics as input to SUES, and more rigorous parameterization of the surface resistance equations using local-scale suburban flux measurements. Second, Aquacycle is used to identify the impact of an array of water sensitive urban design features on the water balance terms. This analysis confirms the potential to passively control urban microclimate by suburban design features that maximize evapotranspiration, such as vegetated roofs. The subsequent effects on daily maximum air temperatures are estimated using an atmospheric boundary layer budget. Potential energy savings of about 2% in summer cooling are estimated from this analysis. This is a clear ‘return on investment’ of using water to maintain urban greenspace, whether as parks distributed throughout an urban area or individual gardens or vegetated roofs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project, using PRACE (Partnership for Advanced Computing in Europe) resources, constructed and ran an ensemble of atmosphere-only global climate model simulations, using the Met Office Unified Model GA3 configuration. Each simulation is 27 years in length for both the present climate and an end-of-century future climate, at resolutions of N96 (130 km), N216 (60 km) and N512 (25 km), in order to study the impact of model resolution on high impact climate features such as tropical cyclones. Increased model resolution is found to improve the simulated frequency of explicitly tracked tropical cyclones, and correlations of interannual variability in the North Atlantic and North West Pacific lie between 0.6 and 0.75. Improvements in the deficit of genesis in the eastern North Atlantic as resolution increases appear to be related to the representation of African Easterly Waves and the African Easterly Jet. However, the intensity of the modelled tropical cyclones as measured by 10 m wind speed remain weak, and there is no indication of convergence over this range of resolutions. In the future climate ensemble, there is a reduction of 50% in the frequency of Southern Hemisphere tropical cyclones, while in the Northern Hemisphere there is a reduction in the North Atlantic, and a shift in the Pacific with peak intensities becoming more common in the Central Pacific. There is also a change in tropical cyclone intensities, with the future climate having fewer weak storms and proportionally more stronger storms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trading commercial real estate involves a process of exchange that is costly and which occurs over an extended and uncertain period of time. This has consequences for the performance and risk of real estate investments. Most research on transaction times has occurred for residential rather than commercial real estate. We study the time taken to transact commercial real estate assets in the UK using a sample of 578 transactions over the period 2004 to 2013. We measure average time to transact from a buyer and seller perspective, distinguishing the search and due diligence phases of the process, and we conduct econometric analysis to explain variation in due diligence times between assets. The median time for purchase of real estate from introduction to completion was 104 days and the median time for sale from marketing to completion was 135 days. There is considerable variation around these times and results suggest that some of this variation is related to market state, type and quality of asset, and type of participants involved in the transaction. Our findings shed light on the drivers of liquidity at an individual asset level and can inform models that quantify the impact of uncertain time on market on real estate investment risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea-ice concentrations in the Laptev Sea simulated by the coupled North Atlantic-Arctic Ocean-Sea-Ice Model and Finite Element Sea-Ice Ocean Model are evaluated using sea-ice concentrations from Advanced Microwave Scanning Radiometer-Earth Observing System satellite data and a polynya classification method for winter 2007/08. While developed to simulate largescale sea-ice conditions, both models are analysed here in terms of polynya simulation. The main modification of both models in this study is the implementation of a landfast-ice mask. Simulated sea-ice fields from different model runs are compared with emphasis placed on the impact of this prescribed landfast-ice mask. We demonstrate that sea-ice models are not able to simulate flaw polynyas realistically when used without fast-ice description. Our investigations indicate that without landfast ice and with coarse horizontal resolution the models overestimate the fraction of open water in the polynya. This is not because a realistic polynya appears but due to a larger-scale reduction of ice concentrations and smoothed ice-concentration fields. After implementation of a landfast-ice mask, the polynya location is realistically simulated but the total open-water area is still overestimated in most cases. The study shows that the fast-ice parameterization is essential for model improvements. However, further improvements are necessary in order to progress from the simulation of large-scale features in the Arctic towards a more detailed simulation of smaller-scaled features (here polynyas) in an Arctic shelf sea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea surface temperature (SST) data are often provided as gridded products, typically at resolutions of order 0.05 degrees from satellite observations to reduce data volume at the request of data users and facilitate comparison against other products or models. Sampling uncertainty is introduced in gridded products where the full surface area of the ocean within a grid cell cannot be fully observed because of cloud cover. In this paper we parameterise uncertainties in SST as a function of the percentage of clear-sky pixels available and the SST variability in that subsample. This parameterisation is developed from Advanced Along Track Scanning Radiometer (AATSR) data, but is applicable to all gridded L3U SST products at resolutions of 0.05-0.1 degrees, irrespective of instrument and retrieval algorithm, provided that instrument noise propagated into the SST is accounted for. We also calculate the sampling uncertainty of ~0.04 K in Global Area Coverage (GAC) Advanced Very High Resolution Radiometer (AVHRR) products, using related methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: Over the past decade in particular, formal linguistic work within L3 acquisition has concentrated on hypothesizing and empirically determining the source of transfer from previous languages—L1, L2 or both—in L3 grammatical representations. In view of the progressive concern with more advanced stages, we aim to show that focusing on L3 initial stages should be one continued priority of the field, even—or especially—if the field is ready to shift towards modeling L3 development and ultimate attainment. Approach: We argue that L3 learnability is significantly impacted by initial stages transfer, as such forms the basis of the initial L3 interlanguage. To illustrate our point, the insights from studies using initial and intermediary stages L3 data are discussed in light of developmental predictions that derive from the initial stages models. Conclusions: Despite a shared desire to understand the process of L3 acquisition in whole, inclusive of offering developmental L3 theories, we argue that the field does not yet have—although is ever closer to—the data basis needed to effectively do so. Originality: This article seeks to convince the readership for the need of conservatism in L3 acquisition theory building, whereby offering a framework on how and why we can most effectively build on the accumulated knowledge of the L3 initial stages in order to make significant, steady progress. Significance: The arguments exposed here are meant to provide an epistemological base for a tenable framework of formal approaches to L3 interlanguage development and, eventually, ultimate attainment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze data obtained from a study designed to evaluate training effects on the performance of certain motor activities of Parkinson`s disease patients. Maximum likelihood methods were used to fit beta-binomial/Poisson regression models tailored to evaluate the effects of training on the numbers of attempted and successful specified manual movements in 1 min periods, controlling for disease stage and use of the preferred hand. We extend models previously considered by other authors in univariate settings to account for the repeated measures nature of the data. The results suggest that the expected number of attempts and successes increase with training, except for patients with advanced stages of the disease using the non-preferred hand. Copyright (c) 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining can be used in healthcare industry to “mine” clinical data to discover hidden information for intelligent and affective decision making. Discovery of hidden patterns and relationships often goes intact, yet advanced data mining techniques can be helpful as remedy to this scenario. This thesis mainly deals with Intelligent Prediction of Chronic Renal Disease (IPCRD). Data covers blood, urine test, and external symptoms applied to predict chronic renal disease. Data from the database is initially transformed to Weka (3.6) and Chi-Square method is used for features section. After normalizing data, three classifiers were applied and efficiency of output is evaluated. Mainly, three classifiers are analyzed: Decision Tree, Naïve Bayes, K-Nearest Neighbour algorithm. Results show that each technique has its unique strength in realizing the objectives of the defined mining goals. Efficiency of Decision Tree and KNN was almost same but Naïve Bayes proved a comparative edge over others. Further sensitivity and specificity tests are used as statistical measures to examine the performance of a binary classification. Sensitivity (also called recall rate in some fields) measures the proportion of actual positives which are correctly identified while Specificity measures the proportion of negatives which are correctly identified. CRISP-DM methodology is applied to build the mining models. It consists of six major phases: business understanding, data understanding, data preparation, modeling, evaluation, and deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we verify the existence of predictability in the Brazilian equity market. Unlike other studies in the same sense, which evaluate original series for each stock, we evaluate synthetic series created on the basis of linear models of stocks. Following Burgess (1999), we use the “stepwise regression” model for the formation of models of each stock. We then use the variance ratio profile together with a Monte Carlo simulation for the selection of models with potential predictability. Unlike Burgess (1999), we carry out White’s Reality Check (2000) in order to verify the existence of positive returns for the period outside the sample. We use the strategies proposed by Sullivan, Timmermann & White (1999) and Hsu & Kuan (2005) amounting to 26,410 simulated strategies. Finally, using the bootstrap methodology, with 1,000 simulations, we find strong evidence of predictability in the models, including transaction costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mercados são instituições criadas para facilitar uma atividade de comercialização. Isto é possível porque um mercado é constituído por instituições que foram desenhadas para reduzir os custos de transação associados a este processo de troca. A partir dessas duas ideias, esta tese possui três objetivos principais. (i) Analisar por que a literatura de análise de cointegração tem mensurado estes custos de forma imprecisa. A principal razão é certa confusão entre os conceitos de custos de transação, de transporte e de comercialização. (ii) Propor um procedimento para mensurar indiretamente os custos de transação de mercado variáveis combinando os modelos de cointegração com mudança de regime e a estrutura teórica oferecidas pela Nova Economia Institucional. Este procedimento é aplicado para quantificar quanto custa comercializar etanol no mercado internacional usando suas atuais instituições. (iii) Por fim, usando os mesmos modelos e a mesma estrutura teórica, esta dissertação contesta a hipótese de que já existe um mercado internacional de etanol bem desenvolvido, tal qual a literatura tem assumido. De forma semelhante, também é avaliada a hipótese de que a remoção das barreiras comerciais norte-americanas para o etanol brasileiro seria uma condição suficiente para o desenvolvimento deste mercado internacional. Os testes aplicados rejeitam ambas as hipóteses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, the trend within the electronics industry is for the use of rapid and advanced simulation methodologies in association with synthesis toolsets. This paper presents an approach developed to support mixed-signal circuit design and analysis. The methodology proposed shows a novel approach to the problem of developing behvioural model descriptions of mixed-signal circuit topologies, by construction of a set of subsystems, that supports the automated mapping of MATLAB (R)/SINIULINK (R) models to structural VHDL-AMS descriptions. The tool developed, named (MSSV)-S-2, reads a SIMULINK (R) model file and translates it to a structural VHDL-AMS code. It also creates the file structure required to simulate the translated model in the SystemVision (TM). To validate the methodology and the developed program, the DAC08, AD7524 and AD5450 data converters were studied and initially modelled in MATLAB (R)/SIMULINK (R). The VHDL-AMS code generated automatically by (MSSV)-S-2, (MATLAB (R)/SIMULINK (R) to SystemVision (TM)), was then simulated in the SystemVision (TM). The simulation results show that the proposed approach, which is based on VHDL-AMS descriptions of the original model library elements, allows for the behavioural level simulation of complex mixed-signal circuits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the U(4) hybrid formalism, manifestly N = (2,2) worldsheet supersymmetric sigma models are constructed for the type-IIB superstring in Ramond-Ramond backgrounds. The Kahler potential in these N = 2 sigma models depends on four chiral and antichiral bosonic superfields and two chiral and antichiral fermionic superfields. When the Kahler potential is quadratic, the model is a free conformal field theory which describes a flat ten-dimensional target space with Ramond-Ramond flux and non-constant dilaton. For more general Kahler potentials, the model describes curved target spaces with Ramond-Ramond flux that are not plane-wave backgrounds. Ricci-flatness of the Kahler metric implies the on-shell conditions for the background up to the usual four-loop conformal anomaly.