978 resultados para Advanced Transaction Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trading commercial real estate involves a process of exchange that is costly and which occurs over an extended and uncertain period of time. This has consequences for the performance and risk of real estate investments. Most research on transaction times has occurred for residential rather than commercial real estate. We study the time taken to transact commercial real estate assets in the UK using a sample of 578 transactions over the period 2004 to 2013. We measure average time to transact from a buyer and seller perspective, distinguishing the search and due diligence phases of the process, and we conduct econometric analysis to explain variation in due diligence times between assets. The median time for purchase of real estate from introduction to completion was 104 days and the median time for sale from marketing to completion was 135 days. There is considerable variation around these times and results suggest that some of this variation is related to market state, type and quality of asset, and type of participants involved in the transaction. Our findings shed light on the drivers of liquidity at an individual asset level and can inform models that quantify the impact of uncertain time on market on real estate investment risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea-ice concentrations in the Laptev Sea simulated by the coupled North Atlantic-Arctic Ocean-Sea-Ice Model and Finite Element Sea-Ice Ocean Model are evaluated using sea-ice concentrations from Advanced Microwave Scanning Radiometer-Earth Observing System satellite data and a polynya classification method for winter 2007/08. While developed to simulate largescale sea-ice conditions, both models are analysed here in terms of polynya simulation. The main modification of both models in this study is the implementation of a landfast-ice mask. Simulated sea-ice fields from different model runs are compared with emphasis placed on the impact of this prescribed landfast-ice mask. We demonstrate that sea-ice models are not able to simulate flaw polynyas realistically when used without fast-ice description. Our investigations indicate that without landfast ice and with coarse horizontal resolution the models overestimate the fraction of open water in the polynya. This is not because a realistic polynya appears but due to a larger-scale reduction of ice concentrations and smoothed ice-concentration fields. After implementation of a landfast-ice mask, the polynya location is realistically simulated but the total open-water area is still overestimated in most cases. The study shows that the fast-ice parameterization is essential for model improvements. However, further improvements are necessary in order to progress from the simulation of large-scale features in the Arctic towards a more detailed simulation of smaller-scaled features (here polynyas) in an Arctic shelf sea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea surface temperature (SST) data are often provided as gridded products, typically at resolutions of order 0.05 degrees from satellite observations to reduce data volume at the request of data users and facilitate comparison against other products or models. Sampling uncertainty is introduced in gridded products where the full surface area of the ocean within a grid cell cannot be fully observed because of cloud cover. In this paper we parameterise uncertainties in SST as a function of the percentage of clear-sky pixels available and the SST variability in that subsample. This parameterisation is developed from Advanced Along Track Scanning Radiometer (AATSR) data, but is applicable to all gridded L3U SST products at resolutions of 0.05-0.1 degrees, irrespective of instrument and retrieval algorithm, provided that instrument noise propagated into the SST is accounted for. We also calculate the sampling uncertainty of ~0.04 K in Global Area Coverage (GAC) Advanced Very High Resolution Radiometer (AVHRR) products, using related methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: Over the past decade in particular, formal linguistic work within L3 acquisition has concentrated on hypothesizing and empirically determining the source of transfer from previous languages—L1, L2 or both—in L3 grammatical representations. In view of the progressive concern with more advanced stages, we aim to show that focusing on L3 initial stages should be one continued priority of the field, even—or especially—if the field is ready to shift towards modeling L3 development and ultimate attainment. Approach: We argue that L3 learnability is significantly impacted by initial stages transfer, as such forms the basis of the initial L3 interlanguage. To illustrate our point, the insights from studies using initial and intermediary stages L3 data are discussed in light of developmental predictions that derive from the initial stages models. Conclusions: Despite a shared desire to understand the process of L3 acquisition in whole, inclusive of offering developmental L3 theories, we argue that the field does not yet have—although is ever closer to—the data basis needed to effectively do so. Originality: This article seeks to convince the readership for the need of conservatism in L3 acquisition theory building, whereby offering a framework on how and why we can most effectively build on the accumulated knowledge of the L3 initial stages in order to make significant, steady progress. Significance: The arguments exposed here are meant to provide an epistemological base for a tenable framework of formal approaches to L3 interlanguage development and, eventually, ultimate attainment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze data obtained from a study designed to evaluate training effects on the performance of certain motor activities of Parkinson`s disease patients. Maximum likelihood methods were used to fit beta-binomial/Poisson regression models tailored to evaluate the effects of training on the numbers of attempted and successful specified manual movements in 1 min periods, controlling for disease stage and use of the preferred hand. We extend models previously considered by other authors in univariate settings to account for the repeated measures nature of the data. The results suggest that the expected number of attempts and successes increase with training, except for patients with advanced stages of the disease using the non-preferred hand. Copyright (c) 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining can be used in healthcare industry to “mine” clinical data to discover hidden information for intelligent and affective decision making. Discovery of hidden patterns and relationships often goes intact, yet advanced data mining techniques can be helpful as remedy to this scenario. This thesis mainly deals with Intelligent Prediction of Chronic Renal Disease (IPCRD). Data covers blood, urine test, and external symptoms applied to predict chronic renal disease. Data from the database is initially transformed to Weka (3.6) and Chi-Square method is used for features section. After normalizing data, three classifiers were applied and efficiency of output is evaluated. Mainly, three classifiers are analyzed: Decision Tree, Naïve Bayes, K-Nearest Neighbour algorithm. Results show that each technique has its unique strength in realizing the objectives of the defined mining goals. Efficiency of Decision Tree and KNN was almost same but Naïve Bayes proved a comparative edge over others. Further sensitivity and specificity tests are used as statistical measures to examine the performance of a binary classification. Sensitivity (also called recall rate in some fields) measures the proportion of actual positives which are correctly identified while Specificity measures the proportion of negatives which are correctly identified. CRISP-DM methodology is applied to build the mining models. It consists of six major phases: business understanding, data understanding, data preparation, modeling, evaluation, and deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we verify the existence of predictability in the Brazilian equity market. Unlike other studies in the same sense, which evaluate original series for each stock, we evaluate synthetic series created on the basis of linear models of stocks. Following Burgess (1999), we use the “stepwise regression” model for the formation of models of each stock. We then use the variance ratio profile together with a Monte Carlo simulation for the selection of models with potential predictability. Unlike Burgess (1999), we carry out White’s Reality Check (2000) in order to verify the existence of positive returns for the period outside the sample. We use the strategies proposed by Sullivan, Timmermann & White (1999) and Hsu & Kuan (2005) amounting to 26,410 simulated strategies. Finally, using the bootstrap methodology, with 1,000 simulations, we find strong evidence of predictability in the models, including transaction costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mercados são instituições criadas para facilitar uma atividade de comercialização. Isto é possível porque um mercado é constituído por instituições que foram desenhadas para reduzir os custos de transação associados a este processo de troca. A partir dessas duas ideias, esta tese possui três objetivos principais. (i) Analisar por que a literatura de análise de cointegração tem mensurado estes custos de forma imprecisa. A principal razão é certa confusão entre os conceitos de custos de transação, de transporte e de comercialização. (ii) Propor um procedimento para mensurar indiretamente os custos de transação de mercado variáveis combinando os modelos de cointegração com mudança de regime e a estrutura teórica oferecidas pela Nova Economia Institucional. Este procedimento é aplicado para quantificar quanto custa comercializar etanol no mercado internacional usando suas atuais instituições. (iii) Por fim, usando os mesmos modelos e a mesma estrutura teórica, esta dissertação contesta a hipótese de que já existe um mercado internacional de etanol bem desenvolvido, tal qual a literatura tem assumido. De forma semelhante, também é avaliada a hipótese de que a remoção das barreiras comerciais norte-americanas para o etanol brasileiro seria uma condição suficiente para o desenvolvimento deste mercado internacional. Os testes aplicados rejeitam ambas as hipóteses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, the trend within the electronics industry is for the use of rapid and advanced simulation methodologies in association with synthesis toolsets. This paper presents an approach developed to support mixed-signal circuit design and analysis. The methodology proposed shows a novel approach to the problem of developing behvioural model descriptions of mixed-signal circuit topologies, by construction of a set of subsystems, that supports the automated mapping of MATLAB (R)/SINIULINK (R) models to structural VHDL-AMS descriptions. The tool developed, named (MSSV)-S-2, reads a SIMULINK (R) model file and translates it to a structural VHDL-AMS code. It also creates the file structure required to simulate the translated model in the SystemVision (TM). To validate the methodology and the developed program, the DAC08, AD7524 and AD5450 data converters were studied and initially modelled in MATLAB (R)/SIMULINK (R). The VHDL-AMS code generated automatically by (MSSV)-S-2, (MATLAB (R)/SIMULINK (R) to SystemVision (TM)), was then simulated in the SystemVision (TM). The simulation results show that the proposed approach, which is based on VHDL-AMS descriptions of the original model library elements, allows for the behavioural level simulation of complex mixed-signal circuits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the U(4) hybrid formalism, manifestly N = (2,2) worldsheet supersymmetric sigma models are constructed for the type-IIB superstring in Ramond-Ramond backgrounds. The Kahler potential in these N = 2 sigma models depends on four chiral and antichiral bosonic superfields and two chiral and antichiral fermionic superfields. When the Kahler potential is quadratic, the model is a free conformal field theory which describes a flat ten-dimensional target space with Ramond-Ramond flux and non-constant dilaton. For more general Kahler potentials, the model describes curved target spaces with Ramond-Ramond flux that are not plane-wave backgrounds. Ricci-flatness of the Kahler metric implies the on-shell conditions for the background up to the usual four-loop conformal anomaly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of new advanced technology on issues that concern meaningful information and its relation to studies of intelligence constitutes the main topic of the present paper. The advantages, disadvantages and implications of the synthetic methodology developed by cognitive scientists, according to which mechanical models of the mind, such as computer simulations or self-organizing robots, may provide good explanatory tools to investigate cognition, are discussed. A difficulty with this methodology is pointed out, namely the use of meaningless information to explain intelligent behavior that incorporates meaningful information. In this context, it is inquired what are the contributions of cognitive science to contemporary studies of intelligent behavior and how technology may play a role in the analysis of the relationships established by organisms in their natural and social environments. © John Benjamins Publishing Company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, the trend within the electronics industry is for the use of rapid and advanced simulation methodologies in association with synthesis toolsets. This paper presents an approach developed to support mixed-signal circuit design and analysis. The methodology proposed shows a novel approach to the problem of developing behvioural model descriptions of mixed-signal circuit topologies, by construction of a set of subsystems, that supports the automated mapping of MATLAB®/SIMULINK® models to structural VHDL-AMS descriptions. The tool developed, named MS 2SV, reads a SIMULINK® model file and translates it to a structural VHDL-AMS code. It also creates the file structure required to simulate the translated model in the System Vision™. To validate the methodology and the developed program, the DAC08, AD7524 and AD5450 data converters were studied and initially modelled in MATLAB®/ SIMULINK®. The VHDL-AMS code generated automatically by MS 2SV, (MATLAB®/SIMULINK® to System Vision™), was then simulated in the System Vision™. The simulation results show that the proposed approach, which is based on VHDL-AMS descriptions of the original model library elements, allows for the behavioural level simulation of complex mixed-signal circuits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work evaluated kinetic and adsorption physicochemical models for the biosorption process of lanthanum, neodymium, europium, and gadolinium by Sargassum sp. in batch systems. The results showed: (a) the pseudo-second order kinetic model was the best approximation for the experimental data with the metal adsorption initial velocity parameter in 0.042-0.055 mmol.g -1.min-1 (La < Nd < Gd < Eu); (b) the Langmuir adsorption model presented adequate correlation with maximum metal uptake at 0.60-0.70 mmol g-1 (Eu < La < Gd < Nd) and the metal-biomass affinity parameter showed distinct values (Gd < Nd < Eu < La: 183.1, 192.5, 678.3, and 837.3 L g-1, respectively); and (c) preliminarily, the kinetics and adsorption evaluation did not reveal a well-defined metal selectivity behavior for the RE biosorption in Sargassum sp., but they indicate a possible partition among RE studied. © (2009) Trans Tech Publications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the development of an application created to assist the teaching of dental structures, generate rich content information and different manners of interaction. An ontology was created to provide semantics informations for virtual models. We also used two devices gesture-based interaction: Kinect and Wii Remote. It was developed a system which use intuitive interaction, and it is able to generate three dimensional images, making the experience of teaching / learning motivating. The projection environment used by the system was called Mini CAVE. © 2012 IEEE.