910 resultados para Process-based model (PBM)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubble-like deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the non-fundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-Cloud Applications are composed of services offered by multiple cloud platforms where the user/developer has full knowledge of the use of such platforms. The use of multiple cloud platforms avoids the following problems: (i) vendor lock-in, which is dependency on the application of a certain cloud platform, which is prejudicial in the case of degradation or failure of platform services, or even price increasing on service usage; (ii) degradation or failure of the application due to fluctuations in quality of service (QoS) provided by some cloud platform, or even due to a failure of any service. In multi-cloud scenario is possible to change a service in failure or with QoS problems for an equivalent of another cloud platform. So that an application can adopt the perspective multi-cloud is necessary to create mechanisms that are able to select which cloud services/platforms should be used in accordance with the requirements determined by the programmer/user. In this context, the major challenges in terms of development of such applications include questions such as: (i) the choice of which underlying services and cloud computing platforms should be used based on the defined user requirements in terms of functionality and quality (ii) the need to continually monitor the dynamic information (such as response time, availability, price, availability), related to cloud services, in addition to the wide variety of services, and (iii) the need to adapt the application if QoS violations affect user defined requirements. This PhD thesis proposes an approach for dynamic adaptation of multi-cloud applications to be applied when a service is unavailable or when the requirements set by the user/developer point out that other available multi-cloud configuration meets more efficiently. Thus, this work proposes a strategy composed of two phases. The first phase consists of the application modeling, exploring the similarities representation capacity and variability proposals in the context of the paradigm of Software Product Lines (SPL). In this phase it is used an extended feature model to specify the cloud service configuration to be used by the application (similarities) and the different possible providers for each service (variability). Furthermore, the non-functional requirements associated with cloud services are specified by properties in this model by describing dynamic information about these services. The second phase consists of an autonomic process based on MAPE-K control loop, which is responsible for selecting, optimally, a multicloud configuration that meets the established requirements, and perform the adaptation. The adaptation strategy proposed is independent of the used programming technique for performing the adaptation. In this work we implement the adaptation strategy using various programming techniques such as aspect-oriented programming, context-oriented programming and components and services oriented programming. Based on the proposed steps, we tried to assess the following: (i) the process of modeling and the specification of non-functional requirements can ensure effective monitoring of user satisfaction; (ii) if the optimal selection process presents significant gains compared to sequential approach; and (iii) which techniques have the best trade-off when compared efforts to development/modularity and performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advanced Oxidation Processes (AOP) are techniques involving the formation of hydroxyl radical (HO•) with high organic matter oxidation rate. These processes application in industry have been increasing due to their capacity of degrading recalcitrant substances that cannot be completely removed by traditional processes of effluent treatment. In the present work, phenol degrading by photo-Fenton process based on addition of H2O2, Fe2+ and luminous radiation was studied. An experimental design was developed to analyze the effect of phenol, H2O2 and Fe2+ concentration on the fraction of total organic carbon (TOC) degraded. The experiments were performed in a batch photochemical parabolic reactor with 1.5 L of capacity. Samples of the reactional medium were collected at different reaction times and analyzed in a TOC measurement instrument from Shimadzu (TOC-VWP). The results showed a negative effect of phenol concentration and a positive effect of the two other variables in the TOC degraded fraction. A statistical analysis of the experimental design showed that the hydrogen peroxide concentration was the most influent variable in the TOC degraded fraction at 45 minutes and generated a model with R² = 0.82, which predicted the experimental data with low precision. The Visual Basic for Application (VBA) tool was used to generate a neural networks model and a photochemical database. The aforementioned model presented R² = 0.96 and precisely predicted the response data used for testing. The results found indicate the possible application of the developed tool for industry, mainly for its simplicity, low cost and easy access to the program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advanced Oxidation Processes (AOP) are techniques involving the formation of hydroxyl radical (HO•) with high organic matter oxidation rate. These processes application in industry have been increasing due to their capacity of degrading recalcitrant substances that cannot be completely removed by traditional processes of effluent treatment. In the present work, phenol degrading by photo-Fenton process based on addition of H2O2, Fe2+ and luminous radiation was studied. An experimental design was developed to analyze the effect of phenol, H2O2 and Fe2+ concentration on the fraction of total organic carbon (TOC) degraded. The experiments were performed in a batch photochemical parabolic reactor with 1.5 L of capacity. Samples of the reactional medium were collected at different reaction times and analyzed in a TOC measurement instrument from Shimadzu (TOC-VWP). The results showed a negative effect of phenol concentration and a positive effect of the two other variables in the TOC degraded fraction. A statistical analysis of the experimental design showed that the hydrogen peroxide concentration was the most influent variable in the TOC degraded fraction at 45 minutes and generated a model with R² = 0.82, which predicted the experimental data with low precision. The Visual Basic for Application (VBA) tool was used to generate a neural networks model and a photochemical database. The aforementioned model presented R² = 0.96 and precisely predicted the response data used for testing. The results found indicate the possible application of the developed tool for industry, mainly for its simplicity, low cost and easy access to the program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bayesian nonparametric models, such as the Gaussian process and the Dirichlet process, have been extensively applied for target kinematics modeling in various applications including environmental monitoring, traffic planning, endangered species tracking, dynamic scene analysis, autonomous robot navigation, and human motion modeling. As shown by these successful applications, Bayesian nonparametric models are able to adjust their complexities adaptively from data as necessary, and are resistant to overfitting or underfitting. However, most existing works assume that the sensor measurements used to learn the Bayesian nonparametric target kinematics models are obtained a priori or that the target kinematics can be measured by the sensor at any given time throughout the task. Little work has been done for controlling the sensor with bounded field of view to obtain measurements of mobile targets that are most informative for reducing the uncertainty of the Bayesian nonparametric models. To present the systematic sensor planning approach to leaning Bayesian nonparametric models, the Gaussian process target kinematics model is introduced at first, which is capable of describing time-invariant spatial phenomena, such as ocean currents, temperature distributions and wind velocity fields. The Dirichlet process-Gaussian process target kinematics model is subsequently discussed for modeling mixture of mobile targets, such as pedestrian motion patterns.

Novel information theoretic functions are developed for these introduced Bayesian nonparametric target kinematics models to represent the expected utility of measurements as a function of sensor control inputs and random environmental variables. A Gaussian process expected Kullback Leibler divergence is developed as the expectation of the KL divergence between the current (prior) and posterior Gaussian process target kinematics models with respect to the future measurements. Then, this approach is extended to develop a new information value function that can be used to estimate target kinematics described by a Dirichlet process-Gaussian process mixture model. A theorem is proposed that shows the novel information theoretic functions are bounded. Based on this theorem, efficient estimators of the new information theoretic functions are designed, which are proved to be unbiased with the variance of the resultant approximation error decreasing linearly as the number of samples increases. Computational complexities for optimizing the novel information theoretic functions under sensor dynamics constraints are studied, and are proved to be NP-hard. A cumulative lower bound is then proposed to reduce the computational complexity to polynomial time.

Three sensor planning algorithms are developed according to the assumptions on the target kinematics and the sensor dynamics. For problems where the control space of the sensor is discrete, a greedy algorithm is proposed. The efficiency of the greedy algorithm is demonstrated by a numerical experiment with data of ocean currents obtained by moored buoys. A sweep line algorithm is developed for applications where the sensor control space is continuous and unconstrained. Synthetic simulations as well as physical experiments with ground robots and a surveillance camera are conducted to evaluate the performance of the sweep line algorithm. Moreover, a lexicographic algorithm is designed based on the cumulative lower bound of the novel information theoretic functions, for the scenario where the sensor dynamics are constrained. Numerical experiments with real data collected from indoor pedestrians by a commercial pan-tilt camera are performed to examine the lexicographic algorithm. Results from both the numerical simulations and the physical experiments show that the three sensor planning algorithms proposed in this dissertation based on the novel information theoretic functions are superior at learning the target kinematics with

little or no prior knowledge

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Four marine fish species are among the most important on the world market: cod, salmon, tuna, and sea bass. While the supply of North American and European markets for two of these species - Atlantic salmon and European sea bass - mainly comes from fish farming, Atlantic cod and tunas are mainly caught from wild stocks. We address the question what will be the status of these wild stocks in the midterm future, in the year 2048, to be specific. Whereas the effects of climate change and ecological driving forces on fish stocks have already gained much attention, our prime interest is in studying the effects of changing economic drivers, as well as the impact of variable management effectiveness. Using a process-based ecological-economic multispecies optimization model, we assess the future stock status under different scenarios of change. We simulate (i) technological progress in fishing, (ii) increasing demand for fish, and (iii) increasing supply of farmed fish, as well as the interplay of these driving forces under different sce- narios of (limited) fishery management effectiveness. We find that economic change has a substantial effect on fish populations. Increasing aquaculture production can dampen the fishing pressure on wild stocks, but this effect is likely to be overwhelmed by increasing demand and technological progress, both increasing fishing pressure. The only solution to avoid collapse of the majority of stocks is institutional change to improve management effectiveness significantly above the current state. We conclude that full recognition of economic drivers of change will be needed to successfully develop an integrated ecosystem management and to sustain the wild fish stocks until 2048 and beyond.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The distribution, abundance, behaviour, and morphology of marine species is affected by spatial variability in the wave environment. Maps of wave metrics (e.g. significant wave height Hs, peak energy wave period Tp, and benthic wave orbital velocity URMS) are therefore useful for predictive ecological models of marine species and ecosystems. A number of techniques are available to generate maps of wave metrics, with varying levels of complexity in terms of input data requirements, operator knowledge, and computation time. Relatively simple "fetch-based" models are generated using geographic information system (GIS) layers of bathymetry and dominant wind speed and direction. More complex, but computationally expensive, "process-based" models are generated using numerical models such as the Simulating Waves Nearshore (SWAN) model. We generated maps of wave metrics based on both fetch-based and process-based models and asked whether predictive performance in models of benthic marine habitats differed. Predictive models of seagrass distribution for Moreton Bay, Southeast Queensland, and Lizard Island, Great Barrier Reef, Australia, were generated using maps based on each type of wave model. For Lizard Island, performance of the process-based wave maps was significantly better for describing the presence of seagrass, based on Hs, Tp, and URMS. Conversely, for the predictive model of seagrass in Moreton Bay, based on benthic light availability and Hs, there was no difference in performance using the maps of the different wave metrics. For predictive models where wave metrics are the dominant factor determining ecological processes it is recommended that process-based models be used. Our results suggest that for models where wave metrics provide secondarily useful information, either fetch- or process-based models may be equally useful.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Government communication is an important management tool during a public health crisis, but understanding its impact is difficult. Strategies may be adjusted in reaction to developments on the ground and it is challenging to evaluate the impact of communication separately from other crisis management activities. Agent-based modeling is a well-established research tool in social science to respond to similar challenges. However, there have been few such models in public health. We use the example of the TELL ME agent-based model to consider ways in which a non-predictive policy model can assist policy makers. This model concerns individuals’ protective behaviors in response to an epidemic, and the communication that influences such behavior. Drawing on findings from stakeholder workshops and the results of the model itself, we suggest such a model can be useful: (i) as a teaching tool, (ii) to test theory, and (iii) to inform data collection. We also plot a path for development of similar models that could assist with communication planning for epidemics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2006, a large and prolonged bloom of the dinoflagellate Karenia mikimotoi occurred in Scottish coastal waters, causing extensive mortalities of benthic organisms including annelids and molluscs and some species of fish ( Davidson et al., 2009). A coupled hydrodynamic-algal transport model was developed to track the progression of the bloom around the Scottish coast during June–September 2006 and hence investigate the processes controlling the bloom dynamics. Within this individual-based model, cells were capable of growth, mortality and phototaxis and were transported by physical processes of advection and turbulent diffusion, using current velocities extracted from operational simulations of the MRCS ocean circulation model of the North-west European continental shelf. Vertical and horizontal turbulent diffusion of cells are treated using a random walk approach. Comparison of model output with remotely sensed chlorophyll concentrations and cell counts from coastal monitoring stations indicated that it was necessary to include multiple spatially distinct seed populations of K. mikimotoi at separate locations on the shelf edge to capture the qualitative pattern of bloom transport and development. We interpret this as indicating that the source population was being transported northwards by the Hebridean slope current from where colonies of K. mikimotoi were injected onto the continental shelf by eddies or other transient exchange processes. The model was used to investigate the effects on simulated K. mikimotoi transport and dispersal of: (1) the distribution of the initial seed population; (2) algal growth and mortality; (3) water temperature; (4) the vertical movement of particles by diurnal migration and eddy diffusion; (5) the relative role of the shelf edge and coastal currents; (6) the role of wind forcing. The numerical experiments emphasized the requirement for a physiologically based biological model and indicated that improved modelling of future blooms will potentially benefit from better parameterisation of temperature dependence of both growth and mortality and finer spatial and temporal hydrodynamic resolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2006, a large and prolonged bloom of the dinoflagellate Karenia mikimotoi occurred in Scottish coastal waters, causing extensive mortalities of benthic organisms including annelids and molluscs and some species of fish ( Davidson et al., 2009). A coupled hydrodynamic-algal transport model was developed to track the progression of the bloom around the Scottish coast during June–September 2006 and hence investigate the processes controlling the bloom dynamics. Within this individual-based model, cells were capable of growth, mortality and phototaxis and were transported by physical processes of advection and turbulent diffusion, using current velocities extracted from operational simulations of the MRCS ocean circulation model of the North-west European continental shelf. Vertical and horizontal turbulent diffusion of cells are treated using a random walk approach. Comparison of model output with remotely sensed chlorophyll concentrations and cell counts from coastal monitoring stations indicated that it was necessary to include multiple spatially distinct seed populations of K. mikimotoi at separate locations on the shelf edge to capture the qualitative pattern of bloom transport and development. We interpret this as indicating that the source population was being transported northwards by the Hebridean slope current from where colonies of K. mikimotoi were injected onto the continental shelf by eddies or other transient exchange processes. The model was used to investigate the effects on simulated K. mikimotoi transport and dispersal of: (1) the distribution of the initial seed population; (2) algal growth and mortality; (3) water temperature; (4) the vertical movement of particles by diurnal migration and eddy diffusion; (5) the relative role of the shelf edge and coastal currents; (6) the role of wind forcing. The numerical experiments emphasized the requirement for a physiologically based biological model and indicated that improved modelling of future blooms will potentially benefit from better parameterisation of temperature dependence of both growth and mortality and finer spatial and temporal hydrodynamic resolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Government communication is an important management tool during a public health crisis, but understanding its impact is difficult. Strategies may be adjusted in reaction to developments on the ground and it is challenging to evaluate the impact of communication separately from other crisis management activities. Agent-based modeling is a well-established research tool in social science to respond to similar challenges. However, there have been few such models in public health. We use the example of the TELL ME agent-based model to consider ways in which a non-predictive policy model can assist policy makers. This model concerns individuals' protective behaviors in response to an epidemic, and the communication that influences such behavior. Drawing on findings from stakeholder workshops and the results of the model itself, we suggest such a model can be useful: (i) as a teaching tool, (ii) to test theory, and (iii) to inform data collection. We also plot a path for development of similar models that could assist with communication planning for epidemics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Title: The £ for lb. Challenge – A lose - win – win scenario. Results from a novel workplace-based, peer-led weight management programme in 2016.

Names: Damien Bennett, Declan Bradley, Angela McComb, Amy Kiernan, Tracey Owen

Background: Tackling obesity is a public health priority. The £ for lb. Challenge is the first country wide, workplace-based peer-led weight management programme in the UK or Ireland with participants from a range of private and public businesses in Northern Ireland (NI).
Intervention: The intervention was workplace-based, led by workplace Champions and based on the NHS Choices 12 week weight loss guide. It operated from January to April 2016. Overweight and obese adult workers were eligible. Training of Peer Champions (staff volunteers) involved two half day workshops delivered by dieticians and physical activity professionals.
Outcome measurement: Weight was measured at enrolment and 12 weekly intervals. Changes in weight, % weight, BMI and % BMI were determined for the whole cohort and sex and deprivation subgroups.
Results: There were 1513 eligible participants from 35 companies. Engagement rate was 98%. 75% of participants completed the programme. Mean weight loss was 2.4 kg or 2.7%. Almost a quarter (24%) lost at least 5% initial bodyweight. Male participants were over twice as likely to complete the programme and three times more likely to lose 5% body weight or more. Over £17,000 was raised for NI charities.
Discussion: The £ for lb. Challenge is a successful health improvement programme with important weight loss for many participants, particularly male workers. With high levels of user engagement and ownership and successful multidisciplinary collaboration between public health, voluntary bodies, private and public companies it is a novel workplace based model with potential to expand.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work represents an original contribution to the methodology for ecosystem models' development as well as the rst attempt of an end-to-end (E2E) model of the Northern Humboldt Current Ecosystem (NHCE). The main purpose of the developed model is to build a tool for ecosystem-based management and decision making, reason why the credibility of the model is essential, and this can be assessed through confrontation to data. Additionally, the NHCE exhibits a high climatic and oceanographic variability at several scales, the major source of interannual variability being the interruption of the upwelling seasonality by the El Niño Southern Oscillation, which has direct e ects on larval survival and sh recruitment success. Fishing activity can also be highly variable, depending on the abundance and accessibility of the main shery resources. This context brings the two main methodological questions addressed in this thesis, through the development of an end-to-end model coupling the high trophic level model OSMOSE to the hydrodynamics and biogeochemical model ROMS-PISCES: i) how to calibrate ecosystem models using time series data and ii) how to incorporate the impact of the interannual variability of the environment and shing. First, this thesis highlights some issues related to the confrontation of complex ecosystem models to data and proposes a methodology for a sequential multi-phases calibration of ecosystem models. We propose two criteria to classify the parameters of a model: the model dependency and the time variability of the parameters. Then, these criteria along with the availability of approximate initial estimates are used as decision rules to determine which parameters need to be estimated, and their precedence order in the sequential calibration process. Additionally, a new Evolutionary Algorithm designed for the calibration of stochastic models (e.g Individual Based Model) and optimized for maximum likelihood estimation has been developed and applied to the calibration of the OSMOSE model to time series data. The environmental variability is explicit in the model: the ROMS-PISCES model forces the OSMOSE model and drives potential bottom-up e ects up the foodweb through plankton and sh trophic interactions, as well as through changes in the spatial distribution of sh. The latter e ect was taken into account using presence/ absence species distribution models which are traditionally assessed through a confusion matrix and the statistical metrics associated to it. However, when considering the prediction of the habitat against time, the variability in the spatial distribution of the habitat can be summarized and validated using the emerging patterns from the shape of the spatial distributions. We modeled the potential habitat of the main species of the Humboldt Current Ecosystem using several sources of information ( sheries, scienti c surveys and satellite monitoring of vessels) jointly with environmental data from remote sensing and in situ observations, from 1992 to 2008. The potential habitat was predicted over the study period with monthly resolution, and the model was validated using quantitative and qualitative information of the system using a pattern oriented approach. The nal ROMS-PISCES-OSMOSE E2E ecosystem model for the NHCE was calibrated using our evolutionary algorithm and a likelihood approach to t monthly time series data of landings, abundance indices and catch at length distributions from 1992 to 2008. To conclude, some potential applications of the model for shery management are presented and their limitations and perspectives discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ce travail évalue le comportement mécanique des matériaux cimentaires à différentes échelles de distance. Premièrement, les propriétés mécaniques du béton produit avec un bioplastifiant à base de microorganismes efficaces (EM) sont etudiées par nanoindentation statistique, et comparées aux propriétés mécaniques du béton produit avec un superplastifiant ordinaire (SP). Il est trouvé que l’ajout de bioplastifiant à base de produit EM améliore la résistance des C–S–H en augmentant la cohésion et la friction des nanograins solides. L’analyse statistique des résultats d’indentation suggère que le bioplastifiant à base de produit EM inhibe la précipitation des C–S–H avec une plus grande fraction volumique solide. Deuxièmement, un modèle multi-échelles à base micromécanique est dérivé pour le comportement poroélastique de la pâte de ciment au jeune age. L’approche proposée permet d’obtenir les propriétés poroélastiques requises pour la modélisation du comportoment mécanique partiellement saturé des pâtes de ciment viellissantes. Il est montré que ce modèle prédit le seuil de percolation et le module de Young non drainé de façon conforme aux données expérimentales. Un metamodèle stochastique est construit sur la base du chaos polynomial pour propager l’incertitude des paramètres du modèle à travers plusieurs échelles de distance. Une analyse de sensibilité est conduite par post-traitement du metamodèle pour des pâtes de ciment avec ratios d’eau sur ciment entre 0.35 et 0.70. Il est trouvé que l’incertitude sous-jacente des propriétés poroélastiques équivalentes est principalement due à l’énergie d’activation des aluminates de calcium au jeune age et, plus tard, au module élastique des silicates de calcium hydratés de basse densité.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The focus of paper is to asses and evaluate new utilisation method of coals combustion resides in glass manufacturing process. Mathematical model of glass manufacturing material balance was used to find favourable proportion of normally used batch materials and coal ash. It was found that possible to substitute up to 20 % of batch with coal ash. On the world glass production scale there is a potential to save 8,4 million tons of silica sand, 6 million tons of dolomite, 3 million tons of clay and 0,2 million tons of lime borate. Furthermore, potential to utilize 2 % of coal combustion products with suggested method.