16 resultados para stock order flow model

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this work is to develop a quasi three-dimensional numerical model to simulate stony debris flows, considering a continuum fluid phase, composed by water and fine sediments, and a non-continuum phase including large particles, such as pebbles and boulders. Large particles are treated in a Lagrangian frame of reference using the Discrete Element Method, the fluid phase is based on the Eulerian approach, using the Finite Element Method to solve the depth-averaged Navier-Stokes equations in two horizontal dimensions. The particle’s equations of motion are in three dimensions. The model simulates particle-particle collisions and wall-particle collisions, taking into account that particles are immersed in a fluid. Bingham and Cross rheological models are used for the continuum phase. Both formulations provide very stable results, even in the range of very low shear rates. Bingham formulation is better able to simulate the stopping stage of the fluid when applied shear stresses are low. Results of numerical simulations have been compared with data from laboratory experiments on a flume-fan prototype. Results show that the model is capable of simulating the motion of big particles moving in the fluid flow, handling dense particulate flows and avoiding overlap among particles. An application to simulate debris flow events that occurred in Northern Venezuela in 1999 shows that the model could replicate the main boulder accumulation areas that were surveyed by the USGS. Uniqueness of this research is the integration of mud flow and stony debris movement in a single modeling tool that can be used for planning and management of debris flow prone areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this work is to develop a quasi three-dimensional numerical model to simulate stony debris flows, considering a continuum fluid phase, composed by water and fine sediments, and a non-continuum phase including large particles, such as pebbles and boulders. Large particles are treated in a Lagrangian frame of reference using the Discrete Element Method, the fluid phase is based on the Eulerian approach, using the Finite Element Method to solve the depth-averaged Navier–Stokes equations in two horizontal dimensions. The particle’s equations of motion are in three dimensions. The model simulates particle-particle collisions and wall-particle collisions, taking into account that particles are immersed in a fluid. Bingham and Cross rheological models are used for the continuum phase. Both formulations provide very stable results, even in the range of very low shear rates. Bingham formulation is better able to simulate the stopping stage of the fluid when applied shear stresses are low. Results of numerical simulations have been compared with data from laboratory experiments on a flume-fan prototype. Results show that the model is capable of simulating the motion of big particles moving in the fluid flow, handling dense particulate flows and avoiding overlap among particles. An application to simulate debris flow events that occurred in Northern Venezuela in 1999 shows that the model could replicate the main boulder accumulation areas that were surveyed by the USGS. Uniqueness of this research is the integration of mud flow and stony debris movement in a single modeling tool that can be used for planning and management of debris flow prone areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tropical rainforests account for more than a third of global net primary production and contain more than half of the global forest carbon. Though these forests are a disproportionately important component of the global carbon cycle, the relationship between rainforest productivity and climate remains poorly understood. Understanding the link between current climate and rainforest tree stem diameter increment, a major constituent of forest productivity, will be crucial to efforts at modeling future climate and rainforest response to climate change. This work reports the physiological and stem growth responses to micrometeorological and phenological states of ten species of canopy trees in a Costa Rican wet tropical forest at sub-annual time intervals. I measured tree growth using band dendrometers and estimated leaf and reproductive phenological states monthly. Electronic data loggers recorded xylem sap flow (an indicator of photosynthetic rate) and weather at half-hour intervals. An analysis of xylem sap flow showed that physiological responses were independent of species, which allowed me to construct a general model of weather driven sap flow rates. This model predicted more than eighty percent of climate driven sap flow variation. Leaf phenology influenced growth in three of the ten species, with two of these species showing a link between leaf phenology and weather. A combination of rainfall, air temperature, and irradiance likely provided the cues that triggered leaf drop in Dipteryx panamensis and Lecythis ampla. Combining the results of the sap flow model, growth, and the climate measures showed tree growth was correlated to climate, though the majority of growth variation remained unexplained. Low variance in the environmental variables and growth rates likely contributed to the large amount of unexplained variation. A simple model that included previous growth increment and three meteorological variables explained from four to nearly fifty percent of the growth variation. Significant growth carryover existed in six of the ten species, and rainfall was positively correlated to growth in eight of the ten species. Minimum nighttime temperature was also correlated to higher growth rates in five of the species and irradiance in two species. These results indicate that tropical rainforest tree trunks could act as carbon sinks if future climate becomes wetter and slightly warmer. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Freeway systems are becoming more congested each day. One contribution to freeway traffic congestion comprises platoons of on-ramp traffic merging into freeway mainlines. As a relatively low-cost countermeasure to the problem, ramp meters are being deployed in both directions of an 11-mile section of I-95 in Miami-Dade County, Florida. The local Fuzzy Logic (FL) ramp metering algorithm implemented in Seattle, Washington, has been selected for deployment. The FL ramp metering algorithm is powered by the Fuzzy Logic Controller (FLC). The FLC depends on a series of parameters that can significantly alter the behavior of the controller, thus affecting the performance of ramp meters. However, the most suitable values for these parameters are often difficult to determine, as they vary with current traffic conditions. Thus, for optimum performance, the parameter values must be fine-tuned. This research presents a new method of fine tuning the FLC parameters using Particle Swarm Optimization (PSO). PSO attempts to optimize several important parameters of the FLC. The objective function of the optimization model incorporates the METANET macroscopic traffic flow model to minimize delay time, subject to the constraints of reasonable ranges of ramp metering rates and FLC parameters. To further improve the performance, a short-term traffic forecasting module using a discrete Kalman filter was incorporated to predict the downstream freeway mainline occupancy. This helps to detect the presence of downstream bottlenecks. The CORSIM microscopic simulation model was selected as the platform to evaluate the performance of the proposed PSO tuning strategy. The ramp-metering algorithm incorporating the tuning strategy was implemented using CORSIM's run-time extension (RTE) and was tested on the aforementioned I-95 corridor. The performance of the FLC with PSO tuning was compared with the performance of the existing FLC without PSO tuning. The results show that the FLC with PSO tuning outperforms the existing FL metering, fixed-time metering, and existing conditions without metering in terms of total travel time savings, average speed, and system-wide throughput.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most popular techniques for creating spatialized virtual sounds is based on the use of Head-Related Transfer Functions (HRTFs). HRTFs are signal processing models that represent the modifications undergone by the acoustic signal as it travels from a sound source to each of the listener's eardrums. These modifications are due to the interaction of the acoustic waves with the listener's torso, shoulders, head and pinnae, or outer ears. As such, HRTFs are somewhat different for each listener. For a listener to perceive synthesized 3-D sound cues correctly, the synthesized cues must be similar to the listener's own HRTFs. ^ One can measure individual HRTFs using specialized recording systems, however, these systems are prohibitively expensive and restrict the portability of the 3-D sound system. HRTF-based systems also face several computational challenges. This dissertation presents an alternative method for the synthesis of binaural spatialized sounds. The sound entering the pinna undergoes several reflective, diffractive and resonant phenomena, which determine the HRTF. Using signal processing tools, such as Prony's signal modeling method, an appropriate set of time delays and a resonant frequency were used to approximate the measured Head-Related Impulse Responses (HRIRs). Statistical analysis was used to find out empirical equations describing how the reflections and resonances are determined by the shape and size of the pinna features obtained from 3D images of 15 experimental subjects modeled in the project. These equations were used to yield “Model HRTFs” that can create elevation effects. ^ Listening tests conducted on 10 subjects show that these model HRTFs are 5% more effective than generic HRTFs when it comes to localizing sounds in the frontal plane. The number of reversals (perception of sound source above the horizontal plane when actually it is below the plane and vice versa) was also reduced by 5.7%, showing the perceptual effectiveness of this approach. The model is simple, yet versatile because it relies on easy to measure parameters to create an individualized HRTF. This low-order parameterized model also reduces the computational and storage demands, while maintaining a sufficient number of perceptually relevant spectral cues. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study investigated the utility of the Story Model for decision making at the jury level by examining the influence of evidence order and deliberation style on story consistency and guilt. Participants were shown a video-taped trial stimulus and then provided case perceptions including a guilt judgment and a narrative about what occurred during the incident. Participants then deliberated for approximately thirty minutes using either an evidence-driven or verdict-driven deliberation style before again providing case perceptions, including a guilt determination, a narrative about what happened during the incident, and an evidence recognition test. Multi-level regression analyses revealed that evidence order, deliberation style and sample interacted to influence both story consistency measures and guilt. Among students, participants in the verdict-driven deliberation condition formed more consistent pro-prosecution stories when the prosecution presented their case in story-order, while participants in the evidence-driven deliberation condition formed more consistent pro-prosecution stories when the defense's case was presented in story-order. Findings were the opposite among community members, with participants in the verdict-driven deliberation condition forming more consistent pro-prosecution stories when the defense's case was presented in story-order, and participants in the evidence-driven deliberation condition forming more consistent pro-prosecution stories when the prosecution's case was presented in story-order. Additionally several story consistency measures influenced guilt decisions. Thus there is some support for the hypothesis that story consistency mediates the influence of evidence order and deliberation style on guilt decisions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubblelike deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the nonfundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An integrated flow and transport model using MIKE SHE/MIKE 11 software was developed to predict the flow and transport of mercury, Hg(II), under varying environmental conditions. The model analyzed the impact of remediation scenarios within the East Fork Poplar Creek watershed of the Oak Ridge Reservation with respect to downstream concentration of mercury. The numerical simulations included the entire hydrological cycle: flow in rivers, overland flow, groundwater flow in the saturated and unsaturated zones, and evapotranspiration and precipitation time series. Stochastic parameters and hydrologic conditions over a five year period of historical hydrological data were used to analyze the hydrological cycle and to determine the prevailing mercury transport mechanism within the watershed. Simulations of remediation scenarios revealed that reduction of the highly contaminated point sources, rather than general remediation of the contaminant plume, has a more direct impact on downstream mercury concentrations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Secrecy is fundamental to computer security, but real systems often cannot avoid leaking some secret information. For this reason, the past decade has seen growing interest in quantitative theories of information flow that allow us to quantify the information being leaked. Within these theories, the system is modeled as an information-theoretic channel that specifies the probability of each output, given each input. Given a prior distribution on those inputs, entropy-like measures quantify the amount of information leakage caused by the channel. ^ This thesis presents new results in the theory of min-entropy leakage. First, we study the perspective of secrecy as a resource that is gradually consumed by a system. We explore this intuition through various models of min-entropy consumption. Next, we consider several composition operators that allow smaller systems to be combined into larger systems, and explore the extent to which the leakage of a combined system is constrained by the leakage of its constituents. Most significantly, we prove upper bounds on the leakage of a cascade of two channels, where the output of the first channel is used as input to the second. In addition, we show how to decompose a channel into a cascade of channels. ^ We also establish fundamental new results about the recently-proposed g-leakage family of measures. These results further highlight the significance of channel cascading. We prove that whenever channel A is composition refined by channel B, that is, whenever A is the cascade of B and R for some channel R, the leakage of A never exceeds that of B, regardless of the prior distribution or leakage measure (Shannon leakage, guessing entropy leakage, min-entropy leakage, or g-leakage). Moreover, we show that composition refinement is a partial order if we quotient away channel structure that is redundant with respect to leakage alone. These results are strengthened by the proof that composition refinement is the only way for one channel to never leak more than another with respect to g-leakage. Therefore, composition refinement robustly answers the question of when a channel is always at least as secure as another from a leakage point of view.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An awareness of mercury (Hg) contamination in various aquatic environments around the world has increased over the past decade, mostly due to its ability to concentrate in the biota. Because the presence and distribution of Hg in aquatic systems depend on many factors (e.g., pe, pH, salinity, temperature, organic and inorganic ligands, sorbents, etc.), it is crucial to understand its fate and transport in the presence of complexing constituents and natural sorbents, under those different factors. An improved understanding of the subject will support the selection of monitoring, remediation, and restoration technologies. The coupling of equilibrium chemical reactions with transport processes in the model PHREEQC offers an advantage in simulating and predicting the fate and transport of aqueous chemical species of interest. Thus, a great variety of reactive transport problems could be addressed in aquatic systems with boundary conditions of specific interest. Nevertheless, PHREEQC lacks a comprehensive thermodynamic database for Hg. Therefore, in order to use PHREEQC to address the fate and transport of Hg in aquatic environments, it is necessary to expand its thermodynamic database, confirm it and then evaluate it in applications where potential exists for its calibration and continued validation. The objectives of this study were twofold: 1) to develop, expand, and confirm the Hg database of the hydrogeochemical PHREEQC to enhance its capability to simulate the fate of Hg species in the presence of complexing constituents and natural sorbents under different conditions of pH, redox, salinity and temperature; and 2) to apply and evaluate the new database in flow and transport scenarios, at two field test beds: Oak Ridge Reservation, Oak Ridge, TN and Everglades National Park, FL, where Hg is present and is of much concern. Overall, this research enhanced the capability of the PHREEQC model to simulate the coupling of the Hg reactions in transport conditions. It also demonstrated its usefulness when applied to field situations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this research was to apply model checking by using a symbolic model checker on Predicate Transition Nets (PrT Nets). A PrT Net is a formal model of information flow which allows system properties to be modeled and analyzed. The aim of this thesis was to use the modeling and analysis power of PrT nets to provide a mechanism for the system model to be verified. Symbolic Model Verifier (SMV) was the model checker chosen in this thesis, and in order to verify the PrT net model of a system, it was translated to SMV input language. A software tool was implemented which translates the PrT Net into SMV language, hence enabling the process of model checking. The system includes two parts: the PrT net editor where the representation of a system can be edited, and the translator which converts the PrT net into an SMV program.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubble-like deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the non-fundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An integrated flow and transport model using MIKE SHE/MIKE 11 software was developed to predict the flow and transport of mercury, Hg(II), under varying environmental conditions. The model analyzed the impact of remediation scenarios within the East Fork Poplar Creek watershed of the Oak Ridge Reservation with respect to downstream concentration of mercury. The numerical simulations included the entire hydrological cycle: flow in rivers, overland flow, groundwater flow in the saturated and unsaturated zones, and evapotranspiration and precipitation time series. Stochastic parameters and hydrologic conditions over a five year period of historical hydrological data were used to analyze the hydrological cycle and to determine the prevailing mercury transport mechanism within the watershed. Simulations of remediation scenarios revealed that reduction of the highly contaminated point sources, rather than general remediation of the contaminant plume, has a more direct impact on downstream mercury concentrations.