10 resultados para Pattern Taxonomy Model

em Digital Commons at Florida International University


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Extensive portions of the southern Everglades are characterized by series of elongated, raised peat ridges and tree islands oriented parallel to the predominant flow direction, separated by intervening sloughs. Tall herbs or woody species are associated with higher elevations and shorter emergent or floating species are associated with lower elevations. The organic soils in this “Ridge-and-Slough” landscape have been stable over millennia in many locations, but degrade over decades under altered hydrologic conditions. We examined soil, pore water, and leaf phosphorus (P) and nitrogen (N) distributions in six Ridge and Slough communities in Shark Slough, Everglades National Park. We found P enrichment to increase and N to decrease monotonically along a gradient from the most persistently flooded sloughs to rarely flooded ridge environments, with the most dramatic change associated with the transition from marsh to forest. Leaf N:P ratios indicated that the marsh communities were strongly P-limited, while data from several forest types suggested either N-limitation or co-limitation by N and P. Ground water stage in forests exhibited a daytime decrease and partial nighttime recovery during periods of surface exposure. The recovery phase suggested re-supply from adjacent flooded marshes or the underlying aquifer, and a strong hydrologic connection between ridge and slough. We therefore developed a simple steady-state model to explore a mechanism by which a phosphorus conveyor belt driven by both evapotranspiration and the regional flow gradient can contribute to the characteristic Ridge and Slough pattern. The model demonstrated that evapotranspiration sinks at higher elevations can draw in low concentration marsh waters, raising local soil and water P concentrations. Focusing of flow and nutrients at the evapotranspiration zone is not strong enough to overcome the regional gradient entirely, allowing the nutrient to spread downstream and creating an elongated concentration plume in the direction of flow. Our analyses suggest that autogenic processes involving the effects of initially small differences in topography, via their interactions with hydrology and nutrient availability, can produce persistent physiographic patterns in the organic sediments of the Everglades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With advances in science and technology, computing and business intelligence (BI) systems are steadily becoming more complex with an increasing variety of heterogeneous software and hardware components. They are thus becoming progressively more difficult to monitor, manage and maintain. Traditional approaches to system management have largely relied on domain experts through a knowledge acquisition process that translates domain knowledge into operating rules and policies. It is widely acknowledged as a cumbersome, labor intensive, and error prone process, besides being difficult to keep up with the rapidly changing environments. In addition, many traditional business systems deliver primarily pre-defined historic metrics for a long-term strategic or mid-term tactical analysis, and lack the necessary flexibility to support evolving metrics or data collection for real-time operational analysis. There is thus a pressing need for automatic and efficient approaches to monitor and manage complex computing and BI systems. To realize the goal of autonomic management and enable self-management capabilities, we propose to mine system historical log data generated by computing and BI systems, and automatically extract actionable patterns from this data. This dissertation focuses on the development of different data mining techniques to extract actionable patterns from various types of log data in computing and BI systems. Four key problems—Log data categorization and event summarization, Leading indicator identification , Pattern prioritization by exploring the link structures , and Tensor model for three-way log data are studied. Case studies and comprehensive experiments on real application scenarios and datasets are conducted to show the effectiveness of our proposed approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conceptual database design is an unusually difficult and error-prone task for novice designers. This study examined how two training approaches---rule-based and pattern-based---might improve performance on database design tasks. A rule-based approach prescribes a sequence of rules for modeling conceptual constructs, and the action to be taken at various stages while developing a conceptual model. A pattern-based approach presents data modeling structures that occur frequently in practice, and prescribes guidelines on how to recognize and use these structures. This study describes the conceptual framework, experimental design, and results of a laboratory experiment that employed novice designers to compare the effectiveness of the two training approaches (between-subjects) at three levels of task complexity (within subjects). Results indicate an interaction effect between treatment and task complexity. The rule-based approach was significantly better in the low-complexity and the high-complexity cases; there was no statistical difference in the medium-complexity case. Designer performance fell significantly as complexity increased. Overall, though the rule-based approach was not significantly superior to the pattern-based approach in all instances, it out-performed the pattern-based approach at two out of three complexity levels. The primary contributions of the study are (1) the operationalization of the complexity construct to a degree not addressed in previous studies; (2) the development of a pattern-based instructional approach to database design; and (3) the finding that the effectiveness of a particular training approach may depend on the complexity of the task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software engineering researchers are challenged to provide increasingly more powerful levels of abstractions to address the rising complexity inherent in software solutions. One new development paradigm that places models as abstraction at the forefront of the development process is Model-Driven Software Development (MDSD). MDSD considers models as first class artifacts, extending the capability for engineers to use concepts from the problem domain of discourse to specify apropos solutions. A key component in MDSD is domain-specific modeling languages (DSMLs) which are languages with focused expressiveness, targeting a specific taxonomy of problems. The de facto approach used is to first transform DSML models to an intermediate artifact in a HLL e.g., Java or C++, then execute that resulting code.^ Our research group has developed a class of DSMLs, referred to as interpreted DSMLs (i-DSMLs), where models are directly interpreted by a specialized execution engine with semantics based on model changes at runtime. This execution engine uses a layered architecture and is referred to as a domain-specific virtual machine (DSVM). As the domain-specific model being executed descends the layers of the DSVM the semantic gap between the user-defined model and the services being provided by the underlying infrastructure is closed. The focus of this research is the synthesis engine, the layer in the DSVM which transforms i-DSML models into executable scripts for the next lower layer to process.^ The appeal of an i-DSML is constrained as it possesses unique semantics contained within the DSVM. Existing DSVMs for i-DSMLs exhibit tight coupling between the implicit model of execution and the semantics of the domain, making it difficult to develop DSVMs for new i-DSMLs without a significant investment in resources.^ At the onset of this research only one i-DSML had been created for the user- centric communication domain using the aforementioned approach. This i-DSML is the Communication Modeling Language (CML) and its DSVM is the Communication Virtual machine (CVM). A major problem with the CVM's synthesis engine is that the domain-specific knowledge (DSK) and the model of execution (MoE) are tightly interwoven consequently subsequent DSVMs would need to be developed from inception with no reuse of expertise.^ This dissertation investigates how to decouple the DSK from the MoE and subsequently producing a generic model of execution (GMoE) from the remaining application logic. This GMoE can be reused to instantiate synthesis engines for DSVMs in other domains. The generalized approach to developing the model synthesis component of i-DSML interpreters utilizes a reusable framework loosely coupled to DSK as swappable framework extensions.^ This approach involves first creating an i-DSML and its DSVM for a second do- main, demand-side smartgrid, or microgrid energy management, and designing the synthesis engine so that the DSK and MoE are easily decoupled. To validate the utility of the approach, the SEs are instantiated using the GMoE and DSKs of the two aforementioned domains and an empirical study to support our claim of reduced developmental effort is performed.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrogeologic variables controlling groundwater exchange with inflow and flow-through lakes were simulated using a three-dimensional numerical model (MODFLOW) to investigate and quantify spatial patterns of lake bed seepage and hydraulic head distributions in the porous medium surrounding the lakes. Also, the total annual inflow and outflow were calculated as a percentage of lake volume for flow-through lake simulations. The general exponential decline of seepage rates with distance offshore was best demonstrated at lower anisotropy ratio (i.e., Kh/Kv = 1, 10), with increasing deviation from the exponential pattern as anisotropy was increased to 100 and 1000. 2-D vertical section models constructed for comparison with 3-D models showed that groundwater heads and seepages were higher in 3-D simulations. Addition of low conductivity lake sediments decreased seepage rates nearshore and increased seepage rates offshore in inflow lakes, and increased the area of groundwater inseepage on the beds of flow-through lakes. Introduction of heterogeneity into the medium decreased the water table and seepage ratesnearshore, and increased seepage rates offshore in inflow lakes. A laterally restricted aquifer located at the downgradient side of the flow-through lake increased the area of outseepage. Recharge rate, lake depth and lake bed slope had relatively little effect on the spatial patterns of seepage rates and groundwater exchange with lakes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El Niño and the Southern Oscillation (ENSO) is a cycle that is initiated in the equatorial Pacific Ocean and is recognized on interannual timescales by oscillating patterns in tropical Pacific sea surface temperatures (SST) and atmospheric circulations. Using correlation and regression analysis of datasets that include SST’s and other interdependent variables including precipitation, surface winds, sea level pressure, this research seeks to quantify recent changes in ENSO behavior. Specifically, the amplitude, frequency of occurrence, and spatial characteristics (i.e. events with maximum amplitude in the Central Pacific versus the Eastern Pacific) are investigated. The research is based on the question; “Are the statistics of ENSO changing due to increasing greenhouse gas concentrations?” Our hypothesis is that the present-day changes in amplitude, frequency, and spatial characteristics of ENSO are determined by the natural variability of the ocean-atmosphere climate system, not the observed changes in the radiative forcing due to change in the concentrations of greenhouse gases. Statistical analysis, including correlation and regression analysis, is performed on observational ocean and atmospheric datasets available from the National Oceanographic and Atmospheric Administration (NOAA), National Center for Atmospheric Research (NCAR) and coupled model simulations from the Coupled Model Inter-comparison Project (phase 5, CMIP5). Datasets are analyzed with a particular focus on ENSO over the last thirty years. Understanding the observed changes in the ENSO phenomenon over recent decades has a worldwide significance. ENSO is the largest climate signal on timescales of 2 - 7 years and affects billions of people via atmospheric teleconnections that originate in the tropical Pacific. These teleconnections explain why changes in ENSO can lead to climate variations in areas including North and South America, Asia, and Australia. For the United States, El Niño events are linked to decreased number of hurricanes in the Atlantic basin, reduction in precipitation in the Pacific Northwest, and increased precipitation throughout the southern United Stated during winter months. Understanding variability in the amplitude, frequency, and spatial characteristics of ENSO is crucial for decision makers who must adapt where regional ecology and agriculture are affected by ENSO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El Niño and the Southern Oscillation (ENSO) is a cycle that is initiated in the equatorial Pacific Ocean and is recognized on interannual timescales by oscillating patterns in tropical Pacific sea surface temperatures (SST) and atmospheric circulations. Using correlation and regression analysis of datasets that include SST’s and other interdependent variables including precipitation, surface winds, sea level pressure, this research seeks to quantify recent changes in ENSO behavior. Specifically, the amplitude, frequency of occurrence, and spatial characteristics (i.e. events with maximum amplitude in the Central Pacific versus the Eastern Pacific) are investigated. The research is based on the question; “Are the statistics of ENSO changing due to increasing greenhouse gas concentrations?” Our hypothesis is that the present-day changes in amplitude, frequency, and spatial characteristics of ENSO are determined by the natural variability of the ocean-atmosphere climate system, not the observed changes in the radiative forcing due to change in the concentrations of greenhouse gases. Statistical analysis, including correlation and regression analysis, is performed on observational ocean and atmospheric datasets available from the National Oceanographic and Atmospheric Administration (NOAA), National Center for Atmospheric Research (NCAR) and coupled model simulations from the Coupled Model Inter-comparison Project (phase 5, CMIP5). Datasets are analyzed with a particular focus on ENSO over the last thirty years. Understanding the observed changes in the ENSO phenomenon over recent decades has a worldwide significance. ENSO is the largest climate signal on timescales of 2 - 7 years and affects billions of people via atmospheric teleconnections that originate in the tropical Pacific. These teleconnections explain why changes in ENSO can lead to climate variations in areas including North and South America, Asia, and Australia. For the United States, El Niño events are linked to decreased number of hurricanes in the Atlantic basin, reduction in precipitation in the Pacific Northwest, and increased precipitation throughout the southern United Stated during winter months. Understanding variability in the amplitude, frequency, and spatial characteristics of ENSO is crucial for decision makers who must adapt where regional ecology and agriculture are affected by ENSO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software engineering researchers are challenged to provide increasingly more pow- erful levels of abstractions to address the rising complexity inherent in software solu- tions. One new development paradigm that places models as abstraction at the fore- front of the development process is Model-Driven Software Development (MDSD). MDSD considers models as first class artifacts, extending the capability for engineers to use concepts from the problem domain of discourse to specify apropos solutions. A key component in MDSD is domain-specific modeling languages (DSMLs) which are languages with focused expressiveness, targeting a specific taxonomy of problems. The de facto approach used is to first transform DSML models to an intermediate artifact in a HLL e.g., Java or C++, then execute that resulting code. Our research group has developed a class of DSMLs, referred to as interpreted DSMLs (i-DSMLs), where models are directly interpreted by a specialized execution engine with semantics based on model changes at runtime. This execution engine uses a layered architecture and is referred to as a domain-specific virtual machine (DSVM). As the domain-specific model being executed descends the layers of the DSVM the semantic gap between the user-defined model and the services being provided by the underlying infrastructure is closed. The focus of this research is the synthesis engine, the layer in the DSVM which transforms i-DSML models into executable scripts for the next lower layer to process. The appeal of an i-DSML is constrained as it possesses unique semantics contained within the DSVM. Existing DSVMs for i-DSMLs exhibit tight coupling between the implicit model of execution and the semantics of the domain, making it difficult to develop DSVMs for new i-DSMLs without a significant investment in resources. At the onset of this research only one i-DSML had been created for the user- centric communication domain using the aforementioned approach. This i-DSML is the Communication Modeling Language (CML) and its DSVM is the Communication Virtual machine (CVM). A major problem with the CVM’s synthesis engine is that the domain-specific knowledge (DSK) and the model of execution (MoE) are tightly interwoven consequently subsequent DSVMs would need to be developed from inception with no reuse of expertise. This dissertation investigates how to decouple the DSK from the MoE and sub- sequently producing a generic model of execution (GMoE) from the remaining appli- cation logic. This GMoE can be reused to instantiate synthesis engines for DSVMs in other domains. The generalized approach to developing the model synthesis com- ponent of i-DSML interpreters utilizes a reusable framework loosely coupled to DSK as swappable framework extensions. This approach involves first creating an i-DSML and its DSVM for a second do- main, demand-side smartgrid, or microgrid energy management, and designing the synthesis engine so that the DSK and MoE are easily decoupled. To validate the utility of the approach, the SEs are instantiated using the GMoE and DSKs of the two aforementioned domains and an empirical study to support our claim of reduced developmental effort is performed.