20 resultados para Modeling Geomorphological Processes
em Digital Commons at Florida International University
Resumo:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. ^ Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. ^ The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures—cash in advance and documentary credit—have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.^
Resumo:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures -- cash in advance and documentary credit -- have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.
Resumo:
This dissertation establishes the foundation for a new 3-D visual interface integrating Magnetic Resonance Imaging (MRI) to Diffusion Tensor Imaging (DTI). The need for such an interface is critical for understanding brain dynamics, and for providing more accurate diagnosis of key brain dysfunctions in terms of neuronal connectivity. ^ This work involved two research fronts: (1) the development of new image processing and visualization techniques in order to accurately establish relational positioning of neuronal fiber tracts and key landmarks in 3-D brain atlases, and (2) the obligation to address the computational requirements such that the processing time is within the practical bounds of clinical settings. The system was evaluated using data from thirty patients and volunteers with the Brain Institute at Miami Children's Hospital. ^ Innovative visualization mechanisms allow for the first time white matter fiber tracts to be displayed alongside key anatomical structures within accurately registered 3-D semi-transparent images of the brain. ^ The segmentation algorithm is based on the calculation of mathematically-tuned thresholds and region-detection modules. The uniqueness of the algorithm is in its ability to perform fast and accurate segmentation of the ventricles. In contrast to the manual selection of the ventricles, which averaged over 12 minutes, the segmentation algorithm averaged less than 10 seconds in its execution. ^ The registration algorithm established searches and compares MR with DT images of the same subject, where derived correlation measures quantify the resulting accuracy. Overall, the images were 27% more correlated after registration, while an average of 1.5 seconds is all it took to execute the processes of registration, interpolation, and re-slicing of the images all at the same time and in all the given dimensions. ^ This interface was fully embedded into a fiber-tracking software system in order to establish an optimal research environment. This highly integrated 3-D visualization system reached a practical level that makes it ready for clinical deployment. ^
Resumo:
A two-phase three-dimensional computational model of an intermediate temperature (120--190°C) proton exchange membrane (PEM) fuel cell is presented. This represents the first attempt to model PEM fuel cells employing intermediate temperature membranes, in this case, phosphoric acid doped polybenzimidazole (PBI). To date, mathematical modeling of PEM fuel cells has been restricted to low temperature operation, especially to those employing Nafion ® membranes; while research on PBI as an intermediate temperature membrane has been solely at the experimental level. This work is an advancement in the state of the art of both these fields of research. With a growing trend toward higher temperature operation of PEM fuel cells, mathematical modeling of such systems is necessary to help hasten the development of the technology and highlight areas where research should be focused.^ This mathematical model accounted for all the major transport and polarization processes occurring inside the fuel cell, including the two phase phenomenon of gas dissolution in the polymer electrolyte. Results were presented for polarization performance, flux distributions, concentration variations in both the gaseous and aqueous phases, and temperature variations for various heat management strategies. The model predictions matched well with published experimental data, and were self-consistent.^ The major finding of this research was that, due to the transport limitations imposed by the use of phosphoric acid as a doping agent, namely low solubility and diffusivity of dissolved gases and anion adsorption onto catalyst sites, the catalyst utilization is very low (∼1--2%). Significant cost savings were predicted with the use of advanced catalyst deposition techniques that would greatly reduce the eventual thickness of the catalyst layer, and subsequently improve catalyst utilization. The model also predicted that an increase in power output in the order of 50% is expected if alternative doping agents to phosphoric acid can be found, which afford better transport properties of dissolved gases, reduced anion adsorption onto catalyst sites, and which maintain stability and conductive properties at elevated temperatures.^
Resumo:
A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.
Resumo:
Rapid advances in electronic communication devices and technologies have resulted in a shift in the way communication applications are being developed. These new development strategies provide abstract views of the underlying communication technologies and lead to the so-called user-centric communication applications. One user-centric communication (UCC) initiative is the Communication Virtual Machine (CVM) technology, which uses the Communication Modeling Language (CML) for modeling communication services and the CVM for realizing these services. In communication-intensive domains such as telemedicine and disaster management, there is an increasing need for user-centric communication applications that are domain-specific and that support the dynamic coordination of communication services commonly found in collaborative communication scenarios. However, UCC approaches like the CVM offer little support for the dynamic coordination of communication services resulting from inherent dependencies between individual steps of a collaboration task. Users either have to manually coordinate communication services, or reply on a process modeling technique to build customized solutions for services in a specific domain that are usually costly, rigidly defined and technology specific. ^ This dissertation proposes a domain-specific modeling approach to address this problem by extending the CVM technology with communication-specific abstractions of workflow concepts commonly found in business processes. The extension involves (1) the definition of the Workflow Communication Modeling Language (WF-CML), a superset of CML, and (2) the extension of the functionality of CVM to process communication-specific workflows. The definition of WF-CML includes the meta-model and the dynamic semantics for control constructs and concurrency. We also extended the CVM prototype to handle the modeling and realization of WF-CML models. A comparative study of the proposed approach with other workflow environments validates the claimed benefits of WF-CML and CVM.^
Resumo:
Small-bodied fishes constitute an important assemblage in many wetlands. In wetlands that dry periodically except for small permanent waterbodies, these fishes are quick to respond to change and can undergo large fluctuations in numbers and biomasses. An important aspect of landscapes that are mixtures of marsh and permanent waterbodies is that high rates of biomass production occur in the marshes during flooding phases, while the permanent waterbodies serve as refuges for many biotic components during the dry phases. The temporal and spatial dynamics of the small fishes are ecologically important, as these fishes provide a crucial food base for higher trophic levels, such as wading birds. We develop a simple model that is analytically tractable, describing the main processes of the spatio-temporal dynamics of a population of small-bodied fish in a seasonal wetland environment, consisting of marsh and permanent waterbodies. The population expands into newly flooded areas during the wet season and contracts during declining water levels in the dry season. If the marsh dries completely during these times (a drydown), the fish need refuge in permanent waterbodies. At least three new and general conclusions arise from the model: (1) there is an optimal rate at which fish should expand into a newly flooding area to maximize population production; (2) there is also a fluctuation amplitude of water level that maximizes fish production, and (3) there is an upper limit on the number of fish that can reach a permanent waterbody during a drydown, no matter how large the marsh surface area is that drains into the waterbody. Because water levels can be manipulated in many wetlands, it is useful to have an understanding of the role of these fluctuations.
Resumo:
This paper demonstrates the usefulness of fluorescence techniques for long-term monitoring and assessment of the dynamics (sources, transport and fate) of chromophoric dissolved organic matter (CDOM) in highly compartmentalized estuarine regions with non-point water sources. Water samples were collected monthly from a total of 73 sampling stations in the Florida Coastal Everglades (FCE) estuaries during 2001 and 2002. Spatial and seasonal variability of CDOM characteristics were investigated for geomorphologically distinct sub-regions within Florida Bay (FB), the Ten Thousand Islands (TTI), and Whitewater Bay (WWB). These variations were observed in both quantity and quality of CDOM. TOC concentrations in the FCE estuaries were generally higher during the wet season (June–October), reflecting high freshwater loadings from the Everglades in TTI, and a high primary productivity of marine biomass in FB. Fluorescence parameters suggested that the CDOM in FB is mainly of marine/microbial origin, while for TTI and WWB a terrestrial origin from Everglades marsh plants and mangroves was evident. Variations in CDOM quality seemed mainly controlled by tidal exchange/mixing of Everglades freshwater with Florida Shelf waters, tidally controlled releases of CDOM from fringe mangroves, primary productivity of marine vegetation in FB and diagenetic processes such as photodegradation (particularly for WWB). The source and dynamics of CDOM in these subtropical estuaries is complex and found to be influenced by many factors including hydrology, geomorphology, vegetation cover, landuse and biogeochemical processes. Simple, easy to measure, high sample throughput fluorescence parameters for surface waters can add valuable information on CDOM dynamics to long-term water quality studies which can not be obtained from quantitative determinations alone.
Resumo:
The applications of micro-end-milling operations have increased recently. A Micro-End-Milling Operation Guide and Research Tool (MOGART) package has been developed for the study and monitoring of micro-end-milling operations. It includes an analytical cutting force model, neural network based data mapping and forecasting processes, and genetic algorithms based optimization routines. MOGART uses neural networks to estimate tool machinability and forecast tool wear from the experimental cutting force data, and genetic algorithms with the analytical model to monitor tool wear, breakage, run-out, cutting conditions from the cutting force profiles. ^ The performance of MOGART has been tested on the experimental data of over 800 experimental cases and very good agreement has been observed between the theoretical and experimental results. The MOGART package has been applied to the micro-end-milling operation study of Engineering Prototype Center of Radio Technology Division of Motorola Inc. ^
Resumo:
Recently, researchers have begun to investigate the benefits of cross-training teams. It has been hypothesized that cross-training should help improve team processes and team performance (Cannon-Bowers, Salas, Blickensderfer, & Bowers, 1998; Travillian, Volpe, Cannon-Bowers, & Salas, 1993). The current study extends previous research by examining different methods of cross-training (positional clarification and positional modeling) and the impact they have on team process and performance in both more complex and less complex environments. One hundred and thirty-five psychology undergraduates were placed in 45 three-person teams. Participants were randomly assigned to roles within teams. Teams were asked to “fly” a series of missions on a PC-based helicopter flight simulation. ^ Results suggest that cross-training improves team mental model accuracy and similarity. Accuracy of team mental models was found to be a predictor of coordination quality, but similarity of team mental models was not. Neither similarity nor accuracy of team mental models was found to be a predictor of backup behavior (quality and quantity). As expected, both team coordination (quality) and backup behaviors (quantity and quality) were significant predictors of overall team performance. Contrary to expectations, there was no interaction between cross-training and environmental complexity. Results from this study further cross-training research by establishing positional clarification and positional modeling as training strategies for improving team performance. ^
Resumo:
Chromium (Cr) is a metal of particular environmental concern, owing to its toxicity and widespread occurrence in groundwater, soil, and soil solution. A combination of hydrological, geochemical, and microbiological processes governs the subsurface migration of Cr. Little effort has been devoted to examining how these biogeochemical reactions combine with hydrologic processes influence Cr migration. This study has focused on the complex problem of predicting the Cr transport in laboratory column experiments. A 1-D reactive transport model was developed and evaluated against data obtained from laboratory column experiments. ^ A series of dynamic laboratory column experiments were conducted under abiotic and biotic conditions. Cr(III) was injected into columns packed with β-MnO 2-coated sand at different initial concentrations, variable flow rates, and at two different pore water pH (3.0 and 4.0). In biotic anaerobic column experiments Cr(VI) along with lactate was injected into columns packed with quartz sand or β-MnO2-coated sand and bacteria, Shewanella alga Simidu (BrY-MT). A mathematical model was developed which included advection-dispersion equations for the movement of Cr(III), Cr(VI), dissolved oxygen, lactate, and biomass. The model included first-order rate laws governing the adsorption of each Cr species and lactate. The equations for transport and adsorption were coupled with nonlinear equations for rate-limited oxidation-reduction reactions along with dual-monod kinetic equations. Kinetic batch experiments were conducted to determine the reduction of Cr(VI) by BrY-MT in three different substrates. Results of the column experiments with Cr(III)-containing influent solutions demonstrate that β-MnO2 effectively catalyzes the oxidation of Cr(III) to Cr(VI). For a given influent concentration and pore water velocity, oxidation rates are higher, and hence effluent concentrations of Cr(VI) are greater, at pH 4 relative to pH 3. Reduction of Cr(VI) by BrY-MT was rapid (within one hour) in columns packed with quartz sand, whereas Cr(VI) reduction by BrY-MT was delayed (57 hours) in presence of β-MnO 2-coated sand. BrY-MT grown in BHIB (brain heart infusion broth) reduced maximum amount of Cr(VI) to Cr(III) followed by TSB (tryptic soy broth) and M9 (minimum media). The comparisons of data and model results from the column experiments show that the depths associated with Cr(III) oxidation and transport within sediments of shallow aquatic systems can strongly influence trends in surface water quality. The results of this study suggests that carefully performed, laboratory column experiments is a useful tool in determining the biotransformation of redox-sensitive metals even in the presence of strong oxidant, like β-MnO2. ^
Resumo:
The purpose of the current study was to attempt to model various cognitive and social processes that are believed to lead to false confessions. More specifically, this study manipulated the variables of experimenter expectancy, guilt-innocence of the suspect, and interrogation techniques using the Russano et al. (2005) paradigm. The primary measure of interest was the likelihood of the participant signing the confession statement. By manipulating experimenter expectancy, the current study sought to further explore the social interactions that may occur in the interrogation room. In addition, in past experiments, the interrogator has typically been restricted to the use of one or two interrogation techniques. In the present study, interrogators were permitted to select from 15 different interrogation techniques when attempting to solicit a confession from participants. ^ Consistent with Rusanno et al. (2005), guilty participants (94%) were more likely to confess to the act of cheating than innocent participants (31%). The variable of experimenter expectancy did not effect confessions rates, length of interrogation, or the type of interrogation techniques used. Path analysis revealed feelings of pressure and the weighing of consequences on the part of the participant were associated with the signing of the confession statement. The findings suggest the guilt/innocence of the participant, the participant's perceptions of the interrogation situation, and length of interrogation play a pivotal role in the signing of the confession statement. Further examination of these variables may provide researchers with a better understanding of the relationship between interrogations and confessions. ^
Resumo:
This study investigated group processes as potential mediators or moderators of positive development outcome and negative reduction intervention response by evaluating the utility of a group measure modified from a widely known measure of group impact found in the group therapy research literature. Four group processes were of primary interest, (1) Group Impact; (2) Facilitator Impact; (3) Skills Impact; and (4) Exploration Impact as assessed by the Session Evaluation Form (SEF). Outcome measures included the Personally Expressive Activities Questionnaire (PEAQ), Erikson Psycho-Social Index (EPSI) and the Zill Behavior Items, Behavior Problem Index (ZBI (BPI)). The sample consisted of 121 multi-ethnic participants drawn from four alternative high schools from the Miami-Dade County Public School system. Utilizing a Latent Growth Curve Modeling approach with Structural Equation Modeling (SEM) statistics, preliminary analyses were conducted to evaluate the psychometric properties of the SEF and its role in the mediation or moderation of intervention outcome. Preliminary results revealed evidence of a single higher order factor representing a "General" global reaction, which was hypothesized to be a "Positive Group Climate" construct to the program as opposed to the four distinct group processes that were initially hypothesized to affect outcomes. The results of the evaluation of the mediation or moderation role of intervention outcome of the single "General" global latent factor ("Positive Group Climate" construct) did not significantly predict treatment response on any of the outcome variables. Nevertheless, the evidence of an underlying "General" global latent factor ("Positive Group Climate" construct) has important future directions for research on positive youth development programs as well as in group therapy research.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. ^ Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. ^ Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. ^ With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.^