972 resultados para Transaction level modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We proposed and tested a multilevel model, underpinned by empowerment theory, that examines the processes linking high-performance work systems (HPWS) and performance outcomes at the individual and organizational levels of analyses. Data were obtained from 37 branches of 2 banking institutions in Ghana. Results of hierarchical regression analysis revealed that branch-level HPWS relates to empowerment climate. Additionally, results of hierarchical linear modeling that examined the hypothesized cross-level relationships revealed 3 salient findings. First, experienced HPWS and empowerment climate partially mediate the influence of branch-level HPWS on psychological empowerment. Second, psychological empowerment partially mediates the influence of empowerment climate and experienced HPWS on service performance. Third, service orientation moderates the psychological empowerment-service performance relationship such that the relationship is stronger for those high rather than low in service orientation. Last, ordinary least squares regression results revealed that branch-level HPWS influences branch-level market performance through cross-level and individual-level influences on service performance that emerges at the branch level as aggregated service performance. © 2011 American Psychological Association.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sol-gel-synthesized bioactive glasses may be formed via a hydrolysis condensation reaction, silica being introduced in the form of tetraethyl orthosilicate (TEOS), and calcium is typically added in the form of calcium nitrate. The synthesis reaction proceeds in an aqueous environment; the resultant gel is dried, before stabilization by heat treatment. These materials, being amorphous, are complex at the level of their atomic-scale structure, but their bulk properties may only be properly understood on the basis of that structural insight. Thus, a full understanding of their structure-property relationship may only be achieved through the application of a coherent suite of leading-edge experimental probes, coupled with the cogent use of advanced computer simulation methods. Using as an exemplar a calcia-silica sol-gel glass of the kind developed by Larry Hench, in the memory of whom this paper is dedicated, we illustrate the successful use of high-energy X-ray and neutron scattering (diffraction) methods, magic-angle spinning solid-state NMR, and molecular dynamics simulation as components to a powerful methodology for the study of amorphous materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ecological models have often been used in order to answer questions that are in the limelight of recent researches such as the possible effects of climate change. The methodology of tactical models is a very useful tool comparison to those complex models requiring relatively large set of input parameters. In this study, a theoretical strategic model (TEGM ) was adapted to the field data on the basis of a 24-year long monitoring database of phytoplankton in the Danube River at the station of G¨od, Hungary (at 1669 river kilometer – hereafter referred to as “rkm”). The Danubian Phytoplankton Growth Model (DPGM) is able to describe the seasonal dynamics of phytoplankton biomass (mg L−1) based on daily temperature, but takes the availability of light into consideration as well. In order to improve fitting, the 24-year long database was split in two parts in accordance with environmental sustainability. The period of 1979–1990 has a higher level of nutrient excess compared with that of the 1991–2002. The authors assume that, in the above-mentioned periods, phytoplankton responded to temperature in two different ways, thus two submodels were developed, DPGM-sA and DPGMsB. Observed and simulated data correlated quite well. Findings suggest that linear temperature rise brings drastic change to phytoplankton only in case of high nutrient load and it is mostly realized through the increase of yearly total biomass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study evaluated the relative fit of both Finn's (1989) Participation-Identification and Wehlage, Rutter, Smith, Lesko and Fernandez's (1989) School Membership models of high school completion to a sample of 4,597 eighth graders taken from the National Educational Longitudinal Study of 1988, (NELS:88), utilizing structural equation modeling techniques. This study found support for the importance of educational engagement as a factor in understanding academic achievement. The Participation-Identification model was particularly well fitting when applied to the sample of high school completers, dropouts (both overall and White dropouts) and African-American students. This study also confirmed the contribution of school environmental factors (i.e., size, diversity of economic and ethnic status among students) and family resources (i.e., availability of learning resources in the home and parent educational level) to students' educational engagement. Based on these findings, school social workers will need to be more attentive to utilizing macro-level interventions (i.e., community organization, interagency coordination) to achieve the organizational restructuring needed to address future challenges. The support found for the Participation-Identification model supports a shift in school social workers' attention from reactive attempts to improve the affective-interpersonal lives of students to proactive attention to their academic lives. The model concentrates school social work practices on the central mission of schools, which is educational engagement. School social workers guided by this model would be encouraged to seek changes in school policies and organization that would facilitate educational engagement. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation establishes the foundation for a new 3-D visual interface integrating Magnetic Resonance Imaging (MRI) to Diffusion Tensor Imaging (DTI). The need for such an interface is critical for understanding brain dynamics, and for providing more accurate diagnosis of key brain dysfunctions in terms of neuronal connectivity. ^ This work involved two research fronts: (1) the development of new image processing and visualization techniques in order to accurately establish relational positioning of neuronal fiber tracts and key landmarks in 3-D brain atlases, and (2) the obligation to address the computational requirements such that the processing time is within the practical bounds of clinical settings. The system was evaluated using data from thirty patients and volunteers with the Brain Institute at Miami Children's Hospital. ^ Innovative visualization mechanisms allow for the first time white matter fiber tracts to be displayed alongside key anatomical structures within accurately registered 3-D semi-transparent images of the brain. ^ The segmentation algorithm is based on the calculation of mathematically-tuned thresholds and region-detection modules. The uniqueness of the algorithm is in its ability to perform fast and accurate segmentation of the ventricles. In contrast to the manual selection of the ventricles, which averaged over 12 minutes, the segmentation algorithm averaged less than 10 seconds in its execution. ^ The registration algorithm established searches and compares MR with DT images of the same subject, where derived correlation measures quantify the resulting accuracy. Overall, the images were 27% more correlated after registration, while an average of 1.5 seconds is all it took to execute the processes of registration, interpolation, and re-slicing of the images all at the same time and in all the given dimensions. ^ This interface was fully embedded into a fiber-tracking software system in order to establish an optimal research environment. This highly integrated 3-D visualization system reached a practical level that makes it ready for clinical deployment. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unified Modeling Language (UML) is the most comprehensive and widely accepted object-oriented modeling language due to its multi-paradigm modeling capabilities and easy to use graphical notations, with strong international organizational support and industrial production quality tool support. However, there is a lack of precise definition of the semantics of individual UML notations as well as the relationships among multiple UML models, which often introduces incomplete and inconsistent problems for software designs in UML, especially for complex systems. Furthermore, there is a lack of methodologies to ensure a correct implementation from a given UML design. The purpose of this investigation is to verify and validate software designs in UML, and to provide dependability assurance for the realization of a UML design.^ In my research, an approach is proposed to transform UML diagrams into a semantic domain, which is a formal component-based framework. The framework I proposed consists of components and interactions through message passing, which are modeled by two-layer algebraic high-level nets and transformation rules respectively. In the transformation approach, class diagrams, state machine diagrams and activity diagrams are transformed into component models, and transformation rules are extracted from interaction diagrams. By applying transformation rules to component models, a (sub)system model of one or more scenarios can be constructed. Various techniques such as model checking, Petri net analysis techniques can be adopted to check if UML designs are complete or consistent. A new component called property parser was developed and merged into the tool SAM Parser, which realize (sub)system models automatically. The property parser generates and weaves runtime monitoring code into system implementations automatically for dependability assurance. The framework in the investigation is creative and flexible since it not only can be explored to verify and validate UML designs, but also provides an approach to build models for various scenarios. As a result of my research, several kinds of previous ignored behavioral inconsistencies can be detected.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two-phase three-dimensional computational model of an intermediate temperature (120--190°C) proton exchange membrane (PEM) fuel cell is presented. This represents the first attempt to model PEM fuel cells employing intermediate temperature membranes, in this case, phosphoric acid doped polybenzimidazole (PBI). To date, mathematical modeling of PEM fuel cells has been restricted to low temperature operation, especially to those employing Nafion ® membranes; while research on PBI as an intermediate temperature membrane has been solely at the experimental level. This work is an advancement in the state of the art of both these fields of research. With a growing trend toward higher temperature operation of PEM fuel cells, mathematical modeling of such systems is necessary to help hasten the development of the technology and highlight areas where research should be focused.^ This mathematical model accounted for all the major transport and polarization processes occurring inside the fuel cell, including the two phase phenomenon of gas dissolution in the polymer electrolyte. Results were presented for polarization performance, flux distributions, concentration variations in both the gaseous and aqueous phases, and temperature variations for various heat management strategies. The model predictions matched well with published experimental data, and were self-consistent.^ The major finding of this research was that, due to the transport limitations imposed by the use of phosphoric acid as a doping agent, namely low solubility and diffusivity of dissolved gases and anion adsorption onto catalyst sites, the catalyst utilization is very low (∼1--2%). Significant cost savings were predicted with the use of advanced catalyst deposition techniques that would greatly reduce the eventual thickness of the catalyst layer, and subsequently improve catalyst utilization. The model also predicted that an increase in power output in the order of 50% is expected if alternative doping agents to phosphoric acid can be found, which afford better transport properties of dissolved gases, reduced anion adsorption onto catalyst sites, and which maintain stability and conductive properties at elevated temperatures.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Mediation System utilizes a central security mediator that is primarily concerned with securing the internal structure of the Mediation System. The current problem is that clients are unable to have authority and administrative rights over the security of their data during a transaction. In addition, this Mediation System is unsuited in presenting a metric that measures the level of confidence of security access rights. This creates a black-box perspective from the client towards the Mediation System and also gives no assurance to these clients that they have assigned the proper security access rights that reflect the current environment of the mediation system. This dissertation presents a Collaborative Information System (CIS) that uses an agent based approach to encapsulate collaborative information and security policies within the Mediation System which are under the control of the clients of the Mediation System. In conjunction with the CIS's Stochastic Security Framework it is possible to take a probabilistic approach in modeling the security access rights of a collaboration transaction. The research results showed that it is feasible to construct a Mediation System utilizing agents and stochastic equations to establish an environment where the client has authority and administrative control in assigning security access rights to their collaborative data that can establish a metric that measures the level of confidence of these assigned rights.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In topographically flat wetlands, where shallow water table and conductive soil may develop as a result of wet and dry seasons, the connection between surface water and groundwater is not only present, but perhaps the key factor dominating the magnitude and direction of water flux. Due to their complex characteristics, modeling waterflow through wetlands using more realistic process formulations (integrated surface-ground water and vegetative resistance) is an actual necessity. This dissertation focused on developing an integrated surface – subsurface hydrologic simulation numerical model by programming and testing the coupling of the USGS MODFLOW-2005 Groundwater Flow Process (GWF) package (USGS, 2005) with the 2D surface water routing model: FLO-2D (O’Brien et al., 1993). The coupling included the necessary procedures to numerically integrate and verify both models as a single computational software system that will heretofore be referred to as WHIMFLO-2D (Wetlands Hydrology Integrated Model). An improved physical formulation of flow resistance through vegetation in shallow waters based on the concept of drag force was also implemented for the simulations of floodplains, while the use of the classical methods (e.g., Manning, Chezy, Darcy-Weisbach) to calculate flow resistance has been maintained for the canals and deeper waters. A preliminary demonstration exercise WHIMFLO-2D in an existing field site was developed for the Loxahatchee Impoundment Landscape Assessment (LILA), an 80 acre area, located at the Arthur R. Marshall Loxahatchee National Wild Life Refuge in Boynton Beach, Florida. After applying a number of simplifying assumptions, results have illustrated the ability of the model to simulate the hydrology of a wetland. In this illustrative case, a comparison between measured and simulated stages level showed an average error of 0.31% with a maximum error of 2.8%. Comparison of measured and simulated groundwater head levels showed an average error of 0.18% with a maximum of 2.9%. The coupling of FLO-2D model with MODFLOW-2005 model and the incorporation of the dynamic effect of flow resistance due to vegetation performed in the new modeling tool WHIMFLO-2D is an important contribution to the field of numerical modeling of hydrologic flow in wetlands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Breast cancer is the second leading cause of cancer death in United States women, estimated to be diagnosed in 1 out of 8 women in their lifetime. Screening mammography detects breast cancer in its pre-clinical stages when treatment strategies have the greatest chance of success, and is currently the only population-wide prevention method proven to reduce the morbidity and mortality associated with breast cancer. Research has shown that the majority of women are not screened annually, with estimates ranging front 6% - 30% of eligible women receiving all available annual mammograms over a 5-year or greater time frame. Health behavior theorists believe that perception of risk/susceptibility to a disease influences preventive health behavior, in this case, screening mammography The purpose of this dissertation is to examine the association between breast cancer risk perception and repeat screening mammography using a structural equation modeling (SEM) framework. A series of SEM multivariate regressions were conducted using self-reported, nationally representative data from the 2005 National Health Interview Survey. Interaction contrasts were tested to measure the potential moderating effects of variables which have been shown to be predictive of mammography use (physician recommendation, economic barriers, structural barriers, race/ethnicity) on the association between breast cancer risk perception and repeat mammography, while controlling for the covariates of age, income, region, nativity, and educational level. Of the variables tested for moderation, results of the SEM analyses identify physician recommendation as the only moderator of the relationship between risk perception and repeat mammography, thus the potentially most effective point of intervention to increase mammography screening, and decrease the morbidity and mortality associated with breast cancer. These findings expand the role of the physician from recommendation to one of attenuating the effect of risk perception and increasing repeat screening. The long range application of the research is the use of the SEM methodology to identify specific points of intervention most likely to increase preventive behavior in population-wide research, allowing for the most effective use of intervention funds.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research involves the design, development, and theoretical demonstration of models resulting in integrated misbehavior resolution protocols for ad hoc networked devices. Game theory was used to analyze strategic interaction among independent devices with conflicting interests. Packet forwarding at the routing layer of autonomous ad hoc networks was investigated. Unlike existing reputation based or payment schemes, this model is based on repeated interactions. To enforce cooperation, a community enforcement mechanism was used, whereby selfish nodes that drop packets were punished not only by the victim, but also by all nodes in the network. Then, a stochastic packet forwarding game strategy was introduced. Our solution relaxed the uniform traffic demand that was pervasive in other works. To address the concerns of imperfect private monitoring in resource aware ad hoc networks, a belief-free equilibrium scheme was developed that reduces the impact of noise in cooperation. This scheme also eliminated the need to infer the private history of other nodes. Moreover, it simplified the computation of an optimal strategy. The belief-free approach reduced the node overhead and was easily tractable. Hence it made the system operation feasible. Motivated by the versatile nature of evolutionary game theory, the assumption of a rational node is relaxed, leading to the development of a framework for mitigating routing selfishness and misbehavior in Multi hop networks. This is accomplished by setting nodes to play a fixed strategy rather than independently choosing a rational strategy. A range of simulations was carried out that showed improved cooperation between selfish nodes when compared to older results. Cooperation among ad hoc nodes can also protect a network from malicious attacks. In the absence of a central trusted entity, many security mechanisms and privacy protections require cooperation among ad hoc nodes to protect a network from malicious attacks. Therefore, using game theory and evolutionary game theory, a mathematical framework has been developed that explores trust mechanisms to achieve security in the network. This framework is one of the first steps towards the synthesis of an integrated solution that demonstrates that security solely depends on the initial trust level that nodes have for each other.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Small-bodied fishes constitute an important assemblage in many wetlands. In wetlands that dry periodically except for small permanent waterbodies, these fishes are quick to respond to change and can undergo large fluctuations in numbers and biomasses. An important aspect of landscapes that are mixtures of marsh and permanent waterbodies is that high rates of biomass production occur in the marshes during flooding phases, while the permanent waterbodies serve as refuges for many biotic components during the dry phases. The temporal and spatial dynamics of the small fishes are ecologically important, as these fishes provide a crucial food base for higher trophic levels, such as wading birds. We develop a simple model that is analytically tractable, describing the main processes of the spatio-temporal dynamics of a population of small-bodied fish in a seasonal wetland environment, consisting of marsh and permanent waterbodies. The population expands into newly flooded areas during the wet season and contracts during declining water levels in the dry season. If the marsh dries completely during these times (a drydown), the fish need refuge in permanent waterbodies. At least three new and general conclusions arise from the model: (1) there is an optimal rate at which fish should expand into a newly flooding area to maximize population production; (2) there is also a fluctuation amplitude of water level that maximizes fish production, and (3) there is an upper limit on the number of fish that can reach a permanent waterbody during a drydown, no matter how large the marsh surface area is that drains into the waterbody. Because water levels can be manipulated in many wetlands, it is useful to have an understanding of the role of these fluctuations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The standard highway assignment model in the Florida Standard Urban Transportation Modeling Structure (FSUTMS) is based on the equilibrium traffic assignment method. This method involves running several iterations of all-or-nothing capacity-restraint assignment with an adjustment of travel time to reflect delays encountered in the associated iteration. The iterative link time adjustment process is accomplished through the Bureau of Public Roads (BPR) volume-delay equation. Since FSUTMS' traffic assignment procedure outputs daily volumes, and the input capacities are given in hourly volumes, it is necessary to convert the hourly capacities to their daily equivalents when computing the volume-to-capacity ratios used in the BPR function. The conversion is accomplished by dividing the hourly capacity by a factor called the peak-to-daily ratio, or referred to as CONFAC in FSUTMS. The ratio is computed as the highest hourly volume of a day divided by the corresponding total daily volume. ^ While several studies have indicated that CONFAC is a decreasing function of the level of congestion, a constant value is used for each facility type in the current version of FSUTMS. This ignores the different congestion level associated with each roadway and is believed to be one of the culprits of traffic assignment errors. Traffic counts data from across the state of Florida were used to calibrate CONFACs as a function of a congestion measure using the weighted least squares method. The calibrated functions were then implemented in FSUTMS through a procedure that takes advantage of the iterative nature of FSUTMS' equilibrium assignment method. ^ The assignment results based on constant and variable CONFACs were then compared against the ground counts for three selected networks. It was found that the accuracy from the two assignments was not significantly different, that the hypothesized improvement in assignment results from the variable CONFAC model was not empirically evident. It was recognized that many other factors beyond the scope and control of this study could contribute to this finding. It was recommended that further studies focus on the use of the variable CONFAC model with recalibrated parameters for the BPR function and/or with other forms of volume-delay functions. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated project delivery (IPD) method has recently emerged as an alternative to traditional delivery methods. It has the potential to overcome inefficiencies of traditional delivery methods by enhancing collaboration among project participants. Information and communication technology (ICT) facilitates IPD by effective management, processing and communication of information within and among organizations. While the benefits of IPD, and the role of ICT in realizing them, have been generally acknowledged, the US public construction sector is very slow in adopting IPD. The reasons are - lack of experience and inadequate understanding of IPD in public owner as confirmed by the results of the questionnaire survey conducted under this research study. The public construction sector should be aware of the value of IPD and should know the essentials for effective implementation of IPD principles - especially, they should be cognizant of the opportunities offered by advancements in ICT to realize this.^ In order to address the need an IPD Readiness Assessment Model (IPD-RAM) was developed in this research study. The model was designed with a goal to determine IPD readiness of a public owner organization considering selected IPD principles, and ICT levels, at which project functions were carried out. Subsequent analysis led to identification of possible improvements in ICTs that have the potential to increase IPD readiness scores. Termed as the gap identification, this process was used to formulate improvement strategies. The model had been applied to six Florida International University (FIU) construction projects (case studies). The results showed that the IPD readiness of the organization was considerably low and several project functions can be improved by using higher and/or advanced level ICT tools and methods. Feedbacks from a focus group comprised of FIU officials and an independent group of experts had been received at various stages of this research and had been utilized during development and implementation of the model. Focus group input was also helpful for validation of the model and its results. It was hoped that the model developed would be useful to construction owner organizations in order to assess their IPD readiness and to identify appropriate ICT improvement strategies.^