803 resultados para Dynamic information Systems


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to develop, explicate, and validate a comprehensive model in order to more effectively assess community injury prevention needs, plan and target efforts, identify potential interventions, and provide a framework for an outcome-based evaluation of the effectiveness of interventions. A systems model approach was developed to conceptualize the major components of inputs, efforts, outcomes and feedback within a community setting. Profiling of multiple data sources demonstrated a community feedback mechanism that increased awareness of priority issues and elicited support from traditional as well as non-traditional injury prevention partners. Injury countermeasures including education, enforcement, engineering, and economic incentives were presented for their potential synergistic effect impacting on knowledge, attitudes, or behaviors of a targeted population. Levels of outcome data were classified into ultimate, intermediate and immediate indicators to assist with determining the effectiveness of intervention efforts. A collaboration between business and health care was successful in achieving data access and use of an emergency department level of injury data for monitoring of the impact of community interventions. Evaluation of injury events and preventive efforts within the context of a dynamic community systems environment was applied to a study community with examples detailing actual profiling and trending of injuries. The resulting model of community injury prevention was validated using a community focus group, community injury prevention coordinators, and injury prevention national experts. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. ^ This thesis describes a heterogeneous database system being developed at High-performance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii) a framework for intelligent computing and communication on the Internet applying the concepts of our work. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Online learning systems (OLS) have become center stage for corporations and educational institutions as a competitive tool in the knowledge economy. The satisfaction construct has received extensive coverage in information systems literature as an indicator of effectiveness but has been criticized for lack of validity; yet, the value construct has been largely ignored, although it has a long history in psychology, sociology, and behavioral science. The purpose of this dissertation is to investigate the value and satisfaction constructs in the context of OLS, and their perceived by learners relationship for implied effectiveness of OLS. ^ First, a qualitative phase is employed to gather OLS values from learners' focus groups, followed by a pilot phase to refine a proposed instrument, and a main phase to validate the survey. Responses were received from 75 students in four focus groups, 141 in the pilot, and 207 the main survey. Extensive data cleaning and exploratory factor analysis were done to identify factors of learners' perceived value and satisfaction of OLS. Then, Value-Satisfaction grids and the Learners' Value Index of Satisfaction (LeVIS) were developed as benchmarking tools of OLS. Moreover, Multicriteria Decision Analysis (MCDA) techniques were employed to impute value from satisfaction scores in order to reduce survey response time. ^ The results provided four satisfaction and four value factors with high reliability (Cronbach's α). Moreover, value and satisfaction were found to have low linear and nonlinear correlations, indicating that they are two distinct uncorrelated constructs. This is consistent with the literature. Value-Satisfaction grids and the LeVIS index indicated relatively high effectiveness for technology and support characteristics, relatively low effectiveness for professor's characteristics, while course and learner characteristics indicated average effectiveness. ^ The main contributions of this study include identifying, defining, and articulating the relationship between value and satisfaction constructs as assessment of users' implied IS effectiveness, as well as assessing the accuracy of MCDA procedures to predict value scores, thus reducing by half the survey questionnaire size. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Access control (AC) limits access to the resources of a system only to authorized entities. Given that information systems today are increasingly interconnected, AC is extremely important. The implementation of an AC service is a complicated task. Yet the requirements to an AC service vary a lot. Accordingly, the design of an AC service should be flexible and extensible in order to save development effort and time. Unfortunately, with conventional object-oriented techniques, when an extension has not been anticipated at the design time, the modification incurred by the extension is often invasive. Invasive changes destroy design modularity, further deteriorate design extensibility, and even worse, they reduce product reliability. ^ A concern is crosscutting if it spans multiple object-oriented classes. It was identified that invasive changes were due to the crosscutting nature of most unplanned extensions. To overcome this problem, an aspect-oriented design approach for AC services was proposed, as aspect-oriented techniques could effectively encapsulate crosscutting concerns. The proposed approach was applied to develop an AC framework that supported role-based access control model. In the framework, the core role-based access control mechanism is given in an object-oriented design, while each extension is captured as an aspect. The resulting framework is well-modularized, flexible, and most importantly, supports noninvasive adaptation. ^ In addition, a process to formalize the aspect-oriented design was described. The purpose is to provide high assurance for AC services. Object-Z was used to specify the static structure and Predicate/Transition net was used to model the dynamic behavior. Object-Z was extended to facilitate specification in an aspect-oriented style. The process of formal modeling helps designers to enhance their understanding of the design, hence to detect problems. Furthermore, the specification can be mathematically verified. This provides confidence that the design is correct. It was illustrated through an example that the model was ready for formal analysis. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Bahamas is a small island nation that is dealing with the problem of freshwater shortage. All of the country’s freshwater is contained in shallow lens aquifers that are recharged solely by rainfall. The country has been struggling to meet the water demands by employing a combination of over-pumping of aquifers, transport of water by barge between islands, and desalination of sea water. In recent decades, new development on New Providence, where the capital city of Nassau is located, has created a large area of impervious surfaces and thereby a substantial amount of runoff with the result that several of the aquifers are not being recharged. A geodatabase was assembled to assess and estimate the quantity of runoff from these impervious surfaces and potential recharge locations were identified using a combination of Geographic Information Systems (GIS) and remote sensing. This study showed that runoff from impervious surfaces in New Providence represents a large freshwater resource that could potentially be used to recharge the lens aquifers on New Providence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Although there are more than 7,000 properties using lodging yield management systems (LYMSs), both practitioners and researchers alike have found it difficult to measure their success. Considerable research was performed in the 1980s to develop success measures for information systems in general. In this work the author develops success measures specifically for LYMSs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many systems and applications are continuously producing events. These events are used to record the status of the system and trace the behaviors of the systems. By examining these events, system administrators can check the potential problems of these systems. If the temporal dynamics of the systems are further investigated, the underlying patterns can be discovered. The uncovered knowledge can be leveraged to predict the future system behaviors or to mitigate the potential risks of the systems. Moreover, the system administrators can utilize the temporal patterns to set up event management rules to make the system more intelligent. With the popularity of data mining techniques in recent years, these events grad- ually become more and more useful. Despite the recent advances of the data mining techniques, the application to system event mining is still in a rudimentary stage. Most of works are still focusing on episodes mining or frequent pattern discovering. These methods are unable to provide a brief yet comprehensible summary to reveal the valuable information from the high level perspective. Moreover, these methods provide little actionable knowledge to help the system administrators to better man- age the systems. To better make use of the recorded events, more practical techniques are required. From the perspective of data mining, three correlated directions are considered to be helpful for system management: (1) Provide concise yet comprehensive summaries about the running status of the systems; (2) Make the systems more intelligence and autonomous; (3) Effectively detect the abnormal behaviors of the systems. Due to the richness of the event logs, all these directions can be solved in the data-driven manner. And in this way, the robustness of the systems can be enhanced and the goal of autonomous management can be approached. This dissertation mainly focuses on the foregoing directions that leverage tem- poral mining techniques to facilitate system management. More specifically, three concrete topics will be discussed, including event, resource demand prediction, and streaming anomaly detection. Besides the theoretic contributions, the experimental evaluation will also be presented to demonstrate the effectiveness and efficacy of the corresponding solutions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. This thesis describes a heterogeneous database system being developed at Highperformance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i.) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii.) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii.) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv.) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v.) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi.) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii.) a framework for intelligent computing and communication on the Internet applying the concepts of our work.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The research described here is supported by the award made by the RCUK Digital Economy programme to the dot.rural Digital Economy Research Hub; award reference: EP/G066051/1/.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The chapter discusses both the complementary factors and contradictions of adopting ERP based systems with enterprise 2.0. ERP is characterized as achieving efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. It is claimed that enterprise 2.0 can support flexible business process management and so incorporate informal and less structured interactions. A traditional view however is that efficiency and flexibility objectives are incompatible as they are different business objectives which are pursued separately in different organizational environments. Thus an ERP system with a primary objective of improving efficiency and an enterprise 2.0 system with a primary aim of improving flexibility may represent a contradiction and lead to a high risk of failure if adopted simultaneously. This chapter will use case study analysis to investigate the use of a combination of ERP and enterprise 2.0 in a single enterprise with the aim of improving both efficiency and flexibility in operations. The chapter provides an in-depth analysis of the combination of ERP with enterprise 2.0 based on social-technical information systems management theory. The chapter also provides a summary of the benefits of the combination of ERP systems and enterprise 2.0 and how they could contribute to the development of a new generation of business management that combines both formal and informal mechanisms. For example, the multiple-sites or informal communities of an enterprise could collaborate efficiently with a common platform with a certain level of standardization but also have the flexibility in order to provide an agile reaction to internal and external events.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper examines how the introduction and use of a new information system affects and is affected by the values of a diverse professional workforce. It uses the example of lecture capture systems in a university. Its contribution is to combine two concepts taken from actor-network theory, namely accumulation and inscription, and combine them with an integrated framework of diversity management. A model is developed of accumulation cycles in lecture capture usage, involving multiple interacting actants, including the broader environment, management commitment to diversity, work group characteristics, individual practices and the affordances of technology. Using this model, alternative future inscriptions can be identified - an optimal one, which enhances professional values, as a result of a virtuous accumulation cycle, or a sub-optimal one, as a result of a vicious cycle. It identifies diversity management as an important influence on how professional values are enhanced, modified or destroyed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The protection of cyberspace has become one of the highest security priorities of governments worldwide. The EU is not an exception in this context, given its rapidly developing cyber security policy. Since the 1990s, we could observe the creation of three broad areas of policy interest: cyber-crime, critical information infrastructures and cyber-defence. One of the main trends transversal to these areas is the importance that the private sector has come to assume within them. In particular in the area of critical information infrastructure protection, the private sector is seen as a key stakeholder, given that it currently operates most infrastructures in this area. As a result of this operative capacity, the private sector has come to be understood as the expert in network and information systems security, whose knowledge is crucial for the regulation of the field. Adopting a Regulatory Capitalism framework, complemented by insights from Network Governance, we can identify the shifting role of the private sector in this field from one of a victim in need of protection in the first phase, to a commercial actor bearing responsibility for ensuring network resilience in the second, to an active policy shaper in the third, participating in the regulation of NIS by providing technical expertise. By drawing insights from the above-mentioned frameworks, we can better understand how private actors are involved in shaping regulatory responses, as well as why they have been incorporated into these regulatory networks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Biological macromolecules can rearrange interdomain orientations when binding to various partners. Interdomain dynamics serve as a molecular mechanism to guide the transitions between orientations. However, our understanding of interdomain dynamics is limited because a useful description of interdomain motions requires an estimate of the probabilities of interdomain conformations, increasing complexity of the problem.

Staphylococcal protein A (SpA) has five tandem protein-binding domains and four interdomain linkers. The domains enable Staphylococcus aureus to evade the host immune system by binding to multiple host proteins including antibodies. Here, I present a study of the interdomain motions of two adjacent domains in SpA. NMR spin relaxation experiments identified a 6-residue flexible interdomain linker and interdomain motions. To quantify the anisotropy of the distribution of interdomain orientations, we measured residual dipolar couplings (RDCs) from the two domains with multiple alignments. The N-terminal domain was directly aligned by a lanthanide ion and not influenced by interdomain motions, so it acted as a reference frame to achieve motional decoupling. We also applied {\it de novo} methods to extract spatial dynamic information from RDCs and represent interdomain motions as a continuous distribution on the 3D rotational space. Significant anisotropy was observed in the distribution, indicating the motion populates some interdomain orientations more than others. Statistical thermodynamic analysis of the observed orientational distribution suggests that it is among the energetically most favorable orientational distributions for binding to antibodies. Thus, the affinity is enhanced by a pre-posed distribution of interdomain orientations while maintaining the flexibility required for function.

The protocol described above can be applied to other biological systems in general. Protein molecule calmodulin and RNA molecule trans-activation response element (TAR) also have intensive interdomain motions with relative small intradomain dynamics. Their interdomain motions were studied using our method based on published RDC data. Our results were consistent with literature results in general. The differences could be due to previous studies' use of physical models, which contain assumptions about potential energy and thus introduced non-experimental information into the interpretations.