653 resultados para multi-theoretical
Resumo:
A novel intelligent online demand side management system is proposed for peak load management in low-voltage distribution networks. This method uses low-cost controllers with low-bandwidth two-way communication installed in custumers’ premises and at distribution transformers to manage the peak load while maximising customer satisfaction. A multi-objective decision making process is proposed to select the load(s) to be delayed or controlled. The efficacy of the proposed control system is verified by simulation of three different feeder types.
Resumo:
Organizations from every industry sector seek to enhance their business performance and competitiveness through the deployment of contemporary information systems (IS), such as Enterprise Systems (ERP). Investments in ERP are complex and costly, attracting scrutiny and pressure to justify their cost. Thus, IS researchers highlight the need for systematic evaluation of information system success, or impact, which has resulted in the introduction of varied models for evaluating information systems. One of these systematic measurement approaches is the IS-Impact Model introduced by a team of researchers at Queensland University of technology (QUT) (Gable, Sedera, & Chan, 2008). The IS-Impact Model is conceptualized as a formative, multidimensional index that consists of four dimensions. Gable et al. (2008) define IS-Impact as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (p.381). The IT Evaluation Research Program (ITE-Program) at QUT has grown the IS-Impact Research Track with the central goal of conducting further studies to enhance and extend the IS-Impact Model. The overall goal of the IS-Impact research track at QUT is "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable, 2009). In order to achieve that, the IS-Impact research track advocates programmatic research having the principles of tenacity, holism, and generalizability through extension research strategies. This study was conducted within the IS-Impact Research Track, to further generalize the IS-Impact Model by extending it to the Saudi Arabian context. According to Hofsted (2012), the national culture of Saudi Arabia is significantly different from the Australian national culture making the Saudi Arabian culture an interesting context for testing the external validity of the IS-Impact Model. The study re-visits the IS-Impact Model from the ground up. Rather than assume the existing instrument is valid in the new context, or simply assess its validity through quantitative data collection, the study takes a qualitative, inductive approach to re-assessing the necessity and completeness of existing dimensions and measures. This is done in two phases: Exploratory Phase and Confirmatory Phase. The exploratory phase addresses the first research question of the study "Is the IS-Impact Model complete and able to capture the impact of information systems in Saudi Arabian Organization?". The content analysis, used to analyze the Identification Survey data, indicated that 2 of the 37 measures of the IS-Impact Model are not applicable for the Saudi Arabian Context. Moreover, no new measures or dimensions were identified, evidencing the completeness and content validity of the IS-Impact Model. In addition, the Identification Survey data suggested several concepts related to IS-Impact, the most prominent of which was "Computer Network Quality" (CNQ). The literature supported the existence of a theoretical link between IS-Impact and CNQ (CNQ is viewed as an antecedent of IS-Impact). With the primary goal of validating the IS-Impact model within its extended nomological network, CNQ was introduced to the research model. The Confirmatory Phase addresses the second research question of the study "Is the Extended IS-Impact Model Valid as a Hierarchical Multidimensional Formative Measurement Model?". The objective of the Confirmatory Phase was to test the validity of IS-Impact Model and CNQ Model. To achieve that, IS-Impact, CNQ, and IS-Satisfaction were operationalized in a survey instrument, and then the research model was assessed by employing the Partial Least Squares (PLS) approach. The CNQ model was validated as a formative model. Similarly, the IS-Impact Model was validated as a hierarchical multidimensional formative construct. However, the analysis indicated that one of the IS-Impact Model indicators was insignificant and can be removed from the model. Thus, the resulting Extended IS-Impact Model consists of 4 dimensions and 34 measures. Finally, the structural model was also assessed against two aspects: explanatory and predictive power. The analysis revealed that the path coefficient between CNQ and IS-Impact is significant with t-value= (4.826) and relatively strong with â = (0.426) with CNQ explaining 18% of the variance in IS-Impact. These results supported the hypothesis that CNQ is antecedent of IS-Impact. The study demonstrates that the quality of Computer Network affects the quality of the Enterprise System (ERP) and consequently the impacts of the system. Therefore, practitioners should pay attention to the Computer Network quality. Similarly, the path coefficient between IS-Impact and IS-Satisfaction was significant t-value = (17.79) and strong â = (0.744), with IS-Impact alone explaining 55% of the variance in Satisfaction, consistent with results of the original IS-Impact study (Gable et al., 2008). The research contributions include: (a) supporting the completeness and validity of IS-Impact Model as a Hierarchical Multi-dimensional Formative Measurement Model in the Saudi Arabian context, (b) operationalizing Computer Network Quality as conceptualized in the ITU-T Recommendation E.800 (ITU-T, 1993), (c) validating CNQ as a formative measurement model and as an antecedent of IS Impact, and (d) conceptualizing and validating IS-Satisfaction as a reflective measurement model and as an immediate consequence of IS Impact. The CNQ model provides a framework to perceptually measure Computer Network Quality from multiple perspectives. The CNQ model features an easy-to-understand, easy-to-use, and economical survey instrument.
Resumo:
In the modern connected world, pervasive computing has become reality. Thanks to the ubiquity of mobile computing devices and emerging cloud-based services, the users permanently stay connected to their data. This introduces a slew of new security challenges, including the problem of multi-device key management and single-sign-on architectures. One solution to this problem is the utilization of secure side-channels for authentication, including the visual channel as vicinity proof. However, existing approaches often assume confidentiality of the visual channel, or provide only insufficient means of mitigating a man-in-the-middle attack. In this work, we introduce QR-Auth, a two-step, 2D barcode based authentication scheme for mobile devices which aims specifically at key management and key sharing across devices in a pervasive environment. It requires minimal user interaction and therefore provides better usability than most existing schemes, without compromising its security. We show how our approach fits in existing authorization delegation and one-time-password generation schemes, and that it is resilient to man-in-the-middle attacks.
Resumo:
Communication processes are vital in the lifecycle of BPM projects. With this in mind, much research has been performed into facilitating this key component between stakeholders. Amongst the methods used to support this process are personalized process visualisations. In this paper, we review the development of this visualization trend, then, we propose a theoretical analysis framework based upon communication theory. We use this framework to provide theoretical support to the conjecture that 3D virtual worlds are powerful tools for communicating personalised visualisations of processes within a workplace. Meta requirements are then derived and applied, via 3D virtual world functionalities, to generate example visualisations containing personalized aspects, which we believe enhance the process of communcation between analysts and stakeholders in BPM process (re)design activities.
Multi-level knowledge transfer in software development outsourcing projects : the agency theory view
Resumo:
In recent years, software development outsourcing has become even more complex. Outsourcing partner have begun‘re- outsourcing’ components of their projects to other outsourcing companies to minimize cost and gain efficiencies, creating a multi-level hierarchy of outsourcing. This research in progress paper presents preliminary findings of a study designed to understand knowledge transfer effectiveness of multi-level software development outsourcing projects. We conceptualize the SD-outsourcing entities using the Agency Theory. This study conceptualizes, operationalises and validates the concept of Knowledge Transfer as a three-phase multidimensional formative index of 1) Domain knowledge, 2) Communication behaviors, and 3) Clarity of requirements. Data analysis identified substantial, significant differences between the Principal and the Agent on two of the three constructs. Using Agency Theory, supported by preliminary findings, the paper also provides prescriptive guidelines of reducing the friction between the Principal and the Agent in multi-level software outsourcing.
Resumo:
Traffic congestion has a significant impact on the economy and environment. Encouraging the use of multimodal transport (public transport, bicycle, park’n’ride, etc.) has been identified by traffic operators as a good strategy to tackle congestion issues and its detrimental environmental impacts. A multi-modal and multi-objective trip planner provides users with various multi-modal options optimised on objectives that they prefer (cheapest, fastest, safest, etc) and has a potential to reduce congestion on both a temporal and spatial scale. The computation of multi-modal and multi-objective trips is a complicated mathematical problem, as it must integrate and utilize a diverse range of large data sets, including both road network information and public transport schedules, as well as optimising for a number of competing objectives, where fully optimising for one objective, such as travel time, can adversely affect other objectives, such as cost. The relationship between these objectives can also be quite subjective, as their priorities will vary from user to user. This paper will first outline the various data requirements and formats that are needed for the multi-modal multi-objective trip planner to operate, including static information about the physical infrastructure within Brisbane as well as real-time and historical data to predict traffic flow on the road network and the status of public transport. It will then present information on the graph data structures representing the road and public transport networks within Brisbane that are used in the trip planner to calculate optimal routes. This will allow for an investigation into the various shortest path algorithms that have been researched over the last few decades, and provide a foundation for the construction of the Multi-modal Multi-objective Trip Planner by the development of innovative new algorithms that can operate the large diverse data sets and competing objectives.
Resumo:
Two simple and effective control strategies for a multi-axle heavy truck, modified skyhook damping (MSD) control and proportional-integration-derivative (PID) control, were implemented into functional virtual prototype (FVP) model and compared in terms of road friendliness and ride comfort. A four-axle heavy truck-road coupling system model was established using FVP technology and validated through a ride comfort test. Then appropriate passive air suspensions were chosen to replace the rear tandem suspensions of the original truck model for preliminary optimization. The mechanical properties and time lag of dampers were taken into account in simulations of MSD and PID semi-active dampers implemented using MATLAB/Simulink. Through co-simulations with Adams and MATLAB, the effects of semi-active MSD and PID control were analyzed and compared, and control parameters which afforded the best comprehensive performance for each control strategy were chosen. Simulation results indicate that compared with the passive air suspension truck, semi-active MSD control improves both ride comfort and road-friendliness markedly, with optimization ratios of RMS vertical acceleration and RMS tyre force ranging from 10.1% to 44.8%. However, semi-active PID control only reduces vertical vibration of the driver’s seat by 11.1%, 11.1% and 10.9% on A, B and C level roads respectively. Both strategies are robust to the variation of road level.
Resumo:
This report is the eight deliverable of the Real Time and Predictive Traveller Information project and the third deliverable of the Arterial Travel Time Information sub-project in the Integrated Traveller Information research Domain of the Smart Transport Research Centre. The primary objective of the Arterial Travel Time Information sub-project is to develop algorithms for real-time travel time estimation and prediction models for arterial traffic. Brisbane arterial network is highly equipped with Bluetooth MAC Scanners, which can provide travel time information. Literature is limited with the knowledge on the Bluetooth protocol based data acquisition process and accuracy and reliability of the analysis performed using the data. This report expands the body of knowledge surrounding the use of data from Bluetooth MAC Scanner (BMS) as a complementary traffic data source. A multi layer simulation model named Traffic and Communication Simulation (TCS) is developed. TCS is utilised to model the theoretical properties of the BMS data and analyse the accuracy and reliability of travel time estimation using the BMS data.
Resumo:
Compression ignition (CI) engine design is subject to many constraints which presents a multi-criteria optimisation problem that the engine researcher must solve. In particular, the modern CI engine must not only be efficient, but must also deliver low gaseous, particulate and life cycle greenhouse gas emissions so that its impact on urban air quality, human health, and global warming are minimised. Consequently, this study undertakes a multi-criteria analysis which seeks to identify alternative fuels, injection technologies and combustion strategies that could potentially satisfy these CI engine design constraints. Three datasets are analysed with the Preference Ranking Organization Method for Enrichment Evaluations and Geometrical Analysis for Interactive Aid (PROMETHEE-GAIA) algorithm to explore the impact of 1): an ethanol fumigation system, 2): alternative fuels (20 % biodiesel and synthetic diesel) and alternative injection technologies (mechanical direct injection and common rail injection), and 3): various biodiesel fuels made from 3 feedstocks (i.e. soy, tallow, and canola) tested at several blend percentages (20-100 %) on the resulting emissions and efficiency profile of the various test engines. The results show that moderate ethanol substitutions (~20 % by energy) at moderate load, high percentage soy blends (60-100 %), and alternative fuels (biodiesel and synthetic diesel) provide an efficiency and emissions profile that yields the most “preferred” solutions to this multi-criteria engine design problem. Further research is, however, required to reduce Reactive Oxygen Species (ROS) emissions with alternative fuels, and to deliver technologies that do not significantly reduce the median diameter of particle emissions.
Resumo:
It is a big challenge to find useful associations in databases for user specific needs. The essential issue is how to provide efficient methods for describing meaningful associations and pruning false discoveries or meaningless ones. One major obstacle is the overwhelmingly large volume of discovered patterns. This paper discusses an alternative approach called multi-tier granule mining to improve frequent association mining. Rather than using patterns, it uses granules to represent knowledge implicitly contained in databases. It also uses multi-tier structures and association mappings to represent association rules in terms of granules. Consequently, association rules can be quickly accessed and meaningless association rules can be justified according to the association mappings. Moreover, the proposed structure is also an precise compression of patterns which can restore the original supports. The experimental results shows that the proposed approach is promising.
Resumo:
Sophisticated models of human social behaviour are fast becoming highly desirable in an increasingly complex and interrelated world. Here, we propose that rather than taking established theories from the physical sciences and naively mapping them into the social world, the advanced concepts and theories of social psychology should be taken as a starting point, and used to develop a new modelling methodology. In order to illustrate how such an approach might be carried out, we attempt to model the low elaboration attitude changes of a society of agents in an evolving social context. We propose a geometric model of an agent in context, where individual agent attitudes are seen to self-organise to form ideologies, which then serve to guide further agent-based attitude changes. A computational implementation of the model is shown to exhibit a number of interesting phenomena, including a tendency for a measure of the entropy in the system to decrease, and a potential for externally guiding a population of agents towards a new desired ideology.
Resumo:
A theoretical framework for a construction management decision evaluation system for project selection by means of a literature review. The theory is developed by the examination of the major factors concerning the project selection decision from a deterministic viewpoint, where the decision-maker is assumed to possess 'perfect knowledge' of all the aspects involved. Four fundamental project characteristics are identified together with three meaningful outcome variables. The relationship within and between these variables are considered together with some possible solution techniques. The theory is next extended to time-related dynamic aspects of the problem leading to the implications of imperfect knowledge and a nondeterministic model. A solution technique is proposed in which Gottinger's sequential machines are utilised to model the decision process,
Resumo:
The overarching aim of this thesis was to investigate how processes of perception and action emerge under changing informational constraints during performance of multi-articular interceptive actions. Interceptive actions provide unique opportunities to study processes of perception and action in dynamic performance environments. The movement model used to exemplify the functionally coupled relationship between perception and action, from an ecological dynamics perspective, was cricket batting. Ecological dynamics conceptualises the human body as a complex system composed of many interacting sub-systems, and perceptual and motor system degrees of freedom, which leads to the emergence of patterns of behaviour under changing task constraints during performance. The series of studies reported in the Chapters of this doctoral thesis contributed to understanding of human behaviour by providing evidence of key properties of complex systems in human movement systems including self-organisation under constraints and meta-stability. Specifically, the studies: i) demonstrated how movement organisation (action) and visual strategies (perception) of dynamic human behaviour are constrained by changing ecological (especially informational) task constraints; (ii) provided evidence for the importance of representative design in experiments on perception and action; and iii), provided a principled theoretical framework to guide learning design in acquisition of skill in interceptive actions like cricket batting.
Resumo:
A novel Glass Fibre Reinforced Polymer (GFRP) sandwich panel was developed by an Australian manufacturer for civil engineering applications. This research is motivated by the new applications of GFRP sandwich structures in civil engineering such as slab, beam, girder and sleeper. An optimisation methodology is developed in this work to enhance the design of GFRP sandwich beams. The design of single and glue laminated GFRP sandwich beam were conducted by using numerical optimisation. The numerical multi-objective optimisation considered a design two objectives simultaneously. These objectives are cost and mass. The numerical optimisation uses the Adaptive Range Multi-objective Genetic Algorithm (ARMOGA) and Finite Element (FE) method. Trade-offs between objectives was found during the optimisation process. Multi-objective optimisation shows a core to skin mass ratio equal to 3.68 for the single sandwich beam cross section optimisation and it showed that the optimum core to skin thickness ratio is 11.0.
Resumo:
Recent advances in computational geodynamics are applied to explore the link between Earth’s heat, its chemistry and its mechanical behavior. Computational thermal-mechanical solutions are now allowing us to understand Earth patterns by solving the basic physics of heat transfer. This approach is currently used to solve basic convection patterns of terrestrial planets. Applying the same methodology to smaller scales delivers promising similarities between observed and predicted structures which are often the site of mineral deposits. The new approach involves a fully coupled solution to the energy, momentum and continuity equations of the system at all scales, allowing the prediction of fractures, shear zones and other typical geological patterns out of a randomly perturbed initial state. The results of this approach are linking a global geodynamic mechanical framework over regional-scale mineral deposits down to the underlying micro-scale processes. Ongoing work includes the challenge of incorporating chemistry into the formulation.