173 resultados para Verma modules


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Public awareness and the nature of highway construction works demand that sustainability measures are first on the development agenda. However, in the current economic climate, individual volition and enthusiasm for such high capital investments do not present as strong cases for decision making as the financial pictures of pursuing sustainability. Some stakeholders consider sustainability to be extra work that costs additional money. Though, stakeholders realised its importance in infrastructure development. They are keen to identify the available alternatives and financial implications on a lifecycle basis. Highway infrastructure development is a complex rocess which requires expertise and tools to evaluate investment options, such as environmentally sustainable features for road and highway development. Life-cycle cost analysis (LCCA) is a valuable approach for investment decision making for construction works. However, LCCA applications in highway development are still limited. Current models, for example focus on economic issues alone and do not deal with sustainability factors, which are more difficult to quantify and encapsulate in estimation modules. This paper reports the research which identifies sustainability related factors in highway construction projects, in quantitative and qualitative forms of a multi-criteria analysis. These factors are then incorporated into past and proven LCCA models to produce a new long term decision support model. The research via questionnaire, model building, analytical hierarchy processes (AHP) and case studies have identified, evaluated and then processed highway sustainability related cost elements. These cost elements need to be verified by industry before being integrated for further development of the model. Then the Australian construction industry will have a practical tool to evaluate investment decisions which provide an optimum balance between financial viability and sustainability deliverables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers the pros and cons of using Behavioural cloning for the development of low-level helicopter automation modules. Over the course of this project several Behavioural cloning approaches have been investigated. The results of the most effective Behavioural cloning approach are then compared to PID modules designed for the same aircraft. The comparison takes into consideration development time, reliability, and control performance. It has been found that Behavioural cloning techniques employing local approximators and a wide state-space coverage during training can produce stabilising control modules in less time than tuning PID controllers. However, performance and reliabity deficits have been found to exist with the Behavioural Cloning, attributable largely to the time variant nature of the dynamics due to the operating environment, and the pilot actions being poor for teaching. The final conclusion drawn here is that tuning PID modules remains superior to behavioural cloning for low-level helicopter automation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since 2005 QUT through a number of large Teaching and Learning Grants has sponsored a range of teamwork learning initiatives to assist students to develop the teamwork skills demanded by industry. After a suite of six online team learning modules was developed, first year unit coordinators requested an additional module to address the challenges of working with the diverse range of social, cultural and personal values that students from different backgrounds bring to student teams. The Intercultural Teams module asks students to map themselves against a Cultural Orientations Framework so they can understand their own cultural beliefs. By learning about other cultural orientations and comparing and analysing their effects, team members can develop communication and team process management strategies to leverage their differences to realise effective and creative outcomes. The interactive session will demonstrate the elements of the Intercultural Teams module and ask participants to consider ways the module can be integrated into classroom learning to support the development of students’ intercultural competencies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Java programming language has potentially significant advantages for wireless sensor nodes but there is currently no feature-rich, open source virtual machine available. In this paper we present Darjeeling, a system comprising offline tools and a memory efficient run-time. The offline post-compiler tool analyzes, links and consolidates Java class files into loadable modules. The runtime implements a modified Java VM that supports multithreading and is designed specifically to operate in constrained execution environments such as wireless sensor network nodes and supports inheritance, threads, garbage collection, and loadable modules. We have demonstrated Java running on AVR128 and MSP430 microcontrollers at speeds of up to 70,000 JVM instructions per second.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Java programming language enjoys widespread popularity on platforms ranging from servers to mobile phones. While efforts have been made to run Java on microcontroller platforms, there is currently no feature-rich, open source virtual machine available. In this paper we present Darjeeling, a system comprising offline tools and a memory efficient runtime. The offline post-compiler tool analyzes, links and consolidates Java class files into loadable modules. The runtime implements a modified Java VM that supports multithreading and is designed specifically to operate in constrained execution environments such as wireless sensor network nodes. Darjeeling improves upon existing work by supporting inheritance, threads, garbage collection, and loadable modules while keeping memory usage to a minimum. We have demonstrated Java running on AVR128 and MSP430 micro-controllers at speeds of up to 70,000 JVM instructions per second.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper details the development of a machine learning system which uses the helicopter state and the actions of an instructing pilot to synthesise helicopter control modules online. Aggressive destabilisation/restabilisation sequences are used for training, such that a wide state space envelope is covered during training. The performance of heading, roll, pitch, height and lateral velocity control learning is presented using our Xcell 60 experimental platform. The helicopter is demonstrated to be stabilised on all axes using the “learning from a pilot” technique. To our knowledge, this is the first time a “learning from a pilot” technique has been successfully applied to all axes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over recent years, Unmanned Air Vehicles or UAVs have become a powerful tool for reconnaissance and surveillance tasks. These vehicles are now available in a broad size and capability range and are intended to fly in regions where the presence of onboard human pilots is either too risky or unnecessary. This paper describes the formulation and application of a design framework that supports the complex task of multidisciplinary design optimisation of UAVs systems via evolutionary computation. The framework includes a Graphical User Interface (GUI), a robust Evolutionary Algorithm optimiser named HAPEA, several design modules, mesh generators and post-processing capabilities in an integrated platform. These population –based algorithms such as EAs are good for cases problems where the search space can be multi-modal, non-convex or discontinuous, with multiple local minima and with noise, and also problems where we look for multiple solutions via Game Theory, namely a Nash equilibrium point or a Pareto set of non-dominated solutions. The application of the methodology is illustrated on conceptual and detailed multi-criteria and multidisciplinary shape design problems. Results indicate the practicality and robustness of the framework to find optimal shapes and trade—offs between the disciplinary analyses and to produce a set of non dominated solutions of an optimal Pareto front to the designer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Port land uses are subjected to unique anthropogenic activities compared to typical urban land uses. This uniqueness results in distinctive stormwater quality characteristics. Such distinction in stormwater quality has made conventional approaches used for pollutant load estimations inaccurate. This is also the case for the Port of Brisbane (PoB). The study discussed in the paper was conducted to estimate the pollutant contributions from Port specific land uses at PoB. For estimation, software modules embedded in Mike URBAN were used. An innovative approach was adopted in modelling where the conventional model calibration step was not needed to be performed to generate suitable site specific parameters. Instead, equations and site specific parameters that replicate pollutant build-up and wash-off were generated from an extensive field investigation. Models were simulated incorporating site specific parameters from six different Port specific land uses and rainfall events from three representative years. Outcomes of the modelling exercise were used to identify the distinct pollutant contributions from different Port land uses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Physical infrastructure assets are important components of our society and our economy. They are usually designed to last for many years, are expected to be heavily used during their lifetime, carry considerable load, and are exposed to the natural environment. They are also normally major structures, and therefore present a heavy investment, requiring constant management over their life cycle to ensure that they perform as required by their owners and users. Given a complex and varied infrastructure life cycle, constraints on available resources, and continuing requirements for effectiveness and efficiency, good management of infrastructure is important. While there is often no one best management approach, the choice of options is improved by better identification and analysis of the issues, by the ability to prioritise objectives, and by a scientific approach to the analysis process. The abilities to better understand the effect of inputs in the infrastructure life cycle on results, to minimise uncertainty, and to better evaluate the effect of decisions in a complex environment, are important in allocating scarce resources and making sound decisions. Through the development of an infrastructure management modelling and analysis methodology, this thesis provides a process that assists the infrastructure manager in the analysis, prioritisation and decision making process. This is achieved through the use of practical, relatively simple tools, integrated in a modular flexible framework that aims to provide an understanding of the interactions and issues in the infrastructure management process. The methodology uses a combination of flowcharting and analysis techniques. It first charts the infrastructure management process and its underlying infrastructure life cycle through the time interaction diagram, a graphical flowcharting methodology that is an extension of methodologies for modelling data flows in information systems. This process divides the infrastructure management process over time into self contained modules that are based on a particular set of activities, the information flows between which are defined by the interfaces and relationships between them. The modular approach also permits more detailed analysis, or aggregation, as the case may be. It also forms the basis of ext~nding the infrastructure modelling and analysis process to infrastructure networks, through using individual infrastructure assets and their related projects as the basis of the network analysis process. It is recognised that the infrastructure manager is required to meet, and balance, a number of different objectives, and therefore a number of high level outcome goals for the infrastructure management process have been developed, based on common purpose or measurement scales. These goals form the basis of classifYing the larger set of multiple objectives for analysis purposes. A two stage approach that rationalises then weights objectives, using a paired comparison process, ensures that the objectives required to be met are both kept to the minimum number required and are fairly weighted. Qualitative variables are incorporated into the weighting and scoring process, utility functions being proposed where there is risk, or a trade-off situation applies. Variability is considered important in the infrastructure life cycle, the approach used being based on analytical principles but incorporating randomness in variables where required. The modular design of the process permits alternative processes to be used within particular modules, if this is considered a more appropriate way of analysis, provided boundary conditions and requirements for linkages to other modules, are met. Development and use of the methodology has highlighted a number of infrastructure life cycle issues, including data and information aspects, and consequences of change over the life cycle, as well as variability and the other matters discussed above. It has also highlighted the requirement to use judgment where required, and for organisations that own and manage infrastructure to retain intellectual knowledge regarding that infrastructure. It is considered that the methodology discussed in this thesis, which to the author's knowledge has not been developed elsewhere, may be used for the analysis of alternatives, planning, prioritisation of a number of projects, and identification of the principal issues in the infrastructure life cycle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Literally, the word compliance suggests conformity in fulfilling official requirements. The thesis presents the results of the analysis and design of a class of protocols called compliant cryptologic protocols (CCP). The thesis presents a notion for compliance in cryptosystems that is conducive as a cryptologic goal. CCP are employed in security systems used by at least two mutually mistrusting sets of entities. The individuals in the sets of entities only trust the design of the security system and any trusted third party the security system may include. Such a security system can be thought of as a broker between the mistrusting sets of entities. In order to provide confidence in operation for the mistrusting sets of entities, CCP must provide compliance verification mechanisms. These mechanisms are employed either by all the entities or a set of authorised entities in the system to verify the compliance of the behaviour of various participating entities with the rules of the system. It is often stated that confidentiality, integrity and authentication are the primary interests of cryptology. It is evident from the literature that authentication mechanisms employ confidentiality and integrity services to achieve their goal. Therefore, the fundamental services that any cryptographic algorithm may provide are confidentiality and integrity only. Since controlling the behaviour of the entities is not a feasible cryptologic goal,the verification of the confidentiality of any data is a futile cryptologic exercise. For example, there exists no cryptologic mechanism that would prevent an entity from willingly or unwillingly exposing its private key corresponding to a certified public key. The confidentiality of the data can only be assumed. Therefore, any verification in cryptologic protocols must take the form of integrity verification mechanisms. Thus, compliance verification must take the form of integrity verification in cryptologic protocols. A definition of compliance that is conducive as a cryptologic goal is presented as a guarantee on the confidentiality and integrity services. The definitions are employed to provide a classification mechanism for various message formats in a cryptologic protocol. The classification assists in the characterisation of protocols, which assists in providing a focus for the goals of the research. The resulting concrete goal of the research is the study of those protocols that employ message formats to provide restricted confidentiality and universal integrity services to selected data. The thesis proposes an informal technique to understand, analyse and synthesise the integrity goals of a protocol system. The thesis contains a study of key recovery,electronic cash, peer-review, electronic auction, and electronic voting protocols. All these protocols contain message format that provide restricted confidentiality and universal integrity services to selected data. The study of key recovery systems aims to achieve robust key recovery relying only on the certification procedure and without the need for tamper-resistant system modules. The result of this study is a new technique for the design of key recovery systems called hybrid key escrow. The thesis identifies a class of compliant cryptologic protocols called secure selection protocols (SSP). The uniqueness of this class of protocols is the similarity in the goals of the member protocols, namely peer-review, electronic auction and electronic voting. The problem statement describing the goals of these protocols contain a tuple,(I, D), where I usually refers to an identity of a participant and D usually refers to the data selected by the participant. SSP are interested in providing confidentiality service to the tuple for hiding the relationship between I and D, and integrity service to the tuple after its formation to prevent the modification of the tuple. The thesis provides a schema to solve the instances of SSP by employing the electronic cash technology. The thesis makes a distinction between electronic cash technology and electronic payment technology. It will treat electronic cash technology to be a certification mechanism that allows the participants to obtain a certificate on their public key, without revealing the certificate or the public key to the certifier. The thesis abstracts the certificate and the public key as the data structure called anonymous token. It proposes design schemes for the peer-review, e-auction and e-voting protocols by employing the schema with the anonymous token abstraction. The thesis concludes by providing a variety of problem statements for future research that would further enrich the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many large coal mining operations in Australia rely heavily on the rail network to transport coal from mines to coal terminals at ports for shipment. Over the last few years, due to the fast growing demand, the coal rail network is becoming one of the worst industrial bottlenecks in Australia. As a result, this provides great incentives for pursuing better optimisation and control strategies for the operation of the whole rail transportation system under network and terminal capacity constraints. This PhD research aims to achieve a significant efficiency improvement in a coal rail network on the basis of the development of standard modelling approaches and generic solution techniques. Generally, the train scheduling problem can be modelled as a Blocking Parallel- Machine Job-Shop Scheduling (BPMJSS) problem. In a BPMJSS model for train scheduling, trains and sections respectively are synonymous with jobs and machines and an operation is regarded as the movement/traversal of a train across a section. To begin, an improved shifting bottleneck procedure algorithm combined with metaheuristics has been developed to efficiently solve the Parallel-Machine Job- Shop Scheduling (PMJSS) problems without the blocking conditions. Due to the lack of buffer space, the real-life train scheduling should consider blocking or hold-while-wait constraints, which means that a track section cannot release and must hold a train until the next section on the routing becomes available. As a consequence, the problem has been considered as BPMJSS with the blocking conditions. To develop efficient solution techniques for BPMJSS, extensive studies on the nonclassical scheduling problems regarding the various buffer conditions (i.e. blocking, no-wait, limited-buffer, unlimited-buffer and combined-buffer) have been done. In this procedure, an alternative graph as an extension of the classical disjunctive graph is developed and specially designed for the non-classical scheduling problems such as the blocking flow-shop scheduling (BFSS), no-wait flow-shop scheduling (NWFSS), and blocking job-shop scheduling (BJSS) problems. By exploring the blocking characteristics based on the alternative graph, a new algorithm called the topological-sequence algorithm is developed for solving the non-classical scheduling problems. To indicate the preeminence of the proposed algorithm, we compare it with two known algorithms (i.e. Recursive Procedure and Directed Graph) in the literature. Moreover, we define a new type of non-classical scheduling problem, called combined-buffer flow-shop scheduling (CBFSS), which covers four extreme cases: the classical FSS (FSS) with infinite buffer, the blocking FSS (BFSS) with no buffer, the no-wait FSS (NWFSS) and the limited-buffer FSS (LBFSS). After exploring the structural properties of CBFSS, we propose an innovative constructive algorithm named the LK algorithm to construct the feasible CBFSS schedule. Detailed numerical illustrations for the various cases are presented and analysed. By adjusting only the attributes in the data input, the proposed LK algorithm is generic and enables the construction of the feasible schedules for many types of non-classical scheduling problems with different buffer constraints. Inspired by the shifting bottleneck procedure algorithm for PMJSS and characteristic analysis based on the alternative graph for non-classical scheduling problems, a new constructive algorithm called the Feasibility Satisfaction Procedure (FSP) is proposed to obtain the feasible BPMJSS solution. A real-world train scheduling case is used for illustrating and comparing the PMJSS and BPMJSS models. Some real-life applications including considering the train length, upgrading the track sections, accelerating a tardy train and changing the bottleneck sections are discussed. Furthermore, the BPMJSS model is generalised to be a No-Wait Blocking Parallel- Machine Job-Shop Scheduling (NWBPMJSS) problem for scheduling the trains with priorities, in which prioritised trains such as express passenger trains are considered simultaneously with non-prioritised trains such as freight trains. In this case, no-wait conditions, which are more restrictive constraints than blocking constraints, arise when considering the prioritised trains that should traverse continuously without any interruption or any unplanned pauses because of the high cost of waiting during travel. In comparison, non-prioritised trains are allowed to enter the next section immediately if possible or to remain in a section until the next section on the routing becomes available. Based on the FSP algorithm, a more generic algorithm called the SE algorithm is developed to solve a class of train scheduling problems in terms of different conditions in train scheduling environments. To construct the feasible train schedule, the proposed SE algorithm consists of many individual modules including the feasibility-satisfaction procedure, time-determination procedure, tune-up procedure and conflict-resolve procedure algorithms. To find a good train schedule, a two-stage hybrid heuristic algorithm called the SE-BIH algorithm is developed by combining the constructive heuristic (i.e. the SE algorithm) and the local-search heuristic (i.e. the Best-Insertion- Heuristic algorithm). To optimise the train schedule, a three-stage algorithm called the SE-BIH-TS algorithm is developed by combining the tabu search (TS) metaheuristic with the SE-BIH algorithm. Finally, a case study is performed for a complex real-world coal rail network under network and terminal capacity constraints. The computational results validate that the proposed methodology would be very promising because it can be applied as a fundamental tool for modelling and solving many real-world scheduling problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Component software has many benefits, most notably increased software re-use; however, the component software process places heavy burdens on programming language technology, which modern object-oriented programming languages do not address. In particular, software components require specifications that are both sufficiently expressive and sufficiently abstract, and, where possible, these specifications should be checked formally by the programming language. This dissertation presents a programming language called Mentok that provides two novel programming language features enabling improved specification of stateful component roles. Negotiable interfaces are interface types extended with protocols, and allow specification of changing method availability, including some patterns of out-calls and re-entrance. Type layers are extensions to module signatures that allow specification of abstract control flow constraints through the interfaces of a component-based application. Development of Mentok's unique language features included creation of MentokC, the Mentok compiler, and formalization of key properties of Mentok in mini-languages called MentokP and MentokL.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present the design and construction of a prototype target tracking system. The experimental set up consists of three main modules for moving the object, detecting the motion of the object and its tracking. The mechanism for moving the object includes an object and two stepper motors and their driving and control circuitry. The detection of the object’s motion is realized by photo switch array. The tracking mechanism consists of a laser beam and two DC servomotors and their associated circuitry. The control algorithm is a standard fuzzy logic controller. The system is designed to operate in two modes in such a way that the role of target and tracker can be interchanged. Experimental results indicate that the fuzzy controller is capable of controlling the system in both modes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multidisciplinary learning, interdisciplinary learning and transdisciplinary learning are often used with a similar meaning, but the misunderstanding of these terms may cause a failure of defining learner needs and developing high quality learning design. In this article, the three terms are reviewed in line with learner engagement and are conceptualised according to different types and levels of interactivity. An undergraduate course, named Creative Industries: Making Connections, was designed to deliver various learning modules to over 1200 students from 11 different disciplines in a blended learning mode. A visual communication learning module in the course, in particular, challenges students as well as academic staff to experience transdisciplinary learning. A survey was conducted to evaluate students' learning experience in the visual communication learning module. The results of the survey bring up meaningful implications for the realisation of transdisciplinary learning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An elective internship unit as part of a work integrated learning program in a business faculty is presented as a case study. In the unit, students complete a minimum of 120 hours work placement over the course of a 13 week semester. The students are majoring in advertising, marketing, or public relations and are placed in corporations, government agencies, and not for profit organisations. To support and scaffold the students’ learning in the work environment, a range of classroom and online learning activities are part of the unit. Classroom activities include an introductory workshop to prepare students for placement, an industry panel, and interview workshop. These are delivered as three workshops across the semester. Prior to commencing their placement, students complete a suite of online learning modules. The Work Placement Preparation Program assists students in securing obtaining a placement and make a successful transition to the work environment. It provides an opportunity for students to source possible work placement sites, prepare competitive applications, develop and rehearse interview skills, deal with workplace issues, and use a student ePortfolio to reflect on their skills and achievements. Students contribute to a reflective blog throughout their placement, with feedback from academic supervisors throughout the placement. The completion of the online learning modules and contribution to a reflective blog are assessed as part of the unit. Other assessment tools include a internship plan and learning contract between the student, industry supervisor, and academic supervisor; job application including responses to selection criteria; and presentation to peers, academics and industry representatives at a poster session. The paper discusses the development of the internship unit over three years, particularly learning activities and assessment. The reflection and refinement of the unit is informed by a pedagogical framework, and the development of processes to best manage placement for all stakeholders. A model of best practice is proposed, that can be adapted to a variety of discipline areas.