719 resultados para multi-value
Resumo:
New substation technology, such as non-conventional instrument transformers,and a need to reduce design and construction costs, are driving the adoption of Ethernet based digital process bus networks for high voltage substations. Protection and control applications can share a process bus, making more efficient use of the network infrastructure. This paper classifies and defines performance requirements for the protocols used in a process bus on the basis of application. These include GOOSE, SNMP and IEC 61850-9-2 sampled values. A method, based on the Multiple Spanning Tree Protocol (MSTP) and virtual local area networks, is presented that separates management and monitoring traffic from the rest of the process bus. A quantitative investigation of the interaction between various protocols used in a process bus is described. These tests also validate the effectiveness of the MSTP based traffic segregation method. While this paper focusses on a substation automation network, the results are applicable to other real-time industrial networks that implement multiple protocols. High volume sampled value data and time-critical circuit breaker tripping commands do not interact on a full duplex switched Ethernet network, even under very high network load conditions. This enables an efficient digital network to replace a large number of conventional analog connections between control rooms and high voltage switchyards.
The creative citizen : understanding the value of design education programs in the knowledge economy
Resumo:
The knowledge economy relies on the diffusion and use of knowledge as well as its creation (Houghton and Sheenan, 2000). The future success of economic activity will depend on the capacity of organisations to transform by increasing their flexibility. In particular, this transformation is dependant on a decentralised, networked and multi-skilled workforce. To help organisations transition, new strategies and structures for education are required. Education systems need to concentrate less on specialist skills and more on the development of people with broad-based problem solving skills that are adaptable, with social and inter-personal communication skills necessary for networking and communication. This paper presents the findings of a ‘Knowledge Economy Market Development Mapping Study’ conducted to identify the value of design education programs from primary through to tertiary level in Queensland, Australia. The relationship of these programs to the development of the capacities mentioned above is explored. The study includes the collection of qualitative and quantitative data consisting of a literature review, focus groups and survey. Recommendations for the future development of design education programs in Queensland, Australia are proposed, and future research opportunities are presented and discussed.
Resumo:
The election of a national Labor Government in 2007 saw ‘social inclusion’ emerge as Australia’s overarching social policy agenda. Being ‘included’ has since been defined as being able to ‘have the resources, opportunities and capabilities needed to learn, work, engage and have a voice’. Various researchers have adopted the social inclusion concept to construct a multi-dimensional framework for measuring disadvantage, beyond poverty alleviation. This research program has enabled various forms of statistical modelling based on some agreement about what it means to be ‘included’ in society. At the same time it is acknowledged that social inclusion remains open and contestable and can be used in the name of both progressive and more punitive programs and policies. This ambiguity raises questions about whether the social inclusion framework, as it is presently defined, has the potential to be a progressive and transformative discourse. In this paper we examine whether the Australian social inclusion agenda has the capacity to address social inequality in a meaningful way, concluding with a discussion about the need to understand social inequality and social disadvantage in relational terms.
Resumo:
Value Management (VM) is a proven methodology that provides a structured framework using supporting tools and techniques that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. It offers an exceptionally robust approach to exploring the need and function of projects to be aligned with client’s objectives. The functional analysis and creativity phases of VM are crucial as it focused on utilising innovative thinking to understand the objectives of clients’ projects and provide value-adding solutions at the early discovery stages of projects. There is however a perception of VM as just being another cost-cutting tool, which has overshadowed the fundamental benefits of the method, therefore negating both influence and wider use in the construction industry. This paper describes findings from a series of case studies conducted at project and corporate levels of a current public funded infrastructure projects in Malaysia. The study aims to investigate VM processes practised by the project client organisation and evaluate the effects of project team involvement in VM workshops during the design-stage of these projects. The focus of the study is on how issues related to ‘upstream’ infrastructure design aimed at improving ‘downstream’ construction process on-site, are being resolved through multi-disciplinary team consideration and decision-making. Findings from the case studies indicate that the mix of disciplines of project team members at a design-stage of a VM workshop has minimal influence on improving construction processes. However, the degree of interaction, institutionalized thinking, cultural dimensions and visualization aids adopted, have a significant impact in maximizing creativity amongst project team members during VM workshop. The case studies conducted for this research have focused on infrastructure projects that utilise traditional VM workshop as client’s chosen VM methodology to review and develop designs. Documents review and semi-structured interview with project teams are used as data collection techniques for the case study. The significant outcomes of this research are expected to offer alternative perspectives for construction professionals and clients to minimise the constraints and strengthen strategies for implementing VM on future projects.
Resumo:
In this work, the thermal expansion properties of carbon nanotube (CNT)-reinforced nanocomposites with CNT content ranging from 1 to 15 wt% were evaluated using a multi-scale numerical approach, in which the effects of two parameters, i.e., temperature and CNT content, were investigated extensively. For all CNT contents, the obtained results clearly revealed that within a wide low-temperature range (30°C ~ 62°C), thermal contraction is observed, while thermal expansion occurs in a high-temperature range (62°C ~ 120°C). It was found that at any specified CNT content, the thermal expansion properties vary with temperature - as temperature increases, the thermal expansion rate increases linearly. However, at a specified temperature, the absolute value of the thermal expansion rate decreases nonlinearly as the CNT content increases. Moreover, the results provided by the present multi-scale numerical model were in good agreement with those obtained from the corresponding theoretical analyses and experimental measurements in this work, which indicates that this multi-scale numerical approach provides a powerful tool to evaluate the thermal expansion properties of any type of CNT/polymer nanocomposites and therefore promotes the understanding on the thermal behaviors of CNT/polymer nanocomposites for their applications in temperature sensors, nanoelectronics devices, etc.
Resumo:
Introduction: The motivation for developing megavoltage (and kilovoltage) cone beam CT (MV CBCT) capabilities in the radiotherapy treatment room was primarily based on the need to improve patient set-up accuracy. There has recently been an interest in using the cone beam CT data for treatment planning. Accurate treatment planning, however, requires knowledge of the electron density of the tissues receiving radiation in order to calculate dose distributions. This is obtained from CT, utilising a conversion between CT number and electron density of various tissues. The use of MV CBCT has particular advantages compared to treatment planning with kilovoltage CT in the presence of high atomic number materials and requires the conversion of pixel values from the image sets to electron density. Therefore, a study was undertaken to characterise the pixel value to electron density relationship for the Siemens MV CBCT system, MVision, and determine the effect, if any, of differing the number of monitor units used for acquisition. If a significant difference with number of monitor units was seen then pixel value to ED conversions may be required for each of the clinical settings. The calibration of the MV CT images for electron density offers the possibility for a daily recalculation of the dose distribution and the introduction of new adaptive radiotherapy treatment strategies. Methods: A Gammex Electron Density CT Phantom was imaged with the MVCB CT system. The pixel value for each of the sixteen inserts, which ranged from 0.292 to 1.707 relative electron density to the background solid water, was determined by taking the mean value from within a region of interest centred on the insert, over 5 slices within the centre of the phantom. These results were averaged and plotted against the relative electron densities of each insert with a linear least squares fit was preformed. This procedure was performed for images acquired with 5, 8, 15 and 60 monitor units. Results: The linear relationship between MVCT pixel value and ED was demonstrated for all monitor unit settings and over a range of electron densities. The number of monitor units utilised was found to have no significant impact on this relationship. Discussion: It was found that the number of MU utilised does not significantly alter the pixel value obtained for different ED materials. However, to ensure the most accurate and reproducible MV to ED calibration, one MU setting should be chosen and used routinely. To ensure accuracy for the clinical situation this MU setting should correspond to that which is used clinically. If more than one MU setting is used clinically then an average of the CT values acquired with different numbers of MU could be utilized without loss in accuracy. Conclusions: No significant differences have been shown between the pixel value to ED conversion for the Siemens MV CT cone beam unit with change in monitor units. Thus as single conversion curve could be utilised for MV CT treatment planning. To fully utilise MV CT imaging for radiotherapy treatment planning further work will be undertaken to ensure all corrections have been made and dose calculations verified. These dose calculations may be either for treatment planning purposes or for reconstructing the delivered dose distribution from transit dosimetry measurements made using electronic portal imaging devices. This will potentially allow the cumulative dose distribution to be determined through the patient’s multi-fraction treatment and adaptive treatment strategies developed to optimize the tumour response.
Resumo:
Defining success in mega projects has been a challenging exercise for Australian Defence. The inherent conflict between nation capability building and cost efficiency raises questions about how to appropriately define mega project success. Contrary to the traditional output-focused project methodology, the value creation perspective argues for the importance of creating new knowledge, processes, and systems for suppliers and customers. Stakeholder involvement is important in this new perspective, as the balancing of competing needs of stakeholders in mega projects becomes a major challenge in managing the value co-creation process. In our earlier study reported interview data from three Australian defence mega projects and reported that those senior executives have a more complex understanding of project success than traditional iron triangle measures. In these mega defence projects, customers and other stakeholders actively engage in the value creation process, and over time both content and process value are created to increase defence and national capability. Value created and captured during and post projects are the key to true success. We aim to develop a comprehensive theoretical model the capture the value co-creation process as a way of re-conceptualising success in mega projects. We propose a new framework redefine project value as multi-dimensional, contextual and temporal construct that emerges from the interactions among multiple stake holders over the complete project life cycle. The framework distinguishes between exploitation and exploration types of projects, and takes into consideration the requisite governance structures.
Resumo:
The ability to identify and assess user engagement with transmedia productions is vital to the success of individual projects and the sustainability of this mode of media production as a whole. It is essential that industry players have access to tools and methodologies that offer the most complete and accurate picture of how audiences/users engage with their productions and which assets generate the most valuable returns of investment. Drawing upon research conducted with Hoodlum Entertainment, a Brisbane-based transmedia producer, this project involved an initial assessment of the way engagement tends to be understood, why standard web analytics tools are ill-suited to measuring it, how a customised tool could offer solutions, and why this question of measuring engagement is so vital to the future of transmedia as a sustainable industry. Working with data provided by Hoodlum Entertainment and Foxtel Marketing, the outcome of the study was a prototype for a custom data visualisation tool that allowed access, manipulation and presentation of user engagement data, both historic and predictive. The prototyped interfaces demonstrate how the visualization tool would collect and organise data specific to multiplatform projects by aggregating data across a number of platform reporting tools. Such a tool is designed to encompass not only platforms developed by the transmedia producer but also sites developed by fans. This visualisation tool accounted for multiplatform experience projects whose top level is comprised of people, platforms and content. People include characters, actors, audience, distributors and creators. Platforms include television, Facebook and other relevant social networks, literature, cinema and other media that might be included in the multiplatform experience. Content refers to discreet media texts employed within the platform, such as tweet, a You Tube video, a Facebook post, an email, a television episode, etc. Core content is produced by the creators’ multiplatform experiences to advance the narrative, while complimentary content generated by audience members offers further contributions to the experience. Equally important is the timing with which the components of the experience are introduced and how they interact with and impact upon each other. Being able to combine, filter and sort these elements in multiple ways we can better understand the value of certain components of a project. It also offers insights into the relationship between the timing of the release of components and user activity associated with them, which further highlights the efficacy (or, indeed, failure) of assets as catalysts for engagement. In collaboration with Hoodlum we have developed a number of design scenarios experimenting with the ways in which data can be visualised and manipulated to tell a more refined story about the value of user engagement with certain project components and activities. This experimentation will serve as the basis for future research.
Resumo:
Two key elements of education for sustainability (EfS) are action-competence, and the importance of place and experiencing the natural world. These elements emphasise and depend on the relationship between learners and their real world contexts, and have been incorporated to some extent into the sustainability cross-curricular perspective of the new Australian curriculum. Given the importance of real-world experiential learning in EfS, what is to be made of the use of multi-user virtual worlds in EfS? We went with our preservice secondary science teachers to the very appealing virtual world Quest Atlantis, which we are using in this paper as an example to explore the value of virtual worlds in EfS. In assessing the virtual world of Quest Atlantis against Australia’s Sustainability Curriculum Framework, many areas of coherence are evident relating to world viewing, systems thinking and futures thinking, knowledge of ecological and human systems, and implementing and reflecting on the consequences of actions. The power and appeal of these virtual experiences in developing these knowledges is undeniable. However there is some incoherence between the elements of EfS as expressed in the Sustainability Curriculum Framework and the experience of QA where learners are not acting in their real world, or developing connection with real place. This analysis highlights both the value and some limitations of virtual worlds as a venue for EfS.
Resumo:
This paper discusses the idea and demonstrates an early prototype of a novel method of interacting with security surveillance footage using natural user interfaces in place of traditional mouse and keyboard interaction. Current surveillance monitoring stations and systems provide the user with a vast array of video feeds from multiple locations on a video wall, relying on the user’s ability to distinguish locations of the live feeds from experience or list based key-value pair of location and camera IDs. During an incident, this current method of interaction may cause the user to spend increased amounts time obtaining situational and location awareness, which is counter-productive. The system proposed in this paper demonstrates how a multi-touch screen and natural interaction can enable the surveillance monitoring station users to quickly identify the location of a security camera and efficiently respond to an incident.
Resumo:
Secure multi-party computation (MPC) protocols enable a set of n mutually distrusting participants P 1, ..., P n , each with their own private input x i , to compute a function Y = F(x 1, ..., x n ), such that at the end of the protocol, all participants learn the correct value of Y, while secrecy of the private inputs is maintained. Classical results in the unconditionally secure MPC indicate that in the presence of an active adversary, every function can be computed if and only if the number of corrupted participants, t a , is smaller than n/3. Relaxing the requirement of perfect secrecy and utilizing broadcast channels, one can improve this bound to t a < n/2. All existing MPC protocols assume that uncorrupted participants are truly honest, i.e., they are not even curious in learning other participant secret inputs. Based on this assumption, some MPC protocols are designed in such a way that after elimination of all misbehaving participants, the remaining ones learn all information in the system. This is not consistent with maintaining privacy of the participant inputs. Furthermore, an improvement of the classical results given by Fitzi, Hirt, and Maurer indicates that in addition to t a actively corrupted participants, the adversary may simultaneously corrupt some participants passively. This is in contrast to the assumption that participants who are not corrupted by an active adversary are truly honest. This paper examines the privacy of MPC protocols, and introduces the notion of an omnipresent adversary, which cannot be eliminated from the protocol. The omnipresent adversary can be either a passive, an active or a mixed one. We assume that up to a minority of participants who are not corrupted by an active adversary can be corrupted passively, with the restriction that at any time, the number of corrupted participants does not exceed a predetermined threshold. We will also show that the existence of a t-resilient protocol for a group of n participants, implies the existence of a t’-private protocol for a group of n′ participants. That is, the elimination of misbehaving participants from a t-resilient protocol leads to the decomposition of the protocol. Our adversary model stipulates that a MPC protocol never operates with a set of truly honest participants (which is a more realistic scenario). Therefore, privacy of all participants who properly follow the protocol will be maintained. We present a novel disqualification protocol to avoid a loss of privacy of participants who properly follow the protocol.
Resumo:
A multi-secret sharing scheme allows several secrets to be shared amongst a group of participants. In 2005, Shao and Cao developed a verifiable multi-secret sharing scheme where each participant’s share can be used several times which reduces the number of interactions between the dealer and the group members. In addition some secrets may require a higher security level than others involving the need for different threshold values. Recently Chan and Chang designed such a scheme but their construction only allows a single secret to be shared per threshold value. In this article we combine the previous two approaches to design a multiple time verifiable multi-secret sharing scheme where several secrets can be shared for each threshold value. Since the running time is an important factor for practical applications, we will provide a complexity comparison of our combined approach with respect to the previous schemes.
Resumo:
A dynamic accumulator is an algorithm, which merges a large set of elements into a constant-size value such that for an element accumulated, there is a witness confirming that the element was included into the value, with a property that accumulated elements can be dynamically added and deleted into/from the original set. Recently Wang et al. presented a dynamic accumulator for batch updates at ICICS 2007. However, their construction suffers from two serious problems. We analyze them and propose a way to repair their scheme. We use the accumulator to construct a new scheme for common secure indices with conjunctive keyword-based retrieval.
Resumo:
This study adopts the premise that innovation capability underpins a service firm's value creation ability and that management style, employee behaviors and marketing underpin its innovation capability. This study examines the role of managers and employees in the creation and delivery of superior value to customers via the firm's innovation capability. To test this premise the current study examines the role of transformational leadership (TFL) as an aspect of the service firm's management style in creating and delivering value to customers through its services. This study adopts a multi-level study, collecting data from managers, employees and customers of service firms in a Southeast-Asian country, Cambodia. The results show that a service firm's innovation capability has a positive effect on the firm's value offering (VO), the VO has a positive relationship with customer perceived value-in use (PVI), and PVI has a positive relationship with firm performance. This study also finds moderating effects of TFL on the relationship between service innovation capability and VO, and of service marketing capability on the relationship between VO and PVI respectively.
Resumo:
Multi-party key agreement protocols indirectly assume that each principal equally contributes to the final form of the key. In this paper we consider three malleability attacks on multi-party key agreement protocols. The first attack, called strong key control allows a dishonest principal (or a group of principals) to fix the key to a pre-set value. The second attack is weak key control in which the key is still random, but the set from which the key is drawn is much smaller than expected. The third attack is named selective key control in which a dishonest principal (or a group of dishonest principals) is able to remove a contribution of honest principals to the group key. The paper discusses the above three attacks on several key agreement protocols, including DH (Diffie-Hellman), BD (Burmester-Desmedt) and JV (Just-Vaudenay). We show that dishonest principals in all three protocols can weakly control the key, and the only protocol which does not allow for strong key control is the DH protocol. The BD and JV protocols permit to modify the group key by any pair of neighboring principals. This modification remains undetected by honest principals.