838 resultados para end-to-side
Resumo:
The continuing need for governments to radically improve the delivery of public services has led to a new, holistic government reform strategy labeled “Transformational Government” that strongly emphasizes customer-centricity. Attention has turned to online portals as a cost effective front-end to deliver services and engage customers as well as to the corresponding organizational approaches for the back-end to decouple the service interface from the departmental structures. The research presented in this paper makes three contributions: Firstly, a systematic literature review of approaches to the evaluation of online portal models in the public sector is presented. Secondly, the findings of a usability study comparing the online presences of the Queensland Government, the UK Government and the South Australian Government are reported and the relative strengths and weaknesses of the different approaches are discussed. And thirdly, the limitations of the usability study in the context of a broader “Transformational Government” approach are identified and service bundling is suggested as an innovative solution to further improve online service delivery.
Resumo:
Deploying wireless networks in networked control systems (NCSs) has become more and more popular during the last few years. As a typical type of real-time control systems, an NCS is sensitive to long and nondeterministic time delay and packet losses. However, the nature of the wireless channel has the potential to degrade the performance of NCS networks in many aspects, particularly in time delay and packet losses. Transport layer protocols could play an important role in providing both reliable and fast transmission service to fulfill NCS’s real-time transmission requirements. Unfortunately, none of the existing transport protocols, including the Transport Control Protocol (TCP) and the User Datagram Protocol (UDP), was designed for real-time control applications. Moreover, periodic data and sporadic data are two types of real-time data traffic with different priorities in an NCS. Due to the lack of support for prioritized transmission service, the real-time performance for periodic and sporadic data in an NCS network is often degraded significantly, particularly under congested network conditions. To address these problems, a new transport layer protocol called Reliable Real-Time Transport Protocol (RRTTP) is proposed in this thesis. As a UDP-based protocol, RRTTP inherits UDP’s simplicity and fast transmission features. To improve the reliability, a retransmission and an acknowledgement mechanism are designed in RRTTP to compensate for packet losses. They are able to avoid unnecessary retransmission of the out-of-date packets in NCSs, and collisions are unlikely to happen, and small transmission delay can be achieved. Moreover, a prioritized transmission mechanism is also designed in RRTTP to improve the real-time performance of NCS networks under congested traffic conditions. Furthermore, the proposed RRTTP is implemented in the Network Simulator 2 for comprehensive simulations. The simulation results demonstrate that RRTTP outperforms TCP and UDP in terms of real-time transmissions in an NCS over wireless networks.
Resumo:
This paper presents a model for the generation of a MAC tag using a stream cipher. The input message is used indirectly to control segments of the keystream that form the MAC tag. Several recent proposals can be considered as instances of this general model, as they all perform message accumulation in this way. However, they use slightly different processes in the message preparation and finalisation phases. We examine the security of this model for different options and against different types of attack, and conclude that the indirect injection model can be used to generate MAC tags securely for certain combinations of options. Careful consideration is required at the design stage to avoid combinations of options that result in susceptibility to forgery attacks. Additionally, some implementations may be vulnerable to side-channel attacks if used in Authenticated Encryption (AE) algorithms. We give design recommendations to provide resistance to these attacks for proposals following this model.
Resumo:
For industrial wireless sensor networks, maintaining the routing path for a high packet delivery ratio is one of the key objectives in network operations. It is important to both provide the high data delivery rate at the sink node and guarantee a timely delivery of the data packet at the sink node. Most proactive routing protocols for sensor networks are based on simple periodic updates to distribute the routing information. A faulty link causes packet loss and retransmission at the source until periodic route update packets are issued and the link has been identified as broken. We propose a new proactive route maintenance process where periodic update is backed-up with a secondary layer of local updates repeating with shorter periods for timely discovery of broken links. Proposed route maintenance scheme improves reliability of the network by decreasing the packet loss due to delayed identification of broken links. We show by simulation that proposed mechanism behaves better than the existing popular routing protocols (AODV, AOMDV and DSDV) in terms of end-to-end delay, routing overhead, packet reception ratio.
Resumo:
Software development settings provide a great opportunity for CSCW researchers to study collaborative work. In this paper, we explore a specific work practice called bug reproduction that is a part of the software bug-fixing process. Bug re-production is a highly collaborative process by which software developers attempt to locally replicate the ‘environment’ within which a bug was originally encountered. Customers, who encounter bugs in their everyday use of systems, play an important role in bug reproduction as they provide useful information to developers, in the form of steps for reproduction, software screenshots, trace logs, and other ways to describe a problem. Bug reproduction, however, poses major hurdles in software maintenance as it is often challenging to replicate the contextual aspects that are at play at the customers’ end. To study the bug reproduction process from a human-centered perspective, we carried out an ethnographic study at a multinational engineering company. Using semi-structured interviews, a questionnaire and half-a-day observation of sixteen software developers working on different software maintenance projects, we studied bug reproduction. In this pa-per, we present a holistic view of bug reproduction practices from a real-world set-ting and discuss implications for designing tools to address the challenges developers face during bug reproduction.
Resumo:
Service-oriented architectures and Web services mature and have become more widely accepted and used by industry. This growing adoption increased the demands for new ways of using Web service technology. Users start re-combining and mediating other providers’ services in ways that have not been anticipated by their original provider. Within organisations and cross-organisational communities, discoverable services are organised in repositories providing convenient access to adaptable end-to-end business processes. This idea is captured in the term Service Ecosystem. This paper addresses the question of how quality management can be performed in such service ecosystems. Service quality management is a key challenge when services are composed of a dynamic set of heterogeneous sub-services from different service providers. This paper contributes to this important area by developing a reference model of quality management in service ecosystems. We illustrate the application of the reference model in an exploratory case study. With this case study, we show how the reference model helps to derive requirements for the implementation and support of quality management in an exemplary service ecosystem in public administration.
Acceptability-based QoE management for user-centric mobile video delivery : a field study evaluation
Resumo:
Effective Quality of Experience (QoE) management for mobile video delivery – to optimize overall user experience while adapting to heterogeneous use contexts – is still a big challenge to date. This paper proposes a mobile video delivery system to emphasize the use of acceptability as the main indicator of QoE to manage the end-to-end factors in delivering mobile video services. The first contribution is a novel framework for user-centric mobile video system that is based on acceptability-based QoE (A-QoE) prediction models, which were derived from comprehensive subjective studies. The second contribution is results from a field study that evaluates the user experience of the proposed system during realistic usage circumstances, addressing the impacts of perceived video quality, loading speed, interest in content, viewing locations, network bandwidth, display devices, and different video coding approaches, including region-of-interest (ROI) enhancement and center zooming
Resumo:
This thesis introduces a method of applying Bayesian Networks to combine information from a range of data sources for effective decision support systems. It develops a set of techniques in development, validation, visualisation, and application of Complex Systems models, with a working demonstration in an Australian airport environment. The methods presented here have provided a modelling approach that produces highly flexible, informative and applicable interpretations of a system's behaviour under uncertain conditions. These end-to-end techniques are applied to the development of model based dashboards to support operators and decision makers in the multi-stakeholder airport environment. They provide highly flexible and informative interpretations and confidence in these interpretations of a system's behaviour under uncertain conditions.
Resumo:
Underwater wireless sensor networks (UWSNs) have become the seat of researchers' attention recently due to their proficiency to explore underwater areas and design different applications for marine discovery and oceanic surveillance. One of the main objectives of each deployed underwater network is discovering the optimized path over sensor nodes to transmit the monitored data to onshore station. The process of transmitting data consumes energy of each node, while energy is limited in UWSNs. So energy efficiency is a challenge in underwater wireless sensor network. Dual sinks vector based forwarding (DS-VBF) takes both residual energy and location information into consideration as priority factors to discover an optimized routing path to save energy in underwater networks. The modified routing protocol employs dual sinks on the water surface which improves network lifetime. According to deployment of dual sinks, packet delivery ratio and the average end to end delay are enhanced. Based on our simulation results in comparison with VBF, average end to end delay reduced more than 80%, remaining energy increased 10%, and the increment of packet reception ratio was about 70%.
Resumo:
Supervisory Control and Data Acquisition (SCADA) systems are one of the key foundations of smart grids. The Distributed Network Protocol version 3 (DNP3) is a standard SCADA protocol designed to facilitate communications in substations and smart grid nodes. The protocol is embedded with a security mechanism called Secure Authentication (DNP3-SA). This mechanism ensures that end-to-end communication security is provided in substations. This paper presents a formal model for the behavioural analysis of DNP3-SA using Coloured Petri Nets (CPN). Our DNP3-SA CPN model is capable of testing and verifying various attack scenarios: modification, replay and spoofing, combined complex attack and mitigation strategies. Using the model has revealed a previously unidentified flaw in the DNP3-SA protocol that can be exploited by an attacker that has access to the network interconnecting DNP3 devices. An attacker can launch a successful attack on an outstation without possessing the pre-shared keys by replaying a previously authenticated command with arbitrary parameters. We propose an update to the DNP3-SA protocol that removes the flaw and prevents such attacks. The update is validated and verified using our CPN model proving the effectiveness of the model and importance of the formal protocol analysis.
Resumo:
Climate change is one of the most important issues confronting the sustainable supply of seafood, with projections suggesting major effects on wild and farmed fisheries worldwide. While climate change has been a consideration for Australian fisheries and aquaculture management, emphasis in both research and adaptation effort has been at the production end of supply chains—impacts further along the chain have been overlooked to date. A holistic biophysical and socio-economic system view of seafood industries, as represented by end-to-end supply chains, may lead to an additional set of options in the face of climate change, thus maximizing opportunities for improved fishery profitability, while also reducing the potential for maladaptation. In this paper, we explore Australian seafood industry stakeholder perspectives on potential options for adaptation along seafood supply chains based on future potential scenarios. Stakeholders, representing wild capture and aquaculture industries, provided a range of actions targeting different stages of the supply chain. Overall, proposed strategies were predominantly related to the production end of the supply chain, suggesting that greater attention in developing adaptation options is needed at post-production stages. However, there are chain-wide adaptation strategies that can present win–win scenarios, where commercial objectives beyond adaptation can also be addressed alongside direct or indirect impacts of climate. Likewise, certain adaptation strategies in place at one stage of the chain may have varying implications on other stages of the chain. These findings represent an important step in understanding the role of supply chains in effective adaptation of fisheries and aquaculture industries to climate change.
Resumo:
Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.
Resumo:
Telomeres are the termini of linear eukaryotic chromosomes consisting of tandem repeats of DNA and proteins that bind to these repeat sequences. Telomeres ensure the complete replication of chromosome ends, impart protection to ends from nucleolytic degradation, end-to-end fusion, and guide the localization of chromosomes within the nucleus. In addition, a combination of genetic, biochemical, and molecular biological approaches have implicated key roles for telomeres in diverse cellular processes such as regulation of gene expression, cell division, cell senescence, and cancer. This review focuses on recent advances in our understanding of the organization of telomeres, telomere replication, proteins that bind telomeric DNA, and the establishment of telomere length equilibrium.
Resumo:
The Forest devil. Businessman Erik Johan Längman (1799 1863) in the transition of economic system In Finnish historiography, Erik Johan Längman (1799-1863) bears a bad reputation of his own level: a mean, profit-seeking businessman who did not care too much about methods in his operations. Although little known, Längman has been praised as one of the pioneers of modern industry in the Grand Duchy of Finland, which belonged to the Russian Empire. From the mid 1830s Längman owned iron mill and several sawmills around the country. The growing demand of the markets in the 1830s, especially in Great Britain, marked a strong stimulus to Finnish lumber industry. At the same time claims for stricter rule over the sawmill industry were raised by high officials. The momentum of the conflict, the Forest Act of 1851, brought an end to illegal overproduction. In this biography, particular emphasis is laid on the entrepreneurial behaviour of Längman, but also on the effect the entrepreneurs had on the Crown s policies. On the other hand, how did the limitations imposed by the Crown guide the actions of the sawmill owners? The solutions adopted by the sawmill owners and the manoeuvring of the government are in a constant dialogue in this study. The Finnish sawmill industry experienced a major change in its techniques and methods of acquiring timber during the 1830s. Längman particularly, with his acquisition organisation, was able to find and reach faraway forests with unexpected results. The official regulating system with its strict producing quotas couldn t follow the changes. When the battle against the sawmill industry really started on, in 1840, it didn t happen for the benefit of iron industry, as argued previously, but to save Crown forests from depletion. After the mid 1840s Längman and the leader of the Finnish nationalistic movement, J. V. Snellman questioned the rationality of the entire regulation system and in doing so they also posed a threat against the aristocratic power. The influential but now also badly provoked chairman of the economic division of senate, Lars Gabriel von Haartman, accused the sawmill-owners harder than ever and took the advantage of the reactionary spirit of imperial Russia to launch the state forest administration. Längman circumvented the conditions of privileges, felled Crown forests illegally and accusations were brought against him for destroying his competitors. The repeated conflicts spoke primarily about a superior business idea and organisational ability. Although Längman spent his last years mostly abroad he still had interests in Finnish timber business when the liberation of sawmill-industry was established, in 1861. Surprisingly, the antagonism around the Crown forests continued, probably even more heated.
Resumo:
Wildlife harvesting has a long history in Australia, including obvious examples of overexploitation. Not surprisingly, there is scepticism that commercial harvesting can be undertaken sustainably. Kangaroo harvesting has been challenged regularly at Administrative Appeals Tribunals and elsewhere over the past three decades. Initially, the concern from conservation groups was sustainability of the harvest. This has been addressed through regular, direct monitoring that now spans > 30 years and a conservative harvest regime with a low risk of overharvest in the face of uncertainty. Opposition to the harvest now continues from animal rights groups whose concerns have shifted from overall harvest sustainability to side effects such as animal welfare, and changes to community structure, genetic composition and population age structure. Many of these concerns are speculative and difficult to address, requiring expensive data. One concern is that older females are the more successful breeders and teach their daughters optimal habitat and diet selection. The lack of older animals in a harvested population may reduce the fitness of the remaining individuals; implying population viability would also be compromised. This argument can be countered by the persistence of populations under harvesting without any obvious impairment to reproduction. Nevertheless, an interesting question is how age influences reproductive output. In this study, data collected from a number of red kangaroo populations across eastern Australia indicate that the breeding success of older females is up to 7-20% higher than that of younger females. This effect is smaller than that of body condition and the environment, which can increase breeding success by up to 30% and 60% respectively. Average age of mature females in a population may be reduced from 9 to 6 years old, resulting in a potential reduction in breeding success of 3-4%. This appears to be offset in harvested populations by improved condition of females from a reduction in kangaroo density. There is an important recommendation for management. The best insurance policy against overharvest and unwanted side effects is not research, which could be never-ending. Rather, it is a harvest strategy that includes safeguards against uncertainty such as harvest reserves, conservative quotas and regular monitoring. Research is still important in fine tuning that strategy and is most usefully incorporated as adaptive management where it can address the key questions on how populations respond to harvesting.