984 resultados para Computational-Linguistic resource
Resumo:
Demography theory suggests that high gender diversity leads to high turnover. As turnover is costly for organizations, we examined whether HR policies and practices influence the expected gender diversity-turnover relationship. Survey data were collected from 198 HR decision makers at publicly listed organizations. We found that HR policies and practices that are supportive of diversity moderate the gender diversity-turnover relationship, such that high gender diversity leads to low turnover in organizations with many diversity supportive policies and practices. Results suggest that organizations can avoid the negative consequences of high gender diversity by implementing diversity supportive HR polices and practices.
Resumo:
The increasing scarcity of water in the world, along with rapid population increase in urban areas, gives reason for concern and highlights the need for integrating water and wastewater management practices. The uncontrolled growth in urban areas has made planning, management and expansion of water and wastewater infrastructure systems very difficult and expensive. In order to achieve sustainable wastewater treatment and promote the conservation of water and nutrient resources, this chapter advocates the need for a closed-loop treatment system approach, and the transformation of the traditional linear treatment systems into integrated cyclical treatment systems. The recent increased understanding of integrated resource management and a shift towards sustainable management and planning of water and wastewater infrastructure are also discussed.
Resumo:
Many infrastructure and necessity systems such as electricity and telecommunication in Europe and the Northern America were used to be operated as monopolies, if not state-owned. However, they have now been disintegrated into a group of smaller companies managed by different stakeholders. Railways are no exceptions. Since the early 1980s, there have been reforms in the shape of restructuring of the national railways in different parts of the world. Continuous refinements are still conducted to allow better utilisation of railway resources and quality of service. There has been a growing interest for the industry to understand the impacts of these reforms on the operation efficiency and constraints. A number of post-evaluations have been conducted by analysing the performance of the stakeholders on their profits (Crompton and Jupe 2003), quality of train service (Shaw 2001) and engineering operations (Watson 2001). Results from these studies are valuable for future improvement in the system, followed by a new cycle of post-evaluations. However, direct implementation of these changes is often costly and the consequences take a long period of time (e.g. years) to surface. With the advance of fast computing technologies, computer simulation is a cost-effective means to evaluate a hypothetical change in a system prior to actual implementation. For example, simulation suites have been developed to study a variety of traffic control strategies according to sophisticated models of train dynamics, traction and power systems (Goodman, Siu and Ho 1998, Ho and Yeung 2001). Unfortunately, under the restructured railway environment, it is by no means easy to model the complex behaviour of the stakeholders and the interactions between them. Multi-agent system (MAS) is a recently developed modelling technique which may be useful in assisting the railway industry to conduct simulations on the restructured railway system. In MAS, a real-world entity is modelled as a software agent that is autonomous, reactive to changes, able to initiate proactive actions and social communicative acts. It has been applied in the areas of supply-chain management processes (García-Flores, Wang and Goltz 2000, Jennings et al. 2000a, b) and e-commerce activities (Au, Ngai and Parameswaran 2003, Liu and You 2003), in which the objectives and behaviour of the buyers and sellers are captured by software agents. It is therefore beneficial to investigate the suitability or feasibility of applying agent modelling in railways and the extent to which it might help in developing better resource management strategies. This paper sets out to examine the benefits of using MAS to model the resource management process in railways. Section 2 first describes the business environment after the railway 2 Modelling issues on the railway resource management process using MAS reforms. Then the problems emerge from the restructuring process are identified in section 3. Section 4 describes the realisation of a MAS for railway resource management under the restructured scheme and the feasible studies expected from the model.
Resumo:
A high performance, low computational complexity rate-based flow control algorithm which can avoid congestion and achieve fairness is important to ATM available bit rate service. The explicit rate allocation algorithm proposed by Kalampoukas et al. is designed to achieve max–min fairness in ATM networks. It has several attractive features, such as a fixed computational complexity of O(1) and the guaranteed convergence to max–min fairness. In this paper, certain drawbacks of the algorithm, such as the severe overload of an outgoing link during transient period and the non-conforming use of the current cell rate field in a resource management cell, have been identified and analysed; a new algorithm which overcomes these drawbacks is proposed. The proposed algorithm simplifies the rate computation as well. Compared with Kalampoukas's algorithm, it has better performance in terms of congestion avoidance and smoothness of rate allocation.
Resumo:
The use of feedback technologies, in the form of products such as Smart Meters, is increasingly seen as the means by which 'consumers' can be made aware of their patterns of resource consumption, and to then use this enhanced awareness to change their behaviour to reduce the environmental impacts of their consumption. These technologies tend to be single-resource focused (e.g. on electricity consumption only) and their functionality defined by persons other than end-users (e.g. electricity utilities). This paper presents initial findings of end-users' experiences with a multi-resource feedback technology, within the context of sustainable housing. It proposes that an understanding of user context, supply chain management and market diffusion issues are important design considerations that contribute to technology 'success'.
Resumo:
Real-world business processes are resource-intensive. In work environments human resources usually multitask, both human and non-human resources are typically shared between tasks, and multiple resources are sometimes necessary to undertake a single task. However, current Business Process Management Systems focus on task-resource allocation in terms of individual human resources only and lack support for a full spectrum of resource classes (e.g., human or non-human, application or non-application, individual or teamwork, schedulable or unschedulable) that could contribute to tasks within a business process. In this paper we develop a conceptual data model of resources that takes into account the various resource classes and their interactions. The resulting conceptual resource model is validated using a real-life healthcare scenario.
Resumo:
The Guardian reportage of the United Kingdom Member of Parliament (MP) expenses scandal of 2009 used crowdsourcing and computational journalism techniques. Computational journalism can be broadly defined as the application of computer science techniques to the activities of journalism. Its foundation lies in computer assisted reporting techniques and its importance is increasing due to the: (a) increasing availability of large scale government datasets for scrutiny; (b) declining cost, increasing power and ease of use of data mining and filtering software; and Web 2.0; and (c) explosion of online public engagement and opinion.. This paper provides a case study of the Guardian MP expenses scandal reportage and reveals some key challenges and opportunities for digital journalism. It finds journalists may increasingly take an active role in understanding, interpreting, verifying and reporting clues or conclusions that arise from the interrogations of datasets (computational journalism). Secondly a distinction should be made between information reportage and computational journalism in the digital realm, just as a distinction might be made between citizen reporting and citizen journalism. Thirdly, an opportunity exists for online news providers to take a ‘curatorial’ role, selecting and making easily available the best data sources for readers to use (information reportage). These activities have always been fundamental to journalism, however the way in which they are undertaken may change. Findings from this paper may suggest opportunities and challenges for the implementation of computational journalism techniques in practice by digital Australian media providers, and further areas of research.
Resumo:
We present an approach to automating computationally sound proofs of key exchange protocols based on public-key encryption. We show that satisfying the property called occultness in the Dolev-Yao model guarantees the security of a related key exchange protocol in a simple computational model. Security in this simpler model has been shown to imply security in a Bellare {Rogaway-like model. Furthermore, the occultness in the Dolev-Yao model can be searched automatically by a mechanisable procedure. Thus automated proofs for key exchange protocols in the computational model can be achieved. We illustrate the method using the well-known Lowe-Needham-Schroeder protocol.
Resumo:
The authors currently engage in two projects to improve human-computer interaction (HCI) designs that can help conserve resources. The projects explore motivation and persuasion strategies relevant to ubiquitous computing systems that bring real-time consumption data into the homes and hands of residents in Brisbane, Australia. The first project seeks to increase understanding among university staff of the tangible and negative effects that excessive printing has on the workplace and local environment. The second project seeks to shift attitudes toward domestic energy conservation through software and hardware that monitor real-time, in situ electricity consumption in homes across Queensland. The insights drawn from these projects will help develop resource consumption user archetypes, providing a framework linking people to differing interface design requirements.
Resumo:
Sustainability decisions and their impacts may be among the greatest challenges facing the world in the 21st century (Davos 2000). Apart from adaptation on the part of established organizations these challenges are arguably going to require solutions developed by new actors However, young ventures have only recently begun generating research interest within sustainability literature (Shepherd et al. 2009). In particular, little is known about resource behaviours of these ventures and how they adapt to substantial resource constraints. One promising theory that has been identified as a way that some entrepreneurs manage constraints is bricolage: a construct defined as “making do by applying combinations of the resources at hand to new problems and opportunities” (Baker and Nelson 2005: 333). Bricolage may be critical as the means of continued venture success as these ventures are frequently developed in severe resource constraint, owing to higher levels of technical sophistication (Rothaermel and Deeds 2006). Further, they are often developed by entrepreneurs committed to personal and social goals of resourcefulness, including values that focus on conservation rather than consumption of resources (Shepherd et al. 2009). In this paper, using seven novel cases of high potential sustainability firms from CAUSEE we consider how constraints impact resource behaviours and further illustrate and extend bricolage domains previously developed by Baker and Nelson (2005) with recommendations for theory and practice provided.
Resumo:
This research shows that gross pollutant traps (GPTs) continue to play an important role in preventing visible street waste—gross pollutants—from contaminating the environment. The demand for these GPTs calls for stringent quality control and this research provides a foundation to rigorously examine the devices. A novel and comprehensive testing approach to examine a dry sump GPT was developed. The GPT is designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. This device has not been previously investigated. Apart from the review of GPTs and gross pollutant data, the testing approach includes four additional aspects to this research, which are: field work and an historical overview of street waste/stormwater pollution, calibration of equipment, hydrodynamic studies and gross pollutant capture/retention investigations. This work is the first comprehensive investigation of its kind and provides valuable practical information for the current research and any future work pertaining to the operations of GPTs and management of street waste in the urban environment. Gross pollutant traps—including patented and registered designs developed by industry—have specific internal configurations and hydrodynamic separation characteristics which demand individual testing and performance assessments. Stormwater devices are usually evaluated by environmental protection agencies (EPAs), professional bodies and water research centres. In the USA, the American Society of Civil Engineers (ASCE) and the Environmental Water Resource Institute (EWRI) are examples of professional and research organisations actively involved in these evaluation/verification programs. These programs largely rely on field evaluations alone that are limited in scope, mainly for cost and logistical reasons. In Australia, evaluation/verification programs of new devices in the stormwater industry are not well established. The current limitations in the evaluation methodologies of GPTs have been addressed in this research by establishing a new testing approach. This approach uses a combination of physical and theoretical models to examine in detail the hydrodynamic and capture/retention characteristics of the GPT. The physical model consisted of a 50% scale model GPT rig with screen blockages varying from 0 to 100%. This rig was placed in a 20 m flume and various inlet and outflow operating conditions were modelled on observations made during the field monitoring of GPTs. Due to infrequent cleaning, the retaining screens inside the GPTs were often observed to be blocked with organic matter. Blocked screens can radically change the hydrodynamic and gross pollutant capture/retention characteristics of a GPT as shown from this research. This research involved the use of equipment, such as acoustic Doppler velocimeters (ADVs) and dye concentration (Komori) probes, which were deployed for the first time in a dry sump GPT. Hence, it was necessary to rigorously evaluate the capability and performance of these devices, particularly in the case of the custom made Komori probes, about which little was known. The evaluation revealed that the Komori probes have a frequency response of up to 100 Hz —which is dependent upon fluid velocities—and this was adequate to measure the relevant fluctuations of dye introduced into the GPT flow domain. The outcome of this evaluation resulted in establishing methodologies for the hydrodynamic measurements and gross pollutant capture/retention experiments. The hydrodynamic measurements consisted of point-based acoustic Doppler velocimeter (ADV) measurements, flow field particle image velocimetry (PIV) capture, head loss experiments and computational fluid dynamics (CFD) simulation. The gross pollutant capture/retention experiments included the use of anthropogenic litter components, tracer dye and custom modified artificial gross pollutants. Anthropogenic litter was limited to tin cans, bottle caps and plastic bags, while the artificial pollutants consisted of 40 mm spheres with a range of four buoyancies. The hydrodynamic results led to the definition of global and local flow features. The gross pollutant capture/retention results showed that when the internal retaining screens are fully blocked, the capture/retention performance of the GPT rapidly deteriorates. The overall results showed that the GPT will operate efficiently until at least 70% of the screens are blocked, particularly at high flow rates. This important finding indicates that cleaning operations could be more effectively planned when the GPT capture/retention performance deteriorates. At lower flow rates, the capture/retention performance trends were reversed. There is little difference in the poor capture/retention performance between a fully blocked GPT and a partially filled or empty GPT with 100% screen blockages. The results also revealed that the GPT is designed with an efficient high flow bypass system to avoid upstream blockages. The capture/retention performance of the GPT at medium to high inlet flow rates is close to maximum efficiency (100%). With regard to the design appraisal of the GPT, a raised inlet offers a better capture/retention performance, particularly at lower flow rates. Further design appraisals of the GPT are recommended.
Resumo:
In a digital world, users’ Personally Identifiable Information (PII) is normally managed with a system called an Identity Management System (IMS). There are many types of IMSs. There are situations when two or more IMSs need to communicate with each other (such as when a service provider needs to obtain some identity information about a user from a trusted identity provider). There could be interoperability issues when communicating parties use different types of IMS. To facilitate interoperability between different IMSs, an Identity Meta System (IMetS) is normally used. An IMetS can, at least theoretically, join various types of IMSs to make them interoperable and give users the illusion that they are interacting with just one IMS. However, due to the complexity of an IMS, attempting to join various types of IMSs is a technically challenging task, let alone assessing how well an IMetS manages to integrate these IMSs. The first contribution of this thesis is the development of a generic IMS model called the Layered Identity Infrastructure Model (LIIM). Using this model, we develop a set of properties that an ideal IMetS should provide. This idealized form is then used as a benchmark to evaluate existing IMetSs. Different types of IMS provide varying levels of privacy protection support. Unfortunately, as observed by Jøsang et al (2007), there is insufficient privacy protection in many of the existing IMSs. In this thesis, we study and extend a type of privacy enhancing technology known as an Anonymous Credential System (ACS). In particular, we extend the ACS which is built on the cryptographic primitives proposed by Camenisch, Lysyanskaya, and Shoup. We call this system the Camenisch, Lysyanskaya, Shoup - Anonymous Credential System (CLS-ACS). The goal of CLS-ACS is to let users be as anonymous as possible. Unfortunately, CLS-ACS has problems, including (1) the concentration of power to a single entity - known as the Anonymity Revocation Manager (ARM) - who, if malicious, can trivially reveal a user’s PII (resulting in an illegal revocation of the user’s anonymity), and (2) poor performance due to the resource-intensive cryptographic operations required. The second and third contributions of this thesis are the proposal of two protocols that reduce the trust dependencies on the ARM during users’ anonymity revocation. Both protocols distribute trust from the ARM to a set of n referees (n > 1), resulting in a significant reduction of the probability of an anonymity revocation being performed illegally. The first protocol, called the User Centric Anonymity Revocation Protocol (UCARP), allows a user’s anonymity to be revoked in a user-centric manner (that is, the user is aware that his/her anonymity is about to be revoked). The second protocol, called the Anonymity Revocation Protocol with Re-encryption (ARPR), allows a user’s anonymity to be revoked by a service provider in an accountable manner (that is, there is a clear mechanism to determine which entity who can eventually learn - and possibly misuse - the identity of the user). The fourth contribution of this thesis is the proposal of a protocol called the Private Information Escrow bound to Multiple Conditions Protocol (PIEMCP). This protocol is designed to address the performance issue of CLS-ACS by applying the CLS-ACS in a federated single sign-on (FSSO) environment. Our analysis shows that PIEMCP can both reduce the amount of expensive modular exponentiation operations required and lower the risk of illegal revocation of users’ anonymity. Finally, the protocols proposed in this thesis are complex and need to be formally evaluated to ensure that their required security properties are satisfied. In this thesis, we use Coloured Petri nets (CPNs) and its corresponding state space analysis techniques. All of the protocols proposed in this thesis have been formally modeled and verified using these formal techniques. Therefore, the fifth contribution of this thesis is a demonstration of the applicability of CPN and its corresponding analysis techniques in modeling and verifying privacy enhancing protocols. To our knowledge, this is the first time that CPN has been comprehensively applied to model and verify privacy enhancing protocols. From our experience, we also propose several CPN modeling approaches, including complex cryptographic primitives (such as zero-knowledge proof protocol) modeling, attack parameterization, and others. The proposed approaches can be applied to other security protocols, not just privacy enhancing protocols.
Resumo:
The Guardian reportage of the United Kingdom Member of Parliament (MP) expenses scandal of 2009 used crowdsourcing and computational journalism techniques. Computational journalism can be broadly defined as the application of computer science techniques to the activities of journalism. Its foundation lies in computer assisted reporting techniques and its importance is increasing due to the: (a) increasing availability of large scale government datasets for scrutiny; (b) declining cost, increasing power and ease of use of data mining and filtering software; and Web 2.0; and (c) explosion of online public engagement and opinion.. This paper provides a case study of the Guardian MP expenses scandal reportage and reveals some key challenges and opportunities for digital journalism. It finds journalists may increasingly take an active role in understanding, interpreting, verifying and reporting clues or conclusions that arise from the interrogations of datasets (computational journalism). Secondly a distinction should be made between information reportage and computational journalism in the digital realm, just as a distinction might be made between citizen reporting and citizen journalism. Thirdly, an opportunity exists for online news providers to take a ‘curatorial’ role, selecting and making easily available the best data sources for readers to use (information reportage). These activities have always been fundamental to journalism, however the way in which they are undertaken may change. Findings from this paper may suggest opportunities and challenges for the implementation of computational journalism techniques in practice by digital Australian media providers, and further areas of research.
Resumo:
Resource-based theory posits that firms achieve high performance by controlling resources that are rare, valuable and costly for others to duplicate or work around. Yet scholars have been less successful understanding processes and behaviours by which firms develop such resources. We draw on the behavioral theory of bricolage from the entrepreneurship literature to suggest one such mechanism by which firms may develop such resource-based advantages. The core of our argument is that idiosyncratic bundling processes synonymous with bricolage behavior may create advantageous resource positions by (i) allowing resource constrained firms to allocate more of their limited resources to activities that they view as more strategically important, and (ii) increasing the difficulties other firms face in trying to imitate these advantages. Based on this reasoning we develop several hypotheses which we test in the context of several samples from a large, longitudinal, Australian study of new firm development. The results support our arguments that bricolage will improve a firms’ overall resource positions while generating more areas of strong resource advantage and fewer areas of strong resource disadvantage. We find little support, however, for our arguments that bricolage will make a firms’ key resource advantages more difficult for other firms to imitate. We find some support for our argument that the role of bricolage in creating resource advantages will be enhanced by the quality of the opportunity with which a firm is engaged.
Resumo:
There are at least four key challenges in the online news environment that computational journalism may address. Firstly, news providers operate in a rapidly evolving environment and larger businesses are typically slower to adapt to market innovations. News consumption patterns have changed and news providers need to find new ways to capture and retain digital users. Meanwhile, declining financial performance has led to cost cuts in mass market newspapers. Finally investigative reporting is typically slow, high cost and may be tedious, and yet is valuable to the reputation of a news provider. Computational journalism involves the application of software and technologies to the activities of journalism, and it draws from the fields of computer science, social science and communications. New technologies may enhance the traditional aims of journalism, or may require “a new breed of people who are midway between technologists and journalists” (Irfan Essa in Mecklin 2009: 3). Historically referred to as ‘computer assisted reporting’, the use of software in online reportage is increasingly valuable due to three factors: larger datasets are becoming publicly available; software is becoming sophisticated and ubiquitous; and the developing Australian digital economy. This paper introduces key elements of computational journalism – it describes why it is needed; what it involves; benefits and challenges; and provides a case study and examples. Computational techniques can quickly provide a solid factual basis for original investigative journalism and may increase interaction with readers, when correctly used. It is a major opportunity to enhance the delivery of original investigative journalism, which ultimately may attract and retain readers online.