978 resultados para Defense information, Classified


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reset/inhibitor nets are Petri nets extended with reset arcs and inhibitor arcs. These extensions can be used to model cancellation and blocking. A reset arc allows a transition to remove all tokens from a certain place when the transition fires. An inhibitor arc can stop a transition from being enabled if the place contains one or more tokens. While reset/inhibitor nets increase the expressive power of Petri nets, they also result in increased complexity of analysis techniques. One way of speeding up Petri net analysis is to apply reduction rules. Unfortunately, many of the rules defined for classical Petri nets do not hold in the presence of reset and/or inhibitor arcs. Moreover, new rules can be added. This is the first paper systematically presenting a comprehensive set of reduction rules for reset/inhibitor nets. These rules are liveness and boundedness preserving and are able to dramatically reduce models and their state spaces. It can be observed that most of the modeling languages used in practice have features related to cancellation and blocking. Therefore, this work is highly relevant for all kinds of application areas where analysis is currently intractable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Each year, The Australian Centre for Philanthropy and Nonprofit Studies (CPNS) at Queensland University of Technology (QUT) collects and analyses statistics on the amount and extent of tax-deductible donations made and claimed by Australians in their individual income tax returns to deductible gift recipients (DGRs). The information presented below is based on the amount and type of tax-deductible donations made and claimed by Australian individual taxpayers to DGRs for the period 1 July 2008 to 30 June 2009. This information has been extracted mainly from the Australian Taxation Office's (ATO) publication Taxation Statistics 2008-09. The 2008-09 report is the latest report that has been made publicly available. It represents information in tax returns for the 2008-09 year processed by the ATO as at 31 October 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis develops, applies and analyses a collaborative design methodology for branding a tourism destination. The area between the Northern Tablelands and the Mid-North Coast of New South Wales, Australia, was used as a case study for this research. The study applies theoretical concepts of systems thinking and complexity to the real world, and tests the use of design as a social tool to engage multiple stakeholders in planning. In this research I acknowledge that places (and destinations) are socially constructed through people's interactions with their physical and social environments. This study explores a methodology that is explicit about the uncertainties of the destination’s system, and that helps to elicit knowledge and system trends. The collective design process used the creation of brand concepts, elements and strategies as instruments to directly engage stakeholders in the process of reflecting about their places and the issues related to tourism activity in the region. The methods applied included individual conversations and collaborative design sessions to elicit knowledge from local stakeholders. Concept maps were used to register and interpret information released throughout the process. An important aspect of the methodology was to bring together different stakeholder groups and translate the information into a common language that was understandable by all participants. This work helped release significant information as to what kind of tourism activity local stakeholders are prepared to receive and support. It also helped the emergence of a more unified regional identity. The outcomes delivered by the project (brand, communication material and strategies) were of high quality and in line with the desires and expectation of the local hosts. The process also reinforced local sense of pride, belonging and conservation. Furthermore, interaction between participants from different parts of the region triggered some self organising activity around the brand they created together. A major contribution of the present work is the articulation of an inclusive methodology to facilitate the involvement of locals into the decision-making process related to tourism planning. Of particular significance is the focus on the social construction of meaning in and through design, showing that design exercises can have significant social impact – not only on the final product, but also on the realities of the people involved in the creative process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Expert elicitation is the process of determining what expert knowledge is relevant to support a quantitative analysis and then eliciting this information in a form that supports analysis or decision-making. The credibility of the overall analysis, therefore, relies on the credibility of the elicited knowledge. This, in turn, is determined by the rigor of the design and execution of the elicitation methodology, as well as by its clear communication to ensure transparency and repeatability. It is difficult to establish rigor when the elicitation methods are not documented, as often occurs in ecological research. In this chapter, we describe software that can be combined with a well-structured elicitation process to improve the rigor of expert elicitation and documentation of the results

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This panel discusses the impact of Green IT on information systems and how information systems can meet environmental challenges and ensure sustainability. We wish to highlight the role of green business processes, and specifically the contributions that the management of these processes can play in leveraging the transformative power of IS in order to create an environmentally sustainable society. The management of business processes has typically been thought of in terms of business improvement alongside the dimensions time, cost, quality, or flexibility – the so-called ‘devil’s quadrangle’. Contemporary organizations, however, increasingly become aware of the need to create more sustainable, IT-enabled business processes that are also successful in terms of their economic, ecological, as well as social impact. Exemplary ecological key performance indicators that increasingly find their way into the agenda of managers include carbon emissions, data center energy, or renewable energy consumption (SAP 2010). The key challenge, therefore, is to extend the devil’s quadrangle to a devil’s pentagon, including sustainability as an important fifth dimension in process change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To detect and annotate the key events of live sports videos, we need to tackle the semantic gaps of audio-visual information. Previous work has successfully extracted semantic from the time-stamped web match reports, which are synchronized with the video contents. However, web and social media articles with no time-stamps have not been fully leveraged, despite they are increasingly used to complement the coverage of major sporting tournaments. This paper aims to address this limitation using a novel multimodal summarization framework that is based on sentiment analysis and players' popularity. It uses audiovisual contents, web articles, blogs, and commentators' speech to automatically annotate and visualize the key events and key players in a sports tournament coverage. The experimental results demonstrate that the automatically generated video summaries are aligned with the events identified from the official website match reports.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While recent research has provided valuable information as to the composition of laser printer particles, their formation mechanisms, and explained why some printers are emitters whilst others are low emitters, fundamental questions relating to the potential exposure of office workers remained unanswered. In particular, (i) what impact does the operation of laser printers have on the background particle number concentration (PNC) of an office environment over the duration of a typical working day?; (ii) what is the airborne particle exposure to office workers in the vicinity of laser printers; (iii) what influence does the office ventilation have upon the transport and concentration of particles?; (iv) is there a need to control the generation of, and/or transport of particles arising from the operation of laser printers within an office environment?; (v) what instrumentation and methodology is relevant for characterising such particles within an office location? We present experimental evidence on printer temporal and spatial PNC during the operation of 107 laser printers within open plan offices of five buildings. We show for the first time that the eight-hour time-weighted average printer particle exposure is significantly less than the eight-hour time-weighted local background particle exposure, but that peak printer particle exposure can be greater than two orders of magnitude higher than local background particle exposure. The particle size range is predominantly ultrafine (< 100nm diameter). In addition we have established that office workers are constantly exposed to non-printer derived particle concentrations, with up to an order of magnitude difference in such exposure amongst offices, and propose that such exposure be controlled along with exposure to printer derived particles. We also propose, for the first time, that peak particle reference values be calculated for each office area analogous to the criteria used in Australia and elsewhere for evaluating exposure excursion above occupational hazardous chemical exposure standards. A universal peak particle reference value of 2.0 x 104 particles cm-3 has been proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of bowling machines is common practice in cricket. In an ideal world all batters would face real bowlers in practice sessions, but this is not always possible, for many reasons. The clear advantage of using bowling machines is that they alleviate the workload required from bowlers (Dennis, Finch & Farhart, 2005) and provide relatively consistent and accurate ball delivery which may not be otherwise available to many young batters. Anecdotal evidence suggests that many, if not most of the world’s greatest players use these methods within their training schedules. For example, Australian internationals, Michael Hussey and Matthew Hayden extensively used bowling machines (Hussey & Sygall, 2007). Bowling machines enable batsmen to practice for long periods, developing their endurance and concentration. However, despite these obvious benefits, in recent times the use of bowling machines has been questioned by sport scientists, coaches, ex- players and commentators. For example, Hussey’s batting coach comments “…we never went near a bowling machine in [Michael’s] first couple of years, I think there’s something to that …” (Hussey & Sygall, 2007, p. 119). This chapter will discuss the efficacy of using bowling machines with reference to research findings, before reporting new evidence that provides support for an alternative, innovative and possibly more representative practice design. Finally, the chapter will provide advice for coaches on the implications of this research, including a case study approach to demonstrate the practical use of such a design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business practices vary from one company to another and business practices often need to be changed due to changes of business environments. To satisfy different business practices, enterprise systems need to be customized. To keep up with ongoing business practice changes, enterprise systems need to be adapted. Because of rigidity and complexity, the customization and adaption of enterprise systems often takes excessive time with potential failures and budget shortfall. Moreover, enterprise systems often drag business behind because they cannot be rapidly adapted to support business practice changes. Extensive literature has addressed this issue by identifying success or failure factors, implementation approaches, and project management strategies. Those efforts were aimed at learning lessons from post implementation experiences to help future projects. This research looks into this issue from a different angle. It attempts to address this issue by delivering a systematic method for developing flexible enterprise systems which can be easily tailored for different business practices or rapidly adapted when business practices change. First, this research examines the role of system models in the context of enterprise system development; and the relationship of system models with software programs in the contexts of computer aided software engineering (CASE), model driven architecture (MDA) and workflow management system (WfMS). Then, by applying the analogical reasoning method, this research initiates a concept of model driven enterprise systems. The novelty of model driven enterprise systems is that it extracts system models from software programs and makes system models able to stay independent of software programs. In the paradigm of model driven enterprise systems, system models act as instructors to guide and control the behavior of software programs. Software programs function by interpreting instructions in system models. This mechanism exposes the opportunity to tailor such a system by changing system models. To make this true, system models should be represented in a language which can be easily understood by human beings and can also be effectively interpreted by computers. In this research, various semantic representations are investigated to support model driven enterprise systems. The significance of this research is 1) the transplantation of the successful structure for flexibility in modern machines and WfMS to enterprise systems; and 2) the advancement of MDA by extending the role of system models from guiding system development to controlling system behaviors. This research contributes to the area relevant to enterprise systems from three perspectives: 1) a new paradigm of enterprise systems, in which enterprise systems consist of two essential elements: system models and software programs. These two elements are loosely coupled and can exist independently; 2) semantic representations, which can effectively represent business entities, entity relationships, business logic and information processing logic in a semantic manner. Semantic representations are the key enabling techniques of model driven enterprise systems; and 3) a brand new role of system models; traditionally the role of system models is to guide developers to write system source code. This research promotes the role of system models to control the behaviors of enterprise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this review piece, we survey the literature on the cost of equity capital implications of corporate disclosure and conservative accounting policy choice decisions with the principle objective of providing insights into the design and methodological issues, which underlie the empirical investigations. We begin with a review of the analytical studies most typically cited in the empirical research as providing a theoretical foundation. We then turn to consider literature that offers insights into the selection of proxies for each of our points of interest, cost of equity capital, disclosure quality and accounting conservatism. As a final step, we review selected empirical studies to illustrate the relevant evidence found within the literature. Based on our review, we interpret the literature as providing the researcher with only limited direct guidance on the appropriate choice of measure for each of the constructs of interest. Further, we view the literature as raising questions about both the interpretation of empirical findings in the face of measurement concerns and the suitability of certain theoretical arguments to the research setting. Overall, perhaps the message which is most clear is that one of the most controversial and fundamental issues underlying the literature is the issue of the diversifiability or nondiversifiability of information effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims. This article is a report of a study done to identify how renal nurses experience information about renal care and the information practices that they used to support everyday practice. Background. What counts as nursing knowledge remains a contested area in the discipline yet little research has been undertaken. Information practice encompasses a range of activities such as seeking, evaluation and sharing of information. The ability to make informed judgement is dependent on nurses being able to identify relevant sources of information that inform their practice and those sources of information may enable the identification of what knowledge is important to nursing practice. Method. The study was philosophically framed from a practice perspective and informed by Habermas and Schatzki; it employed qualitative research techniques. Using purposive sampling six registered nurses working in two regional renal units were interviewed during 2009 and data was thematically analysed. Findings. The information practices of renal nurses involved mapping an information landscape in which they drew on information obtained from epistemic, social and corporeal sources. They also used coupling, a process of drawing together information from a range of sources, to enable them to practice. Conclusion. Exploring how nurses engage with information, and the role the information plays in situating and enacting epistemic, social and corporeal knowledge into everyday nursing practice is instructive because it indicates that nurses must engage with all three modalities in order to perform effectively, efficiently and holistically in the context of patient care. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust, affine covariant, feature extractors provide a means to extract correspondences between images captured by widely separated cameras. Advances in wide baseline correspondence extraction require looking beyond the robust feature extraction and matching approach. This study examines new techniques of extracting correspondences that take advantage of information contained in affine feature matches. Methods of improving the accuracy of a set of putative matches, eliminating incorrect matches and extracting large numbers of additional correspondences are explored. It is assumed that knowledge of the camera geometry is not available and not immediately recoverable. The new techniques are evaluated by means of an epipolar geometry estimation task. It is shown that these methods enable the computation of camera geometry in many cases where existing feature extractors cannot produce sufficient numbers of accurate correspondences.