962 resultados para Standard information
Resumo:
The article focuses on the evidence-based information practice (EBIP) applied at the Auraria Library in Denver, Colorado during the reorganization of its technical services division. Collaboration processes were established for the technical services division through the reorganization and redefinition of workflows. There are several factors that form part of the redefinition of roles including personal interests, department needs, and library needs. A collaborative EBIP environment was created in the division by addressing issues of workplace hierarchies, by the distribution of problem solving, and by the encouragement of reflective dialogue.
'Information in context' : co-designing workplace structures and systems for organizational learning
Resumo:
With the aim of advancing professional practice through better understanding how to create workplace contexts that cultivate individual and collective learning through situated 'information in context' experiences, this paper presents insights gained from three North American collaborative design (co-design) implementations. In the current project at the Auraria Library in Denver, Colorado, USA, participants use collaborative information practices to redesign face-to-face and technology-enabled communication, decision making, and planning systems. Design processes are described and results-to-date described, within an appreciative framework which values information sharing and enables knowledge creation through shared leadership.
Resumo:
The price formation of financial assets is a complex process. It extends beyond the standard economic paradigm of supply and demand to the understanding of the dynamic behavior of price variability, the price impact of information, and the implications of trading behavior of market participants on prices. In this thesis, I study aggregate market and individual assets volatility, liquidity dimensions, and causes of mispricing for US equities over a recent sample period. How volatility forecasts are modeled, what determines intradaily jumps and causes changes in intradaily volatility and what drives the premium of traded equity indexes? Are they induced, for example, by the information content of lagged volatility and return parameters or by macroeconomic news, changes in liquidity and volatility? Besides satisfying our intellectual curiosity, answers to these questions are of direct importance to investors developing trading strategies, policy makers evaluating macroeconomic policies and to arbitrageurs exploiting mispricing in exchange-traded funds. Results show that the leverage effect and lagged absolute returns improve forecasts of continuous components of daily realized volatility as well as jumps. Implied volatility does not subsume the information content of lagged returns in forecasting realized volatility and its components. The reported results are linked to the heterogeneous market hypothesis and demonstrate the validity of extending the hypothesis to returns. Depth shocks, signed order flow, the number of trades, and resiliency are the most important determinants of intradaily volatility. In contrast, spread shock and resiliency are predictive of signed intradaily jumps. There are fewer macroeconomic news announcement surprises that cause extreme price movements or jumps than those that elevate intradaily volatility. Finally, the premium of exchange-traded funds is significantly associated with momentum in net asset value and a number of liquidity parameters including the spread, traded volume, and illiquidity. The mispricing of industry exchange traded funds suggest that limits to arbitrage are driven by potential illiquidity.
Resumo:
Focuses on the various aspects of advances in future information communication technology and its applications Presents the latest issues and progress in the area of future information communication technology Applicable to both researchers and professionals These proceedings are based on the 2013 International Conference on Future Information & Communication Engineering (ICFICE 2013), which will be held at Shenyang in China from June 24-26, 2013. The conference is open to all over the world, and participation from Asia-Pacific region is particularly encouraged. The focus of this conference is on all technical aspects of electronics, information, and communications ICFICE-13 will provide an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of FICE. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in FICE. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject. "This work was supported by the NIPA (National IT Industry Promotion Agency) of Korea Grant funded by the Korean Government (Ministry of Science, ICT & Future Planning)."
Resumo:
Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.
Resumo:
In Angus v Conelius [2007] QCA 190 the Queensland Court of Appeal concluded that the obligations under the Motor Accident Insurance Act 1994 (Qld), and in particular s 45 of the Act (duty of claimant to cooperate with insurer), continue beyond the commencement of court proceedings
Resumo:
The latest paradigm shift in government, termed Transformational Government, puts the citizen in the centre of attention. Including citizens in the design of online one-stop portals can help governmental organisations to become more customer focussed. This study describes the initial efforts of an Australian state government to develop an information architecture to structure the content of their future one-stop portal. Hereby, card sorting exercises have been conducted and analysed, utilising contemporary approaches found in academic and non-scientific literature. This paper describes the findings of the card sorting exercises in this particular case and discusses the suitability of the applied approaches in general. These are distinguished into non-statistical, statistical, and hybrid approaches. Thus, on the one hand, this paper contributes to academia by describing the application of different card sorting approaches and discussing their strengths and weaknesses. On the other hand, this paper contributes to practice by explaining the approach that has been taken by the authors’ research partner in order to develop a customer-focussed governmental one-stop portal. Thus, they provide decision support for practitioners with regard to different analysis methods that can be used to complement recent approaches in Transformational Government.
Resumo:
This paper presents a hybrid framework of Swedish cultural practices and Australian grounded theory for organizational development and suggests practical strategies for 'working smarter' in 21st Century libraries. Toward that end, reflective evidence-based practices are offered to incrementally build organizational capacity for asking good questions, selecting authoritative sources, evaluating multiple perspectives, organizing emerging insights, and communicating them to inform, educate, and influence. In addition, to ensure the robust information exchange necessary to collective workplace learning, leadership traits are proposed for ensuring inclusive communication, decision making, and planning processes. These findings emerge from action research projects conducted from 2003 to 2008 in two North American libraries.
Resumo:
As the importance of information literacy has gained increased recognition, so too have academic library professionals intensified their efforts to champion, activate, and advance these capabilities in others. To date, however, little attention has focused on advancing these essential competencies amongst practitioner advocates.This paper helps redress the paucity of professional literature on the topic of workplace information literacy among library professionals.
Resumo:
The use of intelligent transport systems is proliferating across the Australian road network, particularly on major freeways. New technology allows a greater range of signs and messages to be displayed to drivers. While there has been a long history of human factors analyses of signage, no evaluation has been conducted on this novel, sometimes dynamic, signage or potential interactions when co-located. The purpose of this driving simulator study was to investigate drivers’ behavioural changes and comprehension resulting from the co-location of Lane Use Management Systems with static signs and (Enhanced) Variable Message Signs on Queensland motorways. A section of motorway was simulated, and nine scenarios were developed which presented a combination of signage cases across levels of driving task complexity. Two higher-risk road user groups were targeted for this research on an advanced driving simulator: older (65+ years, N=21) and younger (18-22 years, N=20) drivers. Changes in sign co-location and task complexity had small effect on driver comprehension of the signs and vehicle dynamics variables, including difference with the posted speed limit, headway, standard deviation of lane keeping and brake jerks. However, increasing the amount of information provided to drivers at a given location (by co-locating several signs) increased participants’ gaze duration on the signs. With co-location of signs and without added task complexity, a single gaze was over 2s for more than half of the population tested for both groups, and up to 6 seconds for some individuals.
Resumo:
This study presents a segmentation pipeline that fuses colour and depth information to automatically separate objects of interest in video sequences captured from a quadcopter. Many approaches assume that cameras are static with known position, a condition which cannot be preserved in most outdoor robotic applications. In this study, the authors compute depth information and camera positions from a monocular video sequence using structure from motion and use this information as an additional cue to colour for accurate segmentation. The authors model the problem similarly to standard segmentation routines as a Markov random field and perform the segmentation using graph cuts optimisation. Manual intervention is minimised and is only required to determine pixel seeds in the first frame which are then automatically reprojected into the remaining frames of the sequence. The authors also describe an automated method to adjust the relative weights for colour and depth according to their discriminative properties in each frame. Experimental results are presented for two video sequences captured using a quadcopter. The quality of the segmentation is compared to a ground truth and other state-of-the-art methods with consistently accurate results.
Resumo:
In this paper, we present an unsupervised graph cut based object segmentation method using 3D information provided by Structure from Motion (SFM), called Grab- CutSFM. Rather than focusing on the segmentation problem using a trained model or human intervention, our approach aims to achieve meaningful segmentation autonomously with direct application to vision based robotics. Generally, object (foreground) and background have certain discriminative geometric information in 3D space. By exploring the 3D information from multiple views, our proposed method can segment potential objects correctly and automatically compared to conventional unsupervised segmentation using only 2D visual cues. Experiments with real video data collected from indoor and outdoor environments verify the proposed approach.
Resumo:
I believe that studies of men's gendered experiences of information systems are needed. In order to support this claim, I introduce the area of Masculinity Studies to Information Systems research and, using this, present an exploratory analysis of an internet dating website for gay men – Gaydar. The information system, which forms part of the Gaydar community, is shown to shape, and be shaped by the members as they accept and challenge aspects of it as related to their identities. In doing this, I show how the intertwined processes of information systems development and use contribute to the creation of diverse interpretations of masculinity within a group of men. In sum, my analysis highlights different kinds of men and different versions of masculinity that can sometimes be associated with different experiences of information systems. The implications of this work centre on the need to expand our knowledge of men's gendered experiences with information systems, to reflect upon processes of technology facilitated categorisation and to consider the influences that contribute to the roll out of particular software features along with the underlying rationales for market segmentation in the software and software-based services industries.
Resumo:
The papers in this issue focus our attention to packaged software as an increasingly important, but still relatively poorly understood phenomena in the information systems research community. The topic is not new: Lucas et al. (1988) wrote a provocative piece focused on the issues with implementing packaged software. A decade later, Carmel (1997) argued that packaged software was both ideally suited for American entrepreneurial activity and rapidly growing. The information systems research community, however, has moved more slowly to engage this change (e.g., Sawyer, 2001). The papers in this special issue represent a significant step in better engaging the issues of packaged software relative to information systems research, and highlighting opportunities for additional relevant research.
Resumo:
In an attempt to deal with the potential problems presented by existing information systems, a shift towards the implementation of ERP packages has been witnessed. The common view, particularly the one espoused by vendors, is that ERP packages are most successfully implemented when the standard model is adopted. Yet, despite this, customisation activity still occurs reportedly due to misalignment between the functionality of the package and the requirements of those in the implementing organisation. However, it is recognised that systems development and organisational decision-making are activities influenced by the perspectives of the various groups and individuals involved in the process. Thus, as customisation can be seen as part of systems development, and has to be decided upon, it should be thought about in the same way. In this study, two ERP projects are used to examine different reasons why customisation might take place. These reasons are then built upon through reference to the ERP and more general packaged software literature. The study suggests that whilst a common reason for customising ERP packages might be concerned with functionality misfits, it is important to look further into why these may occur, as there are clearly other reasons for customisation stemming from the multiplicity of social groups involved in the process.