932 resultados para temporal lip information
Resumo:
Early detection surveillance programs aim to find invasions of exotic plant pests and diseases before they are too widespread to eradicate. However, the value of these programs can be difficult to justify when no positive detections are made. To demonstrate the value of pest absence information provided by these programs, we use a hierarchical Bayesian framework to model estimates of incursion extent with and without surveillance. A model for the latent invasion process provides the baseline against which surveillance data are assessed. Ecological knowledge and pest management criteria are introduced into the model using informative priors for invasion parameters. Observation models assimilate information from spatio-temporal presence/absence data to accommodate imperfect detection and generate posterior estimates of pest extent. When applied to an early detection program operating in Queensland, Australia, the framework demonstrates that this typical surveillance regime provides a modest reduction in the estimate that a surveyed district is infested. More importantly, the model suggests that early detection surveillance programs can provide a dramatic reduction in the putative area of incursion and therefore offer a substantial benefit to incursion management. By mapping spatial estimates of the point probability of infestation, the model identifies where future surveillance resources can be most effectively deployed.
Resumo:
Background: In the last decade, there has been increasing interest in the health effects of sedentary behavior, which is often assessed using self-report sitting-time questions. The aim of this qualitative study was to document older adults’ understanding of sitting-time questions from the International Physical Activity (PA) Questionnaire (IPAQ) and the PA Scale for the Elderly (PASE). Methods: Australian community-dwelling adults aged 65+ years answered the IPAQ and PASE sitting questions in face-to-face semi-structured interviews. IPAQ uses one open-ended question to assess sitting on a weekday in the last 7 days 'at work, at home, while doing coursework and during leisure time'; PASE uses a three-part closed question about daily leisure-time sitting in the last 7 days. Participants expressed their thoughts out loud while answering each question. They were then probed about their responses. Interviews were recorded, transcribed and coded into themes. Results: Mean age of the 28 male and 27 female participants was 73 years (range 65-89). The most frequently reported activity was watching TV. For both questionnaires, many participants had difficulties understanding what activities to report. Some had difficulty understanding what activities should be classified as ‘leisure-time sitting’. Some assumed they were being asked to only report activities provided as examples. Most reported activities they normally do, rather than those performed on a day in the previous week. Participants used a variety of strategies to select ‘a day’ for which they reported their sitting activities and to calculate sitting time on that day. Therefore, many different ways of estimating sitting time were used. Participants had particular difficulty reporting their daily sitting-time when their schedules were not consistent across days. Some participants declared the IPAQ sitting question too difficult to answer. Conclusion: The accuracy of older adults’ self-reported sitting time is questionable given the challenges they have in answering sitting-time questions. Their responses to sitting-time questions may be more accurate if our recommendations for clarifying the sitting domains, providing examples relevant to older adults and suggesting strategies for formulating responses are incorporated. Future quantitative studies should include objective criterion measures to assess validity and reliability of these questions.
Resumo:
This panel discusses the impact of Green IT on information systems and how information systems can meet environmental challenges and ensure sustainability. We wish to highlight the role of green business processes, and specifically the contributions that the management of these processes can play in leveraging the transformative power of IS in order to create an environmentally sustainable society. The management of business processes has typically been thought of in terms of business improvement alongside the dimensions time, cost, quality, or flexibility – the so-called ‘devil’s quadrangle’. Contemporary organizations, however, increasingly become aware of the need to create more sustainable, IT-enabled business processes that are also successful in terms of their economic, ecological, as well as social impact. Exemplary ecological key performance indicators that increasingly find their way into the agenda of managers include carbon emissions, data center energy, or renewable energy consumption (SAP 2010). The key challenge, therefore, is to extend the devil’s quadrangle to a devil’s pentagon, including sustainability as an important fifth dimension in process change.
Resumo:
While recent research has provided valuable information as to the composition of laser printer particles, their formation mechanisms, and explained why some printers are emitters whilst others are low emitters, fundamental questions relating to the potential exposure of office workers remained unanswered. In particular, (i) what impact does the operation of laser printers have on the background particle number concentration (PNC) of an office environment over the duration of a typical working day?; (ii) what is the airborne particle exposure to office workers in the vicinity of laser printers; (iii) what influence does the office ventilation have upon the transport and concentration of particles?; (iv) is there a need to control the generation of, and/or transport of particles arising from the operation of laser printers within an office environment?; (v) what instrumentation and methodology is relevant for characterising such particles within an office location? We present experimental evidence on printer temporal and spatial PNC during the operation of 107 laser printers within open plan offices of five buildings. We show for the first time that the eight-hour time-weighted average printer particle exposure is significantly less than the eight-hour time-weighted local background particle exposure, but that peak printer particle exposure can be greater than two orders of magnitude higher than local background particle exposure. The particle size range is predominantly ultrafine (< 100nm diameter). In addition we have established that office workers are constantly exposed to non-printer derived particle concentrations, with up to an order of magnitude difference in such exposure amongst offices, and propose that such exposure be controlled along with exposure to printer derived particles. We also propose, for the first time, that peak particle reference values be calculated for each office area analogous to the criteria used in Australia and elsewhere for evaluating exposure excursion above occupational hazardous chemical exposure standards. A universal peak particle reference value of 2.0 x 104 particles cm-3 has been proposed.
Resumo:
Business practices vary from one company to another and business practices often need to be changed due to changes of business environments. To satisfy different business practices, enterprise systems need to be customized. To keep up with ongoing business practice changes, enterprise systems need to be adapted. Because of rigidity and complexity, the customization and adaption of enterprise systems often takes excessive time with potential failures and budget shortfall. Moreover, enterprise systems often drag business behind because they cannot be rapidly adapted to support business practice changes. Extensive literature has addressed this issue by identifying success or failure factors, implementation approaches, and project management strategies. Those efforts were aimed at learning lessons from post implementation experiences to help future projects. This research looks into this issue from a different angle. It attempts to address this issue by delivering a systematic method for developing flexible enterprise systems which can be easily tailored for different business practices or rapidly adapted when business practices change. First, this research examines the role of system models in the context of enterprise system development; and the relationship of system models with software programs in the contexts of computer aided software engineering (CASE), model driven architecture (MDA) and workflow management system (WfMS). Then, by applying the analogical reasoning method, this research initiates a concept of model driven enterprise systems. The novelty of model driven enterprise systems is that it extracts system models from software programs and makes system models able to stay independent of software programs. In the paradigm of model driven enterprise systems, system models act as instructors to guide and control the behavior of software programs. Software programs function by interpreting instructions in system models. This mechanism exposes the opportunity to tailor such a system by changing system models. To make this true, system models should be represented in a language which can be easily understood by human beings and can also be effectively interpreted by computers. In this research, various semantic representations are investigated to support model driven enterprise systems. The significance of this research is 1) the transplantation of the successful structure for flexibility in modern machines and WfMS to enterprise systems; and 2) the advancement of MDA by extending the role of system models from guiding system development to controlling system behaviors. This research contributes to the area relevant to enterprise systems from three perspectives: 1) a new paradigm of enterprise systems, in which enterprise systems consist of two essential elements: system models and software programs. These two elements are loosely coupled and can exist independently; 2) semantic representations, which can effectively represent business entities, entity relationships, business logic and information processing logic in a semantic manner. Semantic representations are the key enabling techniques of model driven enterprise systems; and 3) a brand new role of system models; traditionally the role of system models is to guide developers to write system source code. This research promotes the role of system models to control the behaviors of enterprise.
Resumo:
Aims. This article is a report of a study done to identify how renal nurses experience information about renal care and the information practices that they used to support everyday practice. Background. What counts as nursing knowledge remains a contested area in the discipline yet little research has been undertaken. Information practice encompasses a range of activities such as seeking, evaluation and sharing of information. The ability to make informed judgement is dependent on nurses being able to identify relevant sources of information that inform their practice and those sources of information may enable the identification of what knowledge is important to nursing practice. Method. The study was philosophically framed from a practice perspective and informed by Habermas and Schatzki; it employed qualitative research techniques. Using purposive sampling six registered nurses working in two regional renal units were interviewed during 2009 and data was thematically analysed. Findings. The information practices of renal nurses involved mapping an information landscape in which they drew on information obtained from epistemic, social and corporeal sources. They also used coupling, a process of drawing together information from a range of sources, to enable them to practice. Conclusion. Exploring how nurses engage with information, and the role the information plays in situating and enacting epistemic, social and corporeal knowledge into everyday nursing practice is instructive because it indicates that nurses must engage with all three modalities in order to perform effectively, efficiently and holistically in the context of patient care. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.
Resumo:
In this thesis, I advance the understanding of information technology (IT) governance research and corporate governance research by considering the question “How do boards govern IT?” The importance of IT to business has increased over the last decade, but there has been little academic research which has focused on boards and their role in the governance of IT (Van Grembergen, De Haes and Guldentops, 2004). Most of the research on information technology governance (ITG) has focused on advancing the understanding and measurement of the components of the ITG model (Buckby, Best & Stewart, 2008; Wilkin & Chenhall, 2010), a model recommended by the IT Governance Institute (2003) as ‘best practice’ for boards to use in governing IT. IT governance is considered to be the responsibility of the board and is said to form an important subset of an organisation’s corporate governance processes (Borth & Bradley, 2008). Boards need to govern IT as a result of the large capital investment in IT resources and high dependency on IT by organisations. Van Grembergen, De Haes and Guldentops (2004) and De Haes & Van Grembergen (2009) indicate that corporate governance matters are not able to be effectively discharged unless IT is being governed properly, and call for further specific research on the role of the board in ITG. Researchers also indicate that the link between corporate governance and IT governance has been neglected (Borth & Bradley, 2008; Musson & Jordan, 2005; Bhattacharjya & Chang, 2008). This thesis will address this gap in the ITG literature by providing the bridge between the ITG and corporate governance literatures. My thesis uses a critical realist epistemology and a mixed method approach to gather insights into my research question. In the first phase of my research I develop a survey instrument to assess whether boards consider the components of the ITG model in governing IT. The results of this first study indicated that directors do not conceptualise their role in governing IT using the elements of the ITG model. Thus, I moved to focus on whether prominent corporate governance theories might elucidate how boards govern IT. In the second phase of the research, I used a qualitative inductive case based study to assess whether agency, stewardship and resource dependence theories explain how boards govern IT in Australian universities. As the first in-depth study of university IT governance processes, my research contributes to the ITG research field by revealing that Australian university board governance of IT is characterized by a combination of agency theory and stewardship theory behaviours and processes. The study also identified strong links between a university’s IT structure and evidence of agency and stewardship theories. This link provides insight into the structures element of the emerging enterprise governance of IT framework (Van Grembergen, De Haes & Guldentops, 2004; De Haes & Van Grembergen, 2009; Van Grembergen & De Haes, 2009b; Ko & Fink, 2010). My research makes an important contribution to governance research by identifying a key link between corporate and ITG literatures and providing insight into board IT governance processes. The research conducted in my thesis should encourage future researchers to continue to explore the links between corporate and IT governance research.
Resumo:
Most web service discovery systems use keyword-based search algorithms and, although partially successful, sometimes fail to satisfy some users information needs. This has given rise to several semantics-based approaches that look to go beyond simple attribute matching and try to capture the semantics of services. However, the results reported in the literature vary and in many cases are worse than the results obtained by keyword-based systems. We believe the accuracy of the mechanisms used to extract tokens from the non-natural language sections of WSDL files directly affects the performance of these techniques, because some of them can be more sensitive to noise. In this paper three existing tokenization algorithms are evaluated and a new algorithm that outperforms all the algorithms found in the literature is introduced.
Resumo:
We demonstrate a modification of the algorithm of Dani et al for the online linear optimization problem in the bandit setting, which allows us to achieve an O( \sqrt{T ln T} ) regret bound in high probability against an adaptive adversary, as opposed to the in expectation result against an oblivious adversary of Dani et al. We obtain the same dependence on the dimension as that exhibited by Dani et al. The results of this paper rest firmly on those of Dani et al and the remarkable technique of Auer et al for obtaining high-probability bounds via optimistic estimates. This paper answers an open question: it eliminates the gap between the high-probability bounds obtained in the full-information vs bandit settings.
Resumo:
Ocean processes are dynamic, complex, and occur on multiple spatial and temporal scales. To obtain a synoptic view of such processes, ocean scientists collect data over long time periods. Historically, measurements were continually provided by fixed sensors, e.g., moorings, or gathered from ships. Recently, an increase in the utilization of autonomous underwater vehicles has enabled a more dynamic data acquisition approach. However, we still do not utilize the full capabilities of these vehicles. Here we present algorithms that produce persistent monitoring missions for underwater vehicles by balancing path following accuracy and sampling resolution for a given region of interest, which addresses a pressing need among ocean scientists to efficiently and effectively collect high-value data. More specifically, this paper proposes a path planning algorithm and a speed control algorithm for underwater gliders, which together give informative trajectories for the glider to persistently monitor a patch of ocean. We optimize a cost function that blends two competing factors: maximize the information value along the path, while minimizing deviation from the planned path due to ocean currents. Speed is controlled along the planned path by adjusting the pitch angle of the underwater glider, so that higher resolution samples are collected in areas of higher information value. The resulting paths are closed circuits that can be repeatedly traversed to collect long-term ocean data in dynamic environments. The algorithms were tested during sea trials on an underwater glider operating off the coast of southern California, as well as in Monterey Bay, California. The experimental results show significant improvements in data resolution and path reliability compared to previously executed sampling paths used in the respective regions.
Resumo:
Late last year teaching staff at Griffith University participated in a symposium entitled 'Spotlight on Generic Skills & Flexible Learning'. This event brought together academic staff as well as library staff, learning advisers and other support staff interested in teaching and learning issues. The discussion was based on the premise that the University has a responsibility to ensure that its courses emphasise broad educational values and 'produce highly sought after graduates with globally applicable skills for the international market'(1). It was acknowledged that the University consistently scores very highly with graduates for its development of generic skills. However at the same time staff expressed concern at the challenge of developing more flexible, student-centred learning environments that have generic skills embedded across all programs (2). As a result there has been much debate in the University about which skills are important, how they will be acquired and how they could effectively be built into the curriculum. One outcome of these discussions is the project described in this paper. What follows is an overview of the project and a discussion about the integration and development of information literacy as a generic attribute in the curriculum and some suggestions on ways forward.