961 resultados para Information technology.


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Every university in Australia has a set of policies that guide the institution in its educational practices, however, the policies are often developed in isolation to each other. Now imagine a space where policies are evidence-based, refined annually, cohesively interrelated, and meet stakeholders’ needs. Is this happenstance or the result of good planning? Culturally, Queensland University of Technology (QUT) is a risk-averse institution that takes pride in its financial solvency and is always keen to know “how are we going?” With a twenty-year history of annual reporting that assures the quality of course performance through multiple lines of evidence, QUT’s Learning and Teaching Unit went one step further and strategically aligned a suite of policies that take into consideration the needs of their stakeholders, collaborate with other areas across the institution and use multiple lines of evidence to inform curriculum decision-making. In QUT’s experience, strategic planning can lead to policy that is designed to meet stakeholders’ needs, not manage them; where decision-making is supported by evidence, not rhetoric; where all feedback is incorporated, not ignored; and where policies are cohesively interrelated, not isolated. While many may call this ‘policy nirvana’, QUT has positioned itself to demonstrate good educational practice through Reframe, its evaluation framework. In this case, best practice was achieved through the application of a theory of change and a design-led logic model that allows for transition to other institutions with different cultural specificity. The evaluation approach follows Seldin’s (2003) notion to offer depth and breadth to the evaluation framework along with Berk’s (2005) concept of multiple lines of evidence. In summary, this paper offers university executives, academics, planning and quality staff an opportunity to understand the critical steps that lead to strategic planning and design of evidence-based educational policy that positions a university for best practice in learning and teaching.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multidimensional data are getting increasing attention from researchers for creating better recommender systems in recent years. Additional metadata provides algorithms with more details for better understanding the interaction between users and items. While neighbourhood-based Collaborative Filtering (CF) approaches and latent factor models tackle this task in various ways effectively, they only utilize different partial structures of data. In this paper, we seek to delve into different types of relations in data and to understand the interaction between users and items more holistically. We propose a generic multidimensional CF fusion approach for top-N item recommendations. The proposed approach is capable of incorporating not only localized relations of user-user and item-item but also latent interaction between all dimensions of the data. Experimental results show significant improvements by the proposed approach in terms of recommendation accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Industrial control systems (ICS) have been moving from dedicated communications to switched and routed corporate networks, making it probable that these devices are being exposed to the Internet. Many ICS have been designed with poor or little security features, making them vulnerable to potential attack. Recently, several tools have been developed that can scan the internet, including ZMap, Masscan and Shodan. However, little in-depth analysis has been done to compare these Internet-wide scanning techniques, and few Internet-wide scans have been conducted targeting ICS and protocols. In this paper we present a Taxonomy of Internet-wide scanning with a comparison of three popular network scanning tools, and a framework for conducting Internet-wide scans.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Understanding the dynamics of disease spread is essential in contexts such as estimating load on medical services, as well as risk assessment and interven- tion policies against large-scale epidemic outbreaks. However, most of the information is available after the outbreak itself, and preemptive assessment is far from trivial. Here, we report on an agent-based model developed to investigate such epidemic events in a stylised urban environment. For most diseases, infection of a new individual may occur from casual contact in crowds as well as from repeated interactions with social partners such as work colleagues or family members. Our model therefore accounts for these two phenomena. Given the scale of the system, efficient parallel computing is required. In this presentation, we focus on aspects related to paralllelisation for large networks generation and massively multi-agent simulations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Despite significant improvements in capacity-distortion performance, a computationally efficient capacity control is still lacking in the recent watermarking schemes. In this paper, we propose an efficient capacity control framework to substantiate the notion of watermarking capacity control to be the process of maintaining “acceptable” distortion and running time, while attaining the required capacity. The necessary analysis and experimental results on the capacity control are reported to address practical aspects of the watermarking capacity problem, in dynamic (size) payload embedding.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Business Process Management (BPM) (Dumas et al. 2013) investigates how organizations function and can be improved on the basis of their business processes. The starting point for BPM is that organizational performance is a function of process performance. Thus, BPM proposes a set of methods, techniques and tools to discover, analyze, implement, monitor and control business processes, with the ultimate goal of improving these processes. Most importantly, BPM is not just an organizational management discipline. BPM also studies how technology, and particularly information technology, can effectively support the process improvement effort. In the past two decades the field of BPM has been the focus of extensive research, which spans an increasingly growing scope and advances technology in various directions. The main international forum for state-of-the-art research in this field is the International Conference on Business Process Management, or “BPM” for short—an annual meeting of the aca ...

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Decision-making is such an integral aspect in health care routine that the ability to make the right decisions at crucial moments can lead to patient health improvements. Evidence-based practice, the paradigm used to make those informed decisions, relies on the use of current best evidence from systematic research such as randomized controlled trials. Limitations of the outcomes from randomized controlled trials (RCT), such as “quantity” and “quality” of evidence generated, has lowered healthcare professionals’ confidence in using EBP. An alternate paradigm of Practice-Based Evidence has evolved with the key being evidence drawn from practice settings. Through the use of health information technology, electronic health records (EHR) capture relevant clinical practice “evidence”. A data-driven approach is proposed to capitalize on the benefits of EHR. The issues of data privacy, security and integrity are diminished by an information accountability concept. Data warehouse architecture completes the data-driven approach by integrating health data from multi-source systems, unique within the healthcare environment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background As the increasing adoption of information technology continues to offer better distant medical services, the distribution of, and remote access to digital medical images over public networks continues to grow significantly. Such use of medical images raises serious concerns for their continuous security protection, which digital watermarking has shown great potential to address. Methods We present a content-independent embedding scheme for medical image watermarking. We observe that the perceptual content of medical images varies widely with their modalities. Recent medical image watermarking schemes are image-content dependent and thus they may suffer from inconsistent embedding capacity and visual artefacts. To attain the image content-independent embedding property, we generalise RONI (region of non-interest, to the medical professionals) selection process and use it for embedding by utilising RONI’s least significant bit-planes. The proposed scheme thus avoids the need for RONI segmentation that incurs capacity and computational overheads. Results Our experimental results demonstrate that the proposed embedding scheme performs consistently over a dataset of 370 medical images including their 7 different modalities. Experimental results also verify how the state-of-the-art reversible schemes can have an inconsistent performance for different modalities of medical images. Our scheme has MSSIM (Mean Structural SIMilarity) larger than 0.999 with a deterministically adaptable embedding capacity. Conclusions Our proposed image-content independent embedding scheme is modality-wise consistent, and maintains a good image quality of RONI while keeping all other pixels in the image untouched. Thus, with an appropriate watermarking framework (i.e., with the considerations of watermark generation, embedding and detection functions), our proposed scheme can be viable for the multi-modality medical image applications and distant medical services such as teleradiology and eHealth.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cloud computing has significantly impacted a broad range of industries, but these technologies and services have been absorbed throughout the marketplace unevenly. Some industries have moved aggressively towards cloud computing, while others have moved much more slowly. For the most part, the energy sector has approached cloud computing in a measured and cautious way, with progress often in the form of private cloud solutions rather than public ones, or hybridized information technology systems that combine cloud and existing non-cloud architectures. By moving towards cloud computing in a very slow and tentative way, however, the energy industry may prevent itself from reaping the full benefit that a more complete migration to the public cloud has brought about in several other industries. This short communication is accordingly intended to offer a high-level overview of cloud computing, and to put forward the argument that the energy sector should make a more complete migration to the public cloud in order to unlock the major system-wide efficiencies that cloud computing can provide. Also, assets within the energy sector should be designed with as much modularity and flexibility as possible so that they are not locked out of cloud-friendly options in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

STIMulate is a support for learning program at the Queensland University of Technology in Brisbane, Australia. The program provides assistance in mathematics, science and information technology for undergraduate students. This paper develops personas - archetypal users - that represent the attitudes and motivations of students that utilise STIMulate (in particular, the IT stream). Seven different personas were developed based on interviews gathered from Peer Learning Facilitators (PLF) who are experienced students that have excelled in relevant subject areas. The personas were then validated by a PLF focus group. Developing the personas enabled us to better understand the characteristics and needs of the students using the STIMulate program, enabling a more critical analysis of the quality of the service provided.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Particle swarm optimization (PSO), a new population based algorithm, has recently been used on multi-robot systems. Although this algorithm is applied to solve many optimization problems as well as multi-robot systems, it has some drawbacks when it is applied on multi-robot search systems to find a target in a search space containing big static obstacles. One of these defects is premature convergence. This means that one of the properties of basic PSO is that when particles are spread in a search space, as time increases they tend to converge in a small area. This shortcoming is also evident on a multi-robot search system, particularly when there are big static obstacles in the search space that prevent the robots from finding the target easily; therefore, as time increases, based on this property they converge to a small area that may not contain the target and become entrapped in that area.Another shortcoming is that basic PSO cannot guarantee the global convergence of the algorithm. In other words, initially particles explore different areas, but in some cases they are not good at exploiting promising areas, which will increase the search time.This study proposes a method based on the particle swarm optimization (PSO) technique on a multi-robot system to find a target in a search space containing big static obstacles. This method is not only able to overcome the premature convergence problem but also establishes an efficient balance between exploration and exploitation and guarantees global convergence, reducing the search time by combining with a local search method, such as A-star.To validate the effectiveness and usefulness of algorithms,a simulation environment has been developed for conducting simulation-based experiments in different scenarios and for reporting experimental results. These experimental results have demonstrated that the proposed method is able to overcome the premature convergence problem and guarantee global convergence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnologys role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Christmas has come early for copyright owners in Australia. The film company, Roadshow, the pay television company Foxtel, and Rupert Murdoch's News Corp and News Limited--as well as copyright industries--have been clamoring for new copyright powers and remedies. In the summer break, the Coalition Government has responded to such entreaties from its industry supporters and donors, with a new package of copyright laws and policies. There has been significant debate over the proposals between the odd couple of Attorney-General George Brandis and the Minister for Communications, Malcolm Turnbull. There have been deep, philosophical differences between the two Ministers over the copyright agenda. The Attorney-General George Brandis has supported a model of copyright maximalism, with strong rights and remedies for the copyright empires in film, television, and publishing. He has shown little empathy for the information technology companies of the digital economy. The Attorney-General has been impatient to press ahead with a copyright regime. The Minister for Communications, Malcolm Turnbull, has been somewhat more circumspect, recognizing that there is a need to ensure that copyright laws do not adversely impact upon competition in the digital economy. The final proposal is a somewhat awkward compromise between the discipline-and-punish regime preferred by Brandis, and the responsive regulation model favored by Turnbull. In his new book, Information Doesn't Want to Be Free: Laws for the Internet Age, Cory Doctorow has some sage advice for copyright owners: Things that don't make money: Complaining about piracy. Calling your customers thieves. Treating your customers like thieves. In this context, the push by copyright owners and the Coalition Government to have a copyright crackdown may well be counter-productive to their interests.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The venture, 23andMe Inc., raises a host of issues in respect of patent law, policy, and practice in respect of lifestyle genetics and personalised medicine. The company observes: ‘We recognize that the availability of personal genetic information raises important issues at the nexus of ethics, law, and public policy’. 23andMe Inc. has tested the boundaries of patent law, with its patent applications, which cut across information technology, medicine, and biotechnology. The company’s research raises fundamental issues about patentability, especially in light of the litigation in Bilski v. Kappos, Mayo Collaborative Services v. Prometheus Laboratories Inc. and Association for Molecular Pathology v. United States Patent and Trademark Office and Myriad Genetics Inc. There has been much debate and controversy over 23andMe Inc. filing patent applications – particularly in respect of its granted patent on ‘Polymorphisms associated with Parkinson’s Disease’. The direct-to-consumer marketing of genetic testing by 23andMe Inc. has also raised important questions of bioethics and human rights. It is queried whether the terms of service for 23andMe Inc. provide adequate recognition of the concepts of informed consent and benefit-sharing, especially in light of litigation in this area in the United States. Given the patent thickets surrounding genetic testing, the case study of 23andMe Inc. also highlights questions about patent infringement and patent exceptions. The future reform of patent law, policy, and practice needs to take into account new developments in lifestyle genetics and personalised medicine – as exemplified by 23andMe Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article considers the challenges posed to intellectual property law by the emerging field of bioinformatics. It examines the intellectual property strategies of established biotechnology companies, such as Celera Genomics, and information technology firms entering into the marketplace, such as IBM. First this paper argues that copyright law is not irrelevant to biotechnology, as some commentators would suggest. It claims that the use of copyright law and contract law is fundamental to the protection of biomedical and genomic databases. Second this article questions whether biotechnology companies are exclusively interested in patenting genes and genetics sequences. Recent evidence suggests that biotechnology companies and IT firms are patenting bioinformatics software and Internet business methods, as well as underlying instrumentation such as microarrays and genechips. Finally, this paper evaluates what impact the privatisation of bioinformatics will have on public research and scientific communication. It raises important questions about integration, interoperability, and the risks of monopoly. It finally considers whether open source software such as the Ensembl Project and peer to peer technology like DSAS will be able to counter this trend of privatisation.