917 resultados para Information Technology Adoption


Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is increasing attention to the importance of Enterprise Systems (ES) and Information Systems (IS) for Small and Medium Enterprises (SMEs). The same attention must be addressed in IS graduate curriculum. Studies reveal that despite healthy demand from the industry for IS management expertise, most IS graduates are ill-equipped to meet the challenges of modern organizations. The majority of contemporary firms, represented by SMEs, seek employees with a balance of business process knowledge and ES software skills. This article describes a curriculum that teaches Information Technology (IT) and IS managementconcepts in a SMEs context. The curriculum conceptualises a ‘learn-by-doing’ approach, to provide business process and ES software specific knowledge for its students. The approach recommends coverage of traditional content related to SMEs’’ operations, strategies, IT investment and management issues while providing an increased focus on strategic use of enterprise IT. The study addresses to an extent, the perennial challenge of updating IS curriculum, given the rapid pace of technological change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The World Health Organization recommends that data on mortality in its member countries are collected utilising the Medical Certificate of Cause of Death published in the instruction volume of the ICD-10. However, investment in health information processes necessary to promote the use of this certificate and improve mortality information is lacking in many countries. An appeal for support to make improvements has been launched through the Health Metrics Network’s MOVE-IT strategy (Monitoring of Vital Events – Information Technology) [World Health Organization, 2011]. Despite this international spotlight on the need for capture of mortality data and in the use of the ICD-10 to code the data reported on such certificates, there is little cohesion in the way that certifiers of deaths receive instruction in how to complete the death certificate, which is the main source document for mortality statistics. Complete and accurate documentation of the immediate, underlying and contributory causes of death of the decedent on the death certificate is a requirement to produce standardised statistical information and to the ability to produce cause-specific mortality statistics that can be compared between populations and across time. This paper reports on a research project conducted to determine the efficacy and accessibility of the certification module of the WHO’s newly-developed web based training tool for coders and certifiers of deaths. Involving a population of medical students from the Fiji School of Medicine and a pre and post research design, the study entailed completion of death certificates based on vignettes before and after access to the training tool. The ability of the participants to complete the death certificates and analysis of the completeness and specificity of the ICD-10 coding of the reported causes of death were used to measure the effect of the students’ learning from the training tool. The quality of death certificate completion was assessed using a Quality Index before and after the participants accessed the training tool. In addition, the views of the participants about accessibility and use of the training tool were elicited using a supplementary questionnaire. The results of the study demonstrated improvement in the ability of the participants to complete death certificates completely and accurately according to best practice. The training tool was viewed very positively and its implementation in the curriculum for medical students was encouraged. Participants also recommended that interactive discussions to examine the certification exercises would be an advantage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quality, in construction projects should be regarded as the fulfillment of expectation of those contributors involved in such projects. Although a significant amount of quality practices have been introduced within the industry, attainment of reasonable levels of quality in construction projects continues to be an ongoing problem. To date, some research into the introduction and improvement of quality practices and stakeholder management has been undertaken, but so far no major studies have been completed that comprehensively examine how greater consideration of stakeholders’ perspectives of quality can be used to contribute to final project quality outcomes. This paper aims to examine the requirements for development of a framework leading to more effective involvement of stakeholders in quality planning and practices thus ultimately contributing to higher quality outcomes for construction projects. Through an extensive literature review it highlights various perceptions of quality, categorizes quality issues with particular focus on benefits and shortcomings and also examines the viewpoints of major stakeholders on project quality. It proposes a set of criteria to be used as a basis for a quality practice improvement framework, which will provide project managers and owners with the required information and strategic direction to achieve their own and their stakeholders’ targets for implementation of quality practices leading to the achievement of improved quality outcomes on future projects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we consider the implementation of time and energy efficient trajectories onto a test-bed autonomous underwater vehicle. The trajectories are losely connected to the results of the application of the maximum principle to the controlled mechanical system. We use a numerical algorithm to compute efficient trajectories designed using geometric control theory to optimize a given cost function. Experimental results are shown for the time minimization problem.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Defence organisations perform information security evaluations to confirm that electronic communications devices are safe to use in security-critical situations. Such evaluations include tracing all possible dataflow paths through the device, but this process is tedious and error-prone, so automated reachability analysis tools are needed to make security evaluations faster and more accurate. Previous research has produced a tool, SIFA, for dataflow analysis of basic digital circuitry, but it cannot analyse dataflow through microprocessors embedded within the circuit since this depends on the software they run. We have developed a static analysis tool that produces SIFA compatible dataflow graphs from embedded microcontroller programs written in C. In this paper we present a case study which shows how this new capability supports combined hardware and software dataflow analyses of a security critical communications device.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article presents a novel approach to confidentiality violation detection based on taint marking. Information flows are dynamically tracked between applications and objects of the operating system such as files, processes and sockets. A confidentiality policy is defined by labelling sensitive information and defining which information may leave the local system through network exchanges. Furthermore, per application profiles can be defined to restrict the sets of information each application may access and/or send through the network. In previous works, we focused on the use of mandatory access control mechanisms for information flow tracking. In this current work, we have extended the previous information flow model to track network exchanges, and we are able to define a policy attached to network sockets. We show an example application of this extension in the context of a compromised web browser: our implementation detects a confidentiality violation when the browser attempts to leak private information to a remote host over the network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ICT degrees in most Australian universities have a sequence of up to three programming subjects, or units. BABELnot is an ALTC-funded project that will document the academic standards associated with those three subjects in the six participating universities and, if possible, at other universities. This will necessitate the development of a rich framework for describing the learning goals associated with programming. It will also be necessary to benchmark exam questions that are mapped onto this framework. As part of the project, workshops are planned for ACE 2012, ICER 2012 and ACE 2013, to elicit feedback from the broader Australasian computing education community, and to disseminate the project’s findings. The purpose of this paper is to introduce the project to that broader Australasian computing education community and to invite their active participation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the third year of the Link the Wiki track, the focus has been shifted to anchor-to-bep link discovery. The participants were encouraged to utilize different technologies to resolve the issue of focused link discovery. Apart from the 2009 Wikipedia collection, the Te Ara collection was introduced for the first time in INEX. For the link the wiki tasks, 5000 file-to-file topics were randomly selected and 33 anchor-to-bep topics were nominated by the participants. The Te Ara collection does not contain hyperlinks and the task was to cross link the entire collection. A GUI tool for self-verification of the linking results was distributed. This helps participants verify the location of the anchor and bep. The assessment tool and the evaluation tool were revised to improve efficiency. Submission runs were evaluated against Wikipedia ground-truth and manual result set respectively. Focus-based evaluation was undertaken using a new metric. Evaluation results are presented and link discovery approaches are described

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper gives an overview of the INEX 2009 Ad Hoc Track. The main goals of the Ad Hoc Track were three-fold. The first goal was to investigate the impact of the collection scale and markup, by using a new collection that is again based on a the Wikipedia but is over 4 times larger, with longer articles and additional semantic annotations. For this reason the Ad Hoc track tasks stayed unchanged, and the Thorough Task of INEX 2002–2006 returns. The second goal was to study the impact of more verbose queries on retrieval effectiveness, by using the available markup as structural constraints—now using both the Wikipedia’s layout-based markup, as well as the enriched semantic markup—and by the use of phrases. The third goal was to compare different result granularities by allowing systems to retrieve XML elements, ranges of XML elements, or arbitrary passages of text. This investigates the value of the internal document structure (as provided by the XML mark-up) for retrieving relevant information. The INEX 2009 Ad Hoc Track featured four tasks: For the Thorough Task a ranked-list of results (elements or passages) by estimated relevance was needed. For the Focused Task a ranked-list of non-overlapping results (elements or passages) was needed. For the Relevant in Context Task non-overlapping results (elements or passages) were returned grouped by the article from which they came. For the Best in Context Task a single starting point (element start tag or passage start) for each article was needed. We discuss the setup of the track, and the results for the four tasks.

Relevância:

80.00% 80.00%

Publicador: