669 resultados para Setting time
Resumo:
This article was written in 1997. After a 2009 review the content was left mostly unchanged - apart from this re-written abstract, restructured headings and a table of contents. The article deals directly with professional registration of surveyors; but it also relates to government procurement of professional services. The issues include public service and professional ethics; setting of professional fees; quality assurance; official corruption; and professional recruitment, education and training. Debate on the Land Surveyors Act 1908 (Qld) and its amendments to 1916 occurred at a time when industrial unrest of the 1890s and common market principles of the new Commonwealth were fresh in peoples’ minds. Industrial issues led to a constitutional crisis in the Queensland’s then bicameral legislature and frustrated a first attempt to pass a Surveyors Bill in 1907. The Bill was re-introduced in 1908 after fresh elections and Kidston’s return as state premier. Co-ordinated immigration and land settlement polices of the colonies were discontinued when the Commonwealth gained power over immigration in 1901. Concerns shifted to protecting jobs from foreign competition. Debate on 1974 amendments to the Act reflected concerns about skill shortages and professional accreditation. However, in times of economic downturn, a so-called ‘chronic shortage of surveyors’ could rapidly degenerate into oversupply and unemployment. Theorists championed a naïve ‘capture theory’ where the professions captured governments to create legislative barriers to entry to the professions. Supposedly, this allowed rent-seeking and monopoly profits through lack of competition. However, historical evidence suggests that governments have been capable of capturing and exploiting surveyors. More enlightened institutional arrangements are needed if the community is to receive benefits commensurate with sizable co-investments of public and private resources in developing human capital.
Resumo:
Objective: To define characteristics of all-terrain vehicle (ATV) crashes occurring in north Queensland from March 2004 till June 2007 with the exploration of associated risk factors. Design: Descriptive analysis of ATV crash data collected by the Rural and Remote Road Safety Study. Setting: Rural and remote north Queensland. Participants: Forty-two ATV drivers and passengers aged 16 years or over hospitalised at Atherton, Cairns, Mount Isa or Townsville for at least 24 hours as a result of a vehicle crash. Main outcome measures: Demographics of participants, reason for travel, nature of crash, injuries sustained and risk factors associated with ATV crash. Results: The majority of casualties were men aged 16–64. Forty-one per cent of accidents occurred while performing agricultural tasks. Furthermore, 39% of casualties had less than one year’s experience riding ATVs. Over half the casualties were not wearing a helmet at the time of the crash. Common injuries were head and neck and upper limb injuries. Rollovers tended to occur while performing agricultural tasks and most commonly resulted in multiple injuries. Conclusions: Considerable trauma results from ATV crashes in rural and remote north Queensland. These crashes are not included in most general vehicle crash data sets, as they are usually limited to events occurring on public roads. Minimal legislation and regulation currently applies to ATV use in agricultural, recreational and commercial settings. Legislation on safer design of ATVs and mandatory courses for riders is an essential part of addressing the burden of ATV crashes on rural and remote communities.
Resumo:
In this third Quantum Interaction (QI) meeting it is time to examine our failures. One of the weakest elements of QI as a field, arises in its continuing lack of models displaying proper evolutionary dynamics. This paper presents an overview of the modern generalised approach to the derivation of time evolution equations in physics, showing how the notion of symmetry is essential to the extraction of operators in quantum theory. The form that symmetry might take in non-physical models is explored, with a number of viable avenues identified.
Resumo:
In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.
Resumo:
The purpose of this article is to raise some concerns over ongoing changes to the nature and scope of the teaching profession. Teaching is a responsible profession, and teachers have always been charged with the job of turning out the next generation of citizens—educated, healthy in mind, and healthy in body. The question is: how far should this responsibility extend? Just what should schools be responsible for? This article proposes some limits to teacher responsibility.
Resumo:
When complex projects go wrong they can go horribly wrong with severe financial consequences. We are undertaking research to develop leading performance indicators for complex projects, metrics to provide early warning of potential difficulties. The assessment of success of complex projects can be made by a range of stakeholders over different time scales, against different levels of project results: the project’s outputs at the end of the project; the project’s outcomes in the months following project completion; and the project’s impact in the years following completion. We aim to identify leading performance indicators, which may include both success criteria and success factors, and which can be measured by the project team during project delivery to forecast success as assessed by key stakeholders in the days, months and years following the project. The hope is the leading performance indicators will act as alarm bells to show if a project is diverting from plan so early corrective action can be taken. It may be that different combinations of the leading performance indicators will be appropriate depending on the nature of project complexity. In this paper we develop a new model of project success, whereby success is assessed by different stakeholders over different time frames against different levels of project results. We then relate this to measurements that can be taken during project delivery. A methodology is described to evaluate the early parts of this model. Its implications and limitations are described. This paper describes work in progress.
Resumo:
Purpose – The paper aims to describe a workforce-planning model developed in-house in an Australian university library that is based on rigorous environmental scanning of an institution, the profession and the sector. Design/methodology/approach – The paper uses a case study that describes the stages of the planning process undertaken to develop the Library’s Workforce Plan and the documentation produced. Findings – While it has been found that the process has had successful and productive outcomes, workforce planning is an ongoing process. To remain effective, the workforce plan needs to be reviewed annually in the context of the library’s overall planning program. This is imperative if the plan is to remain current and to be regarded as a living document that will continue to guide library practice. Research limitations/implications – Although a single case study, the work has been contextualized within the wider research into workforce planning. Practical implications – The paper provides a model that can easily be deployed within a library without external or specialist consultant skills, and due to its scalability can be applied at department or wider level. Originality/value – The paper identifies the trends impacting on, and the emerging opportunities for, university libraries and provides a model for workforce planning that recognizes the context and culture of the organization as key drivers in determining workforce planning. Keywords - Australia, University libraries, Academic libraries, Change management, Manpower planning Paper type - Case study
Resumo:
This paper presents a model to estimate travel time using cumulative plots. Three different cases considered are i) case-Det, for only detector data; ii) case-DetSig, for detector data and signal controller data and iii) case-DetSigSFR: for detector data, signal controller data and saturation flow rate. The performance of the model for different detection intervals is evaluated. It is observed that detection interval is not critical if signal timings are available. Comparable accuracy can be obtained from larger detection interval with signal timings or from shorter detection interval without signal timings. The performance for case-DetSig and for case-DetSigSFR is consistent with accuracy generally more than 95% whereas, case-Det is highly sensitive to the signal phases in the detection interval and its performance is uncertain if detection interval is integral multiple of signal cycles.
Resumo:
Fatigue and overwork are problems experienced by numerous employees in many industry sectors. Focusing on improving work-life balance can frame the ‘problem’ of long work hours to resolve working time duration issues. Flexible work options through re-organising working time arrangements is key to developing an organisational response for delivering work-life balance and usually involves changing the internal structure of work time. This study examines the effect of compressed long weekly working hours and the consequent ‘long break’ on work-life balance. Using Spillover theory and Border theory, this research considers organisational and personal determinants of overwork and fatigue. It concludes compressed long work hours with a long break provide better work-life balance. Further, a long break allows gaining ‘personal time’ and overcoming fatigue.
Resumo:
This article observes a paradox in the recent history of the Special Broadcasting Service. It is argued that, in contrast to the Australian Broadcasting Corporation, the role and general direction of SBS were not extensively debated as part of the ‘culture wars’ that occurred during the years of the Howard government. While that made SBS a less fraught space during that period, it may now be a factor in the comparative lack of support being given by the Rudd Labor government to SBS in comparison with the ABC, as some of the ‘special’ status of SBS has been blunted by its drift towards more mainstream programming and a mixed economy of commercial advertising, as well as government funding.
Resumo:
Aims: Influenza is commonly spread by infectious aerosols; however, detection of viruses in aerosols is not sensitive enough to confirm the characteristics of virus aerosols. The aim of this study was to develop an assay for respiratory viruses sufficiently sensitive to be used in epidemiological studies. Method: A two-step, nested real-time PCR assay was developed for MS2 bacteriophage, and for influenza A and B, parainfluenza 1 and human respiratory syncytial virus. Outer primer pairs were designed to nest each existing real-time PCR assay. The sensitivities of the nested real-time PCR assays were compared to those of existing real-time PCR assays. Both assays were applied in an aerosol study to compare their detection limits in air samples. Conclusions: The nested real-time PCR assays were found to be several logs more sensitive than the real-time PCR assays, with lower levels of virus detected at lower Ct values. The nested real-time PCR assay successfully detected MS2 in air samples, whereas the real-time assay did not. Significance and Impact of the Study: The sensitive assays for respiratory viruses will permit further research using air samples from naturally generated virus aerosols. This will inform current knowledge regarding the risks associated with the spread of viruses through aerosol transmission.
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).