999 resultados para consult


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This teaching case aims to contribute to understanding the phenomenon of Enterprise Systems (ES) implementations in universities. Through this case, students will gain understanding of the importance of ‘contextual elements’ for large scale information systems (IS) implementations, in particular ES. This teaching case illustrates how these contextual factors contribute to the success or failure of such implementations, and how they can influence the decisions that dictate the lifecycle of such systems. The case describes ES implementations at a leading Australian university, and presents a rich account of the institutional, national and industry-sector contexts that have influenced the directions and decisions taken. The journey encountered with the main Enterprise Systems that support Financials, Human Resources and Facilities are described suggesting the lifecycle phases, critical success factors and lessons learnt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The large deformation analysis is one of major challenges in numerical modelling and simulation of metal forming. Because no mesh is used, the meshfree methods show good potential for the large deformation analysis. In this paper, a local meshfree formulation, based on the local weak-forms and the updated Lagrangian (UL) approach, is developed for the large deformation analysis. To fully employ the advantages of meshfree methods, a simple and effective adaptive technique is proposed, and this procedure is much easier than the re-meshing in FEM. Numerical examples of large deformation analysis are presented to demonstrate the effectiveness of the newly developed nonlinear meshfree approach. It has been found that the developed meshfree technique provides a superior performance to the conventional FEM in dealing with large deformation problems for metal forming.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel study that aims to contribute to understanding the phenomenon of Enterprise Systems (ES) evaluation in Australasian universities. The proposed study addresses known limitations of arguably the most significant dependent variable in the Information System (IS) field - IS Success or IS-Impact. This study adopts the IS-Impact measurement model, reported by Gable et al. (2008), as the primary commencing theory-base and applies research extension strategy described by Berthon et al. (2002); extending both theory and the context. This study employs a longitudinal, multi-method research design, with two interrelated phases – exploratory and confirmatory. The exploratory phase aims to investigate the applicability and sufficiency of the IS-Impact dimensions and measures in the new context. The confirmatory phase will gather quantitative data to statistically validate IS-Impact model as a formative index.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A teaching and learning development project is currently under way at Queens-land University of Technology to develop advanced technology videotapes for use with the delivery of structural engineering courses. These tapes consist of integrated computer and laboratory simulations of important concepts, and behaviour of structures and their components for a number of structural engineering subjects. They will be used as part of the regular lectures and thus will not only improve the quality of lectures and learning environment, but also will be able to replace the ever-dwindling laboratory teaching in these subjects. The use of these videotapes, developed using advanced computer graphics, data visualization and video technologies, will enrich the learning process of the current diverse engineering student body. This paper presents the details of this new method, the methodology used, the results and evaluation in relation to one of the structural engineering subjects, steel structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines the algebraic cryptanalysis of small scale variants of the LEX-BES. LEX-BES is a stream cipher based on the Advanced Encryption Standard (AES) block cipher. LEX is a generic method proposed for constructing a stream cipher from a block cipher, initially introduced by Biryukov at eSTREAM, the ECRYPT Stream Cipher project in 2005. The Big Encryption System (BES) is a block cipher introduced at CRYPTO 2002 which facilitates the algebraic analysis of the AES block cipher. In this paper, experiments were conducted to find solution of the equation system describing small scale LEX-BES using Gröbner Basis computations. This follows a similar approach to the work by Cid, Murphy and Robshaw at FSE 2005 that investigated algebraic cryptanalysis on small scale variants of the BES. The difference between LEX-BES and BES is that due to the way the keystream is extracted, the number of unknowns in LEX-BES equations is fewer than the number in BES. As far as the author knows, this attempt is the first at creating solvable equation systems for stream ciphers based on the LEX method using Gröbner Basis computations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Media articles have promoted the view that cyclists are risktakers who disregard traffic regulations, but little is known about the contribution of cyclist risk-taking behaviours to crashes. This study examines the role of traffic violations in the 6774 police-reported bicycle crashes in Queensland between January 2000 and December 2008. Of the 6328 crashes involving bicycles and motor vehicles, cyclists were deemed to be at fault in 44.4% of the incidents. When motorists were determined to be at-fault, ‘failure to yield’ violations accounted for three of the four most reported contributing factors. In crashes where the cyclist was at fault, attention and inexperience were the most frequent contributing factors. There were 67 collisions between bicycles and pedestrians, with the cyclist at fault in 65.7%. During the data period, 302 single-bicycle crashes were reported. The most frequent contributing factors were avoidance actions to miss another road user and inattention or negligence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show how to construct a certificateless key agreement protocol from the certificateless key encapsulation mechanism introduced by \cite{lippold-ICISC_2009} in ICISC 2009 using the \cite{DBLP:conf/acisp/BoydCNP08} protocol from ACISP 2008. We introduce the Canetti-Krawczyk (CK) model for certificateless cryptography, give security notions for Type I and Type II adversaries in the CK model, and highlight the differences to the existing e$^2$CK model discussed by \cite{DBLP:conf/pairing/LippoldBN09}. The resulting CK model is more relaxed thus giving more power to the adversary than the original CK model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fishers are faced with multiple risks, including unpredictability of future catch rates, prices and costs. While the latter are largely beyond the control of fisheries managers, effective fisheries management should reduce uncertainty about future catches. Different management instruments are likely to have different impacts on the risk perception of fishers, and this should manifest itself in their implicit discount rate. Assuming licence and quota values represent the net present value of the flow of expected future profits, then a proxy for the implicit discount rate of vessels in a fishery can be derived by the ratio of the average level of profits to the average licence/quota value. From this, an indication of the risk perception can be derived, assuming higher discount rates reflect higher levels of systematic risk. In this paper, we apply the capital asset pricing model (CAPM) to determine the risk premium implicit in the discount rates for a range of Australian fisheries, and compare this with the set of management instruments in place. We test the assumption that rights based management instruments lower perceptions of risk in fisheries. We find little evidence to support this assumption. although the analysis was based on only limited data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Experiments were undertaken to study drying kinetics of different shaped moist food particulates during heat pump assisted fluidised bed drying. Three particular geometrical shapes of parallelepiped, cylindrical and spheres were selected from potatoes (aspect ratio = 1:1, 2:1, 3:1), cut beans (length: diameter = 1:1, 2:1, 3:1) and peas respectively. A batch fluidised bed dryer connected to a heat pump system was used for the experimentation. A Heat pump and fluid bed combination was used to increase overall energy efficiency and achieve higher drying rates. Drying kinetics, were evaluated with non-dimensional moisture at three different drying temperatures of 30, 40 and 50o C. Due to complex hydrodynamics of the fluidised beds, drying kinetics are dryer or material specific. Numerous mathematical models can be used to calculate drying kinetics ranging from analytical models with simplified assumptions to empirical models built by regression using experimental data. Empirical models are commonly used for various food materials due to their simpler approach. However problems in accuracy, limits the applications of empirical models. Some limitations of empirical models could be reduced by using semi-empirical models based on heat and mass transfer of the drying operation. One such method is the quasi-stationary approach. In this study, a modified quasi-stationary approach was used to model drying kinetics of the cylindrical food particles at three drying temperatures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliability and validity in the testing of spoken language are essential in order to assess learners' English language proficiency as evidence of their readiness to begin courses in tertiary institutions. Research has indicated that the task chosen to elicit language samples can have a marked effect on both the nature of the interaction, including the power differential, and assessment, raising the issue of ethics. This exploratory studey, with a group of 32 students from the Peoples's Republic of China preparing for tertiary study in Singapore, compares test-takers' reactions to the use of an oral proficiency interview and a pair interaction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cleaning of sugar mill evaporators is an expensive exercise. Identifying the scale components assists in determining which chemical cleaning agents would result in effective evaporator cleaning. The current methods (based on x-ray diffraction techniques, ion exchange/high performance liquid chromatography and thermogravimetry/differential thermal analysis) used for scale characterisation are difficult, time consuming and expensive, and cannot be performed in a conventional analytical laboratory or by mill staff. The present study has examined the use of simple descriptor tests for the characterisation of Australian sugar mill evaporator scales. Scale samples were obtained from seven Australian sugar mill evaporators by mechanical means. The appearance, texture and colour of the scale were noted before the samples were characterised using x-ray fluorescence and x-ray powder diffraction to determine the compounds present. A number of commercial analytical test kits were used to determine the phosphate and calcium contents of scale samples. Dissolution experiments were carried out on the scale samples with selected cleaning agents to provide relevant information about the effect the cleaning agents have on different evaporator scales. Results have shown that by simply identifying the colour and the appearance of the scale, the elemental composition and knowing from which effect the scale originates, a prediction of the scale composition can be made. These descriptors and dissolution experiments on scale samples can be used to provide factory staff with an on-site rapid process to predict the most effective chemicals for chemical cleaning of the evaporators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A membrane filtration plant using suitable micro or ultra-filtration membranes has the potential to significantly increase pan stage capacity and improve sugar quality. Previous investigations by SRI and others have shown that membranes will remove polysaccharides, turbidity and colloidal impurities and result in lower viscosity syrups and molasses. However, the conclusion from those investigations was that membrane filtration was not economically viable. A comprehensive assessment of current generation membrane technology was undertaken by SRI. With the aid of two pilot plants provided by Applexion and Koch Membrane Systems, extensive trials were conducted at an Australian factory using clarified juice at 80–98°C as feed to each pilot plant. Conditions were varied during the trials to examine the effect of a range of operating parameters on the filtering characteristics of each of the membranes. These parameters included feed temperature and pressure, flow velocity, soluble solids and impurity concentrations. The data were then combined to develop models to predict the filtration rate (or flux) that could be expected for nominated operating conditions. The models demonstrated very good agreement with the data collected during the trials. The trials also identified those membranes that provided the highest flux levels per unit area of membrane surface for a nominated set of conditions. Cleaning procedures were developed that ensured the water flux level was recovered following a clean-in-place process. Bulk samples of clarified juice and membrane filtered juice from each pilot were evaporated to syrup to quantify the gain in pan stage productivity that results from the removal of high molecular weight impurities by membrane filtration. The results are in general agreement with those published by other research groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Silhouettes are common features used by many applications in computer vision. For many of these algorithms to perform optimally, accurately segmenting the objects of interest from the background to extract the silhouettes is essential. Motion segmentation is a popular technique to segment moving objects from the background, however such algorithms can be prone to poor segmentation, particularly in noisy or low contrast conditions. In this paper, the work of [3] combining motion detection with graph cuts, is extended into two novel implementations that aim to allow greater uncertainty in the output of the motion segmentation, providing a less restricted input to the graph cut algorithm. The proposed algorithms are evaluated on a portion of the ETISEO dataset using hand segmented ground truth data, and an improvement in performance over the motion segmentation alone and the baseline system of [3] is shown.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intelligent surveillance systems typically use a single visual spectrum modality for their input. These systems work well in controlled conditions, but often fail when lighting is poor, or environmental effects such as shadows, dust or smoke are present. Thermal spectrum imagery is not as susceptible to environmental effects, however thermal imaging sensors are more sensitive to noise and they are only gray scale, making distinguishing between objects difficult. Several approaches to combining the visual and thermal modalities have been proposed, however they are limited by assuming that both modalities are perfuming equally well. When one modality fails, existing approaches are unable to detect the drop in performance and disregard the under performing modality. In this paper, a novel middle fusion approach for combining visual and thermal spectrum images for object tracking is proposed. Motion and object detection is performed on each modality and the object detection results for each modality are fused base on the current performance of each modality. Modality performance is determined by comparing the number of objects tracked by the system with the number detected by each mode, with a small allowance made for objects entering and exiting the scene. The tracking performance of the proposed fusion scheme is compared with performance of the visual and thermal modes individually, and a baseline middle fusion scheme. Improvement in tracking performance using the proposed fusion approach is demonstrated. The proposed approach is also shown to be able to detect the failure of an individual modality and disregard its results, ensuring performance is not degraded in such situations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To explore the specific factors that impact on nursing resources in relation to the ‘unoccupied bed’. Design: Descriptive observational Setting: Multiple wards in single site, tertiary referral hospital Main outcome measure: Identification and classification of tasks related to the unoccupied bed. Results: Our study identified three main areas of nursing work, which centre on the ‘unoccupied bed’: 1) bed preparation for admission; 2) temporary transfer; 3) bed preparation post patient discharge. Conclusion: The unoccupied bed is not resource neutral and may involve considerable nursing time. The time associated with each of the reasons for the bed being unoccupied remains to be quantified.