915 resultados para Computer software -- Development -- Congresses


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Strolls project was originally devised for a colleague in Finland and a cross cultural event event called AU goes to FI – the core concept is both re-experience of and presentation of the ‘everyday’ experience of life rather than the usual cultural icons. The project grew and was presented as a mash-up site with google maps (truna aka j.turner & David Browning). The site is now cob-webbed but some of the participant made strolls are archived here. The emphasis on the walk and taking of image stills (as opposed to the straightforward video) is based on a notion of partaking of the environment with technology. The process involves a strange and distinct embodiment as the maker must stop and choose each subsequent shot in order to build up the final animated sequence. The viewer becomes subtly involved in the maker’s decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The worldwide installed base of enterprise resource planning (ERP) systems has increased rapidly over the past 10 years now comprising tens of thousands of installations in large- and medium-sized organizations and millions of licensed users. Similar to traditional information systems (IS), ERP systems must be maintained and upgraded. It is therefore not surprising that ERP maintenance activities have become the largest budget provision in the IS departments of many ERP-using organizations. Yet, there has been limited study of ERP maintenance activities. Are they simply instances of traditional software maintenance activities to which traditional software maintenance research findings can be generalized? Or are they fundamentally different, such that new research, specific to ERP maintenance, is required to help alleviate the ERP maintenance burden? This paper reports a case study of a large organization that implemented ERP (an SAP system) more than three years ago. From the case study and data collected, we observe the following distinctions of ERP maintenance: (1) the ERP-using organization, in addition to addressing internally originated change-requests, also implements maintenance introduced by the vendor; (2) requests for user-support concerning the ERP system behavior, function and training constitute a main part of ERP maintenance activity; and (3) similar to the in-house software environment, enhancement is the major maintenance activity in the ERP environment, encompassing almost 64% of the total change-request effort. In light of these and other findings, we ultimately: (1) propose a clear and precise definition of ERP maintenance; (2) conclude that ERP maintenance cannot be sufficiently described by existing software maintenance taxonomies; and (3) propose a benefits-oriented taxonomy, that better represents ERP maintenance activities. Three salient dimensions (for characterizing requests) incorporated in the proposed ERP maintenance taxonomy are: (1) who is the maintenance source? (2) why is it important to service the request? and (3) what––whether there is any impact of implementing the request on the installed module(s)?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As computational models in fields such as medicine and engineering get more refined, resource requirements are increased. In a first instance, these needs have been satisfied using parallel computing and HPC clusters. However, such systems are often costly and lack flexibility. HPC users are therefore tempted to move to elastic HPC using cloud services. One difficulty in making this transition is that HPC and cloud systems are different, and performance may vary. The purpose of this study is to evaluate cloud services as a means to minimise both cost and computation time for large-scale simulations, and to identify which system properties have the most significant impact on performance. Our simulation results show that, while the performance of Virtual CPU (VCPU) is satisfactory, network throughput may lead to difficulties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most real-life data analysis problems are difficult to solve using exact methods, due to the size of the datasets and the nature of the underlying mechanisms of the system under investigation. As datasets grow even larger, finding the balance between the quality of the approximation and the computing time of the heuristic becomes non-trivial. One solution is to consider parallel methods, and to use the increased computational power to perform a deeper exploration of the solution space in a similar time. It is, however, difficult to estimate a priori whether parallelisation will provide the expected improvement. In this paper we consider a well-known method, genetic algorithms, and evaluate on two distinct problem types the behaviour of the classic and parallel implementations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a technique for the automated removal of noise from process execution logs. Noise is the result of data quality issues such as logging errors and manifests itself in the form of infrequent process behavior. The proposed technique generates an abstract representation of an event log as an automaton capturing the direct follows relations between event labels. This automaton is then pruned from arcs with low relative frequency and used to remove from the log those events not fitting the automaton, which are identified as outliers. The technique has been extensively evaluated on top of various auto- mated process discovery algorithms using both artificial logs with different levels of noise, as well as a variety of real-life logs. The results show that the technique significantly improves the quality of the discovered process model along fitness, appropriateness and simplicity, without negative effects on generalization. Further, the technique scales well to large and complex logs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The output of a differential scanning fluorimetry (DSF) assay is a series of melt curves, which need to be interpreted to get value from the assay. An application that translates raw thermal melt curve data into more easily assimilated knowledge is described. This program, called “Meltdown,” conducts four main activities—control checks, curve normalization, outlier rejection, and melt temperature (Tm) estimation—and performs optimally in the presence of triplicate (or higher) sample data. The final output is a report that summarizes the results of a DSF experiment. The goal of Meltdown is not to replace human analysis of the raw fluorescence data but to provide a meaningful and comprehensive interpretation of the data to make this useful experimental technique accessible to inexperienced users, as well as providing a starting point for detailed analyses by more experienced users.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-authored video- where participants are in control of the creation of their own footage- is a means of creating innovative design material and including all members of a family in design activities. This paper describes our adaptation to this process called Self Authored Video Interviews (SAVIs) that we created and prototyped to better understand how families engage with situated technology in the home. We find the methodology produces unique insights into family dynamics in the home, uncovering assumptions and tensions unlikely to be discovered using more conventional methods. The paper outlines a number of challenges and opportunities associated with the methodology, specifically, maximising the value of the insights gathered by appealing to children to champion the cause, and how to counter perceptions of the lingering presence of researchers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To remain competitive, many agricultural systems are now being run along business lines. Systems methodologies are being incorporated, and here evolutionary computation is a valuable tool for identifying more profitable or sustainable solutions. However, agricultural models typically pose some of the more challenging problems for optimisation. This chapter outlines these problems, and then presents a series of three case studies demonstrating how they can be overcome in practice. Firstly, increasingly complex models of Australian livestock enterprises show that evolutionary computation is the only viable optimisation method for these large and difficult problems. On-going research is taking a notably efficient and robust variant, differential evolution, out into real-world systems. Next, models of cropping systems in Australia demonstrate the challenge of dealing with competing objectives, namely maximising farm profit whilst minimising resource degradation. Pareto methods are used to illustrate this trade-off, and these results have proved to be most useful for farm managers in this industry. Finally, land-use planning in the Netherlands demonstrates the size and spatial complexity of real-world problems. Here, GIS-based optimisation techniques are integrated with Pareto methods, producing better solutions which were acceptable to the competing organizations. These three studies all show that evolutionary computation remains the only feasible method for the optimisation of large, complex agricultural problems. An extra benefit is that the resultant population of candidate solutions illustrates trade-offs, and this leads to more informed discussions and better education of the industry decision-makers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic mark–recapture requires efficient methods of uniquely identifying individuals. 'Shadows' (individuals with the same genotype at the selected loci) become more likely with increasing sample size, and bias harvest rate estimates. Finding loci is costly, but better loci reduce analysis costs and improve power. Optimal microsatellite panels minimize shadows, but panel design is a complex optimization process. locuseater and shadowboxer permit power and cost analysis of this process and automate some aspects, by simulating the entire experiment from panel design to harvest rate estimation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

APSIM-ORYZA is a new functionality developed in the APSIM framework to simulate rice production while addressing management issues such as fertilisation and transplanting, which are particularly important in Korean agriculture. To validate the model for Korean rice varieties and field conditions, the measured yields and flowering times from three field experiments conducted by the Gyeonggi Agricultural Research and Extension Services (GARES) in Korea were compared against the simulated outputs for different management practices and rice varieties. Simulated yields of early-, mid- and mid-to-late-maturing varieties of rice grown in a continuous rice cropping system from 1997 to 2004 showed close agreement with the measured data. Similar results were also found for yields simulated under seven levels of nitrogen application. When different transplanting times were modelled, simulated flowering times ranged from within 3 days of the measured values for the early-maturing varieties, to up to 9 days after the measured dates for the mid- and especially mid-to-late-maturing varieties. This was associated with highly variable simulated yields which correlated poorly with the measured data. This suggests the need to accurately calibrate the photoperiod sensitivity parameters of the model for the photoperiod-sensitive rice varieties in Korea.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wilmot Senaratne, Bill Palmer and Bob Sutherst recently published their paper 'Applications of CLIMEX modelling leading to improved biological control' in Proceedings of the 16th Australian Weeds Conference. They looked at three examples where modern climate matching techniques using computer software produces decisions and results than might happen using previous techniques such as climadiagrams. Assessment of climatic suitability is important at various stages of a biological control project; from initial foreign exploration, to risk assessment in preparation for the release of a particular agent, through to selection of release sites that maximise the agent´s chances of initial establishment. It is now also necessary to predict potential future distributions of both target weeds and agents under climate change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Extreme heat events (both heat waves and extremely hot days) are increasing in frequency and duration globally and cause more deaths in Australia than any other extreme weather event. Numerous studies have demonstrated a link between extreme heat events and an increased risk of morbidity and death. In this study, the researchers sought to identify if extreme heat events in the Tasmanian population were associated with any changes in emergency department admissions to the Royal Hobart Hospital (RHH) for the period 2003-2010. Methods: Non-identifiable RHH emergency department data and climate data from the Australian Bureau of Meteorology were obtained for the period 2003-2010. Statistical analyses were conducted using the computer statistical computer software ‘R’ with a distributed lag non-linear model (DLNM) package used to fit a quassi-Poisson generalised linear regression model. Results: This study showed that RR of admission to RHH during 2003-2010 was significant over temperatures of 24 C with a lag effect lasting 12 days and main effect noted one day after the extreme heat event. Discussion: This study demonstrated that extreme heat events have a significant impact on public hospital admissions. Two limitations were identified: admissions data rather than presentations data were used and further analysis could be done to compare types of admissions and presentations between heat and non-heat events. Conclusion: With the impacts of climate change already being felt in Australia, public health organisations in Tasmania and the rest of Australia need to implement adaptation strategies to enhance resilience to protect the public from the adverse health effects of heat events and climate change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flood extent mapping is a basic tool for flood damage assessment, which can be done by digital classification techniques using satellite imageries, including the data recorded by radar and optical sensors. However, converting the data into the information we need is not a straightforward task. One of the great challenges involved in the data interpretation is to separate the permanent water bodies and flooding regions, including both the fully inundated areas and the wet areas where trees and houses are partly covered with water. This paper adopts the decision fusion technique to combine the mapping results from radar data and the NDVI data derived from optical data. An improved capacity in terms of identifying the permanent or semi-permanent water bodies from flood inundated areas has been achieved. Computer software tools Multispec and Matlab were used.