986 resultados para libreria, Software, Database, ORM, transazionalità


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Growing participation is a key challenge for the viability of sustainability initiatives, many of which require enactment at a local community level in order to be effective. This paper undertakes a review of technology assisted carpooling in order to understand the challenge of designing participation and consider how mobile social software and interface design can be brought to bear. It was found that while persuasive technology and social networking approaches have roles to play, critical factors in the design of carpooling are convenience, ease of use and fit with contingent circumstances, all of which require a use-centred approach to designing a technological system and building participation. Moreover, the reach of technology platform-based global approaches may be limited if they do not cater to local needs. An approach that focuses on iteratively designing technology to support and grow mobile social ridesharing networks in particular locales is proposed. The paper contributes an understanding of HCI approaches in the context of other designing participation approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study focuses on an alluvial plain situated within a large meander of the Logan River at Josephville near Beaudesert which supports a factory that processes gelatine. The plant draws water from on site bores, as well as the Logan River, for its production processes and produces approximately 1.5 ML per day (Douglas Partners, 2004) of waste water containing high levels of dissolved ions. At present a series of treatment ponds are used to aerate the waste water reducing the level of organic matter; the water is then used to irrigate grazing land around the site. Within the study the hydrogeology is investigated, a conceptual groundwater model is produced and a numerical groundwater flow model is developed from this. On the site are several bores that access groundwater, plus a network of monitoring bores. Assessment of drilling logs shows the area is formed from a mixture of poorly sorted Quaternary alluvial sediments with a laterally continuous aquifer comprised of coarse sands and fine gravels that is in contact with the river. This aquifer occurs at a depth of between 11 and 15 metres and is overlain by a heterogeneous mixture of silts, sands and clays. The study investigates the degree of interaction between the river and the groundwater within the fluvially derived sediments for reasons of both environmental monitoring and sustainability of the potential local groundwater resource. A conceptual hydrogeological model of the site proposes two hydrostratigraphic units, a basal aquifer of coarse-grained materials overlain by a thick semi-confining unit of finer materials. From this, a two-layer groundwater flow model and hydraulic conductivity distribution was developed based on bore monitoring and rainfall data using MODFLOW (McDonald and Harbaugh, 1988) and PEST (Doherty, 2004) based on GMS 6.5 software (EMSI, 2008). A second model was also considered with the alluvium represented as a single hydrogeological unit. Both models were calibrated to steady state conditions and sensitivity analyses of the parameters has demonstrated that both models are very stable for changes in the range of ± 10% for all parameters and still reasonably stable for changes up to ± 20% with RMS errors in the model always less that 10%. The preferred two-layer model was found to give the more realistic representation of the site, where water level variations and the numerical modeling showed that the basal layer of coarse sands and fine gravels is hydraulically connected to the river and the upper layer comprising a poorly sorted mixture of silt-rich clays and sands of very low permeability limits infiltration from the surface to the lower layer. The paucity of historical data has limited the numerical modelling to a steady state one based on groundwater levels during a drought period and forecasts for varying hydrological conditions (e.g. short term as well as prolonged dry and wet conditions) cannot reasonably be made from such a model. If future modelling is to be undertaken it is necessary to establish a regular program of groundwater monitoring and maintain a long term database of water levels to enable a transient model to be developed at a later stage. This will require a valid monitoring network to be designed with additional bores required for adequate coverage of the hydrogeological conditions at the Josephville site. Further investigations would also be enhanced by undertaking pump testing to investigate hydrogeological properties in the aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Invited one hour presentation at Microsoft Tech Ed 2009 about getting students interested in games programming at QUT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To meet new challenges of Enterprise Systems that essentially go beyond the initial implementation, contemporary organizations seek employees with business process experts with software skills. Despite a healthy demand from the industry for such expertise, recent studies reveal that most Information Systems (IS) graduates are ill-equipped to meet the challenges of modern organizations. This paper shares insights and experiences from a course that is designed to provide a business process centric view of a market leading Enterprise System. The course, designed for both undergraduate and graduate students, uses two common business processes in a case study that employs both sequential and explorative exercises. Student feedback gained through two longitudinal surveys across two phases of the course demonstrates promising signs of the teaching approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Few frameworks exist for the teaching and assessment of programming subjects that are coherent and logical. Nor are they sufficiently generic and adaptable to be used outside the particular tertiary institutions in which they were developed. This paper presents the Teaching and Assessment of Software Development (TASD) frame-work. We describe its development and implementation at an Australian university and demonstrate, with examples, how it has been used, with supporting data. Extracts of criteria sheets (grading rubrics) for a variety of assessment tasks are included. The numerous advantages of this new framework are discussed with comparisons made to those reported in the published literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective management of groundwater requires stakeholders to have a realistic conceptual understanding of the groundwater systems and hydrological processes.However, groundwater data can be complex, confusing and often difficult for people to comprehend..A powerful way to communicate understanding of groundwater processes, complex subsurface geology and their relationships is through the use of visualisation techniques to create 3D conceptual groundwater models. In addition, the ability to animate, interrogate and interact with 3D models can encourage a higher level of understanding than static images alone. While there are increasing numbers of software tools available for developing and visualising groundwater conceptual models, these packages are often very expensive and are not readily accessible to majority people due to complexity. .The Groundwater Visualisation System (GVS) is a software framework that can be used to develop groundwater visualisation tools aimed specifically at non-technical computer users and those who are not groundwater domain experts. A primary aim of GVS is to provide management support for agencies, and enhancecommunity understanding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Howard East rural area has experienced a rapid growth of small block subdivisions and horticulture over the last 40 years, which has been based on groundwater supply. Early bores in the area provide part of the water supply for Darwin City and are maintained and monitored by NT Power & Water Corporation. The Territory government (NRETAS) has established a monitoring network, and now 48 bores are monitored. However, in the area there are over 2700 private bores that are unregulated.Although NRETAS has both FDM and FEM simulations for the region, community support for potential regulation is sought. To improve stakeholder understanding of the resource QUT was retained by the TRaCKconsortium to develop a 3D visualisation of the groundwater system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What does it mean when we design for accessibility, inclusivity and "dissolving boundaries" -- particularly those boundaries between the design philosophy, the software/interface actuality and the stated goals? This paper is about the principles underlying a research project called 'The Little Grey Cat engine' or greyCat. GreyCat has grown out of our experience in using commercial game engines as production environments for the transmission of culture and experience through the telling of individual stories. The key to this endeavour is the potential of the greyCat software to visualize worlds and the manner in which non-formal stories are intertwined with place. The apparently simple dictum of "show, don't tell" and the use of 3D game engines as a medium disguise an interesting nexus of problematic issues and questions, particularly in the ramifications for cultural dimensions and participatory interaction design. The engine is currently in alpha and the following paper is its background story. In this paper we discuss the problematic, thrown into sharp relief by a particular project, and we continue to unpack concepts and early designs behind the greyCat itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Decisions made in the earliest stage of architectural design have the greatest impact on the construction, lifecycle cost and environmental footprint of buildings. Yet the building services, one of the largest contributors to cost, complexity, and environmental impact, are rarely considered as an influence on the design at this crucial stage. In order for efficient and environmentally sensitive built environment outcomes to be achieved, a closer collaboration between architects and services engineers is required at the outset of projects. However, in practice, there are a variety of obstacles impeding this transition towards an integrated design approach. This paper firstly presents a critical review of the existing barriers to multidisciplinary design. It then examines current examples of best practice in the building industry to highlight the collaborative strategies being employed and their benefits to the design process. Finally, it discusses a case study project to identify directions for further research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Last year European Intellectual Property Review published an article comparing the latest version of the proposed US database legislation, the Collections of Information Antipiracy Bill with the UK's Copyright and Rights in Database Regulations 1997. Subsequently a new US Bill, the Consumer and Investor Access to Information Act has emerged, the Antipiracy Bill has been amended and much debate has occurred, but the US seems no closer to enacting database legislation. This article briefly outlines the background to the US legislative efforts, examines the two Bills and draws some comparisons with the UK Regulations. A study of the US Bills clearly demonstrates the starkly divided opinion on database protection held by the Bills' proponents and the principal lobby groups driving the legislative efforts: the Antipiracy Bill is very protective of database producers' interests, whereas the Access Bill is heavily user-oriented. If the US experience is any indication there will be a long horizon involved in achieving any consensus on international harmonisation of this difficult area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advancement of Service-Oriented Architecture in the technical and business domain, the management & engineering of services requires a thorough and systematic understanding of the service lifecycle for both business and software services. However, while service-oriented approaches acknowledge the importance of the service ecosystem, service lifecycle models are typically internally focused, paying limited attention to processes related to offering services to or using services from other actors. In this paper, we address this need by discussing the relations between a comprehensive service lifecycle approach for service management & engineering and the sourcing & purchasing of services. In particular we pay attention to the similarities and differences between sourcing business and software services, the alignment between service management & engineering and sourcing & purchasing, the role of sourcing in the transformation of an organization towards a service-oriented paradigm, the role of architectural approaches to sourcing in this transformation, and the sourcing of specific services at different levels of granularity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed Denial of Services DDoS, attacks has become one of the biggest threats for resources over Internet. Purpose of these attacks is to make servers deny from providing services to legitimate users. These attacks are also used for occupying media bandwidth. Currently intrusion detection systems can just detect the attacks but cannot prevent / track the location of intruders. Some schemes also prevent the attacks by simply discarding attack packets, which saves victim from attack, but still network bandwidth is wasted. In our opinion, DDoS requires a distributed solution to save wastage of resources. The paper, presents a system that helps us not only in detecting such attacks but also helps in tracing and blocking (to save the bandwidth as well) the multiple intruders using Intelligent Software Agents. The system gives dynamic response and can be integrated with the existing network defense systems without disturbing existing Internet model. We have implemented an agent based networking monitoring system in this regard.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Dynamic Data eXchange (DDX) is our third generation platform for building distributed robot controllers. DDX allows a coalition of programs to share data at run-time through an efficient shared memory mechanism managed by a store. Further, stores on multiple machines can be linked by means of a global catalog and data is moved between the stores on an as needed basis by multi-casting. Heterogeneous computer systems are handled. We describe the architecture of DDX and the standard clients we have developed that let us rapidly build complex control systems with minimal coding.