396 resultados para distributed transaction processing
Resumo:
Manual calibration of large and dynamic networks of cameras is labour intensive and time consuming. This is a strong motivator for the development of automatic calibration methods. Automatic calibration relies on the ability to find correspondences between multiple views of the same scene. If the cameras are sparsely placed, this can be a very difficult task. This PhD project focuses on the further development of uncalibrated wide baseline matching techniques.
Resumo:
This thesis is a documented energy audit and long term study of energy and water reduction in a ghee factory. Global production of ghee exceeds 4 million tonnes annually. The factory in this study refines dairy products by non-traditional centrifugal separation and produces 99.9% pure, canned, crystallised Anhydrous Milk Fat (Ghee). Ghee is traditionally made by batch processing methods. The traditional method is less efficient, than centrifugal separation. An in depth systematic investigation was conducted of each item of major equipment including; ammonia refrigeration, a steam boiler, canning equipment, pumps, heat exchangers and compressed air were all fine-tuned. Continuous monitoring of electrical usage showed that not every initiative worked, others had pay back periods of less than a year. In 1994-95 energy consumption was 6,582GJ and in 2003-04 it was 5,552GJ down 16% for a similar output. A significant reduction in water usage was achieved by reducing the airflow in the refrigeration evaporative condensers to match the refrigeration load. Water usage has fallen 68% from18ML in 1994-95 to 5.78ML in 2003-04. The methods reported in this thesis could be applied to other industries, which have similar equipment, and other ghee manufacturers.
Resumo:
Synthetic polymers have attracted much attention in tissue engineering due to their ability to modulate biomechanical properties. This study investigated the feasibility of processing poly(varepsilon-caprolactone) (PCL) homopolymer, PCL-poly(ethylene glycol) (PEG) diblock, and PCL-PEG-PCL triblock copolymers into three-dimensional porous scaffolds. Properties of the various polymers were investigated by dynamic thermal analysis. The scaffolds were manufactured using the desktop robot-based rapid prototyping technique. Gross morphology and internal three-dimensional structure of scaffolds were identified by scanning electron microscopy and micro-computed tomography, which showed excellent fusion at the filament junctions, high uniformity, and complete interconnectivity of pore networks. The influences of process parameters on scaffolds' morphological and mechanical characteristics were studied. Data confirmed that the process parameters directly influenced the pore size, porosity, and, consequently, the mechanical properties of the scaffolds. The in vitro cell culture study was performed to investigate the influence of polymer nature and scaffold architecture on the adhesion of the cells onto the scaffolds using rabbit smooth muscle cells. Light, scanning electron, and confocal laser microscopy showed cell adhesion, proliferation, and extracellular matrix formation on the surface as well as inside the structure of both scaffold groups. The completely interconnected and highly regular honeycomb-like pore morphology supported bridging of the pores via cell-to-cell contact as well as production of extracellular matrix at later time points. The results indicated that the incorporation of hydrophilic PEG into hydrophobic PCL enhanced the overall hydrophilicity and cell culture performance of PCL-PEG copolymer. However, the scaffold architecture did not significantly influence the cell culture performance in this study.
Resumo:
Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.
Resumo:
This paper looks at the challenges presented for the Australian Library and Information Association by its role as the professional association responsible for ensuring the quality of Australian library technician graduates. There is a particular focus on the issue of course recognition, where the Association's role is complicated by the need to work alongside the national quality assurance processes that have been established by the relevant technical education authorities. The paper describes the history of course recognition in Australia; examines the relationship between course recognition and other quality measures; and describes the process the Association has undertaken recently to ensure appropriate professional scrutiny in a changing environment of accountability.
Resumo:
PPP (Public Private Partnerships) is a new operation mode of infrastructure projects, which usually undergo long periods and have various kinds of risks in technology, market, politics, policy, finance, society, natural conditions and cooperation. So the government and the private agency should establish the risk-sharing mechanism to ensure the successful implementation of the project. As an important branch of the new institutional economics, transaction cost economics and its analysis method have been proved to be beneficial to the proper allocation of risks between the two parts in PPP projects and the improvement of operation efficiency of PPP risk-sharing mechanism. This paper analyzed the transaction cost of the projects risk-sharing method and the both risk carriers. It pointed out that the risk-sharing method of PPP projects not only reflected the spirit of cooperation between public sector and private agency, but also minimized the total transaction cost of the risk sharing mechanism itself. Meanwhile, the risk takers had to strike a balance between the beforehand cost and the afterwards cost so as to control the cost of risk management. The paper finally suggested three ways which might be useful to reduce the transaction cost: to choose appropriate type of contract of PPP risk-sharing mechanism, to prevent information asymmetry and to establish mutual trust between the two participants.
Resumo:
In the filed of semantic grid, QoS-based Web service scheduling for workflow optimization is an important problem.However, in semantic and service rich environment like semantic grid, the emergence of context constraints on Web services is very common making the scheduling consider not only quality properties of Web services, but also inter service dependencies which are formed due to the context constraints imposed on Web services. In this paper, we present a repair genetic algorithm, namely minimal-conflict hill-climbing repair genetic algorithm, to address scheduling optimization problems in workflow applications in the presence of domain constraints and inter service dependencies. Experimental results demonstrate the scalability and effectiveness of the genetic algorithm.
Resumo:
Structural health monitoring (SHM) is the term applied to the procedure of monitoring a structure’s performance, assessing its condition and carrying out appropriate retrofitting so that it performs reliably, safely and efficiently. Bridges form an important part of a nation’s infrastructure. They deteriorate due to age and changing load patterns and hence early detection of damage helps in prolonging the lives and preventing catastrophic failures. Monitoring of bridges has been traditionally done by means of visual inspection. With recent developments in sensor technology and availability of advanced computing resources, newer techniques have emerged for SHM. Acoustic emission (AE) is one such technology that is attracting attention of engineers and researchers all around the world. This paper discusses the use of AE technology in health monitoring of bridge structures, with a special focus on analysis of recorded data. AE waves are stress waves generated by mechanical deformation of material and can be recorded by means of sensors attached to the surface of the structure. Analysis of the AE signals provides vital information regarding the nature of the source of emission. Signal processing of the AE waveform data can be carried out in several ways and is predominantly based on time and frequency domains. Short time Fourier transform and wavelet analysis have proved to be superior alternatives to traditional frequency based analysis in extracting information from recorded waveform. Some of the preliminary results of the application of these analysis tools in signal processing of recorded AE data will be presented in this paper.
Resumo:
Investigated human visual processing of simple two-colour patterns using a delayed match to sample paradigm with positron emission tomography (PET). This study is unique in that the authors specifically designed the visual stimuli to be the same for both pattern and colour recognition with all patterns being abstract shapes not easily verbally coded composed of two-colour combinations. The authors did this to explore those brain regions required for both colour and pattern processing and to separate those areas of activation required for one or the other. 10 right-handed male volunteers aged 18–35 yrs were recruited. The authors found that both tasks activated similar occipital regions, the major difference being more extensive activation in pattern recognition. A right-sided network that involved the inferior parietal lobule, the head of the caudate nucleus, and the pulvinar nucleus of the thalamus was common to both paradigms. Pattern recognition also activated the left temporal pole and right lateral orbital gyrus, whereas colour recognition activated the left fusiform gyrus and several right frontal regions.
Resumo:
Several protocols for isolation of mycobacteria from water exist, but there is no established standard method. This study compared methods of processing potable water samples for the isolation of Mycobacterium avium and Mycobacterium intracellulare using spiked sterilized water and tap water decontaminated using 0.005% cetylpyridinium chloride (CPC). Samples were concentrated by centrifugation or filtration and inoculated onto Middlebrook 7H10 and 7H11 plates and Lowenstein-Jensen slants and into mycobacterial growth indicator tubes with or without polymyxin, azlocillin, nalidixic acid, trimethoprim, and amphotericin B. The solid media were incubated at 32°C, at 35°C, and at 35°C with CO2 and read weekly. The results suggest that filtration of water for the isolation of mycobacteria is a more sensitive method for concentration than centrifugation. The addition of sodium thiosulfate may not be necessary and may reduce the yield. Middlebrook M7H10 and 7H11 were equally sensitive culture media. CPC decontamination, while effective for reducing growth of contaminants, also significantly reduces mycobacterial numbers. There was no difference at 3 weeks between the different incubation temperatures.
Resumo:
A laboratory scale twin screw extruder has been interfaced with a near infrared (NIR) spectrometer via a fibre optic link so that NIR spectra can be collected continuously during the small scale experimental melt state processing of polymeric materials. This system can be used to investigate melt state processes such as reactive extrusion, in real time, in order to explore the kinetics and mechanism of the reaction. A further advantage of the system is that it has the capability to measure apparent viscosity simultaneously which gives important additional information about molecular weight changes and polymer degradation during processing. The system was used to study the melt processing of a nanocomposite consisting of a thermoplastic polyurethane and an organically modified layered silicate.
Resumo:
Modern enterprise knowledge management systems typically require distributed approaches and the integration of numerous heterogeneous sources of information. A powerful foundation for these tasks can be Topic Maps, which not only provide a semantic net-like knowledge representation means and the possibility to use ontologies for modelling knowledge structures, but also offer concepts to link these knowledge structures with unstructured data stored in files, external documents etc. In this paper, we present the architecture and prototypical implementation of a Topic Map application infrastructure, the ‘Topic Grid’, which enables transparent, node-spanning access to different Topic Maps distributed in a network.
Resumo:
This paper considers some of the implications of the rise of design as a master-metaphor of the information age. It compares the terms 'interaction design' and 'mass communication', suggesting that both can be seen as a contradiction in terms, inappropriately preserving an industrial-age division between producers and consumers. With the shift from mass media to interactive media, semiotic and political power seems to be shifting too - from media producers to designers. This paper argues that it is important for the new discipline of 'interactive design' not to fall into habits of thought inherited from the 'mass' industrial era. Instead it argues for the significance, for designers and producers alike, of what I call 'distributed expertise' -including social network markets, a DIY-culture, user-led innovation, consumer co-created content, and the use of Web 2.0 affordances for social, scientific and creative purposes as well as for entertainment. It considers the importance of the growth of 'distributed expertise' as part of a new paradigm in the growth of knowledge, which has 'evolved' through a number of phases, from 'abstraction' to 'representation', to 'productivity'. In the context of technologically mediated popular participation in the growth of knowledge and social relationships, the paper argues that design and media-production professions need to cross rather than to maintain the gap between experts and everyone else, enabling all the agents in the system to navigate the shift into the paradigm of mass productivity.
Resumo:
SoundCipher is a software library written in the Java language that adds important music and sound features to the Processing environment that is widely used by media artists and otherwise has an orientation toward computational graphics. This article introduces the SoundCipher library and its features, describes its influences and design intentions, and positions it within the field of computer music programming tools. SoundCipher enables the rich history of algorithmic music techniques to be accessible within one of today’s most popular media art platforms. It also provides an accessible means for learning to create algorithmic music and sound programs.
Resumo:
Abstract With the phenomenal growth of electronic data and information, there are many demands for the development of efficient and effective systems (tools) to perform the issue of data mining tasks on multidimensional databases. Association rules describe associations between items in the same transactions (intra) or in different transactions (inter). Association mining attempts to find interesting or useful association rules in databases: this is the crucial issue for the application of data mining in the real world. Association mining can be used in many application areas, such as the discovery of associations between customers’ locations and shopping behaviours in market basket analysis. Association mining includes two phases. The first phase, called pattern mining, is the discovery of frequent patterns. The second phase, called rule generation, is the discovery of interesting and useful association rules in the discovered patterns. The first phase, however, often takes a long time to find all frequent patterns; these also include much noise. The second phase is also a time consuming activity that can generate many redundant rules. To improve the quality of association mining in databases, this thesis provides an alternative technique, granule-based association mining, for knowledge discovery in databases, where a granule refers to a predicate that describes common features of a group of transactions. The new technique first transfers transaction databases into basic decision tables, then uses multi-tier structures to integrate pattern mining and rule generation in one phase for both intra and inter transaction association rule mining. To evaluate the proposed new technique, this research defines the concept of meaningless rules by considering the co-relations between data-dimensions for intratransaction-association rule mining. It also uses precision to evaluate the effectiveness of intertransaction association rules. The experimental results show that the proposed technique is promising.