939 resultados para Continuously Stirred Bioreactor
Resumo:
In 1997, business trend analyst Linda Stone proposed the term "continuous partial attention" to characterise the contemporary experience of wanting to be ‘a live node on the network’. She argued that while it can be a positive and functional behaviour, it also has the potential to be disabling, compromising reflective and creative thought. Subsequent studies have explored the ways in which technology has slowly disrupted the idea and experience of a "centred" and "bounded" self. Studies of ‘Gen Y’ show the ease with which young people accommodate this multiplying of the self as they negotiate their partial friendships and networks of interest with family and work. In teaching and learning circles in tertiary education we talk a lot about problems of student ‘disengagement’. In characterising our challenge this way, are we undermining our potential to understand the tendencies of contemporary learners? This paper begins a consideration of how traditional models, frameworks and practices might oppose these partially engaged but continuously connected and interpersonal "dividuals". What questions does this provoke for learning environments towards harnessing yet counterpointing the crisis students might experience; to recognise but also integrate their multiple selves towards what they aim to become through the process of learning?
Resumo:
Automatic detection of suspicious activities in CCTV camera feeds is crucial to the success of video surveillance systems. Such a capability can help transform the dumb CCTV cameras into smart surveillance tools for fighting crime and terror. Learning and classification of basic human actions is a precursor to detecting suspicious activities. Most of the current approaches rely on a non-realistic assumption that a complete dataset of normal human actions is available. This paper presents a different approach to deal with the problem of understanding human actions in video when no prior information is available. This is achieved by working with an incomplete dataset of basic actions which are continuously updated. Initially, all video segments are represented by Bags-Of-Words (BOW) method using only Term Frequency-Inverse Document Frequency (TF-IDF) features. Then, a data-stream clustering algorithm is applied for updating the system's knowledge from the incoming video feeds. Finally, all the actions are classified into different sets. Experiments and comparisons are conducted on the well known Weizmann and KTH datasets to show the efficacy of the proposed approach.
Resumo:
Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.
Resumo:
High time resolution aerosol mass spectrometry measurements were conducted during a field campaign at Mace Head Research Station, Ireland, in June 2007. Observations on one particular day of the campaign clearly indicated advection of aerosol from volcanoes and desert plains in Iceland which could be traced with NOAA Hysplit air mass back trajectories and satellite images. In conjunction with this event, elevated levels of sulphate and light absorbing particles were encountered at Mace Head. While sulphate concentration was continuously increasing, nitrate levels remained low indicating no significant contribution from anthropogenic pollutants. Sulphate concentration increased about 3.8 g/m3 in comparison with the background conditions. Corresponding sulphur flux from volcanic emissions was estimated to about 0.3 TgS/yr, suggesting that a large amount of sulphur released from Icelandic volcanoes may be distributed over distances larger than 1000 km. Overall, our results corroborate that transport of volcanogenic sulphate and dust particles can significantly change the chemical composition, size distribution, and optical properties of aerosol over the North Atlantic Ocean and should be considered accordingly by regional climate models.
Resumo:
A laboratory scale twin screw extruder has been interfaced with a near infrared (NIR) spectrometer via a fibre optic link so that NIR spectra can be collected continuously during the small scale experimental melt state processing of polymeric materials. This system can be used to investigate melt state processes such as reactive extrusion, in real time, in order to explore the kinetics and mechanism of the reaction. A further advantage of the system is that it has the capability to measure apparent viscosity simultaneously which gives important additional information about molecular weight changes and polymer degradation during processing. The system was used to study the melt processing of a nanocomposite consisting of a thermoplastic polyurethane and an organically modified layered silicate.
Resumo:
In a competitive environment, companies continuously innovate to offer superior services at lower costs. ‘Shared services’ have been extensively adopted in practice as one means for improving organisational performance. Shared services is considered most appropriate for support functions, and is widely adopted in Human Resource Management, Finance and Accounting; more recently being employed across the Information Systems function. IS applications and infrastructure are an important enabler and driver of shared services in all functional areas. As computer based corporate information systems have become de facto and the internet pervasive and increasingly the backbone of administrative systems, the technical impediments to sharing have come down dramatically. As this trend continues, CIOs and IT professionals will need a deeper understanding of the shared services phenomenon and its implications. The advent of shared services has consequential implications for the IS academic discipline. Yet, archival analysis of IS the academic literature reveals that shared services, though mentioned in more than 100 articles, has received little in depth attention. This paper is the first attempt to investigate and report on the current status of shared services in the IS literature. The paper presents detailed review of literature from main IS journals and conferences, findings evidencing a lack of focus and definitions and objectives lacking conceptual rigour. The paper concludes with a tentative operational definition, a list of perceived main objectives of shared services, and an agenda for related future research.
Resumo:
The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.
Resumo:
Buffer overflow vulnerabilities continue to prevail and the sophistication of attacks targeting these vulnerabilities is continuously increasing. As a successful attack of this type has the potential to completely compromise the integrity of the targeted host, early detection is vital. This thesis examines generic approaches for detecting executable payload attacks, without prior knowledge of the implementation of the attack, in such a way that new and previously unseen attacks are detectable. Executable payloads are analysed in detail for attacks targeting the Linux and Windows operating systems executing on an Intel IA-32 architecture. The execution flow of attack payloads are analysed and a generic model of execution is examined. A novel classification scheme for executable attack payloads is presented which allows for characterisation of executable payloads and facilitates vulnerability and threat assessments, and intrusion detection capability assessments for intrusion detection systems. An intrusion detection capability assessment may be utilised to determine whether or not a deployed system is able to detect a specific attack and to identify requirements for intrusion detection functionality for the development of new detection methods. Two novel detection methods are presented capable of detecting new and previously unseen executable attack payloads. The detection methods are capable of identifying and enumerating the executable payload’s interactions with the operating system on the targeted host at the time of compromise. The detection methods are further validated using real world data including executable payload attacks.
Resumo:
Providing precise positioning services in regional areas to support agriculture, mining, and construction sectors depends on the availability of ground continuously operating GNSS reference stations and communications linking these stations to central computers and users. With the support of CRC for Spatial Information, a more comprehensive review has been completed recently to examine various wired and wireless communication links available for precise positioning services, in particular in the Queensland regional areas. The study covers a wide range of communication technologies that are currently available, including fixed, mobile wireless, and Geo-stationary and or low earth orbiting satellites. These technologies are compared in terms of bandwidth, typical latency, reliability, coverage, and costs. Additionally, some tests were also conducted to determine the performances of different systems in the real environment. Finally, based on user application requirements, the paper discusses the suitability of different communication links.
Resumo:
When communicating emotion in music, composers and performers encode their expressive intentions through the control of basic musical features such as: pitch, loudness, timbre, mode, and articulation. The extent to which emotion can be controlled through the systematic manipulation of these features has not been fully examined. In this paper we present CMERS, a Computational Music Emotion Rule System for the control of perceived musical emotion that modifies features at the levels of score and performance in real-time. CMERS performance was evaluated in two rounds of perceptual testing. In experiment I, 20 participants continuously rated the perceived emotion of 15 music samples generated by CMERS. Three music works, each with five emotional variations were used (normal, happy, sad, angry, and tender). The intended emotion by CMERS was correctly identified 78% of the time, with significant shifts in valence and arousal also recorded, regardless of the works’ original emotion.
Resumo:
In this thesis, a new technique has been developed for determining the composition of a collection of loads including induction motors. The application would be to provide a representation of the dynamic electrical load of Brisbane so that the ability of the power system to survive a given fault can be predicted. Most of the work on load modelling to date has been on post disturbance analysis, not on continuous on-line models for loads. The post disturbance methods are unsuitable for load modelling where the aim is to determine the control action or a safety margin for a specific disturbance. This thesis is based on on-line load models. Dr. Tania Parveen considers 10 induction motors with different power ratings, inertia and torque damping constants to validate the approach, and their composite models are developed with different percentage contributions for each motor. This thesis also shows how measurements of a composite load respond to normal power system variations and this information can be used to continuously decompose the load continuously and to characterize regarding the load into different sizes and amounts of motor loads.
Resumo:
Purpose: Students with low vision may be disadvantaged when compared with their normally sighted peers, as they frequently work at very short working distances and need to use low vision devices. The aim of this study was to examine the sustained reading rates of students with low vision and compare them with their peers with normal vision. The effects of visual acuity, acuity reserve and age on reading rate were also examined. Method: Fifty-six students (10 to 16 years of age), 26 with low vision and 30 with normal vision were required to read text continuously for 30 minutes. Their position in the text was recorded at two-minute intervals. Distance and near visual acuity, working distance, cause of low vision, reading rates and reading habits were recorded. Results: A total of 80.7 per cent of the students with low vision maintained a constant reading rate during the 30 minutes of reading, although they read at approximately half the rate (104 wpm) compared with their normally sighted peers (195 wpm). Only four of the low vision subjects could not complete the reading task. Reading rates increased significantly with acuity reserve and distance and near visual acuity but there was no significant relationship between age and sustained reading rate. Conclusions: The majority of students with low vision were able to maintain appropriate reading rates to cope in integrated educational settings. Surprisingly only relatively few subjects (16 per cent) used their prescribed low vision devices even though the average accommodative demand was 9 D and generally, they revealed a greater dislike of reading compared to students with normal vision.
Resumo:
In this paper, the problems of three carrier phase ambiguity resolution (TCAR) and position estimation (PE) are generalized as real time GNSS data processing problems for a continuously observing network on large scale. In order to describe these problems, a general linear equation system is presented to uniform various geometry-free, geometry-based and geometry-constrained TCAR models, along with state transition questions between observation times. With this general formulation, generalized TCAR solutions are given to cover different real time GNSS data processing scenarios, and various simplified integer solutions, such as geometry-free rounding and geometry-based LAMBDA solutions with single and multiple-epoch measurements. In fact, various ambiguity resolution (AR) solutions differ in the floating ambiguity estimation and integer ambiguity search processes, but their theoretical equivalence remains under the same observational systems models and statistical assumptions. TCAR performance benefits as outlined from the data analyses in some recent literatures are reviewed, showing profound implications for the future GNSS development from both technology and application perspectives.
Resumo:
Successful product innovation and the ability of companies to continuously improve their innovation processes are rapidly becoming essential requirements for competitive advantage and long-term growth in both manufacturing and service industries. It is now recognized that companies must develop innovation capabilities across all stages of the product development, manufacture, and distribution cycle. These Continuous Product Innovation (CPI) capabilities are closely associated with a company’s knowledge management systems and processes. Companies must develop mechanisms to continuously improve these capabilities over time. Using results of an international survey on CPI practices, sets of companies are identified by similarities in specific contingencies related to their complexity of product, process, technological, and customer interface. Differences between the learning behaviors found present in the company groups and in the levers used to develop and support these behaviors are identified and discussed. This paper also discusses appropriate mechanisms for firms with similar complexities, and some approaches they can use to improve their organizational learning and product innovation.