971 resultados para Employed
Resumo:
Query reformulation is a key user behavior during Web search. Our research goal is to develop predictive models of query reformulation during Web searching. This article reports results from a study in which we automatically classified the query-reformulation patterns for 964,780 Web searching sessions, composed of 1,523,072 queries, to predict the next query reformulation. We employed an n-gram modeling approach to describe the probability of users transitioning from one query-reformulation state to another to predict their next state. We developed first-, second-, third-, and fourth-order models and evaluated each model for accuracy of prediction, coverage of the dataset, and complexity of the possible pattern set. The results show that Reformulation and Assistance account for approximately 45% of all query reformulations; furthermore, the results demonstrate that the first- and second-order models provide the best predictability, between 28 and 40% overall and higher than 70% for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance.
Resumo:
In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.
Resumo:
This paper reports results from a study in which we automatically classified the query reformulation patterns for 964,780 Web searching sessions (composed of 1,523,072 queries) in order to predict what the next query reformulation would be. We employed an n-gram modeling approach to describe the probability of searchers transitioning from one query reformulation state to another and predict their next state. We developed first, second, third, and fourth order models and evaluated each model for accuracy of prediction. Findings show that Reformulation and Assistance account for approximately 45 percent of all query reformulations. Searchers seem to seek system searching assistant early in the session or after a content change. The results of our evaluations show that the first and second order models provided the best predictability, between 28 and 40 percent overall, and higher than 70 percent for some patterns. Implications are that the n-gram approach can be used for improving searching systems and searching assistance in real time.
Resumo:
The purpose of this study was to identify the pedagogical knowledge relevant to the successful completion of a pie chart item. This purpose was achieved through the identification of the essential fluencies that 12–13-year-olds required for the successful solution of a pie chart item. Fluency relates to ease of solution and is particularly important in mathematics because it impacts on performance. Although the majority of students were successful on this multiple choice item, there was considerable divergence in the strategies they employed. Approximately two-thirds of the students employed efficient multiplicative strategies, which recognised and capitalised on the pie chart as a proportional representation. In contrast, the remaining one-third of students used a less efficient additive strategy that failed to capitalise on the representation of the pie chart. The results of our investigation of students’ performance on the pie chart item during individual interviews revealed that five distinct fluencies were involved in the solution process: conceptual (understanding the question), linguistic (keywords), retrieval (strategy selection), perceptual (orientation of a segment of the pie chart) and graphical (recognising the pie chart as a proportional representation). In addition, some students exhibited mild disfluencies corresponding to the five fluencies identified above. Three major outcomes emerged from the study. First, a model of knowledge of content and students for pie charts was developed. This model can be used to inform instruction about the pie chart and guide strategic support for students. Second, perceptual and graphical fluency were identified as two aspects of the curriculum, which should receive a greater emphasis in the primary years, due to their importance in interpreting pie charts. Finally, a working definition of fluency in mathematics was derived from students’ responses to the pie chart item.
Resumo:
In condition-based maintenance (CBM), effective diagnostics and prognostics are essential tools for maintenance engineers to identify imminent fault and to predict the remaining useful life before the components finally fail. This enables remedial actions to be taken in advance and reschedules production if necessary. This paper presents a technique for accurate assessment of the remnant life of machines based on historical failure knowledge embedded in the closed loop diagnostic and prognostic system. The technique uses the Support Vector Machine (SVM) classifier for both fault diagnosis and evaluation of health stages of machine degradation. To validate the feasibility of the proposed model, the five different level data of typical four faults from High Pressure Liquefied Natural Gas (HP-LNG) pumps were used for multi-class fault diagnosis. In addition, two sets of impeller-rub data were analysed and employed to predict the remnant life of pump based on estimation of health state. The results obtained were very encouraging and showed that the proposed prognosis system has the potential to be used as an estimation tool for machine remnant life prediction in real life industrial applications.
Resumo:
Classical negotiation models are weak in supporting real-world business negotiations because these models often assume that the preference information of each negotiator is made public. Although parametric learning methods have been proposed for acquiring the preference information of negotiation opponents, these methods suffer from the strong assumptions about the specific utility function and negotiation mechanism employed by the opponents. Consequently, it is difficult to apply these learning methods to the heterogeneous negotiation agents participating in e‑marketplaces. This paper illustrates the design, development, and evaluation of a nonparametric negotiation knowledge discovery method which is underpinned by the well-known Bayesian learning paradigm. According to our empirical testing, the novel knowledge discovery method can speed up the negotiation processes while maintaining negotiation effectiveness. To the best of our knowledge, this is the first nonparametric negotiation knowledge discovery method developed and evaluated in the context of multi-issue bargaining over e‑marketplaces.
Resumo:
To allocate and size capacitors in a distribution system, an optimization algorithm, called Discrete Particle Swarm Optimization (DPSO), is employed in this paper. The objective is to minimize the transmission line loss cost plus capacitors cost. During the optimization procedure, the bus voltage, the feeder current and the reactive power flowing back to the source side should be maintained within standard levels. To validate the proposed method, the semi-urban distribution system that is connected to bus 2 of the Roy Billinton Test System (RBTS) is used. This 37-bus distribution system has 22 loads being located in the secondary side of a distribution substation (33/11 kV). Reducing the transmission line loss in a standard system, in which the transmission line loss consists of only about 6.6 percent of total power, the capabilities of the proposed technique are seen to be validated.
Resumo:
Spatial information captured from optical remote sensors on board unmanned aerial vehicles (UAVs) has great potential in automatic surveillance of electrical infrastructure. For an automatic vision-based power line inspection system, detecting power lines from a cluttered background is one of the most important and challenging tasks. In this paper, a novel method is proposed, specifically for power line detection from aerial images. A pulse coupled neural filter is developed to remove background noise and generate an edge map prior to the Hough transform being employed to detect straight lines. An improved Hough transform is used by performing knowledge-based line clustering in Hough space to refine the detection results. The experiment on real image data captured from a UAV platform demonstrates that the proposed approach is effective for automatic power line detection.
Resumo:
Light Detection and Ranging (LIDAR) has great potential to assist vegetation management in power line corridors by providing more accurate geometric information of the power line assets and vegetation along the corridors. However, the development of algorithms for the automatic processing of LIDAR point cloud data, in particular for feature extraction and classification of raw point cloud data, is in still in its infancy. In this paper, we take advantage of LIDAR intensity and try to classify ground and non-ground points by statistically analyzing the skewness and kurtosis of the intensity data. Moreover, the Hough transform is employed to detected power lines from the filtered object points. The experimental results show the effectiveness of our methods and indicate that better results were obtained by using LIDAR intensity data than elevation data.
Resumo:
Biotribology, the study of lubrication, wear and friction within the body, has become a topic of high importance in recent times as we continue to encounter debilitating diseases and trauma that destroy function of the joints. A highly successful surgical procedure to replace the joint with an artificial equivalent alleviates dysfunction and pain. However, the wear of the bearing surfaces in prosthetic joints is a significant clinical problem and more patients are surviving longer than the life expectancy of the joint replacement. Revision surgery is associated with increased morbidity and mortality and has a far less successful outcome than primary joint replacement. As such, it is essential to ensure that everything possible is done to limit the rate of revision surgery. Past experience indicates that the survival rate of the implant will be influenced by many parameters, of primary importance, the material properties of the implant, the composition of the synovial fluid and the method of lubrication. In prosthetic joints, effective boundary lubrication is known to take place. The interaction of the boundary lubricant and the bearing material is of utmost importance. The identity of the vital active ingredient within synovial fluid (SF) to which we owe the near frictionless performance of our articulating joints has been the quest of researchers for many years. Once identified, tribo tests can determine what materials and more importantly what surfaces this fraction of SF can function most optimally with. Surface-Active Phospholipids (SAPL) have been implicated as the body’s natural load bearing lubricant. Studies in this thesis are the first to fully characterise the adsorbed SAPL detected on the surface of retrieved prostheses and the first to verify the presence of SAPL on knee prostheses. Rinsings from the bearing surfaces of both hip and knee prostheses removed from revision operations were analysed using High Performance Liquid Chromatography (HPLC) to determine the presence and profile of SAPL. Several common prosthetic materials along with a novel biomaterial were investigated to determine their tribological interaction with various SAPLs. A pin-on-flat tribometer was used to make comparative friction measurements between the various tribo-pairs. A novel material, Pyrolytic Carbon (PyC) was screened as a potential candidate as a load bearing prosthetic material. Friction measurements were also performed on explanted prostheses. SAPL was detected on all retrieved implant bearing surfaces. As a result of the study eight different species of phosphatidylcholines were identified. The relative concentrations of each species were also determined indicating that the unsaturated species are dominant. Initial tribo tests employed a saturated phosphatidylcholine (SPC) and the subsequent tests adopted the addition of the newly identified major constituents of SAPL, unsaturated phosphatidylcholine (USPC), as the test lubricant. All tribo tests showed a dramatic reduction in friction when synthetic SAPL was used as the lubricant under boundary lubrication conditions. Some tribopairs showed more of an affinity to SAPL than others. PyC performed superior to the other prosthetic materials. Friction measurements with explanted prostheses verified the presence and performance of SAPL. SAPL, in particular phosphatidylcholine, plays an essential role in the lubrication of prosthetic joints. Of particular interest was the ability of SAPLs to reduce friction and ultimately wear of the bearing materials. The identification and knowledge of the lubricating constituents of SF is invaluable for not only the future development of artificial joints but also in developing effective cures for several disease processes where lubrication may play a role. The tribological interaction of the various tribo-pairs and SAPL is extremely favourable in the context of reducing friction at the bearing interface. PyC is highly recommended as a future candidate material for use in load bearing prosthetic joints considering its impressive tribological performance.
Resumo:
Emissions from airport operations are of significant concern because of their potential impact on local air quality and human health. The currently limited scientific knowledge of aircraft emissions is an important issue worldwide, when considering air pollution associated with airport operation, and this is especially so for ultrafine particles. This limited knowledge is due to scientific complexities associated with measuring aircraft emissions during normal operations on the ground. In particular this type of research has required the development of novel sampling techniques which must take into account aircraft plume dispersion and dilution as well as the various particle dynamics that can affect the measurements of the aircraft engine plume from an operational aircraft. In order to address this scientific problem, a novel mobile emission measurement method called the Plume Capture and Analysis System (PCAS), was developed and tested. The PCAS permits the capture and analysis of aircraft exhaust during ground level operations including landing, taxiing, takeoff and idle. The PCAS uses a sampling bag to temporarily store a sample, providing sufficient time to utilize sensitive but slow instrumental techniques to be employed to measure gas and particle emissions simultaneously and to record detailed particle size distributions. The challenges in relation to the development of the technique include complexities associated with the assessment of the various particle loss and deposition mechanisms which are active during storage in the PCAS. Laboratory based assessment of the method showed that the bag sampling technique can be used to accurately measure particle emissions (e.g. particle number, mass and size distribution) from a moving aircraft or vehicle. Further assessment of the sensitivity of PCAS results to distance from the source and plume concentration was conducted in the airfield with taxiing aircraft. The results showed that the PCAS is a robust method capable of capturing the plume in only 10 seconds. The PCAS is able to account for aircraft plume dispersion and dilution at distances of 60 to 180 meters downwind of moving a aircraft along with particle deposition loss mechanisms during the measurements. Characterization of the plume in terms of particle number, mass (PM2.5), gaseous emissions and particle size distribution takes only 5 minutes allowing large numbers of tests to be completed in a short time. The results were broadly consistent and compared well with the available data. Comprehensive measurements and analyses of the aircraft plumes during various modes of the landing and takeoff (LTO) cycle (e.g. idle, taxi, landing and takeoff) were conducted at Brisbane Airport (BNE). Gaseous (NOx, CO2) emission factors, particle number and mass (PM2.5) emission factors and size distributions were determined for a range of Boeing and Airbus aircraft, as a function of aircraft type and engine thrust level. The scientific complexities including the analysis of the often multimodal particle size distributions to describe the contributions of different particle source processes during the various stages of aircraft operation were addressed through comprehensive data analysis and interpretation. The measurement results were used to develop an inventory of aircraft emissions at BNE, including all modes of the aircraft LTO cycle and ground running procedures (GRP). Measurements of the actual duration of aircraft activity in each mode of operation (time-in-mode) and compiling a comprehensive matrix of gas and particle emission rates as a function of aircraft type and engine thrust level for real world situations was crucial for developing the inventory. The significance of the resulting matrix of emission rates in this study lies in the estimate it provides of the annual particle emissions due to aircraft operations, especially in terms of particle number. In summary, this PhD thesis presents for the first time a comprehensive study of the particle and NOx emission factors and rates along with the particle size distributions from aircraft operations and provides a basis for estimating such emissions at other airports. This is a significant addition to the scientific knowledge in terms of particle emissions from aircraft operations, since the standard particle number emissions rates are not currently available for aircraft activities.
Theoretical and numerical investigation of plasmon nanofocusing in metallic tapered rods and grooves
Resumo:
Effective focusing of electromagnetic (EM) energy to nanoscale regions is one of the major challenges in nano-photonics and plasmonics. The strong localization of the optical energy into regions much smaller than allowed by the diffraction limit, also called nanofocusing, offers promising applications in nano-sensor technology, nanofabrication, near-field optics or spectroscopy. One of the most promising solutions to the problem of efficient nanofocusing is related to surface plasmon propagation in metallic structures. Metallic tapered rods, commonly used as probes in near field microscopy and spectroscopy, are of a particular interest. They can provide very strong EM field enhancement at the tip due to surface plasmons (SP’s) propagating towards the tip of the tapered metal rod. A large number of studies have been devoted to the manufacturing process of tapered rods or tapered fibers coated by a metal film. On the other hand, structures such as metallic V-grooves or metal wedges can also provide strong electric field enhancements but manufacturing of these structures is still a challenge. It has been shown, however, that the attainable electric field enhancement at the apex in the V-groove is higher than at the tip of a metal tapered rod when the dissipation level in the metal is strong. Metallic V-grooves also have very promising characteristics as plasmonic waveguides. This thesis will present a thorough theoretical and numerical investigation of nanofocusing during plasmon propagation along a metal tapered rod and into a metallic V-groove. Optimal structural parameters including optimal taper angle, taper length and shape of the taper are determined in order to achieve maximum field enhancement factors at the tip of the nanofocusing structure. An analytical investigation of plasmon nanofocusing by metal tapered rods is carried out by means of the geometric optics approximation (GOA), which is also called adiabatic nanofocusing. However, GOA is applicable only for analysing tapered structures with small taper angles and without considering a terminating tip structure in order to neglect reflections. Rigorous numerical methods are employed for analysing non-adiabatic nanofocusing, by tapered rod and V-grooves with larger taper angles and with a rounded tip. These structures cannot be studied by analytical methods due to the presence of reflected waves from the taper section, the tip and also from (artificial) computational boundaries. A new method is introduced to combine the advantages of GOA and rigorous numerical methods in order to reduce significantly the use of computational resources and yet achieve accurate results for the analysis of large tapered structures, within reasonable calculation time. Detailed comparison between GOA and rigorous numerical methods will be carried out in order to find the critical taper angle of the tapered structures at which GOA is still applicable. It will be demonstrated that optimal taper angles, at which maximum field enhancements occur, coincide with the critical angles, at which GOA is still applicable. It will be shown that the applicability of GOA can be substantially expanded to include structures which could be analysed previously by numerical methods only. The influence of the rounded tip, the taper angle and the role of dissipation onto the plasmon field distribution along the tapered rod and near the tip will be analysed analytically and numerically in detail. It will be demonstrated that electric field enhancement factors of up to ~ 2500 within nanoscale regions are predicted. These are sufficient, for instance, to detect single molecules using surface enhanced Raman spectroscopy (SERS) with the tip of a tapered rod, an approach also known as tip enhanced Raman spectroscopy or TERS. The results obtained in this project will be important for applications for which strong local field enhancement factors are crucial for the performance of devices such as near field microscopes or spectroscopy. The optimal design of nanofocusing structures, at which the delivery of electromagnetic energy to the nanometer region is most efficient, will lead to new applications in near field sensors, near field measuring technology, or generation of nanometer sized energy sources. This includes: applications in tip enhanced Raman spectroscopy (TERS); manipulation of nanoparticles and molecules; efficient coupling of optical energy into and out of plasmonic circuits; second harmonic generation in non-linear optics; or delivery of energy to quantum dots, for instance, for quantum computations.
Resumo:
How is contemporary culture 'framed' - understood, promoted, dissected and defended - in the new approaches being employed in university education today? How do these approaches compare with those seen in the public policy process? What are the implications of these differences for future directions in theory, education, activism and policy? Framing Culture looks at cultural and media studies, which are rapidly growing fields through which students are introduced to contemporary cultural industries such as television, film and video. It compares these approaches with those used to frame public policy and finds a striking lack of correspondence between them. Issues such as Australian content on commercial television and in advertising, new technologies and new media, and violence in the media all highlight the gap between contemporary cultural theories and the way culture and communications are debated in public policy. The reasons for this gap must be investigated before closer relations can be established. Framing Culture brings together cultural studies and policy studies in a lively and innovative way. It suggests avenues for cultural activism that have been neglected in cultural theory and practice, and it will provoke debates which are long overdue.
Resumo:
Surveillance networks are typically monitored by a few people, viewing several monitors displaying the camera feeds. It is then very difficult for a human operator to effectively detect events as they happen. Recently, computer vision research has begun to address ways to automatically process some of this data, to assist human operators. Object tracking, event recognition, crowd analysis and human identification at a distance are being pursued as a means to aid human operators and improve the security of areas such as transport hubs. The task of object tracking is key to the effective use of more advanced technologies. To recognize an event people and objects must be tracked. Tracking also enhances the performance of tasks such as crowd analysis or human identification. Before an object can be tracked, it must be detected. Motion segmentation techniques, widely employed in tracking systems, produce a binary image in which objects can be located. However, these techniques are prone to errors caused by shadows and lighting changes. Detection routines often fail, either due to erroneous motion caused by noise and lighting effects, or due to the detection routines being unable to split occluded regions into their component objects. Particle filters can be used as a self contained tracking system, and make it unnecessary for the task of detection to be carried out separately except for an initial (often manual) detection to initialise the filter. Particle filters use one or more extracted features to evaluate the likelihood of an object existing at a given point each frame. Such systems however do not easily allow for multiple objects to be tracked robustly, and do not explicitly maintain the identity of tracked objects. This dissertation investigates improvements to the performance of object tracking algorithms through improved motion segmentation and the use of a particle filter. A novel hybrid motion segmentation / optical flow algorithm, capable of simultaneously extracting multiple layers of foreground and optical flow in surveillance video frames is proposed. The algorithm is shown to perform well in the presence of adverse lighting conditions, and the optical flow is capable of extracting a moving object. The proposed algorithm is integrated within a tracking system and evaluated using the ETISEO (Evaluation du Traitement et de lInterpretation de Sequences vidEO - Evaluation for video understanding) database, and significant improvement in detection and tracking performance is demonstrated when compared to a baseline system. A Scalable Condensation Filter (SCF), a particle filter designed to work within an existing tracking system, is also developed. The creation and deletion of modes and maintenance of identity is handled by the underlying tracking system; and the tracking system is able to benefit from the improved performance in uncertain conditions arising from occlusion and noise provided by a particle filter. The system is evaluated using the ETISEO database. The dissertation then investigates fusion schemes for multi-spectral tracking systems. Four fusion schemes for combining a thermal and visual colour modality are evaluated using the OTCBVS (Object Tracking and Classification in and Beyond the Visible Spectrum) database. It is shown that a middle fusion scheme yields the best results and demonstrates a significant improvement in performance when compared to a system using either mode individually. Findings from the thesis contribute to improve the performance of semi-automated video processing and therefore improve security in areas under surveillance.
Resumo:
This paper reports on the research and development of an ICT tool to facilitate the learning of ratio and fractions by adult prisoners. The design of the ICT tool was informed by a semiotic framework for mathematical meaning-making. The ICT tool thus employed multiple semiotic resources including topological, typological, and social-actional resources. The results showed that individual semiotic resource could only represent part of the mathematical concept, while at the same time it might signify something else to create a misconception. When multiple semiotic resources were utilised the mathematical ideas could be better learnt.