115 resultados para LIMITED SETS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intuitively, any ‘bag of words’ approach in IR should benefit from taking term dependencies into account. Unfortunately, for years the results of exploiting such dependencies have been mixed or inconclusive. To improve the situation, this paper shows how the natural language properties of the target documents can be used to transform and enrich the term dependencies to more useful statistics. This is done in three steps. The term co-occurrence statistics of queries and documents are each represented by a Markov chain. The paper proves that such a chain is ergodic, and therefore its asymptotic behavior is unique, stationary, and independent of the initial state. Next, the stationary distribution is taken to model queries and documents, rather than their initial distributions. Finally, ranking is achieved following the customary language modeling paradigm. The main contribution of this paper is to argue why the asymptotic behavior of the document model is a better representation then just the document’s initial distribution. A secondary contribution is to investigate the practical application of this representation in case the queries become increasingly verbose. In the experiments (based on Lemur’s search engine substrate) the default query model was replaced by the stable distribution of the query. Just modeling the query this way already resulted in significant improvements over a standard language model baseline. The results were on a par or better than more sophisticated algorithms that use fine-tuned parameters or extensive training. Moreover, the more verbose the query, the more effective the approach seems to become.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tort law reform has resulted in legislation being passed by all Australian jurisdictions in the past decade implementing the recommendations contained in the Ipp Report. The report was in response to a perceived crisis in medical indemnity insurance. The objective was to restrict and limit liability in negligence actions. This paper will consider to what extent the reforms have impacted on the liability of health professionals in medical negligence actions. After an analysis of the legislation, it will be argued in this paper that while there has been some limitation and restriction, courts have generally interpreted the civil liability reforms in compliance with the common law. It has been the impact of statutory limits on the assessment of damages through thresholds and caps which has limited the liability of health professionals in medical negligence actions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Web has become a worldwide repository of information which individuals, companies, and organizations utilize to solve or address various information problems. Many of these Web users utilize automated agents to gather this information for them. Some assume that this approach represents a more sophisticated method of searching. However, there is little research investigating how Web agents search for online information. In this research, we first provide a classification for information agent using stages of information gathering, gathering approaches, and agent architecture. We then examine an implementation of one of the resulting classifications in detail, investigating how agents search for information on Web search engines, including the session, query, term, duration and frequency of interactions. For this temporal study, we analyzed three data sets of queries and page views from agents interacting with the Excite and AltaVista search engines from 1997 to 2002, examining approximately 900,000 queries submitted by over 3,000 agents. Findings include: (1) agent sessions are extremely interactive, with sometimes hundreds of interactions per second (2) agent queries are comparable to human searchers, with little use of query operators, (3) Web agents are searching for a relatively limited variety of information, wherein only 18% of the terms used are unique, and (4) the duration of agent-Web search engine interaction typically spans several hours. We discuss the implications for Web information agents and search engines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The primary objective of the experiments reported here was to demonstrate the effects of opening up the design envelope for auditory alarms on the ability of people to learn the meanings of a set of alarms. Two sets of alarms were tested, one already extant and one newly-designed set for the same set of functions, designed according to a rationale set out by the authors aimed at increasing the heterogeneity of the alarm set and incorporating some well-established principles of alarm design. For both sets of alarms, a similarity-rating experiment was followed by a learning experiment. The results showed that the newly-designed set was judged to be more internally dissimilar, and easier to learn, than the extant set. The design rationale outlined in the paper is useful for design purposes in a variety of practical domains and shows how alarm designers, even at a relatively late stage in the design process, can improve the efficacy of an alarm set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the problem of face recognition on video by employing the recently proposed probabilistic linear discrimi-nant analysis (PLDA). The PLDA has been shown to be robust against pose and expression in image-based face recognition. In this research, the method is extended and applied to video where image set to image set matching is performed. We investigate two approaches of computing similarities between image sets using the PLDA: the closest pair approach and the holistic sets approach. To better model face appearances in video, we also propose the heteroscedastic version of the PLDA which learns the within-class covariance of each individual separately. Our experi-ments on the VidTIMIT and Honda datasets show that the combination of the heteroscedastic PLDA and the closest pair approach achieves the best performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter proposes a conceptual model for optimal development of needed capabilities for the contemporary knowledge economy. We commence by outlining key capability requirements of the 21st century knowledge economy, distinguishing these from those suited to the earlier stages of the knowledge economy. We then discuss the extent to which higher education currently caters to these requirements and then put forward a new model for effective knowledge economy capability learning. The core of this model is the development of an adaptive and adaptable career identity, which is created through a reflective process of career self-management, drawing upon data from the self and the world of work. In turn, career identity drives the individual’s process of skill and knowledge acquisition, including deep disciplinary knowledge. The professional capability learning thus acquired includes disciplinary skill and knowledge sets, generic skills, and also skills for the knowledge economy, including disciplinary agility, social network capability, and enterprise skills. In the final part of this chapter, we envision higher education systems that embrace the model, and suggest steps that could be taken toward making the development of knowledge economy capabilities an integral part of the university experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research over the last two decades has significantly increased our understanding of the evolutionary position of the insects among other arthropods, and the relationships among the insect Orders. Many of these insights have been established through increasingly sophisticated analyses of DNA sequence data from a limited number of genes. Recent results have established the relationships of the Holometabola, but relationships among the hemimetabolous orders have been more difficult to elucidate. A strong consensus on the relationships among the Palaeoptera (Ephemeroptera and Odonata) and their relationship to the Neoptera has not emerged with all three possible resolutions supported by different data sets. While polyneopteran relationships generally have resisted significant resolution, it is now clear that termites, Isoptera, are nested within the cockroaches, Blattodea. The newly discovered order Mantophasmatodea is difficult to place with the balance of studies favouring Grylloblattodea as sister-group. While some studies have found the paraneopteran orders (Hemiptera, Thysanoptera, Phthiraptera and Psocoptera) monophyletic, evidence suggests that parasitic lice (Phthiraptera) have evolved from groups within the book and bark lice (Psocoptera), and may represent parallel evolutions of parasitism within two major louse groups. Within Holometabola, it is now clear that Hymenoptera are the sister to the other orders, that, in turn are divided into two clades, the Neuropteroidea (Coleoptera, Neuroptera and relatives) and the Mecopterida (Trichoptera, Lepidoptera, Diptera and their relatives). The enigmatic order Strepsiptera, the twisted wing insects, have now been placed firmly near Coleoptera, rejecting their close relationship to Diptera that was proposed some 15years ago primarily based on ribosomal DNA data. Phylogenomic-scale analyses are just beginning to be focused on the relationships of the insect orders, and this is where we expect to see resolution of palaeopteran and polyneopteran relationships. Future research will benefit from greater coordination between intra and inter-ordinal analyses. This will maximise the opportunities for appropriate outgroup choice at the intraordinal level and provide the background knowledge for the interordinal analyses to span the maximum phylogenetic scope within groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite its role in determining both indoor and outdoor human exposure to anthropogenic particles, there is limited information describing vertical profiles of particle concentrations in urban environments, especially for ultrafine particles. Furthermore, the results of the few studies performed have been inconsistent. As such, this study aimed to assess the influence of vehicle emissions and nucleation formation on particle characteristics (particle number size distribution-PNSD and PM 2.5 concentration) at different heights around three urban office buildings located next to busy roads in Brisbane, Australia, and place these results in the broader context of the existing literature. Two sets of instruments were used to simultaneously measure PNSD, particle number (PN) and PM 2.5 concentrations, respectively, for up to three weeks at each building. The results showed that both PNSD and PM 2.5 concentration around building envelopes were influenced by vehicle emissions and new particle formation, and that they exhibited variability across the three different office buildings. During nucleation events, PN concentration in size range of <30 nm and total PN concentration increased (7-65% and 5-46%, respectively), while PM 2.5 concentration decreased (36-52%) with height. This study has shown an under acknowledged role for nucleation in producing particles that can affect large numbers of people, due to the high density and occupancy of urban office buildings and the fact that the vast majority of people's time is spent indoors. These findings highlight important new information related to the previously overlooked role of particle formation in the urban atmosphere and its potential effects on selection of air intake locations and appropriate filter types when designing or upgrading mechanical ventilation systems in urban office buildings. The results also serve to better define particle behaviour and variability around building envelopes, which has implications for studies of both human exposure and particle dynamics. © 2012 Author(s).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Even though titanium dioxide photocatalysis has been promoted as a leading green technology for water purification, many issues have hindered its application on a large commercial scale. For the materials scientist the main issues have centred the synthesis of more efficient materials and the investigation of degradation mechanisms; whereas for the engineers the main issues have been the development of appropriate models and the evaluation of intrinsic kinetics parameters that allow the scale up or re-design of efficient large-scale photocatalytic reactors. In order to obtain intrinsic kinetics parameters the reaction must be analysed and modelled considering the influence of the radiation field, pollutant concentrations and fluid dynamics. In this way, the obtained kinetic parameters are independent of the reactor size and configuration and can be subsequently used for scale-up purposes or for the development of entirely new reactor designs. This work investigates the intrinsic kinetics of phenol degradation over titania film due to the practicality of a fixed film configuration over a slurry. A flat plate reactor was designed in order to be able to control reaction parameters that include the UV irradiance, flow rates, pollutant concentration and temperature. Particular attention was paid to the investigation of the radiation field over the reactive surface and to the issue of mass transfer limited reactions. The ability of different emission models to describe the radiation field was investigated and compared to actinometric measurements. The RAD-LSI model was found to give the best predictions over the conditions tested. Mass transfer issues often limit fixed film reactors. The influence of this phenomenon was investigated with specifically planned sets of benzoic acid experiments and with the adoption of the stagnant film model. The phenol mass transfer coefficient in the system was calculated to be km,phenol=8.5815x10-7Re0.65(ms-1). The data obtained from a wide range of experimental conditions, together with an appropriate model of the system, has enabled determination of intrinsic kinetic parameters. The experiments were performed in four different irradiation levels (70.7, 57.9, 37.1 and 20.4 W m-2) and combined with three different initial phenol concentrations (20, 40 and 80 ppm) to give a wide range of final pollutant conversions (from 22% to 85%). The simple model adopted was able to fit the wide range of conditions with only four kinetic parameters; two reaction rate constants (one for phenol and one for the family of intermediates) and their corresponding adsorption constants. The intrinsic kinetic parameters values were defined as kph = 0.5226 mmol m-1 s-1 W-1, kI = 0.120 mmol m-1 s-1 W-1, Kph = 8.5 x 10-4 m3 mmol-1 and KI = 2.2 x 10-3 m3 mmol-1. The flat plate reactor allowed the investigation of the reaction under two different light configurations; liquid and substrate side illumination. The latter of particular interest for real world applications where light absorption due to turbidity and pollutants contained in the water stream to be treated could represent a significant issue. The two light configurations allowed the investigation of the effects of film thickness and the determination of the catalyst optimal thickness. The experimental investigation confirmed the predictions of a porous medium model developed to investigate the influence of diffusion, advection and photocatalytic phenomena inside the porous titania film, with the optimal thickness value individuated at 5 ìm. The model used the intrinsic kinetic parameters obtained from the flat plate reactor to predict the influence of thickness and transport phenomena on the final observed phenol conversion without using any correction factor; the excellent match between predictions and experimental results provided further proof of the quality of the parameters obtained with the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

3D models of long bones are being utilised for a number of fields including orthopaedic implant design. Accurate reconstruction of 3D models is of utmost importance to design accurate implants to allow achieving a good alignment between two bone fragments. Thus for this purpose, CT scanners are employed to acquire accurate bone data exposing an individual to a high amount of ionising radiation. Magnetic resonance imaging (MRI) has been shown to be a potential alternative to computed tomography (CT) for scanning of volunteers for 3D reconstruction of long bones, essentially avoiding the high radiation dose from CT. In MRI imaging of long bones, the artefacts due to random movements of the skeletal system create challenges for researchers as they generate inaccuracies in the 3D models generated by using data sets containing such artefacts. One of the defects that have been observed during an initial study is the lateral shift artefact occurring in the reconstructed 3D models. This artefact is believed to result from volunteers moving the leg during two successive scanning stages (the lower limb has to be scanned in at least five stages due to the limited scanning length of the scanner). As this artefact creates inaccuracies in the implants designed using these models, it needs to be corrected before the application of 3D models to implant design. Therefore, this study aimed to correct the lateral shift artefact using 3D modelling techniques. The femora of five ovine hind limbs were scanned with a 3T MRI scanner using a 3D vibe based protocol. The scanning was conducted in two halves, while maintaining a good overlap between them. A lateral shift was generated by moving the limb several millimetres between two scanning stages. The 3D models were reconstructed using a multi threshold segmentation method. The correction of the artefact was achieved by aligning the two halves using the robust iterative closest point (ICP) algorithm, with the help of the overlapping region between the two. The models with the corrected artefact were compared with the reference model generated by CT scanning of the same sample. The results indicate that the correction of the artefact was achieved with an average deviation of 0.32 ± 0.02 mm between the corrected model and the reference model. In comparison, the model obtained from a single MRI scan generated an average error of 0.25 ± 0.02 mm when compared with the reference model. An average deviation of 0.34 ± 0.04 mm was seen when the models generated after the table was moved were compared to the reference models; thus, the movement of the table is also a contributing factor to the motion artefacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cartilage defects heal imperfectly and osteoarthritic changes develop frequently as a result. Although the existence of specific behaviours of chondrocytes derived from various depth-related zones in vitro has been known for over 20 years, only a relatively small body of in vitro studies has been performed with zonal chondrocytes and current clinical treatment strategies do not reflect these native depth-dependent (zonal) differences. This is surprising since mimicking the zonal organization of articular cartilage in neo-tissue by the use of zonal chondrocyte subpopulations could enhance the functionality of the graft. Although some research groups including our own have made considerable progress in tailoring culture conditions using specific growth factors and biomechanical loading protocols, we conclude that an optimal regime has not yet been determined. Other unmet challenges include the lack of specific zonal cell sorting protocols and limited amounts of cells harvested per zone. As a result, the engineering of functional tissue has not yet been realized and no long-term in vivo studies using zonal chondrocytes have been described. This paper critically reviews the research performed to date and outlines our view of the potential future significance of zonal chondrocyte populations in regenerative approaches for the treatment of cartilage defects. Secondly, we briefly discuss the capabilities of additive manufacturing technologies that can not only create patient-specific grafts directly from medical imaging data sets but could also more accurately reproduce the complex 3D zonal extracellular matrix architecture using techniques such as hydrogel-based cell printing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.