945 resultados para Precision Xtra®


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores violent urbanism in the recent science-fiction filem District 9 whhich depicts an alien immigration camp, filmed on location in Soweto in 2008 in the midst of a series of violent clashed between indigenous South Africans and the new wave of African immigrants. Violent Urbanism is the State of method of control of bodies and populations by those precise biological techniques that determine geopolitical sites for the control of cities. This film while presented as cinema verite speaks the real invasion of traditional, spatio-disciplinary regimes such as corporate-run detention centres, refugee camps, border control and enforced relocation by those imperceptible techniques which violate the body by reducing it to a biological datum, tool, or specimen to serve the security agenda of the twenty-first century nation-state. These techniques are chemical and biological warfare proliferation; genetic engineering; and surveillance systems, such as biometrics, whose purview is no longer limited to the specular but includes the molecular. District 9 evinces a compelling urban image of contemporary biopolitics that disturbs the received historiography of post-apartheid urbanism. Clearly Johannesburg is not the only place this could or is happening - the reach of biopolitics is worldwide. District 9 visualises with utter precision the corporate hijacking of the biological realm in contemporary cites, just as it asks the unsettling question, who exactly is the "audience" of Violent Urbanism?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, everyone can effortlessly access a range of information on the World Wide Web (WWW). As information resources on the web continue to grow tremendously, it becomes progressively more difficult to meet high expectations of users and find relevant information. Although existing search engine technologies can find valuable information, however, they suffer from the problems of information overload and information mismatch. This paper presents a hybrid Web Information Retrieval approach allowing personalised search using ontology, user profile and collaborative filtering. This approach finds the context of user query with least user’s involvement, using ontology. Simultaneously, this approach uses time-based automatic user profile updating with user’s changing behaviour. Subsequently, this approach uses recommendations from similar users using collaborative filtering technique. The proposed method is evaluated with the FIRE 2010 dataset and manually generated dataset. Empirical analysis reveals that Precision, Recall and F-Score of most of the queries for many users are improved with proposed method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Link the Wiki track at INEX 2008 offered two tasks, file-to-file link discovery and anchor-to-BEP link discovery. In the former 6600 topics were used and in the latter 50 were used. Manual assessment of the anchor-to-BEP runs was performed using a tool developed for the purpose. Runs were evaluated using standard precision & recall measures such as MAP and precision / recall graphs. 10 groups participated and the approaches they took are discussed. Final evaluation results for all runs are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A rule-based approach for classifying previously identified medical concepts in the clinical free text into an assertion category is presented. There are six different categories of assertions for the task: Present, Absent, Possible, Conditional, Hypothetical and Not associated with the patient. The assertion classification algorithms were largely based on extending the popular NegEx and Context algorithms. In addition, a health based clinical terminology called SNOMED CT and other publicly available dictionaries were used to classify assertions, which did not fit the NegEx/Context model. The data for this task includes discharge summaries from Partners HealthCare and from Beth Israel Deaconess Medical Centre, as well as discharge summaries and progress notes from University of Pittsburgh Medical Centre. The set consists of 349 discharge reports, each with pairs of ground truth concept and assertion files for system development, and 477 reports for evaluation. The system’s performance on the evaluation data set was 0.83, 0.83 and 0.83 for recall, precision and F1-measure, respectively. Although the rule-based system shows promise, further improvements can be made by incorporating machine learning approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose. To devise and validate artist-rendered grading scales for contact lens complications Methods. Each of eight tissue complications of contact lens wear (listed under 'Results') was painted by a skilled ophthalmic artist (Terry R. Tarrant) in five grades of severity: 0 (normal), 1 (trace), 2 (mild), 3 (moderate) and 4 (severe). A representative slit lamp photograph of a tissue response of each of the eight complications was shown to 404 contact lens practitioners who had never before used clinical grading scales. The practitioners were asked to grade each tissue response to the nearest 0.1 grade unit by interpolation. Results. The standard deviation (± s.d.) of the 404 responses for each tissue complication is tabulated below:_ing_ 0.5 Endothelial pplymegethisjij-4 0.7 Epithelial microcysts 0.5 Endothelial blebs_ 0.4 Stromal edema_onjunctiva! hyperemia 0.4 Stromal neovascularization 0.4 Papillary conjunctivitis 0.5 The frequency distributions and best-fit normal curves were also plotted. The precision of grading (s.d. x 2) ranged from 0.8 to 1.4, with a mean precision of 1.0. Conclusions. Grading scales afford contact lens practitioners with a method of quantifying the severity of adverse tissue responses to contact lens wear. It is noteworthy that the statistically verified precision of grading (1.0 scale unit) concurs precisely with the essential design feature of the grading scales that each grading step of 1.0 corresponds to clinically significant difference in severity. Thus, as a general rule, a difference or change in grade of > 1.0 can be taken to be both clinically and statistically significant when using these grading scales. Trained observers are likely to achieve even greater grading precision. Supported by Hydron Limited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vernier acuity, a form of visual hyperacuity, is amongst the most precise forms of spatial vision. Under optimal conditions Vernier thresholds are much finer than the inter-photoreceptor distance. Achievement of such high precision is based substantially on cortical computations, most likely in the primary visual cortex. Using stimuli with added positional noise, we show that Vernier processing is reduced with advancing age across a wide range of noise levels. Using an ideal observer model, we are able to characterize the mechanisms underlying age-related loss, and show that the reduction in Vernier acuity can be mainly attributed to the reduction in efficiency of sampling, with no significant change in the level of internal position noise, or spatial distortion, in the visual system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A priority when designing control strategies for autonomous underwater vehicles is to emphasize their cost of implementation on a real vehicle. Indeed, due to the vehicles' design and the actuation modes usually under consideration for underwater plateforms the number of actuator switchings must be kept to a small value to insure feasibility and precision. This is the main objective of the algorithm presented in this paper. The theory is illustrated on two examples, one is a fully actuated underwater vehicle capable of motion in six-degrees-of freedom and one is minimally actuated with control motions in the vertical plane only.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective - this study examined the clinical utility and precision of routine screening for alcohol and other drug use among women attending a public antenatal service. Study design - a survey of clients and audit of clinical charts. Participants and setting - clients attending an antenatal clinic of a large tertiary hospital in Queensland, Australia, from October to December 2009. Measurements and findings - data were collected from two sources. First, 32 women who reported use of alcohol or other drugs during pregnancy at initial screening were then asked to complete a full substance use survey. Second, data were collected from charts of 349 new clients who attended the antenatal clinic during the study period. Both sensitivity (86%, 67%) and positive predictive value (100%, 92%) for alcohol and other drug use respectively, were high. Only 15% of surveyed women were uncomfortable about being screened for substance use in pregnancy, yet the chart audit revealed poor staff compliance. During the study period, 25% of clients were either not screened adequately or not at all. Key conclusions and implications for practise - despite recommended universal screening in pregnancy and the apparent acceptance by our participants, alcohol and other drug (A&OD) screening in the antenatal setting remains problematic. Investigation into the reasons behind, and ways to overcome, the low screening rate could improve health outcomes for mothers and children in this at-risk group. Targeted education and training for midwives may form part of the solution as these clinicians have a key role in implementing prevention and early intervention strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we apply a simulation based approach for estimating transmission rates of nosocomial pathogens. In particular, the objective is to infer the transmission rate between colonised health-care practitioners and uncolonised patients (and vice versa) solely from routinely collected incidence data. The method, using approximate Bayesian computation, is substantially less computer intensive and easier to implement than likelihood-based approaches we refer to here. We find through replacing the likelihood with a comparison of an efficient summary statistic between observed and simulated data that little is lost in the precision of estimated transmission rates. Furthermore, we investigate the impact of incorporating uncertainty in previously fixed parameters on the precision of the estimated transmission rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The trafficking of molecules and membranes within cells is a prerequisite for all aspects of cellular immune functions, including the delivery and recycling of cell surface proteins, secretion of immune mediators, ingestion of pathogens and activation of lymphocytes. SNARE (soluble-N-ethylmaleimide-sensitive-factor accessory-protein receptor)-family members mediate membrane fusion during all steps of trafficking, and function in almost all aspects of innate and adaptive immune responses. Here, we provide an overview of the roles of SNAREs in immune cells, offering insight into one level at which precision and tight regulation are instilled on immune responses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade, Ionic Liquids (ILs) have been used for the dissolution and derivatization of isolated cellulose. This ability of ILs is now sought for their application in the selective dissolution of cellulose from lignocellulosic biomass, for the manufacture of cellulosic ethanol. However, there are significant knowledge gaps in the understanding of the chemistry of the interaction of biomass and ILs. While imidazolium ILs have been used successfully to dissolve both isolated crystalline cellulose and components of lignocellulosic biomass, phosphonium ILs have not been sufficiently explored for the use in dissolution of lignocellulosic biomass. This thesis reports on the study of the chemistry of sugarcane bagasse with phosphonium ILs. Qualitative and quantitative measurements of biomass components dissolved in the phosphonium ionic liquids (ILs), trihexyltetradecylphosphonium chloride ([P66614]Cl) and tributylmethylphosphonium methylsulphate ([P4441]MeSO4) are obtained using attenuated total reflectance-Fourier Transform Infra Red (FTIR). Absorption bands related to cellulose, hemicelluloses and lignin dissolution monitored in situ in biomass-IL mixtures indicate lignin dissolution in both ILs and some holocellulose dissolution in the hydrophilic [P4441]MeSO4. The kinetics of lignin dissolution reported here indicate that while dissolution in the hydrophobic IL [P66614]Cl appears to follow an accepted mechanism of acid catalysed β-aryl ether cleavage, dissolution in the hydrophilic IL [P4441]MeSO4 does not appear to follow this mechanism and may not be followed by condensation reactions (initiated by reactive ketones). The quantitative measurement of lignin dissolution in phosphonium ILs based on absorbance at 1510 cm-1 has demonstrated utility and greater precision than the conventional Klason lignin method. The cleavage of lignin β-aryl ether bonds in sugarcane bagasse by the ionic liquid [P66614]Cl, in the presence of catalytic amounts of mineral acid. (ca. 0.4 %). The delignification process of bagasse is studied over a range of temperatures (120 °C to 150 °C) by monitoring the production of β-ketones (indicative of cleavage of β-aryl ethers) using FTIR spectroscopy and by compositional analysis of the undissolved fractions. Maximum delignification is obtained at 150 °C, with 52 % of lignin removed from the original lignin content of bagasse. No delignification is observed in the absence of acid which suggests that the reaction is acid catalysed with the IL solubilising the lignin fragments. The rate of delignification was significantly higher at 150 °C, suggesting that crossing the glass transition temperature of lignin effects greater freedom of rotation about the propanoid carbon-carbon bonds and leads to increased cleavage of β-aryl ethers. An attempt has been made to propose a probable mechanism of delignifcation of bagasse with the phosphonuim IL. All polymeric components of bagasse, a lignocellulosic biomass, dissolve in the hydrophilic ionic liquid (IL) tributylmethylphosphonium methylsulfate ([P4441]MeSO4) with and without a catalytic amount of acid (H2SO4, ca. 0.4 %). The presence of acid significantly increases the extent of dissolution of bagasse in [P4441]MeSO4 (by ca. 2.5 times under conditions used here). The dissolved fractions can be partially recovered by the addition of an antisolvent (water) and are significantly enriched in lignin. Unlike acid catalysed dissolution in the hydrophobic IL tetradecyltrihexylphosphonium chloride there is little evidence of cleavage of β-aryl ether bonds of lignin dissolving in [P4441]MeSO4 (with and without acid), but this mechanism may play some role in the acid catalysed dissolution. The XRD of the undissolved fractions suggests that the IL may selectively dissolve the amorphous cellulose component, leaving behind crystalline material.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electronic services are a leitmotif in ‘hot’ topics like Software as a Service, Service Oriented Architecture (SOA), Service oriented Computing, Cloud Computing, application markets and smart devices. We propose to consider these in what has been termed the Service Ecosystem (SES). The SES encompasses all levels of electronic services and their interaction, with human consumption and initiation on its periphery in much the same way the ‘Web’ describes a plethora of technologies that eventuate to connect information and expose it to humans. Presently, the SES is heterogeneous, fragmented and confined to semi-closed systems. A key issue hampering the emergence of an integrated SES is Service Discovery (SD). A SES will be dynamic with areas of structured and unstructured information within which service providers and ‘lay’ human consumers interact; until now the two are disjointed, e.g., SOA-enabled organisations, industries and domains are choreographed by domain experts or ‘hard-wired’ to smart device application markets and web applications. In a SES, services are accessible, comparable and exchangeable to human consumers closing the gap to the providers. This requires a new SD with which humans can discover services transparently and effectively without special knowledge or training. We propose two modes of discovery, directed search following an agenda and explorative search, which speculatively expands knowledge of an area of interest by means of categories. Inspired by conceptual space theory from cognitive science, we propose to implement the modes of discovery using concepts to map a lay consumer’s service need to terminologically sophisticated descriptions of services. To this end, we reframe SD as an information retrieval task on the information attached to services, such as, descriptions, reviews, documentation and web sites - the Service Information Shadow. The Semantic Space model transforms the shadow's unstructured semantic information into a geometric, concept-like representation. We introduce an improved and extended Semantic Space including categorization calling it the Semantic Service Discovery model. We evaluate our model with a highly relevant, service related corpus simulating a Service Information Shadow including manually constructed complex service agendas, as well as manual groupings of services. We compare our model against state-of-the-art information retrieval systems and clustering algorithms. By means of an extensive series of empirical evaluations, we establish optimal parameter settings for the semantic space model. The evaluations demonstrate the model’s effectiveness for SD in terms of retrieval precision over state-of-the-art information retrieval models (directed search) and the meaningful, automatic categorization of service related information, which shows potential to form the basis of a useful, cognitively motivated map of the SES for exploratory search.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of tumour motion during radiation therapy delivery have been widely investigated. Motion effects have become increasingly important with the introduction of dynamic radiotherapy delivery modalities such as enhanced dynamic wedges (EDWs) and intensity modulated radiation therapy (IMRT) where a dynamically collimated radiation beam is delivered to the moving target, resulting in dose blurring and interplay effects which are a consequence of the combined tumor and beam motion. Prior to this work, reported studies on the EDW based interplay effects have been restricted to the use of experimental methods for assessing single-field non-fractionated treatments. In this work, the interplay effects have been investigated for EDW treatments. Single and multiple field treatments have been studied using experimental and Monte Carlo (MC) methods. Initially this work experimentally studies interplay effects for single-field non-fractionated EDW treatments, using radiation dosimetry systems placed on a sinusoidaly moving platform. A number of wedge angles (60º, 45º and 15º), field sizes (20 × 20, 10 × 10 and 5 × 5 cm2), amplitudes (10-40 mm in step of 10 mm) and periods (2 s, 3 s, 4.5 s and 6 s) of tumor motion are analysed (using gamma analysis) for parallel and perpendicular motions (where the tumor and jaw motions are either parallel or perpendicular to each other). For parallel motion it was found that both the amplitude and period of tumor motion affect the interplay, this becomes more prominent where the collimator tumor speeds become identical. For perpendicular motion the amplitude of tumor motion is the dominant factor where as varying the period of tumor motion has no observable effect on the dose distribution. The wedge angle results suggest that the use of a large wedge angle generates greater dose variation for both parallel and perpendicular motions. The use of small field size with a large tumor motion results in the loss of wedged dose distribution for both parallel and perpendicular motion. From these single field measurements a motion amplitude and period have been identified which show the poorest agreement between the target motion and dynamic delivery and these are used as the „worst case motion parameters.. The experimental work is then extended to multiple-field fractionated treatments. Here a number of pre-existing, multiple–field, wedged lung plans are delivered to the radiation dosimetry systems, employing the worst case motion parameters. Moreover a four field EDW lung plan (using a 4D CT data set) is delivered to the IMRT quality control phantom with dummy tumor insert over four fractions using the worst case parameters i.e. 40 mm amplitude and 6 s period values. The analysis of the film doses using gamma analysis at 3%-3mm indicate the non averaging of the interplay effects for this particular study with a gamma pass rate of 49%. To enable Monte Carlo modelling of the problem, the DYNJAWS component module (CM) of the BEAMnrc user code is validated and automated. DYNJAWS has been recently introduced to model the dynamic wedges. DYNJAWS is therefore commissioned for 6 MV and 10 MV photon energies. It is shown that this CM can accurately model the EDWs for a number of wedge angles and field sizes. The dynamic and step and shoot modes of the CM are compared for their accuracy in modelling the EDW. It is shown that dynamic mode is more accurate. An automation of the DYNJAWS specific input file has been carried out. This file specifies the probability of selection of a subfield and the respective jaw coordinates. This automation simplifies the generation of the BEAMnrc input files for DYNJAWS. The DYNJAWS commissioned model is then used to study multiple field EDW treatments using MC methods. The 4D CT data of an IMRT phantom with the dummy tumor is used to produce a set of Monte Carlo simulation phantoms, onto which the delivery of single field and multiple field EDW treatments is simulated. A number of static and motion multiple field EDW plans have been simulated. The comparison of dose volume histograms (DVHs) and gamma volume histograms (GVHs) for four field EDW treatments (where the collimator and patient motion is in the same direction) using small (15º) and large wedge angles (60º) indicates a greater mismatch between the static and motion cases for the large wedge angle. Finally, to use gel dosimetry as a validation tool, a new technique called the „zero-scan method. is developed for reading the gel dosimeters with x-ray computed tomography (CT). It has been shown that multiple scans of a gel dosimeter (in this case 360 scans) can be used to reconstruct a zero scan image. This zero scan image has a similar precision to an image obtained by averaging the CT images, without the additional dose delivered by the CT scans. In this investigation the interplay effects have been studied for single and multiple field fractionated EDW treatments using experimental and Monte Carlo methods. For using the Monte Carlo methods the DYNJAWS component module of the BEAMnrc code has been validated and automated and further used to study the interplay for multiple field EDW treatments. Zero-scan method, a new gel dosimetry readout technique has been developed for reading the gel images using x-ray CT without losing the precision and accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to support intelligent transportation system (ITS) road safety applications such as collision avoidance, lane departure warnings and lane keeping, Global Navigation Satellite Systems (GNSS) based vehicle positioning system has to provide lane-level (0.5 to 1 m) or even in-lane-level (0.1 to 0.3 m) accurate and reliable positioning information to vehicle users. However, current vehicle navigation systems equipped with a single frequency GPS receiver can only provide road-level accuracy at 5-10 meters. The positioning accuracy can be improved to sub-meter or higher with the augmented GNSS techniques such as Real Time Kinematic (RTK) and Precise Point Positioning (PPP) which have been traditionally used in land surveying and or in slowly moving environment. In these techniques, GNSS corrections data generated from a local or regional or global network of GNSS ground stations are broadcast to the users via various communication data links, mostly 3G cellular networks and communication satellites. This research aimed to investigate the precise positioning system performances when operating in the high mobility environments. This involves evaluation of the performances of both RTK and PPP techniques using: i) the state-of-art dual frequency GPS receiver; and ii) low-cost single frequency GNSS receiver. Additionally, this research evaluates the effectiveness of several operational strategies in reducing the load on data communication networks due to correction data transmission, which may be problematic for the future wide-area ITS services deployment. These strategies include the use of different data transmission protocols, different correction data format standards, and correction data transmission at the less-frequent interval. A series of field experiments were designed and conducted for each research task. Firstly, the performances of RTK and PPP techniques were evaluated in both static and kinematic (highway with speed exceed 80km) experiments. RTK solutions achieved the RMS precision of 0.09 to 0.2 meter accuracy in static and 0.2 to 0.3 meter in kinematic tests, while PPP reported 0.5 to 1.5 meters in static and 1 to 1.8 meter in kinematic tests by using the RTKlib software. These RMS precision values could be further improved if the better RTK and PPP algorithms are adopted. The tests results also showed that RTK may be more suitable in the lane-level accuracy vehicle positioning. The professional grade (dual frequency) and mass-market grade (single frequency) GNSS receivers were tested for their performance using RTK in static and kinematic modes. The analysis has shown that mass-market grade receivers provide the good solution continuity, although the overall positioning accuracy is worse than the professional grade receivers. In an attempt to reduce the load on data communication network, we firstly evaluate the use of different correction data format standards, namely RTCM version 2.x and RTCM version 3.0 format. A 24 hours transmission test was conducted to compare the network throughput. The results have shown that 66% of network throughput reduction can be achieved by using the newer RTCM version 3.0, comparing to the older RTCM version 2.x format. Secondly, experiments were conducted to examine the use of two data transmission protocols, TCP and UDP, for correction data transmission through the Telstra 3G cellular network. The performance of each transmission method was analysed in terms of packet transmission latency, packet dropout, packet throughput, packet retransmission rate etc. The overall network throughput and latency of UDP data transmission are 76.5% and 83.6% of TCP data transmission, while the overall accuracy of positioning solutions remains in the same level. Additionally, due to the nature of UDP transmission, it is also found that 0.17% of UDP packets were lost during the kinematic tests, but this loss doesn't lead to significant reduction of the quality of positioning results. The experimental results from the static and the kinematic field tests have also shown that the mobile network communication may be blocked for a couple of seconds, but the positioning solutions can be kept at the required accuracy level by setting of the Age of Differential. Finally, we investigate the effects of using less-frequent correction data (transmitted at 1, 5, 10, 15, 20, 30 and 60 seconds interval) on the precise positioning system. As the time interval increasing, the percentage of ambiguity fixed solutions gradually decreases, while the positioning error increases from 0.1 to 0.5 meter. The results showed the position accuracy could still be kept at the in-lane-level (0.1 to 0.3 m) when using up to 20 seconds interval correction data transmission.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background This paper presents a novel approach to searching electronic medical records that is based on concept matching rather than keyword matching. Aim The concept-based approach is intended to overcome specific challenges we identified in searching medical records. Method Queries and documents were transformed from their term-based originals into medical concepts as defined by the SNOMED-CT ontology. Results Evaluation on a real-world collection of medical records showed our concept-based approach outperformed a keyword baseline by 25% in Mean Average Precision. Conclusion The concept-based approach provides a framework for further development of inference based search systems for dealing with medical data.