942 resultados para Subpixel precision
Resumo:
The idea of body weight regulation implies that a biological mechanism exerts control over energy expenditure and food intake. This is a central tenet of energy homeostasis. However, the source and identity of the controlling mechanism have not been identified, although it is often presumed to be some long-acting signal related to body fat, such as leptin. Using a comprehensive experimental platform, we have investigated the relationship between biological and behavioural variables in two separate studies over a 12-week intervention period in obese adults (total n 92). All variables have been measured objectively and with a similar degree of scientific control and precision, including anthropometric factors, body composition, RMR and accumulative energy consumed at individual meals across the whole day. Results showed that meal size and daily energy intake (EI) were significantly correlated with fat-free mass (FFM, P values ,0·02–0·05) but not with fat mass (FM) or BMI (P values 0·11–0·45) (study 1, n 58). In study 2 (n 34), FFM (but not FM or BMI) predicted meal size and daily EI under two distinct dietary conditions (high-fat and low-fat). These data appear to indicate that, under these circumstances, some signal associated with lean mass (but not FM) exerts a determining effect over self-selected food consumption. This signal may be postulated to interact with a separate class of signals generated by FM. This finding may have implications for investigations of the molecular control of food intake and body weight and for the management of obesity.
Resumo:
Automatic species recognition plays an important role in assisting ecologists to monitor the environment. One critical issue in this research area is that software developers need prior knowledge of specific targets people are interested in to build templates for these targets. This paper proposes a novel approach for automatic species recognition based on generic knowledge about acoustic events to detect species. Acoustic component detection is the most critical and fundamental part of this proposed approach. This paper gives clear definitions of acoustic components and presents three clustering algorithms for detecting four acoustic components in sound recordings; whistles, clicks, slurs, and blocks. The experiment result demonstrates that these acoustic component recognisers have achieved high precision and recall rate.
Resumo:
In this paper, we present a new algorithm for boosting visual template recall performance through a process of visual expectation. Visual expectation dynamically modifies the recognition thresholds of learnt visual templates based on recently matched templates, improving the recall of sequences of familiar places while keeping precision high, without any feedback from a mapping backend. We demonstrate the performance benefits of visual expectation using two 17 kilometer datasets gathered in an outdoor environment at two times separated by three weeks. The visual expectation algorithm provides up to a 100% improvement in recall. We also combine the visual expectation algorithm with the RatSLAM SLAM system and show how the algorithm enables successful mapping
Resumo:
Discovering proper search intents is a vi- tal process to return desired results. It is constantly a hot research topic regarding information retrieval in recent years. Existing methods are mainly limited by utilizing context-based mining, query expansion, and user profiling techniques, which are still suffering from the issue of ambiguity in search queries. In this pa- per, we introduce a novel ontology-based approach in terms of a world knowledge base in order to construct personalized ontologies for identifying adequate con- cept levels for matching user search intents. An iter- ative mining algorithm is designed for evaluating po- tential intents level by level until meeting the best re- sult. The propose-to-attempt approach is evaluated in a large volume RCV1 data set, and experimental results indicate a distinct improvement on top precision after compared with baseline models.
Resumo:
This paper explores violent urbanism in the recent science-fiction filem District 9 whhich depicts an alien immigration camp, filmed on location in Soweto in 2008 in the midst of a series of violent clashed between indigenous South Africans and the new wave of African immigrants. Violent Urbanism is the State of method of control of bodies and populations by those precise biological techniques that determine geopolitical sites for the control of cities. This film while presented as cinema verite speaks the real invasion of traditional, spatio-disciplinary regimes such as corporate-run detention centres, refugee camps, border control and enforced relocation by those imperceptible techniques which violate the body by reducing it to a biological datum, tool, or specimen to serve the security agenda of the twenty-first century nation-state. These techniques are chemical and biological warfare proliferation; genetic engineering; and surveillance systems, such as biometrics, whose purview is no longer limited to the specular but includes the molecular. District 9 evinces a compelling urban image of contemporary biopolitics that disturbs the received historiography of post-apartheid urbanism. Clearly Johannesburg is not the only place this could or is happening - the reach of biopolitics is worldwide. District 9 visualises with utter precision the corporate hijacking of the biological realm in contemporary cites, just as it asks the unsettling question, who exactly is the "audience" of Violent Urbanism?
Resumo:
Nowadays, everyone can effortlessly access a range of information on the World Wide Web (WWW). As information resources on the web continue to grow tremendously, it becomes progressively more difficult to meet high expectations of users and find relevant information. Although existing search engine technologies can find valuable information, however, they suffer from the problems of information overload and information mismatch. This paper presents a hybrid Web Information Retrieval approach allowing personalised search using ontology, user profile and collaborative filtering. This approach finds the context of user query with least user’s involvement, using ontology. Simultaneously, this approach uses time-based automatic user profile updating with user’s changing behaviour. Subsequently, this approach uses recommendations from similar users using collaborative filtering technique. The proposed method is evaluated with the FIRE 2010 dataset and manually generated dataset. Empirical analysis reveals that Precision, Recall and F-Score of most of the queries for many users are improved with proposed method.
Resumo:
The Link the Wiki track at INEX 2008 offered two tasks, file-to-file link discovery and anchor-to-BEP link discovery. In the former 6600 topics were used and in the latter 50 were used. Manual assessment of the anchor-to-BEP runs was performed using a tool developed for the purpose. Runs were evaluated using standard precision & recall measures such as MAP and precision / recall graphs. 10 groups participated and the approaches they took are discussed. Final evaluation results for all runs are presented.
Resumo:
A rule-based approach for classifying previously identified medical concepts in the clinical free text into an assertion category is presented. There are six different categories of assertions for the task: Present, Absent, Possible, Conditional, Hypothetical and Not associated with the patient. The assertion classification algorithms were largely based on extending the popular NegEx and Context algorithms. In addition, a health based clinical terminology called SNOMED CT and other publicly available dictionaries were used to classify assertions, which did not fit the NegEx/Context model. The data for this task includes discharge summaries from Partners HealthCare and from Beth Israel Deaconess Medical Centre, as well as discharge summaries and progress notes from University of Pittsburgh Medical Centre. The set consists of 349 discharge reports, each with pairs of ground truth concept and assertion files for system development, and 477 reports for evaluation. The system’s performance on the evaluation data set was 0.83, 0.83 and 0.83 for recall, precision and F1-measure, respectively. Although the rule-based system shows promise, further improvements can be made by incorporating machine learning approaches.
Resumo:
Purpose. To devise and validate artist-rendered grading scales for contact lens complications Methods. Each of eight tissue complications of contact lens wear (listed under 'Results') was painted by a skilled ophthalmic artist (Terry R. Tarrant) in five grades of severity: 0 (normal), 1 (trace), 2 (mild), 3 (moderate) and 4 (severe). A representative slit lamp photograph of a tissue response of each of the eight complications was shown to 404 contact lens practitioners who had never before used clinical grading scales. The practitioners were asked to grade each tissue response to the nearest 0.1 grade unit by interpolation. Results. The standard deviation (± s.d.) of the 404 responses for each tissue complication is tabulated below:_ing_ 0.5 Endothelial pplymegethisjij-4 0.7 Epithelial microcysts 0.5 Endothelial blebs_ 0.4 Stromal edema_onjunctiva! hyperemia 0.4 Stromal neovascularization 0.4 Papillary conjunctivitis 0.5 The frequency distributions and best-fit normal curves were also plotted. The precision of grading (s.d. x 2) ranged from 0.8 to 1.4, with a mean precision of 1.0. Conclusions. Grading scales afford contact lens practitioners with a method of quantifying the severity of adverse tissue responses to contact lens wear. It is noteworthy that the statistically verified precision of grading (1.0 scale unit) concurs precisely with the essential design feature of the grading scales that each grading step of 1.0 corresponds to clinically significant difference in severity. Thus, as a general rule, a difference or change in grade of > 1.0 can be taken to be both clinically and statistically significant when using these grading scales. Trained observers are likely to achieve even greater grading precision. Supported by Hydron Limited.
Resumo:
Vernier acuity, a form of visual hyperacuity, is amongst the most precise forms of spatial vision. Under optimal conditions Vernier thresholds are much finer than the inter-photoreceptor distance. Achievement of such high precision is based substantially on cortical computations, most likely in the primary visual cortex. Using stimuli with added positional noise, we show that Vernier processing is reduced with advancing age across a wide range of noise levels. Using an ideal observer model, we are able to characterize the mechanisms underlying age-related loss, and show that the reduction in Vernier acuity can be mainly attributed to the reduction in efficiency of sampling, with no significant change in the level of internal position noise, or spatial distortion, in the visual system.
Resumo:
A priority when designing control strategies for autonomous underwater vehicles is to emphasize their cost of implementation on a real vehicle. Indeed, due to the vehicles' design and the actuation modes usually under consideration for underwater plateforms the number of actuator switchings must be kept to a small value to insure feasibility and precision. This is the main objective of the algorithm presented in this paper. The theory is illustrated on two examples, one is a fully actuated underwater vehicle capable of motion in six-degrees-of freedom and one is minimally actuated with control motions in the vertical plane only.
Resumo:
Objective - this study examined the clinical utility and precision of routine screening for alcohol and other drug use among women attending a public antenatal service. Study design - a survey of clients and audit of clinical charts. Participants and setting - clients attending an antenatal clinic of a large tertiary hospital in Queensland, Australia, from October to December 2009. Measurements and findings - data were collected from two sources. First, 32 women who reported use of alcohol or other drugs during pregnancy at initial screening were then asked to complete a full substance use survey. Second, data were collected from charts of 349 new clients who attended the antenatal clinic during the study period. Both sensitivity (86%, 67%) and positive predictive value (100%, 92%) for alcohol and other drug use respectively, were high. Only 15% of surveyed women were uncomfortable about being screened for substance use in pregnancy, yet the chart audit revealed poor staff compliance. During the study period, 25% of clients were either not screened adequately or not at all. Key conclusions and implications for practise - despite recommended universal screening in pregnancy and the apparent acceptance by our participants, alcohol and other drug (A&OD) screening in the antenatal setting remains problematic. Investigation into the reasons behind, and ways to overcome, the low screening rate could improve health outcomes for mothers and children in this at-risk group. Targeted education and training for midwives may form part of the solution as these clinicians have a key role in implementing prevention and early intervention strategies.
Resumo:
In this paper, we apply a simulation based approach for estimating transmission rates of nosocomial pathogens. In particular, the objective is to infer the transmission rate between colonised health-care practitioners and uncolonised patients (and vice versa) solely from routinely collected incidence data. The method, using approximate Bayesian computation, is substantially less computer intensive and easier to implement than likelihood-based approaches we refer to here. We find through replacing the likelihood with a comparison of an efficient summary statistic between observed and simulated data that little is lost in the precision of estimated transmission rates. Furthermore, we investigate the impact of incorporating uncertainty in previously fixed parameters on the precision of the estimated transmission rates.
Resumo:
The trafficking of molecules and membranes within cells is a prerequisite for all aspects of cellular immune functions, including the delivery and recycling of cell surface proteins, secretion of immune mediators, ingestion of pathogens and activation of lymphocytes. SNARE (soluble-N-ethylmaleimide-sensitive-factor accessory-protein receptor)-family members mediate membrane fusion during all steps of trafficking, and function in almost all aspects of innate and adaptive immune responses. Here, we provide an overview of the roles of SNAREs in immune cells, offering insight into one level at which precision and tight regulation are instilled on immune responses.
Resumo:
Over the last decade, Ionic Liquids (ILs) have been used for the dissolution and derivatization of isolated cellulose. This ability of ILs is now sought for their application in the selective dissolution of cellulose from lignocellulosic biomass, for the manufacture of cellulosic ethanol. However, there are significant knowledge gaps in the understanding of the chemistry of the interaction of biomass and ILs. While imidazolium ILs have been used successfully to dissolve both isolated crystalline cellulose and components of lignocellulosic biomass, phosphonium ILs have not been sufficiently explored for the use in dissolution of lignocellulosic biomass. This thesis reports on the study of the chemistry of sugarcane bagasse with phosphonium ILs. Qualitative and quantitative measurements of biomass components dissolved in the phosphonium ionic liquids (ILs), trihexyltetradecylphosphonium chloride ([P66614]Cl) and tributylmethylphosphonium methylsulphate ([P4441]MeSO4) are obtained using attenuated total reflectance-Fourier Transform Infra Red (FTIR). Absorption bands related to cellulose, hemicelluloses and lignin dissolution monitored in situ in biomass-IL mixtures indicate lignin dissolution in both ILs and some holocellulose dissolution in the hydrophilic [P4441]MeSO4. The kinetics of lignin dissolution reported here indicate that while dissolution in the hydrophobic IL [P66614]Cl appears to follow an accepted mechanism of acid catalysed β-aryl ether cleavage, dissolution in the hydrophilic IL [P4441]MeSO4 does not appear to follow this mechanism and may not be followed by condensation reactions (initiated by reactive ketones). The quantitative measurement of lignin dissolution in phosphonium ILs based on absorbance at 1510 cm-1 has demonstrated utility and greater precision than the conventional Klason lignin method. The cleavage of lignin β-aryl ether bonds in sugarcane bagasse by the ionic liquid [P66614]Cl, in the presence of catalytic amounts of mineral acid. (ca. 0.4 %). The delignification process of bagasse is studied over a range of temperatures (120 °C to 150 °C) by monitoring the production of β-ketones (indicative of cleavage of β-aryl ethers) using FTIR spectroscopy and by compositional analysis of the undissolved fractions. Maximum delignification is obtained at 150 °C, with 52 % of lignin removed from the original lignin content of bagasse. No delignification is observed in the absence of acid which suggests that the reaction is acid catalysed with the IL solubilising the lignin fragments. The rate of delignification was significantly higher at 150 °C, suggesting that crossing the glass transition temperature of lignin effects greater freedom of rotation about the propanoid carbon-carbon bonds and leads to increased cleavage of β-aryl ethers. An attempt has been made to propose a probable mechanism of delignifcation of bagasse with the phosphonuim IL. All polymeric components of bagasse, a lignocellulosic biomass, dissolve in the hydrophilic ionic liquid (IL) tributylmethylphosphonium methylsulfate ([P4441]MeSO4) with and without a catalytic amount of acid (H2SO4, ca. 0.4 %). The presence of acid significantly increases the extent of dissolution of bagasse in [P4441]MeSO4 (by ca. 2.5 times under conditions used here). The dissolved fractions can be partially recovered by the addition of an antisolvent (water) and are significantly enriched in lignin. Unlike acid catalysed dissolution in the hydrophobic IL tetradecyltrihexylphosphonium chloride there is little evidence of cleavage of β-aryl ether bonds of lignin dissolving in [P4441]MeSO4 (with and without acid), but this mechanism may play some role in the acid catalysed dissolution. The XRD of the undissolved fractions suggests that the IL may selectively dissolve the amorphous cellulose component, leaving behind crystalline material.