955 resultados para threshold model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expert searchers engage with information as information brokers, researchers, reference librarians, information architects, faculty who teach advanced search, and in a variety of other information-intensive professions. Their experiences are characterized by a profound understanding of information concepts and skills and they have an agile ability to apply this knowledge to interacting with and having an impact on the information environment. This study explored the learning experiences of searchers to understand the acquisition of search expertise. The research question was: What can be learned about becoming an expert searcher from the learning experiences of proficient novice searchers and highly experienced searchers? The key objectives were: (1) to explore the existence of threshold concepts in search expertise; (2) to improve our understanding of how search expertise is acquired and how novice searchers, intent on becoming experts, can learn to search in more expertlike ways. The participant sample drew from two population groups: (1) highly experienced searchers with a minimum of 20 years of relevant professional experience, including LIS faculty who teach advanced search, information brokers, and search engine developers (11 subjects); and (2) MLIS students who had completed coursework in information retrieval and online searching and demonstrated exceptional ability (9 subjects). Using these two groups allowed a nuanced understanding of the experience of learning to search in expertlike ways, with data from those who search at a very high level as well as those who may be actively developing expertise. The study used semi-structured interviews, search tasks with think-aloud narratives, and talk-after protocols. Searches were screen-captured with simultaneous audio-recording of the think-aloud narrative. Data were coded and analyzed using NVivo9 and manually. Grounded theory allowed categories and themes to emerge from the data. Categories represented conceptual knowledge and attributes of expert searchers. In accord with grounded theory method, once theoretical saturation was achieved, during the final stage of analysis the data were viewed through lenses of existing theoretical frameworks. For this study, threshold concept theory (Meyer & Land, 2003) was used to explore which concepts might be threshold concepts. Threshold concepts have been used to explore transformative learning portals in subjects ranging from economics to mathematics. A threshold concept has five defining characteristics: transformative (causing a shift in perception), irreversible (unlikely to be forgotten), integrative (unifying separate concepts), troublesome (initially counter-intuitive), and may be bounded. Themes that emerged provided evidence of four concepts which had the characteristics of threshold concepts. These were: information environment: the total information environment is perceived and understood; information structures: content, index structures, and retrieval algorithms are understood; information vocabularies: fluency in search behaviors related to language, including natural language, controlled vocabulary, and finesse using proximity, truncation, and other language-based tools. The fourth threshold concept was concept fusion, the integration of the other three threshold concepts and further defined by three properties: visioning (anticipating next moves), being light on one's 'search feet' (dancing property), and profound ontological shift (identity as searcher). In addition to the threshold concepts, findings were reported that were not concept-based, including praxes and traits of expert searchers. A model of search expertise is proposed with the four threshold concepts at its core that also integrates the traits and praxes elicited from the study, attributes which are likewise long recognized in LIS research as present in professional searchers. The research provides a deeper understanding of the transformative learning experiences involved in the acquisition of search expertise. It adds to our understanding of search expertise in the context of today's information environment and has implications for teaching advanced search, for research more broadly within library and information science, and for methodologies used to explore threshold concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dengue virus (DENV) transmission in Australia is driven by weather factors and imported dengue fever (DF) cases. However, uncertainty remains regarding the threshold effects of high-order interactions among weather factors and imported DF cases and the impact of these factors on autochthonous DF. A time-series regression tree model was used to assess the threshold effects of natural temporal variations of weekly weather factors and weekly imported DF cases in relation to incidence of weekly autochthonous DF from 1 January 2000 to 31 December 2009 in Townsville and Cairns, Australia. In Cairns, mean weekly autochthonous DF incidence increased 16.3-fold when the 3-week lagged moving average maximum temperature was <32 °C, the 4-week lagged moving average minimum temperature was ≥24 °C and the sum of imported DF cases in the previous 2 weeks was >0. When the 3-week lagged moving average maximum temperature was ≥32 °C and the other two conditions mentioned above remained the same, mean weekly autochthonous DF incidence only increased 4.6-fold. In Townsville, the mean weekly incidence of autochthonous DF increased 10-fold when 3-week lagged moving average rainfall was ≥27 mm, but it only increased 1.8-fold when rainfall was <27 mm during January to June. Thus, we found different responses of autochthonous DF incidence to weather factors and imported DF cases in Townsville and Cairns. Imported DF cases may also trigger and enhance local outbreaks under favorable climate conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cryptosystems based on the hardness of lattice problems have recently acquired much importance due to their average-case to worst-case equivalence, their conjectured resistance to quantum cryptanalysis, their ease of implementation and increasing practicality, and, lately, their promising potential as a platform for constructing advanced functionalities. In this work, we construct “Fuzzy” Identity Based Encryption from the hardness of the Learning With Errors (LWE) problem. We note that for our parameters, the underlying lattice problems (such as gapSVP or SIVP) are assumed to be hard to approximate within supexponential factors for adversaries running in subexponential time. We give CPA and CCA secure variants of our construction, for small and large universes of attributes. All our constructions are secure against selective-identity attacks in the standard model. Our construction is made possible by observing certain special properties that secret sharing schemes need to satisfy in order to be useful for Fuzzy IBE. We also discuss some obstacles towards realizing lattice-based attribute-based encryption (ABE).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project develops and evaluates a model of curriculum design that aims to assist student learning of foundational disciplinary ‘Threshold Concepts’. The project uses phenomenographic action research, cross-institutional peer collaboration and the Variation Theory of Learning to develop and trial the model. Two contrasting disciplines (Physics and Law) and four institutions (two research-intensive and two universities of technology) were involved in the project, to ensure broad applicability of the model across different disciplines and contexts. The Threshold Concepts that were selected for curriculum design attention were measurement uncertainty in Physics and legal reasoning in Law. Threshold Concepts are key disciplinary concepts that are inherently troublesome, transformative and integrative in nature. Once understood, such concepts transform students’ views of the discipline because they enable students to coherently integrate what were previously seen as unrelated aspects of the subject, providing new ways of thinking about it (Meyer & Land 2003, 2005, 2006; Land et al. 2008). However, the integrative and transformative nature of such threshold concepts make them inherently difficult for students to learn, with resulting misunderstandings of concepts being prevalent...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Source Monitoring Framework is a promising model of constructive memory, yet fails because it is connectionist and does not allow content tagging. The Dual-Process Signal Detection Model is an improvement because it reduces mnemic qualia to a single memory signal (or degree of belief), but still commits itself to non-discrete representation. By supposing that ‘tagging’ means the assignment of propositional attitudes to aggregates of anemic characteristics informed inductively, then a discrete model becomes plausible. A Bayesian model of source monitoring accounts for the continuous variation of inputs and assignment of prior probabilities to memory content. A modified version of the High-Threshold Dual-Process model is recommended to further source monitoring research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Messenger RNAs (mRNAs) can be repressed and degraded by small non-coding RNA molecules. In this paper, we formulate a coarsegrained Markov-chain description of the post-transcriptional regulation of mRNAs by either small interfering RNAs (siRNAs) or microRNAs (miRNAs). We calculate the probability of an mRNA escaping from its domain before it is repressed by siRNAs/miRNAs via cal- culation of the mean time to threshold: when the number of bound siRNAs/miRNAs exceeds a certain threshold value, the mRNA is irreversibly repressed. In some cases,the analysis can be reduced to counting certain paths in a reduced Markov model. We obtain explicit expressions when the small RNA bind irreversibly to the mRNA and we also discuss the reversible binding case. We apply our models to the study of RNA interference in the nucleus, examining the probability of mRNAs escaping via small nuclear pores before being degraded by siRNAs. Using the same modelling framework, we further investigate the effect of small, decoy RNAs (decoys) on the process of post-transcriptional regulation, by studying regulation of the tumor suppressor gene, PTEN : decoys are able to block binding sites on PTEN mRNAs, thereby educing the number of sites available to siRNAs/miRNAs and helping to protect it from repression. We calculate the probability of a cytoplasmic PTEN mRNA translocating to the endoplasmic reticulum before being repressed by miRNAs. We support our results with stochastic simulations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Species distribution modelling (SDM) typically analyses species’ presence together with some form of absence information. Ideally absences comprise observations or are inferred from comprehensive sampling. When such information is not available, then pseudo-absences are often generated from the background locations within the study region of interest containing the presences, or else absence is implied through the comparison of presences to the whole study region, e.g. as is the case in Maximum Entropy (MaxEnt) or Poisson point process modelling. However, the choice of which absence information to include can be both challenging and highly influential on SDM predictions (e.g. Oksanen and Minchin, 2002). In practice, the use of pseudo- or implied absences often leads to an imbalance where absences far outnumber presences. This leaves analysis highly susceptible to ‘naughty-noughts’: absences that occur beyond the envelope of the species, which can exert strong influence on the model and its predictions (Austin and Meyers, 1996). Also known as ‘excess zeros’, naughty noughts can be estimated via an overall proportion in simple hurdle or mixture models (Martin et al., 2005). However, absences, especially those that occur beyond the species envelope, can often be more diverse than presences. Here we consider an extension to excess zero models. The two-staged approach first exploits the compartmentalisation provided by classification trees (CTs) (as in O’Leary, 2008) to identify multiple sources of naughty noughts and simultaneously delineate several species envelopes. Then SDMs can be fit separately within each envelope, and for this stage, we examine both CTs (as in Falk et al., 2014) and the popular MaxEnt (Elith et al., 2006). We introduce a wider range of model performance measures to improve treatment of naughty noughts in SDM. We retain an overall measure of model performance, the area under the curve (AUC) of the Receiver-Operating Curve (ROC), but focus on its constituent measures of false negative rate (FNR) and false positive rate (FPR), and how these relate to the threshold in the predicted probability of presence that delimits predicted presence from absence. We also propose error rates more relevant to users of predictions: false omission rate (FOR), the chance that a predicted absence corresponds to (and hence wastes) an observed presence, and the false discovery rate (FDR), reflecting those predicted (or potential) presences that correspond to absence. A high FDR may be desirable since it could help target future search efforts, whereas zero or low FOR is desirable since it indicates none of the (often valuable) presences have been ignored in the SDM. For illustration, we chose Bradypus variegatus, a species that has previously been published as an exemplar species for MaxEnt, proposed by Phillips et al. (2006). We used CTs to increasingly refine the species envelope, starting with the whole study region (E0), eliminating more and more potential naughty noughts (E1–E3). When combined with an SDM fit within the species envelope, the best CT SDM had similar AUC and FPR to the best MaxEnt SDM, but otherwise performed better. The FNR and FOR were greatly reduced, suggesting that CTs handle absences better. Interestingly, MaxEnt predictions showed low discriminatory performance, with the most common predicted probability of presence being in the same range (0.00-0.20) for both true absences and presences. In summary, this example shows that SDMs can be improved by introducing an initial hurdle to identify naughty noughts and partition the envelope before applying SDMs. This improvement was barely detectable via AUC and FPR yet visible in FOR, FNR, and the comparison of predicted probability of presence distribution for pres/absence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To determine the cost-effectiveness of the MobileMums intervention. MobileMums is a 12-week programme which assists mothers with young children to be more physically active, primarily through the use of personalised SMS text-messages. Design: A cost-effectiveness analysis using a Markov model to estimate and compare the costs and consequences of MobileMums and usual care. Setting: This study considers the cost-effectiveness of MobileMums in Queensland, Australia. Participants: A hypothetical cohort of over 36 000 women with a child under 1 year old is considered. These women are expected to be eligible and willing to participate in the intervention in Queensland, Australia. Data sources: The model was informed by the effectiveness results from a 9-month two-arm community-based randomised controlled trial undertaken in 2011 and registered retrospectively with the Australian Clinical Trials Registry (ACTRN12611000481976). Baseline characteristics for the model cohort, treatment effects and resource utilisation were all informed by this trial. Main outcome measures: The incremental cost per quality-adjusted life year (QALY) of MobileMums compared with usual care. Results: The intervention is estimated to lead to an increase of 131 QALYs for an additional cost to the health system of 1.1 million Australian dollars (AUD). The expected incremental cost-effectiveness ratio for MobileMums is 8608 AUD per QALY gained. MobileMums has a 98% probability of being cost-effective at a cost-effectiveness threshold of 64 000 AUD. Varying modelling assumptions has little effect on this result. Conclusions: At a cost-effectiveness threshold of 64 000 AUD, MobileMums would likely be a cost-effective use of healthcare resources in Queensland, Australia. Trial registration number: Australian Clinical Trials Registry; ACTRN12611000481976.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pion photoproduction processes14Ngs(gamma, pgr +)14C and14Ngs(gamma, pgr –)14O have been studied in the threshold region. These processes provide an excellent tool to study the corrections to soft pion theorems and Kroll-Ruderman limit as applied to nuclear processes. The agreement with the available experimental data for these processes is better with the empirical wave functions while the shell-model wave functions predict a much higher value. Detailed experimental studies of these reactions at threshold, it is shown, are expected to lead to a better understanding of the shell-model inputs and radial distributions in the 1p state. We thank Dr. S.C.K. Nair for a helpful discussion during the initial stages of this work. One of us (MVN) thanks Dr. J.M. Laget for sending some unpublished data on pion photoproduction. He is also thankful to Dr. J. Pasupathy and Dr. R. Rajaraman for their interest and encouragement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aflatoxin is a potent carcinogen produced by Aspergillus flavus, which frequently contaminates maize (Zea mays L.) in the field between 40° north and 40° south latitudes. A mechanistic model to predict risk of pre-harvest contamination could assist in management of this very harmful mycotoxin. In this study we describe an aflatoxin risk prediction model which is integrated with the Agricultural Production Systems Simulator (APSIM) modelling framework. The model computes a temperature function for A. flavus growth and aflatoxin production using a set of three cardinal temperatures determined in the laboratory using culture medium and intact grains. These cardinal temperatures were 11.5 °C as base, 32.5 °C as optimum and 42.5 °C as maximum. The model used a low (≤0.2) crop water supply to demand ratio—an index of drought during the grain filling stage to simulate maize crop's susceptibility to A. flavus growth and aflatoxin production. When this low threshold of the index was reached the model converted the temperature function into an aflatoxin risk index (ARI) to represent the risk of aflatoxin contamination. The model was applied to simulate ARI for two commercial maize hybrids, H513 and H614D, grown in five multi-location field trials in Kenya using site specific agronomy, weather and soil parameters. The observed mean aflatoxin contamination in these trials varied from <1 to 7143 ppb. ARI simulated by the model explained 99% of the variation (p ≤ 0.001) in a linear relationship with the mean observed aflatoxin contamination. The strong relationship between ARI and aflatoxin contamination suggests that the model could be applied to map risk prone areas and to monitor in-season risk for genotypes and soils parameterized for APSIM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aflatoxin is a potent carcinogen produced by Aspergillus flavus, which frequently contaminates maize (Zea mays L.) in the field between 40° north and 40° south latitudes. A mechanistic model to predict risk of pre-harvest contamination could assist in management of this very harmful mycotoxin. In this study we describe an aflatoxin risk prediction model which is integrated with the Agricultural Production Systems Simulator (APSIM) modelling framework. The model computes a temperature function for A. flavus growth and aflatoxin production using a set of three cardinal temperatures determined in the laboratory using culture medium and intact grains. These cardinal temperatures were 11.5 °C as base, 32.5 °C as optimum and 42.5 °C as maximum. The model used a low (≤0.2) crop water supply to demand ratio—an index of drought during the grain filling stage to simulate maize crop's susceptibility to A. flavus growth and aflatoxin production. When this low threshold of the index was reached the model converted the temperature function into an aflatoxin risk index (ARI) to represent the risk of aflatoxin contamination. The model was applied to simulate ARI for two commercial maize hybrids, H513 and H614D, grown in five multi-location field trials in Kenya using site specific agronomy, weather and soil parameters. The observed mean aflatoxin contamination in these trials varied from <1 to 7143 ppb. ARI simulated by the model explained 99% of the variation (p ≤ 0.001) in a linear relationship with the mean observed aflatoxin contamination. The strong relationship between ARI and aflatoxin contamination suggests that the model could be applied to map risk prone areas and to monitor in-season risk for genotypes and soils parameterized for APSIM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search for new physics using three-lepton (trilepton) data collected with the CDF II detector and corresponding to an integrated luminosity of 976 pb-1 is presented. The standard model predicts a low rate of trilepton events, which makes some supersymmetric processes, such as chargino-neutralino production, measurable in this channel. The mu+mu+l signature is investigated, where l is an electron or a muon, with the additional requirement of large missing transverse energy. In this analysis, the lepton transverse momenta with respect to the beam direction (pT) are as low as 5 GeV/c, a selection that improves the sensitivity to particles which are light as well as to ones which result in leptonically decaying tau leptons. At the same time, this low-p_T selection presents additional challenges due to the non-negligible heavy-quark background at low lepton momenta. This background is measured with an innovative technique using experimental data. Several dimuon and trilepton control regions are investigated, and good agreement between experimental results and standard-model predictions is observed. In the signal region, we observe one three-muon event and expect 0.4+/-0.1 mu+mu+l events

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A vast literature documents negative skewness and excess kurtosis in stock return distributions on several markets. We approach the issue of negative skewness from a different angle than in previous studies by suggesting a model, which we denote the “negative news threshold” hypothesis, that builds on asymmetrically distributed information and symmetric market responses. Our empirical tests reveal that returns for days when non-scheduled news are disclosed are the source of negative skewness in stock returns. This finding lends solid support to our model and suggests that negative skewness in stock returns is induced by asymmetries in the news disclosure policies of firm management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From a detailed re-examination of results in the literature, the effects of microstructure sizes, namely interlamellar spacing, pearlitic colony size and the prior austentitic grain size on the thresholds for fatigue crack growth (ΔKth) and crack closure (Kcl, th) have been illustrated. It is shown that while interlamellar spacing explicitly controls yield strength, a similar effect on ΔKth cannot be expected. On the other hand, the pearlitic colony size is shown to strongly influence ΔKth and Kcl, th through the deflection and retardation of cracks at colony boundaries. Consequently, an increase in ΔKth and Kcl, th with colony size has been found. The development of a theoretical model to illustrate the effects of colony size, shear flow stress in the slip band and macroscopic yield strength on Kcl, th and ΔKth is presented. the model assumes colony boundaries as potential sites for slip band pile-up formation and subsequent crack deflection finally leading to zig-zag crack growth. Using the concepts of roughness induced crack closure, the magnitude of Kcl, th is quantified as a function of colony size. In deriving the model, the flow stress in the slip band has been considered to represent the work hardened state in pearlite. Comparison of the theoretically predicted trend with the experimental data demonstrates very good agreement. Further, the intrinsic or closure free component of the fatigue threshold, ΔKeff, th is found to be insensitive to colony size and interlamellar spacing. Using a criterion for intrinsic fatigue threshold which considers the attainment of a critical fracture stress over a characteristic distance corresponding to interlamellar spacing, ΔKth values at high R values can be estimated with reasonable accuracy. The magnitude of ΔKth as a function of colony size is then obtained by summing up the average value of experimentally obtained ΔKeff, th values and the predicted Kcl, th values as a function of colony size. Again, very good agreement of the theoretically predicted ΔKth values with those experimentally obtained has been demonstrated.