908 resultados para probability and reinforcement proportion
Resumo:
This paper deals with sequences of random variables belonging to a fixed chaos of order q generated by a Poisson random measure on a Polish space. The problem is investigated whether convergence of the third and fourth moment of such a suitably normalized sequence to the third and fourth moment of a centred Gamma law implies convergence in distribution of the involved random variables. A positive answer is obtained for q = 2 and q = 4. The proof of this four moments theorem is based on a number of new estimates for contraction norms. Applications concern homogeneous sums and U-statistics on the Poisson space.
Resumo:
Limited availability of P in soils to crops may be due to deficiency and/or severe P retention. Earlier studies that drew on large soil profile databases have indicated that it is not (yet) feasible to present meaningful values for "plant-available" soil P, obtained according to comparable analytical methods, that may be linked to soil geographical databases derived from 1:5 million scale FAO Digital Soil Map of the World, such as the 5 x 5 arc-minute version of the ISRIC-WISE database. Therefore, an alternative solution for studying possible crop responses to fertilizer-P applied to soils, at a broad scale, was sought. The approach described in this report considers the inherent capacity of soils to retain phosphorus (P retention), in various forms. Main controlling factors of P retention processes, at the broad scale under consideration, are considered to be pH, soil mineralogy, and clay content. First, derived values for these properties were used to rate the inferred capacity for P retention of the component soil units of each map unit (or grid cell) using four classes (i.e., Low, Moderate, High, and Very High). Subsequently, the overall soil phosphorus retention potential was assessed for each mapping unit, taking into account the P-ratings and relative proportion of each component soil unit. Each P retention class has been assigned to a likely fertilizer P recovery fraction, derived from the literature, thereby permitting spatially more detailed, integrated model-based studies of environmental sustainability and agricultural production at the global and continental level (< 1:5 million). Nonetheless, uncertainties remain high; the present analysis provides an approximation of world soil phosphorus retention potential.
Resumo:
This study investigates the expression of epistemic modality in a corpus of Ghanaian Pidgin English (GhaPE). The epistemic expressions are manually identified and thereafter distinguished from each other in terms of grammatical status and their indication of different epistemic and evidential notions. 7 different elements are found, ranging from 1 pre-verbal marker, 1 adverb, 2 particles and 3 complement-taking predicates. The results indicate, in line with existing research, that to differentiate between usage properties of individual modal expressions it may be necessary to subdivide them in terms of not only epistemic but also evidential meanings. Moreover, a functional parallel between the GhaPE particle abi, the Swedish modal particle väl and the Spanish adverbs a lo mejor and igual is demonstrated, with respect to their simultaneous function of expressing epistemic probability and asking the hearer for confirmation. Finally, the results suggest, contrary to previous accounts, that the pre-verbal marker fit may indicate epistemic possibility without the addition of a preceding irrealis marker go. It is proposed that future researchers should make use of bigger corpora in order to arrive at a more ample conception of both individual modal categories and their interrelations.
Finite mixture regression model with random effects: application to neonatal hospital length of stay
Resumo:
A two-component mixture regression model that allows simultaneously for heterogeneity and dependency among observations is proposed. By specifying random effects explicitly in the linear predictor of the mixture probability and the mixture components, parameter estimation is achieved by maximising the corresponding best linear unbiased prediction type log-likelihood. Approximate residual maximum likelihood estimates are obtained via an EM algorithm in the manner of generalised linear mixed model (GLMM). The method can be extended to a g-component mixture regression model with the component density from the exponential family, leading to the development of the class of finite mixture GLMM. For illustration, the method is applied to analyse neonatal length of stay (LOS). It is shown that identification of pertinent factors that influence hospital LOS can provide important information for health care planning and resource allocation. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The number of known mRNA transcripts in the mouse has been greatly expanded by the RIKEN Mouse Gene Encyclopedia project. Validation of their reproducible expression in a tissue is an important contribution to the study of functional genomics. In this report, we determine the expression profile of 57,931 clones on 20 mouse tissues using cDNA microarrays. Of these 57,931 clones, 22,928 clones correspond to the FANTOM2 clone set. The set represents 20,234 transcriptional units (TUs) out of 33,409 TUs in the FANTOM2 set. We identified 7206 separate clones that satisfied stringent criteria for tissue-specific expression. Gene Ontology terms were assigned for these 7206 clones, and the proportion of 'molecular function' ontology for each tissue-specific clone was examined. These data will provide insights into the function of each tissue. Tissue-specific gene expression profiles obtained using our cDNA microarrays were also compared with the data extracted from the GNF Expression Atlas based on Affymetrix microarrays. One major outcome of the RIKEN transcriptome analysis is the identification of numerous nonprotein-coding mRNAs. The expression profile was also used to obtain evidence of expression for putative noncoding RNAs. In addition, 1926 clones (70%) of 2768 clones that were categorized as unknown EST, and 1969 (58%) clones of 3388 clones that were categorized as unclassifiable were also shown to be reproducibly expressed.
Resumo:
Queensland fruit fly, Bactrocera (Dacus) tryoni (QFF) is arguably the most costly horticultural insect pest in Australia. Despite this, no model is available to describe its population dynamics and aid in its management. This paper describes a cohort-based model of the population dynamics of the Queensland fruit fly. The model is primarily driven by weather variables, and so can be used at any location where appropriate meteorological data are available. In the model, the life cycle is divided into a number of discreet stages to allow physiological processes to be defined as accurately as possible. Eggs develop and hatch into larvae, which develop into pupae, which emerge as either teneral females or males. Both females and males can enter reproductive and over-wintering life stages, and there is a trapped male life stage to allow model predictions to be compared with trap catch data. All development rates are temperature-dependent. Daily mortality rates are temperature-dependent, but may also be influenced by moisture, density of larvae in fruit, fruit suitability, and age. Eggs, larvae and pupae all have constant establishment mortalities, causing a defined proportion of individuals to die upon entering that life stage. Transfer from one immature stage to the next is based on physiological age. In the adult life stages, transfer between stages may require additional and/or alternative functions. Maximum fecundity is 1400 eggs per female per day, and maximum daily oviposition rate is 80 eggs/female per day. The actual number of eggs laid by a female on any given day is restricted by temperature, density of larva in fruit, suitability of fruit for oviposition, and female activity. Activity of reproductive females and males, which affects reproduction and trapping, decreases with rainfall. Trapping of reproductive males is determined by activity, temperature and the proportion of males in the active population. Limitations of the model are discussed. Despite these, the model provides a useful agreement with trap catch data, and allows key areas for future research to be identified. These critical gaps in the current state of knowledge exist despite over 50 years of research on this key pest. By explicitly attempting to model the population dynamics of this pest we have clearly identified the research areas that must be addressed before progress can be made in developing the model into an operational tool for the management of Queensland fruit fly. (C) 2003 Published by Elsevier B.V.
Resumo:
Background Mental health survey data are now being used proactively to decide how the burden of disease might best be reduced. Aims To study the cost-effectiveness of current and optimal treatments for mental disorders and the proportion of burden avertable by each. Method Data for three affective, four anxiety and two alcohol use disorders and for schizophrenia were compared in terms of cost, burden averted and efficiency of current and optimal treatment. We then calculated the burden unavertable given current knowledge. The unit of health gain was a reduction in the years lived with disability (YLDs). Results Summing across all disorders, current treatment averted 13% of the burden, at an average cost of AUS$30 000 per YLD gained. Optimal treatment at current coverage could avert 20% of the burden, at an average cost of AUS$18 000 per YLD gained. Optimal treatment at optimal coverage could avert 28% of the burden, at AUS$16 000 per YLD gained. Sixty per cent of the burden of mental disorders was deemed to be unavertable. Conclusions The efficiency of treatment varied more than tenfold across disorders. Although coverage of some of the more efficient treatments should be extended, other factors justify continued use of less-efficient treatments for some disorders. Declaration of interest None. Funding detailed in Acknowledgements.
Resumo:
Eastern curlews Numenius madagascariensis spending the nonbreeding season in eastern Australia foraged on three intertidal decapods: soldier crab Mictyris longicarpus, sentinel crab Macrophthalmus crassipes and ghost-shrimp Trypaea australiensis. Due to their ecology, these crustaceans were spatially segregated (=distributed in 'patches') and the curlews intermittently consumed more than one prey type. It was predicted that if the curlews behaved as intake rate maximizers, the time spent foraging on a particular prey (patch) would reflect relative availabilities of the prey types and thus prey-specific intake rates would be equal. During the mid-nonbreeding period (November-December), Mictyris and Macrophthalmus were primarily consumed and prey-specific intake rates were statistically indistinguishable (8.8 versus 10.1 kJ x min(-1)). Prior to migration (February), Mictyris and Trypaea were hunted and the respective intake rates were significantly different (8.9 versus 2.3 kJ x min(-1)). Time allocation to Trypaea-hunting was independent of the availability of Mictyris. Thus, consumption of Trypaea depressed the overall intake rate. Six hypotheses for consuming Trypaea before migration were examined. Five hypotheses: the possible error by the predator, prey specialization, observer overestimation of time spent hunting Trypaea, supplementary prey and the choice of higher quality prey due to a digestive bottleneck, were deemed unsatisfactory. The explanation for consumption of a low intake-rate but high quality prey (Trypaea) deemed plausible was diet optimisation by the Curlews in response to the pre-migratory modulation (decrease in size/processing capacity) of their digestive system. With a seasonal decrease in the average intake rate, the estimated intake per low tide increased from 1233 to 1508 kJ between the mid-nonbreeding and pre-migratory periods by increasing the overall time spent on the sandflats and the proportion of time spent foraging.
Resumo:
An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local false discovery rate is provided for each gene, and it can be implemented so that the implied global false discovery rate is bounded as with the Benjamini-Hochberg methodology based on tail areas. The latter procedure is too conservative, unless it is modified according to the prior probability that a gene is not differentially expressed. An attractive feature of the mixture model approach is that it provides a framework for the estimation of this probability and its subsequent use in forming a decision rule. The rule can also be formed to take the false negative rate into account.
Resumo:
Background: Trials have shown that mammography screening reduces mortality and probably decreases morbidity related to breast cancer. Methods: We assessed whether the major mammography service in Western Australia (BreastScreen WA) is likely to reduce mortality by comparing prognostic variables between screen-detected and other cases of breast cancer diagnosed in 1999. We assessed likely reductions in morbidity by comparing treatments received by these two groups. To confirm mortality and morbidity reduction, we also compared prognostic variables and treatments with targets. Information on demographic variables, tumour characteristics at presentation and treatments were collected from medical records for all incident cases of breast cancer in Western Australia in 1999. We matched cases with the Western Australian Cancer Registry records to determine which cases had been detected by BreastScreen WA. Results: BreastScreen WA achieved the targets for mortality reduction. Tumours detected by BreastScreen WA were smaller in size, less likely to have vascular invasion, of lower histological grade and were more likely to be ductal carcinoma in situ alone without invasive carcinoma. Oestrogen receptor status was more likely to be positive, the difference in progesterone status was not significant, and lymph node involvement tended to be lower. BreastScreen WA patients were treated more often with local therapy and less often with systemic therapy, and the proportion of patients treated with breast-conserving surgery was close to the target for minimizing morbidity in breast cancer. Conclusion: Mammographic detection of breast cancer by BreastScreen WA is associated with reduced breast cancer morbidity and a more favourable prognosis.
Resumo:
Objective: Inpatient length of stay (LOS) is an important measure of hospital activity, health care resource consumption, and patient acuity. This research work aims at developing an incremental expectation maximization (EM) based learning approach on mixture of experts (ME) system for on-line prediction of LOS. The use of a batchmode learning process in most existing artificial neural networks to predict LOS is unrealistic, as the data become available over time and their pattern change dynamically. In contrast, an on-line process is capable of providing an output whenever a new datum becomes available. This on-the-spot information is therefore more useful and practical for making decisions, especially when one deals with a tremendous amount of data. Methods and material: The proposed approach is illustrated using a real example of gastroenteritis LOS data. The data set was extracted from a retrospective cohort study on all infants born in 1995-1997 and their subsequent admissions for gastroenteritis. The total number of admissions in this data set was n = 692. Linked hospitalization records of the cohort were retrieved retrospectively to derive the outcome measure, patient demographics, and associated co-morbidities information. A comparative study of the incremental learning and the batch-mode learning algorithms is considered. The performances of the learning algorithms are compared based on the mean absolute difference (MAD) between the predictions and the actual LOS, and the proportion of predictions with MAD < 1 day (Prop(MAD < 1)). The significance of the comparison is assessed through a regression analysis. Results: The incremental learning algorithm provides better on-line prediction of LOS when the system has gained sufficient training from more examples (MAD = 1.77 days and Prop(MAD < 1) = 54.3%), compared to that using the batch-mode learning. The regression analysis indicates a significant decrease of MAD (p-value = 0.063) and a significant (p-value = 0.044) increase of Prop(MAD
Resumo:
Little is known about the quality of the images transmitted in email telemedicine systems. The present study was designed to survey the quality of images transmitted in the Swinfen Charitable Trust email referral system. Telemedicine cases were examined for a 3 month period in 2002 and a 3 month period in 2006. The number of cases with images attached increased from 8 (38%) to 37 (53%). There were four types of images (clinical photographs, microscope pictures, notes and X-ray images) and the proportion of radiology images increased from 27 to 48%. The cases in 2002 came from four different hospitals and were associated with seven different clinical specialties. In 2006, the cases came from 19 different hospitals and 20 different specialties. The 46 cases (from both study periods) had a total of 159 attached images. The quality of the images was assessed by awarding each image a score in four categories: focus, anatomical perspective, composition and lighting. The images were scored on a five-point scale (1 = very poor to 5 =very good) by a qualified medical photographer. In comparing image quality between the two study periods, there was some evidence that the quality had reduced, although the average size of the attached images had increased. The median score for all images in 2002 was 16 (interquartile range 14-19) and the median score in 2006 was 15 (13-16). The difference was significant (P < 0.001, Mann-Whitney test).
Resumo:
The Operator Choice Model (OCM) was developed to model the behaviour of operators attending to complex tasks involving interdependent concurrent activities, such as in Air Traffic Control (ATC). The purpose of the OCM is to provide a flexible framework for modelling and simulation that can be used for quantitative analyses in human reliability assessment, comparison between human computer interaction (HCI) designs, and analysis of operator workload. The OCM virtual operator is essentially a cycle of four processes: Scan Classify Decide Action Perform Action. Once a cycle is complete, the operator will return to the Scan process. It is also possible to truncate a cycle and return to Scan after each of the processes. These processes are described using Continuous Time Probabilistic Automata (CTPA). The details of the probability and timing models are specific to the domain of application, and need to be specified using domain experts. We are building an application of the OCM for use in ATC. In order to develop a realistic model we are calibrating the probability and timing models that comprise each process using experimental data from a series of experiments conducted with student subjects. These experiments have identified the factors that influence perception and decision making in simplified conflict detection and resolution tasks. This paper presents an application of the OCM approach to a simple ATC conflict detection experiment. The aim is to calibrate the OCM so that its behaviour resembles that of the experimental subjects when it is challenged with the same task. Its behaviour should also interpolate when challenged with scenarios similar to those used to calibrate it. The approach illustrated here uses logistic regression to model the classifications made by the subjects. This model is fitted to the calibration data, and provides an extrapolation to classifications in scenarios outside of the calibration data. A simple strategy is used to calibrate the timing component of the model, and the results for reaction times are compared between the OCM and the student subjects. While this approach to timing does not capture the full complexity of the reaction time distribution seen in the data from the student subjects, the mean and the tail of the distributions are similar.
Resumo:
This paper presents a novel method for enabling a robot to determine the direction to a sound source through interacting with its environment. The method uses a new neural network, the Parameter-Less Self-Organizing Map algorithm, and reinforcement learning to achieve rapid and accurate response.
Resumo:
Introdução: A análise de Bolton, análise que quantifica o tamanho dentário, é uma referência importante para profissionais que buscam finalizações ortodônticas adequadas. Objetivo: O objetivo deste trabalho é verificar se há discrepância entre os indivíduos com oclusão normal natural e maloclusões de Classe I e de Classe II divisão 1 de Angle pertencentes a amostra selecionada, em relação aos valores encontrados por Bolton, bem como verificar também se há dimorfismo sexual. Metodologia: 3 grupos contendo 35 pares e modelos em gesso cada, separados pelo tipo de oclusão, pertencentes ao acervo do programa de pós-graduação em Ortodontia da Universidade Metodista de São Paulo foram medidos com paquímetro digital em sua maior distância mésiodistal desde 1º molar direito a 1º molar esquerdo, dos arcos superiores e inferiores, com dentição permanente. Os valores foram tabulados e a proporção de Bolton foi aplicada. Resultados: Respectivamente para os grupos 1, 2 e 3, a proporção total encontrada foi de 90,36 (DP±1,70), 91,17 (DP±2,58) e 90,76 (DP±2,45), e a proporção anterior foi de 77,73 (DP±2,39), 78,01 (DP±2,66) e 77,30 (DP±2,65). Conclusão: não houve dimorfismo sexual nem diferença estatisticamente significante comparando os valores aos sugeridos por Bolton.