939 resultados para Dynamic Threshold Algorithm
Resumo:
We incorporate the process of enforcement learning by assuming that the agency's current marginal cost is a decreasing function of its past experience of detecting and convicting. The agency accumulates data and information (on criminals, on opportunities of crime) enhancing the ability to apprehend in the future at a lower marginal cost.We focus on the impact of enforcement learning on optimal stationary compliance rules. In particular, we show that the optimal stationary fine could be less-than-maximal and the optimal stationary probability of detection could be higher-than-otherwise.
Resumo:
Mating can affect female immunity in multiple ways. On the one hand, the immune system may be activated by pathogens transmitted during mating, sperm and seminal proteins, or wounds inflicted by males. On the other hand, immune defences may also be down-regulated to reallocate resources to reproduction. Ants are interesting models to study post-mating immune regulation because queens mate early in life, store sperm for many years, and use it until their death many years later, while males typically die after mating. This long-term commitment between queens and their mates limits the opportunity for sexual conflict but raises the new constraint of long-term sperm survival. In this study, we examine experimentally the effect of mating on immunity in wood ant queens. Specifically, we compared the phenoloxidase and antibacterial activities of mated and virgin Formica paralugubris queens. Queens had reduced levels of active phenoloxidase after mating, but elevated antibacterial activity 7 days after mating. These results indicate that the process of mating, dealation and ovary activation triggers dynamic patterns of immune regulation in ant queens that probably reflect functional responses to mating and pathogen exposure that are independent of sexual conflict.
Resumo:
Abstract One requirement for psychotherapy research is an accurate assessment of therapeutic interventions across studies. This study compared frequency and depth of therapist interventions from a dynamic perspective across four studies, conducted in four countries, including three treatment arms of psychodynamic psychotherapy, and one each of psychoanalysis and CBT. All studies used the Psychodynamic Intervention Rating Scales (PIRS) to identify 10 interventions from transcribed whole sessions early and later in treatment. The PIRS adequately categorized all interventions, except in CBT (only 91-93% categorized). As hypothesized, interpretations were present in all dynamic therapies and relatively absent in CBT. Proportions of interpretations increased over time. Defense interpretations were more common than transference interpretations, which were most prevalent in psychoanalysis. Depth of interpretations also increased over time. These data can serve as norms for measuring where on the supportive-interpretive continuum a dynamic treatment lies, as well as identify potentially mutative interventions for further process and outcome study.
Resumo:
Mosquito community composition in dynamic landscapes from the Atlantic Forest biome (Diptera, Culicidae). Considering that some species of Culicidae are vectors of pathogens, both the knowledge of the diversity of the mosquito fauna and how some environment factors influence in it, are important subjects. In order to address the composition of Culicidae species in a forest reserve in southern Atlantic Forest, we compared biotic and abiotic environmental determinants and how they were associated with the occurrence of species between sunset and sunrise. The level of conservation of the area was also considered. The investigation was carried out at Reserva Natural do Morro da Mina, in Antonina, state of Paraná, Brazil. We performed sixteen mosquito collections employing Shannon traps at three-hour intervals, from July 2008 to June 2009. The characterization of the area was determined using ecological indices of diversity, evenness, dominance and similarity. We compared the frequency of specimens with abiotic variables, i.e., temperature, relative humidity and pluviosity. Seven thousand four hundred ten mosquito females were captured. They belong to 48 species of 12 genera. The most abundant genera were Anopheles, Culex, Coquillettidia, Aedes and Runchomyia. Among the species, the most abundant was Anopheles cruzii, the primary vector of Plasmodium spp. in the Atlantic Forest. Results of the analyses showed that the abiotic variables we tested did not influence the occurrence of species, although certain values suggested that there was an optimum range for the occurrence of culicid species. It was possible to detect the presence of species of Culicidae with different epidemiologic profiles and habitat preference.
Identification of optimal structural connectivity using functional connectivity and neural modeling.
Resumo:
The complex network dynamics that arise from the interaction of the brain's structural and functional architectures give rise to mental function. Theoretical models demonstrate that the structure-function relation is maximal when the global network dynamics operate at a critical point of state transition. In the present work, we used a dynamic mean-field neural model to fit empirical structural connectivity (SC) and functional connectivity (FC) data acquired in humans and macaques and developed a new iterative-fitting algorithm to optimize the SC matrix based on the FC matrix. A dramatic improvement of the fitting of the matrices was obtained with the addition of a small number of anatomical links, particularly cross-hemispheric connections, and reweighting of existing connections. We suggest that the notion of a critical working point, where the structure-function interplay is maximal, may provide a new way to link behavior and cognition, and a new perspective to understand recovery of function in clinical conditions.
Resumo:
This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results
Resumo:
We investigate dynamics of public perceptions of the 2009 H1N1 influenza pandemic to understand changing patterns of sense-making and blame regarding the outbreak of emerging infectious diseases. We draw on social representation theory combined with a dramaturgical perspective to identify changes in how various collectives are depicted over the course of the pandemic, according to three roles: heroes, villains and victims. Quantitative results based on content analysis of three cross-sectional waves of interviews show a shift from mentions of distant collectives (e.g., far-flung countries) at Wave 1 to local collectives (e.g., risk groups) as the pandemic became of more immediate concern (Wave 2) and declined (Wave 3). Semi-automated content analysis of media coverage shows similar results. Thematic analyses of the discourse associated with collectives revealed that many were consistently perceived as heroes, villains and victims.
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
AIM: The use of an animal model to study the aqueous dynamic and the histological findings after deep sclerectomy with (DSCI) and without collagen implant. METHODS: Deep sclerectomy was performed on rabbits' eyes. Eyes were randomly assigned to receive collagen implants. Measurements of intraocular pressure (IOP) and aqueous outflow facility using the constant pressure method through cannulation of the anterior chamber were performed. The system was filled with BSS and cationised ferritin. Histological assessment of the operative site was performed. Sections were stained with haematoxylin and eosin and with Prussian blue. Aqueous drainage vessels were identified by the reaction between ferritin and Prussian blue. All eyes were coded so that the investigator was blind to the type of surgery until the evaluation was completed. RESULTS: A significant decrease in IOP (p<0.05) was observed during the first 6 weeks after DSCI (mean IOP was 13.07 (2.95) mm Hg preoperatively and 9.08 (2.25) mm Hg at 6 weeks); DS without collagen implant revealed a significant decrease in IOP at weeks 4 and 8 after surgery (mean IOP 12.57 (3.52) mm Hg preoperatively, 9.45 (3.38) mm Hg at 4 weeks, and 9.22 (3.39) mm Hg at 8 weeks). Outflow facility was significantly increased throughout the 9 months of follow up in both DSCI and DS groups (p<0.05). The preoperative outflow facility (OF) was 0.15 (0.02) micro l/min/mm Hg. At 9 months, OF was 0.52 (0.28) microl/min/mm Hg and 0.46 (0.07) micro l/min/mm Hg for DSCI and DS respectively. Light microscopy studies showed the appearance of new aqueous drainage vessels in the sclera adjacent to the dissection site in DSCI and DS and the apparition of spindle cells lining the collagen implant in DSCI after 2 months. CONCLUSION: A significant IOP decrease was observed during the first weeks after DSCI and DS. DS with or without collagen implant provided a significant increase in outflow facility throughout the 9 months of follow up. This might be partly explained by new drainage vessels in the sclera surrounding the operated site. Microscopic studies revealed the appearance of spindle cells lining the collagen implant in DSCI after 2 months.
Resumo:
Neuroimaging studies typically compare experimental conditions using average brain responses, thereby overlooking the stimulus-related information conveyed by distributed spatio-temporal patterns of single-trial responses. Here, we take advantage of this rich information at a single-trial level to decode stimulus-related signals in two event-related potential (ERP) studies. Our method models the statistical distribution of the voltage topographies with a Gaussian Mixture Model (GMM), which reduces the dataset to a number of representative voltage topographies. The degree of presence of these topographies across trials at specific latencies is then used to classify experimental conditions. We tested the algorithm using a cross-validation procedure in two independent EEG datasets. In the first ERP study, we classified left- versus right-hemifield checkerboard stimuli for upper and lower visual hemifields. In a second ERP study, when functional differences cannot be assumed, we classified initial versus repeated presentations of visual objects. With minimal a priori information, the GMM model provides neurophysiologically interpretable features - vis à vis voltage topographies - as well as dynamic information about brain function. This method can in principle be applied to any ERP dataset testing the functional relevance of specific time periods for stimulus processing, the predictability of subject's behavior and cognitive states, and the discrimination between healthy and clinical populations.
Resumo:
BACKGROUND: The objectives of the present study were to evaluate Aids prevention in drug users attending low threshold centres providing sterile injection equipment in Switzerland, to identify the characteristics of these users, and to monitor the progress of indicators of drug-related harm. METHODS: This paper presents results from a cross-sectional survey carried out in 1994. RESULTS: The mean age of attenders was 28 years, and women represented 27% of the sample. 75% of attenders used a combination of hard drugs (heroin and cocaine). Mean duration of heroin consumption was 8 years, and of cocaine 7 years; 76% of attenders had a fixed abode, but only 34% had stable employment; 45% were being treated with methadone; 9% had shared their injection material in the last 6 months; 24% always used condoms in the case of a stable relationship, and 71% in casual relationships. In a cluster analysis constructed on the basis of multiple correspondence analysis, two distinct profiles of users emerge: highly marginalised users with a high level of consumption (21%); irregular users, better integrated socially, of which the majority are under methadone treatment (79%). CONCLUSION: Theses centres play a major role in Aids prevention. Nevertheless, efforts to improve the hygiene conditions of drug injection in Switzerland should be pursued and extended. At the same time, prevention of HIV sexual transmissions should be reinforced.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
A new parameter is introduced: the lightning potential index (LPI), which is a measure of the potential for charge generation and separation that leads to lightning flashes in convective thunderstorms. The LPI is calculated within the charge separation region of clouds between 0 C and 20 C, where the noninductive mechanism involving collisions of ice and graupel particles in the presence of supercooled water is most effective. As shown in several case studies using the Weather Research and Forecasting (WRF) model with explicit microphysics, the LPI is highly correlated with observed lightning. It is suggested that the LPI may be a useful parameter for predicting lightning as well as a tool for improving weather forecasting of convective storms and heavy rainfall.