1000 resultados para neo-kohlberguiana approaching based on DIT
Resumo:
Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.
Resumo:
Auditory evoked potentials are informative of intact cortical functions of comatose patients. The integrity of auditory functions evaluated using mismatch negativity paradigms has been associated with their chances of survival. However, because auditory discrimination is assessed at various delays after coma onset, it is still unclear whether this impairment depends on the time of the recording. We hypothesized that impairment in auditory discrimination capabilities is indicative of coma progression, rather than of the comatose state itself and that rudimentary auditory discrimination remains intact during acute stages of coma. We studied 30 post-anoxic comatose patients resuscitated from cardiac arrest and five healthy, age-matched controls. Using a mismatch negativity paradigm, we performed two electroencephalography recordings with a standard 19-channel clinical montage: the first within 24 h after coma onset and under mild therapeutic hypothermia, and the second after 1 day and under normothermic conditions. We analysed electroencephalography responses based on a multivariate decoding algorithm that automatically quantifies neural discrimination at the single patient level. Results showed high average decoding accuracy in discriminating sounds both for control subjects and comatose patients. Importantly, accurate decoding was largely independent of patients' chance of survival. However, the progression of auditory discrimination between the first and second recordings was informative of a patient's chance of survival. A deterioration of auditory discrimination was observed in all non-survivors (equivalent to 100% positive predictive value for survivors). We show, for the first time, evidence of intact auditory processing even in comatose patients who do not survive and that progression of sound discrimination over time is informative of a patient's chance of survival. Tracking auditory discrimination in comatose patients could provide new insight to the chance of awakening in a quantitative and automatic fashion during early stages of coma.
Resumo:
BACKGROUND/OBJECTIVES Aging enhances frequency of chronic diseases like cardiovascular diseases or periodontitis. Here we reproduced an age-dependent model of the periodontium, a fully physiological approach to periodontal conditions, to evaluate the impact of dietary fat type on gingival tissue of young (6 months old) and old (24 months old) rats. METHODS/FINDINGS Animals were fed life-long on diets based on monounsaturated fatty acids (MUFA) as virgin olive oil, n-6 polyunsaturated fatty acids (n-6PUFA), as sunflower oil, or n-3PUFA, as fish oil. Age-related alveolar bone loss was higher in n-6PUFA fed rats, probably as a consequence of the ablation of the cell capacity to adapt to aging. Gene expression analysis suggests that MUFA or n-3PUFA allowed mitochondria to maintain an adequate turnover through induction of biogenesis, autophagy and the antioxidant systems, and avoiding mitochondrial electron transport system alterations. CONCLUSIONS The main finding is that the enhanced alveolar bone loss associated to age may be targeted by an appropriate dietary treatment. The mechanisms involved in this phenomenon are related with an ablation of the cell capacity to adapt to aging. Thus, MUFA or n-3PUFA might allow mitochondrial maintaining turnover through biogenesis or autophagy. They might also be able to induce the corresponding antioxidant systems to counteract age-related oxidative stress, and do not inhibit mitochondrial electron transport chain. From the nutritional and clinical point of view, it is noteworthy that the potential treatments to attenuate alveolar bone loss (a feature of periodontal disease) associated to age could be similar to some of the proposed for the prevention and treatment of cardiovascular diseases, a group of pathologies recently associated with age-related periodontitis.
Resumo:
Anophelines harbour a diverse microbial consortium that may represent an extended gene pool for the host. The proposed effects of the insect microbiota span physiological, metabolic and immune processes. Here we synthesise how current metagenomic tools combined with classical culture-dependent techniques provide new insights in the elucidation of the role of the Anopheles-associated microbiota. Many proposed malaria control strategies have been based upon the immunomodulating effects that the bacterial components of the microbiota appear to exert and their ability to express anti-Plasmodium peptides. The number of identified bacterial taxa has increased in the current “omics” era and the available data are mostly scattered or in “tables” that are difficult to exploit. Published microbiota reports for multiple anopheline species were compiled in an Excel® spreadsheet. We then filtered the microbiota data using a continent-oriented criterion and generated a visual correlation showing the exclusive and shared bacterial genera among four continents. The data suggested the existence of a core group of bacteria associated in a stable manner with their anopheline hosts. However, the lack of data from Neotropical vectors may reduce the possibility of defining the core microbiota and understanding the mosquito-bacteria interactive consortium.
Resumo:
A test kit based on living, lyophilized bacterial bioreporters emitting bioluminescence as a response to arsenite and arsenate was applied during a field campaign in six villages across Bangladesh. Bioreporter field measurements of arsenic in groundwater from tube wells were in satisfying agreement with the results of spectroscopic analyses of the same samples conducted in the lab. The practicability of the bioreporter test in terms of logistics and material requirements, suitability for high sample throughput, and waste disposal was much better than that of two commercial chemical test kits that were included as references. The campaigns furthermore demonstrated large local heterogeneity of arsenic in groundwater, underscoring the use of well switching as an effective remedy to avoid high arsenic exposure.
Resumo:
A statistical method for classification of sags their origin downstream or upstream from the recording point is proposed in this work. The goal is to obtain a statistical model using the sag waveforms useful to characterise one type of sags and to discriminate them from the other type. This model is built on the basis of multi-way principal component analysis an later used to project the available registers in a new space with lower dimension. Thus, a case base of diagnosed sags is built in the projection space. Finally classification is done by comparing new sags against the existing in the case base. Similarity is defined in the projection space using a combination of distances to recover the nearest neighbours to the new sag. Finally the method assigns the origin of the new sag according to the origin of their neighbours
Resumo:
Texte intégral: http://www.springerlink.com/content/3q68180337551r47/fulltext.pdf
Resumo:
In South America, yellow fever (YF) is an established infectious disease that has been identified outside of its traditional endemic areas, affecting human and nonhuman primate (NHP) populations. In the epidemics that occurred in Argentina between 2007-2009, several outbreaks affecting humans and howler monkeys (Alouatta spp) were reported, highlighting the importance of this disease in the context of conservation medicine and public health policies. Considering the lack of information about YF dynamics in New World NHP, our main goal was to apply modelling tools to better understand YF transmission dynamics among endangered brown howler monkey (Alouatta guariba clamitans) populations in northeastern Argentina. Two complementary modelling tools were used to evaluate brown howler population dynamics in the presence of the disease: Vortex, a stochastic demographic simulation model, and Outbreak, a stochastic disease epidemiology simulation. The baseline model of YF disease epidemiology predicted a very high probability of population decline over the next 100 years. We believe the modelling approach discussed here is a reasonable description of the disease and its effects on the howler monkey population and can be useful to support evidence-based decision-making to guide actions at a regional level.
Resumo:
This paper focuses on one of the methods for bandwidth allocation in an ATM network: the convolution approach. The convolution approach permits an accurate study of the system load in statistical terms by accumulated calculations, since probabilistic results of the bandwidth allocation can be obtained. Nevertheless, the convolution approach has a high cost in terms of calculation and storage requirements. This aspect makes real-time calculations difficult, so many authors do not consider this approach. With the aim of reducing the cost we propose to use the multinomial distribution function: the enhanced convolution approach (ECA). This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements and makes a simple deconvolution process possible. The ECA is used in connection acceptance control, and some results are presented
Resumo:
Developing a fast, inexpensive, and specific test that reflects the mutations present in Mycobacterium tuberculosis isolates according to geographic region is the main challenge for drug-resistant tuberculosis (TB) control. The objective of this study was to develop a molecular platform to make a rapid diagnosis of multidrug-resistant (MDR) and extensively drug-resistant TB based on single nucleotide polymorphism (SNP) mutations present in therpoB, katG, inhA,ahpC, and gyrA genes from Colombian M. tuberculosis isolates. The amplification and sequencing of each target gene was performed. Capture oligonucleotides, which were tested before being used with isolates to assess the performance, were designed for wild type and mutated codons, and the platform was standardised based on the reverse hybridisation principle. This method was tested on DNA samples extracted from clinical isolates from 160 Colombian patients who were previously phenotypically and genotypically characterised as having susceptible or MDR M. tuberculosis. For our method, the kappa index of the sequencing results was 0,966, 0,825, 0,766, 0,740, and 0,625 forrpoB, katG, inhA,ahpC, and gyrA, respectively. Sensitivity and specificity were ranked between 90-100% compared with those of phenotypic drug susceptibility testing. Our assay helps to pave the way for implementation locally and for specifically adapted methods that can simultaneously detect drug resistance mutations to first and second-line drugs within a few hours.
Resumo:
BACKGROUND Most textbooks contains messages relating to health. This profuse information requires analysis with regards to the quality of such information. The objective was to identify the scientific evidence on which the health messages in textbooks are based. METHODS The degree of evidence on which such messages are based was identified and the messages were subsequently classified into three categories: Messages with high, medium or low levels of evidence; Messages with an unknown level of evidence; and Messages with no known evidence. RESULTS 844 messages were studied. Of this total, 61% were classified as messages with an unknown level of evidence. Less than 15% fell into the category where the level of evidence was known and less than 6% were classified as possessing high levels of evidence. More than 70% of the messages relating to "Balanced Diets and Malnutrition", "Food Hygiene", "Tobacco", "Sexual behaviour and AIDS" and "Rest and ergonomics" are based on an unknown level of evidence. "Oral health" registered the highest percentage of messages based on a high level of evidence (37.5%), followed by "Pregnancy and newly born infants" (35%). Of the total, 24.6% are not based on any known evidence. Two of the messages appeared to contravene known evidence. CONCLUSION Many of the messages included in school textbooks are not based on scientific evidence. Standards must be established to facilitate the production of texts that include messages that are based on the best available evidence and which can improve children's health more effectively.
Resumo:
In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 differentcompositional datasets and modelled the first canonical variable using a segmented regression modelsolely based on an observation about the scatter plots. In this paper, multiple linear regressions areapplied to different datasets to confirm the validity of our proposed model. In addition to dating theunknown tephras by calibration as discussed previously, another method of mapping the unknown tephrasinto samples of the reference set or missing samples in between consecutive reference samples isproposed. The application of these methodologies is demonstrated with both simulated and real datasets.This new proposed methodology provides an alternative, more acceptable approach for geologists as theirfocus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age ofunknown tephra.Kew words: Tephrochronology; Segmented regression
Resumo:
A problem in the archaeometric classification of Catalan Renaissance pottery is the fact, thatthe clay supply of the pottery workshops was centrally organized by guilds, and thereforeusually all potters of a single production centre produced chemically similar ceramics.However, analysing the glazes of the ware usually a large number of inclusions in the glaze isfound, which reveal technological differences between single workshops. These inclusionshave been used by the potters in order to opacify the transparent glaze and to achieve a whitebackground for further decoration.In order to distinguish different technological preparation procedures of the single workshops,at a Scanning Electron Microscope the chemical composition of those inclusions as well astheir size in the two-dimensional cut is recorded. Based on the latter, a frequency distributionof the apparent diameters is estimated for each sample and type of inclusion.Following an approach by S.D. Wicksell (1925), it is principally possible to transform thedistributions of the apparent 2D-diameters back to those of the true three-dimensional bodies.The applicability of this approach and its practical problems are examined using differentways of kernel density estimation and Monte-Carlo tests of the methodology. Finally, it istested in how far the obtained frequency distributions can be used to classify the pottery
Resumo:
The system described herein represents the first example of a recommender system in digital ecosystems where agents negotiate services on behalf of small companies. The small companies compete not only with price or quality, but with a wider service-by-service composition by subcontracting with other companies. The final result of these offerings depends on negotiations at the scale of millions of small companies. This scale requires new platforms for supporting digital business ecosystems, as well as related services like open-id, trust management, monitors and recommenders. This is done in the Open Negotiation Environment (ONE), which is an open-source platform that allows agents, on behalf of small companies, to negotiate and use the ecosystem services, and enables the development of new agent technologies. The methods and tools of cyber engineering are necessary to build up Open Negotiation Environments that are stable, a basic condition for predictable business and reliable business environments. Aiming to build stable digital business ecosystems by means of improved collective intelligence, we introduce a model of negotiation style dynamics from the point of view of computational ecology. This model inspires an ecosystem monitor as well as a novel negotiation style recommender. The ecosystem monitor provides hints to the negotiation style recommender to achieve greater stability of an open negotiation environment in a digital business ecosystem. The greater stability provides the small companies with higher predictability, and therefore better business results. The negotiation style recommender is implemented with a simulated annealing algorithm at a constant temperature, and its impact is shown by applying it to a real case of an open negotiation environment populated by Italian companies
Resumo:
This work shows the use of adaptation techniques involved in an e-learning system that considers students' learning styles and students' knowledge states. The mentioned e-learning system is built on a multiagent framework designed to examine opportunities to improve the teaching and to motivate the students to learn what they want in a user-friendly and assisted environment