987 resultados para scientific explanations models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global environmental change (GEC) is an increasingly discussed phenomenon in the scientific literature as evidence of its presence and impacts continues to grow. Yet, while the documentation of GEC is becoming more readily available, local perceptions of GEC— particularly in small-scale societies—and preferences about how to deal with it, are still largely overlooked. Local knowledge and perceptions of GEC are important in that agents make decisions (including on natural resource management) based on individual perceptions. We carried out a systematic literature review that aims to provide an exhaustive state-of-the-art of the degree to and manner in which the study of local perceptions of change are being addressed in GEC research. We reviewed 126 articles found in peer-reviewed journals (between 1998 and 2014) that address local perceptions of GEC. We used three particular lenses of analysis that are known to influence local perceptions, namely (i) cognition, (ii) culture and knowledge, and (iii) possibilities for adaptation.We present our findings on the geographical distribution of the current research, the most common changes reported, perceived drivers and impacts of change, and local explanations and evaluations of change and impacts. Overall, we found the studies to be geographically biased, lacking methodological reporting, mostly theory based with little primary data, and lacking of indepth analysis of the psychological and ontological influences in perception and implications for adaptation. We provide recommendations for future GEC research and propose the development of a “meta-language” around adaptation, perception, and mediation to encourage a greater appreciation and understanding of the diversity around these phenomena across multiple scales, and improved codesign and facilitation of locally relevant adaptation and mitigation strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Economic losses resulting from disease development can be reduced by accurate and early detection of plant pathogens. Early detection can provide the grower with useful information on optimal crop rotation patterns, varietal selections, appropriate control measures, harvest date and post harvest handling. Classical methods for the isolation of pathogens are commonly used only after disease symptoms. This frequently results in a delay in application of control measures at potentially important periods in crop production. This paper describes the application of both antibody and DNA based systems to monitor infection risk of air and soil borne fungal pathogens and the use of this information with mathematical models describing risk of disease associated with environmental parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although tyrosine kinase inhibitors (TKIs) such as imatinib have transformed chronic myelogenous leukemia (CML) into a chronic condition, these therapies are not curative in the majority of cases. Most patients must continue TKI therapy indefinitely, a requirement that is both expensive and that compromises a patient's quality of life. While TKIs are known to reduce leukemic cells' proliferative capacity and to induce apoptosis, their effects on leukemic stem cells, the immune system, and the microenvironment are not fully understood. A more complete understanding of their global therapeutic effects would help us to identify any limitations of TKI monotherapy and to address these issues through novel combination therapies. Mathematical models are a complementary tool to experimental and clinical data that can provide valuable insights into the underlying mechanisms of TKI therapy. Previous modeling efforts have focused on CML patients who show biphasic and triphasic exponential declines in BCR-ABL ratio during therapy. However, our patient data indicates that many patients treated with TKIs show fluctuations in BCR-ABL ratio yet are able to achieve durable remissions. To investigate these fluctuations, we construct a mathematical model that integrates CML with a patient's autologous immune response to the disease. In our model, we define an immune window, which is an intermediate range of leukemic concentrations that lead to an effective immune response against CML. While small leukemic concentrations provide insufficient stimulus, large leukemic concentrations actively suppress a patient's immune system, thus limiting it's ability to respond. Our patient data and modeling results suggest that at diagnosis, a patient's high leukemic concentration is able to suppress their immune system. TKI therapy drives the leukemic population into the immune window, allowing the patient's immune cells to expand and eventually mount an efficient response against the residual CML. This response drives the leukemic population below the immune window, causing the immune population to contract and allowing the leukemia to partially recover. The leukemia eventually reenters the immune window, thus stimulating a sequence of weaker immune responses as the two populations approach equilibrium. We hypothesize that a patient's autologous immune response to CML may explain the fluctuations in BCR-ABL ratio that are regularly seen during TKI therapy. These fluctuations may serve as a signature of a patient's individual immune response to CML. By applying our modeling framework to patient data, we are able to construct an immune profile that can then be used to propose patient-specific combination therapies aimed at further reducing a patient's leukemic burden. Our characterization of a patient's anti-leukemia immune response may be especially valuable in the study of drug resistance, treatment cessation, and combination therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nile perch (Lates niloticus), tilapia (Oreochromis spp), dagaa (Rastrineobola argentea, silver cyprinid), and haplochromines (Tribe Haplochromini) form the backbone of the commercial fishery on Lake Victoria. These fish stocks account for about 70% of the total catch in the three riparian states Uganda, Kenya, and Tanzania. The lake fisheries have been poorly managed, in part due to inadequate scientific analysis and management advice. The overall objective of this project was to model the stocks of the commercial fisheries of Lake Victoria with the view of determining reference points and current stock status. The Schaefer biomass model was fitted to available data for each stock (starting in the 1960s or later) in the form of landings, catch per unit effort, acoustic survey indices, and trawl survey indices. In most cases, the Schaefer model did not fit all data components very well, but attempts were made to find the best model for each stock. When the model was fitted to the Nile perch data starting from 1996, the estimated current biomass is 654 kt (95% CI 466–763); below the optimum of 692 kt and current harvest rate is 38% (33–73%), close to the optimum of 35%. At best, these can be used as tentative guidelines for the management of these fisheries. The results indicate that there have been strong multispecies interactions in the lake ecosystem. The findings from our study can be used as a baseline reference for future studies using more complex models, which could take these multispecies interactions into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília,Instituto de Ciências Humanas, Departamento de Filosofia, Programa de Pós-Graduação em Filosofia, 2015.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic methods based on time-series modeling combined with geostatistics can be useful tools to describe the variability of water-table levels in time and space and to account for uncertainty. Monitoring water-level networks can give information about the dynamic of the aquifer domain in both dimensions. Time-series modeling is an elegant way to treat monitoring data without the complexity of physical mechanistic models. Time-series model predictions can be interpolated spatially, with the spatial differences in water-table dynamics determined by the spatial variation in the system properties and the temporal variation driven by the dynamics of the inputs into the system. An integration of stochastic methods is presented, based on time-series modeling and geostatistics as a framework to predict water levels for decision making in groundwater management and land-use planning. The methodology is applied in a case study in a Guarani Aquifer System (GAS) outcrop area located in the southeastern part of Brazil. Communication of results in a clear and understandable form, via simulated scenarios, is discussed as an alternative, when translating scientific knowledge into applications of stochastic hydrogeology in large aquifers with limited monitoring network coverage like the GAS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent decades, it has been definitely established the existence of a close relationship between the emotional phenomena and rational processes, but we still do not have a unified definition, or effective models to describe any of them well. To advance our understanding of the mechanisms governing the behavior of living beings we must integrate multiple theories, experiments and models from both fields. In this paper we propose a new theoretical framework that allows integrating and understanding, from a functional point of view, the emotion-cognition duality. Our reasoning, based on evolutionary principles, add to the definition and understanding of emotion, justifying its origin, explaining its mission and dynamics, and linking it to higher cognitive processes, mainly with attention, cognition, decision-making and consciousness. According to our theory, emotions are the mechanism for brain function optimization, besides being the contingency and stimuli prioritization system. As a result of this approach, we have developed a dynamic systems-level model capable of providing plausible explanations for some psychological and behavioral phenomena, and establish a new framework for scientific definition of some fundamental psychological terms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biogeochemical-Argo is the extension of the Argo array of profiling floats to include floats that are equipped with biogeochemical sensors for pH, oxygen, nitrate, chlorophyll, suspended particles, and downwelling irradiance. Argo is a highly regarded, international program that measures the changing ocean temperature (heat content) and salinity with profiling floats distributed throughout the ocean. Newly developed sensors now allow profiling floats to also observe biogeochemical properties with sufficient accuracy for climate studies. This extension of Argo will enable an observing system that can determine the seasonal to decadal-scale variability in biological productivity, the supply of essential plant nutrients from deep-waters to the sunlit surface layer, ocean acidification, hypoxia, and ocean uptake of CO2. Biogeochemical-Argo will drive a transformative shift in our ability to observe and predict the effects of climate change on ocean metabolism, carbon uptake, and living marine resource management. Presently, vast areas of the open ocean are sampled only once per decade or less, with sampling occurring mainly in summer. Our ability to detect changes in biogeochemical processes that may occur due to the warming and acidification driven by increasing atmospheric CO2, as well as by natural climate variability, is greatly hindered by this undersampling. In close synergy with satellite systems (which are effective at detecting global patterns for a few biogeochemical parameters, but only very close to the sea surface and in the absence of clouds), a global array of biogeochemical sensors would revolutionize our understanding of ocean carbon uptake, productivity, and deoxygenation. The array would reveal the biological, chemical, and physical events that control these processes. Such a system would enable a new generation of global ocean prediction systems in support of carbon cycling, acidification, hypoxia and harmful algal blooms studies, as well as the management of living marine resources. In order to prepare for a global Biogeochemical-Argo array, several prototype profiling float arrays have been developed at the regional scale by various countries and are now operating. Examples include regional arrays in the Southern Ocean (SOCCOM ), the North Atlantic Sub-polar Gyre (remOcean ), the Mediterranean Sea (NAOS ), the Kuroshio region of the North Pacific (INBOX ), and the Indian Ocean (IOBioArgo ). For example, the SOCCOM program is deploying 200 profiling floats with biogeochemical sensors throughout the Southern Ocean, including areas covered seasonally with ice. The resulting data, which are publically available in real time, are being linked with computer models to better understand the role of the Southern Ocean in influencing CO2 uptake, biological productivity, and nutrient supply to distant regions of the world ocean. The success of these regional projects has motivated a planning meeting to discuss the requirements for and applications of a global-scale Biogeochemical-Argo program. The meeting was held 11-13 January 2016 in Villefranche-sur-Mer, France with attendees from eight nations now deploying Argo floats with biogeochemical sensors present to discuss this topic. In preparation, computer simulations and a variety of analyses were conducted to assess the resources required for the transition to a global-scale array. Based on these analyses and simulations, it was concluded that an array of about 1000 biogeochemical profiling floats would provide the needed resolution to greatly improve our understanding of biogeochemical processes and to enable significant improvement in ecosystem models. With an endurance of four years for a Biogeochemical-Argo float, this system would require the procurement and deployment of 250 new floats per year to maintain a 1000 float array. The lifetime cost for a Biogeochemical-Argo float, including capital expense, calibration, data management, and data transmission, is about $100,000. A global Biogeochemical-Argo system would thus cost about $25,000,000 annually. In the present Argo paradigm, the US provides half of the profiling floats in the array, while the EU, Austral/Asia, and Canada share most the remaining half. If this approach is adopted, the US cost for the Biogeochemical-Argo system would be ~$12,500,000 annually and ~$6,250,000 each for the EU, and Austral/Asia and Canada. This includes no direct costs for ship time and presumes that float deployments can be carried out from future research cruises of opportunity, including, for example, the international GO-SHIP program (http://www.go-ship.org). The full-scale implementation of a global Biogeochemical-Argo system with 1000 floats is feasible within a decade. The successful, ongoing pilot projects have provided the foundation and start for such a system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with SIR (susceptible--infected--removed) household epidemic models in which the infection response may be either mild or severe, with the type of response also affecting the infectiousness of an individual. Two different models are analysed. In the first model, the infection status of an individual is predetermined, perhaps due to partial immunity, and in the second, the infection status of an individual depends on the infection status of its infector and on whether the individual was infected by a within- or between-household contact. The first scenario may be modelled using a multitype household epidemic model, and the second scenario by a model we denote by the infector-dependent-severity household epidemic model. Large population results of the two models are derived, with the focus being on the distribution of the total numbers of mild and severe cases in a typical household, of any given size, in the event that the epidemic becomes established. The aim of the paper is to investigate whether it is possible to determine which of the two underlying explanations is causing the varying response when given final size household outbreak data containing mild and severe cases. We conduct numerical studies which show that, given data on sufficiently many households, it is generally possible to discriminate between the two models by comparing the Kullback-Leibler divergence for the two fitted models to these data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese (doutorado)—Universidade de Brasília, Instituto de Ciências Biológicas, Programa de Pós-Graduação em Ecologia, 2015.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of knowledge and technology in recent decades has brought profound changes in science policy, not only in the countries but also in the supranational organizations. It has been necessary, therefore, to adapt the scientific institutions to new models in order to achieve a greater and better communication between them and the political counterparts responsible for defining the general framework of relations between science and society. The Federationon of Scientific Societies of Spain (COSCE, Confederación de Sociedades Científicas de España) was founded in October 2003 to respond to the urgent need to interact with the political institutions and foster a better orientation in the process of making decisions about the science policy. Currently COSCE consists of over 70 Spanish scientific societies and more than 40,000 scientists. During its twelve years of active life, COSCE has developed an intense work of awareness of the real situation of science in Spain by launching several initiatives (some of which have joined other organizations) or by joining initiatives proposed from other groups related to science both at the Spanish level and at the European and non-European scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spectral sensors are a wide class of devices that are extremely useful for detecting essential information of the environment and materials with high degree of selectivity. Recently, they have achieved high degrees of integration and low implementation cost to be suited for fast, small, and non-invasive monitoring systems. However, the useful information is hidden in spectra and it is difficult to decode. So, mathematical algorithms are needed to infer the value of the variables of interest from the acquired data. Between the different families of predictive modeling, Principal Component Analysis and the techniques stemmed from it can provide very good performances, as well as small computational and memory requirements. For these reasons, they allow the implementation of the prediction even in embedded and autonomous devices. In this thesis, I will present 4 practical applications of these algorithms to the prediction of different variables: moisture of soil, moisture of concrete, freshness of anchovies/sardines, and concentration of gasses. In all of these cases, the workflow will be the same. Initially, an acquisition campaign was performed to acquire both spectra and the variables of interest from samples. Then these data are used as input for the creation of the prediction models, to solve both classification and regression problems. From these models, an array of calibration coefficients is derived and used for the implementation of the prediction in an embedded system. The presented results will show that this workflow was successfully applied to very different scientific fields, obtaining autonomous and non-invasive devices able to predict the value of physical parameters of choice from new spectral acquisitions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term monitoring of acoustical environments is gaining popularity thanks to the relevant amount of scientific and engineering insights that it provides. The increasing interest is due to the constant growth of storage capacity and computational power to process large amounts of data. In this perspective, machine learning (ML) provides a broad family of data-driven statistical techniques to deal with large databases. Nowadays, the conventional praxis of sound level meter measurements limits the global description of a sound scene to an energetic point of view. The equivalent continuous level Leq represents the main metric to define an acoustic environment, indeed. Finer analyses involve the use of statistical levels. However, acoustic percentiles are based on temporal assumptions, which are not always reliable. A statistical approach, based on the study of the occurrences of sound pressure levels, would bring a different perspective to the analysis of long-term monitoring. Depicting a sound scene through the most probable sound pressure level, rather than portions of energy, brought more specific information about the activity carried out during the measurements. The statistical mode of the occurrences can capture typical behaviors of specific kinds of sound sources. The present work aims to propose an ML-based method to identify, separate and measure coexisting sound sources in real-world scenarios. It is based on long-term monitoring and is addressed to acousticians focused on the analysis of environmental noise in manifold contexts. The presented method is based on clustering analysis. Two algorithms, Gaussian Mixture Model and K-means clustering, represent the main core of a process to investigate different active spaces monitored through sound level meters. The procedure has been applied in two different contexts: university lecture halls and offices. The proposed method shows robust and reliable results in describing the acoustic scenario and it could represent an important analytical tool for acousticians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, we investigate the role of applied physics in epidemiological surveillance through the application of mathematical models, network science and machine learning. The spread of a communicable disease depends on many biological, social, and health factors. The large masses of data available make it possible, on the one hand, to monitor the evolution and spread of pathogenic organisms; on the other hand, to study the behavior of people, their opinions and habits. Presented here are three lines of research in which an attempt was made to solve real epidemiological problems through data analysis and the use of statistical and mathematical models. In Chapter 1, we applied language-inspired Deep Learning models to transform influenza protein sequences into vectors encoding their information content. We then attempted to reconstruct the antigenic properties of different viral strains using regression models and to identify the mutations responsible for vaccine escape. In Chapter 2, we constructed a compartmental model to describe the spread of a bacterium within a hospital ward. The model was informed and validated on time series of clinical measurements, and a sensitivity analysis was used to assess the impact of different control measures. Finally (Chapter 3) we reconstructed the network of retweets among COVID-19 themed Twitter users in the early months of the SARS-CoV-2 pandemic. By means of community detection algorithms and centrality measures, we characterized users’ attention shifts in the network, showing that scientific communities, initially the most retweeted, lost influence over time to national political communities. In the Conclusion, we highlighted the importance of the work done in light of the main contemporary challenges for epidemiological surveillance. In particular, we present reflections on the importance of nowcasting and forecasting, the relationship between data and scientific research, and the need to unite the different scales of epidemiological surveillance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artificial Intelligence (AI) and Machine Learning (ML) are novel data analysis techniques providing very accurate prediction results. They are widely adopted in a variety of industries to improve efficiency and decision-making, but they are also being used to develop intelligent systems. Their success grounds upon complex mathematical models, whose decisions and rationale are usually difficult to comprehend for human users to the point of being dubbed as black-boxes. This is particularly relevant in sensitive and highly regulated domains. To mitigate and possibly solve this issue, the Explainable AI (XAI) field became prominent in recent years. XAI consists of models and techniques to enable understanding of the intricated patterns discovered by black-box models. In this thesis, we consider model-agnostic XAI techniques, which can be applied to Tabular data, with a particular focus on the Credit Scoring domain. Special attention is dedicated to the LIME framework, for which we propose several modifications to the vanilla algorithm, in particular: a pair of complementary Stability Indices that accurately measure LIME stability, and the OptiLIME policy which helps the practitioner finding the proper balance among explanations' stability and reliability. We subsequently put forward GLEAMS a model-agnostic surrogate interpretable model which requires to be trained only once, while providing both Local and Global explanations of the black-box model. GLEAMS produces feature attributions and what-if scenarios, from both dataset and model perspective. Eventually, we argue that synthetic data are an emerging trend in AI, being more and more used to train complex models instead of original data. To be able to explain the outcomes of such models, we must guarantee that synthetic data are reliable enough to be able to translate their explanations to real-world individuals. To this end we propose DAISYnt, a suite of tests to measure synthetic tabular data quality and privacy.