960 resultados para Multiple Sources
Resumo:
Interspecific competition, life history traits, environmental heterogeneity and spatial structure as well as disturbance are known to impact the successful dispersal strategies in metacommunities. However, studies on the direction of impact of those factors on dispersal have yielded contradictory results and often considered only few competing dispersal strategies at the same time. We used a unifying modeling approach to contrast the combined effects of species traits (adult survival, specialization), environmental heterogeneity and structure (spatial autocorrelation, habitat availability) and disturbance on the selected, maintained and coexisting dispersal strategies in heterogeneous metacommunities. Using a negative exponential dispersal kernel, we allowed for variation of both species dispersal distance and dispersal rate. We showed that strong disturbance promotes species with high dispersal abilities, while low local adult survival and habitat availability select against them. Spatial autocorrelation favors species with higher dispersal ability when adult survival and disturbance rate are low, and selects against them in the opposite situation. Interestingly, several dispersal strategies coexist when disturbance and adult survival act in opposition, as for example when strong disturbance regime favors species with high dispersal abilities while low adult survival selects species with low dispersal. Our results unify apparently contradictory previous results and demonstrate that spatial structure, disturbance and adult survival determine the success and diversity of coexisting dispersal strategies in competing metacommunities.
Resumo:
Uncertainties as to future supply costs of nonrenewable natural resources, such as oil and gas, raise the issue of the choice of supply sources. In a perfectly deterministic world, an efficient use of multiple sources of supply requires that any given market exhausts the supply it can draw from a low cost source before moving on to a higher cost one; supply sources should be exploited in strict sequence of increasing marginal cost, with a high cost source being left untouched as long as a less costly source is available. We find that this may not be the efficient thing to do in a stochastic world. We show that there exist conditions under which it can be efficient to use a risky supply source in order to conserve a cheaper non risky source. The benefit of doing this comes from the fact that it leaves open the possibility of using it instead of the risky source in the event the latter’s future cost conditions suddenly deteriorate. There are also conditions under which it will be efficient to use a more costly non risky source while a less costly risky source is still available. The reason is that this conserves the less costly risky source in order to use it in the event of a possible future drop in its cost.
Resumo:
The first part of my work consisted in samplings conduced in nine different localities of the salento peninsula and Apulia (Italy): Costa Merlata (BR), Punta Penne (BR), Santa Cesarea terme (LE), Santa Caterina (LE), Torre Inserraglio (LE), Torre Guaceto (BR), Porto Cesareo (LE), Otranto (LE), Isole Tremiti (FG). I collected data of species percentage covering from the infralittoral rocky zone, using squares of 50x50 cm. We considered 3 sites for location and 10 replicates for each site, which has been taken randomly. Then I took other data about the same places, collected in some years, and I combined them together, to do a spatial analysis. So I started from a data set of 1896 samples but I decided not to consider time as a factor because I have reason to think that in this period of time anthropogenic stressors and their effects (if present), didn’t change considerably. The response variable I’ve analysed is the covering percentage of an amount of 243 species (subsequently merged into 32 functional groups), including seaweeds, invertebrates, sediment and rock. 2 After the sampling, I have been spent a period of two months at the Hopkins Marine Station of Stanford University, in Monterey (California,USA), at Fiorenza Micheli's laboratory. I've been carried out statistical analysis on my data set, using the software PRIMER 6. My explorative analysis starts with a nMDS in PRIMER 6, considering the original data matrix without, for the moment, the effect of stressors. What comes out is a good separation between localities and it confirms the result of ANOSIM analysis conduced on the original data matrix. What is possible to ensure is that there is not a separation led by a geographic pattern, but there should be something else that leads the differences. Is clear the presence of at least three groups: one composed by Porto cesareo, Torre Guaceto and Isole tremiti (the only marine protected areas considered in this work); another one by Otranto, and the last one by the rest of little, impacted localities. Inside the localities that include MPA(Marine Protected Areas), is also possible to observe a sort of grouping between protected and controlled areas. What comes out from SIMPER analysis is that the most of the species involved in leading differences between populations are not rare species, like: Cystoseira spp., Mytilus sp. and ECR. Moreover I assigned discrete values (0,1,2) of each stressor to all the sites I considered, in relation to the intensity with which the anthropogenic factor affect the localities. 3 Then I tried to estabilish if there were some significant interactions between stressors: by using Spearman rank correlation and Spearman tables of significance, and taking into account 17 grades of freedom, the outcome shows some significant stressors interactions. Then I built a nMDS considering the stressors as response variable. The result was positive: localities are well separeted by stressors. Consequently I related the matrix with 'localities and species' with the 'localities and stressors' one. Stressors combination explains with a good significance level the variability inside my populations. I tried with all the possible data transformations (none, square root, fourth root, log (X+1), P/A), but the fourth root seemed to be the best one, with the highest level of significativity, meaning that also rare species can influence the result. The challenge will be to characterize better which kind of stressors (including also natural ones), act on the ecosystem; and give them a quantitative and more accurate values, trying to understand how they interact (in an additive or non-additive way).
Resumo:
It is contested that the mineral dust found in Greenlandic ice cores during the Holocene stems from multiple source areas. Particles entrained above a more productive, primary source dominate the signal’s multi-seasonal average. Data in sub-annual resolution, however, reveal at least one further source. Whereas distinct inputs from the primary source are visible in elevated concentration levels, various inputs of the secondary source(s) are reflected by multiple maxima in the coarse particle percentage. As long as the dust sources’ respective seasonal cycles are preserved, primary and secondary source can be distinguished. Since the two source’s ejecta eventually detected differ in size, which can be attributed to a change in atmospheric residence times, it is suggested that the secondary source is located in closer proximity to the drilling site than the primary one.
Resumo:
The primary aim of this dissertation is to develop data mining tools for knowledge discovery in biomedical data when multiple (homogeneous or heterogeneous) sources of data are available. The central hypothesis is that, when information from multiple sources of data are used appropriately and effectively, knowledge discovery can be better achieved than what is possible from only a single source. ^ Recent advances in high-throughput technology have enabled biomedical researchers to generate large volumes of diverse types of data on a genome-wide scale. These data include DNA sequences, gene expression measurements, and much more; they provide the motivation for building analysis tools to elucidate the modular organization of the cell. The challenges include efficiently and accurately extracting information from the multiple data sources; representing the information effectively, developing analytical tools, and interpreting the results in the context of the domain. ^ The first part considers the application of feature-level integration to design classifiers that discriminate between soil types. The machine learning tools, SVM and KNN, were used to successfully distinguish between several soil samples. ^ The second part considers clustering using multiple heterogeneous data sources. The resulting Multi-Source Clustering (MSC) algorithm was shown to have a better performance than clustering methods that use only a single data source or a simple feature-level integration of heterogeneous data sources. ^ The third part proposes a new approach to effectively incorporate incomplete data into clustering analysis. Adapted from K-means algorithm, the Generalized Constrained Clustering (GCC) algorithm makes use of incomplete data in the form of constraints to perform exploratory analysis. Novel approaches for extracting constraints were proposed. For sufficiently large constraint sets, the GCC algorithm outperformed the MSC algorithm. ^ The last part considers the problem of providing a theme-specific environment for mining multi-source biomedical data. The database called PlasmoTFBM, focusing on gene regulation of Plasmodium falciparum, contains diverse information and has a simple interface to allow biologists to explore the data. It provided a framework for comparing different analytical tools for predicting regulatory elements and for designing useful data mining tools. ^ The conclusion is that the experiments reported in this dissertation strongly support the central hypothesis.^
Resumo:
Surveys can collect important data that inform policy decisions and drive social science research. Large government surveys collect information from the U.S. population on a wide range of topics, including demographics, education, employment, and lifestyle. Analysis of survey data presents unique challenges. In particular, one needs to account for missing data, for complex sampling designs, and for measurement error. Conceptually, a survey organization could spend lots of resources getting high-quality responses from a simple random sample, resulting in survey data that are easy to analyze. However, this scenario often is not realistic. To address these practical issues, survey organizations can leverage the information available from other sources of data. For example, in longitudinal studies that suffer from attrition, they can use the information from refreshment samples to correct for potential attrition bias. They can use information from known marginal distributions or survey design to improve inferences. They can use information from gold standard sources to correct for measurement error.
This thesis presents novel approaches to combining information from multiple sources that address the three problems described above.
The first method addresses nonignorable unit nonresponse and attrition in a panel survey with a refreshment sample. Panel surveys typically suffer from attrition, which can lead to biased inference when basing analysis only on cases that complete all waves of the panel. Unfortunately, the panel data alone cannot inform the extent of the bias due to attrition, so analysts must make strong and untestable assumptions about the missing data mechanism. Many panel studies also include refreshment samples, which are data collected from a random sample of new
individuals during some later wave of the panel. Refreshment samples offer information that can be utilized to correct for biases induced by nonignorable attrition while reducing reliance on strong assumptions about the attrition process. To date, these bias correction methods have not dealt with two key practical issues in panel studies: unit nonresponse in the initial wave of the panel and in the
refreshment sample itself. As we illustrate, nonignorable unit nonresponse
can significantly compromise the analyst's ability to use the refreshment samples for attrition bias correction. Thus, it is crucial for analysts to assess how sensitive their inferences---corrected for panel attrition---are to different assumptions about the nature of the unit nonresponse. We present an approach that facilitates such sensitivity analyses, both for suspected nonignorable unit nonresponse
in the initial wave and in the refreshment sample. We illustrate the approach using simulation studies and an analysis of data from the 2007-2008 Associated Press/Yahoo News election panel study.
The second method incorporates informative prior beliefs about
marginal probabilities into Bayesian latent class models for categorical data.
The basic idea is to append synthetic observations to the original data such that
(i) the empirical distributions of the desired margins match those of the prior beliefs, and (ii) the values of the remaining variables are left missing. The degree of prior uncertainty is controlled by the number of augmented records. Posterior inferences can be obtained via typical MCMC algorithms for latent class models, tailored to deal efficiently with the missing values in the concatenated data.
We illustrate the approach using a variety of simulations based on data from the American Community Survey, including an example of how augmented records can be used to fit latent class models to data from stratified samples.
The third method leverages the information from a gold standard survey to model reporting error. Survey data are subject to reporting error when respondents misunderstand the question or accidentally select the wrong response. Sometimes survey respondents knowingly select the wrong response, for example, by reporting a higher level of education than they actually have attained. We present an approach that allows an analyst to model reporting error by incorporating information from a gold standard survey. The analyst can specify various reporting error models and assess how sensitive their conclusions are to different assumptions about the reporting error process. We illustrate the approach using simulations based on data from the 1993 National Survey of College Graduates. We use the method to impute error-corrected educational attainments in the 2010 American Community Survey using the 2010 National Survey of College Graduates as the gold standard survey.
Resumo:
Whereas numerical modeling using finite-element methods (FEM) can provide transient temperature distribution in the component with enough accuracy, it is of the most importance the development of compact dynamic thermal models that can be used for electrothermal simulation. While in most cases single power sources are considered, here we focus on the simultaneous presence of multiple sources. The thermal model will be in the form of a thermal impedance matrix containing the thermal impedance transfer functions between two arbitrary ports. Eachindividual transfer function element ( ) is obtained from the analysis of the thermal temperature transient at node ¿ ¿ after a power step at node ¿ .¿ Different options for multiexponential transient analysis are detailed and compared. Among the options explored, small thermal models can be obtained by constrained nonlinear least squares (NLSQ) methods if the order is selected properly using validation signals. The methods are applied to the extraction of dynamic compact thermal models for a new ultrathin chip stack technology (UTCS).
Resumo:
Giardia duodenalis is a flagellate protozoan that parasitizes humans and several other mammals. Protozoan contamination has been regularly documented at important environmental sites, although most of these studies were performed at the species level. There is a lack of studies that correlate environmental contamination and clinical infections in the same region. The aim of this study is to evaluate the genetic diversity of a set of clinical and environmental samples and to use the obtained data to characterize the genetic profile of the distribution of G. duodenalis and the potential for zoonotic transmission in a metropolitan region of Brazil. The genetic assemblages and subtypes of G. duodenalis isolates obtained from hospitals, a veterinary clinic, a day-care center and important environmental sites were determined via multilocus sequence-based genotyping using three unlinked gene loci. Cysts of Giardia were detected at all of the environmental sites. Mixed assemblages were detected in 25% of the total samples, and an elevated number of haplotypes was identified. The main haplotypes were shared among the groups, and new subtypes were identified at all loci. Ten multilocus genotypes were identified: 7 for assemblage A and 3 for assemblage B. There is persistent G. duodenalis contamination at important environmental sites in the city. The identified mixed assemblages likely represent mixed infections, suggesting high endemicity of Giardia in these hosts. Most Giardia isolates obtained in this study displayed zoonotic potential. The high degree of genetic diversity in the isolates obtained from both clinical and environmental samples suggests that multiple sources of infection are likely responsible for the detected contamination events. The finding that many multilocus genotypes (MLGs) and haplotypes are shared by different groups suggests that these sources of infection may be related and indicates that there is a notable risk of human infection caused by Giardia in this region.
Resumo:
An analysis of the dietary content of haematophagous insects can provide important information about the transmission networks of certain zoonoses. The present study evaluated the potential of polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) analysis of the mitochondrial cytochrome B (cytb) gene to differentiate between vertebrate species that were identified as possible sources of sandfly meals. The complete cytb gene sequences of 11 vertebrate species available in the National Center for Biotechnology Information database were digested with Aci I, Alu I, Hae III and Rsa I restriction enzymes in silico using Restriction Mapper software. The cytb gene fragment (358 bp) was amplified from tissue samples of vertebrate species and the dietary contents of sandflies and digested with restriction enzymes. Vertebrate species presented a restriction fragment profile that differed from that of other species, with the exception of Canis familiaris and Cerdocyon thous. The 358 bp fragment was identified in 76 sandflies. Of these, 10 were evaluated using the restriction enzymes and the food sources were predicted for four: Homo sapiens (1), Bos taurus (1) and Equus caballus (2). Thus, the PCR-RFLP technique could be a potential method for identifying the food sources of arthropods. However, some points must be clarified regarding the applicability of the method, such as the extent of DNA degradation through intestinal digestion, the potential for multiple sources of blood meals and the need for greater knowledge regarding intraspecific variations in mtDNA.
Resumo:
Sleep spindles are approximately 1 s bursts of 10-16 Hz activity that occur during stage 2 sleep. Spindles are highly synchronous across the cortex and thalamus in animals, and across the scalp in humans, implying correspondingly widespread and synchronized cortical generators. However, prior studies have noted occasional dissociations of the magnetoencephalogram (MEG) from the EEG during spindles, although detailed studies of this phenomenon have been lacking. We systematically compared high-density MEG and EEG recordings during naturally occurring spindles in healthy humans. As expected, EEG was highly coherent across the scalp, with consistent topography across spindles. In contrast, the simultaneously recorded MEG was not synchronous, but varied strongly in amplitude and phase across locations and spindles. Overall, average coherence between pairs of EEG sensors was approximately 0.7, whereas MEG coherence was approximately 0.3 during spindles. Whereas 2 principle components explained approximately 50% of EEG spindle variance, >15 were required for MEG. Each PCA component for MEG typically involved several widely distributed locations, which were relatively coherent with each other. These results show that, in contrast to current models based on animal experiments, multiple asynchronous neural generators are active during normal human sleep spindles and are visible to MEG. It is possible that these multiple sources may overlap sufficiently in different EEG sensors to appear synchronous. Alternatively, EEG recordings may reflect diffusely distributed synchronous generators that are less visible to MEG. An intriguing possibility is that MEG preferentially records from the focal core thalamocortical system during spindles, and EEG from the distributed matrix system.
Resumo:
A new model has been developed for assessing multiple sources of nitrogen in catchments. The model (INCA) is process based and uses reaction kinetic equations to simulate the principal mechanisms operating. The model allows for plant uptake, surface and sub-surface pathways and can simulate up to six land uses simultaneously. The model can be applied to catchment as a semi-distributed simulation and has an inbuilt multi-reach structure for river systems. Sources of nitrogen can be from atmospheric deposition, from the terrestrial environment (e.g. agriculture, leakage from forest systems etc.), from urban areas or from direct discharges via sewage or intensive farm units. The model is a daily simulation model and can provide information in the form of time series at key sites, or as profiles down river systems or as statistical distributions. The process model is described and in a companion paper the model is applied to the River Tywi catchment in South Wales and the Great Ouse in Bedfordshire.
Resumo:
An analysis of the dietary content of haematophagous insects can provide important information about the transmission networks of certain zoonoses. The present study evaluated the potential of polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) analysis of the mitochondrial cytochrome B (cytb)gene to differentiate between vertebrate species that were identified as possible sources of sandfly meals. The complete cytb gene sequences of 11 vertebrate species available in the National Center for Biotechnology Information database were digested with Aci I, Alu I, Hae III and Rsa I restriction enzymes in silico using Restriction Mapper software. The cytb gene fragment (358 bp) was amplified from tissue samples of vertebrate species and the dietary contents of sandflies and digested with restriction enzymes. Vertebrate species presented a restriction fragment profile that differed from that of other species, with the exception of Canis familiaris and Cerdocyon thous. The 358 bp fragment was identified in 76 sandflies. Of these, 10 were evaluated using the restriction enzymes and the food sources were predicted for four: Homo sapiens (1), Bos taurus (1) and Equus caballus (2). Thus, the PCR-RFLP technique could be a potential method for identifying the food sources of arthropods. However, some points must be clarified regarding the applicability of the method, such as the extent of DNA degradation through intestinal digestion, the potential for multiple sources of blood meals and the need for greater knowledge regarding intraspecific variations in mtDNA.