955 resultados para statistical spatial analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The origins of early farming and its spread to Europe have been the subject of major interest for some time. The main controversy today is over the nature of the Neolithic transition in Europe: the extent to which the spread was, for the most part, indigenous and animated by imitatio (cultural diffusion) or else was driven by an influx of dispersing populations (demic diffusion). We analyze the spatiotemporal dynamics of the transition using radiocarbon dates from 735 early Neolithic sites in Europe, the Near East, and Anatolia. We compute great-circle and shortest-path distances from each site to 35 possible agricultural centers of origin—ten are based on early sites in the Middle East and 25 are hypothetical locations set at 58 latitude/longitude intervals. We perform a linear fit of distance versus age (and vice versa) for each center. For certain centers, high correlation coefficients (R . 0.8) are obtained. This implies that a steady rate or speed is a good overall approximation for this historical development. The average rate of the Neolithic spread over Europe is 0.6–1.3 km/y (95% confidence interval). This is consistent with the prediction of demic diffusion(0.6–1.1 km/y). An interpolative map of correlation coefficients, obtained by using shortest-path distances, shows that the origins of agriculture were most likely to have occurred in the northern Levantine/Mesopotamian area

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a concerted global effort to digitize biodiversity occurrence data from herbarium and museum collections that together offer an unparalleled archive of life on Earth over the past few centuries. The Global Biodiversity Information Facility provides the largest single gateway to these data. Since 2004 it has provided a single point of access to specimen data from databases of biological surveys and collections. Biologists now have rapid access to more than 120 million observations, for use in many biological analyses. We investigate the quality and coverage of data digitally available, from the perspective of a biologist seeking distribution data for spatial analysis on a global scale. We present an example of automatic verification of geographic data using distributions from the International Legume Database and Information Service to test empirically, issues of geographic coverage and accuracy. There are over 1/2 million records covering 31% of all Legume species, and 84% of these records pass geographic validation. These data are not yet a global biodiversity resource for all species, or all countries. A user will encounter many biases and gaps in these data which should be understood before data are used or analyzed. The data are notably deficient in many of the world's biodiversity hotspots. The deficiencies in data coverage can be resolved by an increased application of resources to digitize and publish data throughout these most diverse regions. But in the push to provide ever more data online, we should not forget that consistent data quality is of paramount importance if the data are to be useful in capturing a meaningful picture of life on Earth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nitrogen flows from European watersheds to coastal marine waters Executive summary Nature of the problem • Most regional watersheds in Europe constitute managed human territories importing large amounts of new reactive nitrogen. • As a consequence, groundwater, surface freshwater and coastal seawater are undergoing severe nitrogen contamination and/or eutrophication problems. Approaches • A comprehensive evaluation of net anthropogenic inputs of reactive nitrogen (NANI) through atmospheric deposition, crop N fixation,fertiliser use and import of food and feed has been carried out for all European watersheds. A database on N, P and Si fluxes delivered at the basin outlets has been assembled. • A number of modelling approaches based on either statistical regression analysis or mechanistic description of the processes involved in nitrogen transfer and transformations have been developed for relating N inputs to watersheds to outputs into coastal marine ecosystems. Key findings/state of knowledge • Throughout Europe, NANI represents 3700 kgN/km2/yr (range, 0–8400 depending on the watershed), i.e. five times the background rate of natural N2 fixation. • A mean of approximately 78% of NANI does not reach the basin outlet, but instead is stored (in soils, sediments or ground water) or eliminated to the atmosphere as reactive N forms or as N2. • N delivery to the European marine coastal zone totals 810 kgN/km2/yr (range, 200–4000 depending on the watershed), about four times the natural background. In areas of limited availability of silica, these inputs cause harmful algal blooms. Major uncertainties/challenges • The exact dimension of anthropogenic N inputs to watersheds is still imperfectly known and requires pursuing monitoring programmes and data integration at the international level. • The exact nature of ‘retention’ processes, which potentially represent a major management lever for reducing N contamination of water resources, is still poorly understood. • Coastal marine eutrophication depends to a large degree on local morphological and hydrographic conditions as well as on estuarine processes, which are also imperfectly known. Recommendations • Better control and management of the nitrogen cascade at the watershed scale is required to reduce N contamination of ground- and surface water, as well as coastal eutrophication. • In spite of the potential of these management measures, there is no choice at the European scale but to reduce the primary inputs of reactive nitrogen to watersheds, through changes in agriculture, human diet and other N flows related to human activity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The regular use of mouthrinses, particularly when combined with the use of air-powder polishing, could affect the appearance of tooth-colored restorations. The current study sought to evaluate the effect of NaHCO(3) powder on translucency of a microfilled composite resin immersed in different mouthrinses, at distinct evaluation periods. Eighty disk-shaped specimens of composite resin (Durafill VS, Heraeus Kulzer GmbH & Co. KG, Hanau, Germany) were prepared. The composite specimens were then randomly allocated into two groups according to the surface treatment: exposure to NaHCO(3) powder (10 seconds) or nonexposure, and they were randomly assigned into four subgroups, according to the mouthrinses employed (N = 10): Periogard (Colgate/Palmolive, Sao Bernardo do Campo, SP, Brazil), Cepacol (Aventis Pharma, Sao Paulo, SP, Brazil), Plax (Colgate/Palmolive), and distilled water (control group). The samples were immersed for 2 minutes daily, 5 days per week, over a 4-month test period. Translucency was measured with a transmission densitometer at seven evaluation periods. Statistical analyses (analysis of variance and Tukey`s test) revealed that: distilled water presented higher translucency values (86.72%); Periogard demonstrated the lowest translucency values (72.70%); and Plax (74.05%) and Cepacol (73.32%) showed intermediate translucency values, which were statistically similar between them (p > 0.01). NaHCO(3) air-powder polishing increased the changes in translucency associated with the mouthrinses. Air-powder polishing alone had no effect on material translucency. Translucency percent was gradually decreased from 1 week of immersion up to 4 months. It may be concluded that the NaHCO(3) powder and the tested mouthrinses have affected the translucency of microfilled composite resin, according to the tested time. CLINICAL SIGNIFICANCE During the last decade, the demand for composite resin restorations has grown considerably, however, controversy persists regarding the effect of surface roughness on color stability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tropical rainforests are becoming increasingly fragmented and understanding the genetic consequences of fragmentation is crucial for conservation of their flora and fauna. We examined populations of the toad Rhinella ornata, a species endemic to Atlantic Coastal Forest in Brazil, and compared genetic diversity among small and medium forest fragments that were either isolated or connected to large forest areas by corridors. Genetic differentiation, as measured by F(ST), was not related to geographic distance among study sites and the size of the fragments did not significantly alter patterns of genetic connectivity. However, population genetic diversity was positively related to fragment size, thus haplotype diversity was lowest in the smallest fragments, likely due to decreases in population sizes. Spatial analyses of genetic discontinuities among groups of populations showed a higher proportion of barriers to gene flow among small and medium fragments than between populations in continuous forest. Our results underscore that even species with relatively high dispersal capacities may, over time, suffer the negative genetic effects of fragmentation, possibly leading to reduced fitness of population and cases of localized extinction. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coq10p is a protein required for coenzyme Q function, but its specific role is still unknown. It is a member of the START domain superfamily that contains a hydrophobic tunnel implicated in the binding of lipophilic molecules. We used site-directed mutagenesis, statistical coupling analysis and molecular modeling to probe structural determinants in the Coq10p putative tunnel. Four point mutations were generated (coq10-K50E, coq10-L96S, coq10-E105K and coq10-K162D) and their biochemical properties analysed, as well as structural consequences. Our results show that all mutations impaired Coq10p function and together with molecular modeling indicate an important role for the Coq10p putative tunnel. (C) 2010 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We examine different phenomenological interaction models for Dark Energy and Dark Matter by performing statistical joint analysis with observational data arising from the 182 Gold type la supernova samples, the shift parameter of the Cosmic Microwave Background given by the three-year Wilkinson Microwave Anisotropy Probe observations, the baryon acoustic oscillation measurement from the Sloan Digital Sky Survey and age estimates of 35 galaxies. Including the time-dependent observable, we add sensitivity of measurement and give complementary results for the fitting. The compatibility among three different data sets seem to imply that the coupling between dark energy and dark matter is a small positive value, which satisfies the requirement to solve the coincidence problem and the second law of thermodynamics, being compatible with previous estimates. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The reactions induced by the weakly bound (6)Li projectile interacting with the intermediate mass target (59)Co were investigated. Light charged particles singles and alpha-d coincidence measurements were performed at the near barrier energies E(lab) = 17.4, 21.5, 25.5 and 29.6 MeV. The main contributions of the different competing mechanisms are discussed. A statistical model analysis. Continuum-Discretized Coupled-Channels (CDCC) calculations and two-body kinematics were used as tools to provide information to disentangle the main components of these mechanisms. A significant contribution of the direct breakup was observed through the difference between the experimental sequential breakup cross section and the CDCC prediction for the non-capture breakup cross section. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Marajo Island is located in a passive continental margin that evolved from rifting associated with the opening of the Equatorial South Atlantic Ocean in the Late Jurassic/Early Cretaceous period. This study, based on remote sensing integrated with sedimentology, as well as subsurface and seismographic data available from the literature, allows discussion of the significance of tectonics during the Quaternary history of marginal basins. Results show that eastern Marajo Island contains channels with evidence of tectonic control. Mapping of straight channels defined four main groups of lineaments (i.e. NNE-SSW, NE-SW, NW-SE and E-W) that parallel main normal and strike-slip fault zones recorded for the Amazon region. Additionally, sedimentological studies of late Quaternary and Holocene deposits indicate numerous ductile and brittle structures within stratigraphic horizons bounded by undeformed strata, related to seismogenic deformation during or shortly after sediment deposition. This conclusion is consistent with subsurface Bouguer mapping suggestive of eastern Marajo Island being still part of the Marajo graben system, where important fault reactivation is recorded up to the Quaternary. Together with the recognition of several phases of fault reactivation, these data suggest that faults developed in association with rift basins might remain active in passive margins, imposing important control on development of depositional systems. Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, it has been observed that software clones and plagiarism are becoming an increased threat for one?s creativity. Clones are the results of copying and using other?s work. According to the Merriam – Webster dictionary, “A clone is one that appears to be a copy of an original form”. It is synonym to duplicate. Clones lead to redundancy of codes, but not all redundant code is a clone.On basis of this background knowledge ,in order to safeguard one?s idea and to avoid intentional code duplication for pretending other?s work as if their owns, software clone detection should be emphasized more. The objective of this paper is to review the methods for clone detection and to apply those methods for finding the extent of plagiarism occurrence among the Swedish Universities in Master level computer science department and to analyze the results.The rest part of the paper, discuss about software plagiarism detection which employs data analysis technique and then statistical analysis of the results.Plagiarism is an act of stealing and passing off the idea?s and words of another person?s as one?s own. Using data analysis technique, samples(Master level computer Science thesis report) were taken from various Swedish universities and processed in Ephorus anti plagiarism software detection. Ephorus gives the percentage of plagiarism for each thesis document, from this results statistical analysis were carried out using Minitab Software.The results gives a very low percentage of Plagiarism extent among the Swedish universities, which concludes that Plagiarism is not a threat to Sweden?s standard of education in computer science.This paper is based on data analysis, intelligence techniques, EPHORUS software plagiarism detection tool and MINITAB statistical software analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The idea for organizing a cooperative market on Waterville Main Street was proposed by Aime Schwartz in the fall of 2008. The Co-op would entail an open market located on Main Street to provide fresh, local produce and crafts to town locals. Through shorter delivery distances and agreements with local farmers, the co-op theoretically will offer consumers lower prices on produce than can be found in conventional grocery stores, as well as an opportunity to support local agriculture. One of the tasks involved with organizing the Co-op is to source all of the produce from among the hundreds of farmers located in Maine. The purpose of this project is to show how Geographic Information System (GIS) tools can be used to help the Co-op and other businesses a) site nearby farms that carry desired produce and products, and b) determine which farms are closest to the business site. Using GIS for this purpose will make it easier and more efficient to source produce suppliers, and reduce the workload on business planners. GIS Network Analyst is a tool that provides network-based spatial analysis, and can be used in conjunction with traditional GIS technologies to determine not only the geometric distance between points, but also distance over existing networks (like roads). We used Network Analyst to find the closest produce suppliers to the Co-op for specific produce items, and compute how far they are over existing roads. This will enable business planners to source potential suppliers by distance before contacting individual farmers, allowing for more efficient use of their time and a faster planning process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The primary objective of this work is to obtain a CIO¿s Critical Competencies model which could be applied to Brazilian CIO. This study was conducted by an explanatory research in a quantitative approach. Theoretical studies were done to understand CIO¿s environment, his identity, his career and organizational relationship. The analysis was based on six CIO¿s competencies models which describe how this professional could have a better performance. The intention of this approach was to provide a better understanding about the research problem. Then a meta-model was done as well as a survey. Once applied on Internet, the survey had 111 valid respondents, all Brazilian CIOs. To obtain the final model statistical factorial analysis was applied to the answers. Each identified factor in the model corresponds to a critical competency for Brazilian CIO. The model was submitted to hypotheses tests trying to establish what is the relation between each resultant factor and the time in the role for each respondent as well as his company size. This study has emerged a CIO critical competencies model to the Brazilian CIO, regarding his good performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O presente trabalho tem como objetivo principal a obtenção de um modelo de Competências Críticas que possa ser aplicado ao CIO brasileiro. Para isso foi realizada uma pesquisa de natureza explanatória por meio de abordagem quantitativa. Para atingir este objetivo, foram feitos estudos teóricos relacionados ao ambiente do CIO, sua identidade, seu desenvolvimento na carreira, bem como seu relacionamento com sua organização e com fornecedores. Para efeito de análise foi considerado nesta pesquisa um conjunto de seis modelos encontrados na literatura mundial que descrevem as competências necessárias ao desempenho profissional do CIO, buscando assim, obter-se um melhor entendimento do problema de pesquisa. A partir daí, foi proposto um metamodelo de competências críticas e elaborado um questionário, sendo este o instrumento de pesquisa que foi utilizado neste estudo. Aplicado o questionário a partir da Internet, obteve-se uma participação de 111 respondentes válidos, sendo todos CIOs brasileiros. Uma vez efetuada a coleta de dados, foram adotados testes estatísticos relativos à Análise Fatorial, com o propósito de obter-se um modelo definitivo. Neste modelo encontrado, cada fator identificado representa uma competência crítica para o CIO brasileiro. Foram também testadas hipóteses a partir do modelo identificado, apurando qual a relação existente entre a importância atribuída aos fatores resultantes e o tempo de atuação de cada respondente como CIO, bem como o porte das empresas em que atuou. Como resultado do estudo, estabeleceu-se um modelo de competências críticas aplicável ao CIO brasileiro, associado ao seu bom desempenho nas suas atribuições.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho visa obter e verificar empiricamente um meta-modelo que possa apoiar e aprofundar a compreensão do fenômeno da resistência a sistemas de informação. Tratase de uma pesquisa explanatória e quantitativa na qual faz-se, por meio de uma extensa revisão da literatura mundial, o levantamento e consolidação das principais teorias e modelos existentes sobre o tema. Dessa forma, buscando obter um melhor entendimento do problema de pesquisa, propõe-se um meta-modelo de fatores pertinentes ao comportamento de resistência a sistemas de informação. Neste modelo, considera-se um conjunto de aspectos que, embora já abordados anteriormente, em sua maior parte ainda não haviam sido testados empiricamente, quais sejam: (i) as características idiossincráticas dos indivíduos, (ii) os aspectos técnicos inerentes aos sistemas de informação, (iii) as características da interação sócio-técnica, (iv) as características da interação de poder e políticas e, finalmente, (v) as características das organizações nas quais a tecnologia e o homem estão inseridos e interagem entre si. O instrumento de pesquisa utilizado no trabalho foi um questionário estruturado, aplicado via Internet, com suas questões contextualizadas quanto aos sistemas de gestão empresarial ERPs: Enterprise Resource Planning Systems. Obteve-se um total de 169 respondentes, considerando-se uma amostra composta exclusivamente por gestores de tecnologia da informação (TI) brasileiros e que tenham vivenciado pelo menos uma experiência de implantação de sistemas ERP ao longo de suas carreiras. Uma vez realizada a coleta dos dados, foram empregados testes estatísticos relativos à análise fatorial, visando alcançar um modelo definitivo. A partir do novo modelo encontrado, por meio da validação proporcionada pela análise fatorial, cada fator identificado representou uma causa para o comportamento de resistência a sistemas de informação. Por fim, testou-se também hipóteses a partir do novo modelo identificado, verificando-se as relações entre a percepção direta dos gestores quanto à resistência e os diversos fatores considerados relevantes para a explicação deste comportamento. Como resultado do estudo, consolidou-se um modelo de análise do comportamento de resistência a sistemas de informação, baseado na percepção do gestor de TI e contextualizado nos sistemas ERPs.