849 resultados para Exclusion process, Multi-species, Multi-scale modelling


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Up to now, high-resolution mapping of surface water extent from satellites has only been available for a few regions, over limited time periods. The extension of the temporal and spatial coverage was difficult, due to the limitation of the remote sensing technique e.g., the interaction of the radiation with vegetation or cloud for visible observations or the temporal sampling with the synthetic aperture radar (SAR)]. The advantages and the limitations of the various satellite techniques are reviewed. The need to have a global and consistent estimate of the water surfaces over long time periods triggered the development of a multi-satellite methodology to obtain consistent surface water all over the globe, regardless of the environments. The Global Inundation Extent from Multi-satellites (GIEMS) combines the complementary strengths of satellite observations from the visible to the microwave, to produce a low-resolution monthly dataset () of surface water extent and dynamics. Downscaling algorithms are now developed and applied to GIEMS, using high-spatial-resolution information from visible, near-infrared, and synthetic aperture radar (SAR) satellite images, or from digital elevation models. Preliminary products are available down to 500-m spatial resolution. This work bridges the gaps and prepares for the future NASA/CNES Surface Water Ocean Topography (SWOT) mission to be launched in 2020. SWOT will delineate surface water extent estimates and their water storage with an unprecedented spatial resolution and accuracy, thanks to a SAR in an interferometry mode. When available, the SWOT data will be adopted to downscale GIEMS, to produce a long time series of water surfaces at global scale, consistent with the SWOT observations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This short communication presents our recent studies to implement numerical simulations for multi-phase flows on top-ranked supercomputer systems with distributed memory architecture. The numerical model is designed so as to make full use of the capacity of the hardware. Satisfactory scalability in terms of both the parallel speed-up rate and the size of the problem has been obtained on two high rank systems with massively parallel processors, the Earth Simulator (Earth simulator research center, Yokohama Kanagawa, Japan) and the TSUBAME (Tokyo Institute of Technology, Tokyo, Japan) supercomputers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Em uso desde a Grécia antiga e atualmente massificado na maioria dos países do mundo, o sistema de votação tradicional baseado em cédulas de papel possui diversos problemas associados à segurança, tais como dificuldades para evitar coerção do eleitor, venda do voto e substituição fraudulenta do eleitor. Além de problemas de usabilidade que acarretam erros de preenchimento da cédula e um processo de apuração lento, que pode durar dias. Ao lado disso, o sistema tradicional não fornece a contraprova do voto, que permite ao eleitor conferir se o seu voto foi corretamente contabilizado na apuração. Inicialmente acreditou-se que a informatização do sistema de votação resolveria todos os problemas do sistema tradicional. Porém, com a sua implantação em alguns países o sistema de votação eletrônica não mostrou-se capaz de fornecer garantias irrefutáveis que não tivesse sido alvo de alterações fraudulentas durante o seu desenvolvimento ou operação. A má reputação do sistema eletrônico está principalmente associada à falta de transparência dos processos que, em sua maioria, não proporcionam a materialização do voto, conferido pelo eleitor para fins de contagem manual, e nem geram evidências (contraprova) da correta contabilização do voto do eleitor. O objetivo deste trabalho é propor uma arquitetura de votação eletrônica que integra, de forma segura, o anonimato e autenticidade do votante, a confidencialidade e integridade do voto/sistema. O sistema aumenta a usabilidade do esquema de votação baseado em "Três Cédulas" de papel, implementando-o computacionalmente. O esquema oferece maior credibilidade ao sistema de votação através da materialização e contraprova do voto, resistência à coerção e ao comércio do voto. Utilizando esquemas de criptografia assimétrica e segurança computacional clássica, associado a um sistema de auditoria eficiente, a proposta garante segurança e transparência nos processos envolvidos. A arquitetura de construção modular distribui a responsabilidade entre suas entidades, agregando-lhe robustez e viabilizando eleições em grande escala. O protótipo do sistema desenvolvido usando serviços web e Election Markup Language mostra a viabilidade da proposta.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Market squid (Loligo opalescens) plays a vital role in the California ecosystem and serves as a major link in the food chain as both a predator and prey species. For over a century, market squid has also been harvested off the California coast from Monterey to San Pedro. Expanding global markets, coupled with a decline in squid product from other parts of the world, in recent years has fueled rapid expansion of the virtually unregulated California fishery. Lack of regulatory management, in combination with dramatic increases in fishing effort and landings, has raised numerous concerns from the scientific, fishing, and regulatory communities. In an effort to address these concerns, the National Oceanic and Atmospheric Administration’s (NOAA) Channel Islands National Marine Sanctuary (CINMS) hosted a panel discussion at the October 1997 California Cooperative Oceanic and Fisheries Investigations (CalCOFI) Conference; it focused on ecosystem management implications for the burgeoning market squid fishery. Both panel and audience members addressed issues such as: the direct and indirect effects of commercial harvesting upon squid biomass; the effects of harvest and the role of squid in the broader marine community; the effects of environmental variation on squid population dynamics; the sustainability of the fishery from the point of view of both scientists and the fishers themselves; and the conservation management options for what is currently an open access and unregulated fishery. Herein are the key points of the ecosystem management panel discussion in the form of a preface, an executive summary, and transcript. (PDF contains 33 pages.)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the measurement of the Higgs Boson decaying into two photons the parametrization of an appropriate background model is essential for fitting the Higgs signal mass peak over a continuous background. This diphoton background modeling is crucial in the statistical process of calculating exclusion limits and the significance of observations in comparison to a background-only hypothesis. It is therefore ideal to obtain knowledge of the physical shape for the background mass distribution as the use of an improper function can lead to biases in the observed limits. Using an Information-Theoretic (I-T) approach for valid inference we apply Akaike Information Criterion (AIC) as a measure of the separation for a fitting model from the data. We then implement a multi-model inference ranking method to build a fit-model that closest represents the Standard Model background in 2013 diphoton data recorded by the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC). Potential applications and extensions of this model-selection technique are discussed with reference to CMS detector performance measurements as well as in potential physics analyses at future detectors.