927 resultados para peer-to-peer (P2P) computing


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coastal managers need accessible, trusted, tailored resources to help them interpret climate information, identify vulnerabilities, and apply climate information to decisions about adaptation on regional and local levels. For decades, climate scientists have studied the impacts that short term natural climate variability and long term climate change will have on coastal systems. For example, recent estimates based on Intergovernmental Panel on Climate Change (IPCC) warming scenarios suggest that global sea levels may rise 0.5 to 1.4 meters above 1990 levels by 2100 (Rahmstorf 2007; Grinsted, Moore, and Jevrejeva 2009). Many low-lying coastal ecosystems and communities will experience more frequent salt water intrusion events, more frequent coastal flooding, and accelerated erosion rates before they experience significant inundation. These changes will affect the ways coastal managers make decisions, such as timing surface and groundwater withdrawals, replacing infrastructure, and planning for changing land use on local and regional levels. Despite the advantages, managers’ use of scientific information about climate variability and change remains limited in environmental decision-making (Dow and Carbone 2007). Traditional methods scientists use to disseminate climate information, like peer-reviewed journal articles and presentations at conferences, are inappropriate to fill decision-makers’ needs for applying accessible, relevant climate information to decision-making. General guides that help managers scope out vulnerabilities and risks are becoming more common; for example, Snover et al. (2007) outlines a basic process for local and state governments to assess climate change vulnerability and preparedness. However, there are few tools available to support more specific decision-making needs. A recent survey of coastal managers in California suggests that boundary institutions can help to fill the gaps between climate science and coastal decision-making community (Tribbia and Moser 2008). The National Sea Grant College Program, the National Oceanic and Atmospheric Administration's (NOAA) university-based program for supporting research and outreach on coastal resource use and conservation, is one such institution working to bridge these gaps through outreach. Over 80% of Sea Grant’s 32 programs are addressing climate issues, and over 60% of programs increased their climate outreach programming between 2006 and 2008 (National Sea Grant Office 2008). One way that Sea Grant is working to assist coastal decision-makers with using climate information is by developing effective methods for coastal climate extension. The purpose of this paper is to discuss climate extension methodologies on regional scales, using the Carolinas Coastal Climate Outreach Initiative (CCCOI) as an example of Sea Grant’s growing capacities for climate outreach and extension. (PDF contains 3 pages)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis examines collapse risk of tall steel braced frame buildings using rupture-to-rafters simulations due to suite of San Andreas earthquakes. Two key advancements in this work are the development of (i) a rational methodology for assigning scenario earthquake probabilities and (ii) an artificial correction-free approach to broadband ground motion simulation. The work can be divided into the following sections: earthquake source modeling, earthquake probability calculations, ground motion simulations, building response, and performance analysis.

As a first step the kinematic source inversions of past earthquakes in the magnitude range of 6-8 are used to simulate 60 scenario earthquakes on the San Andreas fault. For each scenario earthquake a 30-year occurrence probability is calculated and we present a rational method to redistribute the forecast earthquake probabilities from UCERF to the simulated scenario earthquake. We illustrate the inner workings of the method through an example involving earthquakes on the San Andreas fault in southern California.

Next, three-component broadband ground motion histories are computed at 636 sites in the greater Los Angeles metropolitan area by superposing short-period (0.2~s-2.0~s) empirical Green's function synthetics on top of long-period ($>$ 2.0~s) spectral element synthetics. We superimpose these seismograms on low-frequency seismograms, computed from kinematic source models using the spectral element method, to produce broadband seismograms.

Using the ground motions at 636 sites for the 60 scenario earthquakes, 3-D nonlinear analysis of several variants of an 18-story steel braced frame building, designed for three soil types using the 1994 and 1997 Uniform Building Code provisions and subjected to these ground motions, are conducted. Model performance is classified into one of five performance levels: Immediate Occupancy, Life Safety, Collapse Prevention, Red-Tagged, and Model Collapse. The results are combined with the 30-year probability of occurrence of the San Andreas scenario earthquakes using the PEER performance based earthquake engineering framework to determine the probability of exceedance of these limit states over the next 30 years.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There is a sparse number of credible source models available from large-magnitude past earthquakes. A stochastic source model generation algorithm thus becomes necessary for robust risk quantification using scenario earthquakes. We present an algorithm that combines the physics of fault ruptures as imaged in laboratory earthquakes with stress estimates on the fault constrained by field observations to generate stochastic source models for large-magnitude (Mw 6.0-8.0) strike-slip earthquakes. The algorithm is validated through a statistical comparison of synthetic ground motion histories from a stochastically generated source model for a magnitude 7.90 earthquake and a kinematic finite-source inversion of an equivalent magnitude past earthquake on a geometrically similar fault. The synthetic dataset comprises of three-component ground motion waveforms, computed at 636 sites in southern California, for ten hypothetical rupture scenarios (five hypocenters, each with two rupture directions) on the southern San Andreas fault. A similar validation exercise is conducted for a magnitude 6.0 earthquake, the lower magnitude limit for the algorithm. Additionally, ground motions from the Mw7.9 earthquake simulations are compared against predictions by the Campbell-Bozorgnia NGA relation as well as the ShakeOut scenario earthquake. The algorithm is then applied to generate fifty source models for a hypothetical magnitude 7.9 earthquake originating at Parkfield, with rupture propagating from north to south (towards Wrightwood), similar to the 1857 Fort Tejon earthquake. Using the spectral element method, three-component ground motion waveforms are computed in the Los Angeles basin for each scenario earthquake and the sensitivity of ground shaking intensity to seismic source parameters (such as the percentage of asperity area relative to the fault area, rupture speed, and risetime) is studied.

Under plausible San Andreas fault earthquakes in the next 30 years, modeled using the stochastic source algorithm, the performance of two 18-story steel moment frame buildings (UBC 1982 and 1997 designs) in southern California is quantified. The approach integrates rupture-to-rafters simulations into the PEER performance based earthquake engineering (PBEE) framework. Using stochastic sources and computational seismic wave propagation, three-component ground motion histories at 636 sites in southern California are generated for sixty scenario earthquakes on the San Andreas fault. The ruptures, with moment magnitudes in the range of 6.0-8.0, are assumed to occur at five locations on the southern section of the fault. Two unilateral rupture propagation directions are considered. The 30-year probabilities of all plausible ruptures in this magnitude range and in that section of the fault, as forecast by the United States Geological Survey, are distributed among these 60 earthquakes based on proximity and moment release. The response of the two 18-story buildings hypothetically located at each of the 636 sites under 3-component shaking from all 60 events is computed using 3-D nonlinear time-history analysis. Using these results, the probability of the structural response exceeding Immediate Occupancy (IO), Life-Safety (LS), and Collapse Prevention (CP) performance levels under San Andreas fault earthquakes over the next thirty years is evaluated.

Furthermore, the conditional and marginal probability distributions of peak ground velocity (PGV) and displacement (PGD) in Los Angeles and surrounding basins due to earthquakes occurring primarily on the mid-section of southern San Andreas fault are determined using Bayesian model class identification. Simulated ground motions at sites within 55-75km from the source from a suite of 60 earthquakes (Mw 6.0 − 8.0) primarily rupturing mid-section of San Andreas fault are considered for PGV and PGD data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

One of the most challenging problems in mobile broadband networks is how to assign the available radio resources among the different mobile users. Traditionally, research proposals are either speci c to some type of traffic or deal with computationally intensive algorithms aimed at optimizing the delivery of general purpose traffic. Consequently, commercial networks do not incorporate these mechanisms due to the limited hardware resources at the mobile edge. Emerging 5G architectures introduce cloud computing principles to add flexible computational resources to Radio Access Networks. This paper makes use of the Mobile Edge Computing concepts to introduce a new element, denoted as Mobile Edge Scheduler, aimed at minimizing the mean delay of general traffic flows in the LTE downlink. This element runs close to the eNodeB element and implements a novel flow-aware and channel-aware scheduling policy in order to accommodate the transmissions to the available channel quality of end users.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hoje em dia, distribuições de grandes volumes de dados por redes TCP/IP corporativas trazem problemas como a alta utilização da rede e de servidores, longos períodos para conclusão e maior sensibilidade a falhas na infraestrutura de rede. Estes problemas podem ser reduzidos com utilização de redes par-a-par (P2P). O objetivo desta dissertação é analisar o desempenho do protocolo BitTorrent padrão em redes corporativas e também realizar a análise após uma modificação no comportamento padrão do protocolo BitTorrent. Nesta modificação, o rastreador identifica o endereço IP do par que está solicitando a lista de endereços IP do enxame e envia somente aqueles pertencentes à mesma rede local e ao semeador original, com o objetivo de reduzir o tráfego em redes de longa distância. Em cenários corporativos típicos, as simulações mostraram que a alteração é capaz de reduzir o consumo médio de banda e o tempo médio dos downloads, quando comparados ao BitTorrent padrão, além de conferir maior robustez à distribuição em casos de falhas em enlaces de longa distância. As simulações mostraram também que em ambientes mais complexos, com muitos clientes, e onde a restrição de banda em enlaces de longa distância provoca congestionamento e descartes, o desempenho do protocolo BitTorrent padrão pode ser semelhante a uma distribuição em arquitetura cliente-servidor. Neste último caso, a modificação proposta mostrou resultados consistentes de melhoria do desempenho da distribuição.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Liza parsia were exposed to sublethal (0.02 ppm) concentration of DDT for 15 days. The gill responded initially with copious secretion of mucus, oedematous separation of epithelial cells from the basement membrane and fusion of secondary gill lamellae. Hyperplasia of the cells lining primary gill lamellae and lamellar telangiectases (or aneurysms) was frequently seen after day 10 of exposure. Kidney exhibited hypertrophy of the epithelial cells lining proximal convoluted tubules which was followed by shrinkage in glomerular tufts, increase in Bowman's space, appearance of amorphous eosinophilic materials in the lumina of the tubules and focal necrosis on day 10 of the treatment. Hyaline droplets and casts were also encountered in the epithelial cells and lumina of the proximal tubules. Liver revealed an initial dilation of canaliculi and increased secretion of bile. Thereafter, the displacement of nuclei towards periphery of the hepatocytes, disorganization of blood sinusoids, pyknotic changes in nuclei, cytolysis and vacuolation as well as focal necrosis were noticed after day 10 of the intoxication.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Studying chaotic behavior in nonlinear systems requires numerous computations in order to simulate the behavior of such systems. The Standard Map Machine was designed and implemented as a special computer for performing these intensive computations with high-speed and high-precision. Its impressive performance is due to its simple architecture specialized to the numerical computations required of nonlinear systems. This report discusses the design and implementation of the Standard Map Machine and its use in the study of nonlinear mappings; in particular, the study of the standard map.