847 resultados para Vulnerability curve
Resumo:
This paper analyzes the existence of an inflation tax Laffer curve (ITLC) in the context of two standard optimizing monetary models: a cash-in-advance model and a money in the utility function model. Agents’ preferences are characterized in the two models by a constant relative risk aversion utility function. Explosive hyperinflation rules out the presence of an ITLC. In the context of a cash-in-advance economy, this paper shows that explosive hyperinflation is feasible and thus an ITLC is ruled out whenever the relative risk aversion parameter is greater than one. In the context of an optimizing model with money in the utility function, this paper firstly shows that an ITLC is ruled out. Moreover, it is shown that explosive hyperinflations are more likely when the transactions role of money is more important. However, hyperinflationary paths are not feasible in this context unless certain restrictions are imposed.
Resumo:
This paper studies the comovement between output and inflation in the EU15 countries. Following den Haan (2000), I use the correlations of VAR forecast errors at different horizons in order to analyze the output-inflation relationship. The empirical results show that eight countries display a significant positive comovement between output and inflation. Moreover, the empirical evidence suggests that a Phillips curve phenomenom is more likely to be detected in countries where inflation is more stable.
Resumo:
Progressive increases in storm intensities and extreme wave heights have been documented along the U.S. West Coast. Paired with global sea level rise and the potential for an increase in El Niño occurrences, these trends have substantial implications for the vulnerability of coastal communities to natural coastal hazards. Community vulnerability to hazards is characterized by the exposure, sensitivity, and adaptive capacity of human-environmental systems that influence potential impacts. To demonstrate how societal vulnerability to coastal hazards varies with both physical and social factors, we compared community exposure and sensitivity to storm-induced coastal change scenarios in Tillamook (Oregon) and Pacific (Washington) Counties. While both are backed by low-lying coastal dunes, communities in these two counties have experienced different shoreline change histories and have chosen to use the adjacent land in different ways. Therefore, community vulnerability varies significantly between the two counties. Identifying the reasons for this variability can help land-use managers make decisions to increase community resilience and reduce vulnerability in spite of a changing climate. (PDF contains 4 pages)
Resumo:
Coastal hazards such as flooding and erosion threaten many coastal communities and ecosystems. With documented increases in both storm frequency and intensity and projected acceleration of sea level rise, incorporating the impacts of climate change and variability into coastal vulnerability assessments is becoming a necessary, yet challenging task. We are developing an integrated approach to probabilistically incorporate the impacts of climate change into coastal vulnerability assessments via a multi-scale, multi-hazard methodology. By examining the combined hazards of episodic flooding/inundation and storm induced coastal change with chronic trends under a range of future climate change scenarios, a quantitative framework can be established to promote more sciencebased decision making in the coastal zone. Our focus here is on an initial application of our method in southern Oregon, United States. (PDF contains 5 pages)
Resumo:
The states bordering the Gulf of Mexico i.e. Texas, Louisiana, Mississippi, Alabama, and Florida have been historically devastated by hurricanes and tropical storms. A large number of African Americans live in these southern Gulf States which have high percentages of minorities in terms of total population. According to the U.S. Census, the total black population in the United States is about 40.7 million and about one-fourth of them live in these five Gulf States (U.S. Census, 2008). As evidenced from Hurricane Katrina and other major hurricanes, lowincome and under-served communities are usually the hardest hit during these disasters. The aim of this study is to identify and visualize socio-economic vulnerability of the African American population at the county level living in the hurricane risk areas of these five Gulf States. (PDF contains 5 pages)
Resumo:
In this thesis, we develop an efficient collapse prediction model, the PFA (Peak Filtered Acceleration) model, for buildings subjected to different types of ground motions.
For the structural system, the PFA model covers modern steel and reinforced concrete moment-resisting frame buildings (potentially reinforced concrete shear wall buildings). For ground motions, the PFA model covers ramp-pulse-like ground motions, long-period ground motions, and short-period ground motions.
To predict whether a building will collapse in response to a given ground motion, we first extract long-period components from the ground motion using a Butterworth low-pass filter with suggested order and cutoff frequency. The order depends on the type of ground motion, and the cutoff frequency depends on the building’s natural frequency and ductility. We then compare the filtered acceleration time history with the capacity of the building. The capacity of the building is a constant for 2-dimentional buildings and a limit domain for 3-dimentional buildings. If the filtered acceleration exceeds the building’s capacity, the building is predicted to collapse. Otherwise, it is expected to survive the ground motion.
The parameters used in PFA model, which include fundamental period, global ductility and lateral capacity, can be obtained either from numerical analysis or interpolation based on the reference building system proposed in this thesis.
The PFA collapse prediction model greatly reduces computational complexity while archiving good accuracy. It is verified by FEM simulations of 13 frame building models and 150 ground motion records.
Based on the developed collapse prediction model, we propose to use PFA (Peak Filtered Acceleration) as a new ground motion intensity measure for collapse prediction. We compare PFA with traditional intensity measures PGA, PGV, PGD, and Sa in collapse prediction and find that PFA has the best performance among all the intensity measures.
We also provide a close form in term of a vector intensity measure (PGV, PGD) of the PFA collapse prediction model for practical collapse risk assessment.
Resumo:
Curve samplers are sampling algorithms that proceed by viewing the domain as a vector space over a finite field, and randomly picking a low-degree curve in it as the sample. Curve samplers exhibit a nice property besides the sampling property: the restriction of low-degree polynomials over the domain to the sampled curve is still low-degree. This property is often used in combination with the sampling property and has found many applications, including PCP constructions, local decoding of codes, and algebraic PRG constructions.
The randomness complexity of curve samplers is a crucial parameter for its applications. It is known that (non-explicit) curve samplers using O(log N + log(1/δ)) random bits exist, where N is the domain size and δ is the confidence error. The question of explicitly constructing randomness-efficient curve samplers was first raised in [TU06] where they obtained curve samplers with near-optimal randomness complexity.
In this thesis, we present an explicit construction of low-degree curve samplers with optimal randomness complexity (up to a constant factor) that sample curves of degree (m logq(1/δ))O(1) in Fqm. Our construction is a delicate combination of several components, including extractor machinery, limited independence, iterated sampling, and list-recoverable codes.
Resumo:
Growth is one of the most important characteristics of cultured species. The objective of this study was to determine the fitness of linear, log linear, polynomial, exponential and Logistic functions to the growth curves of Macrobrachium rosenbergii obtained by using weekly records of live weight, total length, head length, claw length, and last segment length from 20 to 192 days of age. The models were evaluated according to the coefficient of determination (R2), and error sum off square (ESS) and helps in formulating breeders in selective breeding programs. Twenty full-sib families consisting 400 PLs each were stocked in 20 different hapas and reared till 8 weeks after which a total of 1200 animals were transferred to earthen ponds and reared up to 192 days. The R2 values of the models ranged from 56 – 96 in case of overall body weight with logistic model being the highest. The R2 value for total length ranged from 62 to 90 with logistic model being the highest. In case of head length, the R2 value ranged between 55 and 95 with logistic model being the highest. The R2 value for claw length ranged from 44 to 94 with logistic model being the highest. For last segment length, R2 value ranged from 55 – 80 with polynomial model being the highest. However, the log linear model registered low ESS value followed by linear model for overall body weight while exponential model showed low ESS value followed by log linear model in case of head length. For total length the low ESS value was given by log linear model followed by logistic model and for claw length exponential model showed low ESS value followed by log linear model. In case of last segment length, linear model showed lowest ESS value followed by log linear model. Since, the model that shows highest R2 value with low ESS value is generally considered as the best fit model. Among the five models tested, logistic model, log linear model and linear models were found to be the best models for overall body weight, total length and head length respectively. For claw length and last segment length, log linear model was found to be the best model. These models can be used to predict growth rates in M. rosenbergii. However, further studies need to be conducted with more growth traits taken into consideration
Resumo:
Background Jumping to conclusions (JTC) is associated with psychotic disorder and psychotic symptoms. If JTC represents a trait, the rate should be (i) increased in people with elevated levels of psychosis proneness such as individuals diagnosed with borderline personality disorder (BPD), and (ii) show a degree of stability over time. Methods The JTC rate was examined in 3 groups: patients with first episode psychosis (FEP), BPD patients and controls, using the Beads Task. PANSS, SIS-R and CAPE scales were used to assess positive psychotic symptoms. Four WAIS III subtests were used to assess IQ. Results A total of 61 FEP, 26 BPD and 150 controls were evaluated. 29 FEP were revaluated after one year. 44% of FEP (OR = 8.4, 95% CI: 3.9-17.9) displayed a JTC reasoning bias versus 19% of BPD (OR = 2.5, 95% CI: 0.8-7.8) and 9% of controls. JTC was not associated with level of psychotic symptoms or specifically delusionality across the different groups. Differences between FEP and controls were independent of sex, educational level, cannabis use and IQ. After one year, 47.8% of FEP with JTC at baseline again displayed JTC. Conclusions JTC in part reflects trait vulnerability to develop disorders with expression of psychotic symptoms.
Resumo:
Assessing the vulnerability of stocks to fishing practices in U.S. federal waters was recently highlighted by the National Marine Fisheries Service (NMFS), National Oceanic and Atmospheric Administration, as an important factor to consider when 1) identifying stocks that should be managed and protected under a fishery management plan; 2) grouping data-poor stocks into relevant management complexes; and 3) developing precautionary harvest control rules. To assist the regional fishery management councils in determining vulnerability, NMFS elected to use a modified version of a productivity and susceptibility analysis (PSA) because it can be based on qualitative data, has a history of use in other fisheries, and is recommended by several organizations as a reasonable approach for evaluating risk. A number of productivity and susceptibility attributes for a stock are used in a PSA and from these attributes, index scores and measures of uncertainty are computed and graphically displayed. To demonstrate the utility of the resulting vulnerability evaluation, we evaluated six U.S. fisheries targeting 162 stocks that exhibited varying degrees of productivity and susceptibility, and for which data quality varied. Overall, the PSA was capable of differentiating the vulnerability of stocks along the gradient of susceptibility and productivity indices, although fixed thresholds separating low-, moderate-, and highly vulnerable species were not observed. The PSA can be used as a flexible tool that can incorporate regional-specific information on fishery and management activity.