766 resultados para technology governance risk
Resumo:
This thesis examines collapse risk of tall steel braced frame buildings using rupture-to-rafters simulations due to suite of San Andreas earthquakes. Two key advancements in this work are the development of (i) a rational methodology for assigning scenario earthquake probabilities and (ii) an artificial correction-free approach to broadband ground motion simulation. The work can be divided into the following sections: earthquake source modeling, earthquake probability calculations, ground motion simulations, building response, and performance analysis.
As a first step the kinematic source inversions of past earthquakes in the magnitude range of 6-8 are used to simulate 60 scenario earthquakes on the San Andreas fault. For each scenario earthquake a 30-year occurrence probability is calculated and we present a rational method to redistribute the forecast earthquake probabilities from UCERF to the simulated scenario earthquake. We illustrate the inner workings of the method through an example involving earthquakes on the San Andreas fault in southern California.
Next, three-component broadband ground motion histories are computed at 636 sites in the greater Los Angeles metropolitan area by superposing short-period (0.2~s-2.0~s) empirical Green's function synthetics on top of long-period ($>$ 2.0~s) spectral element synthetics. We superimpose these seismograms on low-frequency seismograms, computed from kinematic source models using the spectral element method, to produce broadband seismograms.
Using the ground motions at 636 sites for the 60 scenario earthquakes, 3-D nonlinear analysis of several variants of an 18-story steel braced frame building, designed for three soil types using the 1994 and 1997 Uniform Building Code provisions and subjected to these ground motions, are conducted. Model performance is classified into one of five performance levels: Immediate Occupancy, Life Safety, Collapse Prevention, Red-Tagged, and Model Collapse. The results are combined with the 30-year probability of occurrence of the San Andreas scenario earthquakes using the PEER performance based earthquake engineering framework to determine the probability of exceedance of these limit states over the next 30 years.
Resumo:
There is a sparse number of credible source models available from large-magnitude past earthquakes. A stochastic source model generation algorithm thus becomes necessary for robust risk quantification using scenario earthquakes. We present an algorithm that combines the physics of fault ruptures as imaged in laboratory earthquakes with stress estimates on the fault constrained by field observations to generate stochastic source models for large-magnitude (Mw 6.0-8.0) strike-slip earthquakes. The algorithm is validated through a statistical comparison of synthetic ground motion histories from a stochastically generated source model for a magnitude 7.90 earthquake and a kinematic finite-source inversion of an equivalent magnitude past earthquake on a geometrically similar fault. The synthetic dataset comprises of three-component ground motion waveforms, computed at 636 sites in southern California, for ten hypothetical rupture scenarios (five hypocenters, each with two rupture directions) on the southern San Andreas fault. A similar validation exercise is conducted for a magnitude 6.0 earthquake, the lower magnitude limit for the algorithm. Additionally, ground motions from the Mw7.9 earthquake simulations are compared against predictions by the Campbell-Bozorgnia NGA relation as well as the ShakeOut scenario earthquake. The algorithm is then applied to generate fifty source models for a hypothetical magnitude 7.9 earthquake originating at Parkfield, with rupture propagating from north to south (towards Wrightwood), similar to the 1857 Fort Tejon earthquake. Using the spectral element method, three-component ground motion waveforms are computed in the Los Angeles basin for each scenario earthquake and the sensitivity of ground shaking intensity to seismic source parameters (such as the percentage of asperity area relative to the fault area, rupture speed, and risetime) is studied.
Under plausible San Andreas fault earthquakes in the next 30 years, modeled using the stochastic source algorithm, the performance of two 18-story steel moment frame buildings (UBC 1982 and 1997 designs) in southern California is quantified. The approach integrates rupture-to-rafters simulations into the PEER performance based earthquake engineering (PBEE) framework. Using stochastic sources and computational seismic wave propagation, three-component ground motion histories at 636 sites in southern California are generated for sixty scenario earthquakes on the San Andreas fault. The ruptures, with moment magnitudes in the range of 6.0-8.0, are assumed to occur at five locations on the southern section of the fault. Two unilateral rupture propagation directions are considered. The 30-year probabilities of all plausible ruptures in this magnitude range and in that section of the fault, as forecast by the United States Geological Survey, are distributed among these 60 earthquakes based on proximity and moment release. The response of the two 18-story buildings hypothetically located at each of the 636 sites under 3-component shaking from all 60 events is computed using 3-D nonlinear time-history analysis. Using these results, the probability of the structural response exceeding Immediate Occupancy (IO), Life-Safety (LS), and Collapse Prevention (CP) performance levels under San Andreas fault earthquakes over the next thirty years is evaluated.
Furthermore, the conditional and marginal probability distributions of peak ground velocity (PGV) and displacement (PGD) in Los Angeles and surrounding basins due to earthquakes occurring primarily on the mid-section of southern San Andreas fault are determined using Bayesian model class identification. Simulated ground motions at sites within 55-75km from the source from a suite of 60 earthquakes (Mw 6.0 − 8.0) primarily rupturing mid-section of San Andreas fault are considered for PGV and PGD data.
Resumo:
Structural design is a decision-making process in which a wide spectrum of requirements, expectations, and concerns needs to be properly addressed. Engineering design criteria are considered together with societal and client preferences, and most of these design objectives are affected by the uncertainties surrounding a design. Therefore, realistic design frameworks must be able to handle multiple performance objectives and incorporate uncertainties from numerous sources into the process.
In this study, a multi-criteria based design framework for structural design under seismic risk is explored. The emphasis is on reliability-based performance objectives and their interaction with economic objectives. The framework has analysis, evaluation, and revision stages. In the probabilistic response analysis, seismic loading uncertainties as well as modeling uncertainties are incorporated. For evaluation, two approaches are suggested: one based on preference aggregation and the other based on socio-economics. Both implementations of the general framework are illustrated with simple but informative design examples to explore the basic features of the framework.
The first approach uses concepts similar to those found in multi-criteria decision theory, and directly combines reliability-based objectives with others. This approach is implemented in a single-stage design procedure. In the socio-economics based approach, a two-stage design procedure is recommended in which societal preferences are treated through reliability-based engineering performance measures, but emphasis is also given to economic objectives because these are especially important to the structural designer's client. A rational net asset value formulation including losses from uncertain future earthquakes is used to assess the economic performance of a design. A recently developed assembly-based vulnerability analysis is incorporated into the loss estimation.
The presented performance-based design framework allows investigation of various design issues and their impact on a structural design. It is a flexible one that readily allows incorporation of new methods and concepts in seismic hazard specification, structural analysis, and loss estimation.
Resumo:
Genome wide association studies (GWAS) have identified several low-penetrance susceptibility alleles in chronic lymphocytic leukemia (CLL). Nevertheless, these studies scarcely study regions that are implicated in non-coding molecules such as microRNAs (miRNAs). Abnormalities in miRNAs, as altered expression patterns and mutations, have been described in CLL, suggesting their implication in the development of the disease. Genetic variations in miRNAs can affect levels of miRNA expression if present in pre-miRNAs and in miRNA biogenesis genes or alter miRNA function if present in both target mRNA and miRNA sequences. Therefore, the present study aimed to evaluate whether polymorphisms in pre-miRNAs, and/or miRNA processing genes contribute to predisposition for CLL. A total of 91 SNPs in 107 CLL patients and 350 cancer-free controls were successfully analyzed using TaqMan Open Array technology. We found nine statistically significant associations with CLL risk after FDR correction, seven in miRNA processing genes (rs3805500 and rs6877842 in DROSHA, rs1057035 in DICER1, rs17676986 in SND1, rs9611280 in TNRC6B, rs784567 in TRBP and rs11866002 in CNOT1) and two in pre-miRNAs (rs11614913 in miR196a2 and rs2114358 in miR1206). These findings suggest that polymorphisms in genes involved in miRNAs biogenesis pathway as well as in pre-miRNAs contribute to the risk of CLL. Large-scale studies are needed to validate the current findings.
Resumo:
Lake Victoria fisheries face severe environmental stresses. Stocks are declining in a context of increasing population and growing demand for the lake’s resources. Rising competition between users is putting conservation goals and rural livelihoods at risk. While Uganda’s co-management policy framework is well-developed, key resources for implementation are lacking, enforcement is poor, and the relations between stakeholders are unequal. Poor rural resource users face significant challenges to effectively participate in fisheries decision-making. This case study demonstrates the progress that can be made using a collaborative approach to catalyze community-led actions linking public health, sanitation and environmental conservation in difficult circumstances, even over a relatively short time period. Multistakeholder dialogue can bring to light the sources of conflict, pinpoint governance challenges, and identify opportunities for institutional collaboration to address community needs. At the same time, the process can help build trust, confidence in collective action and public accountability.
Resumo:
Apart from the use of statistical quality control chart for variables or attributes of food products in a food processing industry, the application of these charts for attributes of fishery products is explained. Statistical quality control chart for fraction defectives is explained by noting defective fish sausages per shift from a sausage industry while control chart for number of defectives is illustrated for number of defective fish cans in each hour of its production of a canning industry. C-chart is another type of control chart which is explained here for number of defects per single fish fillet sampled a1l random for every five minutes in a processing industry. These statistical quality control charts help in the more economic use of resource, time and labour than control charts for variables of products. Also control charts for attributes exhibit the quality history of finished products at different times of production thereby minimizing the risk of consumer rejection.
Resumo:
Why do firms acquire external technologies? Previous research indicates that there are a wide variety of motivations. These include the need to acquire valuable knowledge-based resources, to improve strategic flexibility, to experiment), to overcome organisational inertia, to mitigate risk and uncertainty, to reduce costs and development time in new product development, and the perception that the firm has the absorptive capacity to integrate acquisitions. In this paper we provide an in-depth literature review of the motivations for the acquisition of external technologies by firms. We find that these motivations can be broadly classed into four categories: (1) the development of technological capabilities, (2) the development of strategic options, (3) efficiency improvements, and (4) responses to the competitive environment. In light of this categorisation, we comment on how these different motivations connect to the wider issues of technology acquisition. © 2010 IEEE.
Resumo:
The generation of new medicinal products is both a contributor to global economic growth and a source of valuable benefits to human health. Given their direct responsibility for public health, regulatory authorities monitor closely both the development and exploitation of the underlying technologies and the products derived from them. The manner in which such regulation is implemented can result in regulators constraining or facilitating the generation of new products. This paper will study as an example the impact of EU Risk Management Plans (EU-RMPs), which have been mandatory for the approval of new medicines since 2005, on both the industry and regulatory authorities. In interviews, the responses of those who had experience of the implementation of EU-RMPs were mixed. Although the benefits of a more structured and predictable approach to the evaluation of risk were appreciated, some respondents perceived the regulation as an excessive burden on their organisations. The exploration of factors that influence how EU-RMP regulation affects individual firms provides new insights for both regulators and managers, and demonstrates one aspect of the complexity of the process by which new medicinal products are brought to market.
Resumo:
The generation of new medicinal products is both a contributor to global economic growth and a source of valuable benefits to human health. Given their direct responsibility for public health, regulatory authorities monitor closely both the development and exploitation of the underlying technologies and the products derived from them. The manner in which such regulation is implemented can result in regulators constraining or facilitating the generation of new products. This paper will study as an example the impact of EU Risk Management Plans (EU-RMPs), which have been mandatory for the approval of new medicines since 2005, on both the industry and regulatory authorities. In interviews, the responses of those who had experience of the implementation of EU-RMPs were mixed. Although the benefits of a more structured and predictable approach to the evaluation of risk were appreciated, some respondents perceived the regulation as an excessive burden on their organisations. The exploration of factors that influence how EU-RMP regulation affects individual firms provides new insights for both regulators and managers, and demonstrates one aspect of the complexity of the process by which new medicinal products are brought to market. © 2010 IEEE.
Resumo:
Space heating accounts for a large portion of the world's carbon dioxide emissions. Ground Source Heat Pumps (GSHPs) are a technology which can reduce carbon emissions from heating and cooling. GSHP system performance is however highly sensitive to deviation from design values of the actual annual energy extraction/rejection rates from/to the ground. In order to prevent failure and/or performance deterioration of GSHP systems it is possible to incorporate a safety factor in the design of the GSHP by over-sizing the ground heat exchanger (GHE). A methodology to evaluate the financial risk involved in over-sizing the GHE is proposed is this paper. A probability based approach is used to evaluate the economic feasibility of a hypothetical full-size GSHP system as compared to four alternative Heating Ventilation and Air Conditioning (HVAC) system configurations. The model of the GSHP system is developed in the TRNSYS energy simulation platform and calibrated with data from an actual hybrid GSHP system installed in the Department of Earth Science, University of Oxford, UK. Results of the analysis show that potential savings from a full-size GSHP system largely depend on projected HVAC system efficiencies and gas and electricity prices. Results of the risk analysis also suggest that a full-size GSHP with auxiliary back up is potentially the most economical system configuration. © 2012 Elsevier Ltd.
Resumo:
This study is one of the very few investigating the dioxin body burden of a group of child-bearing-aged women at an electronic waste (e-waste) recycling site (Taizhou, Zhejiang Province) (24 +/- 2.83 years of age, 40% were primiparae) and a reference site (Lin'an city, Zhejiang Province, about 245 km away from Taizhou) (24 +/- 2.35 years of age, 100% were primiparae) in China. Five sets of samples (each set consisted of human milk, placenta, and hair) were collected from each site. Body burdens of people from the e-waste processing site (human milk, 21.02 +/- 13.81 pg WHO-TEQ(1998/g) fat (World Health Organization toxic equivalency 1998); placenta, 31.15 +/- 15.67 pg WHO-TEQ(1998/g) fat; hair, 33.82 +/- 17.74 pg WHO-TEQ(1998/g) dry wt) showed significantly higher levels of polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurnas (PCDD/Fs) than those from the reference site (human milk, 9.35 +/- 7.39 pg WHO-TEQ(1998/g) fat, placenta, 11.91 +/- 7.05 pg WHO-TEQ(1998/g) fat; hair, 5.59 +/- 4.36 pg WHO-TEQ(1998/g) dry wt) and were comparatively higher than other studies. The difference between the two sites was due to e-waste recycling operations, for example, open burning, which led to high background levels. Moreover, mothers from the e-waste recycling site consumed more foods of animal origin. The estimated daily intake of PCDD/Fs within 6 months by breast-fed infants from the e-waste processing site was 2 times higher than that from the reference site. Both values exceeded the WHO tolerable daily intake for adults by at least 25 and 11 times, respectively. Our results implicated that e-waste recycling operations cause prominent PCDD/F levels in the environment and in humans. The elevated body burden may have health implications for the next generation.
Resumo:
For at least two millennia and probably much longer, the traditional vehicle for communicating geographical information to end-users has been the map. With the advent of computers, the means of both producing and consuming maps have radically been transformed, while the inherent nature of the information product has also expanded and diversified rapidly. This has given rise in recent years to the new concept of geovisualisation (GVIS), which draws on the skills of the traditional cartographer, but extends them into three spatial dimensions and may also add temporality, photorealistic representations and/or interactivity. Demand for GVIS technologies and their applications has increased significantly in recent years, driven by the need to study complex geographical events and in particular their associated consequences and to communicate the results of these studies to a diversity of audiences and stakeholder groups. GVIS has data integration, multi-dimensional spatial display advanced modelling techniques, dynamic design and development environments and field-specific application needs. To meet with these needs, GVIS tools should be both powerful and inherently usable, in order to facilitate their role in helping interpret and communicate geographic problems. However no framework currently exists for ensuring this usability. The research presented here seeks to fill this gap, by addressing the challenges of incorporating user requirements in GVIS tool design. It starts from the premise that usability in GVIS should be incorporated and implemented throughout the whole design and development process. To facilitate this, Subject Technology Matching (STM) is proposed as a new approach to assessing and interpreting user requirements. Based on STM, a new design framework called Usability Enhanced Coordination Design (UECD) is ten presented with the purpose of leveraging overall usability of the design outputs. UECD places GVIS experts in a new key role in the design process, to form a more coordinated and integrated workflow and a more focused and interactive usability testing. To prove the concept, these theoretical elements of the framework have been implemented in two test projects: one is the creation of a coastal inundation simulation for Whitegate, Cork, Ireland; the other is a flooding mapping tool for Zhushan Town, Jiangsu, China. The two case studies successfully demonstrated the potential merits of the UECD approach when GVIS techniques are applied to geographic problem solving and decision making. The thesis delivers a comprehensive understanding of the development and challenges of GVIS technology, its usability concerns, usability and associated UCD; it explores the possibility of putting UCD framework in GVIS design; it constructs a new theoretical design framework called UECD which aims to make the whole design process usability driven; it develops the key concept of STM into a template set to improve the performance of a GVIS design. These key conceptual and procedural foundations can be built on future research, aimed at further refining and developing UECD as a useful design methodology for GVIS scholars and practitioners.
Resumo:
The increasing complexity of new manufacturing processes and the continuously growing range of fabrication options mean that critical decisions about the insertion of new technologies must be made as early as possible in the design process. Mitigating the technology risks under limited knowledge is a key factor and major requirement to secure a successful development of the new technologies. In order to address this challenge, a risk mitigation methodology that incorporates both qualitative and quantitative analysis is required. This paper outlines the methodology being developed under a major UK grand challenge project - 3D-Mintegration. The main focus is on identifying the risks through identification of the product key characteristics using a product breakdown approach. The assessment of the identified risks uses quantification and prioritisation techniques to evaluate and rank the risks. Traditional statistical process control based on process capability and six sigma concepts are applied to measure the process capability as a result of the risks that have been identified. This paper also details a numerical approach that can be used to undertake risk analysis. This methodology is based on computational framework where modelling and statistical techniques are integrated. Also, an example of modeling and simulation technique is given using focused ion beam which is among the investigated in the project manufacturing processes.