844 resultados para Failure time data analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

China locates between the circum-Pacific and the Mediterranean-Himalayan seismic belt. The seismic activities in our country are very frequent and so are the collapses and slides of slope triggered by earthquakes. Many collapses and slides of slope take place mainly in the west of China with many earthquakes and mountains, especially in Sichuan and Yunnan Provinces. When a strong earthquake happening, the damage especially in mountains area caused by geological hazards it triggered such as rock collapses, landslides and debris flows is heavier than that it caused directly. A conclusion which the number of lives lost caused by geological hazards triggered by a strong earthquake in mountains area often accounts for a half even more of the total one induced by the strong earthquake can be made by consulting the statistical loss of several representative earthquakes. As a result, geological hazards such as collapses and slides of slope triggered by strong earthquakes attract wide attention for their great costs. Based on field geological investigation, engineering geological exploration and material data analysis, chief conclusions have been drawn after systematic research on formation mechanism, key inducing factors, dynamic characteristics of geological hazards such as collapses and slides of slope triggered by strong earthquakes by means of engineering geomechanics comprehensive analysis, finite difference numerical simulation test, in-lab dynamic triaxial shear test of rock, discrete element numerical simulation. Based on research on a great number of collapses and landslides triggered by Wenchuan and Xiaonanhai Earthquake, two-set methods, i.e. the method for original topography recovering based on factors such as lithology and elevation comparing and the method for reconstructing collapsing and sliding process of slope based on characteristics of seism tectonic zone, structural fissure, diameter spatial distribution of slope debris mass, propagation direction and mechanical property of seismic wave, have been gotten. What is more, types, formation mechanism and dynamic characteristics of collapses and slides of slope induced by strong earthquakes are discussed comprehensively. Firstly, collapsed and slided accumulative mass is in a state of heavily even more broken. Secondly, dynamic process of slope collapsing and sliding consists of almost four stages, i.e. broken, thrown, crushed and river blocked. Thirdly, classified according to failure forms, there are usually four types which are made up of collapsing, land sliding, land sliding-debris flowing and vibrating liquefaction. Finally, as for key inducing factors in slope collapsing and sliding, they often include characteristics of seism tectonic belts, structure and construction of rock mass, terrain and physiognomy, weathering degree of rock mass and mechanical functions of seismic waves. Based on microscopic study on initial fracturing of slope caused by seismic effect, combined with two change trends which include ratio of vertical vs. horizontal peak ground acceleration corresponding to epicentral distance and enlarging effect of peak ground acceleration along slope, key inducing factor of initial slope fracturing in various area with different epicentral distance is obtained. In near-field area, i.e. epicentral distance being less than 30 km, tensile strength of rock mass is a key intrinsic factor inducing initial fracturing of slope undergoing seismic effect whereas shear strength of rock mass is the one when epicentral distance is more than 30 km. In the latter circumstance, research by means of finite difference numerical simulation test and in-lab dynamic triaxial shear test of rock shows that initial fracture begins always in the place of slope shoulder. The fact that fracture strain and shear strength which are proportional to buried depth of rock mass in the place of slope shoulder are less than other place and peak ground acceleration is enlarged in the place causes prior failure at slope shoulder. Key extrinsic factors inducing dynamic fracture of slope at different distances to epicenter have been obtained through discrete element numerical simulation on the total process of collapsing and sliding of slope triggered by Wenchuan Earthquake. Research shows that combined action of P and S seismic waves is the key factor inducing collapsing and sliding of slope at a distance less than 64 km to initial epicenter along earthquake-triggering structure. What is more, vertical tensile action of P seismic wave plays a leading role near epicenter, whereas vertical shear action of S seismic wave plays a leading role gradually with epicentral distance increasing in this range. On the other hand, single action of P seismic wave becomes the key factor inducing collapsing and sliding of slope at a distance between 64 km and 216 km to initial epicenter. Horizontal tensile action of P seismic wave becomes the key factor gradually from combined action between vertical and horizontal tensile action of P seismic wave with epicentral distance increasing in this distance range. In addition, initial failure triggered by strong earthquakes begins almost in the place of slope shoulder. However, initial failure beginning from toe of slope relates probably with gradient and rock occurrence. Finally, starting time of initial failure in slope increases usually with epicentral distance. It is perhaps that the starting time increasing is a result of attenuating of seismic wave from epicenter along earthquake-triggering structure. It is of great theoretical and practical significance for us to construct towns and infrastructure in fragile geological environment along seism tectonic belts and conduct risk management on earthquake-triggered geological hazards by referring to above conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On the basis of the geological analysis and rock mass toppling deformation and failure mechanism analysis of Longtan engineering left bank slope, the synthetic space-time analysis and influence factors analysis on the surface monitoring data and deep rock mass monitoring data of B-zone of left bank slope are carried on. At the same time, based on the monitoring data analysis in conjunction with the predecessor's mechanics analysis results, the deformation state of B-zone of the left bank slope is discussed and its stability is synthetically evaluated. The detailed research contents and results are as following: According to the monitoring drill histogram analysis of Longtan engineering left bank slope, numerical simulation analysis and model experimentation analysis of bedded counter-inclined steep slope, a new type of toppling deformation and failure mode is proposed, that is "up-slope warping". Then the deformation and failure mode of bedded counter-inclined steep slope is summarized as "down-slope toppling" type, "up-slope warping" type and "complex fold" type. On the basis of synthetic space-time analysis to surface monitoring data and deep rock mass deformation monitoring data of B-zone of Longtan left bank slope;, we can get the conclusion that there exists potential instability rock mass over 520m altitude, especially over 560m altitude of slope B, and the rock mass of around strong-weathering line or creep rock mass breaking band controls the deformation of the whole slope. 1. According to the synthetic space-time analysis and influence factors analysis to the surface monitoring data of B-zone of Longtan left bank slope, a dynamical index, accumulative total acceleration index, which is used to analyze the influence factors of slope surface deformation, is raised. The principle and method of accumulative acceleration index are explained, and the index can be used for the influence factors analysis of the similar slope. 2. Summarize the results of geologic analysis, monitoring analysis and mechanics analysis, the following conclusion can be gotten: the stability of B-zone of the slope is basically good. However, on the condition of drainage and slope toe loading engineering, there is still some creep deformation in the rock mass over 520m altitude, especially over 560m altitude. So, better measures of the monitoring and timely maintenance of the drainage system are suggested in the paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Guangxi Longtan Hydropower Station is not only a representative project of West Developing and Power Transmission from West to East in China, but also the second Hydropower Station to Three Gorges Project which is under construction in China. There are 770 X 104m3 creeping rock mass on the left bank slope in upper reaches, in which laid 9 water inlet tunnels and some underground plant buildings. Since the 435m high excavated slope threatens the security of the Dam, its deformation and stability is of great importance to the power station.Based on the Autodesk Map2004, Longtan Hydropower Station Monitoring Information System on Left Bank has been basically finished on the whole. Integrating the hydropower station monitoring information into Geographic Information System(GIS) environment, managers and engineers can dynamically gain the deformation information of the slop by query the symbols. By this means, designers can improve the correctness of analysis, and make a strategic and proper decision. Since the system is beneficial to effectively manage the monitoring-data, equitably save the cost of design and safe construction, and decrease the workload of the engineers, it is a successful application to the combination of hydropower station monitoring information management and computer information system technology.At the same time, on the basis of the geological analysis and rock mass toppling deformation and failure mechanism analysis of Longtan engineering left bank slope, the synthetic space-time analysis and influence factors analysis on the surface monitoring data and deep rock mass monitoring data of A-zone on left bank slope are carried on. It shows that the main intrinsic factor that effects the deformation of Zone A is the argillite limestone interbedding toppling structure, and its main external factors are rain and slope excavation. What's more, Degree of Reinforcement Demand(DRD) has been used to evaluate the slop reinforce effect of Zone A on left bank according to the Engineering Geomechanics-mate-Synthetics(EGMS). The result shows that the slop has been effective reinforced, and it is more stable after reinforce.At last, on the basis of contrasting with several forecast models, a synthetic forecast GRAV model has been presented and used to forecast the deformation of zone A on left bank in generating electricity period. The result indicates that GRAV model has good forecast precision, strong stability, and practical valuable reliability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lee M.H., Qualitative Circuit Models in Failure Analysis Reasoning, AI Journal. vol 111, pp239-276.1999.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urquhart, C., Durbin, J. & Spink, S. (2004). Training needs analysis of healthcare library staff, undertaken for South Yorkshire Workforce Development Confederation. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: South Yorkshire WDC (NHS)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work considers the static calculation of a program’s average-case time. The number of systems that currently tackle this research problem is quite small due to the difficulties inherent in average-case analysis. While each of these systems make a pertinent contribution, and are individually discussed in this work, only one of them forms the basis of this research. That particular system is known as MOQA. The MOQA system consists of the MOQA language and the MOQA static analysis tool. Its technique for statically determining average-case behaviour centres on maintaining strict control over both the data structure type and the labeling distribution. This research develops and evaluates the MOQA language implementation, and adds to the functions already available in this language. Furthermore, the theory that backs MOQA is generalised and the range of data structures for which the MOQA static analysis tool can determine average-case behaviour is increased. Also, some of the MOQA applications and extensions suggested in other works are logically examined here. For example, the accuracy of classifying the MOQA language as reversible is investigated, along with the feasibility of incorporating duplicate labels into the MOQA theory. Finally, the analyses that take place during the course of this research reveal some of the MOQA strengths and weaknesses. This thesis aims to be pragmatic when evaluating the current MOQA theory, the advancements set forth in the following work and the benefits of MOQA when compared to similar systems. Succinctly, this work’s significant expansion of the MOQA theory is accompanied by a realistic assessment of MOQA’s accomplishments and a serious deliberation of the opportunities available to MOQA in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aging population in many countries brings into focus rising healthcare costs and pressure on conventional healthcare services. Pervasive healthcare has emerged as a viable solution capable of providing a technology-driven approach to alleviate such problems by allowing healthcare to move from the hospital-centred care to self-care, mobile care, and at-home care. The state-of-the-art studies in this field, however, lack a systematic approach for providing comprehensive pervasive healthcare solutions from data collection to data interpretation and from data analysis to data delivery. In this thesis we introduce a Context-aware Real-time Assistant (CARA) architecture that integrates novel approaches with state-of-the-art technology solutions to provide a full-scale pervasive healthcare solution with the emphasis on context awareness to help maintaining the well-being of elderly people. CARA collects information about and around the individual in a home environment, and enables accurately recognition and continuously monitoring activities of daily living. It employs an innovative reasoning engine to provide accurate real-time interpretation of the context and current situation assessment. Being mindful of the use of the system for sensitive personal applications, CARA includes several mechanisms to make the sophisticated intelligent components as transparent and accountable as possible, it also includes a novel cloud-based component for more effective data analysis. To deliver the automated real-time services, CARA supports interactive video and medical sensor based remote consultation. Our proposal has been validated in three application domains that are rich in pervasive contexts and real-time scenarios: (i) Mobile-based Activity Recognition, (ii) Intelligent Healthcare Decision Support Systems and (iii) Home-based Remote Monitoring Systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

© 2015, Institute of Mathematical Statistics. All rights reserved.In order to use persistence diagrams as a true statistical tool, it would be very useful to have a good notion of mean and variance for a set of diagrams. In [23], Mileyko and his collaborators made the first study of the properties of the Fréchet mean in (Dp, Wp), the space of persistence diagrams equipped with the p-th Wasserstein metric. In particular, they showed that the Fréchet mean of a finite set of diagrams always exists, but is not necessarily unique. The means of a continuously-varying set of diagrams do not themselves (necessarily) vary continuously, which presents obvious problems when trying to extend the Fréchet mean definition to the realm of time-varying persistence diagrams, better known as vineyards. We fix this problem by altering the original definition of Fréchet mean so that it now becomes a probability measure on the set of persistence diagrams; in a nutshell, the mean of a set of diagrams will be a weighted sum of atomic measures, where each atom is itself a persistence diagram determined using a perturbation of the input diagrams. This definition gives for each N a map (Dp)N→ℙ(Dp). We show that this map is Hölder continuous on finite diagrams and thus can be used to build a useful statistic on vineyards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PREMISE OF THE STUDY: We investigated the origins of 252 Southern Appalachian woody species representing 158 clades to analyze larger patterns of biogeographic connectivity around the northern hemisphere. We tested biogeographic hypotheses regarding the timing of species disjunctions to eastern Asia and among areas of North America. METHODS: We delimited species into biogeographically informative clades, compiled sister-area data, and generated graphic representations of area connections across clades. We calculated taxon diversity within clades and plotted divergence times. KEY RESULTS: Of the total taxon diversity, 45% were distributed among 25 North American endemic clades. Sister taxa within eastern North America and eastern Asia were proportionally equal in frequency, accounting for over 50% of the sister-area connections. At increasing phylogenetic depth, connections to the Old World dominated. Divergence times for 65 clades with intercontinental disjunctions were continuous, whereas 11 intracontinental disjunctions to western North America and nine to eastern Mexico were temporally congruent. CONCLUSIONS: Over one third of the clades have likely undergone speciation within the region of eastern North America. The biogeographic pattern for the region is asymmetric, consisting of mostly mixed-aged, low-diversity clades connecting to the Old World, and a minority of New World clades. Divergence time data suggest that climate change in the Late Miocene to Early Pliocene generated disjunct patterns within North America. Continuous splitting times during the last 45 million years support the hypothesis that widespread distributions formed repeatedly during favorable periods, with serial cooling trends producing pseudocongruent area disjunctions between eastern North America and eastern Asia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The outcomes for both (i) radiation therapy and (ii) preclinical small animal radio- biology studies are dependent on the delivery of a known quantity of radiation to a specific and intentional location. Adverse effects can result from these procedures if the dose to the target is too high or low, and can also result from an incorrect spatial distribution in which nearby normal healthy tissue can be undesirably damaged by poor radiation delivery techniques. Thus, in mice and humans alike, the spatial dose distributions from radiation sources should be well characterized in terms of the absolute dose quantity, and with pin-point accuracy. When dealing with the steep spatial dose gradients consequential to either (i) high dose rate (HDR) brachytherapy or (ii) within the small organs and tissue inhomogeneities of mice, obtaining accurate and highly precise dose results can be very challenging, considering commercially available radiation detection tools, such as ion chambers, are often too large for in-vivo use.

In this dissertation two tools are developed and applied for both clinical and preclinical radiation measurement. The first tool is a novel radiation detector for acquiring physical measurements, fabricated from an inorganic nano-crystalline scintillator that has been fixed on an optical fiber terminus. This dosimeter allows for the measurement of point doses to sub-millimeter resolution, and has the ability to be placed in-vivo in humans and small animals. Real-time data is displayed to the user to provide instant quality assurance and dose-rate information. The second tool utilizes an open source Monte Carlo particle transport code, and was applied for small animal dosimetry studies to calculate organ doses and recommend new techniques of dose prescription in mice, as well as to characterize dose to the murine bone marrow compartment with micron-scale resolution.

Hardware design changes were implemented to reduce the overall fiber diameter to <0.9 mm for the nano-crystalline scintillator based fiber optic detector (NanoFOD) system. Lower limits of device sensitivity were found to be approximately 0.05 cGy/s. Herein, this detector was demonstrated to perform quality assurance of clinical 192Ir HDR brachytherapy procedures, providing comparable dose measurements as thermo-luminescent dosimeters and accuracy within 20% of the treatment planning software (TPS) for 27 treatments conducted, with an inter-quartile range ratio to the TPS dose value of (1.02-0.94=0.08). After removing contaminant signals (Cerenkov and diode background), calibration of the detector enabled accurate dose measurements for vaginal applicator brachytherapy procedures. For 192Ir use, energy response changed by a factor of 2.25 over the SDD values of 3 to 9 cm; however a cap made of 0.2 mm thickness silver reduced energy dependence to a factor of 1.25 over the same SDD range, but had the consequence of reducing overall sensitivity by 33%.

For preclinical measurements, dose accuracy of the NanoFOD was within 1.3% of MOSFET measured dose values in a cylindrical mouse phantom at 225 kV for x-ray irradiation at angles of 0, 90, 180, and 270˝. The NanoFOD exhibited small changes in angular sensitivity, with a coefficient of variation (COV) of 3.6% at 120 kV and 1% at 225 kV. When the NanoFOD was placed alongside a MOSFET in the liver of a sacrificed mouse and treatment was delivered at 225 kV with 0.3 mm Cu filter, the dose difference was only 1.09% with use of the 4x4 cm collimator, and -0.03% with no collimation. Additionally, the NanoFOD utilized a scintillator of 11 µm thickness to measure small x-ray fields for microbeam radiation therapy (MRT) applications, and achieved 2.7% dose accuracy of the microbeam peak in comparison to radiochromic film. Modest differences between the full-width at half maximum measured lateral dimension of the MRT system were observed between the NanoFOD (420 µm) and radiochromic film (320 µm), but these differences have been explained mostly as an artifact due to the geometry used and volumetric effects in the scintillator material. Characterization of the energy dependence for the yttrium-oxide based scintillator material was performed in the range of 40-320 kV (2 mm Al filtration), and the maximum device sensitivity was achieved at 100 kV. Tissue maximum ratio data measurements were carried out on a small animal x-ray irradiator system at 320 kV and demonstrated an average difference of 0.9% as compared to a MOSFET dosimeter in the range of 2.5 to 33 cm depth in tissue equivalent plastic blocks. Irradiation of the NanoFOD fiber and scintillator material on a 137Cs gamma irradiator to 1600 Gy did not produce any measurable change in light output, suggesting that the NanoFOD system may be re-used without the need for replacement or recalibration over its lifetime.

For small animal irradiator systems, researchers can deliver a given dose to a target organ by controlling exposure time. Currently, researchers calculate this exposure time by dividing the total dose that they wish to deliver by a single provided dose rate value. This method is independent of the target organ. Studies conducted here used Monte Carlo particle transport codes to justify a new method of dose prescription in mice, that considers organ specific doses. Monte Carlo simulations were performed in the Geant4 Application for Tomographic Emission (GATE) toolkit using a MOBY mouse whole-body phantom. The non-homogeneous phantom was comprised of 256x256x800 voxels of size 0.145x0.145x0.145 mm3. Differences of up to 20-30% in dose to soft-tissue target organs was demonstrated, and methods for alleviating these errors were suggested during whole body radiation of mice by utilizing organ specific and x-ray tube filter specific dose rates for all irradiations.

Monte Carlo analysis was used on 1 µm resolution CT images of a mouse femur and a mouse vertebra to calculate the dose gradients within the bone marrow (BM) compartment of mice based on different radiation beam qualities relevant to x-ray and isotope type irradiators. Results and findings indicated that soft x-ray beams (160 kV at 0.62 mm Cu HVL and 320 kV at 1 mm Cu HVL) lead to substantially higher dose to BM within close proximity to mineral bone (within about 60 µm) as compared to hard x-ray beams (320 kV at 4 mm Cu HVL) and isotope based gamma irradiators (137Cs). The average dose increases to the BM in the vertebra for these four aforementioned radiation beam qualities were found to be 31%, 17%, 8%, and 1%, respectively. Both in-vitro and in-vivo experimental studies confirmed these simulation results, demonstrating that the 320 kV, 1 mm Cu HVL beam caused statistically significant increased killing to the BM cells at 6 Gy dose levels in comparison to both the 320 kV, 4 mm Cu HVL and the 662 keV, 137Cs beams.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The International Maritime Organisation (IMO) has adopted the use of computer simulation to assist in the assessment of the assembly time for passenger ships. A key parameter required for this analysis and specified as part of the IMO guidelines is the passenger response time distribution. It is demonstrated in this paper that the IMO specified response time distribution assumes an unrealistic mathematical form. This unrealistic mathematical form can lead to serious congestion issues being overlooked in the evacuation analysis and lead to incorrect conclusions concerning the suitability of vessel design. In light of these results, it is vital that IMO undertake research to generate passenger response time data suitable for use in evacuation analysis of passenger ships. Until this type of data becomes readily available, it is strongly recommended that rather than continuing to use the artificial and unrepresentative form of the response time distribution, IMO should adopt plausible and more realistic response time data derived from land based applications. © 2005: Royal Institution of Naval Architects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modeling of on-body propagation channels is of paramount importance to those wishing to evaluate radio channel performance for wearable devices in body area networks (BANs). Difficulties in modeling arise due to the highly variable channel conditions related to changes in the user's state and local environment. This study characterizes these influences by using time-series analysis to examine and model signal characteristics for on-body radio channels in user stationary and mobile scenarios in four different locations: anechoic chamber, open office area, hallway, and outdoor environment. Autocorrelation and cross-correlation functions are reported and shown to be dependent on body state and surroundings. Autoregressive (AR) transfer functions are used to perform time-series analysis and develop models for fading in various on-body links. Due to the non-Gaussian nature of the logarithmically transformed observed signal envelope in the majority of mobile user states, a simple method for reproducing the failing based on lognormal and Nakagami statistics is proposed. The validity of the AR models is evaluated using hypothesis testing, which is based on the Ljung-Box statistic, and the estimated distributional parameters of the simulator output compared with those from experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wavelet transforms provide basis functions for time-frequency analysis and have properties that are particularly useful for the compression of analogue point on wave transient and disturbance power system signals. This paper evaluates the compression properties of the discrete wavelet transform using actual power system data. The results presented in the paper indicate that reduction ratios up to 10:1 with acceptable distortion are achievable. The paper discusses the application of the reduction method for expedient fault analysis and protection assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to explore the care processes experienced by community-dwelling adults dying from advanced heart failure, their family caregivers, and their health-care providers. A descriptive qualitative design was used to guide data collection, analysis, and interpretation. The sample comprised 8 patients, 10 informal caregivers, 11 nurses, 3 physicians, and 3 pharmacists. Data analysis revealed that palliative care was influenced by unique contextual factors (i.e., cancer model of palliative care, limited access to resources, prognostication challenges). Patients described choosing interventions and living with fatigue, pain, shortness of breath, and functional decline. Family caregivers described surviving caregiver burden and drawing on their faith. Health professionals described their role as trying to coordinate care, building expertise, managing medications, and optimizing interprofessional collaboration. Participants strove towards 3 outcomes: effective symptom management, satisfaction with care, and a peaceful death. © McGill University School of Nursing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of model selection of a univariate long memory time series is investigated once a semi parametric estimator for the long memory parameter has been used. Standard information criteria are not consistent in this case. A Modified Information Criterion (MIC) that overcomes these difficulties is introduced and proofs that show its asymptotic validity are provided. The results are general and cover a wide range of short memory processes. Simulation evidence compares the new and existing methodologies and empirical applications in monthly inflation and daily realized volatility are presented.