903 resultados para Structural Risk Analysing method
Resumo:
The purpose of this thesis was to investigate environmental permits of landfills with respect to the appropriateness of risk assessments focusing on contaminant migration, structures capable to protect the environment, waste and leachate management and existing environmental impacts of landfills. According to the requirements, a risk assessment is always required to demonstrate compliance with environmental protection requirements if the environmental permit decision deviates from the set requirements. However, there is a reason to doubt that all relevant risk factors are identified in current risk assessment practices in order to protect people end environment. In this dissertation, risk factors were recognized in 12 randomly selected landfills. Based on this analysis, a structural risk assessment method was created. The method was verified with two case examples. Several development needs were found in the risk assessments of the environmental permit decisions. The risk analysis equations used in the decisions did not adequately take into account all the determining factors like waste prospects, total risk quantification or human delineated factors. Instead of focusing on crucial factors, the landfill environmental protection capability is simply expressed via technical factors like hydraulic conductivity. In this thesis, it could be shown, that using adequate risk assessment approaches the most essential environmental impacts can be taken into account by consideration of contaminant transport mechanisms, leachate effects, and artificial landfill structures. The developed structural risk analysing (SRA) method shows, that landfills structures could be designed in a more cost-efficient way taking advantage of recycled or by-products. Additionally, the research results demonstrate that the environmental protection requirements of landfills should be updated to correspond to the capability to protect the environment instead of the current simplified requirements related to advective transport only.
Resumo:
Part 17: Risk Analysis
Resumo:
This study investigates topology optimization of energy absorbing structures in which material damage is accounted for in the optimization process. The optimization objective is to design the lightest structures that are able to absorb the required mechanical energy. A structural continuity constraint check is introduced that is able to detect when no feasible load path remains in the finite element model, usually as a result of large scale fracture. This assures that designs do not fail when loaded under the conditions prescribed in the design requirements. This continuity constraint check is automated and requires no intervention from the analyst once the optimization process is initiated. Consequently, the optimization algorithm proceeds towards evolving an energy absorbing structure with the minimum structural mass that is not susceptible to global structural failure. A method is also introduced to determine when the optimization process should halt. The method identifies when the optimization method has plateaued and is no longer likely to provide improved designs if continued for further iterations. This provides the designer with a rational method to determine the necessary time to run the optimization and avoid wasting computational resources on unnecessary iterations. A case study is presented to demonstrate the use of this method.
Resumo:
Background: The incidence of venous lesions following transvenous cardiac device implantation is high. Previous implantation of temporary leads ipsilateral to the permanent devices, and a depressed left ventricular ejection fraction have been associated with an increased risk of venous lesions, though the effects of preventive strategies remain controversial. This randomized trial examined the effects of warfarin in the prevention of these complications in high-risk patients. Method: Between February 2004 and September 2007, we studied 101 adults who underwent a first cardiac device implantation, and who had a left ventricular ejection fraction <= 0.40, or a temporary pacing system ipsilateral to the permanent implant, or both. After device implantation, the patients were randomly assigned to warfarin to a target international normalized ratio of 2.0-3.5, or to placebo. Clinical and laboratory evaluations were performed regularly up to 6 months postimplant. Venous lesions were detected at 6 months by digital subtraction venography. Results: Venous obstructions of various degrees were observed in 46 of the 92 patients (50.0%) who underwent venography. The frequency of venous obstructions was 60.4% in the placebo, versus 38.6% in the warfarin group (P = 0.018), corresponding to an absolute risk reduction of 22% (relative risk = 0.63; 95% confidence interval = 0.013-0.42). Conclusions: Warfarin prophylaxis lowered the frequency of venous lesions after transvenous devices implantation in high-risk patients. (PACE 2009; 32:S247-S251)
Resumo:
Subcycling, or the use of different timesteps at different nodes, can be an effective way of improving the computational efficiency of explicit transient dynamic structural solutions. The method that has been most widely adopted uses a nodal partition. extending the central difference method, in which small timestep updates are performed interpolating on the displacement at neighbouring large timestep nodes. This approach leads to narrow bands of unstable timesteps or statistical stability. It also can be in error due to lack of momentum conservation on the timestep interface. The author has previously proposed energy conserving algorithms that avoid the first problem of statistical stability. However, these sacrifice accuracy to achieve stability. An approach to conserve momentum on an element interface by adding partial velocities is considered here. Applied to extend the central difference method. this approach is simple. and has accuracy advantages. The method can be programmed by summing impulses of internal forces, evaluated using local element timesteps, in order to predict a velocity change at a node. However, it is still only statistically stable, so an adaptive timestep size is needed to monitor accuracy and to be adjusted if necessary. By replacing the central difference method with the explicit generalized alpha method. it is possible to gain stability by dissipating the high frequency response that leads to stability problems. However. coding the algorithm is less elegant, as the response depends on previous partial accelerations. Extension to implicit integration, is shown to be impractical due to the neglect of remote effects of internal forces acting across a timestep interface. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Dissertação de mestrado em Construção e Reabilitação Sustentáveis
Resumo:
A computerized handheld procedure is presented in this paper. It is intended as a database complementary tool, to enhance prospective risk analysis in the field of occupational health. The Pendragon forms software (version 3.2) has been used to implement acquisition procedures on Personal Digital Assistants (PDAs) and to transfer data to a computer in an MS-Access format. The data acquisition strategy proposed relies on the risk assessment method practiced at the Institute of Occupational Health Sciences (IST). It involves the use of a systematic hazard list and semi-quantitative risk assessment scales. A set of 7 modular forms has been developed to cover the basic need of field audits. Despite the minor drawbacks observed, the results obtained so far show that handhelds are adequate to support field risk assessment and follow-up activities. Further improvements must still be made in order to increase the tool effectiveness and field adequacy.
Resumo:
Abstract OBJECTIVE To evaluate the incidence of complications related to the use of peripheral intravenous catheter in neonates and identify the associated risk factors. METHOD Prospective cohort study conducted in a Neonatal Intensive Care Unit. Participants were the hospitalized neonates undergoing peripheral intravenous puncture in the period from February to June 2013. RESULTS The incidence of complications was 63.15%, being infiltration/extravasation (69.89%), phlebitis (17.84%) and obstruction (12.27%). The risk factors were the presence of infection (p = 0.0192) and weight at the puncture day (p = 0.0093), type of intermittent infusion associated with continuous infusion (p <0.0001), endotracheal intubation (p = 0.0008), infusion of basic plan (p = 0.0027), total parenteral nutrition (P = 0.0002), blood transfusion associated with other infusions (p = 0.0003) and other drugs (p = 0.0004). Higher risk of developing complications in the first 48 hours after puncture. CONCLUSION A high rate of complications related to the use of peripheral intravenous catheter, and risk factors associated with infection, weight, drugs and infused solutions, and type of infusion.
Resumo:
OBJECTIVE: Current hypertension guidelines stress the importance to assess total cardiovascular risk but do not describe precisely how to use ambulatory blood pressures in the cardiovascular risk stratification. METHOD: We calculated here global cardiovascular risk according to 2003 European Society of Hypertension/European Society of Cardiology guidelines in 127 patients in whom daytime ambulatory blood pressures were recorded and carotid/femoral ultrasonography performed. RESULTS: The presence of ambulatory blood pressures >or =135/85 mmHg shifted cardiovascular risk to higher categories, as did the presence of hypercholesterolemia and, even more so, the presence of atherosclerotic plaques. CONCLUSION: Further studies are, however, needed to define the position of ambulatory blood pressures in the assessment of cardiovascular risk.
Resumo:
Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.
Resumo:
The increasing incidence of type 1 diabetes has led researchers on a quest to find the reason behind this phenomenon. The rate of increase is too great to be caused simply by changes in the genetic component, and many environmental factors are under investigation for their possible contribution. These studies require, however, the participation of those individuals most likely to develop the disease, and the approach chosen by many is to screen vast populations to find persons with increased genetic risk factors. The participating individuals are then followed for signs of disease development, and their exposure to suspected environmental factors is studied. The main purpose of this study was to find a suitable tool for easy and inexpensive screening of certain genetic risk markers for type 1 diabetes. The method should be applicable to using whole blood dried on sample collection cards as sample material, since the shipping and storage of samples in this format is preferred. However, the screening of vast sample libraries of extracted genomic DNA should also be possible, if such a need should arise, for example, when studying the effect of newly discovered genetic risk markers. The method developed in this study is based on homogeneous assay chemistry and an asymmetrical polymerase chain reaction (PCR). The generated singlestranded PCR product is probed by lanthanide-labelled, LNA (locked nucleic acid)-spiked, short oligonucleotides with exact complementary sequences. In the case of a perfect match, the probe is hybridised to the product. However, if even a single nucleotide difference occurs, the probe is bound instead of the PCR product to a complementary quencher-oligonucleotide labelled with a dabcyl-moiety, causing the signal of the lanthanide label to be quenched. The method was applied to the screening of the well-known type 1 diabetes risk alleles of the HLA-DQB1 gene. The method was shown to be suitable as an initial screening step including thousands of samples in the scheme used in the TEDDY (The Environmental Determinants of Diabetes in the Young) study to identify those individuals at increased genetic risk. The method was further developed into dry-reagent form to allow an even simpler approach to screening. The reagents needed in the assay were in dry format in the reaction vessel, and performing the assay required only the addition of the sample and, if necessary, water to rehydrate the reagents. This allows the assay to be successfully executed even by a person with minimal laboratory experience.
Resumo:
The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Labs. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi-Layer Perceptron classifiers. An interesting property of this approach is that it is an approximate implementation of the Structural Risk Minimization (SRM) induction principle. The derivation of Support Vector Machines, its relationship with SRM, and its geometrical insight, are discussed in this paper. Training a SVM is equivalent to solve a quadratic programming problem with linear and box constraints in a number of variables equal to the number of data points. When the number of data points exceeds few thousands the problem is very challenging, because the quadratic form is completely dense, so the memory needed to store the problem grows with the square of the number of data points. Therefore, training problems arising in some real applications with large data sets are impossible to load into memory, and cannot be solved using standard non-linear constrained optimization algorithms. We present a decomposition algorithm that can be used to train SVM's over large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of, and also establish the stopping criteria for the algorithm. We present previous approaches, as well as results and important details of our implementation of the algorithm using a second-order variant of the Reduced Gradient Method as the solver of the sub-problems. As an application of SVM's, we present preliminary results we obtained applying SVM to the problem of detecting frontal human faces in real images.
Resumo:
The skin cancer is the most common of all cancers and the increase of its incidence must, in part, caused by the behavior of the people in relation to the exposition to the sun. In Brazil, the non-melanoma skin cancer is the most incident in the majority of the regions. The dermatoscopy and videodermatoscopy are the main types of examinations for the diagnosis of dermatological illnesses of the skin. The field that involves the use of computational tools to help or follow medical diagnosis in dermatological injuries is seen as very recent. Some methods had been proposed for automatic classification of pathology of the skin using images. The present work has the objective to present a new intelligent methodology for analysis and classification of skin cancer images, based on the techniques of digital processing of images for extraction of color characteristics, forms and texture, using Wavelet Packet Transform (WPT) and learning techniques called Support Vector Machine (SVM). The Wavelet Packet Transform is applied for extraction of texture characteristics in the images. The WPT consists of a set of base functions that represents the image in different bands of frequency, each one with distinct resolutions corresponding to each scale. Moreover, the characteristics of color of the injury are also computed that are dependants of a visual context, influenced for the existing colors in its surround, and the attributes of form through the Fourier describers. The Support Vector Machine is used for the classification task, which is based on the minimization principles of the structural risk, coming from the statistical learning theory. The SVM has the objective to construct optimum hyperplanes that represent the separation between classes. The generated hyperplane is determined by a subset of the classes, called support vectors. For the used database in this work, the results had revealed a good performance getting a global rightness of 92,73% for melanoma, and 86% for non-melanoma and benign injuries. The extracted describers and the SVM classifier became a method capable to recognize and to classify the analyzed skin injuries
Resumo:
Objective: analyze the prevalence of recurrent wheezing and its risk factors. Method: systematic literature review, guided by the research question “what is the prevalence of recurrent wheezing and its risk factors?”. The search was performed in the databases MedLine and LILACS, in April and May 2013. The inclusion criteria were: scientific study, fully available, published between 2002 and 2013, with free access. Results: wheezing presents a higher prevalence in developing countries, possibly due to poor socioeconomic conditions. Among its risk factors, we find heredity, mother’s education level, attendance of day nursery, smoking during pregnancy, breastfeeding for < 3 months, animals in the household of children, among others. Conclusion: in Latin America, the prevalence of wheezing shows to be high and the use of non-standardized instruments hampers its treatment.
Resumo:
Introduction. Patients with terminal heart failure have increased more than the available organs leading to a high mortality rate on the waiting list. Use of Marginal and expanded criteria donors has increased due to the heart shortage. Objective. We analyzed all heart transplantations (HTx) in Sao Paulo state over 8 years for donor profile and recipient risk factors. Method. This multi-institutional review collected HTx data from all institutions in the state of Sao Paulo, Brazil. From 2002 to 2008 (6 years), only 512 (28.8%) of 1777 available heart donors were accepted for transplantation. All medical records were analyzed retrospectively; none of the used donors was excluded, even those considered to be nonstandard. Results. The hospital mortality rate was 27.9% (n = 143) and the average follow-up time was 29.4 +/- 28.4 months. The survival rate was 55.5% (n = 285) at 6 years after HTx. Univariate analysis showed the following factors to impact survival: age (P = .0004), arterial hypertension (P = .4620), norepinephrine (P = .0450), cardiac arrest (P = .8500), diabetes mellitus (P = .5120), infection (P = .1470), CKMB (creatine kinase MB) (P = .8694), creatinine (P = .7225), and Na+ (P = .3273). On multivariate analysis, only age showed significance; logistic regression showed a significant cut-off at 40 years: organs from donors older than 40 years showed a lower late survival rates (P = .0032). Conclusions. Donor age older than 40 years represents an important risk factor for survival after HTx. Neither donor gender nor norepinephrine use negatively affected early survival.