898 resultados para Viscous Dampers,Five Step Method,Equivalent Static Analysis Procedure,Yielding Frames,Passive Energy Dissipation Systems
Resumo:
This paper presents the results from a study of information behaviors in the context of people's everyday lives undertaken in order to develop an integrated model of information behavior (IB). 34 participants from across 6 countries maintained a daily information journal or diary – mainly through a secure web log – for two weeks, to an aggregate of 468 participant days over five months. The text-rich diary data was analyzed using a multi-method qualitative-quantitative analysis in the following order: Grounded Theory analysis with manual coding, automated concept analysis using thesaurus-based visualization, and finally a statistical analysis of the coding data. The findings indicate that people engage in several information behaviors simultaneously throughout their everyday lives (including home and work life) and that sense-making is entangled in all aspects of them. Participants engaged in many of the information behaviors in a parallel, distributed, and concurrent fashion: many information behaviors for one information problem, one information behavior across many information problems, and many information behaviors concurrently across many information problems. Findings indicate also that information avoidance – both active and passive avoidance – is a common phenomenon and that information organizing behaviors or the lack thereof caused the most problems for participants. An integrated model of information behaviors is presented based on the findings.
Resumo:
We present a hierarchical model for assessing an object-oriented program's security. Security is quantified using structural properties of the program code to identify the ways in which `classified' data values may be transferred between objects. The model begins with a set of low-level security metrics based on traditional design characteristics of object-oriented classes, such as data encapsulation, cohesion and coupling. These metrics are then used to characterise higher-level properties concerning the overall readability and writability of classified data throughout the program. In turn, these metrics are then mapped to well-known security design principles such as `assigning the least privilege' and `reducing the size of the attack surface'. Finally, the entire program's security is summarised as a single security index value. These metrics allow different versions of the same program, or different programs intended to perform the same task, to be compared for their relative security at a number of different abstraction levels. The model is validated via an experiment involving five open source Java programs, using a static analysis tool we have developed to automatically extract the security metrics from compiled Java bytecode.
Resumo:
Sustainable transport has become a necessity instead of an option, to address the problems of congestion and urban sprawl, whose effects include increased trip lengths and travel time. A more sustainable form of development, known as Transit Oriented Development (TOD) is presumed to offer sustainable travel choices with reduced need to travel to access daily destinations, by providing a mixture of land uses together with good quality of public transport service, infrastructure for walking and cycling. However, performance assessment of these developments with respect to travel characteristics of their inhabitants is required. This research proposes a five step methodology for evaluating the transport impacts of TODs. The steps for TOD evaluation include pre–TOD assessment, traffic and travel data collection, determination of traffic impacts, determination of travel impacts, and drawing outcomes. Typically, TODs are comprised of various land uses; hence have various types of users. Assessment of characteristics of all user groups is essential for obtaining an accurate picture of transport impacts. A case study TOD, Kelvin Grove Urban Village (KGUV), located 2km of north west of the Brisbane central business district in Australia was selected for implementing the proposed methodology and to evaluate the transport impacts of a TOD from an Australian perspective. The outcomes of this analysis indicated that KGUV generated 27 to 48 percent less traffic compared to standard published rates specified for homogeneous uses. Further, all user groups of KGUV used more sustainable modes of transport compared to regional and similarly located suburban users, with higher trip length for shopping and education trips. Although the results from this case study development support the transport claims of reduced traffic generation and sustainable travel choices by way of TODs, further investigation is required, considering different styles, scales and locations of TODs. The proposed methodology may be further refined by using results from new TODs and a framework for TOD evaluation may be developed.
Resumo:
Defence organisations perform information security evaluations to confirm that electronic communications devices are safe to use in security-critical situations. Such evaluations include tracing all possible dataflow paths through the device, but this process is tedious and error-prone, so automated reachability analysis tools are needed to make security evaluations faster and more accurate. Previous research has produced a tool, SIFA, for dataflow analysis of basic digital circuitry, but it cannot analyse dataflow through microprocessors embedded within the circuit since this depends on the software they run. We have developed a static analysis tool that produces SIFA compatible dataflow graphs from embedded microcontroller programs written in C. In this paper we present a case study which shows how this new capability supports combined hardware and software dataflow analyses of a security critical communications device.
Resumo:
Data flow analysis techniques can be used to help assess threats to data confidentiality and integrity in security critical program code. However, a fundamental weakness of static analysis techniques is that they overestimate the ways in which data may propagate at run time. Discounting large numbers of these false-positive data flow paths wastes an information security evaluator's time and effort. Here we show how to automatically eliminate some false-positive data flow paths by precisely modelling how classified data is blocked by certain expressions in embedded C code. We present a library of detailed data flow models of individual expression elements and an algorithm for introducing these components into conventional data flow graphs. The resulting models can be used to accurately trace byte-level or even bit-level data flow through expressions that are normally treated as atomic. This allows us to identify expressions that safely downgrade their classified inputs and thereby eliminate false-positive data flow paths from the security evaluation process. To validate the approach we have implemented and tested it in an existing data flow analysis toolkit.
Resumo:
PURPOSE: The purpose of this study is to identify risk factors for developing complications following treatment of refractory glaucoma with transscleral diode laser cyclophotocoagulation (cyclodiode), to improve the safety profile of this treatment modality. METHOD: A retrospective analysis of 72 eyes from 70 patients who were treated with cyclodiode. RESULTS: The mean pre-treatment IOP was 37.0 mmHg (SD 11.0), with a mean post-treatment reduction in intraocular pressure (IOP) of 19.8 mmHg, and a mean IOP at last follow-up of 17.1 mmHg (SD 9.7). Mean total power delivered during treatment was 156.8 Joules (SD 82.7) over a mean of 1.3 treatments (SD 0.6). Sixteen eyes (22.2% of patients) developed complications from the treatment, with the most common being hypotony, occurring in 6 patients, including 4 with neovascular glaucoma. A higher pre-treatment IOP and higher mean total power delivery also were associated with higher complications. CONCLUSIONS: Cyclodiode is an effective treatment option for glaucoma that is refractory to other treatment options. By identifying risk factors for potential complications, cyclodiode can be modified accordingly for each patient to improve safety and efficacy.
Resumo:
Static anaylsis represents an approach of checking source code or compiled code of applications before it gets executed. Chess and McGraw state that static anaylsis promises to identify common coding problems automatically. While manual code checking is also a form of static analysis, software tools are used in most cases in order to perform the checks. Chess and McGraw additionaly claim that good static checkers can help to spot and eradicate common security bugs.
Resumo:
The deformation of rocks is commonly intimately associated with metamorphic reactions. This paper is a step towards understanding the behaviour of fully coupled, deforming, chemically reacting systems by considering a simple example of the problem comprising a single layer system with elastic-power law viscous constitutive behaviour where the deformation is controlled by the diffusion of a single chemical component that is produced during a metamorphic reaction. Analysis of the problem using the principles of non-equilibrium thermodynamics allows the energy dissipated by the chemical reaction-diffusion processes to be coupled with the energy dissipated during deformation of the layers. This leads to strain-rate softening behaviour and the resultant development of localised deformation which in turn nucleates buckles in the layer. All such diffusion processes, in leading to Herring-Nabarro, Coble or “pressure solution” behaviour, are capable of producing mechanical weakening through the development of a “chemical viscosity”, with the potential for instability in the deformation. For geologically realistic strain rates these chemical feed-back instabilities occur at the centimetre to micron scales, and so produce structures at these scales, as opposed to thermal feed-back instabilities that become important at the 100–1000 m scales.
Resumo:
In Australia, as in some other western nations, governments impose accountability measures on educational institutions (Earl, 2005). One such accountability measure is the National Assessment Program - Literacy and Numeracy (NAPLAN) from which high-stakes assessment data is generated. In this article, a practical method of data analysis known as the Over Time Assessment Data Analysis (OTADA) is offered as an analytical process by which schools can monitor their current and over time performances. This analysis developed by the author, is currently used extensively in schools throughout Queensland. By Analysing in this way, teachers, and in particular principals, can obtain a quick and insightful performance overview. For those seeking to track the achievements and progress of year level cohorts, the OTADA should be considered.
Resumo:
The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.
Resumo:
Background Loss of heterozygosity (LOH) is an important marker for one of the 'two-hits' required for tumor suppressor gene inactivation. Traditional methods for mapping LOH regions require the comparison of both tumor and patient-matched normal DNA samples. However, for many archival samples, patient-matched normal DNA is not available leading to the under-utilization of this important resource in LOH studies. Here we describe a new method for LOH analysis that relies on the genome-wide comparison of heterozygosity of single nucleotide polymorphisms (SNPs) between cohorts of cases and un-matched healthy control samples. Regions of LOH are defined by consistent decreases in heterozygosity across a genetic region in the case cohort compared to the control cohort. Methods DNA was collected from 20 Follicular Lymphoma (FL) tumor samples, 20 Diffuse Large B-cell Lymphoma (DLBCL) tumor samples, neoplastic B-cells of 10 B-cell Chronic Lymphocytic Leukemia (B-CLL) patients and Buccal cell samples matched to 4 of these B-CLL patients. The cohort heterozygosity comparison method was developed and validated using LOH derived in a small cohort of B-CLL by traditional comparisons of tumor and normal DNA samples, and compared to the only alternative method for LOH analysis without patient matched controls. LOH candidate regions were then generated for enlarged cohorts of B-CLL, FL and DLBCL samples using our cohort heterozygosity comparison method in order to evaluate potential LOH candidate regions in these non-Hodgkin's lymphoma tumor subtypes. Results Using a small cohort of B-CLL samples with patient-matched normal DNA we have validated the utility of this method and shown that it displays more accuracy and sensitivity in detecting LOH candidate regions compared to the only alternative method, the Hidden Markov Model (HMM) method. Subsequently, using B-CLL, FL and DLBCL tumor samples we have utilised cohort heterozygosity comparisons to localise LOH candidate regions in these subtypes of non-Hodgkin's lymphoma. Detected LOH regions included both previously described regions of LOH as well as novel genomic candidate regions. Conclusions We have proven the efficacy of the use of cohort heterozygosity comparisons for genome-wide mapping of LOH and shown it to be in many ways superior to the HMM method. Additionally, the use of this method to analyse SNP microarray data from 3 common forms of non-Hodgkin's lymphoma yielded interesting tumor suppressor gene candidates, including the ETV3 gene that was highlighted in both B-CLL and FL.
Resumo:
Matrix metalloproteinases (MMPs), in particular the gelatinases (MMP-2 and -9), play a significant role in tumour invasion and angiogenesis. The expression and activities of MMPs have not been characterised in malignant mesothelioma (MM) tumour samples. In a prospective study, gelatinase activity was evaluated in homogenised supernatants of snap frozen MM (n = 35), inflamed pleura (IP, n = 12) and uninflammed pleura (UP, n = 14) tissue specimens by semiquantitative gelatin zymography. Matrix metalloproteinases were correlated with clinicopathological factors and with survival using Kaplan-Meier and Cox proportional hazard models. In MM, pro- and active MMP-2 levels were significantly greater than for MMP-9 (P = 0.006, P<0.001). Active MMP-2 was significantly greater in MM than in UP (P=0.04). MMP-2 activity was equivalent between IP and MM, but both pro- and active MMP-9 activities were greater in IP (P=0.02, P=0.009). While there were trends towards poor survival with increasing total and pro-MMP-2 activity (P=0.08) in univariate analysis, they were both independent poor prognostic factors in multivariate analysis in conjunction with weight loss (pro-MMP-2 P = 0.03, total MMP-2 P = 0.04). Total and pro-MMP-2 also contributed to the Cancer and Leukemia Group B prognostic groups. MMP-9 activities were not prognostic. Matrix metalloproteinases, and in particular MMP-2, the most abundant gelatinase, may play an important role in MM tumour growth and metastasis. Agents that reduce MMP synthesis and/or activity may have a role to play in the management of MM. © 2003 Cancer Research UK.
Resumo:
Articular cartilage is the load-bearing tissue that consists of proteoglycan macromolecules entrapped between collagen fibrils in a three-dimensional architecture. To date, the drudgery of searching for mathematical models to represent the biomechanics of such a system continues without providing a fitting description of its functional response to load at micro-scale level. We believe that the major complication arose when cartilage was first envisaged as a multiphasic model with distinguishable components and that quantifying those and searching for the laws that govern their interaction is inadequate. To the thesis of this paper, cartilage as a bulk is as much continuum as is the response of its components to the external stimuli. For this reason, we framed the fundamental question as to what would be the mechano-structural functionality of such a system in the total absence of one of its key constituents-proteoglycans. To answer this, hydrated normal and proteoglycan depleted samples were tested under confined compression while finite element models were reproduced, for the first time, based on the structural microarchitecture of the cross-sectional profile of the matrices. These micro-porous in silico models served as virtual transducers to produce an internal noninvasive probing mechanism beyond experimental capabilities to render the matrices micromechanics and several others properties like permeability, orientation etc. The results demonstrated that load transfer was closely related to the microarchitecture of the hyperelastic models that represent solid skeleton stress and fluid response based on the state of the collagen network with and without the swollen proteoglycans. In other words, the stress gradient during deformation was a function of the structural pattern of the network and acted in concert with the position-dependent compositional state of the matrix. This reveals that the interaction between indistinguishable components in real cartilage is superimposed by its microarchitectural state which directly influences macromechanical behavior.
Resumo:
Finite element frame analysis programs targeted for design office application necessitate algorithms which can deliver reliable numerical convergence in a practical timeframe with comparable degrees of accuracy, and a highly desirable attribute is the use of a single element per member to reduce computational storage, as well as data preparation and the interpretation of the results. To this end, a higher-order finite element method including geometric non-linearity is addressed in the paper for the analysis of elastic frames for which a single element is used to model each member. The geometric non-linearity in the structure is handled using an updated Lagrangian formulation, which takes the effects of the large translations and rotations that occur at the joints into consideration by accumulating their nodal coordinates. Rigid body movements are eliminated from the local member load-displacement relationship for which the total secant stiffness is formulated for evaluating the large member deformations of an element. The influences of the axial force on the member stiffness and the changes in the member chord length are taken into account using a modified bowing function which is formulated in the total secant stiffness relationship, for which the coupling of the axial strain and flexural bowing is included. The accuracy and efficiency of the technique is verified by comparisons with a number of plane and spatial structures, whose structural response has been reported in independent studies.
Resumo:
Terrorists usually target high occupancy iconic and public buildings using vehicle borne incendiary devices in order to claim a maximum number of lives and cause extensive damage to public property. While initial casualties are due to direct shock by the explosion, collapse of structural elements may extensively increase the total figure. Most of these buildings have been or are built without consideration of their vulnerability to such events. Therefore, the vulnerability and residual capacity assessment of buildings to deliberately exploded bombs is important to provide mitigation strategies to protect the buildings' occupants and the property. Explosive loads and their effects on a building have therefore attracted significant attention in the recent past. Comprehensive and economical design strategies must be developed for future construction. This research investigates the response and damage of reinforced concrete (RC) framed buildings together with their load bearing key structural components to a near field blast event. Finite element method (FEM) based analysis was used to investigate the structural framing system and components for global stability, followed by a rigorous analysis of key structural components for damage evaluation using the codes SAP2000 and LS DYNA respectively. The research involved four important areas in structural engineering. They are blast load determination, numerical modelling with FEM techniques, material performance under high strain rate and non-linear dynamic structural analysis. The response and damage of a RC framed building for different blast load scenarios were investigated. The blast influence region for a two dimensional RC frame was investigated for different load conditions and identified the critical region for each loading case. Two types of design methods are recommended for RC columns to provide superior residual capacities. They are RC columns detailing with multi-layer steel reinforcement cages and a composite columns including a central structural steel core. These are to provide post blast gravity load resisting capacity compared to typical RC column against a catastrophic collapse. Overall, this research broadens the current knowledge of blast and residual capacity analysis of RC framed structures and recommends methods to evaluate and mitigate blast impact on key elements of multi-storey buildings.