989 resultados para Staining method
Resumo:
For certain continuum problems, it is desirable and beneficial to combine two different methods together in order to exploit their advantages while evading their disadvantages. In this paper, a bridging transition algorithm is developed for the combination of the meshfree method (MM) with the finite element method (FEM). In this coupled method, the meshfree method is used in the sub-domain where the MM is required to obtain high accuracy, and the finite element method is employed in other sub-domains where FEM is required to improve the computational efficiency. The MM domain and the FEM domain are connected by a transition (bridging) region. A modified variational formulation and the Lagrange multiplier method are used to ensure the compatibility of displacements and their gradients. To improve the computational efficiency and reduce the meshing cost in the transition region, regularly distributed transition particles, which are independent of either the meshfree nodes or the FE nodes, can be inserted into the transition region. The newly developed coupled method is applied to the stress analysis of 2D solids and structures in order to investigate its’ performance and study parameters. Numerical results show that the present coupled method is convergent, accurate and stable. The coupled method has a promising potential for practical applications, because it can take advantages of both the meshfree method and FEM when overcome their shortcomings.
Resumo:
In this article I outline and demonstrate a synthesis of the methods developed by Lemke (1998) and Martin (2000) for analyzing evaluations in English. I demonstrate the synthesis using examples from a 1.3-million-word technology policy corpus drawn from institutions at the local, state, national, and supranational levels. Lemke's (1998) critical model is organized around the broad 'evaluative dimensions' that are deployed to evaluate propositions and proposals in English. Martin's (2000) model is organized with a more overtly systemic-functional orientation around the concept of 'encoded feeling'. In applying both these models at different times, whilst recognizing their individual usefulness and complementarity, I found specific limitations that led me to work towards a synthesis of the two approaches. I also argue for the need to consider genre, media, and institutional aspects more explicitly when claiming intertextual and heteroglossic relations as the basis for inferred evaluations. A basic assertion made in this article is that the perceived Desirability of a process, person, circumstance, or thing is identical to its 'value'. But the Desirability of anything is a socially and thus historically conditioned attribution that requires significant amounts of institutional inculcation of other 'types' of value-appropriateness, importance, beauty, power, and so on. I therefore propose a method informed by critical discourse analysis (CDA) that sees evaluation as happening on at least four interdependent levels of abstraction.
Resumo:
Aim: In the current climate of medical education, there is an ever-increasing demand for and emphasis on simulation as both a teaching and training tool. The objective of our study was to compare the realism and practicality of a number of artificial blood products that could be used for high-fidelity simulation. Method: A literature and internet search was performed and 15 artificial blood products were identified from a variety of sources. One product was excluded due to its potential toxicity risks. Five observers, blinded to the products, performed two assessments on each product using an evaluation tool with 14 predefined criteria including color, consistency, clotting, and staining potential to manikin skin and clothing. Each criterion was rated using a five-point Likert scale. The products were left for 24 hours, both refrigerated and at room temperature, and then reassessed. Statistical analysis was performed to identify the most suitable products, and both inter- and intra-rater variability were examined. Results: Three products scored consistently well with all five assessors, with one product in particular scoring well in almost every criterion. This highest-rated product had a mean rating of 3.6 of 5.0 (95% posterior Interval 3.4-3.7). Inter-rater variability was minor with average ratings varying from 3.0 to 3.4 between the highest and lowest scorer. Intrarater variability was negligible with good agreement between first and second rating as per weighted kappa scores (K = 0.67). Conclusion: The most realistic and practical form of artificial blood identified was a commercial product called KD151 Flowing Blood Syrup. It was found to be not only realistic in appearance but practical in terms of storage and stain removal.
Resumo:
The determination of the most appropriate procurement method for capital works projects is a challenging task for the Department of Housing and Works (DHW) and other Western Australian State Government Agencies because of the array of assessment criteria that are considered and the procurement methods that are available. A number of different procurement systems can be used to deliver capital works projects such a traditional, design and construct and management. Sub-classifications of these systems have proliferated and continue to emerge in response to market demands. The selection of an inappropriate procurement method may lead to undesirable project outcomes. To facilitate DHW in selecting an appropriate procurement method for its capital works projects, a six step procurement method selection process is presented. The characteristics of the most common forms of procurement method used in Australia are presented. Case studies where procurement methods have been used for specific types of capital works in Western Australia are offered to provide a reference point and learning opportunity for procurement method selection.
Resumo:
For a sustainable building industry, not only should the environmental and economic indicators be evaluated but also the societal indicators for building. Current indicators can be in conflict with each other, thus decision making is difficult to clearly quantify and assess sustainability. For the sustainable building, the objectives of decreasing both adverse environmental impact and cost are in conflict. In addition, even though both objectives may be satisfied, building management systems may present other problems such as convenience of occupants, flexibility of building, or technical maintenance, which are difficult to quantify as exact assessment data. These conflicting problems confronting building managers or planners render building management more difficult. This paper presents a methodology to evaluate a sustainable building considering socio-economic and environmental characteristics of buildings, and is intended to assist the decision making for building planners or practitioners. The suggested methodology employs three main concepts: linguistic variables, fuzzy numbers, and an analytic hierarchy process. The linguistic variables are used to represent the degree of appropriateness of qualitative indicators, which are vague or uncertain. These linguistic variables are then translated into fuzzy numbers to reflect their uncertainties and aggregated into the final fuzzy decision value using a hierarchical structure. Through a case study, the suggested methodology is applied to the evaluation of a building. The result demonstrates that the suggested approach can be a useful tool for evaluating a building for sustainability.
Resumo:
INTRODUCTION Inflammation is a protective attempt to facilitate the removal of damaged tissue and to initiate the healing response in other tissues. However, after spinal cord injury (SCI), this response is prolonged leading to secondary degeneration and glial scarring. Here, we investigate the potential of sustained delivery of pro-inflammatory factors vascular endothelial growth factor (VEGF) and platelet derived growth factor (PDGF) to increase early inflammatory events and promote inflammatory resolution. Method Animal ethics approval was obtained from the Queensland University of Technology. Adult Wistar-Kyoto rats (12-16 weeks old) were subjected to laminectomies and T10 hemisections. Animals were then randomised to treatment (implantation of osmotic pump (Alzet) loaded with 5ug VEGF & 5 ug PDGF) or control groups (lesion control or lesion plus pump delivering PBS). Rats were sacrificed at one month and the spinal cords were harvested and examined by immunohistology, using anti-neurofilament-200(NF200) and anti- ionized calcium binding adapter molecule 1 (Iba1). One way ANOVA was used for statistic analysis. Results At 1 month, active pump-treated cords showed a high level of axonal filament throughout the defects as compared to the control groups. The mean lesion size, as measured by NF200, was 0.47mm2 for the lesion control, 0.39mm2 for the vehicle control and 0.078mm2 for the active pump group. Significant differences were detected between the active pump group and the two control groups (AP vs LC p= 0.017 AG vs VC p= 0.004). Iba-1 staining also showed significant differences in the post-injury inflammatory response. Discussion We have shown that axons and activated microglia are co-located in the lesion of the treated cord. We hypothesise the delivery of VEGF/PDGF increases the local vessel permeability to inflammatory cells and activates these along with the resident microglia to threshold population, which ultimately resolved the prolonged inflammation. Here, we have shown that maintaining the inflammatory signals for at least 7 days improved the morphology of the injured cord. Conclusion This study has shown that boosting inflammation, by delivery VEGF/PDGF, in the early phase of SCI helps to reduce secondary degeneration and may promote inflammation resolution. This treatment may provide a platform for other neuro-regenrative therapies.
Resumo:
Improving the performance of a incident detection system was essential to minimize the effect of incidents. A new method of incident detection was brought forward in this paper based on an in-car terminal which consisted of GPS module, GSM module and control module as well as some optional parts such as airbag sensors, mobile phone positioning system (MPPS) module, etc. When a driver or vehicle discovered the freeway incident and initiated an alarm report the incident location information located by GPS, MPPS or both would be automatically send to a transport management center (TMC), then the TMC would confirm the accident with a closed-circuit television (CCTV) or other approaches. In this method, detection rate (DR), time to detect (TTD) and false alarm rate (FAR) were more important performance targets. Finally, some feasible means such as management mode, education mode and suitable accident confirming approaches had been put forward to improve these targets.
Resumo:
Differential axial shortening, distortion and deformation in high rise buildings is a serious concern. They are caused by three time dependent modes of volume change; “shrinkage”, “creep” and “elastic shortening” that takes place in every concrete element during and after construction. Vertical concrete components in a high rise building are sized and designed based on their strength demand to carry gravity and lateral loads. Therefore, columns and walls are sized, shaped and reinforced differently with varying concrete grades and volume to surface area ratios. These structural components may be subjected to the detrimental effects of differential axial shortening that escalates with increasing the height of buildings. This can have an adverse impact on other structural and non-structural elements. Limited procedures are available to quantify axial shortening, and the results obtained from them differ because each procedure is based on various assumptions and limited to few parameters. All these prompt to a need to develop an accurate numerical procedure to quantify the axial shortening of concrete buildings taking into account the important time varying functions of (i) construction sequence (ii) Young’s Modulus and (iii) creep and shrinkage models associated with reinforced concrete. General assumptions are refined to minimize variability of creep and shrinkage parameters to improve accuracy of the results. Finite element techniques are used in the procedure that employs time history analysis along with compression only elements to simulate staged construction behaviour. This paper presents such a procedure and illustrates it through an example. Keywords: Differential Axial Shortening, Concrete Buildings, Creep and Shrinkage, Construction Sequence, Finite Element Method.
Resumo:
The equations governing saltwater intrusion in coastal aquifers are complex. Backward Euler time stepping approaches are often used to advance the solution to these equations in time, which typically requires that small time steps be taken in order to ensure that an accurate solution is obtained. We show that a method of lines approach incorporating variable order backward differentiation formulas can greatly improve the efficiency of the time stepping process.
Resumo:
With the widespread applications of electronic learning (e-Learning) technologies to education at all levels, increasing number of online educational resources and messages are generated from the corresponding e-Learning environments. Nevertheless, it is quite difficult, if not totally impossible, for instructors to read through and analyze the online messages to predict the progress of their students on the fly. The main contribution of this paper is the illustration of a novel concept map generation mechanism which is underpinned by a fuzzy domain ontology extraction algorithm. The proposed mechanism can automatically construct concept maps based on the messages posted to online discussion forums. By browsing the concept maps, instructors can quickly identify the progress of their students and adjust the pedagogical sequence on the fly. Our initial experimental results reveal that the accuracy and the quality of the automatically generated concept maps are promising. Our research work opens the door to the development and application of intelligent software tools to enhance e-Learning.
Resumo:
Recognizing the need to offer alternative methods of brief interventions, this study developed correspondence treatments for low-dependent problem drinkers and evaluated their impact. One hundred and twenty-one problem drinkers were recruited by media advertisements and were randomly allocated to a full cognitive behavioural treatment programme (CBT) or to a minimal intervention condition (MI) that gave information regarding alcohol misuse and instructions to record drinking. As predicted, CBT was more effective than MI in reducing alcohol consumption over the 4-month controlled trial period. CBT produced a 50% fall in consumption, bringing the average intake of subjects within recommended maximum levels. Treatment gains at 6 months were well maintained to 12 months. High levels of consumer satisfaction, a high representation of women and a substantial participation from isolated rural areas attested to the feasibility of the correspondence programme as an alternative treatment. However, some drinking occasions still involved high intake for a significant subgroup of subjects, and this issue will be addressed in future programmes. The results supported the use of correspondence delivery as a means of promoting early engagement and equity of access between city and country areas.
Resumo:
A simple and sensitive spectrophotometric method for the simultaneous determination of acesulfame-K, sodium cyclamate and saccharin sodium sweeteners in foodstuff samples has been researched and developed. This analytical method relies on the different kinetic rates of the analytes in their oxidative reaction with KMnO4 to produce the green manganate product in an alkaline solution. As the kinetic rates of acesulfame-K, sodium cyclamate and saccharin sodium were similar and their kinetic data seriously overlapped, chemometrics methods, such as partial least squares (PLS), principal component regression (PCR) and classical least squares (CLS), were applied to resolve the kinetic data. The results showed that the PLS prediction model performed somewhat better. The proposed method was then applied for the determination of the three sweeteners in foodstuff samples, and the results compared well with those obtained by the reference HPLC method.
Resumo:
This study considers the solution of a class of linear systems related with the fractional Poisson equation (FPE) (−∇2)α/2φ=g(x,y) with nonhomogeneous boundary conditions on a bounded domain. A numerical approximation to FPE is derived using a matrix representation of the Laplacian to generate a linear system of equations with its matrix A raised to the fractional power α/2. The solution of the linear system then requires the action of the matrix function f(A)=A−α/2 on a vector b. For large, sparse, and symmetric positive definite matrices, the Lanczos approximation generates f(A)b≈β0Vmf(Tm)e1. This method works well when both the analytic grade of A with respect to b and the residual for the linear system are sufficiently small. Memory constraints often require restarting the Lanczos decomposition; however this is not straightforward in the context of matrix function approximation. In this paper, we use the idea of thick-restart and adaptive preconditioning for solving linear systems to improve convergence of the Lanczos approximation. We give an error bound for the new method and illustrate its role in solving FPE. Numerical results are provided to gauge the performance of the proposed method relative to exact analytic solutions.