938 resultados para microscopy analysis
Resumo:
This project, as part of a broader Sustainable Sub-divisions research agenda, addresses the role of natural ventilation in reducing the use of energy required to cool dwellings
Resumo:
In the previous phase of this project, 2002-059-B Case-Based Reasoning in Construction and Infrastructure Projects, demonstration software was developed using a case-base reasoning engine to access a number of sources of information on lifetime of metallic building components. One source of information was data from the Queensland Department of Public Housing relating to maintenance operations over a number of years. Maintenance information is seen as being a particularly useful source of data about service life of building components as it relates to actual performance of materials in the working environment. If a building is constructed in 1984 and the maintenance records indicate that the guttering was replaced in 2006, then the service life of the gutters was 22 years in that environment. This phase of the project aims to look more deeply at the Department of Housing data, as an example of maintenance records, and formulate methods for using this data to inform the knowledge of service lifetimes.
Resumo:
Construction is an information intensive industry in which the accuracy and timeliness of information is paramount. It observed that the main communication issue in construction is to provide a method to exchange data between the site operation, the site office and the head office. The information needs under consideration are time critical to assist in maintaining or improving the efficiency at the jobsite. Without appropriate computing support this may increase the difficulty of problem solving. Many researchers focus their research on the usage of mobile computing devices in the construction industry and they believe that mobile computers have the potential to solve some construction problems that leads to reduce overall productivity. However, to date very limited observation has been conducted in terms of the deployment of mobile computers for construction workers on-site. By providing field workers with accurate, reliable and timely information at the location where it is needed, it will support the effectiveness and efficiency at the job site. Bringing a new technology into construction industry is not only need a better understanding of the application, but also need a proper preparation of the allocation of the resources such as people, and investment. With this in mind, an accurate analysis is needed to provide clearly idea of the overall costs and benefits of the new technology. A cost benefit analysis is a method of evaluating the relative merits of a proposed investment project in order to achieve efficient allocation of resources. It is a way of identifying, portraying and assessing the factors which need to be considered in making rational economic choices. In principle, a cost benefit analysis is a rigorous, quantitative and data-intensive procedure, which requires identification all potential effects, categorisation of these effects as costs and benefits, quantitative estimation of the extent of each cost and benefit associated with an action, translation of these into a common metric such as dollars, discounting of future costs and benefits into the terms of a given year, and summary of all cost and benefit to see which is greater. Even though many cost benefit analysis methodologies are available for a general assessment, there is no specific methodology can be applied for analysing the cost and benefit of the application of mobile computing devices in the construction site. Hence, the proposed methodology in this document is predominantly adapted from Baker et al. (2000), Department of Finance (1995), and Office of Investment Management (2005). The methodology is divided into four main stages and then detailed into ten steps. The methodology is provided for the CRC CI 2002-057-C Project: Enabling Team Collaboration with Pervasive and Mobile Computing and can be seen in detail in Section 3.
Resumo:
Realistic estimates of short- and long-term (strategic) budgets for maintenance and rehabilitation of road assessment management should consider the stochastic characteristics of asset conditions of the road networks so that the overall variability of road asset data conditions is taken into account. The probability theory has been used for assessing life-cycle costs for bridge infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick (1993). Salem 2003 cited the importance of the collection and analysis of existing data on total costs for all life-cycle phases of existing infrastructure, including bridges, road etc., and the use of realistic methods for calculating the probable useful life of these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting results in life-cycle cost analysis using deterministic and stochastic methods. Frangopol et. al. 2001 suggested that additional research was required to develop better life-cycle models and tools to quantify risks, and benefits associated with infrastructures. It is evident from the review of the literature that there is very limited information on the methodology that uses the stochastic characteristics of asset condition data for assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002, Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research literature, this report will describe and summarise the methodologies presented by each publication and also suggest a methodology for the current research project funded under the Cooperative Research Centre for Construction Innovation CRC CI project no 2003-029-C.
Resumo:
Reliable budget/cost estimates for road maintenance and rehabilitation are subjected to uncertainties and variability in road asset condition and characteristics of road users. The CRC CI research project 2003-029-C ‘Maintenance Cost Prediction for Road’ developed a method for assessing variation and reliability in budget/cost estimates for road maintenance and rehabilitation. The method is based on probability-based reliable theory and statistical method. The next stage of the current project is to apply the developed method to predict maintenance/rehabilitation budgets/costs of large networks for strategic investment. The first task is to assess the variability of road data. This report presents initial results of the analysis in assessing the variability of road data. A case study of the analysis for dry non reactive soil is presented to demonstrate the concept in analysing the variability of road data for large road networks. In assessing the variability of road data, large road networks were categorised into categories with common characteristics according to soil and climatic conditions, pavement conditions, pavement types, surface types and annual average daily traffic. The probability distributions, statistical means, and standard deviation values of asset conditions and annual average daily traffic for each type were quantified. The probability distributions and the statistical information obtained in this analysis will be used to asset the variation and reliability in budget/cost estimates in later stage. Generally, we usually used mean values of asset data of each category as input values for investment analysis. The variability of asset data in each category is not taken into account. This analysis method demonstrated that it can be used for practical application taking into account the variability of road data in analysing large road networks for maintenance/rehabilitation investment analysis.
Resumo:
Enhancing children's self-concepts is widely accepted as a critical educational outcome of schooling and is postulated as a mediating variable that facilitates the attainment of other desired outcomes such as improved academic achievement. Despite considerable advances in self-concept research, there has been limited progress in devising teacher-administered enhancement interventions. This is unfortunate as teachers are crucial change agents during important developmental periods when self-concept is formed. The primary aim of the present investigation is to build on the promising features of previous self-concept enhancement studies by: (a) combining two exciting research directions developed by Burnett and Craven to develop a potentially powerful cognitive-based intervention; (b) incorporating recent developments in theory and measurement to ensure that the multidimensionality of self-concept is accounted for in the research design; (c) fully investigating the effects of a potentially strong cognitive intervention on reading, mathematics, school and learning self-concepts by using a large sample size and a sophisticated research design; (d) evaluating the effects of the intervention on affective and cognitive subcomponents of reading, mathematics, school and learning self-concepts over time to test for differential effects of the intervention; (e) modifying and extending current procedures to maximise the successful implementation of a teacher-mediated intervention in a naturalistic setting by incorporating sophisticated teacher training as suggested by Hattie (1992) and including an assessment of the efficacy of implementation; and (f) examining the durability of effects associated with the intervention.
Resumo:
In this paper, the stability of an autonomous microgrid with multiple distributed generators (DG) is studied through eigenvalue analysis. It is assumed that all the DGs are connected through Voltage Source Converter (VSC) and all connected loads are passive. The VSCs are controlled by state feedback controller to achieve desired voltage and current outputs that are decided by a droop controller. The state space models of each of the converters with its associated feedback are derived. These are then connected with the state space models of the droop, network and loads to form a homogeneous model, through which the eigenvalues are evaluated. The system stability is then investigated as a function of the droop controller real and reac-tive power coefficients. These observations are then verified through simulation studies using PSCAD/EMTDC. It will be shown that the simulation results closely agree with stability be-havior predicted by the eigenvalue analysis.
Resumo:
In the past decade, the utilization of ambulance data to inform the prevalence of nonfatal heroin overdose has increased. These data can assist public health policymakers, law enforcement agencies, and health providers in planning and allocating resources. This study examined the 672 ambulance attendances at nonfatal heroin overdoses in Queensland, Australia, in 2000. Gender distribution showed a typical 70/30 male-to-female ratio. An equal number of persons with nonfatal heroin overdose were between 15 and 24 years of age and 25 and 34 years of age. Police were present in only 1 of 6 cases, and 28.1% of patients reported using drugs alone. Ambulance data are proving to be a valuable population-based resource for describing the incidence and characteristics of nonfatal heroin overdose episodes. Future studies could focus on the differences between nonfatal heroin overdose and fatal heroin overdose samples.
Resumo:
LEX is a stream cipher that progressed to Phase 3 of the eSTREAM stream cipher project. In this paper, we show that the security of LEX against algebraic attacks relies on a small equation system not being solvable faster than exhaustive search. We use the byte leakage in LEX to construct a system of 21 equa- tions in 17 variables. This is very close to the require- ment for an efficient attack, i.e. a system containing 16 variables. The system requires only 36 bytes of keystream, which is very low.
Resumo:
Introduction: Bone mineral density (BMD) is currently the preferred surrogate for bone strength in clinical practice. Finite element analysis (FEA) is a computer simulation technique that can predict the deformation of a structure when a load is applied, providing a measure of stiffness (Nmm−1). Finite element analysis of X-ray images (3D-FEXI) is a FEA technique whose analysis is derived froma single 2D radiographic image. Methods: 18 excised human femora had previously been quantitative computed tomography scanned, from which 2D BMD-equivalent radiographic images were derived, and mechanically tested to failure in a stance-loading configuration. A 3D proximal femur shape was generated from each 2D radiographic image and used to construct 3D-FEA models. Results: The coefficient of determination (R2%) to predict failure load was 54.5% for BMD and 80.4% for 3D-FEXI. Conclusions: This ex vivo study demonstrates that 3D-FEXI derived from a conventional 2D radiographic image has the potential to significantly increase the accuracy of failure load assessment of the proximal femur compared with that currently achieved with BMD. This approach may be readily extended to routine clinical BMD images derived by dual energy X-ray absorptiometry. Crown Copyright © 2009 Published by Elsevier Ltd on behalf of IPEM. All rights reserved
Resumo:
Background: The transition to school is a sensitive period for children in relation to school success. In the early school years, children need to develop positive attitudes to school and have experiences that promote academic, behavioural and social competence. When children begin school there are higher expectations of responsibility and independence and in the year one class, there are more explicit academic goals for literacy and numeracy and more formal instruction. Most importantly, children’s early attitudes to learning and learning styles have an impact on later educational outcomes. Method: Data were drawn from The Longitudinal Study of Australian Children (LSAC). LSAC is a cross-sequential cohort study funded by the Australian Government. In these analyses, Wave 2 (2006) data for 2499 children in the Kindergarten Cohort were used. Children, at Wave 2, were in the first year of formal school. They had a mean age of 6.9 years (SD= 0.26). Measures included a 6-item measure of Approaches to Learning (task persistence, independence) and the Academic Rating Scales for language and literacy and mathematical thinking. Teachers rated their relationships with children on the short form of the STRS. Results: Girls were rated by their teachers as doing better than boys on Language and literacy, Approaches to learning; and they had a better relationship with their teacher. Children from an Aboriginal or Torres Strait Island (ATSI) background were rated as doing less well on Language and Literacy and Mathematical thinking and on their Approaches to learning. Children from high Socio Economic Position families are doing better on teacher rated Language and Literacy, Mathematical thinking, Approaches to learning and they had a better relationship with their teacher. Conclusions: Findings highlight the importance of key demographic variables in understanding children’s early school success.
Resumo:
This thesis is a documented energy audit and long term study of energy and water reduction in a ghee factory. Global production of ghee exceeds 4 million tonnes annually. The factory in this study refines dairy products by non-traditional centrifugal separation and produces 99.9% pure, canned, crystallised Anhydrous Milk Fat (Ghee). Ghee is traditionally made by batch processing methods. The traditional method is less efficient, than centrifugal separation. An in depth systematic investigation was conducted of each item of major equipment including; ammonia refrigeration, a steam boiler, canning equipment, pumps, heat exchangers and compressed air were all fine-tuned. Continuous monitoring of electrical usage showed that not every initiative worked, others had pay back periods of less than a year. In 1994-95 energy consumption was 6,582GJ and in 2003-04 it was 5,552GJ down 16% for a similar output. A significant reduction in water usage was achieved by reducing the airflow in the refrigeration evaporative condensers to match the refrigeration load. Water usage has fallen 68% from18ML in 1994-95 to 5.78ML in 2003-04. The methods reported in this thesis could be applied to other industries, which have similar equipment, and other ghee manufacturers.
Resumo:
This work aims to take advantage of recent developments in joint factor analysis (JFA) in the context of a phonetically conditioned GMM speaker verification system. Previous work has shown performance advantages through phonetic conditioning, but this has not been shown to date with the JFA framework. Our focus is particularly on strategies for combining the phone-conditioned systems. We show that the classic fusion of the scores is suboptimal when using multiple GMM systems. We investigate several combination strategies in the model space, and demonstrate improvement over score-level combination as well as over a non-phonetic baseline system. This work was conducted during the 2008 CLSP Workshop at Johns Hopkins University.