888 resultados para Assurance de dommages
Resumo:
Roundwood structures have always been used for temporary and low cost shelters and other fleeting structures. Novel concepts for the use of plantation hardwoods in roundwood form in construction were developed and circulated along with an electronic questionnaire to stakeholders representing growers, designers and users of hardwood. Responses indicate that there is a high level of interest in developing products from the emerging small roundwood resource and a detailed program of research was supported and recommended by the majority of participants in the survey. These results indicate a high level of support for further investigation into the use of plantation hardwood for roundwood components. Respondents representing a wide range of stakeholders have indicated that to gain benefit from a detailed project they would require solutions for connection systems and protection from pests and weathering, indications of cost and assurance of ongoing supply for niche applications, data for strength, acoustic dampening and thermal insulation properties, acceptance by regulatory authorities and training for on-site construction.
Resumo:
Efficient and reliable diagnostic tools for the routine indexing and certification of clean propagating material are essential for the management of pospiviroid diseases in horticultural crops. This study describes the development of a true multiplexed diagnostic method for the detection and identification of all nine currently recognized pospiviroid species in one assay using Luminex bead-based suspension array technology. In addition, a new data-driven, statistical method is presented for establishing thresholds for positivity for individual assays within multiplexed arrays. When applied to the multiplexed array data generated in this study, the new method was shown to have better control of false positives and false negative results than two other commonly used approaches for setting thresholds. The 11-plex Luminex MagPlex-TAG pospiviroid array described here has a unique hierarchical assay design, incorporating a near-universal assay in addition to nine species-specific assays, and a co-amplified plant internal control assay for quality assurance purposes. All assays of the multiplexed array were shown to be 100% specific, sensitive and reproducible. The multiplexed array described herein is robust, easy to use, displays unambiguous results and has strong potential for use in routine pospiviroid indexing to improve disease management strategies.
Resumo:
A new approach for the simultaneous identification of the viruses and vectors responsible for tomato yellow leaf curl disease (TYLCD) epidemics is presented. A panel of quantitative multiplexed real-time PCR assays was developed for the sensitive and reliable detection of Tomato yellow leaf curl virus-Israel (TYLCV-IL), Tomato leaf curl virus (ToLCV), Bemisia tabaci Middle East Asia Minor 1 species (MEAM1, B biotype) and B.tabaci Mediterranean species (MED, Q biotype) from either plant or whitefly samples. For quality-assurance purposes, two internal control assays were included in the assay panel for the co-amplification of solanaceous plant DNA or B.tabaci DNA. All assays were shown to be specific and reproducible. The multiplexed assays were able to reliably detect as few as 10 plasmid copies of TYLCV-IL, 100 plasmid copies of ToLCV, 500fg B.tabaci MEAM1 and 300fg B.tabaci MED DNA. Evaluated methods for routine testing of field-collected whiteflies are presented, including protocols for processing B.tabaci captured on yellow sticky traps and for bulking of multiple B.tabaci individuals prior to DNA extraction. This work assembles all of the essential features of a validated and quality-assured diagnostic method for the identification and discrimination of tomato-infecting begomovirus and B.tabaci vector species in Australia. This flexible panel of assays will facilitate improved quarantine, biosecurity and disease-management programmes both in Australia and worldwide.
Resumo:
The National Energy Efficient Building Project (NEEBP) Phase One report, published in December 2014, investigated “process issues and systemic failures” in the administration of the energy performance requirements in the National Construction Code. It found that most stakeholders believed that under-compliance with these requirements is widespread across Australia, with similar issues being reported in all states and territories. The report found that many different factors were contributing to this outcome and, as a result, many recommendations were offered that together would be expected to remedy the systemic issues reported. To follow up on this Phase 1 report, three additional projects were commissioned as part of Phase 2 of the overall NEEBP project. This Report deals with the development and piloting of an Electronic Building Passport (EBP) tool – a project undertaken jointly by pitt&sherry and a team at the Queensland University of Technology (QUT) led by Dr Wendy Miller. The other Phase 2 projects cover audits of Class 1 buildings and issues relating to building alterations and additions. The passport concept aims to provide all stakeholders with (controlled) access to the key documentation and information that they need to verify the energy performance of buildings. This trial project deals with residential buildings but in principle could apply to any building type. Nine councils were recruited to help develop and test a pilot electronic building passport tool. The participation of these councils – across all states – enabled an assessment of the extent to which these councils are currently utilising documentation; to track the compliance of residential buildings with the energy performance requirements in the National Construction Code (NCC). Overall we found that none of the participating councils are currently compiling all of the energy performance-related documentation that would demonstrate code compliance. The key reasons for this include: a major lack of clarity on precisely what documentation should be collected; cost and budget pressures; low public/stakeholder demand for the documentation; and a pragmatic judgement that non-compliance with any regulated documentation requirements represents a relatively low risk for them. Some councils reported producing documentation, such as certificates of final completion, only on demand, for example. Only three of the nine council participants reported regularly conducting compliance assessments or audits utilising this documentation and/or inspections. Overall we formed the view that documentation and information tracking processes operating within the building standards and compliance system are not working to assure compliance with the Code’s energy performance requirements. In other words the Code, and its implementation under state and territory regulatory processes, is falling short as a ‘quality assurance’ system for consumers. As a result it is likely that the new housing stock is under-performing relative to policy expectations, consuming unnecessary amounts of energy, imposing unnecessarily high energy bills on occupants, and generating unnecessary greenhouse gas emissions. At the same time, Councils noted that the demand for documentation relating to building energy performance was low. All the participant councils in the EBP pilot agreed that documentation and information processes need to work more effectively if the potential regulatory and market drivers towards energy efficient homes are to be harnessed. These findings are fully consistent with the Phase 1 NEEBP report. It was also agreed that an EBP system could potentially play an important role in improving documentation and information processes. However, only one of the participant councils indicated that they might adopt such a system on a voluntary basis. The majority felt that such a system would only be taken up if it were: - A nationally agreed system, imposed as a mandatory requirement under state or national regulation; - Capable of being used by multiple parties including councils, private certifiers, building regulators, builders and energy assessors in particular; and - Fully integrated into their existing document management systems, or at least seamlessly compatible rather than a separate, unlinked tool. Further, we note that the value of an EBP in capturing statistical information relating to the energy performance of buildings would be much greater if an EBP were adopted on a nationally consistent basis. Councils were clear that a key impediment to the take up of an EBP system is that they are facing very considerable budget and staffing challenges. They report that they are often unable to meet all community demands from the resources available to them. Therefore they are unlikely to provide resources to support the roll out of an EBP system on a voluntary basis. Overall, we conclude from this pilot that the public good would be well served if the Australian, state and territory governments continued to develop and implement an Electronic Building Passport system in a cost-efficient and effective manner. This development should occur with detailed input from building regulators, the Australian Building Codes Board (ABCB), councils and private certifiers in the first instance. This report provides a suite of recommendations (Section 7.2) designed to advance the development and guide the implementation of a national EBP system.
Resumo:
Title insurance companies originating from America, have, in the past 15 years become part of the Australian conveyancing landscape. However for most residential freehold owners, their activities would be a mystery. A purchaser does not routinely obtain title insurance, with the companies presently focussing on servicing the mortgagee sector. While the lack of penetration in the residential purchaser market may be attributed to the consumer’s lack of knowledge, evidence from Ontario and New Zealand illustrates that title insurance is likely to become an additional cost in the conveyancing process in Australia. In this article we highlight the reasons why, and demonstrate how title insurers have, by working with the legal profession been able to subtly move the risk of responsibility for compensation for loss, (at least in the first instance) from the state to the insurer, but with the added benefit for the state and the conveyancing agents that the cost of the insurance is ultimately borne by the consumer. In New Zealand this development is being accelerated by the introduction of capped conveyancing title insurance. Whether title insurance will become part of the conveyancing process is no longer the relevant question for Australia, (it undoubtedly will), but the unknown issue is just how title insurance companies will work with conveyancing agents to infiltrate the market, and what response this infiltration will have in terms of the state’s view as to their continued role in the provision of assurance. We suggest that developments from New Zealand in relation to capped conveyancing insurance are likely to be replicated in Australia in the near future, and that the state’s role in providing an assurance fund will continue, though the state may seek to expand the areas in which the right to compensation is restricted.
Resumo:
Urban sprawl is the outgrowth along the periphery of cities and along highways. Although an accurate definition of urban sprawl may be debated, a consensus is that urban sprawl is characterized by an unplanned and uneven pattern of growth, driven by multitude of processes and leading to inefficient resource utilization. Urbanization in India has never been as rapid as it is in recent times. As one of the fastest growing economies in the world, India faces stiff challenges in managing the urban sprawl, while ensuring effective delivery of basic services in urban areas. The urban areas contribute significantly to the national economy (more than 50% of GDP), while facing critical challenges in accessing basic services and necessary infrastructure, both social and economic. The overall rise in the population of the urban poor or the increase in travel times due to congestion along road networks are indicators of the effectiveness of planning and governance in assessing and catering for this demand. Agencies of governance at all levels: local bodies, state government and federal government, are facing the brunt of this rapid urban growth. It is imperative for planning and governance to facilitate, augment and service the requisite infrastructure over time systematically. Provision of infrastructure and assurance of the delivery of basic services cannot happen overnight and hence planning has to facilitate forecasting and service provision with appropriate financial mechanisms.
Resumo:
In order to protect the critical electronic equipment/system against damped sine transient currents induced into its cables due to transient electromagnetic fields, switching phenomena, platform resonances, etc. it is necessary to provide proper hardening. The hardness assurance provided can be evaluated as per the test CS 116 of MIL STD 461E/F in laboratory by generating & inducing the necessary damped sine currents into the cables of the Equipment Under Test (EUT). The need and the stringent requirements for building a damped sine wave current generator for generation of damped sine current transients of very high frequencies (30 MHz & 100 MHz) have been presented. A method using LC discharge for the generation has been considered in the development. This involves building of extremely low & nearly loss less inductors (about 5 nH & 14 nH) as well as a capacitor & a switch with much lower inductances. A technique for achieving this has been described. Two units (I No for 30 MHz. & 100 MHz each) have been built. Experiments to verify the output are being conducted.
Resumo:
The international trend towards an increasingly standards-based approach to higher education and the resultant focus on the assurance of learning in tertiary programs have generated a strong emphasis on the assessment of outcomes across the higher education sector. In legal education, curriculum reform is highly prevalent internationally as a result of various reviews of legal education. As legal education focuses more on the attainment of a broader set of outcomes encompassing soft skills, capabilities and attributes, more authentic assessment will need to be developed appropriate to this new environment, meaning that modes of assessment with strong application in real-life settings should be preferred.
Resumo:
Boron neutron capture therapy (BNCT) is a form of chemically targeted radiotherapy that utilises the high neutron capture cross-section of boron-10 isotope to achieve a preferential dose increase in the tumour. The BNCT dosimetry poses a special challenge as the radiation dose absorbed by the irradiated tissues consists of several dose different components. Dosimetry is important as the effect of the radiation on the tissue is correlated with the radiation dose. Consistent and reliable radiation dose delivery and dosimetry are thus basic requirements for radiotherapy. The international recommendations for are not directly applicable to BNCT dosimetry. The existing dosimetry guidance for BNCT provides recommendations but also calls for investigating for complementary methods for comparison and improved accuracy. In this thesis the quality assurance and stability measurements of the neutron beam monitors used in dose delivery are presented. The beam monitors were found not to be affected by the presence of a phantom in the beam and that the effect of the reactor core power distribution was less than 1%. The weekly stability test with activation detectors has been generally reproducible within the recommended tolerance value of 2%. An established toolkit for epithermal neutron beams for determination of the dose components is presented and applied in an international dosimetric intercomparison. The measured quantities (neutron flux, fast neutron and photon dose) by the groups in the intercomparison were generally in agreement within the stated uncertainties. However, the uncertainties were large, ranging from 3-30% (1 standard deviation), emphasising the importance of dosimetric intercomparisons if clinical data is to be compared between different centers. Measurements with the Exradin type 2M ionisation chamber have been repeated in the epithermal neutron beam in the same measurement configuration over the course of 10 years. The presented results exclude severe sensitivity changes to thermal neutrons that have been reported for this type of chamber. Microdosimetry and polymer gel dosimetry as complementary methods for epithermal neutron beam dosimetry are studied. For microdosimetry the comparison of results with ionisation chambers and computer simulation showed that the photon dose measured with microdosimetry was lower than with the two other methods. The disagreement was within the uncertainties. For neutron dose the simulation and microdosimetry results agreed within 10% while the ionisation chamber technique gave 10-30% lower neutron dose rates than the two other methods. The response of the BANG-3 gel was found to be linear for both photon and epithermal neutron beam irradiation. The dose distribution normalised to dose maximum measured by MAGIC polymer gel was found to agree well with the simulated result near the dose maximum while the spatial difference between measured and simulated 30% isodose line was more than 1 cm. In both the BANG-3 and MAGIC gel studies, the interpretation of the results was complicated by the presence of high-LET radiation.
Resumo:
The Earth's ecosystems are protected from the dangerous part of the solar ultraviolet (UV) radiation by stratospheric ozone, which absorbs most of the harmful UV wavelengths. Severe depletion of stratospheric ozone has been observed in the Antarctic region, and to a lesser extent in the Arctic and midlatitudes. Concern about the effects of increasing UV radiation on human beings and the natural environment has led to ground based monitoring of UV radiation. In order to achieve high-quality UV time series for scientific analyses, proper quality control (QC) and quality assurance (QA) procedures have to be followed. In this work, practices of QC and QA are developed for Brewer spectroradiometers and NILU-UV multifilter radiometers, which measure in the Arctic and Antarctic regions, respectively. These practices are applicable to other UV instruments as well. The spectral features and the effect of different factors affecting UV radiation were studied for the spectral UV time series at Sodankylä. The QA of the Finnish Meteorological Institute's (FMI) two Brewer spectroradiometers included daily maintenance, laboratory characterizations, the calculation of long-term spectral responsivity, data processing and quality assessment. New methods for the cosine correction, the temperature correction and the calculation of long-term changes of spectral responsivity were developed. Reconstructed UV irradiances were used as a QA tool for spectroradiometer data. The actual cosine correction factor was found to vary between 1.08-1.12 and 1.08-1.13. The temperature characterization showed a linear temperature dependence between the instrument's internal temperature and the photon counts per cycle. Both Brewers have participated in international spectroradiometer comparisons and have shown good stability. The differences between the Brewers and the portable reference spectroradiometer QASUME have been within 5% during 2002-2010. The features of the spectral UV radiation time series at Sodankylä were analysed for the time period 1990-2001. No statistically significant long-term changes in UV irradiances were found, and the results were strongly dependent on the time period studied. Ozone was the dominant factor affecting UV radiation during the springtime, whereas clouds played a more important role during the summertime. During this work, the Antarctic NILU-UV multifilter radiometer network was established by the Instituto Nacional de Meteorogía (INM) as a joint Spanish-Argentinian-Finnish cooperation project. As part of this work, the QC/QA practices of the network were developed. They included training of the operators, daily maintenance, regular lamp tests and solar comparisons with the travelling reference instrument. Drifts of up to 35% in the sensitivity of the channels of the NILU-UV multifilter radiometers were found during the first four years of operation. This work emphasized the importance of proper QC/QA, including regular lamp tests, for the multifilter radiometers also. The effect of the drifts were corrected by a method scaling the site NILU-UV channels to those of the travelling reference NILU-UV. After correction, the mean ratios of erythemally-weighted UV dose rates measured during solar comparisons between the reference NILU-UV and the site NILU-UVs were 1.007±0.011 and 1.012±0.012 for Ushuaia and Marambio, respectively, when the solar zenith angle varied up to 80°. Solar comparisons between the NILU-UVs and spectroradiometers showed a ±5% difference near local noon time, which can be seen as proof of successful QC/QA procedures and transfer of irradiance scales. This work also showed that UV measurements made in the Arctic and Antarctic can be comparable with each other.
Resumo:
Radiation therapy (RT) plays currently significant role in curative treatments of several cancers. External beam RT is carried out mostly by using megavoltage beams of linear accelerators. Tumor eradication and normal tissue complications correlate to dose absorbed in tissues. Normally this dependence is steep and it is crucial that actual dose within patient accurately correspond to the planned dose. All factors in a RT procedure contain uncertainties requiring strict quality assurance. From hospital physicist´s point of a view, technical quality control (QC), dose calculations and methods for verification of correct treatment location are the most important subjects. Most important factor in technical QC is the verification that radiation production of an accelerator, called output, is within narrow acceptable limits. The output measurements are carried out according to a locally chosen dosimetric QC program defining measurement time interval and action levels. Dose calculation algorithms need to be configured for the accelerators by using measured beam data. The uncertainty of such data sets limits for best achievable calculation accuracy. All these dosimetric measurements require good experience, are workful, take up resources needed for treatments and are prone to several random and systematic sources of errors. Appropriate verification of treatment location is more important in intensity modulated radiation therapy (IMRT) than in conventional RT. This is due to steep dose gradients produced within or close to healthy tissues locating only a few millimetres from the targeted volume. The thesis was concentrated in investigation of the quality of dosimetric measurements, the efficacy of dosimetric QC programs, the verification of measured beam data and the effect of positional errors on the dose received by the major salivary glands in head and neck IMRT. A method was developed for the estimation of the effect of the use of different dosimetric QC programs on the overall uncertainty of dose. Data were provided to facilitate the choice of a sufficient QC program. The method takes into account local output stability and reproducibility of the dosimetric QC measurements. A method based on the model fitting of the results of the QC measurements was proposed for the estimation of both of these factors. The reduction of random measurement errors and optimization of QC procedure were also investigated. A method and suggestions were presented for these purposes. The accuracy of beam data was evaluated in Finnish RT centres. Sufficient accuracy level was estimated for the beam data. A method based on the use of reference beam data was developed for the QC of beam data. Dosimetric and geometric accuracy requirements were evaluated for head and neck IMRT when function of the major salivary glands is intended to be spared. These criteria are based on the dose response obtained for the glands. Random measurement errors could be reduced enabling lowering of action levels and prolongation of measurement time interval from 1 month to even 6 months simultaneously maintaining dose accuracy. The combined effect of the proposed methods, suggestions and criteria was found to facilitate the avoidance of maximal dose errors of up to even about 8 %. In addition, their use may make the strictest recommended overall dose accuracy level of 3 % (1SD) achievable.
Resumo:
The MIT Lincoln Laboratory IDS evaluation methodology is a practical solution in terms of evaluating the performance of Intrusion Detection Systems, which has contributed tremendously to the research progress in that field. The DARPA IDS evaluation dataset has been criticized and considered by many as a very outdated dataset, unable to accommodate the latest trend in attacks. Then naturally the question arises as to whether the detection systems have improved beyond detecting these old level of attacks. If not, is it worth thinking of this dataset as obsolete? The paper presented here tries to provide supporting facts for the use of the DARPA IDS evaluation dataset. The two commonly used signature-based IDSs, Snort and Cisco IDS, and two anomaly detectors, the PHAD and the ALAD, are made use of for this evaluation purpose and the results support the usefulness of DARPA dataset for IDS evaluation.
Resumo:
The motivation behind the fusion of Intrusion Detection Systems was the realization that with the increasing traffic and increasing complexity of attacks, none of the present day stand-alone Intrusion Detection Systems can meet the high demand for a very high detection rate and an extremely low false positive rate. Multi-sensor fusion can be used to meet these requirements by a refinement of the combined response of different Intrusion Detection Systems. In this paper, we show the design technique of sensor fusion to best utilize the useful response from multiple sensors by an appropriate adjustment of the fusion threshold. The threshold is generally chosen according to the past experiences or by an expert system. In this paper, we show that the choice of the threshold bounds according to the Chebyshev inequality principle performs better. This approach also helps to solve the problem of scalability and has the advantage of failsafe capability. This paper theoretically models the fusion of Intrusion Detection Systems for the purpose of proving the improvement in performance, supplemented with the empirical evaluation. The combination of complementary sensors is shown to detect more attacks than the individual components. Since the individual sensors chosen detect sufficiently different attacks, their result can be merged for improved performance. The combination is done in different ways like (i) taking all the alarms from each system and avoiding duplications, (ii) taking alarms from each system by fixing threshold bounds, and (iii) rule-based fusion with a priori knowledge of the individual sensor performance. A number of evaluation metrics are used, and the results indicate that there is an overall enhancement in the performance of the combined detector using sensor fusion incorporating the threshold bounds and significantly better performance using simple rule-based fusion.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.
Resumo:
A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.