969 resultados para technology standard
Resumo:
The design of pre-contoured fracture fixation implants (plates and nails) that correctly fit the anatomy of a patient utilises 3D models of long bones with accurate geometric representation. 3D data is usually available from computed tomography (CT) scans of human cadavers that generally represent the above 60 year old age group. Thus, despite the fact that half of the seriously injured population comes from the 30 year age group and below, virtually no data exists from these younger age groups to inform the design of implants that optimally fit patients from these groups. Hence, relevant bone data from these age groups is required. The current gold standard for acquiring such data–CT–involves ionising radiation and cannot be used to scan healthy human volunteers. Magnetic resonance imaging (MRI) has been shown to be a potential alternative in the previous studies conducted using small bones (tarsal bones) and parts of the long bones. However, in order to use MRI effectively for 3D reconstruction of human long bones, further validations using long bones and appropriate reference standards are required. Accurate reconstruction of 3D models from CT or MRI data sets requires an accurate image segmentation method. Currently available sophisticated segmentation methods involve complex programming and mathematics that researchers are not trained to perform. Therefore, an accurate but relatively simple segmentation method is required for segmentation of CT and MRI data. Furthermore, some of the limitations of 1.5T MRI such as very long scanning times and poor contrast in articular regions can potentially be reduced by using higher field 3T MRI imaging. However, a quantification of the signal to noise ratio (SNR) gain at the bone - soft tissue interface should be performed; this is not reported in the literature. As MRI scanning of long bones has very long scanning times, the acquired images are more prone to motion artefacts due to random movements of the subject‟s limbs. One of the artefacts observed is the step artefact that is believed to occur from the random movements of the volunteer during a scan. This needs to be corrected before the models can be used for implant design. As the first aim, this study investigated two segmentation methods: intensity thresholding and Canny edge detection as accurate but simple segmentation methods for segmentation of MRI and CT data. The second aim was to investigate the usability of MRI as a radiation free imaging alternative to CT for reconstruction of 3D models of long bones. The third aim was to use 3T MRI to improve the poor contrast in articular regions and long scanning times of current MRI. The fourth and final aim was to minimise the step artefact using 3D modelling techniques. The segmentation methods were investigated using CT scans of five ovine femora. The single level thresholding was performed using a visually selected threshold level to segment the complete femur. For multilevel thresholding, multiple threshold levels calculated from the threshold selection method were used for the proximal, diaphyseal and distal regions of the femur. Canny edge detection was used by delineating the outer and inner contour of 2D images and then combining them to generate the 3D model. Models generated from these methods were compared to the reference standard generated using the mechanical contact scans of the denuded bone. The second aim was achieved using CT and MRI scans of five ovine femora and segmenting them using the multilevel threshold method. A surface geometric comparison was conducted between CT based, MRI based and reference models. To quantitatively compare the 1.5T images to the 3T MRI images, the right lower limbs of five healthy volunteers were scanned using scanners from the same manufacturer. The images obtained using the identical protocols were compared by means of SNR and contrast to noise ratio (CNR) of muscle, bone marrow and bone. In order to correct the step artefact in the final 3D models, the step was simulated in five ovine femora scanned with a 3T MRI scanner. The step was corrected using the iterative closest point (ICP) algorithm based aligning method. The present study demonstrated that the multi-threshold approach in combination with the threshold selection method can generate 3D models from long bones with an average deviation of 0.18 mm. The same was 0.24 mm of the single threshold method. There was a significant statistical difference between the accuracy of models generated by the two methods. In comparison, the Canny edge detection method generated average deviation of 0.20 mm. MRI based models exhibited 0.23 mm average deviation in comparison to the 0.18 mm average deviation of CT based models. The differences were not statistically significant. 3T MRI improved the contrast in the bone–muscle interfaces of most anatomical regions of femora and tibiae, potentially improving the inaccuracies conferred by poor contrast of the articular regions. Using the robust ICP algorithm to align the 3D surfaces, the step artefact that occurred by the volunteer moving the leg was corrected, generating errors of 0.32 ± 0.02 mm when compared with the reference standard. The study concludes that magnetic resonance imaging, together with simple multilevel thresholding segmentation, is able to produce 3D models of long bones with accurate geometric representations. The method is, therefore, a potential alternative to the current gold standard CT imaging.
Resumo:
This project investigates machine listening and improvisation in interactive music systems with the goal of improvising musically appropriate accompaniment to an audio stream in real-time. The input audio may be from a live musical ensemble, or playback of a recording for use by a DJ. I present a collection of robust techniques for machine listening in the context of Western popular dance music genres, and strategies of improvisation to allow for intuitive and musically salient interaction in live performance. The findings are embodied in a computational agent – the Jambot – capable of real-time musical improvisation in an ensemble setting. Conceptually the agent’s functionality is split into three domains: reception, analysis and generation. The project has resulted in novel techniques for addressing a range of issues in each of these domains. In the reception domain I present a novel suite of onset detection algorithms for real-time detection and classification of percussive onsets. This suite achieves reasonable discrimination between the kick, snare and hi-hat attacks of a standard drum-kit, with sufficiently low-latency to allow perceptually simultaneous triggering of accompaniment notes. The onset detection algorithms are designed to operate in the context of complex polyphonic audio. In the analysis domain I present novel beat-tracking and metre-induction algorithms that operate in real-time and are responsive to change in a live setting. I also present a novel analytic model of rhythm, based on musically salient features. This model informs the generation process, affording intuitive parametric control and allowing for the creation of a broad range of interesting rhythms. In the generation domain I present a novel improvisatory architecture drawing on theories of music perception, which provides a mechanism for the real-time generation of complementary accompaniment in an ensemble setting. All of these innovations have been combined into a computational agent – the Jambot, which is capable of producing improvised percussive musical accompaniment to an audio stream in real-time. I situate the architectural philosophy of the Jambot within contemporary debate regarding the nature of cognition and artificial intelligence, and argue for an approach to algorithmic improvisation that privileges the minimisation of cognitive dissonance in human-computer interaction. This thesis contains extensive written discussions of the Jambot and its component algorithms, along with some comparative analyses of aspects of its operation and aesthetic evaluations of its output. The accompanying CD contains the Jambot software, along with video documentation of experiments and performances conducted during the project.
Resumo:
In recent times, light gauge steel framed (LSF) structures, such as cold-formed steel wall systems, are increasingly used, but without a full understanding of their fire performance. Traditionally the fire resistance rating of these load-bearing LSF wall systems is based on approximate prescriptive methods developed based on limited fire tests. Very often they are limited to standard wall configurations used by the industry. Increased fire rating is provided simply by adding more plasterboards to these walls. This is not an acceptable situation as it not only inhibits innovation and structural and cost efficiencies but also casts doubt over the fire safety of these wall systems. Hence a detailed fire research study into the performance of LSF wall systems was undertaken using full scale fire tests and extensive numerical studies. A new composite wall panel developed at QUT was also considered in this study, where the insulation was used externally between the plasterboards on both sides of the steel wall frame instead of locating it in the cavity. Three full scale fire tests of LSF wall systems built using the new composite panel system were undertaken at a higher load ratio using a gas furnace designed to deliver heat in accordance with the standard time temperature curve in AS 1530.4 (SA, 2005). Fire tests included the measurements of load-deformation characteristics of LSF walls until failure as well as associated time-temperature measurements across the thickness and along the length of all the specimens. Tests of LSF walls under axial compression load have shown the improvement to their fire performance and fire resistance rating when the new composite panel was used. Hence this research recommends the use of the new composite panel system for cold-formed LSF walls. The numerical study was undertaken using a finite element program ABAQUS. The finite element analyses were conducted under both steady state and transient state conditions using the measured hot and cold flange temperature distributions from the fire tests. The elevated temperature reduction factors for mechanical properties were based on the equations proposed by Dolamune Kankanamge and Mahendran (2011). These finite element models were first validated by comparing their results with experimental test results from this study and Kolarkar (2010). The developed finite element models were able to predict the failure times within 5 minutes. The validated model was then used in a detailed numerical study into the strength of cold-formed thin-walled steel channels used in both the conventional and the new composite panel systems to increase the understanding of their behaviour under nonuniform elevated temperature conditions and to develop fire design rules. The measured time-temperature distributions obtained from the fire tests were used. Since the fire tests showed that the plasterboards provided sufficient lateral restraint until the failure of LSF wall panels, this assumption was also used in the analyses and was further validated by comparison with experimental results. Hence in this study of LSF wall studs, only the flexural buckling about the major axis and local buckling were considered. A new fire design method was proposed using AS/NZS 4600 (SA, 2005), NAS (AISI, 2007) and Eurocode 3 Part 1.3 (ECS, 2006). The importance of considering thermal bowing, magnified thermal bowing and neutral axis shift in the fire design was also investigated. A spread sheet based design tool was developed based on the above design codes to predict the failure load ratio versus time and temperature for varying LSF wall configurations including insulations. Idealised time-temperature profiles were developed based on the measured temperature values of the studs. This was used in a detailed numerical study to fully understand the structural behaviour of LSF wall panels. Appropriate equations were proposed to find the critical temperatures for different composite panels, varying in steel thickness, steel grade and screw spacing for any load ratio. Hence useful and simple design rules were proposed based on the current cold-formed steel structures and fire design standards, and their accuracy and advantages were discussed. The results were also used to validate the fire design rules developed based on AS/NZS 4600 (SA, 2005) and Eurocode Part 1.3 (ECS, 2006). This demonstrated the significant improvements to the design method when compared to the currently used prescriptive design methods for LSF wall systems under fire conditions. In summary, this research has developed comprehensive experimental and numerical thermal and structural performance data for both the conventional and the proposed new load bearing LSF wall systems under standard fire conditions. Finite element models were developed to predict the failure times of LSF walls accurately. Idealized hot flange temperature profiles were developed for non-insulated, cavity and externally insulated load bearing wall systems. Suitable fire design rules and spread sheet based design tools were developed based on the existing standards to predict the ultimate failure load, failure times and failure temperatures of LSF wall studs. Simplified equations were proposed to find the critical temperatures for varying wall panel configurations and load ratios. The results from this research are useful to both structural and fire engineers and researchers. Most importantly, this research has significantly improved the knowledge and understanding of cold-formed LSF loadbearing walls under standard fire conditions.
Resumo:
Generic, flexible social media spaces such as Facebook and Twitter constitute an increasingly important element in our overall media repertoires. They provide a technological basis for instant and world-wide, ad hoc, many-to-many communication, and their effect on global communication patterns has already been highlighted. The short-messaging platform Twitter, for example, caters for uses ranging from interpersonal and quasi-private phatic exchanges to ‘ambient journalism’: ad hoc new reporting and dissemination as major events break. Many such uses have themselves emerged through user-driven processes: even standard Twitter conventions such as the @reply (to publicly address a fellow user) or the #hashtag(to collect related messages in an easily accessible space) are user inventions, in fact, and were incorporated into Twitter’s own infrastructure only subsequently. This demonstrates the substantial potential of social, user-led innovation in social media spaces.
Resumo:
Fire safety of buildings has been recognised as very important by the building industry and the community at large. Gypsum plasterboards are widely used to protect light gauge steel frame (LSF) walls all over the world. Gypsum contains free and chemically bound water in its crystal structure. Plasterboard also contains gypsum (CaSO4.2H2O) and calcium carbonate (CaCO3). The dehydration of gypsum and the decomposition of calcium carbonate absorb heat, and thus are able to protect LSF walls from fires. Kolarkar and Mahendran (2008) developed an innovative composite wall panel system, where the insulation was sandwiched between two plasterboards to improve the thermal and structural performance of LSF wall panels under fire conditions. In order to understand the performance of gypsum plasterboards and LSF wall panels under standard fire conditions, many experiments were conducted in the Fire Research Laboratory of Queensland University of Technology (Kolarkar, 2010). Fire tests were conducted on single, double and triple layers of Type X gypsum plasterboards and load bearing LSF wall panels under standard fire conditions. However, suitable numerical models have not been developed to investigate the thermal performance of LSF walls using the innovative composite panels under standard fire conditions. Continued reliance on expensive and time consuming fire tests is not acceptable. Therefore this research developed suitable numerical models to investigate the thermal performance of both plasterboard assemblies and load bearing LSF wall panels. SAFIR, a finite element program, was used to investigate the thermal performance of gypsum plasterboard assemblies and LSF wall panels under standard fire conditions. Appropriate values of important thermal properties were proposed for plasterboards and insulations based on laboratory tests, literature review and comparisons of finite element analysis results of small scale plasterboard assemblies from this research and corresponding experimental results from Kolarkar (2010). The important thermal properties (thermal conductivity, specific heat capacity and density) of gypsum plasterboard and insulation materials were proposed as functions of temperature and used in the numerical models of load bearing LSF wall panels. Using these thermal properties, the developed finite element models were able to accurately predict the time temperature profiles of plasterboard assemblies while they predicted them reasonably well for load bearing LSF wall systems despite the many complexities that are present in these LSF wall systems under fires. This thesis presents the details of the finite element models of plasterboard assemblies and load bearing LSF wall panels including those with the composite panels developed by Kolarkar and Mahendran (2008). It examines and compares the thermal performance of composite panels developed based on different insulating materials of varying densities and thicknesses based on 11 small scale tests, and makes suitable recommendations for improved fire performance of stud wall panels protected by these composite panels. It also presents the thermal performance data of LSF wall systems and demonstrates the superior performance of LSF wall systems using the composite panels. Using the developed finite element of models of LSF walls, this thesis has proposed new LSF wall systems with increased fire rating. The developed finite element models are particularly useful in comparing the thermal performance of different wall panel systems without time consuming and expensive fire tests.
Resumo:
Despite multiple efforts, the amount of poverty in Bangladesh has remained alarmingly high by any standard. Two salient characteristics of poverty alleviation in Bangladesh are: their poor accessibility for the ‘target’ population (the rural poor), and lack of co-ordination between government and the Non-Government Organisations. The moment the state alone is unable to combat poverty then the NGOs come into the picture to fill the void. First Britain as a colonial power, then the East Pakistan Government and the Government of Bangladesh have promulgated Ordinances and Regulations for the practical regulation of NGOs. The loopholes and flaws within the legal framework have given the NGOs opportunities to violate the Ordinances and Regulations. A better situation could be achieved by modifying and strictly implementing such state rules, ensuring accountability, effective state control, and meaningful NGO-State collaboration and co-operation.
Resumo:
Information Technology and its relationship to organisational performance has long been the interest of researchers. While there is concurrence that IT does contribute to performance, and we are efficiently expanding our knowledge on what factors cause better leveraging of IT resources in organisations, we have done little to understand how these factors interact with technology that results in improved performance. Using a structurational lens that recognises the recursive interaction between technology and people in the presence of social practices, and the norms that inform their ongoing practices, we propose an ethnographic approach to understanding the interaction between technology and resources, aiming to provide richer insight on the nature of the environment that promotes better use of IT resources. Such insights could provide the IT users with at least an initial conception of the IT usage platform that they could promote in their organisations to leverage the most from their IT resources.
Resumo:
Understanding the business value of IT has mostly been studied in developed countries, but because most investment in developing countries is derived from external sources, the influence of that investment on business value is likely to be different. We test this notion using a two-layer model. We examine the impact of IT investments on firm processes, and the relationship of these processes to firm performance in a developing country. Our findings suggest that investment in different areas of IT positively relates to improvements in intermediate business processes and these intermediate business processes positively relate to the overall financial performance of firms in a developing country.
Resumo:
Organisations devote substantial resources to acquire information technology (IT), and explaining the important issue of how IT can affect performance has posed a significant challenge to information system (IS) researchers. Owing to the importance of expanding our understanding on how and where IT and IT-related resources impact organisational performance, this study investigates the differential effects of IT resources and IT-related capabilities, in the presence of platform-related complementarities, on business process performance. We test these relationships empirically via a field survey of 216 firms. The findings suggest that IT resources and IT-related capabilities explain variance in performance. Of interest is the finding that IT resources and IT-related capabilities ability to explain variance in business process is further enhanced by the presence of the platform-related complementarities. Our findings are largely consistent with the resource-based and complementarity arguments of sources of IT-related business value.
Resumo:
Understanding information technology’s (ITs) contribution to business value is an imperative issue, and while we have attempted to untangle the relationship between IT and business value with some success, our knowledge of specific factors leading to ITs contribution to business value still remains limited. In this paper we propose that complementing IT resources, by establishing a sound IT platform with capable organisational resources may aid in ITs ability to contribute to business value. We suggest that performance measurement of this contribution be undertaken at the business process level first, and then mapped through to firm level performance measurement to obtain a better understanding of the path of IT business value contribution.
Resumo:
We examine the relationship between the effectiveness of IT steering committee-driven IT governance initiatives and firm’s IT management and IT infrastructure related capabilities. We test these relationships empirically by a field survey of 216 firms. Results of this study suggest that a firms’ effectiveness of IT steering committee-driven IT governance initiatives positively relate to the level of their IT-related capabilities. We also found positive relationships between IT-related capabilities and internal process-level performance, which positively relate to improvement in customer service and firm-level performance. For researchers, we demonstrate that the resourcebased theory provides a more robust explanation of the determinants of firms IT governance initiatives. This would be ideal in evaluating other IT governance initiatives effectiveness in relation to how they contribute to building performance-differentiating IT-related capabilities. For decision makers, we hope our study has reiterated the notion that IT governance is truly a coordinated effort, embracing all levels of human resources.
Resumo:
This book provides the much needed international dimension on the payoffs of information technology investments. The bulk of the research on the impact of information technology investments has been undertaken in developed economies, mainly the United States. This research provides an alternative dimension - a developing country perspective on how information technology investments impacts organizations. Secondly, there has been much debate and controversy on how we measure information technology investment payoffs. This research uses an innovative two-stage model where it proposes that information technology investments will first impact the process and improvement in the processes will then impact the performance. In doing so, it considers sectors of information technology investment rather than taking it as one. Finally, almost all prior studies in this area have considered only the tangible impact of information technology investments. This research proposes that one can only better understand the benefits by looking at both the tangible and intangible benefits.
Resumo:
The decision of Dalton J in Lai v Soineva [2011] QSC 247 has resulted in a change in the latest versions of the Real Estate Institute of Queensland (REIQ) contracts.
Resumo:
This article considers how law schools can facilitate the development of technology skills by using technology to enhance access to mooting in settings that replicate legal practice. The authors conducted research into the use of technology by Australian law schools for mooting and evaluated an internal mooting competition using Elluminate, an online communication platform available to students through Blackboard. The analysis of the results of the survey and the Elluminate competition will demonstrate that technology can be used in mooting to provide an authentic learning experience. The paper concludes that while it is essential to teach technology skills as part of legal education, it is important that the benefits of using technology are made clear in order for it to be accepted and embraced by the students. Technology must also be available to all students considering the widening participation in higher education and consequent increasing diversity of law students.