955 resultados para experimental methodology
Resumo:
This is an initial report of the PolyU SD part of the team to study Pre-fabricated Building Design and Construction Methodology and marks the completion of Phase 1. It follows our first notes prepared for the meeting on 2 February that identified some critical issues including future lifestyles, life expectancy of buildings, sustainability, size, flexibility and planning considerations. It is also an expansion of our presentation in Dongguan on 23 February. It is not a comprehensive survey of existing approaches or possible ways forward, but it has homed in on certain specific issues and does give specific examples to make the suggestions concrete. It is recommended that more comprehensive research be done to establish previous work and experience internationally. It is also recommended that more research be done on lifestyles as a preliminary to developing at least three concepts for evaluation before proceeding to the detailed design of one concept for full prototyping and market testing. The goal at this point is not to define a single direction but to suggest several future trajectories for further consideration. By the same token, this report is not intended as an exhaustive description of the considerable base of knowledge and ideas brought by the PolyU team to this exciting task. Before taking on an issue of this magnitude and importance in the definition of Hong Kong's future, one must carry out a thoughtful analysis of the issues at hand and an informed definition of paradigms, directions, goals and methods whereby our energies can be best used in the next steps. This report is the result of this analysis
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
In this experimental study the permeability of Australian bagasse chemical pulps obtained from different bagasse fractions were measured in a simple permeability cell and the results compared to one another as well as to eucalypt, Argentinean bagasse and pine pulps. The pulps were characterised in terms of the permeability parameters, the specific surface area, Sv, and the swelling factor, α. It was found that the bagasse fraction used affects these parameters. Fractionation of whole bagasse prior to pulping produced pulps that have permeability properties that compare favourably with eucalypt pulp. The values of Sv and α for bagasse pulp also depend on whether a constant or a variable Kozeny factor is used.
Resumo:
Amphibian is an 10’00’’ musical work which explores new musical interfaces and approaches to hybridising performance practices from the popular music, electronic dance music and computer music traditions. The work is designed to be presented in a range of contexts associated with the electro-acoustic, popular and classical music traditions. The work is for two performers using two synchronised laptops, an electric guitar and a custom designed gestural interface for vocal performers - the e-Mic (Extended Mic-stand Interface Controller). This interface was developed by one of the co-authors, Donna Hewitt. The e-Mic allows a vocal performer to manipulate the voice in real time through the capture of physical gestures via an array of sensors - pressure, distance, tilt - along with ribbon controllers and an X-Y joystick microphone mount. Performance data are then sent to a computer, running audio-processing software, which is used to transform the audio signal from the microphone. In this work, data is also exchanged between performers via a local wireless network, allowing performers to work with shared data streams. The duo employs the gestural conventions of guitarist and singer (i.e. 'a band' in a popular music context), but transform these sounds and gestures into new digital music. The gestural language of popular music is deliberately subverted and taken into a new context. The piece thus explores the nexus between the sonic and performative practices of electro acoustic music and intelligent electronic dance music (‘idm’). This work was situated in the research fields of new musical interfacing, interaction design, experimental music composition and performance. The contexts in which the research was conducted were live musical performance and studio music production. The work investigated new methods for musical interfacing, performance data mapping, hybrid performance and compositional practices in electronic music. The research methodology was practice-led. New insights were gained from the iterative experimental workshopping of gestural inputs, musical data mapping, inter-performer data exchange, software patch design, data and audio processing chains. In respect of interfacing, there were innovations in the design and implementation of a novel sensor-based gestural interface for singers, the e-Mic, one of the only existing gestural controllers for singers. This work explored the compositional potential of sharing real time performance data between performers and deployed novel methods for inter-performer data exchange and mapping. As regards stylistic and performance innovation, the work explored and demonstrated an approach to the hybridisation of the gestural and sonic language of popular music with recent ‘post-digital’ approaches to laptop based experimental music The development of the work was supported by an Australia Council Grant. Research findings have been disseminated via a range of international conference publications, recordings, radio interviews (ABC Classic FM), broadcasts, and performances at international events and festivals. The work was curated into the major Australian international festival, Liquid Architecture, and was selected by an international music jury (through blind peer review) for presentation at the International Computer Music Conference in Belfast, N. Ireland.
Resumo:
For the last two decades heart disease has been the highest single cause of death for the human population. With an alarming number of patients requiring heart transplant, and donations not able to satisfy the demand, treatment looks to mechanical alternatives. Rotary Ventricular Assist Devices, VADs, are miniature pumps which can be implanted alongside the heart to assist its pumping function. These constant flow devices are smaller, more efficient and promise a longer operational life than more traditional pulsatile VADs. The development of rotary VADs has focused on single pumps assisting the left ventricle only to supply blood for the body. In many patients however, failure of both ventricles demands that an additional pulsatile device be used to support the failing right ventricle. This condition renders them hospital bound while they wait for an unlikely heart donation. Reported attempts to use two rotary pumps to support both ventricles concurrently have warned of inherent haemodynamic instability. Poor balancing of the pumps’ flow rates quickly leads to vascular congestion increasing the risk of oedema and ventricular ‘suckdown’ occluding the inlet to the pump. This thesis introduces a novel Bi-Ventricular Assist Device (BiVAD) configuration where the pump outputs are passively balanced by vascular pressure. The BiVAD consists of two rotary pumps straddling the mechanical passive controller. Fluctuations in vascular pressure induce small deflections within both pumps adjusting their outputs allowing them to maintain arterial pressure. To optimise the passive controller’s interaction with the circulation, the controller’s dynamic response is optimised with a spring, mass, damper arrangement. This two part study presents a comprehensive assessment of the prototype’s ‘viability’ as a support device. Its ‘viability’ was considered based on its sensitivity to pathogenic haemodynamics and the ability of the passive response to maintain healthy circulation. The first part of the study is an experimental investigation where a prototype device was designed and built, and then tested in a pulsatile mock circulation loop. The BiVAD was subjected to a range of haemodynamic imbalances as well as a dynamic analysis to assess the functionality of the mechanical damper. The second part introduces the development of a numerical program to simulate human circulation supported by the passively controlled BiVAD. Both investigations showed that the prototype was able to mimic the native baroreceptor response. Simulating hypertension, poor flow balancing and subsequent ventricular failure during BiVAD support allowed the passive controller’s response to be assessed. Triggered by the resulting pressure imbalance, the controller responded by passively adjusting the VAD outputs in order to maintain healthy arterial pressures. This baroreceptor-like response demonstrated the inherent stability of the auto regulating BiVAD prototype. Simulating pulmonary hypertension in the more observable numerical model, however, revealed a serious issue with the passive response. The subsequent decrease in venous return into the left heart went unnoticed by the passive controller. Meanwhile the coupled nature of the passive response not only decreased RVAD output to reduce pulmonary arterial pressure, but it also increased LVAD output. Consequently, the LVAD increased fluid evacuation from the left ventricle, LV, and so actually accelerated the onset of LV collapse. It was concluded that despite the inherently stable baroreceptor-like response of the passive controller, its lack of sensitivity to venous return made it unviable in its present configuration. The study revealed a number of other important findings. Perhaps the most significant was that the reduced pulse experienced during constant flow support unbalanced the ratio of effective resistances of both vascular circuits. Even during steady rotary support therefore, the resulting ventricle volume imbalance increased the likelihood of suckdown. Additionally, mechanical damping of the passive controller’s response successfully filtered out pressure fluctuations from residual ventricular function. Finally, the importance of recognising inertial contributions to blood flow in the atria and ventricles in a numerical simulation were highlighted. This thesis documents the first attempt to create a fully auto regulated rotary cardiac assist device. Initial results encourage development of an inlet configuration sensitive to low flow such as collapsible inlet cannulae. Combining this with the existing baroreceptor-like response of the passive controller will render a highly stable passively controlled BiVAD configuration. The prototype controller’s passive interaction with the vasculature is a significant step towards a highly stable new generation of artificial heart.
Resumo:
Construction is an information intensive industry in which the accuracy and timeliness of information is paramount. It observed that the main communication issue in construction is to provide a method to exchange data between the site operation, the site office and the head office. The information needs under consideration are time critical to assist in maintaining or improving the efficiency at the jobsite. Without appropriate computing support this may increase the difficulty of problem solving. Many researchers focus their research on the usage of mobile computing devices in the construction industry and they believe that mobile computers have the potential to solve some construction problems that leads to reduce overall productivity. However, to date very limited observation has been conducted in terms of the deployment of mobile computers for construction workers on-site. By providing field workers with accurate, reliable and timely information at the location where it is needed, it will support the effectiveness and efficiency at the job site. Bringing a new technology into construction industry is not only need a better understanding of the application, but also need a proper preparation of the allocation of the resources such as people, and investment. With this in mind, an accurate analysis is needed to provide clearly idea of the overall costs and benefits of the new technology. A cost benefit analysis is a method of evaluating the relative merits of a proposed investment project in order to achieve efficient allocation of resources. It is a way of identifying, portraying and assessing the factors which need to be considered in making rational economic choices. In principle, a cost benefit analysis is a rigorous, quantitative and data-intensive procedure, which requires identification all potential effects, categorisation of these effects as costs and benefits, quantitative estimation of the extent of each cost and benefit associated with an action, translation of these into a common metric such as dollars, discounting of future costs and benefits into the terms of a given year, and summary of all cost and benefit to see which is greater. Even though many cost benefit analysis methodologies are available for a general assessment, there is no specific methodology can be applied for analysing the cost and benefit of the application of mobile computing devices in the construction site. Hence, the proposed methodology in this document is predominantly adapted from Baker et al. (2000), Department of Finance (1995), and Office of Investment Management (2005). The methodology is divided into four main stages and then detailed into ten steps. The methodology is provided for the CRC CI 2002-057-C Project: Enabling Team Collaboration with Pervasive and Mobile Computing and can be seen in detail in Section 3.
Resumo:
In the previous research CRC CI 2001-010-C “Investment Decision Framework for Infrastructure Asset Management”, a method for assessing variation in cost estimates for road maintenance and rehabilitation was developed. The variability of pavement strength collected from a 92km national highway was used in the analysis to demonstrate the concept. Further analysis was conducted to identify critical input parameters that significantly affect the prediction of road deterioration. In addition to pavement strength, rut depth, annual traffic loading and initial roughness were found to be critical input parameters for road deterioration. This report presents a method developed to incorporate other critical parameters in the analysis, such as unit costs, which are suspected to contribute to a certain degree to cost estimate variation. Thus, the variability of unit costs will be incorporated in this analysis. Bruce Highway located in the tropical east coast of Queensland has been identified to be the network for the analysis. This report presents a step by step methodology for assessing variation in road maintenance and rehabilitation cost estimates.
Resumo:
This paper introduces the Corporate Culture Change Cycle: a continuous innovation methodology for transforming the psychological contract in an organisational context. The eight step process is based on the action learning model. The purpose of this methodology is to benchmark the psychological contract against eight changing values of the employment relationship as a basis for facilitating a process of aligning the changing needs of employer and employee. Both the Corporate Culture Change Cycle and the New Employment Relationship Model have been validated in several organisational settings and subsequently refined. This continuous innovation methodology addresses gaps in the psychological contract, change management and continuous innovation research literatures. The approach therefore should be of interest to researchers in these fields of study and from a practical perspective for managers wishing to transform their workplace cultures.
Resumo:
Biological tissues are subjected to complex loading states in vivo and in order to define constitutive equations that effectively simulate their mechanical behaviour under these loads, it is necessary to obtain data on the tissue's response to multiaxial loading. Single axis and shear testing of biological tissues is often carried out, but biaxial testing is less common. We sought to design and commission a biaxial compression testing device, capable of obtaining repeatable data for biological samples. The apparatus comprised a sealed stainless steel pressure vessel specifically designed such that a state of hydrostatic compression could be created on the test specimen while simultaneously unloading the sample along one axis with an equilibrating tensile pressure. Thus a state of equibiaxial compression was created perpendicular to the long axis of a rectangular sample. For the purpose of calibration and commissioning of the vessel, rectangular samples of closed cell ethylene vinyl acetate (EVA) foam were tested. Each sample was subjected to repeated loading, and nine separate biaxial experiments were carried out to a maximum pressure of 204 kPa (30 psi), with a relaxation time of two hours between them. Calibration testing demonstrated the force applied to the samples had a maximum error of 0.026 N (0.423% of maximum applied force). Under repeated loading, the foam sample demonstrated lower stiffness during the first load cycle. Following this cycle, an increased stiffness, repeatable response was observed with successive loading. While the experimental protocol was developed for EVA foam, preliminary results on this material suggest that this device may be capable of providing test data for biological tissue samples. The load response of the foam was characteristic of closed cell foams, with consolidation during the early loading cycles, then a repeatable load-displacement response upon repeated loading. The repeatability of the test results demonstrated the ability of the test device to provide reproducible test data and the low experimental error in the force demonstrated the reliability of the test data.
Resumo:
This paper aims to develop the methodology and strategy for concurrent finite element modeling of civil infrastructures at the different scale levels for the purposes of analyses of structural deteriorating. The modeling strategy and method were investigated to develop the concurrent multi-scale model of structural behavior (CMSM-of-SB) in which the global structural behavior and nonlinear damage features of local details in a large complicated structure could be concurrently analyzed in order to meet the needs of structural-state evaluation as well as structural deteriorating. In the proposed method, the “large-scale” modeling is adopted for the global structure with linear responses between stress and strain and the “small-scale” modeling is available for nonlinear damage analyses of the local welded details. A longitudinal truss in steel bridge decks was selected as a case to study how a CMSM-of-SB was developed. The reduced-scale specimen of the longitudinal truss was studied in the laboratory to measure its dynamic and static behavior in global truss and local welded details, while the multi-scale models using constraint equations and substructuring were developed for numerical simulation. The comparison of dynamic and static response between the calculated results by different models indicated that the proposed multi-scale model was found to be the most efficient and accurate. The verification of the model with results from the tested truss under the specific loading showed that, responses at the material scale in the vicinity of local details as well as structural global behaviors could be obtained and fit well with the measured results. The proposed concurrent multi-scale modeling strategy and implementation procedures were applied to Runyang cable-stayed bridge (RYCB) and the CMSM-of-SB of the bridge deck system was accordingly constructed as a practical application.
Resumo:
This paper is a continuation of the paper titled “Concurrent multi-scale modeling of civil infrastructure for analyses on structural deteriorating—Part I: Modeling methodology and strategy” with the emphasis on model updating and verification for the developed concurrent multi-scale model. The sensitivity-based parameter updating method was applied and some important issues such as selection of reference data and model parameters, and model updating procedures on the multi-scale model were investigated based on the sensitivity analysis of the selected model parameters. The experimental modal data as well as static response in terms of component nominal stresses and hot-spot stresses at the concerned locations were used for dynamic response- and static response-oriented model updating, respectively. The updated multi-scale model was further verified to act as the baseline model which is assumed to be finite-element model closest to the real situation of the structure available for the subsequent arbitrary numerical simulation. The comparison of dynamic and static responses between the calculated results by the final model and measured data indicated the updating and verification methods applied in this paper are reliable and accurate for the multi-scale model of frame-like structure. The general procedures of multi-scale model updating and verification were finally proposed for nonlinear physical-based modeling of large civil infrastructure, and it was applied to the model verification of a long-span bridge as an actual engineering practice of the proposed procedures.
Resumo:
Construction projects are faced with a challenge that must not be underestimated. These projects are increasingly becoming highly competitive, more complex, and difficult to manage. They become problems that are difficult to solve using traditional approaches. Soft Systems Methodology (SSM) is a systems approach that is used for analysis and problem solving in such complex and messy situations. SSM uses “systems thinking” in a cycle of action research, learning and reflection to help understand the various perceptions that exist in the minds of the different people involved in the situation. This paper examines the benefits of applying SSM to problems of knowledge management in construction project management, especially those situations that are challenging to understand and difficult to act upon. It includes five case studies of its use in dealing with the confusing situations that incorporate human, organizational and technical aspects.
Resumo:
This paper reports on the early stages of a design experiment in educational assessment that challenges the dichotomous legacy evident in many assessment activities. Combining social networking technologies with the sociology of education the paper proposes that assessment activities are best understood as a negotiable field of exchange. In this design experiment students, peers and experts engage in explicit, "front-end" assessment (Wyatt-Smith, 2008) to translate holistic judgments into institutional, and potentiality economic capital without adhering to long lists of pre-set criteria. This approach invites participants to use social networking technologies to judge creative works using scatter graphs, keywords and tag clouds. In doing so assessors will refine their evaluative expertise and negotiate the characteristics of creative works from which criteria will emerge (Sadler, 2008). The real-time advantages of web-based technologies will aggregate, externalise and democratise this transparent method of assessment for most, if not all, creative works that can be represented in a digital format.