969 resultados para pump-probe technology
Resumo:
Bovine intestine samples were heat pump fluidized bed dried at atmospheric pressure and at temperatures below and above the material freezing points equipped with a continuous monitoring system. The investigation of the drying characteristics has been conducted in the temperature range -10~25oC and the airflow in the range 1.5~2.5 m/s. Some experiments were conducted as a single temperature drying experiments and others as two stage drying experiments employing two temperatures. An Arrhenius-type equation was used to interpret the influence of the drying air parameters on the effective diffusivity, calculated with the method of slopes in terms of energy activation, and this was found to be sensitivity of the temperature. The effective diffusion coefficient of moisture transfer was determined by Fickian method using uni-dimensional moisture movement in both moisture, removal by evaporation and combined sublimation and evaporation. Correlations expressing the effective moisture diffusivity and drying temperature are reported.
Resumo:
Focuses on the various aspects of advances in future information communication technology and its applications Presents the latest issues and progress in the area of future information communication technology Applicable to both researchers and professionals These proceedings are based on the 2013 International Conference on Future Information & Communication Engineering (ICFICE 2013), which will be held at Shenyang in China from June 24-26, 2013. The conference is open to all over the world, and participation from Asia-Pacific region is particularly encouraged. The focus of this conference is on all technical aspects of electronics, information, and communications ICFICE-13 will provide an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of FICE. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in FICE. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject. "This work was supported by the NIPA (National IT Industry Promotion Agency) of Korea Grant funded by the Korean Government (Ministry of Science, ICT & Future Planning)."
Resumo:
Knowledge Integration (KI) is one of the major aspects driving innovation within an organisation. In this paper, we attempt to develop a better understanding of the challenges of knowledge integration within the innovation process in technology-based firms. Using four technology-based Australian firms, we investigated how knowledge integration may be managed within the context of innovation in technology firms. The literature highlights the role of four KI tasks that affect the innovation capability within technology-oriented firms, namely team building capability, capturing tacit knowledge, role of KM systems and technological systemic integration. Our findings indicate that in addition to the four tasks, a strategic approach to integrating knowledge for innovation as well as leadership and management are essential to achieving effective KI across multiple levels of engagement. Our findings also offer practical insights of how knowledge can be integrated within innovation process.
Resumo:
Social media tools are often the result of innovations in Information Technology and developed by IT professionals and innovators. Nevertheless, IT professionals, many of whom are responsible for designing and building social media technologies, have not been investigated on how they themselves use or experience social media for professional purposes. This study will use Information Grounds Theory (Pettigrew, 1998) as a framework to study IT professionals’ experience in using social media for professional purposes. Information grounds facilitates the opportunistic discovery of information within social settings created temporarily at a place where people gather for a specific purpose (e.g., doctors’ waiting rooms, office tea rooms etc.), but the social atmosphere stimulates spontaneous sharing of information (Pettigrew, 1999). This study proposes that social media has the qualities that make it a rich information grounds; people participate from separate “places” in cyberspace in a synchronous manner in real-time, making it almost as dynamic and unplanned as physical information grounds. There is limited research on how social media platforms are perceived as a “place,” (a place to go to, a place to gather, or a place to be seen in) that is comparable to physical spaces. There is also no empirical study on how IT professionals use or “experience” social media. The data for this study is being collected through a study of IT professionals who currently use Twitter. A digital ethnography approach is being taken wherein the researcher uses online observations and “follows” the participants online and observes their behaviours and interactions on social media. Next, a sub-set of participants will be interviewed on their experiences with and within social media and how social media compares with traditional methods of information grounds, information communication, and collaborative environments. An Evolved Grounded Theory (Glaser, 1992) approach will be used to analyse tweets data and interviews and to map the findings against the Information Ground Theory. Findings from this study will provide foundational understanding of IT professionals’ experiences within social media, and can help both professionals and researchers understand this fast-evolving method of communications.
Resumo:
Cardiovascular diseases are a leading cause of death throughout the developed world. With the demand for donor hearts far exceeding the supply, a bridge-to-transplant or permanent solution is required. This is currently achieved with ventricular assist devices (VADs), which can be used to assist the left ventricle (LVAD), right ventricle (RVAD), or both ventricles simultaneously (BiVAD). Earlier generation VADs were large, volume-displacement devices designed for temporary support until a donor heart was found. The latest generation of VADs use rotary blood pump technology which improves device lifetime and the quality of life for end stage heart failure patients. VADs are connected to the heart and greater vessels of the patient through specially designed tubes called cannulae. The inflow cannulae, which supply blood to the VAD, are usually attached to the left atrium or ventricle for LVAD support, and the right atrium or ventricle for RVAD support. Few studies have characterized the haemodynamic difference between the two cannulation sites, particularly with respect to rotary RVAD support. Inflow cannulae are usually made of metal or a semi-rigid polymer to prevent collapse with negative pressures. However suction, and subsequent collapse, of the cannulated heart chamber can be a frequent occurrence, particularly with the relatively preload insensitive rotary blood pumps. Suction events may be associated with endocardial damage, pump flow stoppages and ventricular arrhythmias. While several VAD control strategies are under development, these usually rely on potentially inaccurate sensors or somewhat unreliable inferred data to estimate preload. Fixation of the inflow cannula is usually achieved through suturing the cannula, often via a felt sewing ring, to the cannulated chamber. This technique extends the time on cardiopulmonary bypass which is associated with several postoperative complications. The overall objective of this thesis was to improve the placement and design of rotary LVAD and RVAD inflow cannulae to achieve enhanced haemodynamic performance, reduced incidence of suction events, reduced levels of postoperative bleeding and a faster implantation procedure. Specific objectives were: * in-vitro evaluation of LVAD and RVAD inflow cannula placement, * design and in-vitro evaluation of a passive mechanism to reduce the potential for heart chamber suction, * design and in-vitro evaluation of a novel suture-less cannula fixation device. In order to complete in-vitro evaluation of VAD inflow cannulae, a mock circulation loop (MCL) was developed to accurately replicate the haemodynamics in the human systemic and pulmonary circulations. Validation of the MCL’s haemodynamic performance, including the form and magnitude of pressure, flow and volume traces was completed through comparisons of patient data and the literature. The MCL was capable of reproducing almost any healthy or pathological condition, and provided a useful tool to evaluate VAD cannulation and other cardiovascular devices. The MCL was used to evaluate inflow cannula placement for rotary VAD support. Left and right atrial and ventricular cannulation sites were evaluated under conditions of mild and severe heart failure. With a view to long term LVAD support in the severe left heart failure condition, left ventricular inflow cannulation was preferred due to improved LVAD efficiency and reduced potential for thrombus formation. In the mild left heart failure condition, left atrial cannulation was preferred to provide an improved platform for myocardial recovery. Similar trends were observed with RVAD support, however to a lesser degree due to a smaller difference in right atrial and ventricular pressures. A compliant inflow cannula to prevent suction events was then developed and evaluated in the MCL. As rotary LVAD or RVAD preload was reduced, suction events occurred in all instances with a rigid inflow cannula. Addition of the compliant segment eliminated suction events in all instances. This was due to passive restriction of the compliant segment as preload dropped, thus increasing the VAD circuit resistance and decreasing the VAD flow rate. Therefore, the compliant inflow cannula acted as a passive flow control / anti-suction system in LVAD and RVAD support. A novel suture-less inflow cannula fixation device was then developed to reduce implantation time and postoperative bleeding. The fixation device was evaluated for LVAD and RVAD support in cadaveric animal and human hearts attached to a MCL. LVAD inflow cannulation was achieved in under two minutes with the suture-less fixation device. No leakage through the suture-less fixation device – myocardial interface was noted. Continued development and in-vivo evaluation of this device may result in an improved inflow cannulation technique with the potential for off-bypass insertion. Continued development of this research, in particular the compliant inflow cannula and suture-less inflow cannulation device, will result in improved postoperative outcomes, life span and quality of life for end-stage heart failure patients.
Resumo:
The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.
Resumo:
This article provides a general overview of some of the plant research being conducted by a number of researchers at the Queensland University of Technology (QUT) Brisbane. Details about student projects and research facilities have been limited to those of relevance to plant structure and systematics. Academics, technicians and research students involved in plant research are in the Faculty of Science and Engineering, mainly in the School of Earth, Environment and Biological Sciences (EEBS), with a few exceptions. Our offices and laboratories are housed in a number of different buildings at the Gardens Point campus (e.g., P, Q, R, S, M Blocks) and we have strong collaborative links with Queensland Herbarium (BRI) and Mt Coot-tha Botanic Gardens.
Resumo:
The nature and characteristics of how learners learn today are changing. As technology use in learning and teaching continues to grow, its integration to facilitate deep learning and critical thinking becomes a primary consideration. The implications for learner use, implementation strategies, design of integration frameworks and evaluation of their effectiveness in learning environments cannot be overlooked. This study specifically looked at the impact that technology-enhanced learning environments have on different learners’ critical thinking in relation to eductive ability, technological self-efficacy, and approaches to learning and motivation in collaborative groups. These were explored within an instructional design framework called CoLeCTTE (collaborative learning and critical thinking in technology-enhanced environments) which was proposed, revised and used across three cases. The field of investigation was restricted to three key questions: 1) Do learner skill bases (learning approach and eductive ability) influence critical thinking within the proposed CoLeCTTE framework? If so, how?; 2) Do learning technologies influence the facilitation of deep learning and critical thinking within the proposed CoLeCTTE framework? If so, how?; and 3) How might learning be designed to facilitate the acquisition of deep learning and critical thinking within a technology-enabled collaborative environment? The rationale, assumptions and method of research for using a mixed method and naturalistic case study approach are discussed; and three cases are explored and analysed. The study was conducted at the tertiary level (undergraduate and postgraduate) where participants were engaged in critical technical discourse within their own disciplines. Group behaviour was observed and coded, attributes or skill bases were measured, and participants interviewed to acquire deeper insights into their experiences. A progressive case study approach was used, allowing case investigation to be implemented in a "ladder-like" manner. Cases 1 and 2 used the proposed CoLeCTTE framework with more in-depth analysis conducted for Case 2 resulting in a revision of the CoLeCTTE framework. Case 3 used the revised CoLeCTTE framework and in-depth analysis was conducted. The findings led to the final version of the framework. In Cases 1, 2 and 3, content analysis of group work was conducted to determine critical thinking performance. Thus, the researcher used three small groups where learner skill bases of eductive ability, technological self-efficacy, and approaches to learning and motivation were measured. Cases 2 and 3 participants were interviewed and observations provided more in-depth analysis. The main outcome of this study is analysis of the nature of critical thinking within collaborative groups and technology-enhanced environments positioned in a theoretical instructional design framework called CoLeCTTE. The findings of the study revealed the importance of the Achieving Motive dimension of a student’s learning approach and how direct intervention and strategies can positively influence critical thinking performance. The findings also identified factors that can adversely affect critical thinking performance and include poor learning skills, frustration, stress and poor self-confidence, prioritisations over learning; and inadequate appropriation of group role and tasks. These findings are set out as instructional design guidelines for the judicious integration of learning technologies into learning and teaching practice for higher education that will support deep learning and critical thinking in collaborative groups. These guidelines are presented in two key areas: technology and tools; and activity design, monitoring, control and feedback.
Resumo:
This thesis examines how the initial institutional and technological aspects of the economy and the reforms that alter these aspects influence long run growth and development. These issues are addressed in the framework of stochastic endogenous growth models and an empirical framework. The thesis is able to explain why developing nations exhibit diverse growth and inequality patterns. Consequently, the thesis raises a number of policy implications regarding how these nations can improve their economic outcomes.
Resumo:
This thesis examined the determinants of consumers’ use of emerging mental health services delivered via mobile phone technology, which promise to provide cost-effective psychotherapeutic support where and when needed. It builds on the Model of Goal-Directed Behaviour by recognising the role that competition between behavioural alternatives plays in influencing consumers’ decision to use these services. The research employed a three-study, mixed-methodological approach.
Resumo:
This paper presents research findings and design strategies that illustrate how digital technology can be applied as a tool for hybrid placemaking in ways that would not be possible in purely digital or physical space. Digital technology has revolutionised the way people learn and gather new information. This trend has challenged the role of the library as a physical place, as well as the interplay of digital and physical aspects of the library. The paper provides an overview of how the penetration of digital technology into everyday life has affected the library as a place, both as designed by place makers, and, as perceived by library users. It then identifies a gap in current library research about the use of digital technology as a tool for placemaking, and reports results from a study of Gelatine – a custom built user check-in system that displays real-time user information on a set of public screens. Gelatine and its evaluation at The Edge, at State Library of Queensland illustrates how combining affordances of social, spatial and digital space can improve the connected learning experience among on-site visitors. Future design strategies involving gamifying the user experience in libraries are described based on Gelatine’s infrastructure. The presented design ideas and concepts are relevant for managers and designers of libraries as well as other informal, social learning environments.
Resumo:
The Web is a steadily evolving resource comprising much more than mere HTML pages. With its ever-growing data sources in a variety of formats, it provides great potential for knowledge discovery. In this article, we shed light on some interesting phenomena of the Web: the deep Web, which surfaces database records as Web pages; the Semantic Web, which de�nes meaningful data exchange formats; XML, which has established itself as a lingua franca for Web data exchange; and domain-speci�c markup languages, which are designed based on XML syntax with the goal of preserving semantics in targeted domains. We detail these four developments in Web technology, and explain how they can be used for data mining. Our goal is to show that all these areas can be as useful for knowledge discovery as the HTML-based part of the Web.
Resumo:
In response to current developments In the tertiary education sector, the Queensland University of Technology Library has mounted an Intensive course - Advanced Information Retrieval Skills - for higher degree students. In determining need for such a course, a survey of postgraduate students and their supervisors was conducted. Results of this survey are discussed and details of the four credit point subjects are outlined.
Resumo:
Phenomenography has its roots in educational research (Marton and Booth, 1997), but has since been adopted in other domains including business (Sandberg, 1994), health (Barnard, McCosker and Gerber, 1999), information science (Bruce, 1999a,b) and information technology (Bruce and Pham, 2001) as well as information systems. Emerging phenomenographic research in areas other than education, has been interdisciplinary, often bringing together technology, education and a host discipline such as health or business. In Australia, phenomenography has been used in information technology (IT) related research primarily in Victoria and Queensland. These studies have pursued the latter two of three established lines of phenomenographic research: 1) the study of conceptions of learning; 2) the study of conceptions in specific disciplines of study and 3) the study of how people conceive of various aspects of their everyday world that have not, for them, been the object of formal studies (Marton 1988, p.189). Information Technology researchers have predominantly pursued the latter two lines of research.