954 resultados para appropriate technology
Resumo:
Pragmatic construction professionals, accustomed to intense price competition and focused on the bottom line, have difficulty justifying investments in advanced technology. Researchers and industry professionals need improved tools to analyze how technology affects the performance of the firm. This paper reports the results of research to begin answering the question, “does technology matter?” The researchers developed a set of five dimensions for technology strategy, collected information regarding these dimensions along with four measures of competitive performance in five bridge construction firms, and analyzed the information to identify relationships between technology strategy and competitive performance. Three technology strategy dimensions—competitive positioning, depth of technology strategy, and organizational fit—showed particularly strong correlations with the competitive performance indicators of absolute growth in contract awards and contract award value per technical employee. These findings indicate that technology does matter. The research also provides ways to analyze options for approaching technology and ways to relate technology to competitive performance for use by managers. It also provides a valuable set of research measures for technology strategy.
Resumo:
The pervasiveness of technology in the 21st Century has meant that adults and children live in a society where digital devices are integral to their everyday lives and participation in society. How we communicate, learn, work, entertain ourselves, and even shop is influenced by technology. Therefore, before children begin school they are potentially exposed to a range of learning opportunities mediated by digital devices. These devices include microwaves, mobile phones, computers, and console games such as Playstations® and iPods®. In Queensland preparatory classrooms and in the homes of these children, teachers and parents support and scaffold young children’s experiences, providing them with access to a range of tools that promote learning and provide entertainment. This paper examines teachers’ and parents’ perspectives and considers whether they are techno-optimists who advocate for and promote the inclusion of digital technology, or whether they are they techno-pessimists, who prefer to exclude digital devices from young children’s everyday experiences. An exploratory, single case study design was utilised to gather data from three teachers and ten parents of children in the preparatory year. Teacher data was collected through interviews and email correspondence. Parent data was collected from questionnaires and focus groups. All parents who responded to the research invitation were mothers. The results of data analysis identified a misalignment among adults’ perspectives. Teachers were identified as techno-optimists and parents were identified as techno-pessimists with further emergent themes particular to each category being established. This is concerning because both teachers and mothers influence young children’s experiences and numeracy knowledge, thus, a shared understanding and a common commitment to supporting young children’s use of technology would be beneficial. Further research must investigate fathers’ perspectives of digital devices and the beneficial and detrimental roles that a range of digital devices, tools, and entertainment gadgets play in 21st Century children’s lives.
Resumo:
The antecedents of channel power (e.g. El-Ansary and Stern, 1972) and the impact of channel structure ( e.g. Anderson and Narus,1984) on channel dynamics have long been important topics within the channel literature. In addition to the theoretical and methodological contributions, research in these areas has helped channel managers to understand how power is generated and used in coordinating distribution strategies in different contexts. The study presented in this paper builds upon these previous literatures, which are first briefly reviewed below.
Resumo:
This paper investigates virtual reality representations of performance in London’s late sixteenth-century Rose Theatre, a venue that, by means of current technology, can once again challenge perceptions of space, performance, and memory. The VR model of The Rose becomes a Camillo device in that it represents a virtual recreation of this venue in as much detail as possible and attempts to recover graphic demonstrations of the trace memories of the performance modes of the day. The VR model is based on accurate archeological and theatre historical records and is easy to navigate. The introduction of human figures onto The Rose’s stage via motion capture allows us to explore the relationships between space, actor and environment. The combination of venue and actors facilitates a new way of thinking about how the work of early modern playwrights can be stored and recalled. This virtual theatre is thus activated to intersect productively with contemporary studies in performance; as such, our paper provides a perspective on and embodiment of the relation between technology, memory and experience. It is, at its simplest, a useful archiving project for theatrical history, but it is directly relevant to contemporary performance practice as well. Further, it reflects upon how technology and ‘re-enactments’ of sorts mediate the way in which knowledge and experience are transferred, and even what may be considered ‘knowledge.’ Our work provides opportunities to begin addressing what such intermedial confrontations might produce for ‘remembering, experiencing, thinking and imagining.’ We contend that these confrontations will enhance live theatre performance rather than impeding or disrupting contemporary performance practice. This paper intersects with the CFP’s ‘Performing Memory’ and ‘Memory Lab’ themes. Our presentation (which includes a demonstration of the VR model and the motion capture it requires) takes the form of two closely linked papers that share a single abstract. The two papers will be given by two people, one of whom will be physically present in Utrecht, the other participating via Skype.
Resumo:
This talk explores a new opportunity renewable energy technology has for society.
Resumo:
The use of appropriate features to represent an output class or object is critical for all classification problems. In this paper, we propose a biologically inspired object descriptor to represent the spectral-texture patterns of image-objects. The proposed feature descriptor is generated from the pulse spectral frequencies (PSF) of a pulse coupled neural network (PCNN), which is invariant to rotation, translation and small scale changes. The proposed method is first evaluated in a rotation and scale invariant texture classification using USC-SIPI texture database. It is further evaluated in an application of vegetation species classification in power line corridor monitoring using airborne multi-spectral aerial imagery. The results from the two experiments demonstrate that the PSF feature is effective to represent spectral-texture patterns of objects and it shows better results than classic color histogram and texture features.
Resumo:
Intelligible and accurate risk-based decision-making requires a complex balance of information from different sources, appropriate statistical analysis of this information and consequent intelligent inference and decisions made on the basis of these analyses. Importantly, this requires an explicit acknowledgement of uncertainty in the inputs and outputs of the statistical model. The aim of this paper is to progress a discussion of these issues in the context of several motivating problems related to the wider scope of agricultural production. These problems include biosecurity surveillance design, pest incursion, environmental monitoring and import risk assessment. The information to be integrated includes observational and experimental data, remotely sensed data and expert information. We describe our efforts in addressing these problems using Bayesian models and Bayesian networks. These approaches provide a coherent and transparent framework for modelling complex systems, combining the different information sources, and allowing for uncertainty in inputs and outputs. While the theory underlying Bayesian modelling has a long and well established history, its application is only now becoming more possible for complex problems, due to increased availability of methodological and computational tools. Of course, there are still hurdles and constraints, which we also address through sharing our endeavours and experiences.
Resumo:
This paper explores the embodiment of agency concepts in tangible user interfaces to create meaningful learning experiences. Current notions of agent-based tangible technology are extended, through the development of low-fidelity prototypes, to include additional flexibility and adaptability. A study involving these prototypes was conducted in a kindergarten environment with nine four-year-old children. Observations of children's interactions with the prototypes produced insightful results which will be used to further refine the product under development.
Resumo:
There has been much conjecture of late as to whether the patentable subject matter standard contains a physicality requirement. The issue came to a head when the Federal Circuit introduced the machine-or-transformation test in In re Bilski and declared it to be the sole test for determining subject matter eligibility. Many commentators criticized the test, arguing that it is inconsistent with Supreme Court precedent and the need for the patent system to respond appropriately to all new and useful innovation in whatever form it arises. Those criticisms were vindicated when, on appeal, the Supreme Court in Bilski v. Kappos dispensed with any suggestion that the patentable subject matter test involves a physicality requirement. In this article, the issue is addressed from a normative perspective: it asks whether the patentable subject matter test should contain a physicality requirement. The conclusion reached is that it should not, because such a limitation is not an appropriate means of encouraging much of the valuable innovation we are likely to witness during the Information Age. It is contended that it is not only traditionally-recognized mechanical, chemical and industrial manufacturing processes that are patent eligible, but that patent eligibility extends to include non-machine implemented and non-physical methods that do not have any connection with a physical device and do not cause a physical transformation of matter. Concerns raised that there is a trend of overreaching commoditization or propertization, where the boundaries of patent law have been expanded too far, are unfounded since the strictures of novelty, nonobviousness and sufficiency of description will exclude undeserving subject matter from patentability. The argument made is that introducing a physicality requirement will have unintended adverse effects in various fields of technology, particularly those emerging technologies that are likely to have a profound social effect in the future.
Resumo:
About 1.6 million students currently study outside their home country. Despite this, and the fact that Australia, the United States, the United Kingdom and many of the other host countries of international students are themselves extremely culturally diverse communities, business education remains essentially mono-cultural in form and Anglo American in content. Whilst it is true that these international students may want to understand the "Western" way of doing things, they may not be familiar or comfortable with the processes used to facilitate learning. This paper explores a project undertaken to create a tool that provides essential pre-orientation information and advice to students before they leave home. Where cultural adjustment is required, catching students before departure is a very effective time to introduce key information about lifestyle, culture and approaches to teaching and learning that would assist students with the complex and difficult adjustment to studying abroad, so that they could make a smoother transition to their new place of learning. Welcome to Studying Business at QUT is a Data DVD with 19 short videos capturing a student perspective on life and study. Forty percent of the content is related to living and studying and includes sections on accommodation, lifestyle, food and transport etc., and 60% takes an in-depth look at studying business, featuring students and academics talking about issues such as assessment, academic writing and working in groups. This paper outlines the process of developing the DVD and the range of issues addressed.
Resumo:
Doctoral dissertation, Stanford University, California, 1993
Resumo:
A basic element in advertising strategy is the choice of an appeal. In business-to-business (B2B) marketing communication, a long-standing approach relies on literal and factual, benefit-laden messages. Given the highly complex, costly and involved processes of business purchases, such approaches are certainly understandable. This project challenges the traditional B2B approach and asks if an alternative approach—using symbolic messages that operate at a more intrinsic or emotional level—is effective in the B2B arena. As an alternative to literal (factual) messages, there is an emerging body of literature that asserts stronger, more enduring results can be achieved through symbolic messages (imagery or text) in an advertisement. The present study contributes to this stream of research. From a theoretical standpoint, the study explores differences in literal-symbolic message content in B2B advertisements. There has been much discussion—mainly in the consumer literature—on the ability of symbolic messages to motivate a prospect to process advertising information by necessitating more elaborate processing and comprehension. Business buyers are regarded as less receptive to indirect or implicit appeals because their purchase decisions are based on direct evidence of product superiority. It is argued here, that these same buyers may be equally influenced by advertising that stimulates internally-directed motivation, feelings and cognitions about the brand. Thus far, studies on the effect of literalism and symbolism are fragmented, and few focus on the B2B market. While there have been many studies about the effects of symbolism no adequate scale exists to measure the continuum of literalism-symbolism. Therefore, a first task for this study was to develop such a scale. Following scale development, content analysis of 748 B2B print advertisements was undertaken to investigate whether differences in literalism-symbolism led to higher advertising performance. Variations of time and industry were also measured. From a practical perspective, the results challenge the prevailing B2B practice of relying on literal messages. While definitive support was not established for the use of symbolic message content, literal messages also failed to predict advertising performance. If the ‘fact, benefit laden’ assumption within B2B advertising cannot be supported, then other approaches used in the business-to-consumer (B2C) sector, such as symbolic messages may be also appropriate in business markets. Further research will need to test the potential effects of such messages, thereby building a revised foundation that can help drive advances in B2B advertising. Finally, the study offers a contribution to the growing body of knowledge on symbolism in advertising. While the specific focus of the study relates to B2B advertising, the Literalism-Symbolism scale developed here provides a reliable measure to evaluate literal and symbolic message content in all print advertisements. The value of this scale to advance our understanding about message strategy may be significant in future consumer and business advertising research.
Resumo:
The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.
Resumo:
The ability to accurately predict the remaining useful life of machine components is critical for machine continuous operation and can also improve productivity and enhance system’s safety. In condition-based maintenance (CBM), maintenance is performed based on information collected through condition monitoring and assessment of the machine health. Effective diagnostics and prognostics are important aspects of CBM for maintenance engineers to schedule a repair and to acquire replacement components before the components actually fail. Although a variety of prognostic methodologies have been reported recently, their application in industry is still relatively new and mostly focused on the prediction of specific component degradations. Furthermore, they required significant and sufficient number of fault indicators to accurately prognose the component faults. Hence, sufficient usage of health indicators in prognostics for the effective interpretation of machine degradation process is still required. Major challenges for accurate longterm prediction of remaining useful life (RUL) still remain to be addressed. Therefore, continuous development and improvement of a machine health management system and accurate long-term prediction of machine remnant life is required in real industry application. This thesis presents an integrated diagnostics and prognostics framework based on health state probability estimation for accurate and long-term prediction of machine remnant life. In the proposed model, prior empirical (historical) knowledge is embedded in the integrated diagnostics and prognostics system for classification of impending faults in machine system and accurate probability estimation of discrete degradation stages (health states). The methodology assumes that machine degradation consists of a series of degraded states (health states) which effectively represent the dynamic and stochastic process of machine failure. The estimation of discrete health state probability for the prediction of machine remnant life is performed using the ability of classification algorithms. To employ the appropriate classifier for health state probability estimation in the proposed model, comparative intelligent diagnostic tests were conducted using five different classifiers applied to the progressive fault data of three different faults in a high pressure liquefied natural gas (HP-LNG) pump. As a result of this comparison study, SVMs were employed in heath state probability estimation for the prediction of machine failure in this research. The proposed prognostic methodology has been successfully tested and validated using a number of case studies from simulation tests to real industry applications. The results from two actual failure case studies using simulations and experiments indicate that accurate estimation of health states is achievable and the proposed method provides accurate long-term prediction of machine remnant life. In addition, the results of experimental tests show that the proposed model has the capability of providing early warning of abnormal machine operating conditions by identifying the transitional states of machine fault conditions. Finally, the proposed prognostic model is validated through two industrial case studies. The optimal number of health states which can minimise the model training error without significant decrease of prediction accuracy was also examined through several health states of bearing failure. The results were very encouraging and show that the proposed prognostic model based on health state probability estimation has the potential to be used as a generic and scalable asset health estimation tool in industrial machinery.