238 resultados para functionality


Relevância:

10.00% 10.00%

Publicador:

Resumo:

China has a reputation as an economy based on utility: the large-scale manufacture of low-priced goods. But useful values like functionality, fitness for purpose and efficiency are only part of the story. More important are what Veblen called ‘honorific’ values, arguably the driving force of development, change and value in any economy. To understand the Chinese economy therefore, it is not sufficient to point to its utilitarian aspect. Honorific status-competition is a more fundamental driver than utilitarian cost-competition. We argue that ‘social network markets’ are the expression of these honorific values, relationships and connections that structure and coordinate individual choices. This paper explores how such markets are developing in China in the area of fashion and fashion media. These, we argue, are an expression of ‘risk culture’ for high-end entrepreneurial consumers and producers alike, providing a stimulus to dynamic innovation in the arena of personal taste and comportment, as part of an international cultural system based on constant change. We examine the launch of Vogue China in 2005, and China’s reception as a fashion player among the international editions of Vogue, as an expression of a ‘decisive moment’ in the integration of China into an international social network market based on honorific values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Executive Summary The objective of this report was to use the Sydney Opera House as a case study of the application of Building Information Modelling (BIM). The Sydney opera House is a complex, large building with very irregular building configuration, that makes it a challenging test. A number of key concerns are evident at SOH: • the building structure is complex, and building service systems - already the major cost of ongoing maintenance - are undergoing technology change, with new computer based services becoming increasingly important. • the current “documentation” of the facility is comprised of several independent systems, some overlapping and is inadequate to service current and future services required • the building has reached a milestone age in terms of the condition and maintainability of key public areas and service systems, functionality of spaces and longer term strategic management. • many business functions such as space or event management require up-to-date information of the facility that are currently inadequately delivered, expensive and time consuming to update and deliver to customers. • major building upgrades are being planned that will put considerable strain on existing Facilities Portfolio services, and their capacity to manage them effectively While some of these concerns are unique to the House, many will be common to larger commercial and institutional portfolios. The work described here supported a complementary task which sought to identify if a building information model – an integrated building database – could be created, that would support asset & facility management functions (see Sydney Opera House – FM Exemplar Project, Report Number: 2005-001-C-4 Building Information Modelling for FM at Sydney Opera House), a business strategy that has been well demonstrated. The development of the BIMSS - Open Specification for BIM has been surprisingly straightforward. The lack of technical difficulties in converting the House’s existing conventions and standards to the new model based environment can be related to three key factors: • SOH Facilities Portfolio – the internal group responsible for asset and facility management - have already well established building and documentation policies in place. The setting and adherence to well thought out operational standards has been based on the need to create an environment that is understood by all users and that addresses the major business needs of the House. • The second factor is the nature of the IFC Model Specification used to define the BIM protocol. The IFC standard is based on building practice and nomenclature, widely used in the construction industries across the globe. For example the nomenclature of building parts – eg ifcWall, corresponds to our normal terminology, but extends the traditional drawing environment currently used for design and documentation. This demonstrates that the international IFC model accurately represents local practice for building data representation and management. • a BIM environment sets up opportunities for innovative processes that can exploit the rich data in the model and improve services and functions for the House: for example several high-level processes have been identified that could benefit from standardized Building Information Models such as maintenance processes using engineering data, business processes using scheduling, venue access, security data and benchmarking processes using building performance data. The new technology matches business needs for current and new services. The adoption of IFC compliant applications opens the way forward for shared building model collaboration and new processes, a significant new focus of the BIM standards. In summary, SOH current building standards have been successfully drafted for a BIM environment and are confidently expected to be fully developed when BIM is adopted operationally by SOH. These BIM standards and their application to the Opera House are intended as a template for other organisations to adopt for the own procurement and facility management activities. Appendices provide an overview of the IFC Integrated Object Model and an understanding IFC Model Data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective The review addresses two distinct sets of issues: 1. specific functionality, interface, and calculation problems that presumably can be fixed or improved; and 2. the more fundamental question of whether the system is close to being ready for ‘commercial prime time’ in the North American market. Findings Many of our comments relate to the first set of issues, especially sections B and C. Sections D and E deal with the second set. Overall, we feel that LCADesign represents a very impressive step forward in the ongoing quest to link CAD with LCA tools and, more importantly, to link the world of architectural practice and that of environmental research. From that perspective, it deserves continued financial support as a research project. However, if the decision is whether or not to continue the development program from a purely commercial perspective, we are less bullish. In terms of the North American market, there are no regulatory or other drivers to press design teams to use a tool of this nature. There is certainly interest in this area, but the tools must be very easy to use with little or no training. Understanding the results is as important in this regard as knowing how to apply the tool. Our comments are fairly negative when it comes to that aspect. Our opinion might change to some degree when the ‘fixes’ are made and the functionality improved. However, as discussed in more detail in the following sections, we feel that the multi-step process — CAD to IFC to LCADesign — could pose a serious problem in terms of market acceptance. The CAD to IFC part is impossible for us to judge with the information provided, and we can’t even begin to answer the question about the ease of using the software to import designs, but it appears cumbersome from what we do know. There does appear to be a developing North American market for 3D CAD, with a recent survey indicating that about 50% of the firms use some form of 3D modeling for about 75% of their projects. However, this does not mean that full 3D CAD is always being used. Our information suggests that AutoDesk accounts for about 75 to 80% of the 3D CAD market, and they are very cautious about any links that do not serve a latent demand. Finally, other system that link CAD to energy simulation are using XML data transfer protocols rather than IFC files, and it is our understanding that the market served by AutoDesk tends in that direction right now. This is a subject that is outside our area of expertise, so please take these comments as suggestions for more intensive market research rather than as definitive findings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Both in developed and developing economies, major public funding is invested in civil infrastructure assets. Efficiency and comfort level of expected and demanded living standards are largely dependant on the management strategies of these assets. Buildings are one of the major & vital assets, which need to be maintained primarily to ensure its functionality by effective & efficient delivery of services and to optimize economic benefits. Not withstanding, public building infrastructure is not considered in Infrastructure report card published by Australian Infrastructure Report Card Alliance Partners (2001). The reason appears to be not having enough data to rate public building infrastructure. American Infrastructure Report Card (2001) gave “School Buildings” ‘d-’ rating, which is below ‘poor’. For effective asset management of building infrastructure, a need emerged to optimise the budget for managing assets, to cope up with increased user expectations, to response effectively to possible asset failures, to deal with ageing of assets and aging populations and to treat other scenarios including technology advancement and non-asset solutions. John (Asset Management, 2001) suggests that in the area of asset management worldwide, UK, Australia and New Zealand are leading.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the last two decades heart disease has been the highest single cause of death for the human population. With an alarming number of patients requiring heart transplant, and donations not able to satisfy the demand, treatment looks to mechanical alternatives. Rotary Ventricular Assist Devices, VADs, are miniature pumps which can be implanted alongside the heart to assist its pumping function. These constant flow devices are smaller, more efficient and promise a longer operational life than more traditional pulsatile VADs. The development of rotary VADs has focused on single pumps assisting the left ventricle only to supply blood for the body. In many patients however, failure of both ventricles demands that an additional pulsatile device be used to support the failing right ventricle. This condition renders them hospital bound while they wait for an unlikely heart donation. Reported attempts to use two rotary pumps to support both ventricles concurrently have warned of inherent haemodynamic instability. Poor balancing of the pumps’ flow rates quickly leads to vascular congestion increasing the risk of oedema and ventricular ‘suckdown’ occluding the inlet to the pump. This thesis introduces a novel Bi-Ventricular Assist Device (BiVAD) configuration where the pump outputs are passively balanced by vascular pressure. The BiVAD consists of two rotary pumps straddling the mechanical passive controller. Fluctuations in vascular pressure induce small deflections within both pumps adjusting their outputs allowing them to maintain arterial pressure. To optimise the passive controller’s interaction with the circulation, the controller’s dynamic response is optimised with a spring, mass, damper arrangement. This two part study presents a comprehensive assessment of the prototype’s ‘viability’ as a support device. Its ‘viability’ was considered based on its sensitivity to pathogenic haemodynamics and the ability of the passive response to maintain healthy circulation. The first part of the study is an experimental investigation where a prototype device was designed and built, and then tested in a pulsatile mock circulation loop. The BiVAD was subjected to a range of haemodynamic imbalances as well as a dynamic analysis to assess the functionality of the mechanical damper. The second part introduces the development of a numerical program to simulate human circulation supported by the passively controlled BiVAD. Both investigations showed that the prototype was able to mimic the native baroreceptor response. Simulating hypertension, poor flow balancing and subsequent ventricular failure during BiVAD support allowed the passive controller’s response to be assessed. Triggered by the resulting pressure imbalance, the controller responded by passively adjusting the VAD outputs in order to maintain healthy arterial pressures. This baroreceptor-like response demonstrated the inherent stability of the auto regulating BiVAD prototype. Simulating pulmonary hypertension in the more observable numerical model, however, revealed a serious issue with the passive response. The subsequent decrease in venous return into the left heart went unnoticed by the passive controller. Meanwhile the coupled nature of the passive response not only decreased RVAD output to reduce pulmonary arterial pressure, but it also increased LVAD output. Consequently, the LVAD increased fluid evacuation from the left ventricle, LV, and so actually accelerated the onset of LV collapse. It was concluded that despite the inherently stable baroreceptor-like response of the passive controller, its lack of sensitivity to venous return made it unviable in its present configuration. The study revealed a number of other important findings. Perhaps the most significant was that the reduced pulse experienced during constant flow support unbalanced the ratio of effective resistances of both vascular circuits. Even during steady rotary support therefore, the resulting ventricle volume imbalance increased the likelihood of suckdown. Additionally, mechanical damping of the passive controller’s response successfully filtered out pressure fluctuations from residual ventricular function. Finally, the importance of recognising inertial contributions to blood flow in the atria and ventricles in a numerical simulation were highlighted. This thesis documents the first attempt to create a fully auto regulated rotary cardiac assist device. Initial results encourage development of an inlet configuration sensitive to low flow such as collapsible inlet cannulae. Combining this with the existing baroreceptor-like response of the passive controller will render a highly stable passively controlled BiVAD configuration. The prototype controller’s passive interaction with the vasculature is a significant step towards a highly stable new generation of artificial heart.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been proposed that body image disturbance is a form of cognitive bias wherein schemas for self-relevant information guide the selective processing of appearancerelated information in the environment. This threatening information receives disproportionately more attention and memory, as measured by an Emotional Stroop and incidental recall task. The aim of this thesis was to expand the literature on cognitive processing biases in non-clinical males and females by incorporating a number of significant methodological refinements. To achieve this aim, three phases of research were conducted. The initial two phases of research provided preliminary data to inform the development of the main study. Phase One was a qualitative exploration of body image concerns amongst males and females recruited through the general community and from a university. Seventeen participants (eight male; nine female) provided information on their body image and what factors they saw as positively and negatively impacting on their self evaluations. The importance of self esteem, mood, health and fitness, and recognition of the social ideal were identified as key themes. These themes were incorporated as psycho-social measures and Stroop word stimuli in subsequent phases of the research. Phase Two involved the selection and testing of stimuli to be used in the Emotional Stroop task. Six experimental categories of words were developed that reflected a broad range of health and body image concerns for males and females. These categories were high and low calorie food words, positive and negative appearance words, negative emotion words, and physical activity words. Phase Three addressed the central aim of the project by examining cognitive biases for body image information in empirically defined sub-groups. A National sample of males (N = 55) and females (N = 144), recruited from the general community and universities, completed an Emotional Stroop task, incidental memory test, and a collection of psycho-social questionnaires. Sub-groups of body image disturbance were sought using a cluster analysis, which identified three sub-groups in males (Normal, Dissatisfied, and Athletic) and four sub-groups in females (Normal, Health Conscious, Dissatisfied, and Symptomatic). No differences were noted between the groups in selective attention, although time taken to colour name the words was associated with some of the psycho-social variables. Memory biases found across the whole sample for negative emotion, low calorie food, and negative appearance words were interpreted as reflecting the current focus on health and stigma against being unattractive. Collectively these results have expanded our understanding of processing biases in the general community by demonstrating that the processing biases are found within non-clinical samples and that not all processing biases are associated with negative functionality

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Principal Topic High technology consumer products such as notebooks, digital cameras and DVD players are not introduced into a vacuum. Consumer experience with related earlier generation technologies, such as PCs, film cameras and VCRs, and the installed base of these products strongly impacts the market diffusion of the new generation products. Yet technology substitution has received only sparse attention in the diffusion of innovation literature. Research for consumer durables has been dominated by studies of (first purchase) adoption (c.f. Bass 1969) which do not explicitly consider the presence of an existing product/technology. More recently, considerable attention has also been given to replacement purchases (c.f. Kamakura and Balasubramanian 1987). Only a handful of papers explicitly deal with the diffusion of technology/product substitutes (e.g. Norton and Bass, 1987: Bass and Bass, 2004). They propose diffusion-type aggregate-level sales models that are used to forecast the overall sales for successive generations. Lacking household data, these aggregate models are unable to give insights into the decisions by individual households - whether to adopt generation II, and if so, when and why. This paper makes two contributions. It is the first large-scale empirical study that collects household data for successive generations of technologies in an effort to understand the drivers of adoption. Second, in comparision to traditional analysis that evaluates technology substitution as an ''adoption of innovation'' type process, we propose that from a consumer's perspective, technology substitution combines elements of both adoption (adopting the new generation technology) and replacement (replacing the generation I product with generation II). Based on this proposition, we develop and test a number of hypotheses. Methodology/Key Propositions In some cases, successive generations are clear ''substitutes'' for the earlier generation, in that they have almost identical functionality. For example, successive generations of PCs Pentium I to II to III or flat screen TV substituting for colour TV. More commonly, however, the new technology (generation II) is a ''partial substitute'' for existing technology (generation I). For example, digital cameras substitute for film-based cameras in the sense that they perform the same core function of taking photographs. They have some additional attributes of easier copying and sharing of images. However, the attribute of image quality is inferior. In cases of partial substitution, some consumers will purchase generation II products as substitutes for their generation I product, while other consumers will purchase generation II products as additional products to be used as well as their generation I product. We propose that substitute generation II purchases combine elements of both adoption and replacement, but additional generation II purchases are solely adoption-driven process. Extensive research on innovation adoption has consistently shown consumer innovativeness is the most important consumer characteristic that drives adoption timing (Goldsmith et al. 1995; Gielens and Steenkamp 2007). Hence, we expect consumer innovativeness also to influence both additional and substitute generation II purchases. Hypothesis 1a) More innovative households will make additional generation II purchases earlier. 1 b) More innovative households will make substitute generation II purchases earlier. 1 c) Consumer innovativeness will have a stronger impact on additional generation II purchases than on substitute generation II purchases. As outlined above, substitute generation II purchases act, in part like a replacement purchase for the generation I product. Prior research (Bayus 1991; Grewal et al 2004) identified product age as the most dominant factor influencing replacements. Hence, we hypothesise that: Hypothesis 2: Households with older generation I products will make substitute generation II purchases earlier. Our survey of 8,077 households investigates their adoption of two new generation products: notebooks as a technology change to PCs, and DVD players as a technology shift from VCRs. We employ Cox hazard modelling to study factors influencing the timing of a household's adoption of generation II products. We determine whether this is an additional or substitute purchase by asking whether the generation I product is still used. A separate hazard model is conducted for additional and substitute purchases. Consumer Innovativeness is measured as domain innovativeness adapted from the scales of Goldsmith and Hofacker (1991) and Flynn et al. (1996). The age of the generation I product is calculated based on the most recent household purchase of that product. Control variables include age, size and income of household, and age and education of primary decision-maker. Results and Implications Our preliminary results confirm both our hypotheses. Consumer innovativeness has a strong influence on both additional purchases (exp = 1.11) and substitute purchases (exp = 1.09). Exp is interpreted as the increased probability of purchase for an increase of 1.0 on a 7-point innovativeness scale. Also consistent with our hypotheses, the age of the generation I product has a dramatic influence for substitute purchases of VCR/DVD (exp = 2.92) and a strong influence for PCs/notebooks (exp = 1.30). Exp is interpreted as the increased probability of purchase for an increase of 10 years in the age of the generation I product. Yet, also as hypothesised, there was no influence on additional purchases. The results lead to two key implications. First, there is a clear distinction between additional and substitute purchases of generation II products, each with different drivers. Treating these as a single process will mask the true drivers of adoption. For substitute purchases, product age is a key driver. Hence, implications for marketers of high technology products can utilise data on generation I product age (e.g. from warranty or loyalty programs) to target customers who are more likely to make a purchase.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generally, major public funding is invested in civil infrastructure assets. The efficiency and comfort level of expected and actual living standards is largely dependant on the management strategies of these assets. Buildings are one of the major & vital assets, which need to be maintained primarily to ensure their functionality by effective & efficient delivery of services and to optimise economic benefits. In Australia, billions of dollars are spent annually managing and maintaining built assets. These assets make up the social and economic infrastructure, which facilitate the essential services to public and business. Buildings are one of the prime & fundamental assets, which need to be managed effectively and efficiently to ensure that related services are delivered economically and sustainably

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Maps have been published on the world wide web since its inception (Cartwright, 1999) and are still accessed and viewed by millions of users today (Peterson, 2003). While early webbased GIS products lacked a complete set of cartographic capabilities, the functionality within such systems has significantly increased over recent years. Functionalities once found only in desktop GIS products are now available in web-based GIS applications, for example, data entry, basic editing, and analysis. Applications based on web-GIS are becoming more widespread and the web-based GIS environment is replacing the traditional desktop GIS platforms in many organizations. Therefore, development of a new cartographic method for web-based GIS is vital. The broad aim of this project is to examine and discuss the challenges and opportunities of innovative cartography methods for web-based GIS platforms. The work introduces a recently developed cartographic methodology, which is based on a web-based GIS portal by the Survey of Israel (SOI). The work discusses the prospects and constraints of such methods in improving web-GIS interfaces and usability for the end user. The work also tables the preliminary findings of the initial implementation of the web-based GIS cartographic method within the portal of the Survey of Israel, as well as the applicability of those methods elsewhere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Security-critical communications devices must be evaluated to the highest possible standards before they can be deployed. This process includes tracing potential information flow through the device's electronic circuitry, for each of the device's operating modes. Increasingly, however, security functionality is being entrusted to embedded software running on microprocessors within such devices, so new strategies are needed for integrating information flow analyses of embedded program code with hardware analyses. Here we show how standard compiler principles can augment high-integrity security evaluations to allow seamless tracing of information flow through both the hardware and software of embedded systems. This is done by unifying input/output statements in embedded program execution paths with the hardware pins they access, and by associating significant software states with corresponding operating modes of the surrounding electronic circuitry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The multi-level current reinjection concept described in literature is well-known to produce high quality AC current waveforms in high power and high voltage self-commutating current source converters. This paper proposes a novel reinjection circuitry which is capable of producing a 7-level reinjection current. It is shown that this reinjection current effectively increases the pulse number of the converter to 72. The use of PSCAD/EMTDC simulation validates the functionality of the proposed concept illustrating its effectiveness on both AC and DC sides of the converter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many older adults have difficulty using modern consumer products due to their complexity both in terms of functionality and interface design. It has been observed that older people also have more problems learning new systems. It was hypothesised that designing technological products that are more intuitive for older people to use can solve this problem. An intuitive interface allows a user’s to employ prior knowledge, thus minimizing the learning needed for effective interaction. This paper discusses an experiment investigating the effectiveness of redundancy in interface design. The primary objective of this experiment was to find out if using more than one modality for a product’s interface improves the speed and intuitiveness of interactions for older adults. Preliminary analysis showed strong correlation between technology familiarity and time on tasks, but redundancy in interface design improved speed and accuracy of use only for participants with moderate to high technology familiarity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the study of complex neurobiological movement systems, measurement indeterminacy has typically been overcome by imposing artificial modelling constraints to reduce the number of unknowns (e.g., reducing all muscle, bone and ligament forces crossing a joint to a single vector). However, this approach prevents human movement scientists from investigating more fully the role, functionality and ubiquity of coordinative structures or functional motor synergies. Advancements in measurement methods and analysis techniques are required if the contribution of individual component parts or degrees of freedom of these task-specific structural units is to be established, thereby effectively solving the indeterminacy problem by reducing the number of unknowns. A further benefit of establishing more of the unknowns is that human movement scientists will be able to gain greater insight into ubiquitous processes of physical self-organising that underpin the formation of coordinative structures and the confluence of organismic, environmental and task constraints that determine the exact morphology of these special-purpose devices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Brisbane Media Map is both an online resource and a tertiary-level authentic learning project. The Brisbane Media Map is an online database which provides a detailed overview of about 600 media industry organisations in Brisbane, Australia. In addition to providing contact details and synopses for each organisation’s profile, the Brisbane Media Map also includes supplementary information on current issues, trends, and individuals in the media and communication industry sectors. This resource is produced and updated annually by final-year undergraduate Media and Communication students. This article introduces the Brisbane Media Map, its functionality and systems design approach, as well as its alignment with key learning infrastructures. It examines authentic learning as the pedagogical framework underpinning the ongoing development work of the resource and highlights some synergies of this framework with participatory design principles. The Brisbane Media Map is a useful example of an authentic learning approach that successfully engages students of non-traditional and non-design areas of study in human-computer interaction, usability, and participatory design activities.