252 resultados para Secret--Aspect psychologique--Enquêtes
Resumo:
The role of sustainability in urban design is becoming increasingly important as Australia’s cities continue to grow, putting pressure on existing infrastructure such as water, energy and transport. To optimise an urban design many different aspects such as water, energy, transport, costs need to be taken into account integrally. Integrated software applications assessing urban designs on a large variety of aspects are hardly available. With the upcoming next generation of the Internet often referred to as the Semantic Web, data can become more machine-interpretable by developing ontologies that can support the development of integrated software systems. Software systems can use these ontologies to perform an intelligent task such as assessing an urban design on a particular aspect. When ontologies of different applications are aligned, they can share information resulting in interoperability. Inference such as compliancy checks and classifications can support aligning the ontologies. A proof of concept implementation has been made to demonstrate and validate the usefulness of machine interpretable ontologies for urban designs.
Resumo:
In Apriaden Pty Ltd v Seacrest Pty Ltd the Victorian Court of Appeal decided that termination of a lease under common law contractual principles following repudiation is an alternative to reliance upon an express forfeiture provision in the lease and that it is outside the sphere of statutory protections given against the enforcing of a forfeiture. The balance of authority supports the first aspect of the decision. This article focuses on the second aspect of it, which is a significant development in the law of leases. The article considers the implications of this decision for essential terms of clauses in leases, argues that common law termination for breach of essential terms should be subject to compliance with these statutory requirements and, as an alternative, suggests a way forward through appropriate law reform, considering whether the recent Victorian reform goes far enough.
Resumo:
A key exchange protocol allows a set of parties to agree upon a secret session key over a public network. Two-party key exchange (2PKE) protocols have been rigorously analyzed under various models considering different adversarial actions. However, the analysis of group key exchange (GKE) protocols has not been as extensive as that of 2PKE protocols. Particularly, the security attribute of key compromise impersonation (KCI) resilience has so far been ignored for the case of GKE protocols. We first model the security of GKE protocols addressing KCI attacks by both outsider and insider adversaries. We then show that a few existing protocols are not secure even against outsider KCI attacks. The attacks on these protocols demonstrate the necessity of considering KCI resilience for GKE protocols. Finally, we give a new proof of security for an existing GKE protocol under the revised model assuming random oracles.
Resumo:
Despite an ostensibly technology-driven society, the ability to communicate orally is still seen as an essential ability for students at school and university, as it is for graduates in the workplace. The need to develop effective oral communication skills is often tied to future work-related tasks. One tangible way that educators have assessed proficiency in this area is through prepared oral presentations. While some use the terms oral communication and oral presentation interchangeably, other writers question the role more formal presentations play in the overall development of oral communication skills. Adding to the discussion, this paper is part of a larger study examining the knowledge and skills students bring into the academy from previous educational experiences. The study examines some of the teaching and assessment methods used in secondary schools to develop oral communication skills through the use of formal oral presentations. Specifically, it will look at assessment models and how these are used as a form of instruction as well as how they contribute to an accurate evaluation of student abilities. The purpose of this paper is to explore key terms and identify tensions between expectations and practice. Placing the emphasis on the ‘oral’ aspect of this form of communication this paper will particularly look at the ‘delivery’ element of the process.
Resumo:
Non Alcoholic Fatty Liver Disease (NAFLD) is a condition that is frequently seen but seldom investigated. Until recently, NAFLD was considered benign, self-limiting and unworthy of further investigation. This opinion is based on retrospective studies with relatively small numbers and scant follow-up of histology data. (1) The prevalence for adults, in the USA is, 30%, and NAFLD is recognized as a common and increasing form of liver disease in the paediatric population (1). Australian data, from New South Wales, suggests the prevalence of NAFLD in “healthy” 15 year olds as being 10%.(2) Non-alcoholic fatty liver disease is a condition where fat progressively invades the liver parenchyma. The degree of infiltration ranges from simple steatosis (fat only) to steatohepatitis (fat and inflammation) steatohepatitis plus fibrosis (fat, inflammation and fibrosis) to cirrhosis (replacement of liver texture by scarred, fibrotic and non functioning tissue).Non-alcoholic fatty liver is diagnosed by exclusion rather than inclusion. None of the currently available diagnostic techniques -liver biopsy, liver function tests (LFT) or Imaging; ultrasound, Computerised tomography (CT) or Magnetic Resonance Imaging (MRI) are specific for non-alcoholic fatty liver. An association exists between NAFLD, Non Alcoholic Steatosis Hepatitis (NASH) and irreversible liver damage, cirrhosis and hepatoma. However, a more pervasive aspect of NAFLD is the association with Metabolic Syndrome. This Syndrome is categorised by increased insulin resistance (IR) and NAFLD is thought to be the hepatic representation. Those with NAFLD have an increased risk of death (3) and it is an independent predictor of atherosclerosis and cardiovascular disease (1). Liver biopsy is considered the gold standard for diagnosis, (4), and grading and staging, of non-alcoholic fatty liver disease. Fatty-liver is diagnosed when there is macrovesicular steatosis with displacement of the nucleus to the edge of the cell and at least 5% of the hepatocytes are seen to contain fat (4).Steatosis represents fat accumulation in liver tissue without inflammation. However, it is only called non-alcoholic fatty liver disease when alcohol - >20gms-30gms per day (5), has been excluded from the diet. Both non-alcoholic and alcoholic fatty liver are identical on histology. (4).LFT’s are indicative, not diagnostic. They indicate that a condition may be present but they are unable to diagnosis what the condition is. When a patient presents with raised fasting blood glucose, low HDL (high density lipoprotein), and elevated fasting triacylglycerols they are likely to have NAFLD. (6) Of the imaging techniques MRI is the least variable and the most reproducible. With CT scanning liver fat content can be semi quantitatively estimated. With increasing hepatic steatosis, liver attenuation values decrease by 1.6 Hounsfield units for every milligram of triglyceride deposited per gram of liver tissue (7). Ultrasound permits early detection of fatty liver, often in the preclinical stages before symptoms are present and serum alterations occur. Earlier, accurate reporting of this condition will allow appropriate intervention resulting in better patient health outcomes. References 1. Chalasami N. Does fat alone cause significant liver disease: It remains unclear whether simple steatosis is truly benign. American Gastroenterological Association Perspectives, February/March 2008 www.gastro.org/wmspage.cfm?parm1=5097 Viewed 20th October, 2008 2. Booth, M. George, J.Denney-Wilson, E: The population prevalence of adverse concentrations with adiposity of liver tests among Australian adolescents. Journal of Paediatrics and Child Health.2008 November 3. Catalano, D, Trovato, GM, Martines, GF, Randazzo, M, Tonzuso, A. Bright liver, body composition and insulin resistance changes with nutritional intervention: a follow-up study .Liver Int.2008; February 1280-9 4. Choudhury, J, Sanysl, A. Clinical aspects of Fatty Liver Disease. Semin in Liver Dis. 2004:24 (4):349-62 5. Dionysus Study Group. Drinking factors as cofactors of risk for alcohol induced liver change. Gut. 1997; 41 845-50 6. Preiss, D, Sattar, N. Non-alcoholic fatty liver disease: an overview of prevalence, diagnosis, pathogenesis and treatment considerations. Clin Sci.2008; 115 141-50 7. American Gastroenterological Association. Technical review on nonalcoholic fatty liver disease. Gastroenterology.2002; 123: 1705-25
Resumo:
Changes in fluidization behaviour behaviour was characterised for parallelepiped particles with three aspect ratios, 1:1, 2:1 and 3:1 and spherical particles. All drying experiments were conducted at 500C and 15 % RH using a heat pump dehumidifier system. Fluidization experiments were undertaken for the bed heights of 100, 80, 60 and 40 mm and at 10 moisture content levels. Due to irregularities in shape minimum fluidisation velocity of parallelepiped particulates (potato) could not fitted to any empirical model. Also a generalized equation was used to predict minimum fluidization velocity. The modified quasi-stationary method (MQSM) has been proposed to describe drying kinetics of parallelepiped particulates at 30o C, 40o C and 50o C that dry mostly in the falling rate period in a batch type fluid bed dryer.
Resumo:
The lack of satisfactory consensus for characterizing the system intelligence and structured analytical decision models has inhibited the developers and practitioners to understand and configure optimum intelligent building systems in a fully informed manner. So far, little research has been conducted in this aspect. This research is designed to identify the key intelligent indicators, and develop analytical models for computing the system intelligence score of smart building system in the intelligent building. The integrated building management system (IBMS) was used as an illustrative example to present a framework. The models presented in this study applied the system intelligence theory, and the conceptual analytical framework. A total of 16 key intelligent indicators were first identified from a general survey. Then, two multi-criteria decision making (MCDM) approaches, the analytic hierarchy process (AHP) and analytic network process (ANP), were employed to develop the system intelligence analytical models. Top intelligence indicators of IBMS include: self-diagnostic of operation deviations; adaptive limiting control algorithm; and, year-round time schedule performance. The developed conceptual framework was then transformed to the practical model. The effectiveness of the practical model was evaluated by means of expert validation. The main contribution of this research is to promote understanding of the intelligent indicators, and to set the foundation for a systemic framework that provide developers and building stakeholders a consolidated inclusive tool for the system intelligence evaluation of the proposed components design configurations.
Resumo:
Principal topic: Effectuation theory suggests that entrepreneurs develop their new ventures in an iterative way by selecting possibilities through flexibility and interactions with the market; a focus on affordability of loss rather than maximal return on the capital invested, and the development of pre-commitments and alliances from stakeholders (Sarasvathy, 2001, 2008; Sarasvathy et al., 2005, 2006). In contrast, causation may be described as a rationalistic reasoning method to create a company. After a comprehensive market analysis to discover opportunities, the entrepreneur will select the alternative with the higher expected return and implement it through the use of a business plan. However, little is known about the consequences of following either of these two processes. One aspect that remains unclear is the relationship between newness and effectuation. On one hand it can be argued that the combination of a means-centered, interactive (through pre-commitments and alliances with stakeholders from the early phases of the venture creation) and open-minded process (through flexibility of exploiting contingencies) should encourage and facilitate the development of innovative solutions. On the other hand, having a close relationship with their “future first customers” and focussing too much on the resources and knowledge already within the firm may be a constraint that is not conducive to innovation, or at least not to a radical innovation. While it has been suggested that effectuation strategy is more likely to be used by innovative entrepreneurs (Sarasvathy, 2001), this hypothesis has not been demonstrated yet (Sarasvathy, 2001). Method: In our attempt to capture newness in its different aspects we have considered the following four domains where newness may happen: new product/service; new method for promotion and sales; new production methods/sourcing; market creation. We identified how effectuation may be differently associated with these four domains of newness. To test our four sets of hypotheses a dataset of 1329 firms (702 nascent and 627 young firms) randomly selected in Australia was examined through ANOVA Tukey HSD Test. Results and Implications: Results indicate the existence of a curvilinear relationship between effectuation and newness where low and high levels of newness are associated with low level of effectuation while medium level of newness is associated with high level of effectuation. Implications for academia, practitioners and policy makers are also discussed.
Resumo:
This paper reports on a project concerned with the relationship between person and space in the context of achieving a contemplative state. The need for such a study originated with the desire to contribute to the design of multicultural spaces which could be used for a range of activities including prayer and meditation. Given that the words ‘prayer’ and ‘meditation’ are highly value-laden and potentially alienating for some people, it was decided to use the more accessible term ‘contemplative’. While the project is still underway,several findings have emerged that can be reported on and are of relevance to the conference both methodologically and substantively. Informed by phenomenological methodology, data were collected from a diverse group of people using photo-elicitation and interviewing. The technique of photo-elicitation proved to be highly effective in helping people to reveal their everyday lived experience of contemplative spaces. This methodological aspect of the project is described more fully in the paper. The initial stage of analysis produced two categories of data: varying conceptions of contemplation and contemplative space; and, common understandings of contemplation and contemplative space. From this it was found that achieving a state of contemplation involves both the person and the environment in a dialectic process of unfolding. The unfolding has various physical, psycho-social, and existential dimensions or qualities which operate sequentially and simultaneously. In the paper, these are labelled:the unfolding of the core; distinction; manifestation; cleansing; creation; and sharing, and have parallels with Mircea Eliade’s 1959 definition of sacred as 'something that manifests itself, shows itself, as something wholly different from the profane’. It also connects with the views of Nishida Kitaro from the Kyoto School of Philosophy on the theme of ‘absolute nothingness’: ‘the body-mind is dropped off and we are united with the consciousness of absolute nothingness’ (Kitaro in Heisig, 2001, p. 169). According to Marion (2005), ‘nothingness’ is defined by givenness. In the paper, this fold of givenness is interpreted in the context of the qualities of the environment that accomplish the act of coming forward into visibility through the dialectic relationship with a person. (Eliade, 1959, Heisig, 2001, Marion, 2002)
Resumo:
This paper presents a detailed description of the influence of critical parameters that govern the vulnerability of columns under lateral impact loads. Numerical simulations are conducted by using the Finite Element program LS-DYNA, incorporating steel reinforcement, material models and strain rate effects. A simplified method based on impact pulse generated from full scale impact tests is used for impact reconstruction and effects of the various pulse loading parameters are investigated under low to medium velocity impacts. A constitutive material model which can simulate failures under tri-axial state of stresses is used for concrete. Confinement effects are also introduced to the numerical simulation and columns of Grade 30 to 50 concrete under pure axial loading are analysed in detail. This research confirmed that the vulnerability of the axially loaded columns can be mitigated by reducing the slenderness ratio and concrete grade, and by choosing the design option with a minimal amount of longitudinal steel. Additionally, it is evident that approximately a 50% increase in impact capacity can be gained for columns in medium rise buildings by enhancing the confinement effects alone. Results also indicated that the ductility as well as the mode of failure under impact can be changed with the volumetric ratio of lateral steel. Moreover, to increase the impact capacity of the vulnerable columns, a higher confining stress is required. The general provisions of current design codes do not sufficiently cover this aspect and hence this research will provide additional guidelines to overcome the inadequacies of code provisions.
Resumo:
Although internet chat is a significant aspect of many internet users’ lives, the manner in which participants in quasi-synchronous chat situations orient to issues of social and moral order remains to be studied in depth. The research presented here is therefore at the forefront of a continually developing area of study. This work contributes new insights into how members construct and make accountable the social and moral orders of an adult-oriented Internet Relay Chat (IRC) channel by addressing three questions: (1) What conversational resources do participants use in addressing matters of social and moral order? (2) How are these conversational resources deployed within IRC interaction? and (3) What interactional work is locally accomplished through use of these resources? A survey of the literature reveals considerable research in the field of computer-mediated communication, exploring both asynchronous and quasi-synchronous discussion forums. The research discussed represents a range of communication interests including group and collaborative interaction, the linguistic construction of social identity, and the linguistic features of online interaction. It is suggested that the present research differs from previous studies in three ways: (1) it focuses on the interaction itself, rather than the ways in which the medium affects the interaction; (2) it offers turn-by-turn analysis of interaction in situ; and (3) it discusses membership categories only insofar as they are shown to be relevant by participants through their talk. Through consideration of the literature, the present study is firmly situated within the broader computer-mediated communication field. Ethnomethodology, conversation analysis and membership categorization analysis were adopted as appropriate methodological approaches to explore the research focus on interaction in situ, and in particular to investigate the ways in which participants negotiate and co-construct social and moral orders in the course of their interaction. IRC logs collected from one chat room were analysed using a two-pass method, based on a modification of the approaches proposed by Pomerantz and Fehr (1997) and ten Have (1999). From this detailed examination of the data corpus three interaction topics are identified by means of which participants clearly orient to issues of social and moral order: challenges to rule violations, ‘trolling’ for cybersex, and experiences regarding the 9/11 attacks. Instances of these interactional topics are subjected to fine-grained analysis, to demonstrate the ways in which participants draw upon various interactional resources in their negotiation and construction of channel social and moral orders. While these analytical topics stand alone in individual focus, together they illustrate different instances in which participants’ talk serves to negotiate social and moral orders or collaboratively construct new orders. Building on the work of Vallis (2001), Chapter 5 illustrates three ways that rule violation is initiated as a channel discussion topic: (1) through a visible violation in open channel, (2) through an official warning or sanction by a channel operator regarding the violation, and (3) through a complaint or announcement of a rule violation by a non-channel operator participant. Once the topic has been initiated, it is shown to become available as a topic for others, including the perceived violator. The fine-grained analysis of challenges to rule violations ultimately demonstrates that channel participants orient to the rules as a resource in developing categorizations of both the rule violation and violator. These categorizations are contextual in that they are locally based and understood within specific contexts and practices. Thus, it is shown that compliance with rules and an orientation to rule violations as inappropriate within the social and moral orders of the channel serves two purposes: (1) to orient the speaker as a group member, and (2) to reinforce the social and moral orders of the group. Chapter 6 explores a particular type of rule violation, solicitations for ‘cybersex’ known in IRC parlance as ‘trolling’. In responding to trolling violations participants are demonstrated to use affiliative and aggressive humour, in particular irony, sarcasm and insults. These conversational resources perform solidarity building within the group, positioning non-Troll respondents as compliant group members. This solidarity work is shown to have three outcomes: (1) consensus building, (2) collaborative construction of group membership, and (3) the continued construction and negotiation of existing social and moral orders. Chapter 7, the final data analysis chapter, offers insight into how participants, in discussing the events of 9/11 on the actual day, collaboratively constructed new social and moral orders, while orienting to issues of appropriate and reasonable emotional responses. This analysis demonstrates how participants go about ‘doing being ordinary’ (Sacks, 1992b) in formulating their ‘first thoughts’ (Jefferson, 2004). Through sharing their initial impressions of the event, participants perform support work within the interaction, in essence working to normalize both the event and their initial misinterpretation of it. Normalising as a support work mechanism is also shown in relation to participants constructing the ‘quiet’ following the event as unusual. Normalising is accomplished by reference to the indexical ‘it’ and location formulations, which participants use both to negotiate who can claim to experience the ‘unnatural quiet’ and to identify the extent of the quiet. Through their talk participants upgrade the quiet from something legitimately experienced by one person in a particular place to something that could be experienced ‘anywhere’, moving the phenomenon from local to global provenance. With its methodological design and detailed analysis and findings, this research contributes to existing knowledge in four ways. First, it shows how rules are used by participants as a resource in negotiating and constructing social and moral orders. Second, it demonstrates that irony, sarcasm and insults are three devices of humour which can be used to perform solidarity work and reinforce existing social and moral orders. Third, it demonstrates how new social and moral orders are collaboratively constructed in relation to extraordinary events, which serve to frame the event and evoke reasonable responses for participants. And last, the detailed analysis and findings further support the use of conversation analysis and membership categorization as valuable methods for approaching quasi-synchronous computer-mediated communication.
Resumo:
Experience plays an important role in building management. “How often will this asset need repair?” or “How much time is this repair going to take?” are types of questions that project and facility managers face daily in planning activities. Failure or success in developing good schedules, budgets and other project management tasks depend on the project manager's ability to obtain reliable information to be able to answer these types of questions. Young practitioners tend to rely on information that is based on regional averages and provided by publishing companies. This is in contrast to experienced project managers who tend to rely heavily on personal experience. Another aspect of building management is that many practitioners are seeking to improve available scheduling algorithms, estimating spreadsheets and other project management tools. Such “micro-scale” levels of research are important in providing the required tools for the project manager's tasks. However, even with such tools, low quality input information will produce inaccurate schedules and budgets as output. Thus, it is also important to have a broad approach to research at a more “macro-scale.” Recent trends show that the Architectural, Engineering, Construction (AEC) industry is experiencing explosive growth in its capabilities to generate and collect data. There is a great deal of valuable knowledge that can be obtained from the appropriate use of this data and therefore the need has arisen to analyse this increasing amount of available data. Data Mining can be applied as a powerful tool to extract relevant and useful information from this sea of data. Knowledge Discovery in Databases (KDD) and Data Mining (DM) are tools that allow identification of valid, useful, and previously unknown patterns so large amounts of project data may be analysed. These technologies combine techniques from machine learning, artificial intelligence, pattern recognition, statistics, databases, and visualization to automatically extract concepts, interrelationships, and patterns of interest from large databases. The project involves the development of a prototype tool to support facility managers, building owners and designers. This final report presents the AIMMTM prototype system and documents how and what data mining techniques can be applied, the results of their application and the benefits gained from the system. The AIMMTM system is capable of searching for useful patterns of knowledge and correlations within the existing building maintenance data to support decision making about future maintenance operations. The application of the AIMMTM prototype system on building models and their maintenance data (supplied by industry partners) utilises various data mining algorithms and the maintenance data is analysed using interactive visual tools. The application of the AIMMTM prototype system to help in improving maintenance management and building life cycle includes: (i) data preparation and cleaning, (ii) integrating meaningful domain attributes, (iii) performing extensive data mining experiments in which visual analysis (using stacked histograms), classification and clustering techniques, associative rule mining algorithm such as “Apriori” and (iv) filtering and refining data mining results, including the potential implications of these results for improving maintenance management. Maintenance data of a variety of asset types were selected for demonstration with the aim of discovering meaningful patterns to assist facility managers in strategic planning and provide a knowledge base to help shape future requirements and design briefing. Utilising the prototype system developed here, positive and interesting results regarding patterns and structures of data have been obtained.
Resumo:
China has a reputation as an economy based on utility: the large-scale manufacture of low-priced goods. But useful values like functionality, fitness for purpose and efficiency are only part of the story. More important are what Veblen called ‘honorific’ values, arguably the driving force of development, change and value in any economy. To understand the Chinese economy therefore, it is not sufficient to point to its utilitarian aspect. Honorific status-competition is a more fundamental driver than utilitarian cost-competition. We argue that ‘social network markets’ are the expression of these honorific values, relationships and connections that structure and coordinate individual choices. This paper explores how such markets are developing in China in the area of fashion and fashion media. These, we argue, are an expression of ‘risk culture’ for high-end entrepreneurial consumers and producers alike, providing a stimulus to dynamic innovation in the arena of personal taste and comportment, as part of an international cultural system based on constant change. We examine the launch of Vogue China in 2005, and China’s reception as a fashion player among the international editions of Vogue, as an expression of a ‘decisive moment’ in the integration of China into an international social network market based on honorific values.
Resumo:
Experience plays an important role in building management. “How often will this asset need repair?” or “How much time is this repair going to take?” are types of questions that project and facility managers face daily in planning activities. Failure or success in developing good schedules, budgets and other project management tasks depend on the project manager's ability to obtain reliable information to be able to answer these types of questions. Young practitioners tend to rely on information that is based on regional averages and provided by publishing companies. This is in contrast to experienced project managers who tend to rely heavily on personal experience. Another aspect of building management is that many practitioners are seeking to improve available scheduling algorithms, estimating spreadsheets and other project management tools. Such “micro-scale” levels of research are important in providing the required tools for the project manager's tasks. However, even with such tools, low quality input information will produce inaccurate schedules and budgets as output. Thus, it is also important to have a broad approach to research at a more “macro-scale.” Recent trends show that the Architectural, Engineering, Construction (AEC) industry is experiencing explosive growth in its capabilities to generate and collect data. There is a great deal of valuable knowledge that can be obtained from the appropriate use of this data and therefore the need has arisen to analyse this increasing amount of available data. Data Mining can be applied as a powerful tool to extract relevant and useful information from this sea of data. Knowledge Discovery in Databases (KDD) and Data Mining (DM) are tools that allow identification of valid, useful, and previously unknown patterns so large amounts of project data may be analysed. These technologies combine techniques from machine learning, artificial intelligence, pattern recognition, statistics, databases, and visualization to automatically extract concepts, interrelationships, and patterns of interest from large databases. The project involves the development of a prototype tool to support facility managers, building owners and designers. This Industry focused report presents the AIMMTM prototype system and documents how and what data mining techniques can be applied, the results of their application and the benefits gained from the system. The AIMMTM system is capable of searching for useful patterns of knowledge and correlations within the existing building maintenance data to support decision making about future maintenance operations. The application of the AIMMTM prototype system on building models and their maintenance data (supplied by industry partners) utilises various data mining algorithms and the maintenance data is analysed using interactive visual tools. The application of the AIMMTM prototype system to help in improving maintenance management and building life cycle includes: (i) data preparation and cleaning, (ii) integrating meaningful domain attributes, (iii) performing extensive data mining experiments in which visual analysis (using stacked histograms), classification and clustering techniques, associative rule mining algorithm such as “Apriori” and (iv) filtering and refining data mining results, including the potential implications of these results for improving maintenance management. Maintenance data of a variety of asset types were selected for demonstration with the aim of discovering meaningful patterns to assist facility managers in strategic planning and provide a knowledge base to help shape future requirements and design briefing. Utilising the prototype system developed here, positive and interesting results regarding patterns and structures of data have been obtained.
Resumo:
An important aspect of designing any product is validation. Virtual design process (VDP) is an alternative to hardware prototyping in which analysis of designs can be done without manufacturing physical samples. In recent years, VDP have been generated either for animation or filming applications. This paper proposes a virtual reality design process model on one of the applications when used as a validation tool. This technique is used to generate a complete design guideline and validation tool of product design. To support the design process of a product, a virtual environment and VDP method were developed that supports validation and an initial design cycle performed by a designer. The product model car carrier is used as illustration for which virtual design was generated. The loading and unloading sequence of the model for the prototype was generated using automated reasoning techniques and was completed by interactively animating the product in the virtual environment before complete design was built. By using the VDP process critical issues like loading, unloading, Australian Design rules (ADR) and clearance analysis were done. The process would save time, money in physical sampling and to large extent in complete math generation. Since only schematic models are required, it saves time in math modelling and handling of bigger size assemblies due to complexity of the models. This extension of VDP process for design evaluation is unique and was developed, implemented successfully. In this paper a Toll logistics and J Smith and Sons car carrier which is developed under author’s responsibility has been used to illustrate our approach of generating design validation via VDP.