936 resultados para computer modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Building Information Modelling (BIM) is an IT enabled technology that allows storage, management, sharing, access, update and use of all the data relevant to a project through out the project life-cycle in the form of a data repository. BIM enables improved inter-disciplinary collaboration across distributed teams, intelligent documentation and information retrieval, greater consistency in building data, better conflict detection and enhanced facilities management. While the technology itself may not be new, and similar approaches have been in use in some other sectors like Aircraft and Automobile industry for well over a decade now, the AEC/FM (Architecture, Engineering and Construction/ Facilities Management) industry is still to catch up with them in its ability to exploit the benefits of the IT revolution. Though the potential benefits of the technology in terms of knowledge sharing, project management, project co-ordination and collaboration are near to obvious, the adoption rate has been rather lethargic, inspite of some well directed efforts and availability of supporting commercial tools. Since the technology itself has been well tested over the years in some other domains the plausible causes must be rooted well beyond the explanation of the ‘Bell Curve of innovation adoption’. This paper discusses the preliminary findings of an ongoing research project funded by the Cooperative Research Centre for Construction Innovation (CRC-CI) which aims to identify these gaps and come up with specifications and guidelines to enable greater adoption of the BIM approach in practice. A detailed literature review is conducted that looks at some of the similar research reported in the recent years. A desktop audit of some of the existing commercial tools that support BIM application has been conducted to identify the technological issues and concerns, and a workshop was organized with industry partners and various players in the AEC industry for needs analysis, expectations and feedback on the possible deterrents and inhibitions surrounding the BIM adoption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Drivers' ability to react to unpredictable events deteriorates when exposed to highly predictable and uneventful driving tasks. Particularly, highway design reduces the driving task mainly to a lane-keeping one. It contributes to hypovigilance and road crashes as drivers are often not aware that their driving behaviour is impaired. Monotony increases fatigue, however, the fatigue community has mainly focused on endogenous factors leading to fatigue such as sleep deprivation. This paper focuses on the exogenous factor monotony which contributes to hypovigilance. Objective measurements of the effects of monotonous driving conditions on the driver and the vehicle's dynamics is systematically reviewed with the aim of justifying the relevance of the need for a mathematical framework that could predict hypovigilance in real-time. Although electroencephalography (EEG) is one of the most reliable measures of vigilance, it is obtrusive. This suggests to predict from observable variables the time when the driver is hypovigilant. Outlined is a vision for future research in the modelling of driver vigilance decrement due to monotonous driving conditions. A mathematical model for predicting drivers’ hypovigilance using information like lane positioning, steering wheel movements and eye blinks is provided. Such a modelling of driver vigilance should enable the future development of an in-vehicle device that detects driver hypovigilance in advance, thus offering the potential to enhance road safety and prevent road crashes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer aided joint replacement surgery has become very popular during recent years and is being done in increasing numbers all over the world. The accuracy of the system depends to a major extent, on accurate registration and immobility of the tracker attachment devices to the bone. This study was designed to asses the forces needed to displace the tracker attachment devices in the bone simulators. Bone simulators were used to maintain the uniformity of the bone structure during the study. The fixation devices tested were 3mm diameter self drilling, self tapping threaded pin, 4mm diameter self tapping cortical threaded pin, 5mm diameter self tapping cancellous threaded pin and a triplanar fixation device ‘ortholock’ used with three 3mm pins. All the devices were tested for pull out, translational and rotational forces in unicortical and bicortical fixation modes. Also tested was the normal bang strength and forces generated by leaning on the devices. The forces required to produce translation increased with the increasing diameter of the pins. These were 105N, 185N, and 225N for the unicortical fixations and 130N, 200N, 225N for the bicortical fixations for 3mm, 4mm and 5mm diameter pins respectively. The forces required to pull out the pins were 1475N, 1650N, 2050N for the unicortical, 1020N, 3044N and 3042N for the bicortical fixated 3mm, 4mm and 5mm diameter pins. The ortholock translational and pull out strength was tested to 900N and 920N respectively and still it did not fail. Rotatory forces required to displace the tracker on pins was to the magnitude of 30N before failure. The ortholock device had rotational forces applied up to 135N and still did not fail. The manual leaning forces and the sudden bang forces generated were of the magnitude of 210N and 150N respectively. The strength of the fixation pins increases with increasing diameter from three to five mm for the translational forces. There is no significant difference in pull out forces of four mm and five mm diameter pins though it is more that the three mm diameter pins. This is because of the failure of material at that stage rather than the fixation device. The rotatory forces required to displace the tracker are very small and much less that that can be produced by the surgeon or assistants in single pins. Although the ortholock device was tested to 135N in rotation without failing, one has to be very careful not to put any forces during the operation on the tracker devices to ensure the accuracy of the procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents one approach to addressing the important issue of interdisciplinarity in the primary school mathematics curriculum, namely, through realistic mathematical modelling problems. Such problems draw upon other disciplines for their contexts and data. The article initially considers the nature of modelling with complex systems and discusses how such experiences differ from existing problem-solving activities in the primary mathematics curriculum. Principles for designing interdisciplinary modelling problems are then addressed, with reference to two mathematical modelling problems— one based in the scientific domain and the other in the literary domain. Examples of the models children have created in solving these problems follow. A reflection on the differences in the diversity and sophistication of these models raises issues regarding the design of interdisciplinary modelling problems. The article concludes with suggested opportunities for generating multidisciplinary projects within the regular mathematics curriculum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

President’s Message Hello fellow AITPM members, We’ve been offered a lot of press lately about the Federal Government’s plan for the multibillion dollar rollout of its high speed broadband network, which at the moment is being rated to a speed of 100Mb/s. This seems fantastic in comparison to the not atypical 250 to 500kb/s that I receive on my metropolitan cable broadband, which incidentally my service provider rates at theoretical speeds of up to 8 Mb/s. I have no doubt that such a scheme will generate significant advantages to business and consumers. However, I also have some reservations. Only a few of years ago I marvelled at my first 256Mb USB stick, which cost my employer about $90. Last month I purchased a 16Gb stick with a free computer carry bag for $80, which on the back of my envelope has given me about 72 times the value of my first USB stick not including the carry bag! I am pretty sure the technology industry will find a way to eventually push a lot more than 100Mb/s down the optic fibre network just as they have done with pushing several Mb/s ADSL2 down antique copper wire. This makes me wonder about the general problem of inbuilt obsolescence of all things high-tech due to rapid advances in the tech industry. As a transport professional I then think to myself that our industry has been moving forward at somewhat of a slower pace. We certainly have had major milestones having significant impacts, such as the move from horse and cart to the self propelled motor vehicle, sealing and formal geometric design of roads, development of motorways, signalisation of intersections, coordination of networks, to simulation modelling for real time adaptive control (perhaps major change has been at a frequency of 30 years or so?). But now with ITS truly penetrating the transport market, largely thanks to the in-car GPS navigator, smart phone, e-toll and e-ticket, I believe that to avoid our own obsolescence we’re going to need to “plan for ITS” rather than just what we seem to have been doing up until now, that is, to get it out there. And we’ll likely need to do it at a faster pace. It will involve understanding how to data mine enormous data sets, better understanding the human/machine interface, keeping pace with automotive technology more closely, resolving the ethical and privacy chestnuts, and in the main actually planning for ITS to make peoples’ lives easier rather than harder. And in amongst this we’ll need to keep pace with the types of technology advances similar to my USB stick example above. All the while we’ll be making a brand new set of friends in the disciplines that will morph into ITS along with us. Hopefully these will all be “good” problems for our profession to have. I should close in reminding everyone again that AITPM’s flagship event, the 2009 AITPM National Conference, Traffic Beyond Tomorrow, is being held in Adelaide from 5 to 7 August. www.aitpm.com has all of the details about how to register, sponsor a booth, session, etc. Best regards all, Jon Bunker

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-resolution modelling has become essential as modern 3D applications demand 3D objects with higher LODs (LOD). Multi-modal devices such as PDAs and UMPCs do not have sufficient resources to handle the original 3D objects. The increased usage of collaborative applications has created many challenges for remote manipulation working with 3D objects of different quality. This paper studies how we can improve multi-resolution techniques by performing multiedge decimation and using annotative commands. It also investigates how devices with poorer quality 3D object can participate in collaborative actions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The equations governing saltwater intrusion in coastal aquifers are complex. Backward Euler time stepping approaches are often used to advance the solution to these equations in time, which typically requires that small time steps be taken in order to ensure that an accurate solution is obtained. We show that a method of lines approach incorporating variable order backward differentiation formulas can greatly improve the efficiency of the time stepping process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Web service based systems, new value-added Web services can be constructed by integrating existing Web services. A Web service may have many implementations, which are functionally identical, but have different Quality of Service (QoS) attributes, such as response time, price, reputation, reliability, availability and so on. Thus, a significant research problem in Web service composition is how to select an implementation for each of the component Web services so that the overall QoS of the composite Web service is optimal. This is so called QoS-aware Web service composition problem. In some composite Web services there are some dependencies and conflicts between the Web service implementations. However, existing approaches cannot handle the constraints. This paper tackles the QoS-aware Web service composition problem with inter service dependencies and conflicts using a penalty-based genetic algorithm (GA). Experimental results demonstrate the effectiveness and the scalability of the penalty-based GA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the widespread applications of electronic learning (e-Learning) technologies to education at all levels, increasing number of online educational resources and messages are generated from the corresponding e-Learning environments. Nevertheless, it is quite difficult, if not totally impossible, for instructors to read through and analyze the online messages to predict the progress of their students on the fly. The main contribution of this paper is the illustration of a novel concept map generation mechanism which is underpinned by a fuzzy domain ontology extraction algorithm. The proposed mechanism can automatically construct concept maps based on the messages posted to online discussion forums. By browsing the concept maps, instructors can quickly identify the progress of their students and adjust the pedagogical sequence on the fly. Our initial experimental results reveal that the accuracy and the quality of the automatically generated concept maps are promising. Our research work opens the door to the development and application of intelligent software tools to enhance e-Learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The driving task requires sustained attention during prolonged periods, and can be performed in highly predictable or repetitive environments. Such conditions could create drowsiness or hypovigilance and impair the ability to react to critical events. Identifying vigilance decrement in monotonous conditions has been a major subject of research, but no research to date has attempted to predict this vigilance decrement. This pilot study aims to show that vigilance decrements due to monotonous tasks can be predicted through mathematical modelling. A short vigilance task sensitive to short periods of lapses of vigilance called Sustained Attention to Response Task is used to assess participants’ performance. This task models the driver’s ability to cope with unpredicted events by performing the expected action. A Hidden Markov Model (HMM) is proposed to predict participants’ hypovigilance. Driver’s vigilance evolution is modelled as a hidden state and is correlated to an observable variable: the participant’s reactions time. This experiment shows that the monotony of the task can lead to an important vigilance decline in less than five minutes. This impairment can be predicted four minutes in advance with an 86% accuracy using HMMs. This experiment showed that mathematical models such as HMM can efficiently predict hypovigilance through surrogate measures. The presented model could result in the development of an in-vehicle device that detects driver hypovigilance in advance and warn the driver accordingly, thus offering the potential to enhance road safety and prevent road crashes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vertebrplasty involved injecting cement into a fractured vertebra to provide stabilisation. There is clinical evidence to suggest however that vertebroplasty may be assocated with a higher risk of adjacent vertebral fracture; which may be due to the change in material properties of the post-procedure vertebra modifying the transmission of mechanical stresses to adjacent vertebrae.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business Process Management (BPM) has increased in popularity and maturity in recent years. Large enterprises engage use process management approaches to model, manage and refine repositories of process models that detail the whole enterprise. These process models can run to the thousands in number, and may contain large hierarchies of tasks and control structures that become cumbersome to maintain. Tools are therefore needed to effectively traverse this process model space in an efficient manner, otherwise the repositories remain hard to use, and thus are lowered in their effectiveness. In this paper we analyse a range of BPM tools for their effectiveness in handling large process models. We establish that the present set of commercial tools is lacking in key areas regarding visualisation of, and interaction with, large process models. We then present six tool functionalities for the development of advanced business process visualisation and interaction, presenting a design for a tool that will exploit the latest advances in 2D and 3D computer graphics to enable fast and efficient search, traversal and modification of process models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Node-based Local Mesh Generation (NLMG) algorithm, which is free of mesh inconsistency, is one of core algorithms in the Node-based Local Finite Element Method (NLFEM) to achieve the seamless link between mesh generation and stiffness matrix calculation, and the seamless link helps to improve the parallel efficiency of FEM. Furthermore, the key to ensure the efficiency and reliability of NLMG is to determine the candidate satellite-node set of a central node quickly and accurately. This paper develops a Fast Local Search Method based on Uniform Bucket (FLSMUB) and a Fast Local Search Method based on Multilayer Bucket (FLSMMB), and applies them successfully to the decisive problems, i.e. presenting the candidate satellite-node set of any central node in NLMG algorithm. Using FLSMUB or FLSMMB, the NLMG algorithm becomes a practical tool to reduce the parallel computation cost of FEM. Parallel numerical experiments validate that either FLSMUB or FLSMMB is fast, reliable and efficient for their suitable problems and that they are especially effective for computing the large-scale parallel problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents an extended Joint Factor Analysis model including explicit modelling of unwanted within-session variability. The goals of the proposed extended JFA model are to improve verification performance with short utterances by compensating for the effects of limited or imbalanced phonetic coverage, and to produce a flexible JFA model that is effective over a wide range of utterance lengths without adjusting model parameters such as retraining session subspaces. Experimental results on the 2006 NIST SRE corpus demonstrate the flexibility of the proposed model by providing competitive results over a wide range of utterance lengths without retraining and also yielding modest improvements in a number of conditions over current state-of-the-art.