399 resultados para Point Stimulation Method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent surveys of information technology management professionals show that understanding business domains in terms of business productivity and cost reduction potential, knowledge of different vertical industry segments and their information requirements, understanding of business processes and client-facing skills are more critical for Information Systems personnel than ever before. In an attempt to restrucuture the information systems curriculum accordingly, our view it that information systems students need to develop an appreciation for organizational work systems in order to understand the operation and significance of information systems within such work systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for business process models, from a rather correlational perspective. This is helpful for understanding, for example size and complexity as general driving forces of error probability. Yet, design decisions usually have to build on thresholds, which can reliably indicate that a certain counter-action has to be taken. This cannot be achieved only by providing measures; it requires a systematic identification of effective and meaningful thresholds. In this paper, we derive thresholds for a set of structural measures for predicting errors in conceptual process models. To this end, we use a collection of 2,000 business process models from practice as a means of determining thresholds, applying an adaptation of the ROC curves method. Furthermore, an extensive validation of the derived thresholds was conducted by using 429 EPC models from an Australian financial institution. Finally, significant thresholds were adapted to refine existing modeling guidelines in a quantitative way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the ultrasonic velocity measurement method which investigates the possible effects of high voltage high frequency pulsed power on cortical bone material elasticity. Before applying a pulsed power signal on a live bone, it is essential to determine the safe parameters of pulsed power applied on bone non-destructively. Therefore, the possible changes in cortical bone material elasticity due to a specified pulsed power excitation have been investigated. A controllable positive buck-boost converter with adjustable output voltage and frequency has been used to generate high voltage pulses (500V magnitude at 10 KHz frequency). To determine bone elasticity, an ultrasonic velocity measurement has been conducted on two groups of control (unexposed to pulse power but in the same environmental condition) and cortical bone samples exposed to pulsed power. Young’s modulus of cortical bone samples have been determined and compared before and after applying the pulsed power signal. After applying the high voltage pulses, no significant variation in elastic property of cortical bone specimens was found compared to the control. The result shows that pulsed power with nominated parameters can be applied on cortical bone tissue without any considerable negative effect on elasticity of bone material.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research is one of several ongoing studies conducted within the IT Professional Services (ITPS) research programme at Queensland University of Technology (QUT). In 2003, ITPS introduced the IS-Impact model, a measurement model for measuring information systems success from the viewpoint of multiple stakeholders. The model, along with its instrument, is robust, simple, yet generalisable, and yields results that are comparable across time, stakeholders, different systems and system contexts. The IS-Impact model is defined as “a measure at a point in time, of the stream of net benefits from the Information System (IS), to date and anticipated, as perceived by all key-user-groups”. The model represents four dimensions, which are ‘Individual Impact’, ‘Organizational Impact’, ‘Information Quality’ and ‘System Quality’. The two Impact dimensions measure the up-to-date impact of the evaluated system, while the remaining two Quality dimensions act as proxies for probable future impacts (Gable, Sedera & Chan, 2008). To fulfil the goal of ITPS, “to develop the most widely employed model” this research re-validates and extends the IS-Impact model in a new context. This method/context-extension research aims to test the generalisability of the model by addressing known limitations of the model. One of the limitations of the model relates to the extent of external validity of the model. In order to gain wide acceptance, a model should be consistent and work well in different contexts. The IS-Impact model, however, was only validated in the Australian context, and packaged software was chosen as the IS understudy. Thus, this study is concerned with whether the model can be applied in another different context. Aiming for a robust and standardised measurement model that can be used across different contexts, this research re-validates and extends the IS-Impact model and its instrument to public sector organisations in Malaysia. The overarching research question (managerial question) of this research is “How can public sector organisations in Malaysia measure the impact of information systems systematically and effectively?” With two main objectives, the managerial question is broken down into two specific research questions. The first research question addresses the applicability (relevance) of the dimensions and measures of the IS-Impact model in the Malaysian context. Moreover, this research question addresses the completeness of the model in the new context. Initially, this research assumes that the dimensions and measures of the IS-Impact model are sufficient for the new context. However, some IS researchers suggest that the selection of measures needs to be done purposely for different contextual settings (DeLone & McLean, 1992, Rai, Lang & Welker, 2002). Thus, the first research question is as follows, “Is the IS-Impact model complete for measuring the impact of IS in Malaysian public sector organisations?” [RQ1]. The IS-Impact model is a multidimensional model that consists of four dimensions or constructs. Each dimension is represented by formative measures or indicators. Formative measures are known as composite variables because these measures make up or form the construct, or, in this case, the dimension in the IS-Impact model. These formative measures define different aspects of the dimension, thus, a measurement model of this kind needs to be tested not just on the structural relationship between the constructs but also the validity of each measure. In a previous study, the IS-Impact model was validated using formative validation techniques, as proposed in the literature (i.e., Diamantopoulos and Winklhofer, 2001, Diamantopoulos and Siguaw, 2006, Petter, Straub and Rai, 2007). However, there is potential for improving the validation testing of the model by adding more criterion or dependent variables. This includes identifying a consequence of the IS-Impact construct for the purpose of validation. Moreover, a different approach is employed in this research, whereby the validity of the model is tested using the Partial Least Squares (PLS) method, a component-based structural equation modelling (SEM) technique. Thus, the second research question addresses the construct validation of the IS-Impact model; “Is the IS-Impact model valid as a multidimensional formative construct?” [RQ2]. This study employs two rounds of surveys, each having a different and specific aim. The first is qualitative and exploratory, aiming to investigate the applicability and sufficiency of the IS-Impact dimensions and measures in the new context. This survey was conducted in a state government in Malaysia. A total of 77 valid responses were received, yielding 278 impact statements. The results from the qualitative analysis demonstrate the applicability of most of the IS-Impact measures. The analysis also shows a significant new measure having emerged from the context. This new measure was added as one of the System Quality measures. The second survey is a quantitative survey that aims to operationalise the measures identified from the qualitative analysis and rigorously validate the model. This survey was conducted in four state governments (including the state government that was involved in the first survey). A total of 254 valid responses were used in the data analysis. Data was analysed using structural equation modelling techniques, following the guidelines for formative construct validation, to test the validity and reliability of the constructs in the model. This study is the first research that extends the complete IS-Impact model in a new context that is different in terms of nationality, language and the type of information system (IS). The main contribution of this research is to present a comprehensive, up-to-date IS-Impact model, which has been validated in the new context. The study has accomplished its purpose of testing the generalisability of the IS-Impact model and continuing the IS evaluation research by extending it in the Malaysian context. A further contribution is a validated Malaysian language IS-Impact measurement instrument. It is hoped that the validated Malaysian IS-Impact instrument will encourage related IS research in Malaysia, and that the demonstrated model validity and generalisability will encourage a cumulative tradition of research previously not possible. The study entailed several methodological improvements on prior work, including: (1) new criterion measures for the overall IS-Impact construct employed in ‘identification through measurement relations’; (2) a stronger, multi-item ‘Satisfaction’ construct, employed in ‘identification through structural relations’; (3) an alternative version of the main survey instrument in which items are randomized (rather than blocked) for comparison with the main survey data, in attention to possible common method variance (no significant differences between these two survey instruments were observed); (4) demonstrates a validation process of formative indexes of a multidimensional, second-order construct (existing examples mostly involved unidimensional constructs); (5) testing the presence of suppressor effects that influence the significance of some measures and dimensions in the model; and (6) demonstrates the effect of an imbalanced number of measures within a construct to the contribution power of each dimension in a multidimensional model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Handling information overload online, from the user's point of view is a big challenge, especially when the number of websites is growing rapidly due to growth in e-commerce and other related activities. Personalization based on user needs is the key to solving the problem of information overload. Personalization methods help in identifying relevant information, which may be liked by a user. User profile and object profile are the important elements of a personalization system. When creating user and object profiles, most of the existing methods adopt two-dimensional similarity methods based on vector or matrix models in order to find inter-user and inter-object similarity. Moreover, for recommending similar objects to users, personalization systems use the users-users, items-items and users-items similarity measures. In most cases similarity measures such as Euclidian, Manhattan, cosine and many others based on vector or matrix methods are used to find the similarities. Web logs are high-dimensional datasets, consisting of multiple users, multiple searches with many attributes to each. Two-dimensional data analysis methods may often overlook latent relationships that may exist between users and items. In contrast to other studies, this thesis utilises tensors, the high-dimensional data models, to build user and object profiles and to find the inter-relationships between users-users and users-items. To create an improved personalized Web system, this thesis proposes to build three types of profiles: individual user, group users and object profiles utilising decomposition factors of tensor data models. A hybrid recommendation approach utilising group profiles (forming the basis of a collaborative filtering method) and object profiles (forming the basis of a content-based method) in conjunction with individual user profiles (forming the basis of a model based approach) is proposed for making effective recommendations. A tensor-based clustering method is proposed that utilises the outcomes of popular tensor decomposition techniques such as PARAFAC, Tucker and HOSVD to group similar instances. An individual user profile, showing the user's highest interest, is represented by the top dimension values, extracted from the component matrix obtained after tensor decomposition. A group profile, showing similar users and their highest interest, is built by clustering similar users based on tensor decomposed values. A group profile is represented by the top association rules (containing various unique object combinations) that are derived from the searches made by the users of the cluster. An object profile is created to represent similar objects clustered on the basis of their similarity of features. Depending on the category of a user (known, anonymous or frequent visitor to the website), any of the profiles or their combinations is used for making personalized recommendations. A ranking algorithm is also proposed that utilizes the personalized information to order and rank the recommendations. The proposed methodology is evaluated on data collected from a real life car website. Empirical analysis confirms the effectiveness of recommendations made by the proposed approach over other collaborative filtering and content-based recommendation approaches based on two-dimensional data analysis methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Backgrounds Whether suicide in China has significant seasonal variations is unclear. The aim of this study is to examine the seasonality of suicide in Shandong China and to assess the associations of suicide seasonality with gender, residence, age and methods of suicide. Methods Three types of tests (Chi-square, Edwards' T and Roger's Log method) were used to detect the seasonality of the suicide data extracted from the official mortality data of Shandong Disease Surveillance Point (DSP) system. Peak/low ratios (PLRs) and 95% confidence intervals (CIs) were calculated to indicate the magnitude of seasonality. Results A statistically significant seasonality with a single peak in suicide rates in spring and early summer, and a dip in winter was observed, which remained relatively consistent over years. Regardless of gender, suicide seasonality was more pronounced in rural areas, younger age groups and for non-violent methods, in particular, self-poisoning by pesticide. Conclusions There are statistically significant seasonal variations of completed suicide for both men and women in Shandong, China. Differences exist between residence (urban/rural), age groups and suicide methods. Results appear to support a sociological explanation of suicide seasonality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an iterative hierarchical algorithm for multi-view stereo. The algorithm attempts to utilise as much contextual information as is available to compute highly accurate and robust depth maps. There are three novel aspects to the approach: 1) firstly we incrementally improve the depth fidelity as the algorithm progresses through the image pyramid; 2) secondly we show how to incorporate visual hull information (when available) to constrain depth searches; and 3) we show how to simultaneously enforce the consistency of the depth-map by continual comparison with neighbouring depth-maps. We show that this approach produces highly accurate depth-maps and, since it is essentially a local method, is both extremely fast and simple to implement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a strategy for delayed research method selection in a qualitative interpretivist research. An exemplary case details how explorative interviews were designed and conducted in accordance with a paradigm prior to deciding whether to adopt grounded theory or phenomenology for data analysis. The focus here is to determine the most appropriate research strategy in this case the methodological framing to conduct research and represent findings, both of which are detailed. Research addressing current management issues requires both a flexible framework and the capability to consider the research problem from various angles, to derive tangible results for academia with immediate application to business demands. Researchers, and in particular novices, often struggle to decide on an appropriate research method suitable to address their research problem. This often applies to interpretative qualitative research where it is not always immediately clear which is the most appropriate method to use, as the research objectives shift and crystallize over time. This paper uses an exemplary case to reveal how the strategy for delayed research method selection contributes to deciding whether to adopt grounded theory or phenomenology in the initial phase of a PhD research project. In this case, semi-structured interviews were used for data generation framed in an interpretivist approach, situated in a business context. Research questions for this study were thoroughly defined and carefully framed in accordance with the research paradigm‟s principles, while at the same time ensuring that the requirements of both potential research methods were met. The grounded theory and phenomenology methods were compared and contrasted to determine their suitability and whether they meet the research objectives based on a pilot study. The strategy proposed in this paper is an alternative to the more „traditional‟ approach, which initially selects the methodological formulation, followed by data generation. In conclusion, the suggested strategy for delayed research method selection intends to help researchers identify and apply the most appropriate method to their research. This strategy is based on explorations of data generation and analysis in order to derive faithful results from the data generated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chronic venous leg ulcers are a detrimental health issue plaguing our society, resulting in long term pain, immobility and decreased quality of life for a large proportion of sufferers. The frequency of these chronic wounds has led current research to focus on the wound environment to provide important information regarding the prolonged, fluctuated or static healing patterns of these wounds. Disruption to the normal wound healing process results in release of multiple factors in the wound environment that could correlate to wound chronicity. These biochemical factors can often be detected through non-invasively sampling chronic wound fluid (CWF) from the site of injury. Of note, whilst there are numerous studies comparing acute and chronic wound fluids, there have not been any reports in the literature employing a longitudinal study in order to track biochemical changes in wound fluid as patients transition from a non-healing to healed state. Initially the objective of this study was to identify biochemical changes in CWF associated with wound healing using a proteomic approach. The proteomic approach incorporated a multi-dimensional liquid chromatography fractionation technique coupled with mass spectrometry (MS) to enable identification of proteins present in lower concentrations in CWF. Not surprisingly, many of the proteins identified in wound fluid were acute phase proteins normally expressed during the inflammatory phase of healing. However, the number of proteins positively identified by MS was quite low. This was attributed to the diverse range in concentration of protein species in CWF making it challenging to detect the diagnostically relevant low molecular weight proteins. In view of this, SELDI-TOF MS was also explored as a means to target low molecular weight proteins in sequential patient CWF samples during the course of healing. Unfortunately, the results generated did not yield any peaks of interest that were altered as wounds transitioned to a healed state. During the course of proteomic assessment of CWF, it became evident that a fraction of non-proteinaceous compounds strongly absorbed at 280 nm. Subsequent analyses confirmed that most of these compounds were in fact part of the purine catabolic pathway, possessing distinctive aromatic rings and which results in high absorbance at 254 nm. The accumulation of these purinogenic compounds in CWF suggests that the wound bed is poorly oxygenated resulting in a switch to anaerobic metabolism and consequently ATP breakdown. In addition, the presence of the terminal purine catabolite, uric acid (UA), indicates that the enzyme xanthine oxidoreductase (XOR) catalyses the reaction of hypoxanthine to xanthine and finally to UA. More importantly, the studies provide evidence for the first time of the exogenous presence of XOR in CWF. XOR is the only enzyme in humans capable of catalysing the production of UA in conjunction with a burst of the highly reactive superoxide radical and other oxidants like H2O2. Excessive release of these free radicals in the wound environment can cause cellular damage disrupting the normal wound healing process. In view of this, a sensitive and specific assay was established for monitoring low concentrations of these catabolites in CWF. This procedure involved combining high performance liquid chromatography (HPLC) with tandem mass spectrometry and multiple reaction monitoring (MRM). This application was selective, using specific MRM transitions and HPLC separations for each analyte, making it ideal for the detection and quantitation of purine catabolites in CWF. The results demonstrated that elevated levels of UA were detected in wound fluid obtained from patients with clinically worse ulcers. This suggests that XOR is active in the wound site generating significant amounts of reactive oxygen species (ROS). In addition, analysis of the amount of purine precursors in wound fluid revealed elevated levels of purine precursors in wound fluid from patients with less severe ulcers. Taken together, the results generated in this thesis suggest that monitoring changes of purine catabolites in CWF is likely to provide valuable information regarding the healing patterns of chronic venous leg ulcers. XOR catalysis of purine precursors not only provides a method for monitoring the onset, prognosis and progress of chronic venous leg ulcers, but also provides a potential therapeutic target by inhibiting XOR, thus blocking UA and ROS production. Targeting a combination of these purinogenic compounds and XOR could lead to the development of novel point of care diagnostic tests. Therefore, further investigation of these processes during wound healing will be worthwhile and may assist in elucidating the pathogenesis of this disease state, which in turn may lead to the development of new diagnostics and therapies that target these processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seat pressure is known as a major factor of seat comfort in vehicles. In passenger vehicles, there is lacking research into the seat comfort of rear seat occupants. As accurate seat pressure measurement requires significant effort, simulation of seat pressure is evolving as a preferred method. However, analytic methods are based on complex finite element modeling and therefore are time consuming and involve high investment. Based on accurate anthropometric measurements of 64 male subjects and outboard rear seat pressure measurements in three different passenger vehicles, this study investigates if a set of parameters derived from seat pressure mapping are sensitive enough to differentiate between different seats and whether they correlate with anthropometry in linear models. In addition to the pressure map analysis, H-Points were measured with a coordinate measurement system based on palpated body landmarks and the range of H-Point locations in the three seats is provided. It was found that for the cushion, cushion contact area and cushion front area/force could be modeled by subject anthropometry,while only seatback contact area could be modeled based on anthropometry for all three vehicles. Major differences were found between the vehicles for other parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The accuracy of marker placement on palpable surface anatomical landmarks is an important consideration in biomechanics. Although marker placement reliability has been studied in some depth, it remains unclear whether or not the markers are accurately positioned over the intended landmark in order to define the static position and orientation of the segment. A novel method using commonly available X-ray imaging was developed to identify the accuracy of markers placed on the shoe surface by palpating landmarks through the shoe. An anterior–posterior and lateral–medial X-ray was taken on 24 participants with a newly developed marker set applied to both the skin and shoe. The vector magnitude of both skin- and shoe-mounted markers from the anatomical landmark was calculated, as well as the mean marker offset between skin- and shoe-mounted markers. The accuracy of placing markers on the shoe relative to the skin-mounted markers, accounting for shoe thickness, was less than 5mm for all markers studied. Further, when using the developed guidelines provided in this study, the method was deemed reliable (Intra-rater ICCs¼0.50–0.92). In conclusion, the method proposed here can reliably assess marker placement accuracy on the shoe surface relative to chosen anatomical landmarks beneath the skin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, the advent of new tools for musculoskeletal simulation has increased the potential for significantly improving the ergonomic design process and ergonomic assessment of design. In this paper we investigate the use of one such tool, ‘The AnyBody Modeling System’, applied to solve a one-parameter and yet, complex ergonomic design problem. The aim of this paper is to investigate the potential of computer-aided musculoskeletal modelling in the ergonomic design process, in the same way as CAE technology has been applied to engineering design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Although class attendance is linked to academic performance, questions remain about what determines students’ decisions to attend or miss class. Aims: In addition to the constructs of a common decision-making model, the theory of planned behaviour, the present study examined the influence of student role identity and university student (in-group) identification for predicting both the initiation and maintenance of students’ attendance at voluntary peer-assisted study sessions in a statistics subject. Sample: University students enrolled in a statistics subject were invited to complete a questionnaire at two time points across the academic semester. A total of 79 university students completed questionnaires at the first data collection point, with 46 students completing the questionnaire at the second data collection point. Method: Twice during the semester, students’ attitudes, subjective norm, perceived behavioural control, student role identity, in-group identification, and intention to attend study sessions were assessed via on-line questionnaires. Objective measures of class attendance records for each half-semester (or ‘term’) were obtained. Results: Across both terms, students’ attitudes predicted their attendance intentions, with intentions predicting class attendance. Earlier in the semester, in addition to perceived behavioural control, both student role identity and in-group identification predicted students’ attendance intentions, with only role identity influencing intentions later in the semester. Conclusions: These findings highlight the possible chronology that different identity influences have in determining students’ initial and maintained attendance at voluntary sessions designed to facilitate their learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the availability of 3D full body scanners and the associated software systems for operations with large point clouds, 3D anthropometry has been marketed as a breakthrough and milestone in ergonomic design. The assumptions made by the representatives of the 3D paradigm need to be critically reviewed though. 3D anthropometry has advantages as well as shortfalls, which need to be carefully considered. While it is apparent that the measurement of a full body point cloud allows for easier storage of raw data and improves quality control, the difficulties in calculation of standardized measurements from the point cloud are widely underestimated. Early studies that made use of 3D point clouds to derive anthropometric dimensions have shown unacceptable deviations from the standardized results measured manually. While 3D human point clouds provide a valuable tool to replicate specific single persons for further virtual studies, or personalize garment, their use in ergonomic design must be critically assessed. Ergonomic, volumetric problems are defined by their 2-dimensional boundary or one dimensional sections. A 1D/2D approach is therefore sufficient to solve an ergonomic design problem. As a consequence, all modern 3D human manikins are defined by the underlying anthropometric girths (2D) and lengths/widths (1D), which can be measured efficiently using manual techniques. Traditionally, Ergonomists have taken a statistical approach to design for generalized percentiles of the population rather than for a single user. The underlying method is based on the distribution function of meaningful single and two-dimensional anthropometric variables. Compared to these variables, the distribution of human volume has no ergonomic relevance. On the other hand, if volume is to be seen as a two-dimensional integral or distribution function of length and girth, the calculation of combined percentiles – a common ergonomic requirement - is undefined. Consequently, we suggest to critically review the cost and use of 3D anthropometry. We also recommend making proper use of widely available single and 2-dimensional anthropometric data in ergonomic design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate and detailed road models play an important role in a number of geospatial applications, such as infrastructure planning, traffic monitoring, and driver assistance systems. In this thesis, an integrated approach for the automatic extraction of precise road features from high resolution aerial images and LiDAR point clouds is presented. A framework of road information modeling has been proposed, for rural and urban scenarios respectively, and an integrated system has been developed to deal with road feature extraction using image and LiDAR analysis. For road extraction in rural regions, a hierarchical image analysis is first performed to maximize the exploitation of road characteristics in different resolutions. The rough locations and directions of roads are provided by the road centerlines detected in low resolution images, both of which can be further employed to facilitate the road information generation in high resolution images. The histogram thresholding method is then chosen to classify road details in high resolution images, where color space transformation is used for data preparation. After the road surface detection, anisotropic Gaussian and Gabor filters are employed to enhance road pavement markings while constraining other ground objects, such as vegetation and houses. Afterwards, pavement markings are obtained from the filtered image using the Otsu's clustering method. The final road model is generated by superimposing the lane markings on the road surfaces, where the digital terrain model (DTM) produced by LiDAR data can also be combined to obtain the 3D road model. As the extraction of roads in urban areas is greatly affected by buildings, shadows, vehicles, and parking lots, we combine high resolution aerial images and dense LiDAR data to fully exploit the precise spectral and horizontal spatial resolution of aerial images and the accurate vertical information provided by airborne LiDAR. Objectoriented image analysis methods are employed to process the feature classiffcation and road detection in aerial images. In this process, we first utilize an adaptive mean shift (MS) segmentation algorithm to segment the original images into meaningful object-oriented clusters. Then the support vector machine (SVM) algorithm is further applied on the MS segmented image to extract road objects. Road surface detected in LiDAR intensity images is taken as a mask to remove the effects of shadows and trees. In addition, normalized DSM (nDSM) obtained from LiDAR is employed to filter out other above-ground objects, such as buildings and vehicles. The proposed road extraction approaches are tested using rural and urban datasets respectively. The rural road extraction method is performed using pan-sharpened aerial images of the Bruce Highway, Gympie, Queensland. The road extraction algorithm for urban regions is tested using the datasets of Bundaberg, which combine aerial imagery and LiDAR data. Quantitative evaluation of the extracted road information for both datasets has been carried out. The experiments and the evaluation results using Gympie datasets show that more than 96% of the road surfaces and over 90% of the lane markings are accurately reconstructed, and the false alarm rates for road surfaces and lane markings are below 3% and 2% respectively. For the urban test sites of Bundaberg, more than 93% of the road surface is correctly reconstructed, and the mis-detection rate is below 10%.