954 resultados para Driver behavioural models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complex design process of airport terminal needs to support a wide range of changes in operational facilities for both usual and unusual/emergency events. Process model describes how activities within a process are connected and also states logical information flow of the various activities. The traditional design process overlooks the necessity of information flow from the process model to the actual building design, which needs to be considered as a integral part of building design. The current research introduced a generic method to obtain design related information from process model to incorporate with the design process. Appropriate integration of the process model prior to the design process uncovers the relationship exist between spaces and their relevant functions, which could be missed in the traditional design approach. The current paper examines the available Business Process Model (BPM) and generates modified Business Process Model(mBPM) of check-in facilities of Brisbane International airport. The information adopted from mBPM then transform into possible physical layout utilizing graph theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report maps the current state of entrepreneurship in Australia using data from the Global Entrepreneurship Monitor (GEM) for the year 2011. Entrepreneurship is regarded as a crucial driver for economic well-being. Entrepreneurial activity in new and established firms drives innovation and creates jobs. Entrepreneurs also fuel competition thereby contributing indirectly to market and productivity growth along with improving competitiveness of the national economy. Given the economic landscape that exists as a result of the global financial crisis (GFC), it is probably more important than ever for us to understand the effects and drivers of entrepreneurial activity and attitudes in Australia. The central finding of this report is that entrepreneurship is certainly alive and well in Australia. With 10.5 per cent of the adult population involved in setting up a new business or owning a newly founded business as measured by the total entrepreneurial activity rate (TEA) in 2011, Australia ranks second only to the United States among the innovation-driven (developed) economies. Compared with 2010 the TEA rate has increased by 2.7 percentage points. Furthermore, in regard to employee entrepreneurial activity (EEA) rate in established firms, Australia ranks above average. According to GEM data, 5 per cent of the adult population is engaged in developing or launching new products, a new business unit or subsidiary for their employer. Further analysis of the GEM data also clearly shows that Australia compares well with other major economies in terms of the ‘quality’ of entrepreneurial activities being pursued. Indeed, it is not only the quantity of entrepreneurs but also the level of their aspirations and business goals that are important drivers for economic growth. On average, for each business started in Australia driven by the lack of alternatives for the founder to generate income from any other source, there are five other businesses started where the founders specifically want to take advantage of a business opportunity that they believe will increase their personal income or independence. With respect to innovativeness, 31 per cent of Australian new businesses offer products or services which they consider to be new to customers or where very few, or in some cases no, other businesses offer the same product or service. Both these indicators are higher than the average for innovation-driven economies. Somewhat below average is the international orientation of Australian entrepreneurs whereby only 12 per cent aim at having a substantial share of customers from international markets. So what drives this high quantity and quality of entrepreneurship in Australia? The analysis of the data suggests it is a combination of both business opportunities and entrepreneurial skills. It seems that around 50 per cent of the Australian population identify opportunities for a start-up venture and believe that they have the necessary skills to start a business. Furthermore, a large majority of the Australian population report that high media attention for entrepreneurship provides successful role models for prospective entrepreneurs. As a result, 12 per cent of our respondents have expressed the intention to start a business within the next three years. These numbers are all well above average when compared to the other major economies. With regard to gender, the GEM survey shows a high proportion of female entrepreneurs. Approximately 8.4 per cent of adult females are actually involved in setting up a business or have recently done so. Although this female TEA rate is slightly down from 2010, Australia ranks second among the innovation-driven economies. This paints a healthy picture of access to entrepreneurial opportunities for Australian women.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Light gauge cold-formed steel frame (LSF) structures are increasingly used in industrial, commercial and residential buildings because of their non-combustibility, dimensional stability, and ease of installation. A floor-ceiling system is an example of its applications. LSF floor-ceiling systems must be designed to serve as fire compartment boundaries and provide adequate fire resistance. Fire rated floor-ceiling assemblies formed with new materials and construction methodologies have been increasingly used in buildings. However, limited research has been undertaken in the past and hence a thorough understanding of their fire resistance behaviour is not available. Recently a new composite panel in which an external insulation layer is used between two plasterboards has been developed at QUT to provide a higher fire rating to LSF floors under standard fire conditions. But its increased fire rating could not be determined using the currently available design methods. Research on LSF floor systems under fire conditions is relatively recent and the behaviour of floor joists and other components in the systems is not fully understood. The present design methods thus require the use of expensive fire protection materials to protect them from excessive heat increase during a fire. This leads to uneconomical and conservative designs. Fire rating of these floor systems is provided simply by adding more plasterboard sheets to the steel joists and such an approach is totally inefficient. Hence a detailed fire research study was undertaken into the structural and thermal performance of LSF floor systems including those protected by the new composite panel system using full scale fire tests and extensive numerical studies. Experimental study included both the conventional and the new steel floor-ceiling systems under structural and fire loads using a gas furnace designed to deliver heat in accordance with the standard time- temperature curve in AS 1530.4 (SA, 2005). Fire tests included the behavioural and deflection characteristics of LSF floor joists until failure as well as related time-temperature measurements across the section and along the length of all the specimens. Full scale fire tests have shown that the structural and thermal performance of externally insulated LSF floor system was superior than traditional LSF floors with or without cavity insulation. Therefore this research recommends the use of the new composite panel system for cold-formed LSF floor-ceiling systems. The numerical analyses of LSF floor joists were undertaken using the finite element program ABAQUS based on the measured time-temperature profiles obtained from fire tests under both steady state and transient state conditions. Mechanical properties at elevated temperatures were considered based on the equations proposed by Dolamune Kankanamge and Mahendran (2011). Finite element models were calibrated using the full scale test results and used to further provide a detailed understanding of the structural fire behaviour of the LSF floor-ceiling systems. The models also confirmed the superior performance of the new composite panel system. The validated model was then used in a detailed parametric study. Fire tests and the numerical studies showed that plasterboards provided sufficient lateral restraint to LSF floor joists until their failure. Hence only the section moment capacity of LSF floor joists subjected to local buckling effects was considered in this research. To predict the section moment capacity at elevated temperatures, the effective section modulus of joists at ambient temperature is generally considered adequate. However, this research has shown that it leads to considerable over- estimation of the local buckling capacity of joist subject to non-uniform temperature distributions under fire conditions. Therefore new simplified fire design rules were proposed for LSF floor joist to determine the section moment capacity at elevated temperature based on AS/NZS 4600 (SA, 2005), NAS (AISI, 2007) and Eurocode 3 Part 1.3 (ECS, 2006). The accuracy of the proposed fire design rules was verified with finite element analysis results. A spread sheet based design tool was also developed based on these design rules to predict the failure load ratio versus time, moment capacity versus time and temperature for various LSF floor configurations. Idealised time-temperature profiles of LSF floor joists were developed based on fire test measurements. They were used in the detailed parametric study to fully understand the structural and fire behaviour of LSF floor panels. Simple design rules were also proposed to predict both critical average joist temperatures and failure times (fire rating) of LSF floor systems with various floor configurations and structural parameters under any given load ratio. Findings from this research have led to a comprehensive understanding of the structural and fire behaviour of LSF floor systems including those protected by the new composite panel, and simple design methods. These design rules were proposed within the guidelines of the Australian/New Zealand, American and European cold- formed steel structures standard codes of practice. These may also lead to further improvements to fire resistance through suitable modifications to the current composite panel system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A building information model (BIM) is an electronic repository of structured, three-dimensional data that captures both the physical and dynamic functional characteristics of a facility. In addition to its more traditional function as a tool to aid design and construction, a BIM can be used throughout the life cycle of a facility, functioning as a living database that places resources contained within the building in their spatial and temporal context. Through its comprehension of spatial relationships, a BIM can meaningfully represent and integrate previously isolated control and management systems and processes, and thereby provide a more intuitive interface to users. By placing processes in a spatial context, decision-making can be improved, with positive flow-on effects for security and efficiency. In this article, we systematically analyse the authorization requirements involved in the use of BIMs. We introduce the concept of using a BIM as a graphical tool to support spatial access control configuration and management (including physical access control). We also consider authorization requirements for regulating access to the structured data that exists within a BIM as well as to external systems and data repositories that can be accessed via the BIM interface. With a view to addressing these requirements we present a survey of relevant spatiotemporal access control models, focusing on features applicable to BIMs and highlighting capability gaps. Finally, we present a conceptual authorization framework that utilizes BIMs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three dimensional models and groundwater quality are combined to better understand and conceptualise groundwater systems in complex geological settings in the Wairau Plain, Marlborough. Hydrochemical facies, which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters, are identified within geological formations to assess natural water-rock interactions, redox potential and human agricultural impact on groundwater quality in the Wairau Plain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Articular cartilage is a highly resilient tissue located at the ends of long bones. It has a zonal structure, which has functional significance in load-bearing. Cartilage does not spontaneously heal itself when damaged, and untreated cartilage lesions or age-related wear often lead to osteoarthritis (OA). OA is a degenerative condition that is highly prevalent, age-associated, and significantly affects patient mobility and quality of life. There is no cure for OA, and patients usually resort to replacing the biological joint with an artificial prosthesis. An alternative approach is to dynamically regenerate damaged or diseased cartilage through cartilage tissue engineering, where cells, materials, and stimuli are combined to form new cartilage. However, despite extensive research, major limitations remain that have prevented the wide-spread application of tissue-engineered cartilage. Critically, there is a dearth of information on whether autologous chondrocytes obtained from OA patients can be used to successfully generate cartilage tissues with structural hierarchy typically found in normal articular cartilage. I aim to address these limitations in this thesis by showing that chondrocyte subpopulations isolated from macroscopically normal areas of the cartilage can be used to engineer stratified cartilage tissues and that compressive loading plays an important role in zone-dependent biosynthesis of these chondrocytes. I first demonstrate that chondrocyte subpopulations from the superficial (S) and middle/deep (MD) zones of OA cartilage are responsive to compressive stimulation in vitro, and that the effect of compression on construct quality is zone-dependent. I also show that compressive stimulation can influence pericelluar matrix production, matrix metalloproteinase secretion, and cytokine expression in zonal chondrocytes in an alginate hydrogel model. Subsequently, I focus on recreating the zonal structure by forming layered constructs using the alginate-released chondrocyte (ARC) method either with or without polymeric scaffolds. Resulting zonal ARC constructs had hyaline morphology, and expressed cartilage matrix molecules such as proteoglycans and collagen type II in both scaffold-free and scaffold-based approaches. Overall, my findings demonstrate that chondrocyte subpopulations obtained from OA joints respond sensitively to compressive stimulation, and are able to form cartilaginous constructs with stratified organization similar to native cartilage using the scaffold-free and scaffold-based ARC technique. The ultimate goal in tissue engineering is to help provide improved treatment options for patients suffering from debilitating conditions such as OA. Further investigations in developing functional cartilage replacement tissues using autologous chondrocytes will bring us a step closer to improving the quality of life for millions of OA patients worldwide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the goal of identifying disease subgroups based on differences in observed symptom profile is considered. Commonly referred to as phenotype identification, solutions to this task often involve the application of unsupervised clustering techniques. In this paper, we investigate the application of a Dirichlet Process mixture (DPM) model for this task. This model is defined by the placement of the Dirichlet Process (DP) on the unknown components of a mixture model, allowing for the expression of uncertainty about the partitioning of observed data into homogeneous subgroups. To exemplify this approach, an application to phenotype identification in Parkinson’s disease (PD) is considered, with symptom profiles collected using the Unified Parkinson’s Disease Rating Scale (UPDRS). Clustering, Dirichlet Process mixture, Parkinson’s disease, UPDRS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the effects of an eco-driving message on driver distraction. Two in-vehicle distracter tasks were compared with an eco-driving task and a baseline task in an advanced driving simulator. N = 22 subjects were asked to perform an eco-driving, CD changing, and a navigation task while engaged in critical manoeuvres during which they were expected to respond to a peripheral detection task (PDT) with total duration of 3.5 h. The study involved two sessions over two consecutive days. The results show that drivers’ mental workloads are significantly higher during navigation and CD changing tasks in comparison to the two other scenarios. However, eco-driving mental workload is still marginally significant (p ∼ .05) across different manoeuvres. Similarly, event detection tasks show that drivers miss significantly more events in the navigation and CD changing scenarios in comparison to both the baseline and eco-driving scenario. Analysis of the practice effect shows that drivers’ baseline scenario and navigation scenario exhibit significantly less demand on the second day. Drivers also can detect significantly more events on the second day for all scenarios. The authors conclude that even reading a simple message while driving could potentially lead to missing an important event, especially when executing critical manoeuvres. However, there is some evidence of a practice effect which suggests that future research should focus on performance with habitual rather than novel tasks. It is recommended that sending text as an eco-driving message analogous to the study circumstances should not be delivered to drivers on-line when vehicle is in motion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dealing with the aggression of other drivers on the road is an important skill given that driving is a common activity for adults in highly motorised countries. Even though incidents of extreme aggression on the road (such as assault) are reportedly rare, milder forms, some of them dangerous (such as tailgating or deliberately following too closely) are apparently common, and may be increasing. At the very least, this is likely to render the driving environment more stressful, and at worst elevates the risk of crashing by increasing both the level of risky driving behaviours and the likelihood of responses that escalate the situation. Thus the need for drivers to manage incidents of conflict is likely to become increasingly important. However, little research examines how drivers manage their own or others’ aggressive driving behaviour. Recently greater attention has been paid to driver cognitions, especially the attributions that drivers make about other drivers, that then might influence their own driving responses, particularly aggressive or risky ones. The study reported below was the first in a larger exploration of aggressive driving that focussed on driver cognitions, emotions and underlying motivations for aggressive behaviours on the road. Qualitative, in-depth interviews of drivers (n = 30, aged 18-49 years) were subjected to thematic analysis to investigate driver experiences with aggressive driving. Two main themes were identified from these accounts: driver management of self; and driver attempts to influence or manage other drivers. This paper describes the subthemes falling under the management of self main theme. These subthemes were labelled ‘being magnanimous’, ‘chilling out’, ‘slowing down’, and ‘apology/acknowledgment’. ‘Being magnanimous’ referred to situations where the respondent perceived him/herself to be a recipient of another’s aggressive driving and made a deliberate choice not to respond. However, a characteristic of this sub-theme was that this choice was underpinned by the adoption of morally superior stance, or sense of magnanimity. ‘Chilling out’ was a more general response to both the milder aggressive behaviours of other drivers and the general frustrations of driving. ‘Slowing down’ referred to reducing one’s speed in response to the perceived aggressive driving, often tailgating, of another. This subtheme appeared to consist of two separate underlying motivations. One of these was a genuine concern for one’s own safety while the other was more aimed at “getting back” at the other driver. ‘Apology’ referred to how drivers modified their more negative reactions and responses when another driver made gestures that acknowledged their having made a mistake, indicated an apology, or acknowledged the recipient driver. These sub-themes are discussed in relation to their implications for understanding aggressive driving and intervening to reduce it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research project involved two studies aimed to determine whether drivers who have experienced a traffic crash resulting in a Whiplash Associated Disorder (WAD) are at an elevated risk of a subsequent traffic crash. Using data and records held by the Queensland Motor Accident Insurance Commission (MAIC) and Queensland Transport Crash Database (QTCD) the first study examined the crash involvement of two samples of drivers subsequent to a crash in which a compensable injury was incurred. One sample was of persons who had suffered a WAD, the second of persons with a soft tissue injury of equivalent severity. Since differentially altered driving exposure following the relevant injury in the two groups could be a potential confound, in the second study such exposure was estimated using survey data obtained from a sample of similarly injured drivers. These studies were supplemented by a brief analysis of qualitative data drawn from open-ended questions in the survey. In addition a comprehensive review of the literature on impaired driving due to similar medical conditions was undertaken and is reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In microscopic traffic simulators, the interaction between vehicles is considered. The dynamics of the system then becomes an emergent property of the interaction between its components. Such interactions include lane-changing, car-following behaviours and intersection management. Although, in some cases, such simulators produce realistic prediction, they do not allow for an important aspect of the dynamics, that is, the driver-vehicle interaction. This paper introduces a physically sound vehicle-driver model for realistic microscopic simulation. By building a nanoscopic traffic simulation model that uses steering angle and throttle position as parameters, the model aims to overcome unrealistic acceleration and deceleration values, as found in various microscopic simulation tools. A physics engine calculates the driving force of the vehicle, and the preliminary results presented here, show that, through a realistic driver-vehicle-environment simulator, it becomes possible to model realistic driver and vehicle behaviours in a traffic simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, "contextuality", is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, "entanglement", allows cognitive phenomena to be modelled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light...