623 resultados para Stochastic settling time


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The 2003 Bureau of Labor Statistics American Time Use Survey (ATUS) contains 438 distinct primary activity variables that can be analyzed with regard to how time is spent by Americans. The Compendium of Physical Activities is used to code physical activities derived from various surveys, logs, diaries, etc to facilitate comparison of coded intensity levels across studies. ------ ----- Methods: This paper describes the methods, challenges, and rationale for linking Compendium estimates of physical activity intensity (METs, metabolic equivalents) with all activities reported in the 2003 ATUS. ----- ----- Results: The assigned ATUS intensity levels are not intended to compute the energy costs of physical activity in individuals. Instead, they are intended to be used to identify time spent in activities broadly classified by type and intensity. This function will complement public health surveillance systems and aid in policy and health-promotion activities. For example, at least one of the future projects of this process is the descriptive epidemiology of time spent in common physical activity intensity categories. ----- ----- Conclusions: The process of metabolic coding of the ATUS by linking it with the Compendium of Physical Activities can make important contributions to our understanding of Americans’ time spent in health-related physical activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Where airports were once the sole responsibility of their governments, liberalisation of economies has seen administrative interests in airport spaces divested increasingly towards market led authority. Extant literature suggests that actions in decision spaces can be described under broad idealised forms of governance. However in looking at a sample of 18 different airports it is apparent that these classic models are insufficient to appreciate the contextual complexity of each case. Issues of institutional arrangements, privatisation, and management focus are reviewed against existing governance modes to produce a model for informing privatisation decisions, based on the contextual needs of the individual airport and region. Expanding governance modes to include emergent airport arrangements both contribute to the existing literature, and provides a framework to assist policy makers and those charged with the operation of airports to design effective governance models. In progressing this framework, contributions are made to government decision makers for the development of new, or review of existing strategies for privatisation, while the private sector can identify the intent and expectations of privatisation initiatives to make better informed decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to provide a labour process theory interpretation of four case studies within the Australian construction industry. In each case study a working time intervention (a shift to a five-day working week from the industry standard six days) was implemented as an attempt to improve the work-life balance of employees. ----- ----- Design/methodology/approach: This paper was based on four case studies with mixed methods. Each case study has a variety of data collection methods which include questionnaires, short and long interviews, and focus groups. ----- ----- Findings: It was found that the complex mix of wage- and salary-earning staff within the construction industry, along with labour market pressures, means that changing to a five-day working week is quite a radical notion within the industry. However, there are some organisations willing to explore opportunities for change with mixed experiences. ----- ----- Practical implications: The practical implications of this research include understanding the complexity within the Australian construction industry, based around hours of work and pay systems. Decision-makers within the construction industry must recognize a range of competing pressures that mean that “preferred” managerial styles might not be appropriate. ----- ----- Originality/value:– This paper shows that construction firms must take an active approach to reducing the culture of long working hours. This can only be achieved by addressing issues of project timelines and budgets and assuring that take-home pay is not reliant on long hours of overtime.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article examined the relationship between time structure and Macan's process model of time management. This study proposed that time structure—‘appraisal of effective time usage’—would be a more parsimonious mediator than perceived control over time in the relationship between time management behaviours and outcome variables, such as job satisfaction and psychological well-being. Alternative structure models were compared using a sample of 111 university students. Model 1 tested Macan's process model of time management with perceived control over time as the mediator. Model 2 replaced perceived control over time by the construct of time structure. Model 3 examined the possibility of perceived control over time and time structure as being parallel mediators of the relationships between time management and outcomes. Results of this study showed that Model 1 and Model 2 fitted the data equally well. On the other hand, the mediated effects were small and partial in both models. This pattern of results calls for reassessment of the process model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This current report, It’s About Time: Investing in Transportation to Keep Texas Economically Competitive, updates the February 2009 report by providing an enhanced analysis of the current state of the Texas transportation system, determining the household costs of under-investing in the system and identifying potential revenue options for funding the system. However, the general conclusion has not changed. There are tremendous needs and high costs associated with “doing nothing new.”

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A trend in design and implementation of modern industrial automation systems is to integrate computing, communication and control into a unified framework at different levels of machine/factory operations and information processing. These distributed control systems are referred to as networked control systems (NCSs). They are composed of sensors, actuators, and controllers interconnected over communication networks. As most of communication networks are not designed for NCS applications, the communication requirements of NCSs may be not satisfied. For example, traditional control systems require the data to be accurate, timely and lossless. However, because of random transmission delays and packet losses, the control performance of a control system may be badly deteriorated, and the control system rendered unstable. The main challenge of NCS design is to both maintain and improve stable control performance of an NCS. To achieve this, communication and control methodologies have to be designed. In recent decades, Ethernet and 802.11 networks have been introduced in control networks and have even replaced traditional fieldbus productions in some real-time control applications, because of their high bandwidth and good interoperability. As Ethernet and 802.11 networks are not designed for distributed control applications, two aspects of NCS research need to be addressed to make these communication networks suitable for control systems in industrial environments. From the perspective of networking, communication protocols need to be designed to satisfy communication requirements for NCSs such as real-time communication and high-precision clock consistency requirements. From the perspective of control, methods to compensate for network-induced delays and packet losses are important for NCS design. To make Ethernet-based and 802.11 networks suitable for distributed control applications, this thesis develops a high-precision relative clock synchronisation protocol and an analytical model for analysing the real-time performance of 802.11 networks, and designs a new predictive compensation method. Firstly, a hybrid NCS simulation environment based on the NS-2 simulator is designed and implemented. Secondly, a high-precision relative clock synchronization protocol is designed and implemented. Thirdly, transmission delays in 802.11 networks for soft-real-time control applications are modeled by use of a Markov chain model in which real-time Quality-of- Service parameters are analysed under a periodic traffic pattern. By using a Markov chain model, we can accurately model the tradeoff between real-time performance and throughput performance. Furthermore, a cross-layer optimisation scheme, featuring application-layer flow rate adaptation, is designed to achieve the tradeoff between certain real-time and throughput performance characteristics in a typical NCS scenario with wireless local area network. Fourthly, as a co-design approach for both a network and a controller, a new predictive compensation method for variable delay and packet loss in NCSs is designed, where simultaneous end-to-end delays and packet losses during packet transmissions from sensors to actuators is tackled. The effectiveness of the proposed predictive compensation approach is demonstrated using our hybrid NCS simulation environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automated visual surveillance of crowds is a rapidly growing area of research. In this paper we focus on motion representation for the purpose of abnormality detection in crowded scenes. We propose a novel visual representation called textures of optical flow. The proposed representation measures the uniformity of a flow field in order to detect anomalous objects such as bicycles, vehicles and skateboarders; and can be combined with spatial information to detect other forms of abnormality. We demonstrate that the proposed approach outperforms state-of-the-art anomaly detection algorithms on a large, publicly-available dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current knowledge about the relationship between transport disadvantage and activity space size is limited to urban areas, and as a result, very little is known to date about this link in a rural context. In addition, although research has identified transport disadvantaged groups based on their size of activity spaces, these studies have, however, not empirically explained such differences and the result is often a poor identification of the problems facing disadvantaged groups. Research has shown that transport disadvantage varies over time. The static nature of analysis using the activity space concept in previous research studies has lacked the ability to identify transport disadvantage in time. Activity space is a dynamic concept; and therefore possesses a great potential in capturing temporal variations in behaviour and access opportunities. This research derives measures of the size and fullness of activity spaces for 157 individuals for weekdays, weekends, and for a week using weekly activity-travel diary data from three case study areas located in rural Northern Ireland. Four focus groups were also conducted in order to triangulate the quantitative findings and to explain the differences between different socio-spatial groups. The findings of this research show that despite having a smaller sized activity space, individuals were not disadvantaged because they were able to access their required activities locally. Car-ownership was found to be an important life line in rural areas. Temporal disaggregation of the data reveals that this is true only on weekends due to a lack of public transport services. In addition, despite activity spaces being at a similar size, the fullness of activity spaces of low-income individuals was found to be significantly lower compared to their high-income counterparts. Focus group data shows that financial constraint, poor connections both between public transport services and between transport routes and opportunities forced individuals to participate in activities located along the main transport corridors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the relationship between law and morality. Morality does not necessarily coincide with the law, but it contributes to it. An act may be legal but nevertheless considered to be immoral in a particular society. For example, the use of pornography may be considered by many to be immoral. Nevertheless, the sale and distribution of non-violent, non-child related, sexually explicit material is legal (or regulated) in many jurisdictions. Many laws are informed by, and even created by, morality. This paper examines the historical influence of morality on the law and on society in general. It aims to develop a theoretical framework for examining legal moralism and the social construction of morality and crime as well as the relationship between sex, desire and taboo. Here, we refer to the moral temporality of sex and taboo, which examines the way in which moral judgments about sex and what is considered taboo change over time, and the kinds of justifications that are employed in support of changing moralities. It unpacks the way in which abstract and highly tenuous concepts such as ‘‘desire’’, ‘‘art’’ and ‘‘entertainment’’ may be ‘‘out of time’’ with morality, and how morality shapes laws over time, fabricating justifications from within socially constructed communities of practice. This theoretical framework maps the way in which these concepts have become temporally dominated by heteronormative structures such as the family, marriage, reproduction, and longevity. It is argued that the logic of these structures is inexorably tied to the heterosexual life-path, charting individual lives and relationships through explicit phases of childhood, adolescence and adulthood that, in the twenty-first century, delimit the boundaries of taboo surrounding sex more than any other time in history.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Focusing on the conditions that an optimization problem may comply with, the so-called convergence conditions have been proposed and sequentially a stochastic optimization algorithm named as DSZ algorithm is presented in order to deal with both unconstrained and constrained optimizations. The principle is discussed in the theoretical model of DSZ algorithm, from which we present the practical model of DSZ algorithm. Practical model efficiency is demonstrated by the comparison with the similar algorithms such as Enhanced simulated annealing (ESA), Monte Carlo simulated annealing (MCS), Sniffer Global Optimization (SGO), Directed Tabu Search (DTS), and Genetic Algorithm (GA), using a set of well-known unconstrained and constrained optimization test cases. Meanwhile, further attention goes to the strategies how to optimize the high-dimensional unconstrained problem using DSZ algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To quantify the lagged effects of mean temperature on deaths from cardiovascular diseases in Brisbane, Australia. Design Polynomial distributed lag models were used to assess the percentage increase in mortality up to 30 days associated with an increase (or decrease) of 1°C above (or below) the threshold temperature. Setting Brisbane, Australia. Patients 22 805 cardiovascular deaths registered between 1996 and 2004. Main outcome measures Deaths from cardiovascular diseases. Results The results show a longer lagged effect in cold days and a shorter lagged effect in hot days. For the hot effect, a statistically significant association was observed only for lag 0–1 days. The percentage increase in mortality was found to be 3.7% (95% CI 0.4% to 7.1%) for people aged ≥65 years and 3.5% (95% CI 0.4% to 6.7%) for all ages associated with an increase of 1°C above the threshold temperature of 24°C. For the cold effect, a significant effect of temperature was found for 10–15 lag days. The percentage estimates for older people and all ages were 3.1% (95% CI 0.7% to 5.7%) and 2.8% (95% CI 0.5% to 5.1%), respectively, with a decrease of 1°C below the threshold temperature of 24°C. Conclusions The lagged effects lasted longer for cold temperatures but were apparently shorter for hot temperatures. There was no substantial difference in the lag effect of temperature on mortality between all ages and those aged ≥65 years in Brisbane, Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Suburbanisation has been internationally a major phenomenon in the last decades. Suburb-to-suburb routes are nowadays the most widespread road journeys; and this resulted in an increment of distances travelled, particularly on faster suburban highways. The design of highways tends to over-simplify the driving task and this can result in decreased alertness. Driving behaviour is consequently impaired and drivers are then more likely to be involved in road crashes. This is particularly dangerous on highways where the speed limit is high. While effective countermeasures to this decrement in alertness do not currently exist, the development of in-vehicle sensors opens avenues for monitoring driving behaviour in real-time. The aim of this study is to evaluate in real-time the level of alertness of the driver through surrogate measures that can be collected from in-vehicle sensors. Slow EEG activity is used as a reference to evaluate driver's alertness. Data are collected in a driving simulator instrumented with an eye tracking system, a heart rate monitor and an electrodermal activity device (N=25 participants). Four different types of highways (driving scenario of 40 minutes each) are implemented through the variation of the road design (amount of curves and hills) and the roadside environment (amount of buildings and traffic). We show with Neural Networks that reduced alertness can be detected in real-time with an accuracy of 92% using lane positioning, steering wheel movement, head rotation, blink frequency, heart rate variability and skin conductance level. Such results show that it is possible to assess driver's alertness with surrogate measures. Such methodology could be used to warn drivers of their alertness level through the development of an in-vehicle device monitoring in real-time drivers' behaviour on highways, and therefore it could result in improved road safety.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method for measuring the in-bucket payload volume on a dragline excavator for the purpose of estimating the material's bulk density in real-time. Knowledge of the payload's bulk density can provide feedback to mine planning and scheduling to improve blasting and therefore provide a more uniform bulk density across the excavation site. This allows a single optimal bucket size to be used for maximum overburden removal per dig and in turn reduce costs and emissions in dragline operation and maintenance. The proposed solution uses a range bearing laser to locate and scan full buckets between the lift and dump stages of the dragline cycle. The bucket is segmented from the scene using cluster analysis, and the pose of the bucket is calculated using the Iterative Closest Point (ICP) algorithm. Payload points are identified using a known model and subsequently converted into a height grid for volume estimation. Results from both scaled and full scale implementations show that this method can achieve an accuracy of above 95%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The elastic task model, a significant development in scheduling of real-time control tasks, provides a mechanism for flexible workload management in uncertain environments. It tells how to adjust the control periods to fulfill the workload constraints. However, it is not directly linked to the quality-of-control (QoC) management, the ultimate goal of a control system. As a result, it does not tell how to make the best use of the system resources to maximize the QoC improvement. To fill in this gap, a new feedback scheduling framework, which we refer to as QoC elastic scheduling, is developed in this paper for real-time process control systems. It addresses the QoC directly through embedding both the QoC management and workload adaptation into a constrained optimization problem. The resulting solution for period adjustment is in a closed-form expressed in QoC measurements, enabling closed-loop feedback of the QoC to the task scheduler. Whenever the QoC elastic scheduler is activated, it improves the QoC the most while still meeting the system constraints. Examples are given to demonstrate the effectiveness of the QoC elastic scheduling.