494 resultados para Dynamic Failure
Resumo:
Construction practitioners often experience unexpected results of their scheduling-related decisions. This is mainly due to lack of understanding of the dynamic nature of construction system. However, very little attention has been given to its significant importance and few empirical studies have been undertaken on this issue. This paper, therefore, analyzes the effect of aggressive scheduling, overtime, resource adding, and schedule slippage on construction performance, focusing on workers’ reactions to those scheduling decisions. Survey data from 102 construction practitioners in 38 construction sites are used for the analysis. The results indicate that efforts to increase work rate by working overtime, resource adding, and aggressive scheduling can be offset due to losses in productivity and quality. Based on the research findings, practical guidelines are then discussed to help site managers to effectively deal with the dynamics of scheduling and improve construction performance.
Resumo:
Previous research on construction innovation has commonly recognized the importance of the organizational climate and key individuals, often called “champions,” for the success of innovation. However, it rarely focuses on the role of participants at the project level and addresses the dynamics of construction innovation. This paper therefore presents a dynamic innovation model that has been developed using the concept of system dynamics. The model incorporates the influence of several individual and situational factors and highlights two critical elements that drive construction innovations: (1) normative pressure created by project managers through their championing behavior, and (2) instrumental motivation of team members facilitated by a supportive organizational climate. The model is qualified empirically, using the results of a survey of project managers and their project team members working for general contractors in Singapore, by assessing casual relationships for key model variables. Finally, the paper discusses the implications of the model structure for fostering construction innovations.
Resumo:
The objective of this research was to investigate the effects of driving conditions and suspension parameters on dynamic load-sharing of longitudinal-connected air suspensions of a tri-axle semi-trailer. A novel nonlinear model of a multi-axle semi-trailer with longitudinal-connected air suspension was formulated based on fluid mechanics and thermodynamics and was validated through test results. The effects of driving conditions and suspension parameters on dynamic load-sharing and road-friendliness of the semi-trailer were analyzed. Simulation results indicate that the road-friendliness metric-DLC (dynamic load coefficient) is not always in accordance with the load-sharing metric-DLSC (dynamic load-sharing coefficient). The effect of employing larger air lines and connectors on the DLSC optimization ratio gives varying results as road roughness increases and as driving speed increases. When the vehicle load reduces, or the static pressure increases, the DLSC optimization ratio declines monotonically. The results also indicate that if the air line diameter is always assumed to be larger than the connector diameter, the influence of air line diameter on load-sharing is more significant than that of the connector.
Resumo:
The decision of Applegarth J in Heartwood Architectural & Joinery Pty Ltd v Redchip Lawyers [2009] QSC 195 (27 July 2009) involved a costs order against solicitors personally. This decision is but one of several recent decisions in which the court has been persuaded that the circumstances justified costs orders against legal practitioners on the indemnity basis. These decisions serve as a reminder to practitioners of their disclosure obligations when seeking any interlocutory relief in an ex parte application. These obligations are now clearly set out in r 14.4 of the Legal Profession (Solicitors) Rule 2007 and r 25 of 2007 Barristers Rule. Inexperience or ignorance will not excuse breaches of the duties owed to the court.
Resumo:
This paper considers the problem of reconstructing the motion of a 3D articulated tree from 2D point correspondences subject to some temporal prior. Hitherto, smooth motion has been encouraged using a trajectory basis, yielding a hard combinatorial problem with time complexity growing exponentially in the number of frames. Branch and bound strategies have previously attempted to curb this complexity whilst maintaining global optimality. However, they provide no guarantee of being more efficient than exhaustive search. Inspired by recent work which reconstructs general trajectories using compact high-pass filters, we develop a dynamic programming approach which scales linearly in the number of frames, leveraging the intrinsically local nature of filter interactions. Extension to affine projection enables reconstruction without estimating cameras.
Resumo:
The main aim of this paper is to describe an adaptive re-planning algorithm based on a RRT and Game Theory to produce an efficient collision free obstacle adaptive Mission Path Planner for Search and Rescue (SAR) missions. This will provide UAV autopilots and flight computers with the capability to autonomously avoid static obstacles and No Fly Zones (NFZs) through dynamic adaptive path replanning. The methods and algorithms produce optimal collision free paths and can be integrated on a decision aid tool and UAV autopilots.
Resumo:
Dynamic capabilities are widely considered to incorporate those processes that enable organizations to sustain superior performance over time. In this paper, we argue theoretically and demonstrate empirically that these effects are contingent on organizational structure and the competitive intensity in the market. Results from partial least square structural equation modeling (PLS-SEM) analyses indicate that organic organizational structures facilitate the impact of dynamic capabilities on organizational performance. Furthermore, we find that the performance effects of dynamic capabilities are contingent on the competitive intensity faced by firms. Our findings demonstrate the performance effects of internal alignment between organizational structure and dynamic capabilities, as well as the external fit of dynamic capabilities with competitive intensity. We outline the advantages of PLS-SEM for modeling latent constructs, such as dynamic capabilities, and conclude with managerial implications.
Resumo:
With the growing importance of IS for organizations and the continuous stream of new IT developments, the IS function in organizations becomes more critical but is also more challenged. For example, how should the IS function deal with the consumerization of IT or what is the added value of the IS function when it comes to SaaS and the Cloud? In this paper we argue that IS research is in need of a dynamic perspective on the IS function. The IS function has to become more focused on building and adapting IS capabilities in a changing environment. We discuss that there has been an overreliance on the Resource Based View so far for understanding the IS function and capabilities and introduce Dynamic Capabilities Theory as an additional theoretical perspective, which has only been limitedly addressed in IS literature yet. We present a first conceptualization of the dynamic IS function and discuss IS capabilities frameworks and individual IS capabilities from a dynamic perspective. These initial insights demonstrate the contribution of a dynamic perspective on the IS function itself.
Resumo:
Reasons for performing study: Many domestic horses and ponies are sedentary and obese due to confinement to small paddocks and stables and a diet of infrequent, high-energy rations. Severe health consequences can be associated with this altered lifestyle. Objectives: The aims of this study were to investigate the ability of horses to learn to use a dynamic feeder system and determine the movement and behavioural responses of horses to the novel system. Methods: A dynamic feed station was developed to encourage horses to exercise in order to access ad libitum hay. Five pairs of horses (n = 10) were studied using a randomised crossover design with each pair studied in a control paddock containing a standard hay feeder and an experimental paddock containing the novel hay feeder. Horse movement was monitored by a global positioning system (GPS) and horses observed and their ability to learn to use the system and the behavioural responses to its use assessed. Results: With initial human intervention all horses used the novel feeder within 1 h. Some aggressive behaviour was observed between horses not well matched in dominance behaviour. The median distance walked by the horses was less (P = 0.002) during a 4 h period (117 [57–185] m) in the control paddock than in the experimental paddock (630 [509–719] m). Conclusions: The use of an automated feeding system promotes increased activity levels in horses housed in small paddocks, compared with a stationary feeder.
Resumo:
Aims: This paper describes the development of a risk adjustment (RA) model predictive of individual lesion treatment failure in percutaneous coronary interventions (PCI) for use in a quality monitoring and improvement program. Methods and results: Prospectively collected data for 3972 consecutive revascularisation procedures (5601 lesions) performed between January 2003 and September 2011 were studied. Data on procedures to September 2009 (n = 3100) were used to identify factors predictive of lesion treatment failure. Factors identified included lesion risk class (p < 0.001), occlusion type (p < 0.001), patient age (p = 0.001), vessel system (p < 0.04), vessel diameter (p < 0.001), unstable angina (p = 0.003) and presence of major cardiac risk factors (p = 0.01). A Bayesian RA model was built using these factors with predictive performance of the model tested on the remaining procedures (area under the receiver operating curve: 0.765, Hosmer–Lemeshow p value: 0.11). Cumulative sum, exponentially weighted moving average and funnel plots were constructed using the RA model and subjectively evaluated. Conclusion: A RA model was developed and applied to SPC monitoring for lesion failure in a PCI database. If linked to appropriate quality improvement governance response protocols, SPC using this RA tool might improve quality control and risk management by identifying variation in performance based on a comparison of observed and expected outcomes.
Resumo:
We propose a multi-layer spectrum sensing optimisation algorithm to maximise sensing efficiency by computing the optimal sensing and transmission durations for a fast changing, dynamic primary user. Dynamic primary user traffic is modelled as a random process, where the primary user changes states during both the sensing period and transmission period to reflect a more realistic scenario. Furthermore, we formulate joint constraints to correctly reflect interference to the primary user and lost opportunity of the secondary user during the transmission period. Finally, we implement a novel duty cycle based detector that is optimised with respect to PU traffic to accurately detect primary user activity during the sensing period. Simulation results show that unlike currently used detection models, the proposed algorithm can jointly optimise the sensing and transmission durations to simultaneously satisfy the optimisation constraints for the considered primary user traffic.
Resumo:
The overarching aim of this thesis was to investigate how processes of perception and action emerge under changing informational constraints during performance of multi-articular interceptive actions. Interceptive actions provide unique opportunities to study processes of perception and action in dynamic performance environments. The movement model used to exemplify the functionally coupled relationship between perception and action, from an ecological dynamics perspective, was cricket batting. Ecological dynamics conceptualises the human body as a complex system composed of many interacting sub-systems, and perceptual and motor system degrees of freedom, which leads to the emergence of patterns of behaviour under changing task constraints during performance. The series of studies reported in the Chapters of this doctoral thesis contributed to understanding of human behaviour by providing evidence of key properties of complex systems in human movement systems including self-organisation under constraints and meta-stability. Specifically, the studies: i) demonstrated how movement organisation (action) and visual strategies (perception) of dynamic human behaviour are constrained by changing ecological (especially informational) task constraints; (ii) provided evidence for the importance of representative design in experiments on perception and action; and iii), provided a principled theoretical framework to guide learning design in acquisition of skill in interceptive actions like cricket batting.
Resumo:
Anisotropic damage distribution and evolution have a profound effect on borehole stress concentrations. Damage evolution is an irreversible process that is not adequately described within classical equilibrium thermodynamics. Therefore, we propose a constitutive model, based on non-equilibrium thermodynamics, that accounts for anisotropic damage distribution, anisotropic damage threshold and anisotropic damage evolution. We implemented this constitutive model numerically, using the finite element method, to calculate stress–strain curves and borehole stresses. The resulting stress–strain curves are distinctively different from linear elastic-brittle and linear elastic-ideal plastic constitutive models and realistically model experimental responses of brittle rocks. We show that the onset of damage evolution leads to an inhomogeneous redistribution of material properties and stresses along the borehole wall. The classical linear elastic-brittle approach to borehole stability analysis systematically overestimates the stress concentrations on the borehole wall, because dissipative strain-softening is underestimated. The proposed damage mechanics approach explicitly models dissipative behaviour and leads to non-conservative mud window estimations. Furthermore, anisotropic rocks with preferential planes of failure, like shales, can be addressed with our model.
Resumo:
In the modern built environment, building construction and demolition consume a large amount of energy and emits greenhouse gasses due to widely used conventional construction materials such as reinforced and composite concrete. These materials consume high amount of natural resources and possess high embodied energy. More energy is required to recycle or reuse such materials at the cessation of use. Therefore, it is very important to use recyclable or reusable new materials in building construction in order to conserve natural resources and reduce the energy and emissions associated with conventional materials. Advancements in materials technology have resulted in the introduction of new composite and hybrid materials in infrastructure construction as alternatives to the conventional materials. This research project has developed a lightweight and prefabricatable Hybrid Composite Floor Plate System (HCFPS) as an alternative to conventional floor system, with desirable properties, easy to construct, economical, demountable, recyclable and reusable. Component materials of HCFPS include a central Polyurethane (PU) core, outer layers of Glass-fiber Reinforced Cement (GRC) and steel laminates at tensile regions. This research work explored the structural adequacy and performance characteristics of hybridised GRC, PU and steel laminate for the development of HCFPS. Performance characteristics of HCFPS were investigated using Finite Element (FE) method simulations supported by experimental testing. Parametric studies were conducted to develop the HCFPS to satisfy static performance using sectional configurations, spans, loading and material properties as the parameters. Dynamic response of HCFPS floors was investigated by conducting parametric studies using material properties, walking frequency and damping as the parameters. Research findings show that HCFPS can be used in office and residential buildings to provide acceptable static and dynamic performance. Design guidelines were developed for this new floor system. HCFPS is easy to construct and economical compared to conventional floor systems as it is lightweight and prefabricatable floor system. This floor system can also be demounted and reused or recycled at the cessation of use due to its component materials.
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.