921 resultados para Standard method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to provide some insights about P2M, and more specifically, to develop some thoughts about Project Management seen as a Mirror, a place for reflection…, between the Mission of organisation and its actual creation of Values (with s: a source of value for people, organisations and society). This place is the realm of complexity, of interactions between multiple variables, each of them having a specific time horizon and occupying a specific place, playing a specific role. Before developing this paper I would like to borrow to my colleague and friend, Professor Ohara, the following, part of a paper going to be presented at IPMA World Congress, in New Delhi later this year in November 2005. “P2M is the Japanese version of project & program management, which is the first standard guide for education and certification developed in 2001. A specific finding of P2M is characterized by “mission driven management of projects” or a program which harness complexity of problem solving observed in the interface between technical system and business model.” (Ohara, 2005, IPMA Conference, New Delhi) “The term of “mission” is a key word in the field of corporate strategy, where it expresses raison d’être or “value of business”. It is more specifically used for expressing “the client needs” in terms of a strategic business unit. The concept of mission is deemed to be a useful tool to share essential content of value and needs in message for complex project.” (Ohara, 2005, IPMA Conference, New Delhi) “Mission is considered as a significant “metamodel representation” by several reasons. First, it represents multiple values for aspiration. The central objective of mission initiative is profiling of ideality in the future from reality, which all stakeholders are glad to accept and share. Second, it shall be within a stretch of efforts, and not beyond or outside of the realization. Though it looks like unique, it has to depict a solid foundation. The pragmatic sense of equilibrium between innovation and adaptation is required for the mission. Third, it shall imply a rough sketch for solution to critical issues for problems in reality.” (Ohara, 2005, IPMA Conference, New Delhi) “Project modeling” idea has been introduced in P2M program management. A package of three project models of “scheme”, “system” and “service” are given as a reference type program. (Ohara, 2005, IPMA Conference, New Delhi) If these quotes apply to P2M, they are fully congruent with the results of the research undertaken and the resulting meta-model & meta-method developed by the CIMAP, ESC Lille Research Centre in Project & Program Management, since the 80’s. The paper starts by questioning the common Project Management (PM) paradigm. Then discussing the concept of Project, it argues that an alternative epistemological position should be taken to capture Page 2 / 11 the very nature of the PM field. Based on this, a development about “the need of modelling to understand” is proposed grounded on two theoretical roots. This leads to the conclusion that, in order to enables this modelling, a standard approach is necessary, but should be understood under the perspective of the Theory of Convention in order to facilitate a situational and contextual application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper introduces the underlying principles and the general features of a meta-method (MAP method – Management & Analysis of Projects) developed as part of and used in various research, education and professional development programmes at ESC Lille. This method aims at providing effective and efficient structure and process for acting and learning in various complex, uncertain and ambiguous managerial situations (projects, programmes, portfolios). The paper is organized in three parts. In a first part, I propose to revisit the dominant vision of the project management knowledge field, based on the assumptions they are not addressing adequately current business and management contexts and situations, and that competencies in management of entrepreneurial activities are the sources of creation of value for organisations. Then, grounded on the new suggested perspective, the second part presents the underlying concepts supporting MAP method seen as a ‘convention generator' and how this meta-method inextricably links learning and practice in addressing managerial situations. The third part describes example of application, illustrating with a brief case study how the method integrates Project Management Governance, and gives few examples of use in Management Education and Professional Development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper investigates a detailed Active Shock Control Bump Design Optimisation on a Natural Laminar Flow (NLF) aerofoil; RAE 5243 to reduce cruise drag at transonic flow conditions using Evolutionary Algorithms (EAs) coupled to a robust design approach. For the uncertainty design parameters, the positions of boundary layer transition (xtr) and the coefficient of lift (Cl) are considered (250 stochastic samples in total). In this paper, two robust design methods are considered; the first approach uses a standard robust design method, which evaluates one design model at 250 stochastic conditions for uncertainty. The second approach is the combination of a standard robust design method and the concept of hierarchical (multi-population) sampling (250, 50, 15) for uncertainty. Numerical results show that the evolutionary optimization method coupled to uncertainty design techniques produces useful and reliable Pareto optimal SCB shapes which have low sensitivity and high aerodynamic performance while having significant total drag reduction. In addition,it also shows the benefit of using hierarchical robust method for detailed uncertainty design optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently in the Australian higher education sector, the productivity benefits of occupational therapy clinical education placements are a contested issue. This article will report results of a study that developed a methodology for documenting time use during placements and investigated the productivity changes associated with occupational therapy clinical education placements in Queensland, Australia. Supervisors’ and students’ time use during placements and how this changed for supervisors compared to pre- and post-placement is also presented. Methods: Using a cohort survey design, participants were students from two Queensland universities, and their supervisors employed by Queensland Health. Time use was recorded in 30 minute blocks according to particular categories. Results: There was a significant increase in supervisors’ time spent in patient care activities (F = 94.0112,12.37 df, P < 0.001) between pre- and during placement (P < 0.001) and decrease between during and post-placement (P < 0.001). Supervisors’ time spent in all non-patient care activities was also significant (F = 4.5802,16 df, P = 0.027) increasing between pre- and during placement (P = 0.028). There was a significant decrease in supervisors’ time spent in placement activities (F = 5.1332,19.18 df, P = 0.016) from during to post-placement. Students spent more time than supervisors in patient care activities while on placement. Discussion: A novel method for reporting productivity and time-use changes during clinical education programs for occupational therapy has been applied. Supervisors spent considerable time in assessing and managing students and their clinical education role should be seen as core business in standard occupational therapy practice. This paper will contribute to future assessments of the economic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing global competitiveness worldwide has forced manufacturing organizations to produce high-quality products more quickly and at a competitive cost which demand of continuous improvements techniques. In this paper, we propose a fuzzy based performance evaluation method for lean supply chain. To understand the overall performance of cost competitive supply chain, we investigate the alignment of market strategy and position of the supply chain. Competitive strategies can be achieved by using a different weight calculation for different supply chain situations. By identifying optimal performance metrics and applying performance evaluation methods, managers can predict the overall supply chain performance under lean strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design of pre-contoured fracture fixation implants (plates and nails) that correctly fit the anatomy of a patient utilises 3D models of long bones with accurate geometric representation. 3D data is usually available from computed tomography (CT) scans of human cadavers that generally represent the above 60 year old age group. Thus, despite the fact that half of the seriously injured population comes from the 30 year age group and below, virtually no data exists from these younger age groups to inform the design of implants that optimally fit patients from these groups. Hence, relevant bone data from these age groups is required. The current gold standard for acquiring such data–CT–involves ionising radiation and cannot be used to scan healthy human volunteers. Magnetic resonance imaging (MRI) has been shown to be a potential alternative in the previous studies conducted using small bones (tarsal bones) and parts of the long bones. However, in order to use MRI effectively for 3D reconstruction of human long bones, further validations using long bones and appropriate reference standards are required. Accurate reconstruction of 3D models from CT or MRI data sets requires an accurate image segmentation method. Currently available sophisticated segmentation methods involve complex programming and mathematics that researchers are not trained to perform. Therefore, an accurate but relatively simple segmentation method is required for segmentation of CT and MRI data. Furthermore, some of the limitations of 1.5T MRI such as very long scanning times and poor contrast in articular regions can potentially be reduced by using higher field 3T MRI imaging. However, a quantification of the signal to noise ratio (SNR) gain at the bone - soft tissue interface should be performed; this is not reported in the literature. As MRI scanning of long bones has very long scanning times, the acquired images are more prone to motion artefacts due to random movements of the subject‟s limbs. One of the artefacts observed is the step artefact that is believed to occur from the random movements of the volunteer during a scan. This needs to be corrected before the models can be used for implant design. As the first aim, this study investigated two segmentation methods: intensity thresholding and Canny edge detection as accurate but simple segmentation methods for segmentation of MRI and CT data. The second aim was to investigate the usability of MRI as a radiation free imaging alternative to CT for reconstruction of 3D models of long bones. The third aim was to use 3T MRI to improve the poor contrast in articular regions and long scanning times of current MRI. The fourth and final aim was to minimise the step artefact using 3D modelling techniques. The segmentation methods were investigated using CT scans of five ovine femora. The single level thresholding was performed using a visually selected threshold level to segment the complete femur. For multilevel thresholding, multiple threshold levels calculated from the threshold selection method were used for the proximal, diaphyseal and distal regions of the femur. Canny edge detection was used by delineating the outer and inner contour of 2D images and then combining them to generate the 3D model. Models generated from these methods were compared to the reference standard generated using the mechanical contact scans of the denuded bone. The second aim was achieved using CT and MRI scans of five ovine femora and segmenting them using the multilevel threshold method. A surface geometric comparison was conducted between CT based, MRI based and reference models. To quantitatively compare the 1.5T images to the 3T MRI images, the right lower limbs of five healthy volunteers were scanned using scanners from the same manufacturer. The images obtained using the identical protocols were compared by means of SNR and contrast to noise ratio (CNR) of muscle, bone marrow and bone. In order to correct the step artefact in the final 3D models, the step was simulated in five ovine femora scanned with a 3T MRI scanner. The step was corrected using the iterative closest point (ICP) algorithm based aligning method. The present study demonstrated that the multi-threshold approach in combination with the threshold selection method can generate 3D models from long bones with an average deviation of 0.18 mm. The same was 0.24 mm of the single threshold method. There was a significant statistical difference between the accuracy of models generated by the two methods. In comparison, the Canny edge detection method generated average deviation of 0.20 mm. MRI based models exhibited 0.23 mm average deviation in comparison to the 0.18 mm average deviation of CT based models. The differences were not statistically significant. 3T MRI improved the contrast in the bone–muscle interfaces of most anatomical regions of femora and tibiae, potentially improving the inaccuracies conferred by poor contrast of the articular regions. Using the robust ICP algorithm to align the 3D surfaces, the step artefact that occurred by the volunteer moving the leg was corrected, generating errors of 0.32 ± 0.02 mm when compared with the reference standard. The study concludes that magnetic resonance imaging, together with simple multilevel thresholding segmentation, is able to produce 3D models of long bones with accurate geometric representations. The method is, therefore, a potential alternative to the current gold standard CT imaging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent times, light gauge steel framed (LSF) structures, such as cold-formed steel wall systems, are increasingly used, but without a full understanding of their fire performance. Traditionally the fire resistance rating of these load-bearing LSF wall systems is based on approximate prescriptive methods developed based on limited fire tests. Very often they are limited to standard wall configurations used by the industry. Increased fire rating is provided simply by adding more plasterboards to these walls. This is not an acceptable situation as it not only inhibits innovation and structural and cost efficiencies but also casts doubt over the fire safety of these wall systems. Hence a detailed fire research study into the performance of LSF wall systems was undertaken using full scale fire tests and extensive numerical studies. A new composite wall panel developed at QUT was also considered in this study, where the insulation was used externally between the plasterboards on both sides of the steel wall frame instead of locating it in the cavity. Three full scale fire tests of LSF wall systems built using the new composite panel system were undertaken at a higher load ratio using a gas furnace designed to deliver heat in accordance with the standard time temperature curve in AS 1530.4 (SA, 2005). Fire tests included the measurements of load-deformation characteristics of LSF walls until failure as well as associated time-temperature measurements across the thickness and along the length of all the specimens. Tests of LSF walls under axial compression load have shown the improvement to their fire performance and fire resistance rating when the new composite panel was used. Hence this research recommends the use of the new composite panel system for cold-formed LSF walls. The numerical study was undertaken using a finite element program ABAQUS. The finite element analyses were conducted under both steady state and transient state conditions using the measured hot and cold flange temperature distributions from the fire tests. The elevated temperature reduction factors for mechanical properties were based on the equations proposed by Dolamune Kankanamge and Mahendran (2011). These finite element models were first validated by comparing their results with experimental test results from this study and Kolarkar (2010). The developed finite element models were able to predict the failure times within 5 minutes. The validated model was then used in a detailed numerical study into the strength of cold-formed thin-walled steel channels used in both the conventional and the new composite panel systems to increase the understanding of their behaviour under nonuniform elevated temperature conditions and to develop fire design rules. The measured time-temperature distributions obtained from the fire tests were used. Since the fire tests showed that the plasterboards provided sufficient lateral restraint until the failure of LSF wall panels, this assumption was also used in the analyses and was further validated by comparison with experimental results. Hence in this study of LSF wall studs, only the flexural buckling about the major axis and local buckling were considered. A new fire design method was proposed using AS/NZS 4600 (SA, 2005), NAS (AISI, 2007) and Eurocode 3 Part 1.3 (ECS, 2006). The importance of considering thermal bowing, magnified thermal bowing and neutral axis shift in the fire design was also investigated. A spread sheet based design tool was developed based on the above design codes to predict the failure load ratio versus time and temperature for varying LSF wall configurations including insulations. Idealised time-temperature profiles were developed based on the measured temperature values of the studs. This was used in a detailed numerical study to fully understand the structural behaviour of LSF wall panels. Appropriate equations were proposed to find the critical temperatures for different composite panels, varying in steel thickness, steel grade and screw spacing for any load ratio. Hence useful and simple design rules were proposed based on the current cold-formed steel structures and fire design standards, and their accuracy and advantages were discussed. The results were also used to validate the fire design rules developed based on AS/NZS 4600 (SA, 2005) and Eurocode Part 1.3 (ECS, 2006). This demonstrated the significant improvements to the design method when compared to the currently used prescriptive design methods for LSF wall systems under fire conditions. In summary, this research has developed comprehensive experimental and numerical thermal and structural performance data for both the conventional and the proposed new load bearing LSF wall systems under standard fire conditions. Finite element models were developed to predict the failure times of LSF walls accurately. Idealized hot flange temperature profiles were developed for non-insulated, cavity and externally insulated load bearing wall systems. Suitable fire design rules and spread sheet based design tools were developed based on the existing standards to predict the ultimate failure load, failure times and failure temperatures of LSF wall studs. Simplified equations were proposed to find the critical temperatures for varying wall panel configurations and load ratios. The results from this research are useful to both structural and fire engineers and researchers. Most importantly, this research has significantly improved the knowledge and understanding of cold-formed LSF loadbearing walls under standard fire conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monodisperse silica nanoparticles were synthesised by the well-known Stober protocol, then dispersed in acetonitrile (ACN) and subsequently added to a bisacetonitrile gold(I) coordination complex ([Au(MeCN)2]?) in ACN. The silica hydroxyl groups were deprotonated in the presence of ACN, generating a formal negative charge on the siloxy groups. This allowed the [Au(MeCN)2]? complex to undergo ligand exchange with the silica nanoparticles and form a surface coordination complex with reduction to metallic gold (Au0) proceeding by an inner sphere mechanism. The residual [Au(MeCN)2]? complex was allowed to react with water, disproportionating into Au0 and Au(III), respectively, with the Au0 adding to the reduced gold already bound on the silica surface. The so-formed metallic gold seed surface was found to be suitable for the conventional reduction of Au(III) to Au0 by ascorbic acid (ASC). This process generated a thin and uniform gold coating on the silica nanoparticles. The silica NPs batches synthesised were in a size range from 45 to 460 nm. Of these silica NP batches, the size range from 400 to 480 nm were used for the gold-coating experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The decision of Dalton J in Lai v Soineva [2011] QSC 247 has resulted in a change in the latest versions of the Real Estate Institute of Queensland (REIQ) contracts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ninth release of the Toolbox, represents over fifteen years of development and a substantial level of maturity. This version captures a large number of changes and extensions generated over the last two years which support my new book “Robotics, Vision & Control”. The Toolbox has always provided many functions that are useful for the study and simulation of classical arm-type robotics, for example such things as kinematics, dynamics, and trajectory generation. The Toolbox is based on a very general method of representing the kinematics and dynamics of serial-link manipulators. These parameters are encapsulated in MATLAB ® objects - robot objects can be created by the user for any serial-link manipulator and a number of examples are provided for well know robots such as the Puma 560 and the Stanford arm amongst others. The Toolbox also provides functions for manipulating and converting between datatypes such as vectors, homogeneous transformations and unit-quaternions which are necessary to represent 3-dimensional position and orientation. This ninth release of the Toolbox has been significantly extended to support mobile robots. For ground robots the Toolbox includes standard path planning algorithms (bug, distance transform, D*, PRM), kinodynamic planning (RRT), localization (EKF, particle filter), map building (EKF) and simultaneous localization and mapping (EKF), and a Simulink model a of non-holonomic vehicle. The Toolbox also including a detailed Simulink model for a quadcopter flying robot.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite an increased focus on proactive policing in recent years, criminal investigation is still perhaps the most important task of any law enforcement agency. As a result, the skills required to carry out a successful investigation or to be an ‘effective detective’ have been subjected to much attention and debate (Smith and Flanagan, 2000; Dean, 2000; Fahsing and Gottschalk, 2008:652). Stelfox (2008:303) states that “The service’s capacity to carry out investigations comprises almost entirely the expertise of investigators.” In this respect, Dean (2000) highlighted the need to profile criminal investigators in order to promote further understanding of the cognitive approaches they take to the process of criminal investigation. As a result of his research, Dean (2000) produced a theoretical framework of criminal investigation, which included four disparate cognitive or ‘thinking styles’. These styles were the ‘Method’, ‘Challenge’, ‘Skill’ and ‘Risk’. While the Method and Challenge styles deal with adherence to Standard Operating Procedures (SOPs) and the internal ‘drive’ that keeps an investigator going, the Skill and Risk styles both tap on the concept of creativity in policing. It is these two latter styles that provide the focus for this paper. This paper presents a brief discussion on Dean’s (2000) Skill and Risk styles before giving an overview of the broader literature on creativity in policing. The potential benefits of a creative approach as well as some hurdles which need to be overcome when proposing the integration of creativity within the policing sector are then discussed. Finally, the paper concludes by proposing further research into Dean’s (2000) skill and risk styles and also by stressing the need for significant changes to the structure and approach of the traditional policing organisation before creativity in policing is given the status it deserves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There have been numerous calls over the years for the development of an accounting standard for not-for-profit entities (NFPEs). Probably the most commonly quoted in this regard is that from the Industry Commission Report No. 45 in 1995 which contained the following recommendation: The Commonwealth government should provide funds to the Australian Accounting Standards Board and the Public Sector Accounting Standards Board to develop within two years suitable accounting standards for Community Social Welfare Organisations. This recommendation was made over 5-years ago. Why has no action been taken towards its implementation?...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine methodologies and methods that apply to multi-level research in the learning sciences. In so doing we describe how multiple theoretical frameworks informs the use of different methods that apply to social levels involving space-time relationships that are not accessible consciously as social life is enacted. Most of the methods involve analyses of video and audio files. Within a framework of interpretive research we present a methodology of event-oriented social science, which employs video ethnography, narrative, conversation analysis, prosody analysis, and facial expression analysis. We illustrate multi-method research in an examination of the role of emotions in teaching and learning. Conversation and prosody analyses augment facial expression analysis and ethnography. We conclude with an exploration of ways in which multi-level studies can be complemented with neural level analyses.