268 resultados para Automate
Resumo:
The purpose of this work is to validate and automate the use of DYNJAWS; a new component module (CM) in the BEAMnrc Monte Carlo (MC) user code. The DYNJAWS CM simulates dynamic wedges and can be used in three modes; dynamic, step-and-shoot and static. The step-and-shoot and dynamic modes require an additional input file defining the positions of the jaw that constitutes the dynamic wedge, at regular intervals during its motion. A method for automating the generation of the input file is presented which will allow for the more efficient use of the DYNJAWS CM. Wedged profiles have been measured and simulated for 6 and 10 MV photons at three field sizes (5 cm x 5 cm , 10 cm x10 cm and 20 cm x 20 cm), four wedge angles (15, 30, 45 and 60 degrees), at dmax and at 10 cm depth. Results of this study show agreement between the measured and the MC profiles to within 3% of absolute dose or 3 mm distance to agreement for all wedge angles at both energies and depths. The gamma analysis suggests that dynamic mode is more accurate than the step-and-shoot mode. The DYNJAWS CM is an important addition to the BEAMnrc code and will enable the MC verification of patient treatments involving dynamic wedges.
Resumo:
The literature abounds with descriptions of failures in high-profile projects and a range of initiatives has been generated to enhance project management practice (e.g., Morris, 2006). Estimating from our own research, there are scores of other project failures that are unrecorded. Many of these failures can be explained using existing project management theory; poor risk management, inaccurate estimating, cultures of optimism dominating decision making, stakeholder mismanagement, inadequate timeframes, and so on. Nevertheless, in spite of extensive discussion and analysis of failures and attention to the presumed causes of failure, projects continue to fail in unexpected ways. In the 1990s, three U.S. state departments of motor vehicles (DMV) cancelled major projects due to time and cost overruns and inability to meet project goals (IT-Cortex, 2010). The California DMV failed to revitalize their drivers’ license and registration application process after spending $45 million. The Oregon DMV cancelled their five year, $50 million project to automate their manual, paper-based operation after three years when the estimates grew to $123 million; its duration stretched to eight years or more and the prototype was a complete failure. In 1997, the Washington state DMV cancelled their license application mitigation project because it would have been too big and obsolete by the time it was estimated to be finished. There are countless similar examples of projects that have been abandoned or that have not delivered the requirements.
Resumo:
Virtual methods to assess the fitting of a fracture fixation plate were proposed recently, however with limitations such as simplified fit criteria or manual data processing. This study aims to automate a fit analysis procedure using clinical-based criteria, and then to analyse the results further for borderline fit cases. Three dimensional (3D) models of 45 bones and of a precontoured distal tibial plate were utilized to assess the fitting of the plate automatically. A Matlab program was developed to automatically measure the shortest distance between the bone and the plate at three regions of interest and a plate-bone angle. The measured values including the fit assessment results were recorded in a spreadsheet as part of the batch-process routine. An automated fit analysis procedure will enable the processing of larger bone datasets in a significantly shorter time, which will provide more representative data of the target population for plate shape design and validation. As a result, better fitting plates can be manufactured and made available to surgeons, thereby reducing the risk and cost associated with complications or corrective procedures. This in turn, is expected to translate into improving patients' quality of life.
Resumo:
Emerging from the challenge to reduce energy consumption in buildings is the need for energy simulation to be used more effectively to support integrated decision making in early design. As a critical response to a Green Star case study, we present DEEPA, a parametric modeling framework that enables architects and engineers to work at the same semantic level to generate shared models for energy simulation. A cloud-based toolkit provides web and data services for parametric design software that automate the process of simulating and tracking design alternatives, by linking building geometry more directly to analysis inputs. Data, semantics, models and simulation results can be shared on the fly. This allows the complex relationships between architecture, building services and energy consumption to be explored in an integrated manner, and decisions to be made collaboratively.
Resumo:
Many modern business environments employ software to automate the delivery of workflows; whereas, workflow design and generation remains a laborious technical task for domain specialists. Several differ- ent approaches have been proposed for deriving workflow models. Some approaches rely on process data mining approaches, whereas others have proposed derivations of workflow models from operational struc- tures, domain specific knowledge or workflow model compositions from knowledge-bases. Many approaches draw on principles from automatic planning, but conceptual in context and lack mathematical justification. In this paper we present a mathematical framework for deducing tasks in workflow models from plans in mechanistic or strongly controlled work environments, with a focus around automatic plan generations. In addition, we prove an associative composition operator that permits crisp hierarchical task compositions for workflow models through a set of mathematical deduction rules. The result is a logical framework that can be used to prove tasks in workflow hierarchies from operational information about work processes and machine configurations in controlled or mechanistic work environments.
Resumo:
Rates of dehydration/rehydration are important quality parameters for dried products. Theoretically, if there are no adverse effects on the integrity of the tissue structure, it should absorb water to the same moisture content of the initial product before drying.The purpose of this work is to semi-automate the process of detection of cell structure boundaries as a food is dehydrated and rehydrated. This will enable food materials researchers to quantify changes to material’s structure as these processes take place. Images of potato cells as they were dehydrated and rehydrated were taken using an electron microscope. Cell boundaries were detected using an image processing algorithm. Average cell area and perimeter at each stage of dehydration were calculated and plotted versus time. The results show that the algorithm can successfully identify cell boundaries.
Resumo:
The design and construction community has shown increasing interest in adopting building information models (BIMs). The richness of information provided by BIMs has the potential to streamline the design and construction processes by enabling enhanced communication, coordination, automation and analysis. However, there are many challenges in extracting construction-specific information out of BIMs. In most cases, construction practitioners have to manually identify the required information, which is inefficient and prone to error, particularly for complex, large-scale projects. This paper describes the process and methods we have formalized to partially automate the extraction and querying of construction-specific information from a BIM. We describe methods for analyzing a BIM to query for spatial information that is relevant for construction practitioners, and that is typically represented implicitly in a BIM. Our approach integrates ifcXML data and other spatial data to develop a richer model for construction users. We employ custom 2D topological XQuery predicates to answer a variety of spatial queries. The validation results demonstrate that this approach provides a richer representation of construction-specific information compared to existing BIM tools.
Resumo:
Damage assessment (damage detection, localization and quantification) in structures and appropriate retrofitting will enable the safe and efficient function of the structures. In this context, many Vibration Based Damage Identification Techniques (VBDIT) have emerged with potential for accurate damage assessment. VBDITs have achieved significant research interest in recent years, mainly due to their non-destructive nature and ability to assess inaccessible and invisible damage locations. Damage Index (DI) methods are also vibration based, but they are not based on the structural model. DI methods are fast and inexpensive compared to the model-based methods and have the ability to automate the damage detection process. DI method analyses the change in vibration response of the structure between two states so that the damage can be identified. Extensive research has been carried out to apply the DI method to assess damage in steel structures. Comparatively, there has been very little research interest in the use of DI methods to assess damage in Reinforced Concrete (RC) structures due to the complexity of simulating the predominant damage type, the flexural crack. Flexural cracks in RC beams distribute non- linearly and propagate along all directions. Secondary cracks extend more rapidly along the longitudinal and transverse directions of a RC structure than propagation of existing cracks in the depth direction due to stress distribution caused by the tensile reinforcement. Simplified damage simulation techniques (such as reductions in the modulus or section depth or use of rotational spring elements) that have been extensively used with research on steel structures, cannot be applied to simulate flexural cracks in RC elements. This highlights a big gap in knowledge and as a consequence VBDITs have not been successfully applied to damage assessment in RC structures. This research will address the above gap in knowledge and will develop and apply a modal strain energy based DI method to assess damage in RC flexural members. Firstly, this research evaluated different damage simulation techniques and recommended an appropriate technique to simulate the post cracking behaviour of RC structures. The ABAQUS finite element package was used throughout the study with properly validated material models. The damaged plasticity model was recommended as the method which can correctly simulate the post cracking behaviour of RC structures and was used in the rest of this study. Four different forms of Modal Strain Energy based Damage Indices (MSEDIs) were proposed to improve the damage assessment capability by minimising the numbers and intensities of false alarms. The developed MSEDIs were then used to automate the damage detection process by incorporating programmable algorithms. The developed algorithms have the ability to identify common issues associated with the vibration properties such as mode shifting and phase change. To minimise the effect of noise on the DI calculation process, this research proposed a sequential order of curve fitting technique. Finally, a statistical based damage assessment scheme was proposed to enhance the reliability of the damage assessment results. The proposed techniques were applied to locate damage in RC beams and slabs on girder bridge model to demonstrate their accuracy and efficiency. The outcomes of this research will make a significant contribution to the technical knowledge of VBDIT and will enhance the accuracy of damage assessment in RC structures. The application of the research findings to RC flexural members will enable their safe and efficient performance.
Resumo:
1. Autonomous acoustic recorders are widely available and can provide a highly efficient method of species monitoring, especially when coupled with software to automate data processing. However, the adoption of these techniques is restricted by a lack of direct comparisons with existing manual field surveys. 2. We assessed the performance of autonomous methods by comparing manual and automated examination of acoustic recordings with a field-listening survey, using commercially available autonomous recorders and custom call detection and classification software. We compared the detection capability, time requirements, areal coverage and weather condition bias of these three methods using an established call monitoring programme for a nocturnal bird, the little spotted kiwi(Apteryx owenii). 3. The autonomous recorder methods had very high precision (>98%) and required <3% of the time needed for the field survey. They were less sensitive, with visual spectrogram inspection recovering 80% of the total calls detected and automated call detection 40%, although this recall increased with signal strength. The areal coverage of the spectrogram inspection and automatic detection methods were 85% and 42% of the field survey. The methods using autonomous recorders were more adversely affected by wind and did not show a positive association between ground moisture and call rates that was apparent from the field counts. However, all methods produced the same results for the most important conservation information from the survey: the annual change in calling activity. 4. Autonomous monitoring techniques incur different biases to manual surveys and so can yield different ecological conclusions if sampling is not adjusted accordingly. Nevertheless, the sensitivity, robustness and high accuracy of automated acoustic methods demonstrate that they offer a suitable and extremely efficient alternative to field observer point counts for species monitoring.
Resumo:
This paper presents two algorithms to automate the detection of marine species in aerial imagery. An algorithm from an initial pilot study is presented in which morphology operations and colour analysis formed the basis of its working principle. A second approach is presented in which saturation channel and histogram-based shape profiling were used. We report on performance for both algorithms using datasets collected from an unmanned aerial system at an altitude of 1000 ft. Early results have demonstrated recall values of 48.57% and 51.4%, and precision values of 4.01% and 4.97%.
Resumo:
The ability to automate forced landings in an emergency such as engine failure is an essential ability to improve the safety of Unmanned Aerial Vehicles operating in General Aviation airspace. By using active vision to detect safe landing zones below the aircraft, the reliability and safety of such systems is vastly improved by gathering up-to-the-minute information about the ground environment. This paper presents the Site Detection System, a methodology utilising a downward facing camera to analyse the ground environment in both 2D and 3D, detect safe landing sites and characterise them according to size, shape, slope and nearby obstacles. A methodology is presented showing the fusion of landing site detection from 2D imagery with a coarse Digital Elevation Map and dense 3D reconstructions using INS-aided Structure-from-Motion to improve accuracy. Results are presented from an experimental flight showing the precision/recall of landing sites in comparison to a hand-classified ground truth, and improved performance with the integration of 3D analysis from visual Structure-from-Motion.
Resumo:
Business processes depend on human resources and managers must regularly evaluate the performance of their employees based on a number of measures, some of which are subjective in nature. As modern organisations use information systems to automate their business processes and record information about processes’ executions in event logs, it now becomes possible to get objective information about resource behaviours by analysing data recorded in event logs. We present an extensible framework for extracting knowledge from event logs about the behaviour of a human resource and for analysing the dynamics of this behaviour over time. The framework is fully automated and implements a predefined set of behavioural indicators for human resources. It also provides a means for organisations to define their own behavioural indicators, using the conventional Structured Query Language, and a means to analyse the dynamics of these indicators. The framework's applicability is demonstrated using an event log from a German bank.
Resumo:
This paper presents a novel framework for the unsupervised alignment of an ensemble of temporal sequences. This approach draws inspiration from the axiom that an ensemble of temporal signals stemming from the same source/class should have lower rank when "aligned" rather than "misaligned". Our approach shares similarities with recent state of the art methods for unsupervised images ensemble alignment (e.g. RASL) which breaks the problem into a set of image alignment problems (which have well known solutions i.e. the Lucas-Kanade algorithm). Similarly, we propose a strategy for decomposing the problem of temporal ensemble alignment into a similar set of independent sequence problems which we claim can be solved reliably through Dynamic Time Warping (DTW). We demonstrate the utility of our method using the Cohn-Kanade+ dataset, to align expression onset across multiple sequences, which allows us to automate the rapid discovery of event annotations.
Resumo:
4D modeling - the simulation and visualisation of the construction process - is now a common method used during the building construction process with reasonable support from existing software. The goal of this paper is to examine the information needs required to model the deconstruction/demolition process of a building. The motivation is the need to reduce the impacts on the local environment during the deconstruction process. The focus is on the definition and description of the activities to remove building components and on the assessment of the noise, dust and vibration implications of these activities on the surrounding environment. The outcomes of the research are: i. requirements specification for BIM models to support operational deconstruction process planning, ii. algorithms for augmenting the BIM with the derived information necessary to automate planning of the deconstruction process with respect to impacts on the surrounding environment, iii. algorithms to build naive deconstruction activity schedules.
Resumo:
Enterprise Resource Planning (ERP) software typically takes the form of a package that is licensed for use to those in a client organisation and is sold as being able to automate a wide range of processes within organisations. ERP packages have become an important feature of information and communications technology (ICT) infrastructures in organizations. However, a number of highly publicised failures have been associated with the ERP packages too. For example: Hershey, Aero Group and Snap-On have blamed the implementation of ERP packages for negative impacts upon earnings (Scott and Vessey 2000); Cadbury Schweppes implemented plans to fulfil 250 orders where normally they would fulfil 1000 due to the increased complexity and the need to re-train staff post implementation (August 1999) and FoxMeyer drug company’s implementation of an ERP package has been argued to have lead to bankruptcy proceedings resulting in litigation against SAP, the software vendor in question (Bicknell 1998). Some have even rejected a single vendor approach outright (Light et. al. 2001). ERP packages appear to work for some and not for others, they contain contradictions. Indeed, if we start from the position that technologies do not provide their own explanation, then we have to consider the direction of a technological trajectory and why it moves in one way rather than another (Bijker and Law 1994). In other words, ERP appropriation cannot be predetermined as a success, despite the persuasive attempts of vendors via their websites and other marketing channels. Moreover, just because ERP exists, we cannot presume that all will appropriate it in the same fashion, if at all. There is more to the diffusion of innovations than stages of adoption and a simple demarcation between adoption and rejection. The processes that are enacted in appropriation need to be conceptualised as a site of struggle, political and imbued with power (Hislop et. al. 2000; Howcroft and Light, 2006). ERP appropriation and rejection can therefore be seen as a paradoxical phenomenon. In this paper we examine these contradictions as a way to shed light on the presence and role of inconsistencies in ERP appropriation and rejection. We argue that much of the reasoning associated with ERP adoption is pro-innovation biased and that deterministic models of the diffusion of innovations such as Rogers (2003), do not adequately take account of contradictions in the process. Our argument is that a better theoretical understanding of these contradictions is necessary to underpin research and practice in this area. In the next section, we introduce our view of appropriation. Following this is an outline of the idea of contradiction, and the strategies employed to ‘cope’ with this. Then, we introduce a number of reasons for ERP adoption and identify their inherent contradictions using these perspectives. From this discussion, we draw a framework, which illustrates how the interpretive flexibility of reasons to adopt ERP packages leads to contradictions which fuel the enactment of appropriation and rejection.