976 resultados para project delay estimation
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
Purpose: The objective of this study is to investigate the feasibility of detecting and quantifying 3D cerebrovascular wall motion from a single 3D rotational x-ray angiography (3DRA) acquisition within a clinically acceptable time and computing from the estimated motion field for the further biomechanical modeling of the cerebrovascular wall. Methods: The whole motion cycle of the cerebral vasculature is modeled using a 4D B-spline transformation, which is estimated from a 4D to 2D + t image registration framework. The registration is performed by optimizing a single similarity metric between the entire 2D + t measured projection sequence and the corresponding forward projections of the deformed volume at their exact time instants. The joint use of two acceleration strategies, together with their implementation on graphics processing units, is also proposed so as to reach computation times close to clinical requirements. For further characterizing vessel wall properties, an approximation of the wall thickness changes is obtained through a strain calculation. Results: Evaluation on in silico and in vitro pulsating phantom aneurysms demonstrated an accurate estimation of wall motion curves. In general, the error was below 10% of the maximum pulsation, even in the situation when substantial inhomogeneous intensity pattern was present. Experiments on in vivo data provided realistic aneurysm and vessel wall motion estimates, whereas in regions where motion was neither visible nor anatomically possible, no motion was detected. The use of the acceleration strategies enabled completing the estimation process for one entire cycle in 5-10 min without degrading the overall performance. The strain map extracted from our motion estimation provided a realistic deformation measure of the vessel wall. Conclusions: The authors' technique has demonstrated that it can provide accurate and robust 4D estimates of cerebrovascular wall motion within a clinically acceptable time, although it has to be applied to a larger patient population prior to possible wide application to routine endovascular procedures. In particular, for the first time, this feasibility study has shown that in vivo cerebrovascular motion can be obtained intraprocedurally from a 3DRA acquisition. Results have also shown the potential of performing strain analysis using this imaging modality, thus making possible for the future modeling of biomechanical properties of the vascular wall.
Resumo:
The problem of jointly estimating the number, the identities, and the data of active users in a time-varying multiuser environment was examined in a companion paper (IEEE Trans. Information Theory, vol. 53, no. 9, September 2007), at whose core was the use of the theory of finite random sets on countable spaces. Here we extend that theory to encompass the more general problem of estimating unknown continuous parameters of the active-user signals. This problem is solved here by applying the theory of random finite sets constructed on hybrid spaces. We doso deriving Bayesian recursions that describe the evolution withtime of a posteriori densities of the unknown parameters and data.Unlike in the above cited paper, wherein one could evaluate theexact multiuser set posterior density, here the continuous-parameter Bayesian recursions do not admit closed-form expressions. To circumvent this difficulty, we develop numerical approximationsfor the receivers that are based on Sequential Monte Carlo (SMC)methods (“particle filtering”). Simulation results, referring to acode-divisin multiple-access (CDMA) system, are presented toillustrate the theory.
Resumo:
Wireless “MIMO” systems, employing multiple transmit and receive antennas, promise a significant increase of channel capacity, while orthogonal frequency-division multiplexing (OFDM) is attracting a good deal of attention due to its robustness to multipath fading. Thus, the combination of both techniques is an attractive proposition for radio transmission. The goal of this paper is the description and analysis of a new and novel pilot-aided estimator of multipath block-fading channels. Typical models leading to estimation algorithms assume the number of multipath components and delays to be constant (and often known), while their amplitudes are allowed to vary with time. Our estimator is focused instead on the more realistic assumption that the number of channel taps is also unknown and varies with time following a known probabilistic model. The estimation problem arising from these assumptions is solved using Random-Set Theory (RST), whereby one regards the multipath-channel response as a single set-valued random entity.Within this framework, Bayesian recursive equations determine the evolution with time of the channel estimator. Due to the lack of a closed form for the solution of Bayesian equations, a (Rao–Blackwellized) particle filter (RBPF) implementation ofthe channel estimator is advocated. Since the resulting estimator exhibits a complexity which grows exponentially with the number of multipath components, a simplified version is also introduced. Simulation results describing the performance of our channel estimator demonstrate its effectiveness.
Resumo:
In this paper, we introduce a pilot-aided multipath channel estimator for Multiple-Input Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems. Typical estimation algorithms assume the number of multipath components and delays to be known and constant, while theiramplitudes may vary in time. In this work, we focus on the more realistic assumption that also the number of channel taps is unknown and time-varying. The estimation problem arising from this assumption is solved using Random Set Theory (RST), which is a probability theory of finite sets. Due to the lack of a closed form of the optimal filter, a Rao-Blackwellized Particle Filter (RBPF) implementation of the channel estimator is derived. Simulation results demonstrate the estimator effectiveness.
Resumo:
In Quantitative Microbial Risk Assessment, it is vital to understand how lag times of individual cells are distributed over a bacterial population. Such identified distributions can be used to predict the time by which, in a growth-supporting environment, a few pathogenic cells can multiply to a poisoning concentration level. We model the lag time of a single cell, inoculated into a new environment, by the delay of the growth function characterizing the generated subpopulation. We introduce an easy-to-implement procedure, based on the method of moments, to estimate the parameters of the distribution of single cell lag times. The advantage of the method is especially apparent for cases where the initial number of cells is small and random, and the culture is detectable only in the exponential growth phase.
Resumo:
Reflective cracking in hot mix asphalt (HMA) overlays has been a common cause of poor pavement performance in Iowa for many years. Reflective cracks commonly occur in HMA overlays when deteriorated portland cement concrete is paved over with HMA. This results in HMA pavement surfaces with poor ride quality and increased transportation maintenance costs. To delay the formation of cracks in HMA overlays, the Iowa Department of Transportation (Iowa DOT) has begun to implement a crack-relief interlayer mix design specification. The crack-relief interlayer is an asphalt-rich, highly flexible HMA that can resist cracking in high strain loading conditions. In this project, the field performance of an HMA overlay using a one inch interlayer was compared to a conventional HMA overlay without an interlayer. Both test sections were constructed on US 169 in Adel, Iowa as part of an Iowa DOT overlay project. The laboratory performance of the interlayer mix design was assessed for resistance to cracking from repeated strains by using the four-point bending beam apparatus. An HMA using a highly polymer modified binder was designed and shown to meet the laboratory performance test criteria. The field performance of the overlay with the interlayer exceeded the performance of the conventional overlay that did not have the interlayer. After one winter season, 29 percent less reflective cracking was measured in the pavement section with the interlayer than the pavement section without the interlayer. The level of cracking severity was also reduced by using the interlayer in the overlay.
Resumo:
Concrete paving is often at a disadvantage in terms of pavement type selection due to the time of curing required prior to opening the pavement to traffic. The State of Iowa has been able to reduce traffic delay constraints through material selection and construction methods to date. Methods for monitoring concrete strength gain and quality have not changed since the first concrete pavements were constructed in Iowa. In 1995, Lee County and the Iowa DOT cooperated in a research project, HR-380, to construct a 7.1 mile (11. 43 km) project to evaluate the use of maturity and pulse velocity nondestructive testing (NDT) methods in the estimation of concrete strength gain. The research identified the pros and cons of each method and suggested an instructional memorandum to utilize maturity measurements to meet traffic delay demands. Maturity was used to reduce the traffic delay opening time from 5-7 days to less than 2 days through the implementation of maturity measurements and special traffic control measures. Recommendations on the development of the maturity curve for each project and the location and monitoring of the maturity thermocouples are included. Examples of equipment that could easily be used by project personnel to estimate the concrete strength using the maturity methods is described.
Resumo:
Tutkimus keskittyy hankintatoimen kehittämiseen osana laitosprojektien toteutusta. Työ pohjautuu empiiriseltä taustaltaan Pöyry Oyj:n projektiliiketoimintaan ja työn tarkastelunäkökulmaksi onvalittu projektihallinnosta vastaavan yrityksen näkökulma. Tutkimus on hyvin käytännönläheinen ¿ se lähtee hankinnan ja sen seurannan ongelmista ja pyrkii tarjoamaan niihin uudenlaisia ratkaisuja. Pohjimmiltaan tutkimus kuuluu teollisuustalouden piiriin, vaikka tietojärjestelmätieteellä on vahva tukirooli. Työn tavoitteet ja tulokset liittyvät teollisuustaloudelle ominaisesti yrityksen toiminnan kehittämiseen, käytetyt välineet ja ratkaisut puolestaan hyödyntävät tietojärjestelmätieteen antamia mahdollisuuksia. Tutkimuksessa on käytetty konstruktiivista tutkimusotetta, jonka mukaisesti on luotu innovatiivisia konstruktioita ratkaisemaan aitoja reaalimaailman ongelmia ja tätä kautta tuotettu kontribuutioita teollisuustaloudelle. Tavoitteena oli järjestää hankintatoimi ja sen seuranta suurissa laitosprojekteissa tehokkaammin. Tätä varten uudistettiin ensin projektihallinnon ja hankintatoimen toimintaohjeet vastaamaan paremmin nykyajan vaatimuksia. Toimintaohjeiden perusteella ryhdyttiin toteuttamaan hankintaohjelmistoa, joka pystyisi kattamaan kaikki toimintaohjeissa kuvatut piirteet. Lopulta hankintaohjelmisto toi mukanaan uusia piirteitä projektihallintoon ja hankintatoimeen ja nämä sisällytettiin toimintaohjeisiin. Tähän kehitystyöhön ryhdyttiin, jotta laitosprojektien projektihallinto ja hankintatoimi toimisivat paremmin, eli pienemmin kustannuksin tuottaen projekteissa tarvittavat tulokset nopeammin, tarkemmin ja laadukkaammin. Tutkimuksella on kolmenlaisia tuloksia: hankintatoimen parannetut metodit, hankintaohjelmiston pohjana olevat toiminta- ja laskentamallit sekä implementaationa hankintasovellus. Uudistetut projekti- ja hankintaohjeet kuvaavat hankintatoiminnan parannettuja metodeja. Hankintaohjelmistoasuunnitellessa ja kehitettäessä tehdyt kuvaukset sisältävät uusia malleja niin hankintaprosessille kuin hankinnan seuraamiseksi suurissa laitosprojekteissa. Itse ohjelmisto on tuloksena implementaatio, joka perustuu parannettuihin hankintametodeihin ja uusiin toiminta- ja laskentamalleihin. Uudistetut projekti- ja hankintaohjeet ovat olleet käytössä Pöyry Oyj:ssä vuodesta 1991. Vuosien varrella nämä toimintaohjeet ovat auttaneet ja tukeneet satojen laitosprojektientoteutusta ja ylläpitäneet Pöyry Oyj:n kilpailukykyä kansainvälisenä projektitalona. Hankintasovellus puolestaan on ollut käytössä useissa projekteissa ja sen on havaittu pienentävän hankintatoimen suoria työkustannuksia laitosprojekteissa. Sovelluksen katsotaan myös tuovan epäsuoria kustannussäästöjä parempien hankintapäätösten muodossa, mutta näiden säästöjen suuruutta ei pystytä luotettavasti arvioimaan.
Resumo:
Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.
Resumo:
Next-generation sequencing (NGS) technologies have become the standard for data generation in studies of population genomics, as the 1000 Genomes Project (1000G). However, these techniques are known to be problematic when applied to highly polymorphic genomic regions, such as the human leukocyte antigen (HLA) genes. Because accurate genotype calls and allele frequency estimations are crucial to population genomics analyses, it is important to assess the reliability of NGS data. Here, we evaluate the reliability of genotype calls and allele frequency estimates of the single-nucleotide polymorphisms (SNPs) reported by 1000G (phase I) at five HLA genes (HLA-A, -B, -C, -DRB1, and -DQB1). We take advantage of the availability of HLA Sanger sequencing of 930 of the 1092 1000G samples and use this as a gold standard to benchmark the 1000G data. We document that 18.6% of SNP genotype calls in HLA genes are incorrect and that allele frequencies are estimated with an error greater than ±0.1 at approximately 25% of the SNPs in HLA genes. We found a bias toward overestimation of reference allele frequency for the 1000G data, indicating mapping bias is an important cause of error in frequency estimation in this dataset. We provide a list of sites that have poor allele frequency estimates and discuss the outcomes of including those sites in different kinds of analyses. Because the HLA region is the most polymorphic in the human genome, our results provide insights into the challenges of using of NGS data at other genomic regions of high diversity.
Resumo:
Työn tavoitteena oli kehittää tutkittavan insinööriyksikön projektien kustannusestimointiprosessia, siten että yksikön johdolla olisi tulevaisuudessa käytettävänään tarkempaa kustannustietoa. Jotta tämä olisi mahdollista, ensin täytyi selvittää yksikön toimintatavat, projektien kustannusrakenteet sekä kustannusatribuutit. Tämän teki mahdolliseksi projektien kustannushistoriatiedon tutkiminen sekä asiantuntijoiden haastattelu. Työn tuloksena syntyi kohdeyksikön muiden prosessien kanssa yhteensopiva kustannusestimointiprosessi sekä –malli.Kustannusestimointimenetelmän ja –mallin perustana on kustannusatribuutit, jotka määritellään erikseen tutkittavassa ympäristössä. Kustannusatribuutit löydetään historiatietoa tutkimalla, eli analysoimalla jo päättyneitä projekteja, projektien kustannusrakenteita sekä tekijöitä, jotka ovat vaikuttaneet kustannusten syntyyn. Tämän jälkeen kustannusatribuuteille täytyy määritellä painoarvot sekä painoarvojen vaihteluvälit. Estimointimallin tarkuutta voidaan parantaa mallin kalibroinnilla. Olen käyttänyt Goal – Question – Metric (GQM) –menetelmää tutkimuksen kehyksenä.
Resumo:
Sensor-based robot control allows manipulation in dynamic environments with uncertainties. Vision is a versatile low-cost sensory modality, but low sample rate, high sensor delay and uncertain measurements limit its usability, especially in strongly dynamic environments. Force is a complementary sensory modality allowing accurate measurements of local object shape when a tooltip is in contact with the object. In multimodal sensor fusion, several sensors measuring different modalities are combined to give a more accurate estimate of the environment. As force and vision are fundamentally different sensory modalities not sharing a common representation, combining the information from these sensors is not straightforward. In this thesis, methods for fusing proprioception, force and vision together are proposed. Making assumptions of object shape and modeling the uncertainties of the sensors, the measurements can be fused together in an extended Kalman filter. The fusion of force and visual measurements makes it possible to estimate the pose of a moving target with an end-effector mounted moving camera at high rate and accuracy. The proposed approach takes the latency of the vision system into account explicitly, to provide high sample rate estimates. The estimates also allow a smooth transition from vision-based motion control to force control. The velocity of the end-effector can be controlled by estimating the distance to the target by vision and determining the velocity profile giving rapid approach and minimal force overshoot. Experiments with a 5-degree-of-freedom parallel hydraulic manipulator and a 6-degree-of-freedom serial manipulator show that integration of several sensor modalities can increase the accuracy of the measurements significantly.
Resumo:
In the current economy situation companies try to reduce their expenses. One of the solutions is to improve the energy efficiency of the processes. It is known that the energy consumption of pumping applications range from 20 up to 50% of the energy usage in the certain industrial plants operations. Some studies have shown that 30% to 50% of energy consumed by pump systems could be saved by changing the pump or the flow control method. The aim of this thesis is to create a mobile measurement system that can calculate a working point position of a pump drive. This information can be used to determine the efficiency of the pump drive operation and to develop a solution to bring pump’s efficiency to a maximum possible value. This can allow a great reduction in the pump drive’s life cycle cost. In the first part of the thesis, a brief introduction in the details of pump drive operation is given. Methods that can be used in the project are presented. Later, the review of available platforms for the project implementation is given. In the second part of the thesis, components of the project are presented. Detailed description for each created component is given. Finally, results of laboratory tests are presented. Acquired results are compared and analyzed. In addition, the operation of created system is analyzed and suggestions for the future development are given.
Resumo:
The objective of this case study is to provide a Finnish solution provider company an objective, in-depth analysis of their project based business and especially of project estimation accuracy. A project and customer profitability analysis is conducted as a complementary addition to describe profitability of the Case Company’s core division. The theoretical framework is constructed on project profitability and customer profitability analysis. Project profitability is approached starting from managing projects, continuing to project pricing process and concluding to project success. The empirical part of this study describes the Case Company’s project portfolio, and by means of quantitative analysis, the study describes how the characteristics of a project impact the project’s profitability. The findings indicate that it really makes a difference in project portfolio’s estimated and actual profitability when methods of installation and technical specifications are scrutinized. Implications on profitability are gathered into a risk assessment tool proposal.