943 resultados para Software analysis
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.
Resumo:
Nested clade phylogeographic analysis (NCPA) is a popular method for reconstructing the demographic history of spatially distributed populations from genetic data. Although some parts of the analysis are automated, there is no unique and widely followed algorithm for doing this in its entirety, beginning with the data, and ending with the inferences drawn from the data. This article describes a method that automates NCPA, thereby providing a framework for replicating analyses in an objective way. To do so, a number of decisions need to be made so that the automated implementation is representative of previous analyses. We review how the NCPA procedure has evolved since its inception and conclude that there is scope for some variability in the manual application of NCPA. We apply the automated software to three published datasets previously analyzed manually and replicate many details of the manual analyses, suggesting that the current algorithm is representative of how a typical user will perform NCPA. We simulate a large number of replicate datasets for geographically distributed, but entirely random-mating, populations. These are then analyzed using the automated NCPA algorithm. Results indicate that NCPA tends to give a high frequency of false positives. In our simulations we observe that 14% of the clades give a conclusive inference that a demographic event has occurred, and that 75% of the datasets have at least one clade that gives such an inference. This is mainly due to the generation of multiple statistics per clade, of which only one is required to be significant to apply the inference key. We survey the inferences that have been made in recent publications and show that the most commonly inferred processes (restricted gene flow with isolation by distance and contiguous range expansion) are those that are commonly inferred in our simulations. However, published datasets typically yield a richer set of inferences with NCPA than obtained in our random-mating simulations, and further testing of NCPA with models of structured populations is necessary to examine its accuracy.
Resumo:
ANeCA is a fully automated implementation of Nested Clade Phylogeographic Analysis. This was originally developed by Templeton and colleagues, and has been used to infer, from the pattern of gene sequence polymorphisms in a geographically structured population, the historical demographic processes that have shaped its evolution. Until now it has been necessary to perform large parts of the procedure manually. We provide a program that will take data in Nexus sequential format, and directly output a set of inferences. The software also includes TCS v1.18 and GeoDis v2.2 as part of automation.
Resumo:
Thermal non-destructive testing (NDT) is commonly used for assessing aircraft structures. This research work evaluates the potential of pulsed -- transient thermography for locating fixtures beneath aircraft skins in order to facilitate accurate automated assembly operations. Representative aluminium and carbon fibre aircraft skin-fixture assemblies were modelled using thermal modelling software. The assemblies were also experimentally investigated with an integrated pulsed thermographic evaluation system, as well as using a custom built system incorporating a miniature un-cooled camera. Modelling showed that the presence of an air gap between skin and fixture significantly reduced the thermal contrast developed, especially in aluminium. Experimental results show that fixtures can be located to accuracies of 0.5 mm.
Resumo:
A desktop tool for replay and analysis of gaze-enhanced multiparty virtual collaborative sessions is described. We linked three CAVE (TM)-like environments, creating a multiparty collaborative virtual space where avatars are animated with 3D gaze as well as head and hand motions in real time. Log files are recorded for subsequent playback and analysis Using the proposed software tool. During replaying the user can rotate the viewpoint and navigate in the simulated 3D scene. The playback mechanism relies on multiple distributed log files captured at every site. This structure enables an observer to experience latencies of movement and information transfer for every site as this is important fir conversation analysis. Playback uses an event-replay algorithm, modified to allow fast traversal of the scene by selective rendering of nodes, and to simulate fast random access. The tool's is analysis module can show each participant's 3D gaze points and areas where gaze has been concentrated.
Resumo:
Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.
Resumo:
A simple and practical technique for assessing the risks, that is, the potential for error, and consequent loss, in software system development, acquired during a requirements engineering phase is described. The technique uses a goal-based requirements analysis as a framework to identify and rate a set of key issues in order to arrive at estimates of the feasibility and adequacy of the requirements. The technique is illustrated and how it has been applied to a real systems development project is shown. How problems in this project could have been identified earlier is shown, thereby avoiding costly additional work and unhappy users.
Resumo:
Population size estimation with discrete or nonparametric mixture models is considered, and reliable ways of construction of the nonparametric mixture model estimator are reviewed and set into perspective. Construction of the maximum likelihood estimator of the mixing distribution is done for any number of components up to the global nonparametric maximum likelihood bound using the EM algorithm. In addition, the estimators of Chao and Zelterman are considered with some generalisations of Zelterman’s estimator. All computations are done with CAMCR, a special software developed for population size estimation with mixture models. Several examples and data sets are discussed and the estimators illustrated. Problems using the mixture model-based estimators are highlighted.
Resumo:
Context: During development managers, analysts and designers often need to know whether enough requirements analysis work has been done and whether or not it is safe to proceed to the design stage. Objective: This paper describes a new, simple and practical method for assessing our confidence in a set of requirements. Method: We identified 4 confidence factors and used a goal oriented framework with a simple ordinal scale to develop a method for assessing confidence. We illustrate the method and show how it has been applied to a real systems development project. Results: We show how assessing confidence in the requirements could have revealed problems in this project earlier and so saved both time and money. Conclusion: Our meta-level assessment of requirements provides a practical and pragmatic method that can prove useful to managers, analysts and designers who need to know when sufficient requirements analysis has been performed.
Resumo:
Passive samplers have been predominantly used to monitor environmental conditions in single volumes. However, measurements using a calibrated passive sampler- Solid Phase Microextraction (SPME) fibre, in three houses with cold pitched roof, successfully demonstrated the potential of the SPME fibre as a device for monitoring air movement in two volumes. The roofs monitored were pitched at 15° - 30° with insulation thickness varying between 200-300 mm on the ceiling. For effective analysis, two constant sources of volatile organic compounds were diffused steadily in the house. Emission rates and air movement from the house to the roof was predicted using developed algorithms. The airflow rates which were calibrated against conventional tracer gas techniques were introduced into a HAM software package to predict the effects of air movement on other varying parameters. On average it was shown from the in situ measurements that about 20-30% of air entering the three houses left through gaps and cracks in the ceiling into the roof. Although these field measurements focus on the airflows, it is associated with energy benefits such that; if these flows are reduced then significantly energy losses would also be reduced (as modelled) consequently improving the energy efficiency of the house. Other results illustrated that condensation formation risks were dependent on the airtightness of the building envelopes including configurations of their roof constructions.
Resumo:
Metabolic stable isotope labeling is increasingly employed for accurate protein (and metabolite) quantitation using mass spectrometry (MS). It provides sample-specific isotopologues that can be used to facilitate comparative analysis of two or more samples. Stable Isotope Labeling by Amino acids in Cell culture (SILAC) has been used for almost a decade in proteomic research and analytical software solutions have been established that provide an easy and integrated workflow for elucidating sample abundance ratios for most MS data formats. While SILAC is a discrete labeling method using specific amino acids, global metabolic stable isotope labeling using isotopes such as (15)N labels the entire element content of the sample, i.e. for (15)N the entire peptide backbone in addition to all nitrogen-containing side chains. Although global metabolic labeling can deliver advantages with regard to isotope incorporation and costs, the requirements for data analysis are more demanding because, for instance for polypeptides, the mass difference introduced by the label depends on the amino acid composition. Consequently, there has been less progress on the automation of the data processing and mining steps for this type of protein quantitation. Here, we present a new integrated software solution for the quantitative analysis of protein expression in differential samples and show the benefits of high-resolution MS data in quantitative proteomic analyses.
Resumo:
Social Networking Sites have recently become a mainstream communications technology for many people around the world. Major IT vendors are releasing social software designed for use in a business/commercial context. These Enterprise 2.0 technologies have impressive collaboration and information sharing functionality, but so far they do not have any organizational network analysis (ONA) features that reveal any patterns of connectivity within business units. This paper shows the impact of organizational network analysis techniques and social networks on organizational performance, we also give an overview on current enterprise social software, and most importantly, we highlight how Enterprise 2.0 can help automate an organizational network analysis.
Resumo:
A new electronic software distribution (ESD) life cycle analysis (LCA)methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative,physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO2e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO2e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.
Resumo:
Air distribution systems are one of the major electrical energy consumers in air-conditioned commercial buildings which maintain comfortable indoor thermal environment and air quality by supplying specified amounts of treated air into different zones. The sizes of air distribution lines affect energy efficiency of the distribution systems. Equal friction and static regain are two well-known approaches for sizing the air distribution lines. Concerns to life cycle cost of the air distribution systems, T and IPS methods have been developed. Hitherto, all these methods are based on static design conditions. Therefore, dynamic performance of the system has not been yet addressed; whereas, the air distribution systems are mostly performed in dynamic rather than static conditions. Besides, none of the existing methods consider any aspects of thermal comfort and environmental impacts. This study attempts to investigate the existing methods for sizing of the air distribution systems and proposes a dynamic approach for size optimisation of the air distribution lines by taking into account optimisation criteria such as economic aspects, environmental impacts and technical performance. These criteria have been respectively addressed through whole life costing analysis, life cycle assessment and deviation from set-point temperature of different zones. Integration of these criteria into the TRNSYS software produces a novel dynamic optimisation approach for duct sizing. Due to the integration of different criteria into a well- known performance evaluation software, this approach could be easily adopted by designers in busy nature of design. Comparison of this integrated approach with the existing methods reveals that under the defined criteria, system performance is improved up to 15% compared to the existing methods. This approach is interpreted as a significant step forward reaching to the net zero emission building in future.