995 resultados para Basic Acceptance Testing
Resumo:
City Audit Report
Resumo:
City Audit Report
Resumo:
City Audit Report
Resumo:
City Audit Report
Resumo:
Other Audit Reports - Municipal Utilities
Resumo:
This project focuses on studying and testing the benefits of the NX Remote Desktop technology in administrative use for Finnish Meteorological Institutes existing Linux Terminal Service Project environment. This was done due to the criticality of the system caused by growing number of users as the Linux Terminal Service Project system expands. Although many of the supporting tasks can be done via Secure Shell connection, testing graphical programs or desktop behaviour in such a way is impossible. At first basic technologies behind the NX Remote Desktop were studied, and after that started the testing of two possible programs, FreeNX and NoMachine NX server. Testing the functionality and bandwidth demands were first done in a closed local area network, and results were studied. The better candidate was then installed in a virtual server simulating actual Linux Terminal Service Project server at Finnish Meteorological Institute and connection from Internet was tested to see was there any problems with firewalls and security policies. The results are reported in this study. Studying and testing the two different candidates of NX Remote Desktop showed, that NoMachine NX Server provides better customer support and documentation. Security aspects of the Finnish Meteorological Institute had also to be considered, and since updates along with the new developing tools are announced in next version of the program, this version was the choice. Studies also show that even NoMachine promises a swift connection over an average of 20Kbit/s bandwidth, at least double of that is needed. This project gives an overview of available remote desktop products along their benefits. NX Remote Desktop technology is studied, and installation instructions are included. Testing is done in both, closed and the actual environment and problems and suggestions are studied and analyzed. The installation to the actual LTSP server is not yet made, but a virtual server is put up in the same place in the view of network topology. This ensures, that if the administrators are satisfied with the system, installation and setting up the system will go as described in this report.
Resumo:
Epidemiological processes leave a fingerprint in the pattern of genetic structure of virus populations. Here, we provide a new method to infer epidemiological parameters directly from viral sequence data. The method is based on phylogenetic analysis using a birth-death model (BDM) rather than the commonly used coalescent as the model for the epidemiological transmission of the pathogen. Using the BDM has the advantage that transmission and death rates are estimated independently and therefore enables for the first time the estimation of the basic reproductive number of the pathogen using only sequence data, without further assumptions like the average duration of infection. We apply the method to genetic data of the HIV-1 epidemic in Switzerland.
Resumo:
Two different approaches currently prevail for predicting spatial patterns of species assemblages. The first approach (macroecological modelling, MEM) focuses directly on realised properties of species assemblages, whereas the second approach (stacked species distribution modelling, S-SDM) starts with constituent species to approximate assemblage properties. Here, we propose to unify the two approaches in a single 'spatially-explicit species assemblage modelling' (SESAM) framework. This framework uses relevant species source pool designations, macroecological factors, and ecological assembly rules to constrain predictions of the richness and composition of species assemblages obtained by stacking predictions of individual species distributions. We believe that such a framework could prove useful in many theoretical and applied disciplines of ecology and evolution, both for improving our basic understanding of species assembly across spatio-temporal scales and for anticipating expected consequences of local, regional or global environmental changes. In this paper, we propose such a framework and call for further developments and testing across a broad range of community types in a variety of environments.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
This paper discusses the role of deterministic components in the DGP and in the auxiliary regression model which underlies the implementation of the Fractional Dickey-Fuller (FDF) test for I(1) against I(d) processes with d ∈ [0, 1). This is an important test in many economic applications because I(d) processess with d & 1 are mean-reverting although, when 0.5 ≤ d & 1,, like I(1) processes, they are nonstationary. We show how simple is the implementation of the FDF in these situations, and argue that it has better properties than LM tests. A simple testing strategy entailing only asymptotically normally distributed tests is also proposed. Finally, an empirical application is provided where the FDF test allowing for deterministic components is used to test for long-memory in the per capita GDP of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there is some controversy.
Resumo:
Community College Audit Reports
Resumo:
Other Audit Reports - 28E Organizations
Resumo:
Other Audit Reports - 28E Organizations
Resumo:
Other Audit Reports - 28E Organizations
Resumo:
County Audit Report