964 resultados para Simplified and advanced calculation methods
Resumo:
The research presented in this paper is part of an ongoing investigation into how best to support meaningful lab-based usability evaluations of mobile technologies. In particular, we report on a comparative study of (a) a standard paper prototype of a mobile application used to perform an early-phase seated (static) usability evaluation, and (b) a pseudo-paper prototype created from the paper prototype used to perform an early-phase,contextually-relevant, mobile usability evaluation. We draw some initial conclusions regarding whether it is worth the added effort of conducting a usability evaluation of a pseudo-paper prototype in a contextually-relevant setting during early-phase user interface development.
Resumo:
Remote sensing data is routinely used in ecology to investigate the relationship between landscape pattern as characterised by land use and land cover maps, and ecological processes. Multiple factors related to the representation of geographic phenomenon have been shown to affect characterisation of landscape pattern resulting in spatial uncertainty. This study investigated the effect of the interaction between landscape spatial pattern and geospatial processing methods statistically; unlike most papers which consider the effect of each factor in isolation only. This is important since data used to calculate landscape metrics typically undergo a series of data abstraction processing tasks and are rarely performed in isolation. The geospatial processing methods tested were the aggregation method and the choice of pixel size used to aggregate data. These were compared to two components of landscape pattern, spatial heterogeneity and the proportion of landcover class area. The interactions and their effect on the final landcover map were described using landscape metrics to measure landscape pattern and classification accuracy (response variables). All landscape metrics and classification accuracy were shown to be affected by both landscape pattern and by processing methods. Large variability in the response of those variables and interactions between the explanatory variables were observed. However, even though interactions occurred, this only affected the magnitude of the difference in landscape metric values. Thus, provided that the same processing methods are used, landscapes should retain their ranking when their landscape metrics are compared. For example, highly fragmented landscapes will always have larger values for the landscape metric "number of patches" than less fragmented landscapes. But the magnitude of difference between the landscapes may change and therefore absolute values of landscape metrics may need to be interpreted with caution. The explanatory variables which had the largest effects were spatial heterogeneity and pixel size. These explanatory variables tended to result in large main effects and large interactions. The high variability in the response variables and the interaction of the explanatory variables indicate it would be difficult to make generalisations about the impact of processing on landscape pattern as only two processing methods were tested and it is likely that untested processing methods will potentially result in even greater spatial uncertainty. © 2013 Elsevier B.V.
Resumo:
The research presented in this paper is part of an ongoing investigation into how best to support meaningful lab-based usability evaluations of mobile technologies. In particular, we report on a comparative study of (a) a standard paper prototype of a mobile application used to perform an early-phase seated (static) usability evaluation, and (b) a pseudo-paper prototype created from the paper prototype used to perform an early-phase,contextually-relevant, mobile usability evaluation. We draw some initial conclusions regarding whether it is worth the added effort of conducting a usability evaluation of a pseudo-paper prototype in a contextually-relevant setting during early-phase user interface development.
Resumo:
Biological experiments often produce enormous amount of data, which are usually analyzed by data clustering. Cluster analysis refers to statistical methods that are used to assign data with similar properties into several smaller, more meaningful groups. Two commonly used clustering techniques are introduced in the following section: principal component analysis (PCA) and hierarchical clustering. PCA calculates the variance between variables and groups them into a few uncorrelated groups or principal components (PCs) that are orthogonal to each other. Hierarchical clustering is carried out by separating data into many clusters and merging similar clusters together. Here, we use an example of human leukocyte antigen (HLA) supertype classification to demonstrate the usage of the two methods. Two programs, Generating Optimal Linear Partial Least Square Estimations (GOLPE) and Sybyl, are used for PCA and hierarchical clustering, respectively. However, the reader should bear in mind that the methods have been incorporated into other software as well, such as SIMCA, statistiXL, and R.
Resumo:
In this paper a methodology for evaluation of information security of objects under attacks, processed by methods of compression, is represented. Two basic parameters for evaluation of information security of objects – TIME and SIZE – are chosen and the characteristics, which reflect on their evaluation, are analyzed and estimated. A co-efficient of information security of object is proposed as a mean of the coefficients of the parameter TIME and SIZE. From the simulation experiments which were carried out methods with the highest co-efficient of information security had been determined. Assessments and conclusions for future investigations are proposed.
Resumo:
The traffic carried by core optical networks grows at a steady but remarkable pace of 30-40% year-over-year. Optical transmissions and networking advancements continue to satisfy the traffic requirements by delivering the content over the network infrastructure in a cost and energy efficient manner. Such core optical networks serve the information traffic demands in a dynamic way, in response to requirements for shifting of traffics demands, both temporally (day/night) and spatially (business district/residential). However as we are approaching fundamental spectral efficiency limits of singlemode fibers, the scientific community is pursuing recently the development of an innovative, all-optical network architecture introducing the spatial degree of freedom when designing/operating future transport networks. Spacedivision- multiplexing through the use of bundled single mode fibers, and/or multi-core fibers and/or few-mode fibers can offer up to 100-fold capacity increase in future optical networks. The EU INSPACE project is working on the development of a complete spatial-spectral flexible optical networking solution, offering the network ultra-high capacity, flexibility and energy efficiency required to meet the challenges of delivering exponentially growing traffic demands in the internet over the next twenty years. In this paper we will present the motivation and main research activities of the INSPACE consortium towards the realization of the overall project solution. © 2014 Copyright SPIE.
Resumo:
Adaptive critic methods have common roots as generalizations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, nonlinear and nonstationary environments. In this study, a novel probabilistic dual heuristic programming (DHP) based adaptive critic controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) adaptive critic method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterized by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the critic network is then calculated and shown to be equal to the analytically derived correct value.
Resumo:
Az életben számtalan olyan esettel találkozunk, amikor egy jószág iránti kereslet meghaladja a rendelkezésre álló kínálatot. Példaként említhetjük a kárpótlási igényeket, egy csődbement cég hitelezőinek igényeit, valamely szerv átültetésére váró betegek sorát stb. Ilyen helyzetekben valamilyen eljárás szerint oszthatjuk el a szűkös mennyiséget a szereplők között. Szokás megkülönböztetni a determinisztikus és a sztochasztikus elosztási eljárásokat, jóllehet sok esetben csak a determinisztikus eljárásokat alkalmazzák. Azonban igazságossági szempontból gyakran használnak sztochasztikus elosztási eljárásokat is, mint például tette azt az Egyesült államok hadserege a második világháború végét követően a külföldön állomásozó katonáinak visszavonásakor, illetve a vietnami háború során behívandó személyek kiválasztásakor. / === / We investigated the minimal variance methods introduced in Tasnádi [6] based on seven popular axioms. We proved that if a deterministic rationing method satisfies demand monotonicity, resource monotonicity, equal treatment of equals and self-duality, than the minimal variance methods associated with the given deterministic rationing method also satisfies demand monotonicity, resource monotonicity, equal treatment of equals and self-duality. Furthermore, we found that the consistency, the lower composition and the upper composition of a deterministic rationing method does not imply the consistency, the lower composition and the upper composition of a minimal variance method associated with the given deterministic rationing method.
Resumo:
Accurate knowledge of the time since death, or postmortem interval (PMI), has enormous legal, criminological, and psychological impact. In this study, an investigation was made to determine whether the relationship between the degradation of the human cardiac structure protein Cardiac Troponin T and PMI could be used as an indicator of time since death, thus providing a rapid, high resolution, sensitive, and automated methodology for the determination of PMI. ^ The use of Cardiac Troponin T (cTnT), a protein found in heart tissue, as a selective marker for cardiac muscle damage has shown great promise in the determination of PMI. An optimized conventional immunoassay method was developed to quantify intact and fragmented cTnT. A small sample of cardiac tissue, which is less affected than other tissues by external factors, was taken, homogenized, extracted with magnetic microparticles, separated by SDS-PAGE, and visualized with Western blot by probing with monoclonal antibody against cTnT. This step was followed by labeling and available scanners. This conventional immunoassay provides a proper detection and quantitation of cTnT protein in cardiac tissue as a complex matrix; however, this method does not provide the analyst with immediate results. Therefore, a competitive separation method using capillary electrophoresis with laser-induced fluorescence (CE-LIF) was developed to study the interaction between human cTnT protein and monoclonal anti-TroponinT antibody. ^ Analysis of the results revealed a linear relationship between the percent of degraded cTnT and the log of the PMI, indicating that intact cTnT could be detected in human heart tissue up to 10 days postmortem at room temperature and beyond two weeks at 4C. The data presented demonstrates that this technique can provide an extended time range during which PMI can be more accurately estimated as compared to currently used methods. The data demonstrates that this technique represents a major advance in time of death determination through a fast and reliable, semi-quantitative measurement of a biochemical marker from an organ protected from outside factors. ^
Resumo:
Energy saving, reduction of greenhouse gasses and increased use of renewables are key policies to achieve the European 2020 targets. In particular, distributed renewable energy sources, integrated with spatial planning, require novel methods to optimise supply and demand. In contrast with large scale wind turbines, small and medium wind turbines (SMWTs) have a less extensive impact on the use of space and the power system, nevertheless, a significant spatial footprint is still present and the need for good spatial planning is a necessity. To optimise the location of SMWTs, detailed knowledge of the spatial distribution of the average wind speed is essential, hence, in this article, wind measurements and roughness maps were used to create a reliable annual mean wind speed map of Flanders at 10 m above the Earth’s surface. Via roughness transformation, the surface wind speed measurements were converted into meso- and macroscale wind data. The data were further processed by using seven different spatial interpolation methods in order to develop regional wind resource maps. Based on statistical analysis, it was found that the transformation into mesoscale wind, in combination with Simple Kriging, was the most adequate method to create reliable maps for decision-making on optimal production sites for SMWTs in Flanders (Belgium).
Resumo:
Based on optical imaging and spectroscopy of the Type II-Plateau SN 2013eq, we present a comparative study of commonly used distance determination methods based on Type II supernovae. The occurrence of SN 2013eq in the Hubble flow (z = 0.041 ± 0.001) prompted us to investigate the implications of the difference between "angular" and "luminosity" distances within the framework of the expanding photosphere method (EPM) that relies upon a relation between flux and angular size to yield a distance. Following a re-derivation of the basic equations of the EPM for SNe at non-negligible redshifts, we conclude that the EPM results in an angular distance. The observed flux should be converted into the SN rest frame and the angular size, θ, has to be corrected by a factor of (1 + z)2. Alternatively, the EPM angular distance can be converted to a luminosity distance by implementing a modification of the angular size. For SN 2013eq, we find EPM luminosity distances of DL = 151 ± 18 Mpc and DL = 164 ± 20 Mpc by making use of different sets of dilution factors taken from the literature. Application of the standardized candle method for Type II-P SNe results in an independent luminosity distance estimate (DL = 168 ± 16 Mpc) that is consistent with the EPM estimate. Spectra of SN 2013eq are available in the Weizmann Interactive Supernova data REPository (WISeREP): http://wiserep.weizmann.ac.il
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
Chionanthus pygmaeus Small (pygmy fringetree) (Oleaceae) is an endemic and rare Florida species, which has an attractive, small habit giving it great potential for use in managed landscapes. Members of the genus Chionanthus are difficult to propagate via cuttings and possess complex seed dormancies that are not well understood. Conservation of pygmy fringetree and its potential for commercial propagation for use in managed landscapes is contingent on a better understanding of its complex seed dormancy and enhancement of its propagation. I conducted two experiments to assess sexual and asexual propagation methods for pygmy fringetree. The first experiment was conducted to determine what factors are involved in overcoming seed dormancy. Various scarification treatments, which mimicked conditions seeds are exposed to in the wild, were investigated to determine their effects on germination of 20-year-old seeds originally collected from the species’ native range. Treatments included endocarp removal, sulfuric acid, boiling-water, and smoke-water treatments. Prior to treatment initiation, seed viability was estimated to be 12%. Treated seeds went through two cold- and two warm-stratification periods of 4°C and 25°C, respectively, in a dark growth chamber. After 180 days, none of the treatments induced early germination. Seeds were then tested for viability, which was 11%. Seed dormancy of the species is apparently complex, allowing some of the seeds to retain some degree of viability, but without dormancy requirements satisfied. The second experiment was conducted to assess if pygmy fringetree could be successfully propagated via hardwood or root cuttings if the appropriate combination of environmental conditions and hormones were applied. Hardwood and root cuttings were treated with either 1000 ppm IBA talc, 8000 ppm IBA talc, or inert talc. All cuttings were placed on a mist bench in a greenhouse for 9 weeks. Hardwood cuttings were supplemented with bottom heat at 24 °C. No treatments were successful in inducing adventitious root formation. I conclude that pygmy fringetree seeds possess complex dormancy that was not able to be overcome by the treatments utilized. However, this result is confounded by the age of the seeds used in the experiment. I also conclude that vegetative propagation of pygmy fringetree is highly dependent on the time of year cuttings are harvested. Further research of both seed and asexual propagation methods need to be explored before pygmy fringetree can be propagated on a commercial scale.
Resumo:
There is still a lack of an engineering approach for building Web systems, and the field of measuring the Web is not yet mature. In particular, there is an uncertainty in the selection of evaluation methods, and there are risks of standardizing inadequate evaluation practices. It is important to know whether we are evaluating the Web or specific website(s). We need a new categorization system, a different focus on evaluation methods, and an in-depth analysis that reveals the strengths and weaknesses of each method. As a contribution to the field of Web evaluation, this study proposes a novel approach to view and select evaluation methods based on the purpose and platforms of the evaluation. It has been shown that the choice of the appropriate evaluation method(s) depends greatly on the purpose of the evaluation.
Resumo:
International audience