928 resultados para automation of fit analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

"First published in 1940."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"February 1985."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The report is preceded by the Bureau of Ships' Analysis of Arthur Andersen and Company shipbuilding cost study (22 L.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reprint of An analysis of speculative trading in grain futures, Technical bulletin no. 1001, issued Oct. 1949.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"May 1988."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliographical references.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cybercrime and related malicious activity in our increasingly digital world has become more prevalent and sophisticated, evading traditional security mechanisms. Digital forensics has been proposed to help investigate, understand and eventually mitigate such attacks. The practice of digital forensics, however, is still fraught with various challenges. Some of the most prominent of these challenges include the increasing amounts of data and the diversity of digital evidence sources appearing in digital investigations. Mobile devices and cloud infrastructures are an interesting specimen, as they inherently exhibit these challenging circumstances and are becoming more prevalent in digital investigations today. Additionally they embody further characteristics such as large volumes of data from multiple sources, dynamic sharing of resources, limited individual device capabilities and the presence of sensitive data. These combined set of circumstances make digital investigations in mobile and cloud environments particularly challenging. This is not aided by the fact that digital forensics today still involves manual, time consuming tasks within the processes of identifying evidence, performing evidence acquisition and correlating multiple diverse sources of evidence in the analysis phase. Furthermore, industry standard tools developed are largely evidence-oriented, have limited support for evidence integration and only automate certain precursory tasks, such as indexing and text searching. In this study, efficiency, in the form of reducing the time and human labour effort expended, is sought after in digital investigations in highly networked environments through the automation of certain activities in the digital forensic process. To this end requirements are outlined and an architecture designed for an automated system that performs digital forensics in highly networked mobile and cloud environments. Part of the remote evidence acquisition activity of this architecture is built and tested on several mobile devices in terms of speed and reliability. A method for integrating multiple diverse evidence sources in an automated manner, supporting correlation and automated reasoning is developed and tested. Finally the proposed architecture is reviewed and enhancements proposed in order to further automate the architecture by introducing decentralization particularly within the storage and processing functionality. This decentralization also improves machine to machine communication supporting several digital investigation processes enabled by the architecture through harnessing the properties of various peer-to-peer overlays. Remote evidence acquisition helps to improve the efficiency (time and effort involved) in digital investigations by removing the need for proximity to the evidence. Experiments show that a single TCP connection client-server paradigm does not offer the required scalability and reliability for remote evidence acquisition and that a multi-TCP connection paradigm is required. The automated integration, correlation and reasoning on multiple diverse evidence sources demonstrated in the experiments improves speed and reduces the human effort needed in the analysis phase by removing the need for time-consuming manual correlation. Finally, informed by published scientific literature, the proposed enhancements for further decentralizing the Live Evidence Information Aggregator (LEIA) architecture offer a platform for increased machine-to-machine communication thereby enabling automation and reducing the need for manual human intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report presents the results of stratigraphic analysis of the southwestern quadrant of the Cedar Hills Regional Landfill (CHRLF). My report was intended to incorporate the recent Area 8 borehole data into the pre-existing analyses. This analysis was conducted during the preparation of the Area 8 Hydrogeologic Report, but is my independent investigation and does not represent the opinion of UEC or their associates. The CHRLF, in Maple Valley, WA, south of Squak Mountain, is a municipal solid waste landfill that has been in operation since the 1960s. A network of borings, the product of previous investigations, exists for the study area. I utilized the compiled boring logs, previous investigations, and the recently acquired data to produce a series of interpretative cross-sections for the study area. I recognized 9 distinct stratigraphic units, including fill. My interpreted stratigraphic units are similar to those identified in previous investigations such as the Area 7 Hydrogeologic investigation (HDR Engineering and Associates, 2008). These units include pre-Olympia aged non-glacial alluvium, glacial alluvium, and glacial till. Additionally, younger, Vashon-aged deposits of glacial till, recessional outwash, recessional lacustrine, and ice-contact were observed. An isolated “till-like” deposit was observed below the Vashon till. This could possibly represent an older till as mapped by Sweet Edwards (1985) and Booth (1995). I cite the continuity of the lower contact of the Vashon till (Unit 5, Table 2) and the upper contact pre-Vashon non-glacial fluvial deposits (Unit 9, Table 2) as evidence that faults or other structural features do not offset the deposits in the study area. This conclusion supports the findings of the pre-existing body of work within the landfill property and the nearby Queen City Farms property.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we analyzed the adsorption of a large number of gases and vapors on graphitized thermal carbon black. The Henry constant was used to determine the adsorbate-adsorbent interaction energy, which is found to be a modest decreasing function of temperature. Analysis of the complete adsorption isotherm over a wider range of pressure yields information on the monolayer coverage concentration and the adsorbate-adsorbate interaction energy. Among the various equations tested, the Hill-de Boer equation accounting for BET-postulated multilayer formation describes well the adsorption isotherms of all adsorbates. On average, the adsorbate-adsorbate interaction energy in the adsorbed phase is less than that in the bulk phase, suggesting that the distance between adsorbed molecules in the first layer of the adsorbed phase is slightly less than the equilibrium distance between two adsorbate molecules in the bulk phase. This suggests that the first layer is in a compressed state, which is due to the attraction of the adsorbent surface. The monolayer concentration as determined from the fitting of the Hill-de Boer equation with experimental data is slightly larger than the values calculated from the molecular projection area, suggesting that molecules can be oriented such that a larger number of molecules can be accommodated on the carbon black surface. This further supports the shorter distance between adsorbate molecules in the adsorbed phase.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

QTL detection experiments in livestock species commonly use the half-sib design. Each male is mated to a number of females, each female producing a limited number of progeny. Analysis consists of attempting to detect associations between phenotype and genotype measured on the progeny. When family sizes are limiting experimenters may wish to incorporate as much information as possible into a single analysis. However, combining information across sires is problematic because of incomplete linkage disequilibrium between the markers and the QTL in the population. This study describes formulae for obtaining MLEs via the expectation maximization (EM) algorithm for use in a multiple-trait, multiple-family analysis. A model specifying a QTL with only two alleles, and a common within sire error variance is assumed. Compared to single-family analyses, power can be improved up to fourfold with multi-family analyses. The accuracy and precision of QTL location estimates are also substantially improved. With small family sizes, the multi-family, multi-trait analyses reduce substantially, but not totally remove, biases in QTL effect estimates. In situations where multiple QTL alleles are segregating the multi-family analysis will average out the effects of the different QTL alleles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study represents the first application of multi-way calibration by N-PLS and multi-way curve resolution by PARAFAC to 2D diffusion-edited H-1 NMR spectra. The aim of the analysis was to evaluate the potential for quantification of lipoprotein main- and subtractions in human plasma samples. Multi-way N-PLS calibrations relating the methyl and methylene peaks of lipoprotein lipids to concentrations of the four main lipoprotein fractions as well as 11 subfractions were developed with high correlations (R = 0.75-0.98). Furthermore, a PARAFAC model with four chemically meaningful components was calculated from the 2D diffusion-edited spectra of the methylene peak of lipids. Although the four extracted PARAFAC components represent molecules of sizes that correspond to the four main fractions of lipoproteins, the corresponding concentrations of the four PARAFAC components proved not to be correlated to the reference concentrations of these four fractions in the plasma samples as determined by ultracentrifugation. These results indicate that NMR provides complementary information on the classification of lipoprotein fractions compared to ultracentrifugation. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In some applications of data envelopment analysis (DEA) there may be doubt as to whether all the DMUs form a single group with a common efficiency distribution. The Mann-Whitney rank statistic has been used to evaluate if two groups of DMUs come from a common efficiency distribution under the assumption of them sharing a common frontier and to test if the two groups have a common frontier. These procedures have subsequently been extended using the Kruskal-Wallis rank statistic to consider more than two groups. This technical note identifies problems with the second of these applications of both the Mann-Whitney and Kruskal-Wallis rank statistics. It also considers possible alternative methods of testing if groups have a common frontier, and the difficulties of disaggregating managerial and programmatic efficiency within a non-parametric framework. © 2007 Springer Science+Business Media, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many models of edge analysis in biological vision, the initial stage is a linear 2nd derivative operation. Such models predict that adding a linear luminance ramp to an edge will have no effect on the edge's appearance, since the ramp has no effect on the 2nd derivative. Our experiments did not support this prediction: adding a negative-going ramp to a positive-going edge (or vice-versa) greatly reduced the perceived blur and contrast of the edge. The effects on a fairly sharp edge were accurately predicted by a nonlinear multi-scale model of edge processing [Georgeson, M. A., May, K. A., Freeman, T. C. A., & Hesse, G. S. (in press). From filters to features: Scale-space analysis of edge and blur coding in human vision. Journal of Vision], in which a half-wave rectifier comes after the 1st derivative filter. But we also found that the ramp affected perceived blur more profoundly when the edge blur was large, and this greater effect was not predicted by the existing model. The model's fit to these data was much improved when the simple half-wave rectifier was replaced by a threshold-like transducer [May, K. A. & Georgeson, M. A. (2007). Blurred edges look faint, and faint edges look sharp: The effect of a gradient threshold in a multi-scale edge coding model. Vision Research, 47, 1705-1720.]. This modified model correctly predicted that the interaction between ramp gradient and edge scale would be much larger for blur perception than for contrast perception. In our model, the ramp narrows an internal representation of the gradient profile, leading to a reduction in perceived blur. This in turn reduces perceived contrast because estimated blur plays a role in the model's estimation of contrast. Interestingly, the model predicts that analogous effects should occur when the width of the window containing the edge is made narrower. This has already been confirmed for blur perception; here, we further support the model by showing a similar effect for contrast perception. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To assess the repeatability of an objective image analysis technique to determine intraocular lens (IOL) rotation and centration. SETTING: Six ophthalmology clinics across Europe. METHODS: One-hundred seven patients implanted with Akreos AO aspheric IOLs with orientation marks were imaged. Image quality was rated by a masked observer. The axis of rotation was determined from a line bisecting the IOL orientation marks. This was normalized for rotation of the eye between visits using the axis bisecting 2 consistent conjunctival vessels or iris features. The center of ovals overlaid to circumscribe the IOL optic edge and the pupil or limbus were compared to determine IOL centration. Intrasession repeatability was assessed in 40 eyes and the variability of repeated analysis examined. RESULTS: Intrasession rotational stability of the IOL was ±0.79 degrees (SD) and centration was ±0.10 mm horizontally and ±0.10 mm vertically. Repeated analysis variability of the same image was ±0.70 degrees for rotation and ±0.20 mm horizontally and ±0.31 mm vertically for centration. Eye rotation (absolute) between visits was 2.23 ± 1.84 degrees (10%>5 degrees rotation) using one set of consistent conjunctival vessels or iris features and 2.03 ± 1.66 degrees (7%>5 degrees rotation) using the average of 2 sets (P =.13). Poorer image quality resulted in larger apparent absolute IOL rotation (r =-0.45,P<.001). CONCLUSIONS: Objective analysis of digital retroillumination images allows sensitive assessment of IOL rotation and centration stability. Eye rotation between images can lead to significant errors if not taken into account. Image quality is important to analysis accuracy.