999 resultados para measuring professionalism


Relevância:

20.00% 20.00%

Publicador:

Resumo:

介绍了一种基于一线总线数字温度传感器的HIFRL-CSR循环冷却水温度远程测量系统,阐述了以DT400模块为核心的温度测量的硬件模块和软件设计。该系统具有测温精度高、易扩展性、低成本、低功耗、可靠性高、抗干扰能力强等特点,根据不同需要可应用于多种温度测量系统中。

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chinese Acad Sci, ISCAS Lab Internet Software Technologies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several methods have been used for the measurement of the electronic decay constant (beta) of organic molecules. However, each of them has some disadvantages. For the first time, electrochemical impedance spectroscopy (EIS) was used to obtain the 18 value by measuring the tunneling resistance through alkanedithiols. The tunneling resistance through alkanedithiols increases exponentially with the molecular length in terms of the mechanism of coherent nonresonant tunneling. beta was 0.51 +/- 0.01 per carbon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new method of measuring the mean size of solvent clusters in swollen polymer membrane is presented in this paper. This method is based on a combination of inverse gas chromatography (IGC) and equilibrium swelling. The mechanism is that weight fraction activity coefficient of solvent in swollen polymer is influenced by its clusters size. The mean clusters size of solvent in swollen polymer can be calculated as the quotient of the weight fraction activity coefficient of clustering system dividing the weigh fraction activity coefficient of non-clustering system. In this experiment, the weigh fraction activity coefficient of non-clustering system was measured with IGC. Methanol, ethanol and polyimide systems were tested with the new method at three temperatures, 20, 40, and 60degreesC. The mean clusters size of methanol in polyimide was five, four, and three at each temperature condition, respectively. Ethanol did not form clusters (the mean clusters size was one). In contrast to the inherent narrow temperature range in DSC, XRD, and FTIR methods, the temperature range in IGC and equilibrium swelling is broad. Compared with DSC. XRD. and FTIR, this new method can detect the clusters of solvent-polymer system at higher temperature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a new method to estimate the diffusion coefficient and transference number of a salt or an electroactive ion in a solution with little or no supporting electrolyte. The above two parameters can be obtained from a single potential step experiment without previous knowledge of either one. It would appear that the method could also be used in the study of ion transport in a high viscosity solvent or a solid electrolyte. (C) 1998 Elsevier Science S.A.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the correction of mode II strain energy release rate, G(II), of composite laminates measured with the end-notched flexure (ENF) specimen. A derivation is given of the expressions for compliance and strain energy release rate, in whic

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Alexander, N.; Rhodes, M.; and Myers, H. (2007). International market selection: measuring actions instead of intentions. Journal of Services Marketing. 21(6), pp.424-434 RAE2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of available network connections can often have a large impact on the performance of distributed applications. For example, document transfer applications such as FTP, Gopher and the World Wide Web suffer increased response times as a result of network congestion. For these applications, the document transfer time is directly related to the available bandwidth of the connection. Available bandwidth depends on two things: 1) the underlying capacity of the path from client to server, which is limited by the bottleneck link; and 2) the amount of other traffic competing for links on the path. If measurements of these quantities were available to the application, the current utilization of connections could be calculated. Network utilization could then be used as a basis for selection from a set of alternative connections or servers, thus providing reduced response time. Such a dynamic server selection scheme would be especially important in a mobile computing environment in which the set of available servers is frequently changing. In order to provide these measurements at the application level, we introduce two tools: bprobe, which provides an estimate of the uncongested bandwidth of a path; and cprobe, which gives an estimate of the current congestion along a path. These two measures may be used in combination to provide the application with an estimate of available bandwidth between server and client thereby enabling application-level congestion avoidance. In this paper we discuss the design and implementation of our probe tools, specifically illustrating the techniques used to achieve accuracy and robustness. We present validation studies for both tools which demonstrate their reliability in the face of actual Internet conditions; and we give results of a survey of available bandwidth to a random set of WWW servers as a sample application of our probe technique. We conclude with descriptions of other applications of our measurement tools, several of which are currently under development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Server performance has become a crucial issue for improving the overall performance of the World-Wide Web. This paper describes Webmonitor, a tool for evaluating and understanding server performance, and presents new results for a realistic workload. Webmonitor measures activity and resource consumption, both within the kernel and in HTTP processes running in user space. Webmonitor is implemented using an efficient combination of sampling and event-driven techniques that exhibit low overhead. Our initial implementation is for the Apache World-Wide Web server running on the Linux operating system. We demonstrate the utility of Webmonitor by measuring and understanding the performance of a Pentium-based PC acting as a dedicated WWW server. Our workload uses a file size distribution with a heavy tail. This captures the fact that Web servers must concurrently handle some requests for large audio and video files, and a large number of requests for small documents, containing text or images. Our results show that in a Web server saturated by client requests, over 90% of the time spent handling HTTP requests is spent in the kernel. Furthermore, keeping TCP connections open, as required by TCP, causes a factor of 2-9 increase in the elapsed time required to service an HTTP request. Data gathered from Webmonitor provide insight into the causes of this performance penalty. Specifically, we observe a significant increase in resource consumption along three dimensions: the number of HTTP processes running at the same time, CPU utilization, and memory utilization. These results emphasize the important role of operating system and network protocol implementation in determining Web server performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate measurement of network bandwidth is crucial for flexible Internet applications and protocols which actively manage and dynamically adapt to changing utilization of network resources. These applications must do so to perform tasks such as distributing and delivering high-bandwidth media, scheduling service requests and performing admission control. Extensive work has focused on two approaches to measuring bandwidth: measuring it hop-by-hop, and measuring it end-to-end along a path. Unfortunately, best-practice techniques for the former are inefficient and techniques for the latter are only able to observe bottlenecks visible at end-to-end scope. In this paper, we develop and simulate end-to-end probing methods which can measure bottleneck bandwidth along arbitrary, targeted subpaths of a path in the network, including subpaths shared by a set of flows. As another important contribution, we describe a number of practical applications which we foresee as standing to benefit from solutions to this problem, especially in emerging, flexible network architectures such as overlay networks, ad-hoc networks, peer-to-peer architectures and massively accessed content servers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most vexing questions facing researchers interested in the World Wide Web is why users often experience long delays in document retrieval. The Internet's size, complexity, and continued growth make this a difficult question to answer. We describe the Wide Area Web Measurement project (WAWM) which uses an infrastructure distributed across the Internet to study Web performance. The infrastructure enables simultaneous measurements of Web client performance, network performance and Web server performance. The infrastructure uses a Web traffic generator to create representative workloads on servers, and both active and passive tools to measure performance characteristics. Initial results based on a prototype installation of the infrastructure are presented in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops a framework for estimating household preferences for school and neighborhood attributes in the presence of sorting. It embeds a boundary discontinuity design in a heterogeneous residential choice model, addressing the endogeneity of school and neighborhood characteristics. The model is estimated using restricted-access Census data from a large metropolitan area, yielding a number of new results. First, households are willing to pay less than 1 percent more in house prices - substantially lower than previous estimates - when the average performance of the local school increases by 5 percent. Second, much of the apparent willingness to pay for more educated and wealthier neighbors is explained by the correlation of these sociodemographic measures with unobserved neighborhood quality. Third, neighborhood race is not capitalized directly into housing prices; instead, the negative correlation of neighborhood percent black and housing prices is due entirely to the fact that blacks live in unobservably lower-quality neighborhoods. Finally, there is considerable heterogeneity in preferences for schools and neighbors, with households preferring to self-segregate on the basis of both race and education. © 2007 by The University of Chicago. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present measurements of morphological features in a thick turbid sample using light-scattering spectroscopy (LSS) and Fourier-domain low-coherence interferometry (fLCI) by processing with the dual-window (DW) method. A parallel frequency domain optical coherence tomography (OCT) system with a white-light source is used to image a two-layer phantom containing polystyrene beads of diameters 4.00 and 6.98 mum on the top and bottom layers, respectively. The DW method decomposes each OCT A-scan into a time-frequency distribution with simultaneously high spectral and spatial resolution. The spectral information from localized regions in the sample is used to determine scatterer structure. The results show that the two scatterer populations can be differentiated using LSS and fLCI.