963 resultados para Area measurement.
Resumo:
An electrochemical technique for the real-time detection of hydrogen peroxide (H2O2) was employed to describe respiratory burst activity (RBA) of phagocytes in plasma which can be used to evaluate the ability of immune system and disease resistance. The method is based upon the electric current changes, by redox reaction on platinum electrode of extracellular hydrogen peroxide (H2O2) released from phagocytes stimulated by the zymosan at 680 mV direct current (d.c.). Compared with the control, activation of respiratory burst by zymosan particles results in a high amperometric response, and a current peak was obtained during the whole monitoring process. The peak current was proved by addition Of Cu2+ and other controls, to be the result of intense release of H2O2 from phagocytes. The peak area was calculated and used to evaluate the quantity of effective H2O2, which represents the quantity of H2O2 beyond the clearance of related enzymes in plasma. According to Faraday's law, the phagocytes' ability of prawns to generate effective H2O2 was evaluated from 1.253 x 10(-14) mol/cell to 6.146 x 10(-14) mol/cell, and carp from 1.689 x 10(-15) Mol/Cell to 7.873 x 10(-1)5 mol/cell. This method is an acute and quick detection of extracellular effective H2O2 in plasma and reflects the capacity of phagocytes under natural conditions, which could be applied for selecting species and parents with high immunity for breeding in aquaculture. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
There is a need to obtain the hydrologic data including ocean current, wave, temperature and so on in the South China Sea. A new profiling instrument which does not suffer from the damage due to nature forces or incidents caused by passing ships, is under development to acquire data from this area. This device is based on a taut single point mid-water mooring system. It incorporates a small, instrumented vertically profiling float attached via an electromechanical cable to a winch integral with the main subsurface flotation. On a pre-set schedule, the instrument float with sensors is winched up to the surface if there is no strip passing by, which is defined by an on-board miniature sonar. And it can be immediately winched down to a certain depth if the sonar sensor finds something is coming. Since, because Of logistics, the area can only be visited once for a long time and a minimum of 10 times per day profiles are desired, energy demands are severe. To respond to these concerns, the system has been designed to conserve a substantial portion of the potential energy lost during the ascent phase of each profile and subsequently use this energy to pull the instrument down. Compared with the previous single-point layered measuring mode, it is advanced and economical. At last the paper introduces the test in the South China Sea.
Resumo:
Psychometrics is a term within the statistical literature that encompasses the development and evaluation of psychological tests and measures, an area of increasing importance within applied psychology specifically and behavioral sciences. Confusion continues to exist regarding the fundamental tenets of psychometric evaluation and application of the appropriate statistical tests and procedures. The purpose of this paper is to highlight the main psychometric elements which need to be considered in both the development and evaluation of an instrument or tool used within the context of posttraumatic stress disorder (PTSD). The psychometric profile of a tool should also be considered in established tools used in screening PTSD. A “standard” for the application and reporting of psychometric data and approaches is emphasized, the goal of which is to ensure that the key psychometric parameters are considered in relation to the selection and use of PTSD screening tools.
Resumo:
One of the most vexing questions facing researchers interested in the World Wide Web is why users often experience long delays in document retrieval. The Internet's size, complexity, and continued growth make this a difficult question to answer. We describe the Wide Area Web Measurement project (WAWM) which uses an infrastructure distributed across the Internet to study Web performance. The infrastructure enables simultaneous measurements of Web client performance, network performance and Web server performance. The infrastructure uses a Web traffic generator to create representative workloads on servers, and both active and passive tools to measure performance characteristics. Initial results based on a prototype installation of the infrastructure are presented in this paper.
Resumo:
The cost and complexity of deploying measurement infrastructure in the Internet for the purpose of analyzing its structure and behavior is considerable. Basic questions about the utility of increasing the number of measurements and/or measurement sites have not yet been addressed which has lead to a "more is better" approach to wide-area measurements. In this paper, we quantify the marginal utility of performing wide-area measurements in the context of Internet topology discovery. We characterize topology in terms of nodes, links, node degree distribution, and end-to-end flows using statistical and information-theoretic techniques. We classify nodes discovered on the routes between a set of 8 sources and 1277 destinations to differentiate nodes which make up the so called "backbone" from those which border the backbone and those on links between the border nodes and destination nodes. This process includes reducing nodes that advertise multiple interfaces to single IP addresses. We show that the utility of adding sources goes down significantly after 2 from the perspective of interface, node, link and node degree discovery. We show that the utility of adding destinations is constant for interfaces, nodes, links and node degree indicating that it is more important to add destinations than sources. Finally, we analyze paths through the backbone and show that shared link distributions approximate a power law indicating that a small number of backbone links in our study are very heavily utilized.
Resumo:
Traditional motion capture techniques, for instance, those employing optical technology, have long been used in the area of rehabilitation, sports medicine and performance analysis, where accurately capturing bio-mechanical data is of crucial importance. However their size, cost, complexity and lack of portability mean that their use is often impractical. Low cost MEMS inertial sensors when combined and assembled into a Wireless Inertial Measurement Unit (WIMU) present a possible solution for low cost and highly portable motion capture. However due to the large variability inherent to MEMS sensors, such a system would need extensive characterization to calibrate each sensor and ensure good quality data capture. A completely calibrated WIMU system would allow for motion capture in a wider range of real-world, non-laboratory based applications. Calibration can be a complex task, particularly for newer, multi-sensing range capable inertial sensors. As such we present an automated system for quickly and easily calibrating inertial sensors in a packaged WIMU, demonstrating some of the improvements in accuracy attainable.
Resumo:
This work employs a custom built body area network of wireless inertial measurement technology to conduct a biomechanical analysis of precision targeted throwing in competitive and recreational darts. The solution is shown to be capable of measuring key biomechanical factors including speed, acceleration and timing. These parameters are subsequently correlated with scoring performance to determine the affect each variable has on outcome. For validation purposes an optical 3D motion capture system provides a complete kinematic model of the subject and enables concurrent benchmarking of the 'gold standard' optical inertial measurement system with the more affordable and proactive wireless inertial measurement solution developed as part of this work.
Resumo:
The spatial variability of aerosol number and mass along roads was determined in different regions (urban, rural and coastal-marine) of the Netherlands. A condensation particle counter (CPC) and an optical aerosol spectrometer (LAS-X) were installed in a van along with a global positioning system (GPS). Concentrations were measured with high-time resolutions while driving allowing investigations not possible with stationary equipment. In particular, this approach proves to be useful to identify those locations where numbers and mass attain high levels ('hot spots'). In general, concentrations of number and mass of particulate matter increase along with the degree of urbanisation, with number concentration being the more sensitive indicator. The lowest particle numbers and PM1-concentrations are encountered in a coastal and rural area: <5000cm-3 and 6μgm-3, respectively. The presence of sea-salt material along the North-Sea coast enhances PM>1-concentrations compared to inland levels. High-particle numbers are encountered on motorways correlating with traffic intensity; the largest average number concentration is measured on the ring motorway around Amsterdam: about 160000cm-3 (traffic intensity 100000vehday-1). Peak values occur in tunnels where numbers exceed 106cm-3. Enhanced PM1 levels (i.e. larger than 9μgm-3) exist on motorways, major traffic roads and in tunnels. The concentrations of PM>1 appear rather uniformly distributed (below 6μgm-3 for most observations). On the urban scale, (large) spatial variations in concentration can be explained by varying intensities of traffic and driving patterns. The highest particle numbers are measured while being in traffic congestions or when behind a heavy diesel-driven vehicle (up to 600×103cm-3). Relatively high numbers are observed during the passages of crossings and, at a decreasing rate, on main roads with much traffic, quiet streets and residential areas with limited traffic. The number concentration exhibits a larger variability than mass: the mass concentration on city roads with much traffic is 12% higher than in a residential area at the edge of the same city while the number of particles changes by a factor of two (due to the presence of the ultrafine particles (aerodynamic diameter <100nm). It is further indicated that people residing at some 100m downwind a major traffic source are exposed to (still) 40% more particles than those living in the urban background areas. © 2004 Elsevier Ltd. All rights reserved.
Resumo:
Purpose: Environmental turbulence including rapid changes in technology and markets has resulted in the need for new approaches to performance measurement and benchmarking. There is a need for studies that attempt to measure and benchmark upstream, leading or developmental aspects of organizations. Therefore, the aim of this paper is twofold. The first is to conduct an in-depth case analysis of lead performance measurement and benchmarking leading to the further development of a conceptual model derived from the extant literature and initial survey data. The second is to outline future research agendas that could further develop the framework and the subject area.
Design/methodology/approach: A multiple case analysis involving repeated in-depth interviews with managers in organisational areas of upstream influence in the case organisations.
Findings: It was found that the effect of external drivers for lead performance measurement and benchmarking was mediated by organisational context factors such as level of progression in business improvement methods. Moreover, the legitimation of the business improvement methods used for this purpose, although typical, had been extended beyond their original purpose with the development of bespoke sets of lead measures.
Practical implications: Examples of methods and lead measures are given that can be used by organizations in developing a programme of lead performance measurement and benchmarking.
Originality/value: There is a paucity of in-depth studies relating to the theory and practice of lead performance measurement and benchmarking in organisations.
Resumo:
Using seven strategically placed, time-synchronized bodyworn receivers covering the head, upper front and back torso, and the limbs, we have investigated the effect of user state: stationary or mobile and local environment: anechoic chamber, open office area and hallway upon first and second order statistics for on-body fading channels. Three candidate models were considered: Nakagami, Rice and lognormal. Using maximum likelihood estimation and the Akaike information criterion it was established that the Nakagami-m distribution best described small-scale fading for the majority of on-body channels over all the measurement scenarios. When the user was stationary, Nakagami-m parameters were found to be much greater than 1, irrespective of local surroundings. For mobile channels, Nakagami-m parameters significantly decreased, with channels in the open office area and hallway experiencing the worst fading conditions.
Resumo:
In this letter, we investigate the distribution of the phase component of the complex received signal observed in practical experiments using body area networks. Two phase distributions, the recently proposed kappa-mu and eta-mu probability densities, which together encompass the most widely used fading models, namely Semi-Gaussian, Rayleigh, Hoyt, Rice, and Nakagami-m, have been compared with measurement data. The kappa-mu distribution has been found to provide the best fit over a range of on-body links, while the user was mobile. The experiments were carried out in two dissimilar indoor environments at opposite ends of the multipath spectrum. It has also been found that the uniform phase distribution has not arisen in anyone of the experiments.
Resumo:
Context: Shared care models integrating family physician services with interdisciplinary palliative care specialist teams are critical to improve access to quality palliative home care and address multiple domains of end-of-life issues and needs. Objectives: To examine the impact of a shared care pilot program on the primary outcomes of symptom severity and emotional distress (patient and family separately) over time and, secondarily, the concordance between patient preferences and place of death. Methods: An inception cohort of patients (n = 95) with advanced, progressive disease, expected to die within six months, were recruited from three rural family physician group practices (21 physicians) and followed prospectively until death or pilot end. Serial measurement of symptoms, emotional distress (patient and family), and preferences for place of death was performed, with analysis of changes in distress outcomes assessed using t-tests and general linear models. Results: Symptoms trended toward improvement, with a significant reduction in anxiety from baseline to 14 days noted. Symptom and emotional distress were maintained below high severity (7-10), and a high rate of home death compared with population norms was observed. Conclusion: Future controlled studies are needed to examine outcomes for shared care models with comparison groups. Shared care models build on family physician capacity and as such are promising in the development of palliative home care programs to improve access to quality palliative home care and foster health system integration. © 2011 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
Resumo:
This paper presents a case-study of a PMU application with PSS support in a real large scale Chinese power system to suppress inter-area oscillations. The paper uses PMU measured feedback signals from a PSS input signal for dynamic torque analysis (DTA). In the paper, a mathematical model of multi-machine power system is described, followed by formation of the residue and DTA indices. Simulations of the model are used with a large-scale power system model to demonstrate the role of PSS and the equivalence of DTA residue indices.
Resumo:
The inertia of fixed-speed wind turbine generators (WTGs) helps to mitigate under-frequency transients, promotes fault ride-through and damps inter-area oscillations. It is therefore important to quantify this inertia. The authors use measured wind farm responses during under-frequency transients to provide this information. They discuss the extent of the data and the criteria used to select certain events for further analysis. The estimation of WTG inertia is based on a induction generator model. The basis of the model will be described. The manner in which the model is applied to estimate the inertia from the measured data is then explained. Finally, the implications of the results for power system operation are assessed.
Resumo:
Experiments were undertaken to characterize a noninvasive chronic, model of nasal congestion in which nasal patency is measured using acoustic rhinometry. Compound 48/80 was administered intranasally to elicit nasal congestion in five beagle dogs either by syringe (0.5 ml) in thiopental sodium-anesthetized animals or as a mist (0.25 ml) in the same animals in the conscious state. Effects of mast cell degranulation on nasal cavity volume as well as on minimal cross-sectional area (A(min)) and intranasal distance to A(min) (D(min)) were studied. Compound 48/80 caused a dose-related decrease in nasal cavity volume and A(min) together with a variable increase in D(min). Maximal responses were seen at 90-120 min. Compound 48/80 was less effective in producing nasal congestion in conscious animals, which also had significantly larger basal nasal cavity volumes. These results demonstrate the utility of using acoustic rhinometry to measure parameters of nasal patency in dogs and suggest that this model may prove useful in studies of the actions of decongestant drugs.