873 resultados para National Measurement System for Time and Frequency.
Resumo:
The autonomic nervous system maintains homeostasis, which is the state of balance in the body. That balance can be determined simply and noninvasively by evaluating heart rate variability (HRV). However, independently of autonomic control of the heart, HRV can be influenced by other factors, such as respiratory parameters. Little is known about the relationship between HRV and spirometric indices. In this study, our objective was to determine whether HRV correlates with spirometric indices in adults without cardiopulmonary disease, considering the main confounders (e.g., smoking and physical inactivity). In a sample of 119 asymptomatic adults (age 20-80 years), we evaluated forced vital capacity (FVC) and forced expiratory volume in 1 s (FEV1). We evaluated resting HRV indices within a 5-min window in the middle of a 10-min recording period, thereafter analyzing time and frequency domains. To evaluate daily physical activity, we instructed participants to use a triaxial accelerometer for 7 days. Physical inactivity was defined as <150 min/week of moderate to intense physical activity. We found that FVC and FEV1, respectively, correlated significantly with the following aspects of the RR interval: standard deviation of the RR intervals (r =0.31 and 0.35), low-frequency component (r =0.38 and 0.40), and Poincaré plot SD2 (r =0.34 and 0.36). Multivariate regression analysis, adjusted for age, sex, smoking, physical inactivity, and cardiovascular risk, identified the SD2 and dyslipidemia as independent predictors of FVC and FEV1 (R2=0.125 and 0.180, respectively, for both). We conclude that pulmonary function is influenced by autonomic control of cardiovascular function, independently of the main confounders.
Resumo:
In this thesis the basic structure and operational principals of single- and multi-junction solar cells are considered and discussed. Main properties and characteristics of solar cells are briefly described. Modified equipment for measuring the quantum efficiency for multi-junction solar cell is presented. Results of experimental research single- and multi-junction solar cells are described.
Resumo:
Phospholipids in water form lamellar phases made up of alternating layers of water and bimolecular lipid leaflets. Three complementary methods, osmotic, mechanical, and vapour pressures, were used to measure the work of removing water from lamellar phases composed of frozen dipalmitoylphosphatidylcholine ( DPPC ), melted DPPC, egg phosphatidylethanolamine or equimolar mixtures of DPPC and cholesterol ( DPPC/CHOL ), Concurrently the structural changes that resulted from this water removal were measured using X-ray diffraction. The work was divided into that which forces the bilayers together ( F ) and that which compresses the molecules together within the bilayers ( F )# A large repulsive force exists between bilayers composed of each of the lipids studied and this force increases exponentially as bilayer separation is decreased. F is affected by the nature of the head groups, conformation of the acyl chains and heterogeneity of these chains. In general all of the melted phosphatidylcholines ( melted DPPC, egg lecithin and DPPC/CHOL ) have large equilibrium separations in excess water resulting from large repulsive hydration forces between these bilayers. By comparison, egg PE has an increased attractive force, and frozen DPPC has a decreased hydration force; each results in smaller separations in water for these two lipids. The chemical potentials of the water between the bilayers for all these lipids lie on a continuum, indicating that interbilayer water cannot be characterized by two discrete states, usually referred to as "bound" or "non**bound". For all lipids studied a maximum of 25 % of the total work done on the system goes into deforming the bilayers. The method used here viii to separate repulsion from deformation, developed for us by v. A. Parsegian, provides a unique method for the measurement of lateral pressure of a bilayer and its modulus of deformability ( Y ). Lateral pressure is affected by the nature of the head group, conformation and heterogeneity of the acyl chains. For small changes in molecular surface area ( A ) near equilibrium, both melted and frozen DPPC have similar values for the deformability modulus. Thus in this regime it requires about the same force to change the angle of tilt of frozen chains as it does to compress the fluid bilayer. The introduction of cholesterol into bilayers of DPPC reduces dramatically the lateral pressure of the bilayers over a large range of molecular surface areas ( A ). The variation in the magnitude of bilayer repulsion with different phospholipids provides a basis for the mechanism of lipid segregation in mixed lipid systems and suggests that interacting heterogeneous membranes may influence or modulate the composition of the opposing membrane. The measurements of deformabilities of bilayers provides a direct comparison of them with the properties of monolayers.
Resumo:
Fermi patches in quasi-two dimensional charge density waves (CDW) have not described the connection to superconductivity (SC) according to theory adequately at this point in time. The connection between CDW and SC in the quasi-two dimensional material CuxTiSe2 is an interesting one which might reveal mechanisms in unconventional superconductors. A previous Brock graduate student grew crystals of CuxTiSe2. The precise doping of the samples was not known. In order to determine the doping parameter x in CuxTiSe2, a sensitive resistivity measurement system was necessary. A new resistivity measurement system was designed and implemented utilizing an Infrared Labs HDL-10 He3 cryostat. By comparing with data from the literature, doping of two samples was investigated using the new measurement system and a Quantum Design Magnetic Property Measurement System (MPMS). Methods for determining the doping revealed that the old resistivity system would not be able to determine the CDW transition temperature of highly doped samples or doping for elongated samples due to electronic noise. Doping in one sample was found to be between x=0.06 and x=0.065. Values of doping in the second sample had a discrepancy but could be explained by incorrect sample orientation.
Resumo:
Sonar signal processing comprises of a large number of signal processing algorithms for implementing functions such as Target Detection, Localisation, Classification, Tracking and Parameter estimation. Current implementations of these functions rely on conventional techniques largely based on Fourier Techniques, primarily meant for stationary signals. Interestingly enough, the signals received by the sonar sensors are often non-stationary and hence processing methods capable of handling the non-stationarity will definitely fare better than Fourier transform based methods.Time-frequency methods(TFMs) are known as one of the best DSP tools for nonstationary signal processing, with which one can analyze signals in time and frequency domains simultaneously. But, other than STFT, TFMs have been largely limited to academic research because of the complexity of the algorithms and the limitations of computing power. With the availability of fast processors, many applications of TFMs have been reported in the fields of speech and image processing and biomedical applications, but not many in sonar processing. A structured effort, to fill these lacunae by exploring the potential of TFMs in sonar applications, is the net outcome of this thesis. To this end, four TFMs have been explored in detail viz. Wavelet Transform, Fractional Fourier Transfonn, Wigner Ville Distribution and Ambiguity Function and their potential in implementing five major sonar functions has been demonstrated with very promising results. What has been conclusively brought out in this thesis, is that there is no "one best TFM" for all applications, but there is "one best TFM" for each application. Accordingly, the TFM has to be adapted and tailored in many ways in order to develop specific algorithms for each of the applications.
Resumo:
Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.
Resumo:
Poor project planning, implementation and control and the subsequent cost and time overruns are ubiquitous features that have been posing serious concern at all levels - state, national and international. It results in wastage of the nation's scarce resources and retards the socio-economic progress. Although several studies peripheral on project overruns have been made at the national level, no serious attempt has been made at the state level to identify the magnitude of overruns, their causes and impacts on industrial projects. The present study "Time and Cost Overruns of Industrial Projects in Kerala" is an earnest attempt to probe in depth the time and cost overruns and their impact on industrial projects. The study places emphasise on the identification of the real reasons behind the cost and time overruns. It also covers the present project management practices of industrial projects in Kerala.
Resumo:
The incorporation of space allows the establishment of a more precise relationship between a contaminating input, a contaminating byproduct and emissions that reach the final receptor. However, the presence of asymmetric information impedes the implementation of the first-best policy. As a solution to this problem a site specific deposit refund system for the contaminating input and the contaminating byproduct are proposed. Moreover, the utilization of a successive optimization technique first over space and second over time enables definition of the optimal intertemporal site specific deposit refund system
Resumo:
Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.
Resumo:
The shamba system involves farmers tending tree saplings on state-owned forest land in return for being permitted to intercrop perennial food crops until canopy closure. At one time the system was used throughout all state-owned forest lands in Kenya, accounting for a large proportion of some 160,000 ha. The system should theoretically be mutually beneficial to both local people and the government. However the system has had a chequered past in Kenya due to widespread malpractice and associated environmental degradation. It was last banned in 2003 but in early 2008 field trials were initiated for its reintroduction. This study aimed to: assess the benefits and limitations of the shamba system in Kenya; assess the main influences on the extent to which the limitations and benefits are realised and; consider the management and policy requirements for the system's successful and sustainable operation. Information was obtained from 133 questionnaires using mainly open ended questions and six participatory workshops carried out in forest-adjacent communities on the western slopes of Mount Kenya in Nyeri district. In addition interviews were conducted with key informants from communities and organisations. There was strong desire amongst local people for the system's reintroduction given that it had provided significant food, income and employment. Local perceptions of the failings of the system included firstly mismanagement by government or forest authorities and secondly abuse of the system by shamba farmers and outsiders. Improvements local people considered necessary for the shamba system to work included more accountability and transparency in administration and better rules with respect to plot allocation and stewardship. Ninety-seven percent of respondents said they would like to be more involved in management of the forest and 80% that they were willing to pay for the use of a plot. The study concludes that the structural framework laid down by the 2005 Forests Act, which includes provision for the reimplementation of the shamba system under the new plantation establishment and livelihood improvement scheme (PELIS) [It should be noted that whilst the shamba system was re-branded in 2008 under the acronym PELIS, for the sake of simplicity the authors continue to refer to the 'shamba system' and 'shamba farmers' throughout this paper.], is weakened because insufficient power is likely to be devolved to local people, casting them merely as 'forest users' and the shamba system as a 'forest user right'. In so doing the system's potential to both facilitate and embody the participation of local people in forest management is limited and the long-term sustainability of the new system is questionable. Suggested instruments to address this include some degree of sharing of profits from forest timber, performance related guarantees for farmers to gain a new plot and use of joint committees consisting of local people and the forest authorities for long term management of forests.
Resumo:
This research examines dynamics associated with new representational technologies in complex organizations through a study of the use of a Single Model Environment, prototyping and simulation tools in the mega-project to construct Terminal 5 at Heathrow Airport, London. The ambition of the client, BAA. was to change industrial practices reducing project costs and time to delivery through new contractual arrangements and new digitally-enabled collaborative ways of working. The research highlights changes over time and addresses two areas of 'turbulence' in the use of: 1) technologies, where there is a dynamic tension between desires to constantly improve, change and update digital technologies and the need to standardise practices, maintaining and defending the overall integrity of the system; and 2) representations, where dynamics result from the responsibilities and liabilities associated with sharing of digital representations and a lack of trust in the validity of data from other firms. These dynamics are tracked across three stages of this well-managed and innovative project and indicate the generic need to treat digital infrastructure as an ongoing strategic issue.
Resumo:
Tremor is a clinical feature characterized by oscillations of a part of the body. The detection and study of tremor is an important step in investigations seeking to explain underlying control strategies of the central nervous system under natural (or physiological) and pathological conditions. It is well established that tremorous activity is composed of deterministic and stochastic components. For this reason, the use of digital signal processing techniques (DSP) which take into account the nonlinearity and nonstationarity of such signals may bring new information into the signal analysis which is often obscured by traditional linear techniques (e.g. Fourier analysis). In this context, this paper introduces the application of the empirical mode decomposition (EMD) and Hilbert spectrum (HS), which are relatively new DSP techniques for the analysis of nonlinear and nonstationary time-series, for the study of tremor. Our results, obtained from the analysis of experimental signals collected from 31 patients with different neurological conditions, showed that the EMD could automatically decompose acquired signals into basic components, called intrinsic mode functions (IMFs), representing tremorous and voluntary activity. The identification of a physical meaning for IMFs in the context of tremor analysis suggests an alternative and new way of detecting tremorous activity. These results may be relevant for those applications requiring automatic detection of tremor. Furthermore, the energy of IMFs was visualized as a function of time and frequency by means of the HS. This analysis showed that the variation of energy of tremorous and voluntary activity could be distinguished and characterized on the HS. Such results may be relevant for those applications aiming to identify neurological disorders. In general, both the HS and EMD demonstrated to be very useful to perform objective analysis of any kind of tremor and can therefore be potentially used to perform functional assessment.