918 resultados para Measure of riskiness
Resumo:
Background. Respiratory irregularity has been previously reported in patients with panic disorder using time domain measures. However, the respiratory signal is not entirely linear and a few previous studies used approximate entropy (APEN), a measure of regularity of time series. We have been studying APEN and other nonlinear measures including a measure of chaos, the largest Lyapunov exponent (LLE) of heart rate time series, in some detail. In this study, we used these measures of respiration to compare normal controls (n = 18) and patients with panic disorder (n = 22) in addition to the traditional time domain measures of respiratory rate and tidal volume. Methods: Respiratory signal was obtained by the Respitrace system using a thoracic and an abdominal belt, which was digitized at 500 Hz. Later, the time series were constructed at 4 Hz, as the highest frequency in this signal is limited to 0.5 Hz. We used 256 s of data (1,024 points) during supine and standing postures under normal breathing and controlled breathing at 12 breaths/min. Results: APEN was significantly higher in patients in standing posture during normal as well as controlled breathing (p = 0.002 and 0.02, respectively). LLE was also significantly higher in standing posture during normal breathing (p = 0.009). Similarly, the time domain measures of standard deviations and the coefficient of variation (COV) of tidal volume (TV) were significantly higher in the patient group (p = 0.02 and 0.004, respectively). The frequency of sighs was also higher in the patient group in standing posture (p = 0.02). In standing posture, LLE (p < 0.05) as well as APEN (p < 0.01) contributed significantly toward the separation of the two groups over and beyond the linear measure, i.e. the COV of TV. Conclusion: These findings support the previously described respiratory irregularity in patients with panic disorder and also illustrate the utility of nonlinear measures such as APEN and LLE as additional measures toward a better understanding of the abnormalities of respiratory physiology in similar patient populations as the correlation between LLE, APEN and some of the time domain measures only explained up to 50-60% of the variation. Copyright (C) 2002 S. Karger AG, Basel.
Resumo:
Fragility is viewed as a measure of the loss of rigidity of a glass structure above its glass transition temperature. It is attributed to the weakness of directional bonding and to the presence of a high density of low-energy configurational states. An a priori fragility function of electronegativities and bond distances is proposed which quite remarkably reproduces the entire range of reported fragilities and demonstrates that the fragility of a melt is indeed encrypted in the chemistry of the parent material. It has also been shown that the use of fragility-modified activation barriers in the Arrhenius function account for the whole gamut of viscosity behavior of liquids. It is shown that fragility can be a universal scaling parameter to collapse all viscosity curves on to a master plot.
Resumo:
The sulfur atom in the substrates leads to modest enhancements in the titled phenomena: these are essentially derived from favourable enthalpies of activation, the negative entropies of activation possibly indicating a measure of stereoelectronic control.
Resumo:
Tricyclic antidepressants have notable cardiac side effects, and this issue has become important due to the recent reports of increased cardiovascular mortality in patients with depression and anxiety. Several previous studies indicate that serotonin reuptake inhibitors (SRIs) do not appear to have such adverse effects. Apart from the effects of these drugs on routine 12-lead ECG, the effects on beat-to-beat heart rate (HR) and QT interval time series provide more information on the side effects related to cardiac autonomic function. In this study, we evaluated the effects of two antidepressants, nortriptyline (n = 13), a tricyclic, and paroxetine (n = 16), an SRI inhibitor, on HR variability in patients with panic disorder, using a measure of chaos, the largest Lyapunov exponent (LLE) using pre- and posttreatment HR time series. Our results show that nortriptyline is associated with a decrease in LLE of high frequency (HF: 0.15-0.5 Hz) filtered series, which is most likely due to its anticholinergic effect, while paroxetine had no such effect. Paroxetine significantly decreased sympathovagal ratios as measured by a decrease in LLE of LF/HF. These results suggest that paroxetine appears to be safer in regards to cardiovascular effects compared to nortriptyline in this group of patients. (C) 2003 Elsevier Inc. All rights reserved.
Resumo:
MEMS resonators have potential applications in the areas of RF-MEMS, clock oscillators, ultrasound transducers, etc. The important characteristics of a resonator are its resonant frequency and Q-factor (a measure of damping). Usually large damping in macro structures makes it difficult to excite and measure their higher modes. In contrast, MEMS resonators seem amenable to excitation in higher modes. In this paper, 28 modes of vibration of an electrothermal actuator are experimentally captured–perhaps the highest number of modes experimentally captured so far. We verify these modes with FEM simulations and report that all the measured frequencies are within 5% of theoretically predicted values.
Resumo:
The e�cient operation of single-source, single-sink wireless network is considered with the diversity-multiplexing gain tradeo� (DMT) as the measure of performance. Whereas in the case of a point-to-point MIMO channel the DMT is determined by the fading statistics, in the case of a network, the DMT is additionally, a function of the time schedule according to which the network is operated, as well as the protocol that dictates the mode of operation of the intermediate relays.In general, it is only possible at present, to provide upper bounds on the DMT of the network in terms of the DMT of the MIMO channel appearing across cuts in the network. This paper presents a tutorial overview on the DMT of half-duplex multi-hop wireless networks that also attempts to identify where possible, codes that achieve the DMT.For example, it is shown how one can construct codes that achieve the DMT of a network under a given schedule and either an amplify-and-forward or decode-and-forward protocol. Also contained in the paper,are discussions on the DMT of the multiple-access channel as well as the impact of feedback on the DMT of a MIMO channel.
Resumo:
Freshwater ecosystems vary in size and composition and contain a wide range of organisms which interact with each other and with the environment. These interactions are between organisms and the environment as nutrient cycling, biomass formation and transfer, maintenance of internal environment and interactions with the external environment. The range of organisms present in aquatic communities decides the generation and transfer function of biomass, which defines and characterises the system. These organisms have distinct roles as they occupy particular trophic levels, forming an interconnected system in a food chain. Availability of resources and competition would primarily determine the balance of individual species within the food web, which in turn influences the variety and proportions of the different organisms, with important implications for the overall functioning of the system. This dynamic and diverse relationship decides the physical, chemical and biological elements across spatial and temporal scales in the aquatic ecosystem, which can be recorded by regular inventorying and monitoring to maintain the integrity and conserve the ecosystem. Regular environmental monitoring, particularly water quality monitoring allows us to detect, assess and manage the overall impacts on the rivers. The appreciation of water quality is in constant flux. Water quality assessments derived through the biotic indices, i.e. assessments based on observations of the resident floral and faunal communities has gained importance in recent years. Biological evaluations provide a description of the water quality that is often not achievable from elemental analyses alone. A biological indicator (or bioindicator) is a taxon or taxa selected based on its sensitivity to a particular attribute, and then assessed to make inferences about that attribute. In other words, they are a substitute for directly measuring abiotic features or other biota. Bioindicators are evaluated through presence or absence, condition, relative abundance, reproductive success, community structure (i.e. composition and diversity), community function (i.e. trophic structure), or any combination thereof.Biological communities reflect the overall ecological integrity by integrating various stresses, thus providing a broad measure of their synergistic impacts. Aquatic communities, both plants and animals, integrate and reflect the effects of chemical and physical disturbances that occur over extended periods of time. Monitoring procedures based on the biota measure the health of a river and the ability of aquatic ecosystems to support life as opposed to simply characterising the chemical and physical components of a particular system. This is the central purpose of assessing the biological condition of aquatic communities of a river.Diatoms (Bacillariophyceae), blue green algae (Cyanophyceae), green algae (Chlorophyceae), and red algae (Rhodphyceae) are the main groups of algae in flowing water. These organisms are widely used as biological indicators of environmental health in the aquatic ecosystem because algae occupy the most basic level in the transfer of energy through natural aquatic systems. The distribution of algae in an aquatic ecosystem is directly related to the fundamental factors such as physical, chemical and biological constituents. Soft algae (all the algal groups except diatoms) have also been used as indicators of biological integrity, but they may have less efficiency than diatoms in this respect due to their highly variable morphology. The diatoms (Bacillariophyceae) comprise a ubiquitous, highly successful and distinctive group of unicellular algae with the most obvious distinguishing characteristic feature being siliceous cell walls (frustules). The photosynthetic organisms living within its photic zone are responsible for about one-half of global primary productivity. The most successful organisms are thought to be photosynthetic prokaryotes (cyanobacteria and prochlorophytes) and a class of eukaryotic unicellular algae known as diatoms. Diatoms are likely to have arisen around 240 million years ago following an endosymbiotic event between a red eukaryotic alga and a heterotrophic flagellate related to the Oomycetes.The importance of algae to riverine ecology is easily appreciated when one considers that they are primary producers that convert inorganic nutrients into biologically active organic compounds while providing physical habitat for other organisms. As primary producers, algae transform solar energy into food from which many invertebrates obtain their energy. Algae also transform inorganic nutrients, such as atmospheric nitrogen into organic forms such as ammonia and amino acids that can be used by other organisms. Algae stabilises the substrate and creates mats that form structural habitats for fish and invertebrates. Algae are a source of organic matter and provide habitat for other organisms such as non-photosynthetic bacteria, protists, invertebrates, and fish. Algae's crucial role in stream ecosystems and their excellent indicator properties make them an important component of environmental studies to assess the effects of human activities on stream health. Diatoms are used as biological indicators for a number of reasons: 1. They occur in all types of aquatic ecosystems. 2. They collectively show a broad range of tolerance along a gradient of aquatic productivity, individual species have specific water chemistry requirements. 3. They have one of the shortest generation times of all biological indicators (~2 weeks). They reproduce and respond rapidly to environmental change and provide early measures of both pollution impacts and habitat restoration. 4. It takes two to three weeks before changes are reflected to a measurable extent in the assemblage composition.
Resumo:
In this paper we analyze a novel Micro Opto Electro Mechanical Systems (MOEMS) race track resonator based vibration sensor. In this vibration sensor the straight portion of a race track resonator is located at the foot of the cantilever beam with proof mass. As the beam deflects due to vibration, stress induced refractive change in the waveguide located over the beam lead to the wavelength shift providing the measure of vibration. A wavelength shift of 3.19 pm/g in the range of 280 g for a cantilever beam of 1750μm×450m×20μmhas been obtained. The maximum acceleration (breakdown) for these dimensions is 2900g when a safety factor of 2 is taken into account. Since the wavelength of operation is around 1.55μm hybrid integration of source and detector is possible on the same substrate. Also it is less amenable to noise as wavelength shift provides the sensor signal. This type of sensors can be used for aerospace application and other harsh environments with suitable design.
Resumo:
Substantial increase in competition compels design firms to develop new products at an increasingly rapid pace. This situation pressurizes engineering teams to develop better products and at the same time develop products faster [1]. Continuous innovation is a key factor to enable a company to generate profit on a continued basis, through the introduction of new products in the market – a prime intention for Product Lifecycle Management. Creativity, affecting a wide spectrum of business portfolios, is regarded as the crucial factor for designing products. A central goal of product development is to create products that are sufficiently novel and useful. This research focuses on the determination of novelty of engineering products. Determination of novelty is important for ascertaining the newness of a product, to decide on the patentability of the design, to compare designers' capability of solving problems and to ascertain the potential market of a product. Few attempts at measuring novelty is available in literature [2, 3, 4], but more in-depth research is required for assessing degree of novelty of products. This research aims to determine the novelty of a product by enabling a person to determine the degree of novelty in a product. A measure of novelty has been developed by which the degree of ''novelty'' of products can be ascertained. An empirical study has been conducted to determine the validity of this method for determining the 'novelty' of the products.
Solute solute and solvent solute interactions in solid solutions of Cu+Sn, Au+Sn and Cu+Au+Sn alloys
Resumo:
The chemical potentials of tin in its α-solid solutions with Cu, Au and Cu + Au alloys have been measured using a gas-solid equilibration technique. The variation of the excess chemical potential of tin with its composition in the alloy is related to the solute-solute repulsive interaction, while the excess chemical potential at infinite dilution of the solute is a measure of solvent-solute interaction energies. It is shown that solute-solute interaction is primarily determined by the concentration of (s + p) electrons in the conduction band, although the interaction energies are smaller than those predicted by either the rigid band model or calculation based on Friedel oscillations in the potential function. Finally, the variation of the solvent-solute interaction with solvent composition in the ternary system can be accounted for in terms of a quasi-chemical treatment which takes into account the clustering of the solvent atoms around the solute.
Resumo:
Natural hazards such as landslides are triggered by numerous factors such as ground movements, rock falls, slope failure, debris flows, slope instability, etc. Changes in slope stability happen due to human intervention, anthropogenic activities, change in soil structure, loss or absence of vegetation (changes in land cover), etc. Loss of vegetation happens when the forest is fragmented due to anthropogenic activities. Hence land cover mapping with forest fragmentation can provide vital information for visualising the regions that require immediate attention from slope stability aspects. The main objective of this paper is to understand the rate of change in forest landscape from 1973 to 2004 through multi-sensor remote sensing data analysis. The forest fragmentation index presented here is based on temporal land use information and forest fragmentation model, in which the forest pixels are classified as patch, transitional, edge, perforated, and interior, that give a measure of forest continuity. The analysis carried out for five prominent watersheds of Uttara Kannada district– Aganashini, Bedthi, Kali, Sharavathi and Venkatpura revealed that interior forest is continuously decreasing while patch, transitional, edge and perforated forest show increasing trend. The effect of forest fragmentation on landslide occurrence was visualised by overlaying the landslide occurrence points on classified image and forest fragmentation map. The increasing patch and transitional forest on hill slopes are the areas prone to landslides, evident from the field verification, indicating that deforestation is a major triggering factor for landslides. This emphasises the need for immediate conservation measures for sustainable management of the landscape. Quantifying and describing land use - land cover change and fragmentation is crucial for assessing the effect of land management policies and environmental protection decisions.
Resumo:
Many knowledge based systems (KBS) transform a situation information into an appropriate decision using an in built knowledge base. As the knowledge in real world situation is often uncertain, the degree of truth of a proposition provides a measure of uncertainty in the underlying knowledge. This uncertainty can be evaluated by collecting `evidence' about the truth or falsehood of the proposition from multiple sources. In this paper we propose a simple framework for representing uncertainty in using the notion of an evidence space.
Resumo:
The Radius of Direct attraction of a discrete neural network is a measure of stability of the network. it is known that Hopfield networks designed using Hebb's Rule have a radius of direct attraction of Omega(n/p) where n is the size of the input patterns and p is the number of them. This lower bound is tight if p is no larger than 4. We construct a family of such networks with radius of direct attraction Omega(n/root plog p), for any p greater than or equal to 5. The techniques used to prove the result led us to the first polynomial-time algorithm for designing a neural network with maximum radius of direct attraction around arbitrary input patterns. The optimal synaptic matrix is computed using the ellipsoid method of linear programming in conjunction with an efficient separation oracle. Restrictions of symmetry and non-negative diagonal entries in the synaptic matrix can be accommodated within this scheme.
Resumo:
In this work, we propose an approach for reducing radiated noise from `light' fluid-loaded structures, such as, for example, vibrating structures in air. In this approach, we optimize the structure so as to minimize the dynamic compliance (defined as the input power) of the structure. We show that minimizing the dynamic compliance results in substantial reductions in the radiated sound power from the structure. The main advantage of this approach is that the redesign to minimize the dynamic compliance moves the natural frequencies of the structure away from the driving frequency thereby reducing the vibration levels of the structure, which in turn results in a reduction in the radiated sound power as an indirect benefit. Thus, the need for an acoustic and the associated sensitivity analysis is completely bypassed (although, in this work, we do carry out an acoustic analysis to demonstrate the reduction in sound power levels), making the strategy efficient compared to existing strategies in the literature which try to minimize some measure of noise directly. We show the effectiveness of the proposed approach by means of several examples involving both topology and stiffener optimization, for vibrating beam, plate and shell-type structures.
Resumo:
We investigate evolution of quantum correlations in ensembles of two-qubit nuclear spin systems via nuclear magnetic resonance techniques. We use discord as a measure of quantum correlations and the Werner state as an explicit example. We, first, introduce different ways of measuring discord and geometric discord in two-qubit systems and then describe the following experimental studies: (a) We quantitatively measure discord for Werner-like states prepared using an entangling pulse sequence. An initial thermal state with zero discord is gradually and periodically transformed into a mixed state with maximum discord. The experimental and simulated behavior of rise and fall of discord agree fairly well. (b) We examine the efficiency of dynamical decoupling sequences in preserving quantum correlations. In our experimental setup, the dynamical decoupling sequences preserved the traceless parts of the density matrices at high fidelity. But they could not maintain the purity of the quantum states and so were unable to keep the discord from decaying. (c) We observe the evolution of discord for a singlet-triplet mixed state during a radio-frequency spin-lock. A simple relaxation model describes the evolution of discord, and the accompanying evolution of fidelity of the long-lived singlet state, reasonably well.