883 resultados para 300802 Wildlife and Habitat Management
Resumo:
The World Wide Web provides the opportunity for a radically changed and much more efficient communication process for scientific results. A survey in the closely related domains of construction information technology and construction management was conducted in February 2000, aimed at measuring to what extent these opportunities are already changing the scientific information exchange and how researchers feel about the changes. The paper presents the results based on 236 replies to an extensive Web based questionnaire. 65% of the respondents stated their primary research interest as IT in A/E/C and 20% as construction management and economics. The questions dealt with how researchers find, access and read different sources; how much and what publications they read; how often and to which conferences they travel; how much they publish, and what are the criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing with one final section dedicated to opinions about electronic publishing. According to the survey researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author’s or publisher’s website. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available totally freely on the Web, where the costs could be covered by for instance professional societies or the publishing university. The shift that the Web is causing seems to be towards the "just in time" reading of literature. Also, frequent users of the Web rely less on scientific publications and tend to read fewer articles. If available with little effort, papers published in traditional journals are preferred; if not, the papers should be on the Web. In these circumstances, the role of paper-based journals published by established publishers is shifting from the core "information exchange" to the building of authors' prestige. The respondents feel they should build up their reputations by publishing in journals and relevant conferences, but then make their work freely available on the Web.
Resumo:
Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.
Resumo:
In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.
Resumo:
The movement and habitat utilization patterns were studied in an Asian elephant population during 1981-83 within a 1130 km2 area in southern India (110 30' N to 120 0' N and 760 50' E to 770 15' E). The study area encompasses a diversity of vegetation types from dry thorn forest (250-400 m) through deciduous forest (400-1400 m) to stunted evergreen shola forest and grassland (1400-1800 m). Home range sizes of some identified elephants were between 105 and 320 km2. Based on the dry season distribution, five different elephant clans, each consisting of between 50 and 200 individuals and having overlapping home ranges, could be defined within the study area. Seaso- nal habitat preferences were related to the availability of water and the palatability of food plants. During the dry months (January-April) elephants congregated at high densities of up to five individuals kM-2 in river valleys where browse plants had a much higher protein content than the coarse tall grasses on hill slopes. With the onset of rains of the first wet season (May- August) they dispersed over a wider area at lower densities, largely into the tall grass forests, to feed on the fresh grasses, which then had a high protein value. During the second wet season (September-December), when the tall grasses became fibrous, they moved into lower elevation short grass open forests. The normal movement pattern could be upset during years of adverse environmental con- ditions. However, the movement pattern of elephants in this region has not basically changed for over a century, as inferred from descriptions recorded during the nineteenth century.
Resumo:
This paper reviews integrated economic and ecological models that address impacts and adaptation to climate change in the forest sector. Early economic model studies considered forests as one out of many possible impacts of climate change, while ecological model studies tended to limit the economic impacts to fixed price-assumptions. More recent studies include broader representations of both systems, but there are still few studies which can be regarded fully integrated. Full integration of ecological and economic models is needed to address forest management under climate change appropriately. The conclusion so far is that there are vast uncertainties about how climate change affects forests. This is partly due to the limited knowledge about the global implications of the social and economical adaptation to the effects of climate change on forests.
Resumo:
One of the most challenging tasks in building language resources is the copyright license management. There are several reasons for this. First of all, the current European copyright system is designed to a large extent to satisfy the commercial actors, e.g. publishers, record companies etc. This means that the scope and duration of the rights are very extensive and there are even certain forms of protection that do not exist elsewhere in the world, e.g. database right. On the other hand, the exceptions for research and teaching are typically very narrow.
Resumo:
Small mammals were sampled in two natural habitats (montane stunted evergreen forests and montane grassland) and four anthropogenic habitats (tea, wattle, bluegum and pine plantation) in the Upper Nilgiris in southern India. Of the species trapped, eight were in montane evergreen forests and three were in other habitats. Habitat discrimination was studied in the rodents Rattus rattus and Mus famulus and the shrew Suncus montanus in the montane forest habitat. Multivariate tests on five variables (canopy cover, midstorey density, ground cover, tree density, canopy height) showed that R. rattus uses areas of higher tree density and lower canopy cover. Suncus montanus and M. famulus use habitat with higher tree density and ground cover and lower canopy height. Multivariate tests did not discriminate habitat use between the species. Univariate tests, however, showed that M. famulus uses areas of higher tree density than R. rattus and S. montanus. Rattus rattus was the dominant species in the montane forest, comprising 60.9% of total density, while the rodent Millardia meltada was the dominant species in the grassland. Studies of spatial interaction between these two species in habitats where they coexisted showed neither overlap nor avoidance between the species. Rattus rattus, however, did use areas of lower ground cover than did M. meltada. The analysis of spatial interactions between the species, habitat discrimination and use, and the removal experiments suggest that interspecific competition may not be a strong force in structuring these small mammal communities. There are distinct patterns in the use of different habitats by some species, but microhabitat selection and segregation is weak. Other factors such as intraspecific competition may play a more important role in these communities.
Resumo:
This paper describes a dynamic voltage frequency control scheme for a 256 X 64 SRAM block for reducing the energy in active mode and stand-by mode. The DVFM control system monitors the external clock and changes the supply voltage and the body bias so as to achieve a significant reduction in energy. The behavioral model of the proposed DVFM control system algorithm is described and simulated in HDL using delay and energy parameters obtained through SPICE simulation. The frequency range dictated by an external controller is 100 MHz to I GHz. The supply voltage of the complete memory system is varied in steps of 50 mV over the range of 500 mV to IV. The threshold voltage range of operation is plusmn100 mV around the nominal value, achieving 83.4% energy reduction in the active mode and 86.7% in the stand-by mode. This paper also proposes a energy replica that is used in the energy monitor subsystem of the DVFM system.
Resumo:
Interactions of major activities involved in airfleet operations, maintenance, and logistics are investigated in the framework of closed queuing networks with finite number of customers. The system is viewed at three levels, namely: operations at the flying-base, maintenance at the repair-depot, and logistics for subsystems and their interactions in achieving the system objectives. Several performance measures (eg, availability of aircraft at the flying-base, mean number of aircraft on ground at different stages of repair, use of repair facilities, and mean time an aircraft spends in various stages of repair) can easily be computed in this framework. At the subsystem level the quantities of interest are the unavailability (probability of stockout) of a spare and the duration of its unavailability. The repair-depot capability is affected by the unavailability of a spare which in turn, adversely affects the availability of aircraft at the flying-base level. Examples illustrate the utility of the proposed models.
Resumo:
In this paper, we study duty cycling and power management in a network of energy harvesting sensor (EHS) nodes. We consider a one-hop network, where K EHS nodes send data to a destination over a wireless fading channel. The goal is to find the optimum duty cycling and power scheduling across the nodes that maximizes the average sum data rate, subject to energy neutrality at each node. We adopt a two-stage approach to simplify the problem. In the inner stage, we solve the problem of optimal duty cycling of the nodes, subject to the short-term power constraint set by the outer stage. The outer stage sets the short-term power constraints on the inner stage to maximize the long-term expected sum data rate, subject to long-term energy neutrality at each node. Albeit suboptimal, our solutions turn out to have a surprisingly simple form: the duty cycle allotted to each node by the inner stage is simply the fractional allotted power of that node relative to the total allotted power. The sum power allotted is a clipped version of the sum harvested power across all the nodes. The average sum throughput thus ultimately depends only on the sum harvested power and its statistics. We illustrate the performance improvement offered by the proposed solution compared to other naive schemes via Monte-Carlo simulations.
Resumo:
Sepsophis punctatus Beddome 1870, the only species of a monotypic genus, was described based on a single specimen from the Eastern Ghats of India. We rediscovered the species based on specimens from Odisha and Andhra Pradesh state, India, after a gap of 137 years, including four specimens from close to the type locality. The holotype was studied in detail, and we present additional morphological characters of the species with details on natural history, habitat and diet. The morphological characters of the holotype along with two additional specimens collected by Beddome are compared with the specimens collected by us. We also briefly discuss the distribution of other members of the subfamily Scincinae and their evolutionary affinities.
Resumo:
Multi-GPU machines are being increasingly used in high-performance computing. Each GPU in such a machine has its own memory and does not share the address space either with the host CPU or other GPUs. Hence, applications utilizing multiple GPUs have to manually allocate and manage data on each GPU. Existing works that propose to automate data allocations for GPUs have limitations and inefficiencies in terms of allocation sizes, exploiting reuse, transfer costs, and scalability. We propose a scalable and fully automatic data allocation and buffer management scheme for affine loop nests on multi-GPU machines. We call it the Bounding-Box-based Memory Manager (BBMM). BBMM can perform at runtime, during standard set operations like union, intersection, and difference, finding subset and superset relations on hyperrectangular regions of array data (bounding boxes). It uses these operations along with some compiler assistance to identify, allocate, and manage data required by applications in terms of disjoint bounding boxes. This allows it to (1) allocate exactly or nearly as much data as is required by computations running on each GPU, (2) efficiently track buffer allocations and hence maximize data reuse across tiles and minimize data transfer overhead, and (3) and as a result, maximize utilization of the combined memory on multi-GPU machines. BBMM can work with any choice of parallelizing transformations, computation placement, and scheduling schemes, whether static or dynamic. Experiments run on a four-GPU machine with various scientific programs showed that BBMM reduces data allocations on each GPU by up to 75% compared to current allocation schemes, yields performance of at least 88% of manually written code, and allows excellent weak scaling.