72 resultados para Large Mammals
Resumo:
Many terrestrial and marine systems are experiencing accelerating decline due to the effects of global change. This situation has raised concern about the consequences of biodiversity losses for ecosystem function, ecosystem service provision, and human well-being. Coastal marine habitats are a main focus of attention because they harbour a high biological diversity, are among the most productive systems of the world and present high anthropogenic interaction levels. The accelerating degradation of many terrestrial and marine systems highlights the urgent need to evaluate the consequence of biodiversity loss. Because marine biodiversity is a dynamic entity and this study was interested global change impacts, this study focused on benthic biodiversity trends over large spatial and long temporal scales. The main aim of this project was to investigate the current extent of biodiversity of the high diverse benthic coralligenous community in the Mediterranean Sea, detect its changes, and predict its future changes over broad spatial and long temporal scales. These marine communities are characterized by structural species with low growth rates and long life spans; therefore they are considered particularly sensitive to disturbances. For this purpose, this project analyzed permanent photographic plots over time at four locations in the NW Mediterranean Sea. The spatial scale of this study provided information on the level of species similarity between these locations, thus offering a solid background on the amount of large scale variability in coralligenous communities; whereas the temporal scale was fundamental to determine the natural variability in order to discriminate between changes observed due to natural factors and those related to the impact of disturbances (e.g. mass mortality events related to positive thermal temperatures, extreme catastrophic events). This study directly addressed the challenging task of analyzing quantitative biodiversity data of these high diverse marine benthic communities. Overall, the scientific knowledge gained with this research project will improve our understanding in the function of marine ecosystems and their trajectories related to global change.
Resumo:
One of the first useful products from the human genome will be a set of predicted genes. Besides its intrinsic scientific interest, the accuracy and completeness of this data set is of considerable importance for human health and medicine. Though progress has been made on computational gene identification in terms of both methods and accuracy evaluation measures, most of the sequence sets in which the programs are tested are short genomic sequences, and there is concern that these accuracy measures may not extrapolate well to larger, more challenging data sets. Given the absence of experimentally verified large genomic data sets, we constructed a semiartificial test set comprising a number of short single-gene genomic sequences with randomly generated intergenic regions. This test set, which should still present an easier problem than real human genomic sequence, mimics the approximately 200kb long BACs being sequenced. In our experiments with these longer genomic sequences, the accuracy of GENSCAN, one of the most accurate ab initio gene prediction programs, dropped significantly, although its sensitivity remained high. Conversely, the accuracy of similarity-based programs, such as GENEWISE, PROCRUSTES, and BLASTX was not affected significantly by the presence of random intergenic sequence, but depended on the strength of the similarity to the protein homolog. As expected, the accuracy dropped if the models were built using more distant homologs, and we were able to quantitatively estimate this decline. However, the specificities of these techniques are still rather good even when the similarity is weak, which is a desirable characteristic for driving expensive follow-up experiments. Our experiments suggest that though gene prediction will improve with every new protein that is discovered and through improvements in the current set of tools, we still have a long way to go before we can decipher the precise exonic structure of every gene in the human genome using purely computational methodology.
Resumo:
This report presents systematic empirical annotation of transcript products from 399 annotated protein-coding loci across the 1% of the human genome targeted by the Encyclopedia of DNA elements (ENCODE) pilot project using a combination of 5' rapid amplification of cDNA ends (RACE) and high-density resolution tiling arrays. We identified previously unannotated and often tissue- or cell-line-specific transcribed fragments (RACEfrags), both 5' distal to the annotated 5' terminus and internal to the annotated gene bounds for the vast majority (81.5%) of the tested genes. Half of the distal RACEfrags span large segments of genomic sequences away from the main portion of the coding transcript and often overlap with the upstream-annotated gene(s). Notably, at least 20% of the resultant novel transcripts have changes in their open reading frames (ORFs), most of them fusing ORFs of adjacent transcripts. A significant fraction of distal RACEfrags show expression levels comparable to those of known exons of the same locus, suggesting that they are not part of very minority splice forms. These results have significant implications concerning (1) our current understanding of the architecture of protein-coding genes; (2) our views on locations of regulatory regions in the genome; and (3) the interpretation of sequence polymorphisms mapping to regions hitherto considered to be "noncoding," ultimately relating to the identification of disease-related sequence alterations.
Resumo:
We study the minimum mean square error (MMSE) and the multiuser efficiency η of large dynamic multiple access communication systems in which optimal multiuser detection is performed at the receiver as the number and the identities of active users is allowed to change at each transmission time. The system dynamics are ruled by a Markov model describing the evolution of the channel occupancy and a large-system analysis is performed when the number of observations grow large. Starting on the equivalent scalar channel and the fixed-point equation tying multiuser efficiency and MMSE, we extend it to the case of a dynamic channel, and derive lower and upper bounds for the MMSE (and, thus, for η as well) holding true in the limit of large signal–to–noise ratios and increasingly large observation time T.
Resumo:
Large law firms seem to prefer hourly fees over contingent fees. Thispaper provides a moral hazard explanation for this pattern of behavior.Contingent legal fees align the interests of the attorney with those ofthe client, but not necessarily with those of the partnership. We showthat the choice of hourly fees is a solution to an agency problem withmultiple principals, where the interests of one principal (law firm)collide with the interests of the other principal (client).
Resumo:
We evaluate conditional predictive densities for U.S. output growth and inflationusing a number of commonly used forecasting models that rely on a large number ofmacroeconomic predictors. More specifically, we evaluate how well conditional predictive densities based on the commonly used normality assumption fit actual realizationsout-of-sample. Our focus on predictive densities acknowledges the possibility that, although some predictors can improve or deteriorate point forecasts, they might have theopposite effect on higher moments. We find that normality is rejected for most modelsin some dimension according to at least one of the tests we use. Interestingly, however,combinations of predictive densities appear to be correctly approximated by a normaldensity: the simple, equal average when predicting output growth and Bayesian modelaverage when predicting inflation.
Resumo:
We study a novel class of noisy rational expectations equilibria in markets with largenumber of agents. We show that, as long as noise increases with the number of agents inthe economy, the limiting competitive equilibrium is well-defined and leads to non-trivialinformation acquisition, perfect information aggregation, and partially revealing prices,even if per capita noise tends to zero. We find that in such equilibrium risk sharing and price revelation play dierent roles than in the standard limiting economy in which per capita noise is not negligible. We apply our model to study information sales by a monopolist, information acquisition in multi-asset markets, and derivatives trading. Thelimiting equilibria are shown to be perfectly competitive, even when a strategic solutionconcept is used.
Resumo:
This paper analyzes whether standard covariance matrix tests work whendimensionality is large, and in particular larger than sample size. Inthe latter case, the singularity of the sample covariance matrix makeslikelihood ratio tests degenerate, but other tests based on quadraticforms of sample covariance matrix eigenvalues remain well-defined. Westudy the consistency property and limiting distribution of these testsas dimensionality and sample size go to infinity together, with theirratio converging to a finite non-zero limit. We find that the existingtest for sphericity is robust against high dimensionality, but not thetest for equality of the covariance matrix to a given matrix. For thelatter test, we develop a new correction to the existing test statisticthat makes it robust against high dimensionality.
Resumo:
The responsiveness of long-term household debt to the interest rate is acrucial parameter for assessing the effectiveness of public policies aimedat promoting specific types of saving. This paper estimates the effect ofa reform of Credito Bonificado, a large program in Portugal that subsidizedmortgage interest rates, on long-term household debt. The reform establisheda ceiling in the price of the house that could be financed through theprogram, and provides plausibly exogenous variation in incentives. Usinga unique dataset of matched household survey data and administrative recordsof debt, we document a large decrease in the probability of signing a newloan after the removal of the subsidy.
Resumo:
In recent years, Spain has received unprecedented immigration flows. Between 2001 and 2006 the fraction of the population born abroad more than doubled, increasing from4.8% to 10.8%. For Spanish provinces with above-median inflows (relative to population),immigration increased by 24% the number of high school dropouts while only increasingcollege graduates by 11%. We study different channels by which regional labor markets haveabsorbed the large increase in relative supply of low educated workers. We identify theexogenous supply shock using historical immigrant settlement patterns by country of origin.Using data from the Labor Force Survey and the decennial Census, we find a large expansion ofemployment in high immigration regions. Disaggregating by industry, the absorption operatedthrough large increases in the share of low-educated workers, compared to the same industry inlow-immigration regions. We do not find changes in sectoral specialization. Overall, andperhaps surprisingly, the pattern of absorption is very similar to the one found in the US.
Resumo:
The paper explores an efficiency hypothesis regarding the contractual process between large retailers, such as Wal-Mart and Carrefour, and their suppliers. The empirical evidence presented supports the idea that large retailers play a quasi-judicial role, acting as "courts of first instance" in their relationships with suppliers. In this role, large retailers adjust the terms of trade to on-going changes and sanction performance failures, sometimes delaying payments. A potential abuse of their position is limited by the need for re-contracting and preserving their reputations. Suppliers renew their confidence in their retailers on a yearly basis, through writing new contracts. This renovation contradicts the alternative hypothesis that suppliers are expropriated by large retailers as a consequence of specific investments.
Resumo:
Using comprehensive administrative data on France's single largest financialaid program, this paper provides new evidence on the impact of large-scaleneed-based grant programs on the college enrollment decisions, persistenceand graduation rates of low-income students. We exploit sharp discontinuitiesin the grant eligibility formula to identify the impact of aid on student outcomesat different levels of study. We find that eligibility for an annual cashallowance of 1,500 euros increases college enrollment rates by up to 5 percentagepoints. Moreover, we show that need-based grants have positive effectson student persistence and degree completion.
Resumo:
We present an approach for creating image mosaics using navigation data consisting on 3D position estimates provided by sensors such as LBL available in deep water surveys. A central issue with acoustic 3D positioning is that the accuracy is far too low compositing the images within reasonable accuracy
Resumo:
A method to determine the thermal cross section of a deep level from capacitance measurements is reported. The results enable us to explain the nonexponential behavior of the capacitance versus capture time when the trap concentration is not negligible with respect to that of the shallow one, and the Debye tail effects are taken into account. A figure of merit for the nonexponential behavior of the capture process is shown and discussed for different situations of doping and applied bias. We have also considered the influence of the position of the trap level"s energy on the nonexponentiality of the capture transient. The experimental results are given for the gold acceptor level in silicon and for the DX center in Al0.55 Ga0.45As, which are in good agreement with the developed theory.