6 resultados para removing caveat from land title

em Duke University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Miyun Reservoir, the only surface water source for Beijing city, has experienced water supply decline in recent decades. Previous studies suggest that both land use change and climate contribute to the changes of water supply in this critical watershed. However, the specific causes of the decline in the Miyun Reservoir are debatable under a non-stationary climate in the past 4 decades. The central objective of this study was to quantify the separate and collective contributions of land use change and climate variability to the decreasing inflow into the Miyun Reservoir during 1961–2008. Different from previous studies on this watershed, we used a comprehensive approach to quantify the timing of changes in hydrology and associated environmental variables using the long-term historical hydrometeorology and remote-sensing-based land use records. To effectively quantify the different impacts of the climate variation and land use change on streamflow during different sub-periods, an annual water balance model (AWB), the climate elasticity model (CEM), and a rainfall–runoff model (RRM) were employed to conduct attribution analysis synthetically. We found a significant (p  <  0.01) decrease in annual streamflow, a significant positive trend in annual potential evapotranspiration (p  <  0.01), and an insignificant (p  >  0.1) negative trend in annual precipitation during 1961–2008. We identified two streamflow breakpoints, 1983 and 1999, by the sequential Mann–Kendall test and double-mass curve. Climate variability alone did not explain the decrease in inflow to the Miyun Reservoir. Reduction of water yield was closely related to increase in actual evapotranspiration due to the expansion of forestland and reduction in cropland and grassland, and was likely exacerbated by increased water consumption for domestic and industrial uses in the basin. The contribution to the observed streamflow decline from land use change fell from 64–92 % during 1984–1999 to 36–58 % during 2000–2008, whereas the contribution from climate variation climbed from 8–36 % during the 1984–1999 to 42–64 % during 2000–2008. Model uncertainty analysis further demonstrated that climate warming played a dominant role in streamflow reduction in the most recent decade (i.e., 2000s). We conclude that future climate change and variability will further challenge the water supply capacity of the Miyun Reservoir to meet water demand. A comprehensive watershed management strategy needs to consider the climate variations besides vegetation management in the study basin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scholarly publishing, and scholarly communication more generally, are based on patterns established over many decades and even centuries. Some of these patterns are clearly valuable and intimately related to core values of the academy, but others were based on the exigencies of the past, and new opportunities have brought into question whether it makes sense to persist in supporting old models. New technologies and new publishing models raise the question of how we should fund and operate scholarly publishing and scholarly communication in the future, moving away from a scarcity model based on the exchange of physical goods that restricts access to scholarly literature unless a market-based exchange takes place. This essay describes emerging models that attempt to shift scholarly communication to a more open-access and mission-based approach and that try to retain control of scholarship by academics and the institutions and scholarly societies that support them. It explores changing practices for funding scholarly journals and changing services provided by academic libraries, changes instituted with the end goal of providing more access to more readers, stimulating new scholarship, and removing inefficiencies from a system ready for change. © 2014 by the American Anthropological Association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organisms in the wild develop with varying food availability. During periods of nutritional scarcity, development may slow or arrest until conditions improve. The ability to modulate developmental programs in response to poor nutritional conditions requires a means of sensing the changing nutritional environment and limiting tissue growth. The mechanisms by which organisms accomplish this adaptation are not well understood. We sought to study this question by examining the effects of nutrient deprivation on Caenorhabditis elegans development during the late larval stages, L3 and L4, a period of extensive tissue growth and morphogenesis. By removing animals from food at different times, we show here that specific checkpoints exist in the early L3 and early L4 stages that systemically arrest the development of diverse tissues and cellular processes. These checkpoints occur once in each larval stage after molting and prior to initiation of the subsequent molting cycle. DAF-2, the insulin/insulin-like growth factor receptor, regulates passage through the L3 and L4 checkpoints in response to nutrition. The FOXO transcription factor DAF-16, a major target of insulin-like signaling, functions cell-nonautonomously in the hypodermis (skin) to arrest developmental upon nutrient removal. The effects of DAF-16 on progression through the L3 and L4 stages are mediated by DAF-9, a cytochrome P450 ortholog involved in the production of C. elegans steroid hormones. Our results identify a novel mode of C. elegans growth in which development progresses from one checkpoint to the next. At each checkpoint, nutritional conditions determine whether animals remain arrested or continue development to the next checkpoint.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A common challenge that users of academic databases face is making sense of their query outputs for knowledge discovery. This is exacerbated by the size and growth of modern databases. PubMed, a central index of biomedical literature, contains over 25 million citations, and can output search results containing hundreds of thousands of citations. Under these conditions, efficient knowledge discovery requires a different data structure than a chronological list of articles. It requires a method of conveying what the important ideas are, where they are located, and how they are connected; a method of allowing users to see the underlying topical structure of their search. This paper presents VizMaps, a PubMed search interface that addresses some of these problems. Given search terms, our main backend pipeline extracts relevant words from the title and abstract, and clusters them into discovered topics using Bayesian topic models, in particular the Latent Dirichlet Allocation (LDA). It then outputs a visual, navigable map of the query results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Articular cartilage consists of chondrocytes and two major components, a collagen-rich framework and highly abundant proteoglycans. Most prior studies defining the zonal distribution of cartilage have extracted proteins with guanidine-HCl. However, an unextracted collagen-rich residual is left after extraction. In addition, the high abundance of anionic polysaccharide molecules extracted from cartilage adversely affects the chromatographic separation. In this study, we established a method for removing chondrocytes from cartilage sections with minimal extracellular matrix protein loss. The addition of surfactant to guanidine-HCl extraction buffer improved protein solubility. Ultrafiltration removed interference from polysaccharides and salts. Almost four-times more collagen peptides were extracted by the in situ trypsin digestion method. However, as expected, proteoglycans were more abundant within the guanidine-HCl extraction. These different methods were used to extract cartilage sections from different cartilage layers (superficial, intermediate, and deep), joint types (knee and hip), and disease states (healthy and osteoarthritic), and the extractions were evaluated by quantitative and qualitative proteomic analyses. The results of this study led to the identifications of the potential biomarkers of osteoarthritis (OA), OA progression, and the joint specific biomarkers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Highlights of Data Expedition: • Students explored daily observations of local climate data spanning the past 35 years. • Topological Data Analysis, or TDA for short, provides cutting-edge tools for studying the geometry of data in arbitrarily high dimensions. • Using TDA tools, students discovered intrinsic dynamical features of the data and learned how to quantify periodic phenomenon in a time-series. • Since nature invariably produces noisy data which rarely has exact periodicity, students also considered the theoretical basis of almost-periodicity and even invented and tested new mathematical definitions of almost-periodic functions. Summary The dataset we used for this data expedition comes from the Global Historical Climatology Network. “GHCN (Global Historical Climatology Network)-Daily is an integrated database of daily climate summaries from land surface stations across the globe.” Source: https://www.ncdc.noaa.gov/oa/climate/ghcn-daily/ We focused on the daily maximum and minimum temperatures from January 1, 1980 to April 1, 2015 collected from RDU International Airport. Through a guided series of exercises designed to be performed in Matlab, students explore these time-series, initially by direct visualization and basic statistical techniques. Then students are guided through a special sliding-window construction which transforms a time-series into a high-dimensional geometric curve. These high-dimensional curves can be visualized by projecting down to lower dimensions as in the figure below (Figure 1), however, our focus here was to use persistent homology to directly study the high-dimensional embedding. The shape of these curves has meaningful information but how one describes the “shape” of data depends on which scale the data is being considered. However, choosing the appropriate scale is rarely an obvious choice. Persistent homology overcomes this obstacle by allowing us to quantitatively study geometric features of the data across multiple-scales. Through this data expedition, students are introduced to numerically computing persistent homology using the rips collapse algorithm and interpreting the results. In the specific context of sliding-window constructions, 1-dimensional persistent homology can reveal the nature of periodic structure in the original data. I created a special technique to study how these high-dimensional sliding-window curves form loops in order to quantify the periodicity. Students are guided through this construction and learn how to visualize and interpret this information. Climate data is extremely complex (as anyone who has suffered from a bad weather prediction can attest) and numerous variables play a role in determining our daily weather and temperatures. This complexity coupled with imperfections of measuring devices results in very noisy data. This causes the annual seasonal periodicity to be far from exact. To this end, I have students explore existing theoretical notions of almost-periodicity and test it on the data. They find that some existing definitions are also inadequate in this context. Hence I challenged them to invent new mathematics by proposing and testing their own definition. These students rose to the challenge and suggested a number of creative definitions. While autocorrelation and spectral methods based on Fourier analysis are often used to explore periodicity, the construction here provides an alternative paradigm to quantify periodic structure in almost-periodic signals using tools from topological data analysis.