941 resultados para test case optimization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nickel, although essential to plants, may be toxic to plants and animals. It is mainly assimilated by food ingestion. However, information about the average levels of elements (including Ni) in edible vegetables from different regions is still scarce in Brazil. The objectives of this study were to: (a) evaluate and optimize a method for preparation of vegetable tissue samples for Ni determination; (b) optimize the analytical procedures for determination by Flame Atomic Absorption Spectrometry (FAAS) and by Electrothermal Atomic Absorption (ETAAS) in vegetable samples and (c) determine the Ni concentration in vegetables consumed in the cities of Lorena and Taubaté in the Vale do Paraíba, State of São Paulo, Brazil. By means of the analytical technique for determination by ETAAS or FAAS, the results were validated by the test of analyte addition and recovery. The most viable method tested for quantification of this element was HClO4-HNO3 wet digestion. All samples but carrot tissue collected in Lorena contained Ni levels above the permitted by the Brazilian Ministry of Health. The most disturbing results, requiring more detailed studies, were the Ni concentrations measured in carrot samples from Taubaté, where levels were five times higher than permitted by Brazilian regulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular shape has long been known to be an important property for the process of molecular recognition. Previous studies postulated the existence of a drug-like shape space that could be used to artificially bias the composition of screening libraries, with the aim to increase the chance of success in Hit Identification. In this work, it was analysed to which extend this assumption holds true. Normalized Principal Moments of Inertia Ratios (NPRs) have been used to describe the molecular shape of small molecules. It was investigated, whether active molecules of diverse targets are located in preferred subspaces of the NPR shape space. Results illustrated a significantly stronger clustering than could be expected by chance, with parts of the space unlikely to be occupied by active compounds. Furthermore, a strong enrichment of elongated, rather flat shapes could be observed, while globular compounds were highly underrepresented. This was confirmed for a wide range of small molecule datasets from different origins. Active compounds exhibited a high overlap in their shape distributions across different targets, making a purely shape­ based discrimination very difficult. An additional perspective was provided by comparing the shapes of protein binding pockets with those of their respective ligands. Although more globular than their ligands, it was observed that binding sites shapes exhibited a similarly skewed distribution in shape space: spherical shapes were highly underrepresented. This was different for unoccupied binding pockets of smaller size. These were on the contrary identified to possess a more globular shape. The relation between shape complementarity and exhibited bioactivity was analysed; a moderate correlation between bioactivity and parameters including pocket coverage, distance in shape space, and others could be identified, which reflects the importance of shape complementarity. However, this also suggests that other aspects are of relevance for molecular recognition. A subsequent analysis assessed if and how shape and volume information retrieved from pocket or respective reference ligands could be used as a pre-filter in a virtual screening approach. ln Lead Optimization compounds need to get optimized with respect to a variety of pararneters. Here, the availability of past success stories is very valuable, as they can guide medicinal chemists during their analogue synthesis plans. However, although of tremendous interest for the public domain, so far only large corporations had the ability to mine historical knowledge in their proprietary databases. With the aim to provide such information, the SwissBioisostere database was developed and released during this thesis. This database contains information on 21,293,355 performed substructural exchanges, corresponding to 5,586,462 unique replacements that have been measured in 35,039 assays against 1,948 molecular targets representing 30 target classes, and on their impact on bioactivity . A user-friendly interface was developed that provides facile access to these data and is accessible at http//www.swissbioisostere.ch. The ChEMBL database was used as primary data source of bioactivity information. Matched molecular pairs have been identified in the extracted and cleaned data. Success-based scores were developed and integrated into the database to allow re-ranking of proposed replacements by their past outcomes. It was analysed to which degree these scores correlate with chemical similarity of the underlying fragments. An unexpectedly weak relationship was detected and further investigated. Use cases of this database were envisioned, and functionalities implemented accordingly: replacement outcomes are aggregatable at the assay level, and it was shawn that an aggregation at the target or target class level could also be performed, but should be accompanied by a careful case-by-case assessment. It was furthermore observed that replacement success depends on the activity of the starting compound A within a matched molecular pair A-B. With increasing potency the probability to lose bioactivity through any substructural exchange was significantly higher than in low affine binders. A potential existence of a publication bias could be refuted. Furthermore, often performed medicinal chemistry strategies for structure-activity-relationship exploration were analysed using the acquired data. Finally, data originating from pharmaceutical companies were compared with those reported in the literature. It could be seen that industrial medicinal chemistry can access replacement information not available in the public domain. In contrast, a large amount of often-performed replacements within companies could also be identified in literature data. Preferences for particular replacements differed between these two sources. The value of combining different endpoints in an evaluation of molecular replacements was investigated. The performed studies highlighted furthermore that there seem to exist no universal substructural replacement that always retains bioactivity irrespective of the biological environment. A generalization of bioisosteric replacements seems therefore not possible. - La forme tridimensionnelle des molécules a depuis longtemps été reconnue comme une propriété importante pour le processus de reconnaissance moléculaire. Des études antérieures ont postulé que les médicaments occupent préférentiellement un sous-ensemble de l'espace des formes des molécules. Ce sous-ensemble pourrait être utilisé pour biaiser la composition de chimiothèques à cribler, dans le but d'augmenter les chances d'identifier des Hits. L'analyse et la validation de cette assertion fait l'objet de cette première partie. Les Ratios de Moments Principaux d'Inertie Normalisés (RPN) ont été utilisés pour décrire la forme tridimensionnelle de petites molécules de type médicament. Il a été étudié si les molécules actives sur des cibles différentes se co-localisaient dans des sous-espaces privilégiés de l'espace des formes. Les résultats montrent des regroupements de molécules incompatibles avec une répartition aléatoire, avec certaines parties de l'espace peu susceptibles d'être occupées par des composés actifs. Par ailleurs, un fort enrichissement en formes allongées et plutôt plates a pu être observé, tandis que les composés globulaires étaient fortement sous-représentés. Cela a été confirmé pour un large ensemble de compilations de molécules d'origines différentes. Les distributions de forme des molécules actives sur des cibles différentes se recoupent largement, rendant une discrimination fondée uniquement sur la forme très difficile. Une perspective supplémentaire a été ajoutée par la comparaison des formes des ligands avec celles de leurs sites de liaison (poches) dans leurs protéines respectives. Bien que plus globulaires que leurs ligands, il a été observé que les formes des poches présentent une distribution dans l'espace des formes avec le même type d'asymétrie que celle observée pour les ligands: les formes sphériques sont fortement sous­ représentées. Un résultat différent a été obtenu pour les poches de plus petite taille et cristallisées sans ligand: elles possédaient une forme plus globulaire. La relation entre complémentarité de forme et bioactivité a été également analysée; une corrélation modérée entre bioactivité et des paramètres tels que remplissage de poche, distance dans l'espace des formes, ainsi que d'autres, a pu être identifiée. Ceci reflète l'importance de la complémentarité des formes, mais aussi l'implication d'autres facteurs. Une analyse ultérieure a évalué si et comment la forme et le volume d'une poche ou de ses ligands de référence pouvaient être utilisés comme un pré-filtre dans une approche de criblage virtuel. Durant l'optimisation d'un Lead, de nombreux paramètres doivent être optimisés simultanément. Dans ce contexte, la disponibilité d'exemples d'optimisations réussies est précieuse, car ils peuvent orienter les chimistes médicinaux dans leurs plans de synthèse par analogie. Cependant, bien que d'un extrême intérêt pour les chercheurs dans le domaine public, seules les grandes sociétés pharmaceutiques avaient jusqu'à présent la capacité d'exploiter de telles connaissances au sein de leurs bases de données internes. Dans le but de remédier à cette limitation, la base de données SwissBioisostere a été élaborée et publiée dans le domaine public au cours de cette thèse. Cette base de données contient des informations sur 21 293 355 échanges sous-structuraux observés, correspondant à 5 586 462 remplacements uniques mesurés dans 35 039 tests contre 1948 cibles représentant 30 familles, ainsi que sur leur impact sur la bioactivité. Une interface a été développée pour permettre un accès facile à ces données, accessible à http:/ /www.swissbioisostere.ch. La base de données ChEMBL a été utilisée comme source de données de bioactivité. Une version modifiée de l'algorithme de Hussain et Rea a été implémentée pour identifier les Matched Molecular Pairs (MMP) dans les données préparées au préalable. Des scores de succès ont été développés et intégrés dans la base de données pour permettre un reclassement des remplacements proposés selon leurs résultats précédemment observés. La corrélation entre ces scores et la similarité chimique des fragments correspondants a été étudiée. Une corrélation plus faible qu'attendue a été détectée et analysée. Différents cas d'utilisation de cette base de données ont été envisagés, et les fonctionnalités correspondantes implémentées: l'agrégation des résultats de remplacement est effectuée au niveau de chaque test, et il a été montré qu'elle pourrait également être effectuée au niveau de la cible ou de la classe de cible, sous réserve d'une analyse au cas par cas. Il a en outre été constaté que le succès d'un remplacement dépend de l'activité du composé A au sein d'une paire A-B. Il a été montré que la probabilité de perdre la bioactivité à la suite d'un remplacement moléculaire quelconque est plus importante au sein des molécules les plus actives que chez les molécules de plus faible activité. L'existence potentielle d'un biais lié au processus de publication par articles a pu être réfutée. En outre, les stratégies fréquentes de chimie médicinale pour l'exploration des relations structure-activité ont été analysées à l'aide des données acquises. Enfin, les données provenant des compagnies pharmaceutiques ont été comparées à celles reportées dans la littérature. Il a pu être constaté que les chimistes médicinaux dans l'industrie peuvent accéder à des remplacements qui ne sont pas disponibles dans le domaine public. Par contre, un grand nombre de remplacements fréquemment observés dans les données de l'industrie ont également pu être identifiés dans les données de la littérature. Les préférences pour certains remplacements particuliers diffèrent entre ces deux sources. L'intérêt d'évaluer les remplacements moléculaires simultanément selon plusieurs paramètres (bioactivité et stabilité métabolique par ex.) a aussi été étudié. Les études réalisées ont souligné qu'il semble n'exister aucun remplacement sous-structural universel qui conserve toujours la bioactivité quel que soit le contexte biologique. Une généralisation des remplacements bioisostériques ne semble donc pas possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One way of classifying water quality is by means of indices, in which a series of parameters analyzed are joined a single value, facilitating the interpretation of extensive lists of variables or indicators, underlying the classification of water quality. The objective of this study was to develop a statistically based index to classify water according to the Irrigation Water Quality Index (IWQI), to evaluate the ionic composition of water for use in irrigation and classify it by its source. For this purpose, the database generated during the Technology Generation and Adaptation (GAT) program was used, in which, as of 1988, water samples were collected monthly from water sources in the states of Paraíba, Rio Grande do Norte and Ceará. To evaluate water quality, the electrical conductivity (EC) of irrigation water was taken as a reference, with values corresponding to 0.7 dS m-1. The chemical variables used in this study were: pH, EC, Ca, Mg, Na, K, Cl, HCO3, CO3, and SO4. The data of all characteristics evaluated were standardized and data normality was confirmed by Lilliefors test. Then the irrigation water quality index was determined by an equation that relates the standardized value of the variable with the number of characteristics evaluated. Thus, the IWQI was classified based on indices, considering normal distribution. Finally, these indices were subjected to regression analysis. The method proposed for the IWQI allowed a satisfactory classification of the irrigation water quality, being able to estimate it as a function of EC for the three water sources. Variation in the ionic composition was observed among the three sources and within a single source. Although the water quality differed, it was good in most cases, with the classification IWQI II.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gradients of variation-or clines-have always intrigued biologists. Classically, they have been interpreted as the outcomes of antagonistic interactions between selection and gene flow. Alternatively, clines may also establish neutrally with isolation by distance (IBD) or secondary contact between previously isolated populations. The relative importance of natural selection and these two neutral processes in the establishment of clinal variation can be tested by comparing genetic differentiation at neutral genetic markers and at the studied trait. A third neutral process, surfing of a newly arisen mutation during the colonization of a new habitat, is more difficult to test. Here, we designed a spatially explicit approximate Bayesian computation (ABC) simulation framework to evaluate whether the strong cline in the genetically based reddish coloration observed in the European barn owl (Tyto alba) arose as a by-product of a range expansion or whether selection has to be invoked to explain this colour cline, for which we have previously ruled out the actions of IBD or secondary contact. Using ABC simulations and genetic data on 390 individuals from 20 locations genotyped at 22 microsatellites loci, we first determined how barn owls colonized Europe after the last glaciation. Using these results in new simulations on the evolution of the colour phenotype, and assuming various genetic architectures for the colour trait, we demonstrate that the observed colour cline cannot be due to the surfing of a neutral mutation. Taking advantage of spatially explicit ABC, which proved to be a powerful method to disentangle the respective roles of selection and drift in range expansions, we conclude that the formation of the colour cline observed in the barn owl must be due to natural selection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Switzerland, the land management regime is characterized by a liberal attitude towards the institution of property rights, which is guaranteed by the Constitution. Under the present Swiss constitutional arrangement, authorities (municipalities) are required to take into account landowners' interests when implementing their spatial planning policy. In other words, the institution of property rights cannot be restricted easily in order to implement zoning plans and planning projects. This situation causes many problems. One of them is the gap between the way land is really used by the landowners and the way land should be used based on zoning plans. In fact, zoning plans only describe how landowners should use their property. There is no sufficient provision for handling cases where the use is not in accordance with zoning plans. In particular, landowners may not be expropriated for a non-conforming use of the land. This situation often leads to the opening of new building areas in greenfields and urban sprawl, which is in contradiction with the goals set into the Federal Law on Spatial Planning. In order to identify legal strategies of intervention to solve the problem, our paper is structured into three main parts. Firstly, we make a short description of the Swiss land management regime. Then, we focus on an innovative land management approach designed to implement zoning plans in accordance with property rights. Finally, we present a case study that shows the usefulness of the presented land management approach in practice. We develop three main results. Firstly, the land management approach brings a mechanism to involve landowners in planning projects. Coordination principle between spatial planning goals and landowners' interests is the cornerstone of all the process. Secondly, the land use is improved both in terms of space and time. Finally, the institution of property rights is not challenged, since there is no expropriation and the market stays free.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a procedure for the optical characterization of thin-film stacks from spectrophotometric data. The procedure overcomes the intrinsic limitations arising in the numerical determination of manyparameters from reflectance or transmittance spectra measurements. The key point is to use all theinformation available from the manufacturing process in a single global optimization process. The method is illustrated by a case study of solgel applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Myasthenia gravis is an autoimmune disease characterized by fluctuating muscle weakness. It is often associated with other autoimmune disorders, such as thyroid disease, rheumatoid arthritis, systemic lupus erythematosus, and antiphospholipid syndrome. Many aspects of autoimmune diseases are not completely understood, particularly when they occur in association, which suggests a common pathogenetic mechanism. CASE PRESENTATION: We report a case of a 42-year-old Caucasian woman with antiphospholipid syndrome, in whom myasthenia gravis developed years later. She tested negative for both antibodies against the acetylcholine receptor and against muscle-specific receptor tyrosine-kinase, but had typical decremental responses at the repetitive nerve stimulation testing, so that a generalized myasthenia gravis was diagnosed. Her thromboplastin time and activated partial thromboplastin time were high, anticardiolipin and anti-β2 glycoprotein-I antibodies were slightly elevated, as a manifestation of the antiphospholipid syndrome. She had a good clinical response when treated with a combination of pyridostigmine, prednisone and azathioprine. CONCLUSIONS: Many patients with myasthenia gravis test positive for a large variety of auto-antibodies, testifying of an immune dysregulation, and some display mild T-cell lymphopenia associated with hypergammaglobulinemia and B-cell hyper-reactivity. Both of these mechanisms could explain the occurrence of another autoimmune condition, such as antiphospholipid syndrome, but further studies are necessary to shed light on this matter.Clinicians should be aware that patients with an autoimmune diagnosis such as antiphospholipid syndrome who develop signs and neurological symptoms suggestive of myasthenia gravis are at risk and should prompt an emergent evaluation by a specialist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: We describe a case of diffuse nesidioblastosis in an adult patient who presented with exclusively fasting symptoms and a focal pancreatic 111In-pentetreotide uptake mimicking an insulinoma. CASE PRESENTATION: A 23-year-old Caucasian man had severe daily fasting hypoglycemia with glucose levels below 2mmol/L. Besides rare neuroglycopenic symptoms (confusion, sleepiness), he was largely asymptomatic. His investigations revealed low venous plasma glucose levels, high insulin and C-peptide levels and a 72-hour fast test that were all highly suggestive for an insulinoma. Abdominal computed tomography and magnetic resonance imaging did not reveal any lesions. The sole imagery that was compatible with an insulinoma was a 111In-somatostatin receptor scintigraphy that showed a faint but definite focal tracer between the head and the body of the pancreas. However, this lesion could not be confirmed by endoscopic ultrasonography of the pancreas. Following duodenopancreatectomy, the histological findings were consistent with diffuse nesidioblastosis. Postoperatively, the patient continued to present with fasting hypoglycemia and was successfully treated with diazoxide. CONCLUSION: In the absence of gastrointestinal surgery, nesidioblastosis is very rare in adults. In addition, nesidioblastosis is usually characterized by post-prandial hypoglycemia, whereas this patient presented with fasting hypoglycemia. This case also illustrates the risk for a false positive result of 111In-pentetreotide scintigraphy in the case of nesidioblastosis. Selective arterial calcium stimulation and venous sampling is the most reliable procedure for the positive diagnosis of insulinoma or nesidioblastosis and should be used to confirm any suspicion based on imaging modalities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper derives the HJB (Hamilton-Jacobi-Bellman) equation for sophisticated agents in a finite horizon dynamic optimization problem with non-constant discounting in a continuous setting, by using a dynamic programming approach. A simple example is used in order to illustrate the applicability of this HJB equation, by suggesting a method for constructing the subgame perfect equilibrium solution to the problem.Conditions for the observational equivalence with an associated problem with constantdiscounting are analyzed. Special attention is paid to the case of free terminal time. Strotz¿s model (an eating cake problem of a nonrenewable resource with non-constant discounting) is revisited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a debate on whether an influence of biotic interactions on species distributions can be reflected at macro-scale levels. Whereas the influence of biotic interactions on spatial arrangements is beginning to be studied at local scales, similar studies at macro-scale levels are scarce. There is no example disentangling, from other similarities with related species, the influence of predator-prey interactions on species distributions at macro-scale levels. In this study we aimed to disentangle predator-prey interactions from species distribution data following an experimental approach including a factorial design. As a case of study we selected the short-toed eagle because of its known specialization on certain prey reptiles. We used presence-absence data at a 100 Km2 spatial resolution to extract the explanatory capacity of different environmental predictors (five abiotic and two biotic predictors) on the short-toed eagle species distribution in Peninsular Spain. Abiotic predictors were relevant climatic and topographic variables, and relevant biotic predictors were prey richness and forest density. In addition to the short-toed eagle, we also obtained the predictor's explanatory capacities for i) species of the same family Accipitridae (as a reference), ii) for other birds of different families (as controls) and iii) species with randomly selected presences (as null models). We run 650 models to test for similarities of the short-toed eagle, controls and null models with reference species, assessed by regressions of explanatory capacities. We found higher similarities between the short-toed eagle and other species of the family Accipitridae than for the other two groups. Once corrected by the family effect, our analyses revealed a signal of predator-prey interaction embedded in species distribution data. This result was corroborated with additional analyses testing for differences in the concordance between the distributions of different bird categories and the distributions of either prey or non-prey species of the short-toed eagle. Our analyses were useful to disentangle a signal of predator-prey interactions from species distribution data at a macro-scale. This study highlights the importance of disentangling specific features from the variation shared with a given taxonomic level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research project looked at the economic benefits and costs associated with alternative strategies for abandoning low volume rural highways and bridges. Three test counties in Iowa were studied, each 100 square miles in size: Hamilton County having a high agricultural tax base and a high percentage of paved roads and few bridges; Shelby County having a relatively low agricultural tax base, hilly terrain and a low percentage of paved road and many bridges; and Linn County having a high agricultural tax base, a high percentage of paved roads and a large number of non-farm households. A questionnaire survey was undertaken to develop estimates of farm and household travel patterns. Benefits and costs associated with the abandonment of various segments of rural highway and bridge mileages in each county were calculated. "Benefits" calculated were reduced future reconstruction and maintenance costs, whereas "costs" were the added cost of travel resulting from the reduced mileage. Some of the findings suggest limited cost savings from abandonment of county roads with no property access in areas with large non-farm rural population; relatively high cost savings from the abandonment of roads with no property access in areas with small rural population; and the largest savings from the conversion of public dead-end gravel roads with property or residence accesses to private drives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study explores the statistical properties of a randomization test based on the random assignment of the intervention point in a two-phase (AB) single-case design. The focus is on randomization distributions constructed with the values of the test statistic for all possible random assignments and used to obtain p-values. The shape of those distributions is investigated for each specific data division defined by the moment in which the intervention is introduced. Another aim of the study consisted in testing the detection of inexistent effects (i.e., production of false alarms) in autocorrelated data series, in which the assumption of exchangeability between observations may be untenable. In this way, it was possible to compare nominal and empirical Type I error rates in order to obtain evidence on the statistical validity of the randomization test for each individual data division. The results suggest that when either of the two phases has considerably less measurement times, Type I errors may be too probable and, hence, the decision making process to be carried out by applied researchers may be jeopardized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blowing and drifting of snow is a major concern for transportation efficiency and road safety in regions where their development is common. One common way to mitigate snow drift on roadways is to install plastic snow fences. Correct design of snow fences is critical for road safety and maintaining the roads open during winter in the US Midwest and other states affected by large snow events during the winter season and to maintain costs related to accumulation of snow on the roads and repair of roads to minimum levels. Of critical importance for road safety is the protection against snow drifting in regions with narrow rights of way, where standard fences cannot be deployed at the recommended distance from the road. Designing snow fences requires sound engineering judgment and a thorough evaluation of the potential for snow blowing and drifting at the construction site. The evaluation includes site-specific design parameters typically obtained with semi-empirical relations characterizing the local transport conditions. Among the critical parameters involved in fence design and assessment of their post-construction efficiency is the quantification of the snow accumulation at fence sites. The present study proposes a joint experimental and numerical approach to monitor snow deposits around snow fences, quantitatively estimate snow deposits in the field, asses the efficiency and improve the design of snow fences. Snow deposit profiles were mapped using GPS based real-time kinematic surveys (RTK) conducted at the monitored field site during and after snow storms. The monitored site allowed testing different snow fence designs under close to identical conditions over four winter seasons. The study also discusses the detailed monitoring system and analysis of weather forecast and meteorological conditions at the monitored sites. A main goal of the present study was to assess the performance of lightweight plastic snow fences with a lower porosity than the typical 50% porosity used in standard designs of such fences. The field data collected during the first winter was used to identify the best design for snow fences with a porosity of 50%. Flow fields obtained from numerical simulations showed that the fence design that worked the best during the first winter induced the formation of an elongated area of small velocity magnitude close to the ground. This information was used to identify other candidates for optimum design of fences with a lower porosity. Two of the designs with a fence porosity of 30% that were found to perform well based on results of numerical simulations were tested in the field during the second winter along with the best performing design for fences with a porosity of 50%. Field data showed that the length of the snow deposit away from the fence was reduced by about 30% for the two proposed lower-porosity (30%) fence designs compared to the best design identified for fences with a porosity of 50%. Moreover, one of the lower-porosity designs tested in the field showed no significant snow deposition within the bottom gap region beneath the fence. Thus, a major outcome of this study is to recommend using plastic snow fences with a porosity of 30%. It is expected that this lower-porosity design will continue to work well for even more severe snow events or for successive snow events occurring during the same winter. The approach advocated in the present study allowed making general recommendations for optimizing the design of lower-porosity plastic snow fences. This approach can be extended to improve the design of other types of snow fences. Some preliminary work for living snow fences is also discussed. Another major contribution of this study is to propose, develop protocols and test a novel technique based on close range photogrammetry (CRP) to quantify the snow deposits trapped snow fences. As image data can be acquired continuously, the time evolution of the volume of snow retained by a snow fence during a storm or during a whole winter season can, in principle, be obtained. Moreover, CRP is a non-intrusive method that eliminates the need to perform man-made measurements during the storms, which are difficult and sometimes dangerous to perform. Presently, there is lots of empiricism in the design of snow fences due to lack of data on fence storage capacity on how snow deposits change with the fence design and snow storm characteristics and in the estimation of the main parameters used by the state DOTs to design snow fences at a given site. The availability of such information from CRP measurements should provide critical data for the evaluation of the performance of a certain snow fence design that is tested by the IDOT. As part of the present study, the novel CRP method is tested at several sites. The present study also discusses some attempts and preliminary work to determine the snow relocation coefficient which is one of the main variables that has to be estimated by IDOT engineers when using the standard snow fence design software (Snow Drift Profiler, Tabler, 2006). Our analysis showed that standard empirical formulas did not produce reasonable values when applied at the Iowa test sites monitored as part of the present study and that simple methods to estimate this variable are not reliable. The present study makes recommendations for the development of a new methodology based on Large Scale Particle Image Velocimetry that can directly measure the snow drift fluxes and the amount of snow relocated by the fence.