841 resultados para Sectoral and territorial approaches
Resumo:
Improving the genetic base of cultivars that underpin commercial mango production is generally recognized as necessary for long term industry stability. Genetic improvement can take many approaches to improve cultivars, each with their own advantages and disadvantages. This paper will discuss several approaches used in the genetic improvement of mangoes in Australia, including varietal introductions, selection of monoembryonic progeny, selection within polyembryonic populations, assisted open pollination and controlled closed pollination. The current activities of the Australian National Mango Breeding Program will be outlined, and the analysis and use of hybrid phenotype data from the project for selection of next generation parents will be discussed. Some of the important traits that will enhance the competitiveness of future cultivars will be introduced and the challenges in achieving them discussed. The use of a genomics approach and its impact on future mango breeding is examined.
Resumo:
This study addresses the following question: How to think about ethics in a technological world? The question is treated first thematically by framing central issues in the relationship between ethics and technology. This relationship has three distinct facets: i) technological advance poses new challenges for ethics, ii) traditional ethics may become poorly applicable in a technologically transformed world, and iii) the progress in science and technology has altered the concept of rationality in ways that undermine ethical thinking itself. The thematic treatment is followed by the description and analysis of three approaches to the questions framed. First, Hans Jonas s thinking on the ontology of life and the imperative of responsibility is studied. In Jonas s analysis modern culture is found to be nihilistic because it is unable to understand organic life, to find meaning in reality, and to justify morals. At the root of nihilism Jonas finds dualism, the traditional Western way of seeing consciousness as radically separate from the material world. Jonas attempts to create a metaphysical grounding for an ethic that would take the technologically increased human powers into account and make the responsibility for future generations meaningful and justified. The second approach is Albert Borgmann s philosophy of technology that mainly assesses the ways in which technological development has affected everyday life. Borgmann admits that modern technology has liberated humans from toil, disease, danger, and sickness. Furthermore, liberal democracy, possibilities for self-realization, and many of the freedoms we now enjoy would not be possible on a large scale without technology. Borgmann, however, argues that modern technology in itself does not provide a whole and meaningful life. In fact, technological conditions are often detrimental to the good life. Integrity in life, according to him, is to be sought among things and practices that evade technoscientific objectification and commodification. Larry Hickman s Deweyan philosophy of technology is the third approach under scrutiny. Central in Hickman s thinking is a broad definition of technology that is nearly equal to Deweyan inquiry. Inquiry refers to the reflective and experiential way humans adapt to their environment by modifying their habits and beliefs. In Hickman s work, technology consists of all kinds of activities that through experimentation and/or reflection aim at improving human techniques and habits. Thus, in addition to research and development, many arts and political reforms are technological for Hickman. He argues for recasting such distinctions as fact/value, poiesis/praxis/theoria, and individual/society. Finally, Hickman does not admit a categorical difference between ethics and technology: moral values and norms need to be submitted to experiential inquiry as well as all the other notions. This study mainly argues for an interdisciplinary approach to the ethics of technology. This approach should make use of the potentialities of the research traditions in applied ethics, the philosophy of technology, and the social studies on science and technology and attempt to overcome their limitations. This study also advocates an endorsement of mid-level ethics that concentrate on the practices, institutions, and policies of temporal human life. Mid-level describes the realm between the instantaneous and individualistic micro-level and the universal and global macro level.
Resumo:
Nucleation is the first step of a first order phase transition. A new phase is always sprung up in nucleation phenomena. The two main categories of nucleation are homogeneous nucleation, where the new phase is formed in a uniform substance, and heterogeneous nucleation, when nucleation occurs on a pre-existing surface. In this thesis the main attention is paid on heterogeneous nucleation. This thesis wields the nucleation phenomena from two theoretical perspectives: the classical nucleation theory and the statistical mechanical approach. The formulation of the classical nucleation theory relies on equilibrium thermodynamics and use of macroscopically determined quantities to describe the properties of small nuclei, sometimes consisting of just a few molecules. The statistical mechanical approach is based on interactions between single molecules, and does not bear the same assumptions as the classical theory. This work gathers up the present theoretical knowledge of heterogeneous nucleation and utilizes it in computational model studies. A new exact molecular approach on heterogeneous nucleation was introduced and tested by Monte Carlo simulations. The results obtained from the molecular simulations were interpreted by means of the concepts of the classical nucleation theory. Numerical calculations were carried out for a variety of substances nucleating on different substances. The classical theory of heterogeneous nucleation was employed in calculations of one-component nucleation of water on newsprint paper, Teflon and cellulose film, and binary nucleation of water-n-propanol and water-sulphuric acid mixtures on silver nanoparticles. The results were compared with experimental results. The molecular simulation studies involved homogeneous nucleation of argon and heterogeneous nucleation of argon on a planar platinum surface. It was found out that the use of a microscopical contact angle as a fitting parameter in calculations based on the classical theory of heterogeneous nucleation leads to a fair agreement between the theoretical predictions and experimental results. In the presented cases the microscopical angle was found to be always smaller than the contact angle obtained from macroscopical measurements. Furthermore, molecular Monte Carlo simulations revealed that the concept of the geometrical contact parameter in heterogeneous nucleation calculations can work surprisingly well even for very small clusters.
Resumo:
Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Protein misfolding is a general causation of classical conformational diseases and many pathogenic changes that are the result of structural conversion. Here I review recent progress in clinical and computational approaches for each stage of the misfolding process, aiming to present readers an outline for swift comprehension of this field.
Resumo:
Biochemical (electrophoresis and mitochondrial DNA) and morphological analysis are important tools for the characterization of strains. Reference is made to studies conducted in the framework of the Genetic Improvement of Farmed Tilapias project to establish a new base tilapia population for culture purposes, describing the basic concepts of electrophoresis and morphometric analysis.
Resumo:
The WorldFish Center was tasked to undertake a study to access, collate and develop background materials to produce an internationally linked and Africa-wide perspective on sectorally relevant policy issues. The specific objective of the study was to assess and define conditions and impact pathways, in Africa or elsewhere, where markets, policies, resources and technologies have combined to promote steady and sustainable growth of aquaculture, and where have been clear direct impacts on food supply, income, employment and consumption opportunities, as well as increase in supply that has led to stabilised prices. The study was also aimed at providing guidelines for scaling up the implementation of the synthesis study via Afri-FishNet (CAADP Fish Expert Pools) at the national and regional levels.
Resumo:
The use of L1 regularisation for sparse learning has generated immense research interest, with successful application in such diverse areas as signal acquisition, image coding, genomics and collaborative filtering. While existing work highlights the many advantages of L1 methods, in this paper we find that L1 regularisation often dramatically underperforms in terms of predictive performance when compared with other methods for inferring sparsity. We focus on unsupervised latent variable models, and develop L1 minimising factor models, Bayesian variants of "L1", and Bayesian models with a stronger L0-like sparsity induced through spike-and-slab distributions. These spike-and-slab Bayesian factor models encourage sparsity while accounting for uncertainty in a principled manner and avoiding unnecessary shrinkage of non-zero values. We demonstrate on a number of data sets that in practice spike-and-slab Bayesian methods outperform L1 minimisation, even on a computational budget. We thus highlight the need to re-assess the wide use of L1 methods in sparsity-reliant applications, particularly when we care about generalising to previously unseen data, and provide an alternative that, over many varying conditions, provides improved generalisation performance.
Resumo:
Computational Intelligence and Feature Selection provides a high level audience with both the background and fundamental ideas behind feature selection with an emphasis on those techniques based on rough and fuzzy sets, including their hybridizations. It introduces set theory, fuzzy set theory, rough set theory, and fuzzy-rough set theory, and illustrates the power and efficacy of the feature selections described through the use of real-world applications and worked examples. Program files implementing major algorithms covered, together with the necessary instructions and datasets, are available on the Web.
Resumo:
Recent decades have witnessed a shift in the studies on Spanish rural commons, in line with the changes in the international literature as a whole. The focus in the 1970s was on the land privatization process referred to as disentailment (Desamortización), being considered one of the essential dimensions of the transition to capitalism. The recent revival of interest in rural commons has focused less on privatization than on the real functioning of the commons and the social relations articulated around them. A further focus of interest is the interaction between rural society and the State, mainly through the study of forestry policy and its effects on different regions. A third field of interest is the emergence of conflicts around rural commons; not only those of a distributive nature but also environmental and political ones. Accordingly, these new approaches go beyond the old image of a fatal destiny in order to profoundly analyze the environmental and social interactions of rural commons dynamics.