628 resultados para Maximizing
Resumo:
We examine the question of the optimal number of reserves that should be established to maximize the persistence of a species. We assume that the mean time to extinction of a single population increases as a power of the habitat area, that there is a certain amount of habitat to be reserved, and that the aim is to determine how this habitat is most efficiently divided. The optimal configuration depends on whether the management objective is to maximize the mean time to extinction or minimize the risk of extinction. When maximizing the mean time to extinction, the optimal number of independent reserves does not depend on the amount of available habitat for the reserve system. In contrast, the risk of extinction is minimized when individual reserves are equal to the optimal patch size, making the optimal number of reserves linearly proportional to the amount of available habitat. A model that includes dispersal and correlation in the incidence of extinction demonstrates the importance of considering the relative rate at which these two factors decrease with distance between reserves. A small number of reserves is optimal when the mean time to extinction increases rapidly with habitat area or when risks of extinction are high.
Resumo:
Translocation is an important tool for the conservation of species that have suffered severe range reductions. The success of a translocation should be measured not only by the survival of released animals, but by the reproductive output of individuals and hence the establishment of a self-sustaining population. The bridled nailtail wallaby is an endangered Australian macropod that suffered an extensive range contraction to a single remaining wild population. A translocated population was established and subsequently monitored over a four year period. The aim of this study was to measure the reproductive success of released males using genetic tools and to determine the factors that predicted reproductive success. Captive-bred and wild-caught animals were released and we found significant variation in male reproductive success among release groups. Variation in reproductive success was best explained by individual male weight, survival and release location rather than origin. Only 26% of candidate males were observed to sire an offspring during the study. The bridled nailtail wallaby is a sexually dimorphic, polygynous macropod and reproductive success is skewed toward large males. Males over 5800 g were six times more likely to sire an offspring than males below this weight. This study highlights the importance of considering mating system when choosing animals for translocation. Translocation programs for polygynous species should release a greater proportion of females, and only release males of high breeding potential. By maximizing the reproductive output of released animals, conservation managers will reduce the costs of translocation and increase the chance of successfully establishing a self-sustaining population. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This paper proposes three models of adding relations to an organization structure which is a complete K-ary tree of height H: (i) a model of adding an edge between two nodes with the same depth N, (ii) a model of adding edges between every pair of nodes with the same depth N and (iii) a model of adding edges between every pair of siblings with the same depth N. For each of the three models, an optimal depth N* is obtained by maximizing the total shortening path length which is the sum of shortening lengths of shortest paths between every pair of all nodes. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
The first step in conservation planning is to identify objectives. Most stated objectives for conservation, such as to maximize biodiversity outcomes, are too vague to be useful within a decision-making framework. One way to clarify the issue is to define objectives in terms of the risk of extinction for multiple species. Although the assessment of extinction risk for single species is common, few researchers have formulated an objective function that combines the extinction risks of multiple species. We sought to translate the broad goal of maximizing the viability of species into explicit objectives for use in a decision-theoretic approach to conservation planning. We formulated several objective functions based on extinction risk across many species and illustrated the differences between these objectives with simple examples. Each objective function was the mathematical representation of an approach to conservation and emphasized different levels of threat Our objectives included minimizing the joint probability of one or more extinctions, minimizing the expected number of extinctions, and minimizing the increase in risk of extinction from the best-case scenario. With objective functions based on joint probabilities of extinction across species, any correlations in extinction probabilities bad to be known or the resultant decisions were potentially misleading. Additive objectives, such as the expected number of extinctions, did not produce the same anomalies. We demonstrated that the choice of objective function is central to the decision-making process because alternative objective functions can lead to a different ranking of management options. Therefore, decision makers need to think carefully in selecting and defining their conservation goals.
Resumo:
The Internet of Things (IoT) consists of a worldwide “network of networks,” composed by billions of interconnected heterogeneous devices denoted as things or “Smart Objects” (SOs). Significant research efforts have been dedicated to port the experience gained in the design of the Internet to the IoT, with the goal of maximizing interoperability, using the Internet Protocol (IP) and designing specific protocols like the Constrained Application Protocol (CoAP), which have been widely accepted as drivers for the effective evolution of the IoT. This first wave of standardization can be considered successfully concluded and we can assume that communication with and between SOs is no longer an issue. At this time, to favor the widespread adoption of the IoT, it is crucial to provide mechanisms that facilitate IoT data management and the development of services enabling a real interaction with things. Several reference IoT scenarios have real-time or predictable latency requirements, dealing with billions of device collecting and sending an enormous quantity of data. These features create a new need for architectures specifically designed to handle this scenario, hear denoted as “Big Stream”. In this thesis a new Big Stream Listener-based Graph architecture is proposed. Another important step, is to build more applications around the Web model, bringing about the Web of Things (WoT). As several IoT testbeds have been focused on evaluating lower-layer communication aspects, this thesis proposes a new WoT Testbed aiming at allowing developers to work with a high level of abstraction, without worrying about low-level details. Finally, an innovative SOs-driven User Interface (UI) generation paradigm for mobile applications in heterogeneous IoT networks is proposed, to simplify interactions between users and things.
Resumo:
This work of thesis wants to present a dissertation of the wide range of modern dense matching algorithms, which are spreading in different application and research fields, with a particular attention to the innovative “Semi-Global” matching techniques. The choice of develop a semi-global numerical code was justified by the need of getting insight on the variables and strategies that affect the algorithm performances with the primary objective of maximizing the method accuracy and efficiency, and the results level of completeness. The dissertation will consist in the metrological characterization of the proprietary implementation of the semi-global matching algorithm, evaluating the influence of several matching variables and functions implemented in the process and comparing the accuracy and completeness of different results (digital surface models, disparity maps and 2D displacement fields) obtained using our code and other commercial and open-source matching programs in a wide variety of application fields.
Resumo:
Desde sua criação a internet tem recebido por parte dos anunciantes réplicas propagandas desenvolvidas para outras mídias, porém com o passar dos anos a internet ganhou espaço e recebeu a atenção dos anunciantes desejosos por utilizar suas forças à favor da venda de seus produtos. Contudo conhece-se muito pouco a respeito das características de linguagem específicas da internet, e a literatura existente e pouca e inconclusiva. Com base nesse quadro o trabalho comparou vídeos publicitários desenvolvidos especificamente para a internet com outros desenvolvidos originalmente para a televisão e publicados na internet a fim de identificar pontos de paridade e convergência entre eles no que se refere à linguagem. O trabalho baseou-se nas teorias da Cibercultura e da Sociedade em Rede e nos estudos de Marshall McLuhan e em uma análise comparativa dos vídeos a fim de entender a linguagem técnica, discursiva e visual dos mesmos. A pesquisa apontou diferenças entre os dois grupos de investigados, sobretudo no que se refere à interatividade, potencial de viralização e maximização de visualização. O estudo identificou também que no que se refere à linguagem técnico-discursiva os vídeos apresentaram pouca diferenciação embora tenha-se comprovado que os vídeos desenvolvidos para a internet são mais recentes o que evidencia um início do processo de adaptação da linguagem publicitária ao meio internet.
Resumo:
An unsupervised learning procedure based on maximizing the mutual information between the outputs of two networks receiving different but statistically dependent inputs is analyzed (Becker S. and Hinton G., Nature, 355 (1992) 161). By exploiting a formal analogy to supervised learning in parity machines, the theory of zero-temperature Gibbs learning for the unsupervised procedure is presented for the case that the networks are perceptrons and for the case of fully connected committees.
Resumo:
A novel dissolution method was developed, suitable for powder mixtures, based on the USP basket apparatus. The baskets were modified such that the powder mixtures were retained within the baskets and not dispersed, a potential difficulty that may arise when using conventional USP basket and paddle apparatus. The advantages of this method were that the components of the mixtures were maintained in close proximity, maximizing any drug:excipient interaction and leading to more linear dissolution profiles. Two weakly acidic model drugs, ibuprofen and acetaminophen, and a selection of pharmaceutical excipients, including potential dissolution-enhancing alkalizing agents, were chosen for investigation. Dissolution profiles were obtained for simple physical mixtures. The f1 fit factor values, calculated using pure drug as the reference material, demonstrated a trend in line with expectations, with several dissolution enhancers apparent for both drugs. Also, the dissolution rates were linear over substantial parts of the profiles. For both drugs, a rank order comparison between the f1 fit factor and calculated dissolution rate, obtained from the linear section of the dissolution profile, demonstrated a correlation using a significance level of P=0.05. The method was proven to be suitable for discriminating between the effects of excipients on the dissolution of the model drugs. The method design produced dissolution profiles where the dissolution rate was linear for a substantial time, allowing determination of the dissolution rate without mathematical transformation of the data. This method may be suitable as a preliminary excipient-screening tool in the drug formulation development process.
Resumo:
This paper develops and applies an integrated multiple criteria decision making approach to optimize the facility location-allocation problem in the contemporary customer-driven supply chain. Unlike the traditional optimization techniques, the proposed approach, combining the analytic hierarchy process (AHP) and the goal programming (GP) model, considers both quantitative and qualitative factors, and also aims at maximizing the benefits of deliverer and customers. In the integrated approach, the AHP is used first to determine the relative importance weightings or priorities of alternative locations with respect to both deliverer oriented and customer oriented criteria. Then, the GP model, incorporating the constraints of system, resource, and AHP priority is formulated to select the best locations for setting up the warehouses without exceeding the limited available resources. In this paper, a real case study is used to demonstrate how the integrated approach can be applied to deal with the facility location-allocation problem, and it is proved that the integrated approach outperforms the traditional costbased approach.
Resumo:
In for-profit organizations efficiency measurement with reference to the potential for profit augmentation is particularly important as is its decomposition into technical, and allocative components. Different profit efficiency approaches can be found in the literature to measure and decompose overall profit efficiency. In this paper, we highlight some problems within existing approaches and propose a new measure of profit efficiency based on a geometric mean of input/output adjustments needed for maximizing profits. Overall profit efficiency is calculated through this efficiency measure and is decomposed into its technical and allocative components. Technical efficiency is calculated based on a non-oriented geometric distance function (GDF) that is able to incorporate all the sources of inefficiency, while allocative efficiency is retrieved residually. We also define a measure of profitability efficiency which complements profit efficiency in that it makes it possible to retrieve the scale efficiency of a unit as a component of its profitability efficiency. In addition, the measure of profitability efficiency allows for a dual profitability interpretation of the GDF measure of technical efficiency. The concepts introduced in the paper are illustrated using a numerical example.
Resumo:
A novel, direction-sensitive bending sensor based on an asymmetric fiber Bragg grating (FBG) inscribed by an infrared femtosecond laser was demonstrated. The technique is based on tight transverse confinement of the femto-inscribed structures and can be directly applied in conventional, untreated singlemode fibers. The FBG structure was inscribed by an amplified, titanium sapphire laser system. The grating cross-section was elongated along the direction of the laser beam with the transverse dimensions of approximately 1 by 2 μm. It was suggested that the sensitivity of the device can be improved by inscribing smaller spatial features and by implementing more complex grating designs aimed at maximizing the effect of strain.
Resumo:
Colouring sparse graphs under various restrictions is a theoretical problem of significant practical relevance. Here we consider the problem of maximizing the number of different colours available at the nodes and their neighbourhoods, given a predetermined number of colours. In the analytical framework of a tree approximation, carried out at both zero and finite temperatures, solutions obtained by population dynamics give rise to estimates of the threshold connectivity for the incomplete to complete transition, which are consistent with those of existing algorithms. The nature of the transition as well as the validity of the tree approximation are investigated.
Resumo:
This thesis examines the present provisions for pre-conception care and the views of the providers of services. Pre-conception care is seen by some clinicians and health educators as a means of making any necessary changes in life style, corrections to imbalances in the nutritional status of the prospective mother (and father) and the assessment of any medical problems, thus maximizing the likelihood of the normal development of the baby. Pre-conception care may be described as a service to bridge the gap between the family planning clinic and the first ante-natal booking appointment. There were three separate foci for the empirical research - the Foresight organisation (a charity which has pioneered pre-conception care in Britain); the pre-conception care clinic at the West London Hospital, Hammersmith; and the West Midlands Regional Health Authority. The six main sources of data were: twenty five clinicians operating Foresight pre-conception clinics, couples attending pre-conception clinics, committee members of the Foresight organisation, staff of the West London Hospital pre-conception clinic, Hammersmith, District Health Education Officers working in the West Midlands Regional Health Authority and the members of the Ante-Natal Care Action Group, a sub-group of the Regional Health Advisory Group on Health Promotion and Preventive Medicine. A range of research methods were adopted. These were as follows: questionnaires and report forms used in co-operation with the Foresight clinicians, interviews, participant observation discussions and informal meetings and, finally, literature and official documentation. The research findings illustrated that pre-conception care services provided at the predominantly private Foresight clinics were of a rather `ad hoc' nature. The type of provision varied considerably and clearly reflected the views held by its providers. The protocol which had been developed to assist in the standardization of results was not followed by the clinicians. The pre-conception service provided at the West London Hospital shared some similarities in its approach with the Foresight provision; a major difference was that it did not advocate the use of routine hair trace metal analysis. Interviews with District Health Education Officers and with members of the Ante Natal Care Action Group revealed a tentative and cautious approach to pre-conception care generally and to the Foresight approach in particular. The thesis concludes with a consideration of the future of pre-conception care and the prospects for the establishment of a comprehensive pre-conception care service.
Resumo:
In the last 15 years, 80% of all recombinant proteins reported in the literature were produced in the bacterium, Escherichia coli, or the yeast, Pichia pastoris. Nonetheless, developing effective general strategies for producing recombinant eukaryotic membrane proteins in these organisms remains a particular challenge. Using a validated screening procedure together with accurate yield quantitation, we therefore wished to establish the critical steps contributing to high yields of recombinant eukaryotic membrane protein in P. pastoris. Whilst the use of fusion partners to generate chimeric constructs and directed mutagenesis have previously been shown to be effective in bacterial hosts, we conclude that this approach is not transferable to yeast. Rather, codon optimization and the preparation and selection of high-yielding P. pastoris clones are effective strategies for maximizing yields of human aquaporins.