986 resultados para thumb
Resumo:
Evaluating progress towards eradication is critically important because weed eradication programs are very expensive and may take more than 10 years to complete. The degree of confidence that can be placed in any measure of eradication progress is a function of the effort that has been invested in finding new infestations and in monitoring known infestations. Determining eradication endpoints is particularly difficult, since plants may be extremely difficult to detect when at low densities and it is virtually impossible to demonstrate seed bank exhaustion. Recent work suggests that an economic approach to this problem should be adopted. They propose some rules of thumb to determine whether to continue an eradication program or switch to an alternative management strategy.
Resumo:
The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter.
Resumo:
The prioritisation of potential agents on the basis of likely efficacy is an important step in biological control because it can increase the probability of a successful biocontrol program, and reduce risks and costs. In this introductory paper we define success in biological control, review how agent selection has been approached historically, and outline the approach to agent selection that underpins the structure of this special issue on agent selection. Developing criteria by which to judge the success of a biocontrol agent (or program) provides the basis for agent selection decisions. Criteria will depend on the weed, on the ecological and management context in which that weed occurs, and on the negative impacts that biocontrol is seeking to redress. Predicting which potential agents are most likely to be successful poses enormous scientific challenges. 'Rules of thumb', 'scoring systems' and various conceptual and quantitative modelling approaches have been proposed to aid agent selection. However, most attempts have met with limited success due to the diversity and complexity of the systems in question. This special issue presents a series of papers that deconstruct the question of agent choice with the aim of progressively improving the success rate of biological control. Specifically they ask: (i) what potential agents are available and what should we know about them? (ii) what type, timing and degree of damage is required to achieve success? and (iii) which potential agent will reach the necessary density, at the right time, to exert the required damage in the target environment?
Resumo:
This catalogue essay was written to accompany Parallel Park's 2016 exhibition at Cut Thumb, Brisbane, 'Tandem'. It discusses the collaboration of Holly Bates and Tayla Haggarty, and their exploration of relationship dynamics alongside the role of documentation in performance art. The essay provides a critical framework for the exhibition that utilises installation, performance and moving image to highlight the potential for subjective experience to become a critical tool for engagement.
Resumo:
Measures of transit accessibility are important in evaluating transit services, planning for future services and investment on land use development. Existing tools measure transit accessibility using averaged walking distance or walking time to public transit. Although the mode captivity may have significant implications on one’s willingness to walk to use public transit, this has not been addressed in the literature to date. Failed to distinguish transit captive users may lead to overestimated ridership and spatial coverage of transit services. The aim of this research is to integrate the concept of transit captivity into the analysis of walking access to public transit. The conventional way of defining “captive” and “choice” transit users showed no significant difference in their walking times according to a preliminary analysis. A cluster analysis technique is used to further divide “choice” users by three main factors, namely age group, labour force status and personal income. After eliminating “true captive” users, defined as those without driver’s licence or without a car in respective household, “non-true captive” users were classified into a total of eight groups having similar socio-economic characteristics. The analysis revealed significant differences in the walking times and patterns by their level of captivity to public transit. This paper challenges the rule-of-thumb of 400m walking distance to bus stops. In average, people’s willingness to walk dropped drastically at 268m and continued to drop constantly until it reached the mark of 670m, where there was another drastic drop of 17%, which left with only 10% of the total bus riders willing to walk 670m or more. This research found that mothers working part time were the ones with lowest transit captivity and thus most sensitive to the walking time, followed by high-income earners and the elderly. The level of captivity increases when public transit users earned lesser income, such as students and students working part time.
Resumo:
Double-stranded RNA (dsRNA) viruses encode only a single protein species that contains RNA-dependent RNA polymerase (RdRP) motifs. This protein is a central component in the life cycle of a dsRNA virus, carrying out both RNA transcription and replication. The architecture of viral RdRPs resembles that of a 'cupped right hand' with fingers, palm and thumb domains. Those applying de novo initiation have additional structural features, including a flexible C-terminal domain that constitutes the priming platform. Moreover, viral RdRPs must be able to interact with the incoming 3'-terminus of the template and position it so that a productive binary complex is formed. Bacteriophage phi6 of the Cystoviridae family is to date one of the best studied dsRNA viruses. The purified recombinant phi6 RdRP is highly active in vitro and possesses both RNA replication and transcription activities. The extensive biochemical observations and the atomic level crystal structure of the phi6 RdRP provides an excellent platform for in-depth studies of RNA replication in vitro. In this thesis, targeted structure-based mutagenesis, enzymatic assays and molecular mapping of phi6 RdRP and its RNA were used to elucidate the formation of productive RNA-polymerase binary complexes. The positively charged rim of the template tunnel was shown to have a significant role in the engagement of highly structured ssRNA molecules, whereas specific interactions further down in the template tunnel promote ssRNA entry to the catalytic site. This work demonstrated that by aiding the formation of a stable binary complex with optimized RNA templates, the overall polymerization activity of the phi6 RdRP can be greatly enhanced. Furthermore, proteolyzed phi6 RdRPs that possess a nick in the polypeptide chain at the hinge region, which is part of the extended loop, were better suited for catalysis at higher temperatures whilst favouring back-primed initiation. The clipped C-terminus remains associated with the main body of the polymerase and the hinge region, although structurally disordered, is involved in the control of C-terminal domain displacement. The accumulated knowhow on bacteriophage phi6 was utilized in the development of two technologies for the production of dsRNA: (i) an in vitro system that combines the T7 RNA polymerase and the phi6 RdRP to generate dsRNA molecules of practically unlimited length, and (ii) an in vivo RNA replication system based on restricted infection with phi6 polymerase complexes in bacterial cells to produce virtually unlimited amounts of dsRNA. The pools of small interfering RNAs derived from dsRNA produced by these systems were validated and shown to efficiently decrease the expression of both exogenous and endogenous targets.
Resumo:
The increasing focus of relationship marketing and customer relationship management (CRM) studies on issues of customer profitability has led to the emergence of an area of research on profitable customer management. Nevertheless, there is a notable lack of empirical research examining the current practices of firms specifically with regard to the profitable management of customer relationships according to the approaches suggested in theory. This thesis fills this research gap by exploring profitable customer management in the retail banking sector. Several topics are covered, including marketing metrics and accountability; challenges in the implementation of profitable customer management approaches in practice; analytic versus heuristic (‘rule of thumb’) decision making; and the modification of costly customer behavior in order to increase customer profitability, customer lifetime value (CLV), and customer equity, i.e. the financial value of the customer base. The thesis critically reviews the concept of customer equity and proposes a Customer Equity Scorecard, providing a starting point for a constructive dialog between marketing and finance concerning the development of appropriate metrics to measure marketing outcomes. Since customer management and measurement issues go hand in hand, profitable customer management is contingent on both marketing management skills and financial measurement skills. A clear gap between marketing theory and practice regarding profitable customer management is also identified. The findings show that key customer management aspects that have been proposed within the literature on profitable customer management for many years, are not being actively applied by the banks included in the research. Instead, several areas of customer management decision making are found to be influenced by heuristics. This dilemma for marketing accountability is addressed by emphasizing that CLV and customer equity, which are aggregate metrics, only provide certain indications regarding the relative value of customers and the approximate value of the customer base (or groups of customers), respectively. The value created by marketing manifests itself in the effect of marketing actions on customer perceptions, behavior, and ultimately the components of CLV, namely revenues, costs, risk, and retention, as well as additional components of customer equity, such as customer acquisition. The thesis also points out that although costs are a crucial component of CLV, they have largely been neglected in prior CRM research. Cost-cutting has often been viewed negatively in customer-focused marketing literature on service quality and customer profitability, but the case studies in this thesis demonstrate that reduced costs do not necessarily have to lead to lower service quality, customer retention, and customer-related revenues. Consequently, this thesis provides an expanded foundation upon which marketers can stake their claim for accountability. By focusing on the range of drivers and all of the components of CLV and customer equity, marketing has the potential to provide specific evidence concerning how various activities have affected the drivers and components of CLV within different groups of customers, and the implications for customer equity on a customer base level.
Resumo:
Indigenous peoples with a historical continuity of resource-use practices often possess a broad knowledge base of the behavior of complex ecological systems in their own localities. This knowledge has accumulated through a long series of observations transmitted from generation to generation. Such ''diachronic'' observations can be of great value and complement the ''synchronic''observations on which western science is based. Where indigenous peoples have depended, for long periods of time, on local environments for the provision of a variety of resources, they have developed a stake in conserving, and in some cases, enhancing, biodiversity. They are aware that biological diversity is a crucial factor in generating the ecological services and natural resources on which they depend. Some indigenous groups manipulate the local landscape to augment its heterogeneity, and some have been found to be motivated to restore biodiversity in degraded landscapes. Their practices for the conservation of biodiversity were grounded in a series of rules of thumb which are apparently arrived at through a trial and error process over a long historical time period. This implies that their knowledge base is indefinite and their implementation involves an intimate relationship with the belief system. Such knowledge is difficult for western science to understand. It is vital, however, that the value of the knowledge-practice-belief complex of indigenous peoples relating to conservation of biodiversity is fully recognized if ecosystems and biodiversity are to be managed sustainably. Conserving this knowledge would be most appropriately accomplished through promoting the community-based resource-management systems of indigenous peoples.
Resumo:
Design, analysis and technology for the integrity enhancement of damaged or underdesigned structures continues to be an engineering challenge. Bonded composite patch repairs to metallic structures is receiving increased attention in the recent years. It offers various advantages over rivetted doubler, particularly for airframe repairs. This paper presents an experimental investigation of residual strength and fatigue crack-growth life of an edge-cracked aluminium specimen repaired using glass epoxy composite patch. The investigation begins with the evaluation of three different surface treatments from bond strength viewpoint. A simple thumb rule formula is employed to estimate the patch size. Cracked and repaired specimens are tested under static and fatigue loading. The patch appears to restore the original strength of the undamaged specimen and enhance the fatigue crack growth life by an order of magnitude. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
Management of coastal development in Hawaii is based on the location of the certified shoreline, which is representative of the upper limit of marine inundation within the last several years. Though the certified shoreline location is significantly more variable than long-term erosion indicators, its migration will still follow the coastline's general trend. The long-term migration of Hawaii’s coasts will be significantly controlled by rising sea level. However, land use decisions adjacent to the shoreline and the shape and nature of the nearshore environment are also important controls to coastal migration. Though each of the islands has experienced local sea-level rise over the course of the last century, there are still locations across the islands of Kauai, Oahu, and Maui, which show long- term accretion or anomalously high erosion rates relative to their regions. As a result, engineering rules of thumb such as the Brunn rule do not always predict coastal migration and beach profile equilibrium in Hawaii. With coastlines facing all points of the compass rose, anthropogenic alteration of the coasts, complex coastal environments such as coral reefs, and the limited capacity to predict coastal change, Hawaii will require a more robust suite of proactive coastal management policies to weather future changes to its coastline. Continuing to use the current certified shoreline, adopting more stringent coastal setback rules similar to Kauai County, adding realistic sea-level rise components for all types of coastal planning, and developing regional beach management plans are some of the recommended adaptation strategies for Hawaii. (PDF contains 4 pages)
Resumo:
There is a growing amount of experimental evidence that suggests people often deviate from the predictions of game theory. Some scholars attempt to explain the observations by introducing errors into behavioral models. However, most of these modifications are situation dependent and do not generalize. A new theory, called the rational novice model, is introduced as an attempt to provide a general theory that takes account of erroneous behavior. The rational novice model is based on two central principals. The first is that people systematically make inaccurate guesses when they are evaluating their options in a game-like situation. The second is that people treat their decisions similar to a portfolio problem. As a result, non optimal actions in a game theoretic sense may be included in the rational novice strategy profile with positive weights.
The rational novice model can be divided into two parts: the behavioral model and the equilibrium concept. In a theoretical chapter, the mathematics of the behavioral model and the equilibrium concept are introduced. The existence of the equilibrium is established. In addition, the Nash equilibrium is shown to be a special case of the rational novice equilibrium. In another chapter, the rational novice model is applied to a voluntary contribution game. Numerical methods were used to obtain the solution. The model is estimated with data obtained from the Palfrey and Prisbrey experimental study of the voluntary contribution game. It is found that the rational novice model explains the data better than the Nash model. Although a formal statistical test was not used, pseudo R^2 analysis indicates that the rational novice model is better than a Probit model similar to the one used in the Palfrey and Prisbrey study.
The rational novice model is also applied to a first price sealed bid auction. Again, computing techniques were used to obtain a numerical solution. The data obtained from the Chen and Plott study were used to estimate the model. The rational novice model outperforms the CRRAM, the primary Nash model studied in the Chen and Plott study. However, the rational novice model is not the best amongst all models. A sophisticated rule-of-thumb, called the SOPAM, offers the best explanation of the data.
Resumo:
For most fisheries applications, the shape of a length-frequency distribution is much more important than its mean length or variance. This makes it difficult to evaluate at which point a sample size is adequate. By estimating the coefficient of variation of the counts in each length class and taking a weighted mean of these, a measure of precision was obtained that takes the precision in all length classes into account. The precision estimates were closely associated with the ratio of the sample size to the number of size classes in each sample. As a rule-of-thumb, a minimum sample size of 10 times the number of length classes in the sample is suggested because the precision deteriorates rapidly for smaller sample sizes. In absence of such a rule-of-thumb, samplers have previously under-estimated the required sample size for samples with large fish, while over-sampling small fish of the same species.
Resumo:
Ensaio não destrutivo é uma ferramenta essencial quando um equipamento, dispositivo ou componente não pode ser submetido a procedimentos destrutivos ou invasivos devido a razões de segurança, alto custo ou outras restrições físicas ou logísticas. Dentro deste quadro radiografias por transmissão com raios gama e nêutrons térmicos são técnicas singulares para inspecionar um objeto e desvendar sua estrutura interna devido à capacidade de atravessar uma vasta gama de materiais utilizados na indústria. Grosso modo, raios gama são mais atenuados por materiais pesados enquanto nêutrons térmicos são mais atenuados por materiais mais leves, tornando-as ferramentas complementares. Este trabalho apresenta os resultados obtidos na inspeção de vários componentes mecânicos, através da radiografia por transmissão com nêutrons térmicos e raios gama. O fluxo de nêutrons térmicos de 4,46x105 n.cm-2.s-1 disponível no canal principal do reator de pesquisa Argonauta do Instituto de Engenharia Nuclear foi usado como fonte para as imagens radiográficas com nêutrons. Raios dekeV emitidos pelo 198Au, também produzido no reator, foram usados como fonte de radiação para radiografias . Imaging Plates, especificamente produzidos para operar com nêutrons térmicos ou com raios X, foram empregados como detectores e dispositivos de armazenamento e captação de imagens para cada uma dessas radiações. Esses dispositivos exibem varias vantagens quando comparados ao filme radiográfico convencional. Com efeito, além de maior sensibilidade e serem reutilizáveis não são necessários câmaras escuras e processamento químico para a revelação. Em vez disso, ele é lido por um feixe de laser que libera elétrons armadilhados na rede cristalina durante a exposição à radiação, fornecendo uma imagem final digital. O desempenho de ambos os sistemas de aquisição de imagens, assim constituído, foi avaliado com respeito à sensibilidade, resolução espacial, linearidade e range dinâmico, incluído uma comparação com sistemas radiográficos com nêutrons empregando filmes e folhas de gadolínio como conversor de nêutrons em partículas carregadas. Além desta caracterização, diversos equipamentos e componentes foram radiografados com ambos os sistemas visando-se avaliar suas capacidades de desvendar a estrutura interna desses objetos e detectar estruturas e estados anormais. Dentro desta abordagem, uma neutrongrafia detectou a presença de material cerâmico remanescente empregado como molde no processo de fabricação nos canais de refrigeração de uma aleta do estator de uma turbina tipo turbo-fan, que deveria estar livre desse material. O reostato danificado de um sensor de pressão automotivo, foi identificado por neutrongrafia, embora nesse caso a radiografia também conseguiu realizar essa tarefa com melhor resolução, corroborando assim as curvas de resolução espacial obtidas na caracterização dos dois sistemas. A homogeneidade da distribuição do material encapsulado em uma gaxeta explosiva de chumbo utilizada na indústria aeroespacial foi igualmente verificada por neutrongrafia porque esse metal é relativamente transparente para nêutrons, mas suficientemente opaco para o explosivo rico em hidrogênio. Diversos outros instrumentos e componentes tais como variômetro, altímetro, bússola aeronáutica, injetor automotivo de combustível, foto-camera, disco rígido de computador, motor de passo, conectores eletrônicos e projéteis foram radiografados com ambos os sistemas visando avaliar suas habilidades em desvendar diferentes peculiaridades em função do agente interrogador.
Resumo:
A simple mathematical model of stack ventilation flows in multi-compartment buildings is developed with a view to providing an intuitive understanding of the physical processes governing the movement of air and heat through naturally ventilated buildings. Rules of thumb for preliminary design can be ascertained from a qualitative examination of the governing equations of flow, which elucidate the relationships between 'core' variables - flow rates, air temperatures, heat inputs and building geometry. The model is applied to an example three-storey office building with an inlet plenum and atrium. An examination of the governing equations of flow is used to predict the behaviour of steady flows and to provide a number of preliminary design suggestions. It is shown that control of ventilation flows must be shared between all ventilation openings within the building in order to minimise the disparity in flow rates between storeys, and ensure adequate fresh air supply rates for all occupants. © 2013 Elsevier Ltd.
Resumo:
Classical swine fever virus (CSFV) non-structural protein 5B (NS5B) encodes an RNA-dependent RNA polymerase (RdRp), a key enzyme which initiates RNA replication by a de novo mechanism without a primer and is a potential target for anti-virus therapy. We expressed the NS5B protein in Escherichia coli. The rGTP can stimulate de novo initiation of RNA synthesis and mutation of the GDD motif to Gly-Asp-Asp (GAA) abolishes the RNA synthesis. To better understand the mechanism of viral RNA synthesis in CSFV, a three-dimensional model was built by homology modeling based on the alignment with several virus RdRps. The model contains 605 residues folded in the characteristic fingers, palm and thumb domains. The fingers domain contains an N-terminal region that plays an important role in conformational change. We propose that the experimentally observed promotion of polymerase efficiency by rGTP is probably due to the conformational changes of the polymerase caused by binding the rGTP. Mutation of the GDD to GAA interferes with the interaction between the residues at the polymerase active site and metal ions, and thus renders the polymerase inactive. (c) 2005 Elsevier B.V. All rights reserved.