988 resultados para IT Usage
Resumo:
Common ash (Fraxinus excelsior L.) is a medium-sized deciduous tree with large compound leaves that develop relatively late in spring. It flowers before leaf-buds burst and trees can carry male, female, or hermaphrodite flowers, or different combinations of the flower types. It grows throughout the European temperate zone, but is absent from the driest Mediterranean areas because it does not tolerate extended summer drought, and from the northern boreal regions, with its seedlings in particular being vulnerable to late spring frost. Soils exert a strong control on common ash distribution locally. The species grows best on fertile soils where soil pH exceeds 5.5. It rarely forms pure stands, more often it is found in small groups in mixed stands. Ash trees produce high quality timber that combines light weight, strength, and flexibility. Before the mass use of steel, it was used for a wide range of purposes, from agricultural implements to construction of boat and car frames. Today
Resumo:
The European larch (Larix decidua Mill.) is a pioneer, very long-lived, fast-growing coniferous tree, which occurs in the central and eastern mountains of Europe, forming open forests or pasture woods at the upper tree limits. Larch is the only deciduous conifer in Europe as an adaptation to continental alpine climates. In fact, it is able to tolerate very cold temperatures during winter and, by losing its needles, avoids foliage desiccation. It is a transitional species, colonising open terrain after natural disturbances. It forms pure stands but more often it is found with other alpine tree species, which tend to replace it if no other disturbances occur. Thanks to its adaptability and the durability of its wood, the European larch represents an important silvicultural tree species in the alpine regions, planted even outside its natural ranges. Its wood is largely used for carpentry, furniture and pulp for paper. In lower altitudes or with high precipitation rates, larch is more susceptible to fungal diseases. Outbreaks of insect defoliators, principally caused by the larch bud moth (Zeiraphera diniana), can limit tree development, with economic losses in plantations, but they rarely lead to the death of the trees.
Resumo:
Among the coniferous species, Norway spruce (Picea abies (L.) Karst.) is one of the most important trees in Europe both for economic and ecological aspects, with a long tradition of cultivation. It can be a big tree, reaching 50-60 m in height with a straight and regular trunk, particularly used for timber constructions, pulpwood for paper and furniture. This widespread species dominates the Boreal forests in Northern Europe and the subalpine areas of the Alps and Carpathian Mountains. Thanks to its high performances in different site conditions, it can also be found outside its natural distribution on lower elevations in more temperate forests. Norway spruce has been massively planted up to its niche limits, where it is particularly susceptible to heat and drought, due to its shallow root system. For this reason it is expected to be severely affected under global warming conditions. Disturbed and weakened plants can be easily attacked by rot fungi such as Heterobasidion annosum and Armillaria, or by the bark beetles Ips typographus, one of the most destructive spruce forest pests.
Resumo:
Juglans regia L., commonly known as common, English or Persian walnut, is an economically very important tree species, prized both for its nuts and for its attractive high-quality timber. It is the most widespread nut tree worldwide.
Resumo:
Purpose. To evaluate the use of the Legionella Urine Antigen Test as a cost effective method for diagnosing Legionnaires’ disease in five San Antonio Hospitals from January 2007 to December 2009. ^ Methods. The data reported by five San Antonio hospitals to the San Antonio Metropolitan Health District during a 3-year retrospective study (January 2007 to December 2009) were evaluated for the frequency of non-specific pneumonia infections, the number of Legionella Urine Antigen Tests performed, and the percentage of positive cases of Legionnaires’ disease diagnosed by the Legionella Urine Antigen Test.^ Results. There were a total of 7,087 cases of non-specific pneumonias reported across the five San Antonio hospitals studied from 2007 to 2009. A total of 5,371 Legionella Urine Antigen Tests were performed from January, 2007 to December, 2009 across the five San Antonio hospitals in the study. A total of 38 positive cases of Legionnaires’ disease were identified by the use of Legionella Urinary Antigen Test from 2007-2009.^ Conclusions. In spite of the limitations of this study in obtaining sufficient relevant data to evaluate the cost effectiveness of Legionella Urinary Antigen Test in diagnosing Legionnaires’ disease, the Legionella Urinary Antigen Test is simple, accurate, faster, as results can be obtained within minutes to hours; and convenient because it can be performed in emergency room department to any patient who presents with the clinical signs or symptoms of pneumonia. Over the long run, it remains to be shown if this test may decrease mortality, lower total medical costs by decreasing the number of broad-spectrum antibiotics prescribed, shorten patient wait time/hospital stay, and decrease the need for unnecessary ancillary testing, and improve overall patient outcomes.^
Resumo:
Diarrheal disease associated with enterotoxigenic Escherichia coli (ETEC) infection is one of the major public health problems in many developing countries, especially in infants and young children. Because tests suitable for field laboratories have been developed only relatively recently, the literature on the environmental risk factors associated with ETEC is not as complete as for many other pathogens or for diarrhea of unspecified etiology.^ Data from a diarrheal disease surveillance project in rural Egypt in which stool samples were tested for a variety of pathogens, and in which an environmental questionnaire was completed for the same study households, provided an opportunity to test for an association between ETEC and various risk factors present in those households. ETEC laboratory-positive specimens were compared with ETEC laboratory-negative specimens for both symptomatic and asymptomatic children less than three years of age at the individual and household level using a case-comparison design.^ Individual children more likely to have LT infection were those who lived in HHs that had cooked food stored for subsequent consumption at the time of the visit, where caretakers used water but not soap to clean an infant after a diarrheal stool, and that had an indoor, private water source. LT was more common in HHs where the caretaker did not clean an infant with soap after a diarrheal stool, and where a sleeping infant was not covered with a net. At both the individual and HH level, LT was significantly associated with good water supply in terms of quantity and storage.^ ST was isolated more frequently at the individual level where a sleeping infant was covered with a net, where large animals were kept in or around the house, where water was always available and was not potable, and where the water container was not covered. At the HH level, the absence of a toilet or latrine and the indiscriminate disposal of animal waste decreased risk. Using animal feces for fertilizer, the presence of large animals, and poor water quality were associated with ST at both the individual and HH level.^ These findings are mostly consistent with those of other studies, and/or are biologically plausible, with the obvious exception of those from this study where poorer water supplies are associated with less infection, at least in the case of LT. More direct observation of how animal ownership and feces disposal relates to different types of water supply and usage might clarify mechanisms through which some ETEC infection could be prevented in similar settings. ^
Resumo:
In an increasing number of applications (e.g., in embedded, real-time, or mobile systems) it is important or even essential to ensure conformance with respect to a specification expressing resource usages, such as execution time, memory, energy, or user-defined resources. In previous work we have presented a novel framework for data size-aware, static resource usage verification. Specifications can include both lower and upper bound resource usage functions. In order to statically check such specifications, both upper- and lower-bound resource usage functions (on input data sizes) approximating the actual resource usage of the program which are automatically inferred and compared against the specification. The outcome of the static checking of assertions can express intervals for the input data sizes such that a given specification can be proved for some intervals but disproved for others. After an overview of the approach in this paper we provide a number of novel contributions: we present a full formalization, and we report on and provide results from an implementation within the Ciao/CiaoPP framework (which provides a general, unified platform for static and run-time verification, as well as unit testing). We also generalize the checking of assertions to allow preconditions expressing intervals within which the input data size of a program is supposed to lie (i.e., intervals for which each assertion is applicable), and we extend the class of resource usage functions that can be checked.
Resumo:
Automatic cost analysis of programs has been traditionally concentrated on a reduced number of resources such as execution steps, time, or memory. However, the increasing relevance of analysis applications such as static debugging and/or certiflcation of user-level properties (including for mobile code) makes it interesting to develop analyses for resource notions that are actually application-dependent. This may include, for example, bytes sent or received by an application, number of files left open, number of SMSs sent or received, number of accesses to a datábase, money spent, energy consumption, etc. We present a fully automated analysis for inferring upper bounds on the usage that a Java bytecode program makes of a set of application programmer-deflnable resources. In our context, a resource is defined by programmer-provided annotations which state the basic consumption that certain program elements make of that resource. From these deflnitions our analysis derives functions which return an upper bound on the usage that the whole program (and individual blocks) make of that resource for any given set of input data sizes. The analysis proposed is independent of the particular resource. We also present some experimental results from a prototype implementation of the approach covering a signiflcant set of interesting resources.
Resumo:
Automatic cost analysis of programs has been traditionally studied in terms of a number of concrete, predefined resources such as execution steps, time, or memory. However, the increasing relevance of analysis applications such as static debugging and/or certification of user-level properties (including for mobile code) makes it interesting to develop analyses for resource notions that are actually applicationdependent. This may include, for example, bytes sent or received by an application, number of files left open, number of SMSs sent or received, number of accesses to a database, money spent, energy consumption, etc. We present a fully automated analysis for inferring upper bounds on the usage that a Java bytecode program makes of a set of application programmer-definable resources. In our context, a resource is defined by programmer-provided annotations which state the basic consumption that certain program elements make of that resource. From these definitions our analysis derives functions which return an upper bound on the usage that the whole program (and individual blocks) make of that resource for any given set of input data sizes. The analysis proposed is independent of the particular resource. We also present some experimental results from a prototype implementation of the approach covering an ample set of interesting resources.
Resumo:
It has been shown that cloud computing brings cost benefits and promotes efficiency in the operations of the organizations, no matter what their type or size. However, few public organizations are benefiting from this new paradigm shift in the way the organizations consume and manage computational resources. The objective of this thesis is to analyze both internal and external factors that may influence the adoption of cloud computing by public organizations and propose possible strategies that can assist these organizations in their path to cloud usage. In order to achieve this objective, a SWOT analysis has been conducted, detecting internal factors (strengths and weaknesses) and external factors (opportunities and threats) that can influence the adoption of a governmental cloud. With the application of a TOWS matrix, by combining the internal and external factors, a list of possible strategies have been formulated to be used as a guide to decision-making related to the transition to a cloud environment.
Resumo:
The psbA gene of the chloroplast genome has a codon usage that is unusual for plant chloroplast genes. In the present study the evolutionary status of this codon usage is tested by reconstructing putative ancestral psbA sequences to determine the pattern of change in codon bias during angiosperm divergence. It is shown that the codon biases of the ancestral genes are much stronger than all extant flowering plant psbA genes. This is related to previous work that demonstrated a significant increase in synonymous substitution in psbA relative to other chloroplast genes. It is suggested, based on the two lines of evidence, that the codon bias of this gene currently is not being maintained by selection. Rather, the atypical codon bias simply may be a remnant of an ancestral codon bias that now is being degraded by the mutation bias of the chloroplast genome, in other words, that the psbA gene is not at equilibrium. A model for the evolution of selective pressure on the codon usage of plant chloroplast genes is discussed.
Resumo:
With more than 10 fully sequenced, publicly available prokaryotic genomes, it is now becoming possible to gain useful insights into genome evolution. Before the genome era, many evolutionary processes were evaluated from limited data sets and evolutionary models were constructed on the basis of small amounts of evidence. In this paper, I show that genes on the Borrelia burgdorferi genome have two separate, distinct, and significantly different codon usages, depending on whether the gene is transcribed on the leading or lagging strand of replication. Asymmetrical replication is the major source of codon usage variation. Replicational selection is responsible for the higher number of genes on the leading strands, and transcriptional selection appears to be responsible for the enrichment of highly expressed genes on these strands. Replicational–transcriptional selection, therefore, has an influence on the codon usage of a gene. This is a new paradigm of codon selection in prokaryotes.
Resumo:
Objectives: To explore whether the presence of online tables of contents (TOC) in an online catalog affects circulation (checkouts and inhouse usage). Two major questions were posed: (1) did the presence of online tables of contents for books increase use, and, (2) if it did, what factors might cause the increase?
Resumo:
Understanding the factors responsible for variations in mutation patterns and selection efficacy along chromosomes is a prerequisite for deciphering genome sequences. Population genetics models predict a positive correlation between the efficacy of selection at a given locus and the local rate of recombination because of Hill–Robertson effects. Codon usage is considered one of the most striking examples that support this prediction at the molecular level. In a wide range of species including Caenorhabditis elegans and Drosophila melanogaster, codon usage is essentially shaped by selection acting for translational efficiency. Codon usage bias correlates positively with recombination rate in Drosophila, apparently supporting the hypothesis that selection on codon usage is improved by recombination. Here we present an exhaustive analysis of codon usage in C. elegans and D. melanogaster complete genomes. We show that in both genomes there is a positive correlation between recombination rate and the frequency of optimal codons. However, we demonstrate that in both species, this effect is due to a mutational bias toward G and C bases in regions of high recombination rate, possibly as a direct consequence of the recombination process. The correlation between codon usage bias and recombination rate in these species appears to be essentially determined by recombination-dependent mutational patterns, rather than selective effects. This result highlights that it is necessary to take into account the mutagenic effect of recombination to understand the evolutionary role and impact of recombination.
Resumo:
The correction of presbyopia and restoration of true accommodative function to the ageing eye is the focus of much ongoing research and clinical work. A range of accommodating intraocular lenses (AIOLs) implanted during cataract surgery has been developed and they are designed to change either their position or shape in response to ciliary muscle contraction to generate an increase in dioptric power. Two main design concepts exist. First, axial shift concepts rely on anterior axial movement of one or two optics creating accommodative ability. Second, curvature change designs are designed to provide significant amplitudes of accommodation with little physical displacement. Single-optic devices have been used most widely, although the true accommodative ability provided by forward shift of the optic appears limited and recent findings indicate that alternative factors such as flexing of the optic to alter ocular aberrations may be responsible for the enhanced near vision reported in published studies. Techniques for analysing the performance of AIOLs have not been standardised and clinical studies have reported findings using a wide range of both subjective and objective methods, making it difficult to gauge the success of these implants. There is a need for longitudinal studies using objective methods to assess long-term performance of AIOLs and to determine if true accommodation is restored by the designs available. While dual-optic and curvature change IOLs are designed to provide greater amplitudes of accommodation than is possible with single-optic devices, several of these implants are in the early stages of development and require significant further work before human use is possible. A number of challenges remain and must be addressed before the ultimate goal of restoring youthful levels of accommodation to the presbyopic eye can be achieved.