995 resultados para 280301 Programming Techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to identify and map the weed population in a no-tillage area. Geostatistical techniques were used in the mapping in order to assess this information as a tool for the localized application of herbicides. The area of study is 58.08 hectares wide and was sampled in a fixed square grid (which point spaced 50 m, 232 points) using a GPS receiver. In each point the weeds species and population were analyzed in a square with a 0.25 m2 fixed area. The species Ipomoea grandifolia, Gnaphalium spicatum, Richardia spp. and Emilia sonchifolia have presented no spatial dependence. However, the species Conyza spp., C. echinatus and E. indica have shown a spatial correlation. Among the models tested, the spherical model has shown had a better fit for Conyza spp. and Eleusine indica and the Gaussian model for Cenchrus echinatus. The three species have a clumped spatial distribution. The mapping of weeds can be a tool for localized control, making herbicide use more rational, effective and economical.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phytotoxic effects of invasive weed Parthenium hysterophorus were studied by using whole plant, leaf and root aqueous extracts at 0, 2.5, 5.0, 7.5 and 10% (w/v) concentrations against germination and early seedling growth of wheat and canola. Studies were carried out both in Petri plates with filter paper as substratum placed in controlled conditions and soil-filled plastic pots placed in open environments. Pronounced variation was noted for phytotoxic activity of different plant parts of parthenium, aqueous extract concentrations, test species, and bioassay techniques. Aqueous parthenium extracts either inhibited or delayed the germination and suppressed seedling growth of test species over control. For both test species, all the germination attributes were suppressed to a greater extent in Petri plates than in plastic pots. Leaf extracts were more suppressive to germination of test species than whole plant and root extracts. Increasing extract concentration beyond 2.5% caused significant reduction in seedling dry biomass of both test species. Aqueous parthenium extract diminished chlorophyll contents of wheat and canola by 32-63% and 29 69%, respectively. Nevertheless, an increase of 9-172% and 22-60% in phenolic contents of wheat and canola was recorded. Canola appeared to be more susceptible than wheat at all extract concentrations. Present study concluded that bioassays conducted under controlled condition using filter paper as substratum may be misleading due to over estimation of allelopathic response and variation in potential of receiver and donor species. Furthermore, it implies that threshold concentrations of allelochemicals for test species in Petri plates are rarely reached under field conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chromosome abnormalities and the mitotic index in lymphocyte cultures and micronuclei in buccal mucosa cells were investigated in a sample of underground mineral coal miners from Southern Brazil. A decreased mitotic index, an excess of micronuclei and a higher frequency of chromosome abnormalities (fragments, polyploidy and overall chromosome alterations) were observed in the miners when compared to age-paired normal controls from the same area. An alternative assay for clastogenesis in occupational exposition was tested by submitting lymphocytes from non-exposed individuals to a pool of plasmas from the exposed population. This assay proved to be very convenient, as the lymphocytes obtained from the same individuals can be used as target as well as control cells. Also, it yielded a larger number of metaphases and of successful cultures than with common lymphocyte cultures from miners. A significantly higher frequency of chromatid gaps, fragments and overall alterations were observed when lymphocytes from control subjects were exposed to miner plasma pools. Control plasma pools did not significantly induce any type of chromosome alterations in the cultures of normal subjects, thus indicating that the results are not due to the effect of the addition of plasma pools per se.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Methods previously described by Canovai et al. (Caryologia 47: 241-247, 1994) which produced C and ASG bands in mitotic chromosomes of Ceratitis capitata were applied to the chromosomes of several Anastrepha species. Metaphase plate yield was substantially increased by use of imaginal disks together with cerebral ganglia. The C-bands were quite prominent allowing the resolution of tiny blocks of heterochromatin. The ASG method produced G-like banded chromosomes, which permitted recognition of each individual chromosome. These simple techniques do not require special equipment and may be valuable for karyotype variability studies in fruit flies and other Diptera

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Standard techniques for radioautography used in biological and medical research can be classified into three categories, i.e., macroscopic radioautography, light microscopic radioautography and electron microscopic radioautography. The routine techniques used in these three procedures are described. With regard to macroscopic radioautography, whole body radioautography is a standard technique which employs freezing and cryosectioning and can demonstrate organ distributions of both soluble and insoluble compounds. In contrast, in light and electron microscopic radioautography, soluble and insoluble techniques are separated. In order to demonstrate insoluble labeled compounds, conventional chemical fixations such as formalin for light microscopy or buffered glutaraldehyde and osmium tetroxide for both light and electron microscopy followed by dehydration, embedding and wet-mounting applications of radioautographic emulsions can be used. For the demonstration of soluble labeled compounds, however, cryotechniques such as cryofixation, cryosectioning, freeze-drying, freeze-substitution followed by dry-sectioning and dry-mounting radioautography should be employed both for light and electron microscopy. The outlines of these techniques, which should be utilized in various fields of biological and medical research, are described in detail

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To assess relationships between neuropeptide-binding sites and receptor proteins in rat brain, the distribution of radioautographically labeled somatostatin and neurotensin-binding sites was compared to that of immunolabeled sst2A and NTRH receptor subtypes, respectively. By light microscopy, immunoreactive sst2A receptors were either confined to neuronal perikarya and dendrites or diffusely distributed in tissue. By electron microscopy, areas expressing somatodendritic sst2A receptors displayed only low proportions of membrane-associated, as compared to intracellular, receptors. Conversely, regions displaying diffuse sst2A labeling exhibited higher proportions of membrane-associated than intracellular receptors. Furthermore, the former showed only low levels of radioautographically labeled somatostatin-binding sites whereas the latter contained high densities of somatostatin-binding suggesting that membrane-associated receptors are preferentially recognized by the radioligand. In the case of NTRH receptors, there was a close correspondence between the light microscopic distribution of NTRH immunoreactivity and that of labeled neurotensin-binding sites. Within the substantia nigra, the bulk of immuno- and autoradiographically labeled receptors were associated with the cell bodies and dendrites of presumptive DA neurons. By electron microscopy, both markers were detected inside as well as on the surface of labeled neurons. At the level of the plasma membrane, their distribution was highly correlated and characterized by a lack of enrichment at the level of synaptic junctions and by a homogeneous distribution along the remaining neuronal surface, in conformity with the hypothesis of an extra-synaptic action of this neuropeptide. Inside labeled dendrites, there was a proportionally higher content of immunoreactive than radiolabeled receptors. Some of the immunolabeled receptors not recognized by the radioligand were found in endosome-like organelles suggesting that, as in the case of sst2A receptors, they may have undergone endocytosis subsequent to binding to the endogenous peptide

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present study, using noise-free simulated signals, we performed a comparative examination of several preprocessing techniques that are used to transform the cardiac event series in a regularly sampled time series, appropriate for spectral analysis of heart rhythm variability (HRV). First, a group of noise-free simulated point event series, which represents a time series of heartbeats, was generated by an integral pulse frequency modulation model. In order to evaluate the performance of the preprocessing methods, the differences between the spectra of the preprocessed simulated signals and the true spectrum (spectrum of the model input modulating signals) were surveyed by visual analysis and by contrasting merit indices. It is desired that estimated spectra match the true spectrum as close as possible, showing a minimum of harmonic components and other artifacts. The merit indices proposed to quantify these mismatches were the leakage rate, defined as a measure of leakage components (located outside some narrow windows centered at frequencies of model input modulating signals) with respect to the whole spectral components, and the numbers of leakage components with amplitudes greater than 1%, 5% and 10% of the total spectral components. Our data, obtained from a noise-free simulation, indicate that the utilization of heart rate values instead of heart period values in the derivation of signals representative of heart rhythm results in more accurate spectra. Furthermore, our data support the efficiency of the widely used preprocessing technique based on the convolution of inverse interval function values with a rectangular window, and suggest the preprocessing technique based on a cubic polynomial interpolation of inverse interval function values and succeeding spectral analysis as another efficient and fast method for the analysis of HRV signals

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Switching power supplies are usually implemented with a control circuitry that uses constant clock frequency turning the power semiconductor switches on and off. A drawback of this customary operating principle is that the switching frequency and harmonic frequencies are present in both the conducted and radiated EMI spectrum of the power converter. Various variable-frequency techniques have been introduced during the last decade to overcome the EMC problem. The main objective of this study was to compare the EMI and steady-state performance of a switch mode power supply with different spread-spectrum/variable-frequency methods. Another goal was to find out suitable tools for the variable-frequency EMI analysis. This thesis can be divided into three main parts: Firstly, some aspects of spectral estimation and measurement are presented. Secondly, selected spread spectrum generation techniques are presented with simulations and background information. Finally, simulations and prototype measurements from the EMC and the steady-state performance are carried out in the last part of this work. Combination of the autocorrelation function, the Welch spectrum estimate and the spectrogram were used as a substitute for ordinary Fourier methods in the EMC analysis. It was also shown that the switching function can be used in preliminary EMC analysis of a SMPS and the spectrum and autocorrelation sequence of a switching function correlates with the final EMI spectrum. This work is based on numerous simulations and measurements made with the prototype. All these simulations and measurements are made with the boost DC/DC converter. Four different variable-frequency modulation techniques in six different configurations were analyzed and the EMI performance was compared to the constant frequency operation. Output voltage and input current waveforms were also analyzed in time domain to see the effect of the spread spectrum operation on these quantities. According to the results presented in this work, spread spectrum modulation can be utilized in power converter for EMI mitigation. The results from steady-state voltage measurements show, that the variable-frequency operation of the SMPS has effect on the voltage ripple, but the ripple measured from the prototype is still acceptable in some applications. Both current and voltage ripple can be controlled with proper main circuit and controller design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental issues, including global warming, have been serious challenges realized worldwide, and they have become particularly important for the iron and steel manufacturers during the last decades. Many sites has been shut down in developed countries due to environmental regulation and pollution prevention while a large number of production plants have been established in developing countries which has changed the economy of this business. Sustainable development is a concept, which today affects economic growth, environmental protection, and social progress in setting up the basis for future ecosystem. A sustainable headway may attempt to preserve natural resources, recycle and reuse materials, prevent pollution, enhance yield and increase profitability. To achieve these objectives numerous alternatives should be examined in the sustainable process design. Conventional engineering work cannot address all of these substitutes effectively and efficiently to find an optimal route of processing. A systematic framework is needed as a tool to guide designers to make decisions based on overall concepts of the system, identifying the key bottlenecks and opportunities, which lead to an optimal design and operation of the systems. Since the 1980s, researchers have made big efforts to develop tools for what today is referred to as Process Integration. Advanced mathematics has been used in simulation models to evaluate various available alternatives considering physical, economic and environmental constraints. Improvements on feed material and operation, competitive energy market, environmental restrictions and the role of Nordic steelworks as energy supplier (electricity and district heat) make a great motivation behind integration among industries toward more sustainable operation, which could increase the overall energy efficiency and decrease environmental impacts. In this study, through different steps a model is developed for primary steelmaking, with the Finnish steel sector as a reference, to evaluate future operation concepts of a steelmaking site regarding sustainability. The research started by potential study on increasing energy efficiency and carbon dioxide reduction due to integration of steelworks with chemical plants for possible utilization of available off-gases in the system as chemical products. These off-gases from blast furnace, basic oxygen furnace and coke oven furnace are mainly contained of carbon monoxide, carbon dioxide, hydrogen, nitrogen and partially methane (in coke oven gas) and have proportionally low heating value but are currently used as fuel within these industries. Nonlinear optimization technique is used to assess integration with methanol plant under novel blast furnace technologies and (partially) substitution of coal with other reducing agents and fuels such as heavy oil, natural gas and biomass in the system. Technical aspect of integration and its effect on blast furnace operation regardless of capital expenditure of new operational units are studied to evaluate feasibility of the idea behind the research. Later on the concept of polygeneration system added and a superstructure generated with alternative routes for off-gases pretreatment and further utilization on a polygeneration system producing electricity, district heat and methanol. (Vacuum) pressure swing adsorption, membrane technology and chemical absorption for gas separation; partial oxidation, carbon dioxide and steam methane reforming for methane gasification; gas and liquid phase methanol synthesis are the main alternative process units considered in the superstructure. Due to high degree of integration in process synthesis, and optimization techniques, equation oriented modeling is chosen as an alternative and effective strategy to previous sequential modelling for process analysis to investigate suggested superstructure. A mixed integer nonlinear programming is developed to study behavior of the integrated system under different economic and environmental scenarios. Net present value and specific carbon dioxide emission is taken to compare economic and environmental aspects of integrated system respectively for different fuel systems, alternative blast furnace reductants, implementation of new blast furnace technologies, and carbon dioxide emission penalties. Sensitivity analysis, carbon distribution and the effect of external seasonal energy demand is investigated with different optimization techniques. This tool can provide useful information concerning techno-environmental and economic aspects for decision-making and estimate optimal operational condition of current and future primary steelmaking under alternative scenarios. The results of the work have demonstrated that it is possible in the future to develop steelmaking towards more sustainable operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reports on the design and characteristics of substrate mimetics in protease-catalyzed reactions. Firstly, the basis of protease-catalyzed peptide synthesis and the general advantages of substrate mimetics over common acyl donor components are described. The binding behavior of these artificial substrates and the mechanism of catalysis are further discussed on the basis of hydrolysis, acyl transfer, protein-ligand docking, and molecular dynamics studies on the trypsin model. The general validity of the substrate mimetic concept is illustrated by the expansion of this strategy to trypsin-like, glutamic acid-specific, and hydrophobic amino acid-specific proteases. Finally, opportunities for the combination of the substrate mimetic strategy with the chemical solid-phase peptide synthesis and the use of substrate mimetics for non-peptide organic amide synthesis are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graphene is a material with extraordinary properties. Its mechanical and electrical properties are unparalleled but the difficulties in its production are hindering its breakthrough in on applications. Graphene is a two-dimensional material made entirely of carbon atoms and it is only a single atom thick. In this work, properties of graphene and graphene based materials are described, together with their common preparation techniques and related challenges. This Thesis concentrates on the topdown techniques, in which natural graphite is used as a precursor for the graphene production. Graphite consists of graphene sheets, which are stacked together tightly. In the top-down techniques various physical or chemical routes are used to overcome the forces keeping the graphene sheets together, and many of them are described in the Thesis. The most common chemical method is the oxidisation of graphite with strong oxidants, which creates a water-soluble graphene oxide. The properties of graphene oxide differ significantly from pristine graphene and, therefore, graphene oxide is often reduced to form materials collectively known as reduced graphene oxide. In the experimental part, the main focus is on the chemical and electrochemical reduction of graphene oxide. A novel chemical route using vanadium is introduced and compared to other common chemical graphene oxide reduction methods. A strong emphasis is placed on electrochemical reduction of graphene oxide in various solvents. Raman and infrared spectroscopy are both used in in situ spectroelectrochemistry to closely monitor the spectral changes during the reduction process. These in situ techniques allow the precise control over the reduction process and even small changes in the material can be detected. Graphene and few layer graphene were also prepared using a physical force to separate these materials from graphite. Special adsorbate molecules in aqueous solutions, together with sonic treatment, produce stable dispersions of graphene and few layer graphene sheets in water. This mechanical exfoliation method damages the graphene sheets considerable less than the chemical methods, although it suffers from a lower yield.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acute promyelocytic leukemia (AML M3) is a well-defined subtype of leukemia with specific and peculiar characteristics. Immediate identification of t(15;17) or the PML/RARA gene rearrangement is fundamental for treatment. The objective of the present study was to compare fluorescent in situ hybridization (FISH), reverse transcriptase-polymerase chain reaction (RT-PCR) and karyotyping in 18 samples (12 at diagnosis and 6 after treatment) from 13 AML M3 patients. Bone marrow samples were submitted to karyotype G-banding, FISH and RT-PCR. At diagnosis, cytogenetics was successful in 10 of 12 samples, 8 with t(15;17) and 2 without. FISH was positive in 11/12 cases (one had no cells for analysis) and positivity varied from 25 to 93% (mean: 56%). RT-PCR was done in 6/12 cases and all were positive. Four of 8 patients with t(15;17) presented positive RT-PCR as well as 2 without metaphases. The lack of RT-PCR results in the other samples was due to poor quality RNA. When the three tests were compared at diagnosis, karyotyping presented the translocation in 80% of the tested samples while FISH and RT-PCR showed the PML/RARA rearrangement in 100% of them. Of 6 samples evaluated after treatment, 3 showed a normal karyotype, 1 persistence of an abnormal clone and 2 no metaphases. FISH was negative in 4 samples studied and 2 had no material for analysis. RT-PCR was positive in 4 (2 of which showed negative FISH, indicating residual disease) and negative in 2. When the three tests were compared after treatment, they showed concordance in 2 of 6 samples or, when there were not enough cells for all tests, concordance between karyotype and RT-PCR in one. At remission, RT-PCR was the most sensitive test in detecting residual disease, as expected (positive in 4/6 samples). An incidence of about 40% of 5' breaks and 60% of 3' breaks, i.e., bcr3 and bcr1/bcr2, respectively, was observed.