982 resultados para semi-recursive method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study evaluated the potential use of commercial mycoinsecticide formulations against red-gum lerp psyllid Glycaspis brimblecombei Moore in semi-field conditions. Eucalypt seedlings infested with psyllid nymphs were sprayed with different formulations and two concentrations of each product. Conidial deposits were evaluated after spraying for control efficiency. The conidial deposit was affected by the pathogen species and by formulation types. Higher conidial deposits were associated with mycoinsecticide formulation concentrates of lower granulometry and oil dispersion. However, some products with low deposits of conidia were highly efficient against psyllid nymphs. The results showed that the use of entomopathogenic fungi is a promising alternative method for controlling the red-gum lerp psyllid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Física - IFT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The competitive regime faced by individuals is fundamental to modelling the evolution of social organization. In this paper, we assess the relative importance of contest and scramble food competition on the social dynamics of a provisioned semi-free-ranging Cebus apella group (n=18). Individuals competed directly for provisioned and clumped foods. Effects of indirect competition were apparent with individuals foraging in different areas and with increased group dispersion during periods of low food abundance. We suggest that both forms of competition can act simultaneously and to some extent synergistically in their influence on social dynamics; the combination of social and ecological opportunities for competition and how those opportunities are exploited both influence the nature of the relationships within social groups of primates and underlie the evolved social structure. Copyright (c) 2008 S. Karger AG, Basel

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Semi-supervised learning is one of the important topics in machine learning, concerning with pattern classification where only a small subset of data is labeled. In this paper, a new network-based (or graph-based) semi-supervised classification model is proposed. It employs a combined random-greedy walk of particles, with competition and cooperation mechanisms, to propagate class labels to the whole network. Due to the competition mechanism, the proposed model has a local label spreading fashion, i.e., each particle only visits a portion of nodes potentially belonging to it, while it is not allowed to visit those nodes definitely occupied by particles of other classes. In this way, a "divide-and-conquer" effect is naturally embedded in the model. As a result, the proposed model can achieve a good classification rate while exhibiting low computational complexity order in comparison to other network-based semi-supervised algorithms. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Little is known about the situational contexts in which individuals consume processed sources of dietary sugars. This study aimed to describe the situational contexts associated with the consumption of sweetened food and drink products in a Catholic Middle Eastern Canadian community. A two-stage exploratory sequential mixed-method design was employed with a rationale of triangulation. In stage 1 (n = 62), items and themes describing the situational contexts of sweetened food and drink product consumption were identified from semi-structured interviews and were used to develop the content for the Situational Context Instrument for Sweetened Product Consumption (SCISPC). Face validity, readability and cultural relevance of the instrument were assessed. In stage 2 (n = 192), a cross-sectional study was conducted and exploratory factor analysis was used to examine the structure of themes that emerged from the qualitative analysis as a means of furthering construct validation. The SCISPC reliability and predictive validity on the daily consumption of sweetened products were also assessed. In stage 1, six themes and 40-items describing the situational contexts of sweetened product consumption emerged from the qualitative analysis and were used to construct the first draft of the SCISPC. In stage 2, factor analysis enabled the clarification and/or expansion of the instrument's initial thematic structure. The revised SCISPC has seven factors and 31 items describing the situational contexts of sweetened product consumption. Initial validation of the instrument indicated it has excellent internal consistency and adequate test-retest reliability. Two factors of the SCISPC had predictive validity for the daily consumption of total sugar from sweetened products (Snacking and Energy demands) while the other factors (Socialization, Indulgence, Constraints, Visual Stimuli and Emotional needs) were rather associated to occasional consumption of these products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, a dynamic programming approach to deal with the unconstrained two-dimensional non-guillotine cutting problem is presented. The method extends the recently introduced recursive partitioning approach for the manufacturer's pallet loading problem. The approach involves two phases and uses bounds based on unconstrained two-staged and non-staged guillotine cutting. The method is able to find the optimal cutting pattern of a large number of pro blem instances of moderate sizes known in the literature and a counterexample for which the approach fails to find known optimal solutions was not found. For the instances that the required computer runtime is excessive, the approach is combined with simple heuristics to reduce its running time. Detailed numerical experiments show the reliability of the method. Journal of the Operational Research Society (2012) 63, 183-200. doi: 10.1057/jors.2011.6 Published online 17 August 2011

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Semi-supervised learning techniques have gained increasing attention in the machine learning community, as a result of two main factors: (1) the available data is exponentially increasing; (2) the task of data labeling is cumbersome and expensive, involving human experts in the process. In this paper, we propose a network-based semi-supervised learning method inspired by the modularity greedy algorithm, which was originally applied for unsupervised learning. Changes have been made in the process of modularity maximization in a way to adapt the model to propagate labels throughout the network. Furthermore, a network reduction technique is introduced, as well as an extensive analysis of its impact on the network. Computer simulations are performed for artificial and real-world databases, providing a numerical quantitative basis for the performance of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cefadroxil is a semi-synthetic first-generation oral cephalosporin used in the treatment of mild to moderate infections of the respiratory and urinary tracts, skin and soft tissue infections. In this work a simple, rapid, economic and sensitive HPLC-UV method is described for the quantitative determination of cefadroxil in human plasma samples using lamivudine as internal standard. Sample pre-treatment was accomplished through protein precipitation with acetonitrile and chromatographic separation was performed with a mobile phase consisting of a mixture of sodium dihydrogen phosphate monohydrate solution, methanol and acetonitrile in the ratio of 90:8:2 (v/v/v) at a flow rate of 1.0mL/min. The proposed method is linear between 0.4 to 40.0 mu g/mL and its average recovery is 102.21% for cefadroxil and 97.94% for lamivudine. The method is simple, sensitive, reproducible, less time consuming for determination of cefadroxil in human plasma. The method can therefore be recommended for pharmacokinetics studies, including bioavailability and bioequivalence studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dimensionality reduction is employed for visual data analysis as a way to obtaining reduced spaces for high dimensional data or to mapping data directly into 2D or 3D spaces. Although techniques have evolved to improve data segregation on reduced or visual spaces, they have limited capabilities for adjusting the results according to user's knowledge. In this paper, we propose a novel approach to handling both dimensionality reduction and visualization of high dimensional data, taking into account user's input. It employs Partial Least Squares (PLS), a statistical tool to perform retrieval of latent spaces focusing on the discriminability of the data. The method employs a training set for building a highly precise model that can then be applied to a much larger data set very effectively. The reduced data set can be exhibited using various existing visualization techniques. The training data is important to code user's knowledge into the loop. However, this work also devises a strategy for calculating PLS reduced spaces when no training data is available. The approach produces increasingly precise visual mappings as the user feeds back his or her knowledge and is capable of working with small and unbalanced training sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diagnosis, grading and classification of tumours has benefited considerably from the development of DCE-MRI which is now essential to the adequate clinical management of many tumour types due to its capability in detecting active angiogenesis. Several strategies have been proposed for DCE-MRI evaluation. Visual inspection of contrast agent concentration curves vs time is a very simple yet operator dependent procedure, therefore more objective approaches have been developed in order to facilitate comparison between studies. In so called model free approaches, descriptive or heuristic information extracted from time series raw data have been used for tissue classification. The main issue concerning these schemes is that they have not a direct interpretation in terms of physiological properties of the tissues. On the other hand, model based investigations typically involve compartmental tracer kinetic modelling and pixel-by-pixel estimation of kinetic parameters via non-linear regression applied on region of interests opportunely selected by the physician. This approach has the advantage to provide parameters directly related to the pathophysiological properties of the tissue such as vessel permeability, local regional blood flow, extraction fraction, concentration gradient between plasma and extravascular-extracellular space. Anyway, nonlinear modelling is computational demanding and the accuracy of the estimates can be affected by the signal-to-noise ratio and by the initial solutions. The principal aim of this thesis is investigate the use of semi-quantitative and quantitative parameters for segmentation and classification of breast lesion. The objectives can be subdivided as follow: describe the principal techniques to evaluate time intensity curve in DCE-MRI with focus on kinetic model proposed in literature; to evaluate the influence in parametrization choice for a classic bi-compartmental kinetic models; to evaluate the performance of a method for simultaneous tracer kinetic modelling and pixel classification; to evaluate performance of machine learning techniques training for segmentation and classification of breast lesion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The biogenic production of NO in the soil accounts for between 10% and 40% of the global total. A large degree of the uncertainty in the estimation of the biogenic emissions stems from a shortage of measurements in arid regions, which comprise 40% of the earth’s land surface area. This study examined the emission of NO from three ecosystems in southern Africa which cover an aridity gradient from semi-arid savannas in South Africa to the hyper-arid Namib Desert in Namibia. A laboratory method was used to determine the release of NO as a function of the soil moisture and the soil temperature. Various methods were used to up-scale the net potential NO emissions determined in the laboratory to the vegetation patch, landscape or regional level. The importance of landscape, vegetation and climatic characteristics is emphasized. The first study occurred in a semi-arid savanna region in South Africa, where soils were sampled from 4 landscape positions in the Kruger National Park. The maximum NO emission occurred at soil moisture contents of 10%-20% water filled pore space (WFPS). The highest net potential NO emissions came from the low lying landscape positions, which have the largest nitrogen (N) stocks and the largest input of N. Net potential NO fluxes obtained in the laboratory were converted in field fluxes for the period 2003-2005, for the four landscape positions, using soil moisture and temperature data obtained in situ at the Kruger National Park Flux Tower Site. The NO emissions ranged from 1.5-8.5 kg ha-1 a-1. The field fluxes were up-scaled to a regional basis using geographic information system (GIS) based techniques, this indicated that the highest NO emissions occurred from the Midslope positions due to their large geographical extent in the research area. Total emissions ranged from 20x103 kg in 2004 to 34x103 kg in 2003 for the 56000 ha Skukuza land type. The second study occurred in an arid savanna ecosystem in the Kalahari, Botswana. In this study I collected soils from four differing vegetation patch types including: Pan, Annual Grassland, Perennial Grassland and Bush Encroached patches. The maximum net potential NO fluxes ranged from 0.27 ng m-2 s-1 in the Pan patches to 2.95 ng m-2 s-1 in the Perennial Grassland patches. The net potential NO emissions were up-scaled for the year December 2005-November 2006. This was done using 1) the net potential NO emissions determined in the laboratory, 2) the vegetation patch distribution obtained from LANDSAT NDVI measurements 3) estimated soil moisture contents obtained from ENVISAT ASAR measurements and 4) soil surface temperature measurements using MODIS 8 day land surface temperature measurements. This up-scaling procedure gave NO fluxes which ranged from 1.8 g ha-1 month-1 in the winter months (June and July) to 323 g ha-1 month-1 in the summer months (January-March). Differences occurred between the vegetation patches where the highest NO fluxes occurred in the Perennial Grassland patches and the lowest in the Pan patches. Over the course of the year the mean up-scaled NO emission for the studied region was 0.54 kg ha-1 a-1 and accounts for a loss of approximately 7.4% of the estimated N input to the region. The third study occurred in the hyper-arid Namib Desert in Namibia. Soils were sampled from three ecosystems; Dunes, Gravel Plains and the Riparian zone of the Kuiseb River. The net potential NO flux measured in the laboratory was used to estimate the NO flux for the Namib Desert for 2006 using modelled soil moisture and temperature data from the European Centre for Medium Range Weather Forecasts (ECMWF) operational model on a 36km x 35km spatial resolution. The maximum net potential NO production occurred at low soil moisture contents (<10%WFPS) and the optimal temperature was 25°C in the Dune and Riparian ecosystems and 35°C in the Gravel Plain Ecosystems. The maximum net potential NO fluxes ranged from 3.0 ng m-2 s-1 in the Riparian ecosystem to 6.2 ng m-2 s-1 in the Gravel Plains ecosystem. Up-scaling the net potential NO flux gave NO fluxes of up to 0.062 kg ha-1 a-1 in the Dune ecosystem and 0.544 kg h-1 a-1 in the Gravel Plain ecosystem. From these studies it is shown that NO is emitted ubiquitously from terrestrial ecosystems, as such the NO emission potential from deserts and scrublands should be taken into account in the global NO models. The emission of NO is influenced by various factors such as landscape, vegetation and climate. This study looks at the potential emissions from certain arid and semi-arid environments in southern Africa and other parts of the world and discusses some of the important factors controlling the emission of NO from the soil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The physico-chemical characterization, structure-pharmacokinetic and metabolism studies of new semi synthetic analogues of natural bile acids (BAs) drug candidates have been performed. Recent studies discovered a role of BAs as agonists of FXR and TGR5 receptor, thus opening new therapeutic target for the treatment of liver diseases or metabolic disorders. Up to twenty new semisynthetic analogues have been synthesized and studied in order to find promising novel drugs candidates. In order to define the BAs structure-activity relationship, their main physico-chemical properties (solubility, detergency, lipophilicity and affinity with serum albumin) have been measured with validated analytical methodologies. Their metabolism and biodistribution has been studied in “bile fistula rat”, model where each BA is acutely administered through duodenal and femoral infusion and bile collected at different time interval allowing to define the relationship between structure and intestinal absorption and hepatic uptake ,metabolism and systemic spill-over. One of the studied analogues, 6α-ethyl-3α7α-dihydroxy-5β-cholanic acid, analogue of CDCA (INT 747, Obeticholic Acid (OCA)), recently under approval for the treatment of cholestatic liver diseases, requires additional studies to ensure its safety and lack of toxicity when administered to patients with a strong liver impairment. For this purpose, CCl4 inhalation to rat causing hepatic decompensation (cirrhosis) animal model has been developed and used to define the difference of OCA biodistribution in respect to control animals trying to define whether peripheral tissues might be also exposed as a result of toxic plasma levels of OCA, evaluating also the endogenous BAs biodistribution. An accurate and sensitive HPLC-ES-MS/MS method is developed to identify and quantify all BAs in biological matrices (bile, plasma, urine, liver, kidney, intestinal content and tissue) for which a sample pretreatment have been optimized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die Zielsetzung der Arbeit besteht darin, neue Ansätze zur Herstellung strukturierter Kompositpartikel in wässrigem Medium zu entwickeln, welche als die Bildung genau definierter heterogener Strukturen in Kolloidsystemen angesehen werden können. Im Allgemeinen wurden zwei verschiedene Herangehensweisen entwickelt, die sich aufgrund des Ursprungs der gebildeten heterogenen Strukturen unterscheiden: Heterogenität oder Homogenität. Der Erste Ansatz basiert auf der Aggregation heterogener Phasen zur Bildung strukturierter Kolloidpartikel mit Heterogenität in der zugrunde liegenden Chemie, während der Zweite Ansatz auf der Bildung heterogener Phasen in Kolloidpartikeln aus homogenen Mischungen heraus durch kontrollierte Phasenseparation beruht.rnIm Detail beschäftigt sich der erste Teil der Dissertation mit einer neuen Herstellungsmethode für teilkristalline Komposit-Kolloidpartikel mit hoher Stabilität basierend auf der Aggregation flüssiger Monomertropfen an teilkristalline Polyacrylnitrilpartikel. Nach der Aggregation wurden hochstabile Dispersionen bestehend aus strukturierten, teilkristallinen Kompositpartikeln durch freie radikalische Polymerisation erhalten, während ein direktes Mischen der PAN Dispersionen mit Methacrylat-Polymerdispersionen zur unmittelbaren Koagulation führte. In Abhängigkeit von der Glastemperatur des Methacrylatpolymers führt die anschließende freie radikalische Polymerisation zur Bildung von Rasberry oder Kern-Schale Partikeln. Die auf diese Weise hergestellten Partikel sind dazu in der Lage, kontinuierliche Filme mit eingebetteten teilkristallinen Phasen zu bilden, welche als Sauerstoffbarriere Anwendung finden können.rnDer zweite Teil der Dissertation beschreibt eine neue Methode zur Herstellung strukturierter Duroplast-Thermoplast Komposit-Kolloidpartikel. Die Bildung eines Duroplastnetzwerks mit einer thermoplastischen Hülle wurde in zwei Schritten durch verschiedene, separate Polymerisationsmechanismen erreicht: Polyaddition und freie radikalische Polymerisation. Es wurden stabile Miniemulsionen erhalten, welche aus Bisphenol-F basiertem Epoxidharz, Phenalkamin-basiertem Härter und Vinlymonomere bestehen. Sie wurden durch Ultraschall mit nachfolgender Härtung bei verschiedenen Temperaturen als sogenannte Seed-Emulsionen hergestellt. Weitere Vinylmonomere wurden hinzugegeben und nachfolgend polymerisiert, was zur Bildung von Kern-Schale, beziehungsweise Duroplast-Thermoplast Kolloidpartikeln führte. Dabei findet in beiden Fällen zwischen der duroplastischen und der thermoplastischen Phase eine chemisch induzierte Phasenseparation statt, welche essenziell für die Bildung heterogener Strukturen ist. Die auf diese Weise hergestellten Kompositpartikel sind dazu in der Lage, transparente Filme zu bilden, welche unter geeigneten Bedingungen deutlich verbesserte mechanische Eigenschaften im Vergleich zu reinen Duroplastfilmen bereitstellen.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Im Rahmen dieser Arbeit wurde ein neuartiger Experimentaufbau -- das γ3 Experiment -- zur Messung von photoneninduzierten Kern-Dipolanregungen in stabilen Isotopen konzipiert und an der High Intensity γ-Ray Source (HIγS) an der Duke University installiert.rnDie hohe Energieauflösung und die hohe Nachweiseffizienz des Detektoraufbaus, welcher aus einer Kombination von LaBr Szintillatoren und hochreinen Germanium-Detektoren besteht, erlaubt erstmals die effiziente Messung von γ-γ-Koinzidenzen in Verbindung mit der Methode der Kernresonanzfluoreszenz.rnDiese Methode eröffnet den Zugang zum Zerfallsverhalten der angeregten Dipolzustände als zusätzlicher Observablen, die ein detaillierteres Verständnis der zugrunde liegenden Struktur dieser Anregungen ermöglicht.rnDer Detektoraufbau wurde bereits erfolgreich im Rahmen von zwei Experimentkampagnen in 2012 und 2013 für die Untersuchung von 13 verschiedenen Isotopen verwendet. Im Fokus dieser Arbeit stand die Analyse der Pygmy-Dipolresonanz (PDR) im Kern 140Ce im Energiebereich von 5,2 MeV bis 8,3 MeV basierend auf den mit dem γ3 Experimentaufbau gemessenen Daten. Insbesondere das Zerfallsverhalten der Zustände, die an der PDR beteiligt sind, wurde untersucht. Der Experimentaufbau, die Details der Analyse sowie die Resultate werden in der vorliegenden Arbeit präsentiert. Desweiteren erlaubt ein Vergleich der Ergebnisse mit theoretischen Rechnungen im quasi-particle phonon model (QPM) eine Interpretation des beobachteten Zerfallsverhaltens.