862 resultados para Branch-and-bound algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uncertainty in decision-making for patients’ risk of re-admission arises due to non-uniform data and lack of knowledge in health system variables. The knowledge of the impact of risk factors will provide clinicians better decision-making and in reducing the number of patients admitted to the hospital. Traditional approaches are not capable to account for the uncertain nature of risk of hospital re-admissions. More problems arise due to large amount of uncertain information. Patients can be at high, medium or low risk of re-admission, and these strata have ill-defined boundaries. We believe that our model that adapts fuzzy regression method will start a novel approach to handle uncertain data, uncertain relationships between health system variables and the risk of re-admission. Because of nature of ill-defined boundaries of risk bands, this approach does allow the clinicians to target individuals at boundaries. Targeting individuals at boundaries and providing them proper care may provide some ability to move patients from high risk to low risk band. In developing this algorithm, we aimed to help potential users to assess the patients for various risk score thresholds and avoid readmission of high risk patients with proper interventions. A model for predicting patients at high risk of re-admission will enable interventions to be targeted before costs have been incurred and health status have deteriorated. A risk score cut off level would flag patients and result in net savings where intervention costs are much higher per patient. Preventing hospital re-admissions is important for patients, and our algorithm may also impact hospital income.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advanced glycation end-products (AGEs) are linked to aging and correlated diseases. The aim of present study was to evaluate oxidative stress related parameters in J774A.1 murine macrophage cells during chronic exposure to a subtoxic concentration of AGE (5% ribose-glycated serum (GS)) and subsequently for 48 h to a higher dose (10% GS). No effects on cell viability were evident in either experimental condition. During chronic treatment, glycative markers (free and bound pentosidine) increased significantly in intra- and extracellular environments, but the production and release of thiobarbituric acid reactive substances (TBARs), as an index of lipid peroxidation, underwent a time-dependent decrease. Exposure to 10% GS evidenced that glycative markers rose further, while TBARs elicited a cellular defence against oxidative stress. Nonadapted cultures showed an accumulation of AGEs, a marked oxidative stress, and a loss of viability. During 10% GS exposure, reduced glutathione levels in adapted cultures remained constant, as did the oxidized glutathione to reduced glutathione ratio, while nonadapted cells showed a markedly increased redox ratio. A constant increase of heat shock protein 70 (HSP70) mRNA was observed in all experimental conditions. On the contrary, HSP70 expression became undetectable for a longer exposure time; this could be due to the direct involvement of HSP70 in the refolding of damaged proteins. Our findings suggest an adaptive response of macrophages to subtoxic doses of AGE, which could constitute an important factor in the spread of damage to other cellular types during aging.Key words: in vitro cytotoxicity, AGE, pentosidine, glycoxidation, oxidative stress, TBARs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ellerman Bombs (EBs) are often found to be co-spatial with bipolar photospheric magnetic fields. We use Hα imaging spectroscopy along with Fe i 6302.5 Å spectropolarimetry from the Swedish 1 m Solar Telescope (SST), combined with data from the Solar Dynamic Observatory, to study EBs and the evolution of the local magnetic fields at EB locations. EBs are found via an EB detection and tracking algorithm. Using NICOLE inversions of the spectropolarimetric data, we find that, on average, (3.43 ± 0.49) × 1024 erg of stored magnetic energy disappears from the bipolar region during EB burning. The inversions also show flux cancellation rates of 1014–1015 Mx s‑1 and temperature enhancements of 200 K at the detection footpoints. We investigate the near-simultaneous flaring of EBs due to co-temporal flux emergence from a sunspot, which shows a decrease in transverse velocity when interacting with an existing, stationary area of opposite polarity magnetic flux, resulting in the formation of the EBs. We also show that these EBs can be fueled further by additional, faster moving, negative magnetic flux regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research in ubiquitous and pervasive technologies have made it possible to recognise activities of daily living through non-intrusive sensors. The data captured from these sensors are required to be classified using various machine learning or knowledge driven techniques to infer and recognise activities. The process of discovering the activities and activity-object patterns from the sensors tagged to objects as they are used is critical to recognising the activities. In this paper, we propose a topic model process of discovering activities and activity-object patterns from the interactions of low level state-change sensors. We also develop a recognition and segmentation algorithm to recognise activities and recognise activity boundaries. Experimental results we present validates our framework and shows it is comparable to existing approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present document deals with the optimization of shape of aerodynamic profiles -- The objective is to reduce the drag coefficient on a given profile without penalising the lift coefficient -- A set of control points defining the geometry are passed and parameterized as a B-Spline curve -- These points are modified automatically by means of CFD analysis -- A given shape is defined by an user and a valid volumetric CFD domain is constructed from this planar data and a set of user-defined parameters -- The construction process involves the usage of 2D and 3D meshing algorithms that were coupled into own- code -- The volume of air surrounding the airfoil and mesh quality are also parametrically defined -- Some standard NACA profiles were used by obtaining first its control points in order to test the algorithm -- Navier-Stokes equations were solved for turbulent, steady-state ow of compressible uids using the k-epsilon model and SIMPLE algorithm -- In order to obtain data for the optimization process an utility to extract drag and lift data from the CFD simulation was added -- After a simulation is run drag and lift data are passed to the optimization process -- A gradient-based method using the steepest descent was implemented in order to define the magnitude and direction of the displacement of each control point -- The control points and other parameters defined as the design variables are iteratively modified in order to achieve an optimum -- Preliminary results on conceptual examples show a decrease in drag and a change in geometry that obeys to aerodynamic behavior principles

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho apresenta um estudo de caso das heurísticas Simulated Annealing e Algoritmo Genético para um problema de grande relevância encontrado no sistema portuário, o Problema de Alocação em Berços. Esse problema aborda a programação e a alocação de navios às áreas de atracação ao longo de um cais. A modelagem utilizada nesta pesquisa é apresentada por Mauri (2008) [28] que trata do problema como uma Problema de Roteamento de Veículos com Múltiplas Garagens e sem Janelas de Tempo. Foi desenvolvido um ambiente apropriado para testes de simulação, onde o cenário de análise foi constituido a partir de situações reais encontradas na programação de navios de um terminal de contêineres. Os testes computacionais realizados mostram a performance das heurísticas em relação a função objetivo e o tempo computacional, a m de avaliar qual das técnicas apresenta melhores resultados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho tem por objetivo propor uma metodologia heurística para o Problema de Cobertura de Arcos aplicado aos serviços de saneamento, em específico na leitura de hidrômetros. Dentro deste contexto desenvolveu-se um aplicativo que permite o planejamento de rotas de maneira que os custos em distância percorrida sejam reduzidos e mantenham-se aproximadamente os mesmos em todos os percursos. A metodologia foi dividida em etapas. Na primeira etapa, para compreender melhor o problema, fez-se uma pesquisa de campo organizando os dados disponibilizados por uma empresa de saneamento. A segunda etapa foi caracterizada pela determinação de pontos em cada metade de trechos de quadra e nas interseções de ruas, os quais foram cadastrados, em um mapa georeferenciado. Este mapa contemplou a região escolhida para o estudo e os pontos cadastrados serviram para determinar e consequentemente, designar as medianas relacionadas, o que constitui a terceira etapa. Para isso utilizou-se respectivamente o algoritmo de Teitz Bart Modificado por CADP e o algoritmo de designação de Gillet e Johnson adaptado. Ao final desta etapa formaram-se subsetores dentro de um setor específico. Na última etapa encontrou-se as rotas de cada subsetor através do algoritmo genético. O aplicativo desenvolvido permitiu flexibilidade de ações, dando autonomia para o usuário na escolha das opções de cálculo. Sua interface gráfica possibilitou a elaboração de mapas e a visualização das rotas em cada subsetor. Além disso o aplicativo minimizou os percursos e distribuiu os subsetores com distâncias aproximadas. A eficiência das heurísticas que embasaram o aplicativo desenvolvido, foi comprovada através dos testes realizados, os quais obtiveram resultados de boa qualidade.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este artículo comprueba la existencia de autocorrelación espacial al considerar la proporción de personas que viven en situación de miseria en los municipios del departamento de Antioquia, Colombia. Para ello se utiliza el test I de Moran y se propone un algoritmo para descartar la posibilidad de que la dependencia espacial sea espuria. Los resultados demuestran la necesidad de tener en cuenta la econometría espacial para determinar la asignación óptima del gasto social, destinado a intervenir en forma efectiva la situación de miseria en los municipios del departamento de Antioquia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A reduzida informação e o pouco trabalho científico desenvolvido na área de sistemas de combustão de biomassa de média potência, faz dos objectivos propostos neste trabalho elementos importantes. O trabalho científico a seguir apresentado, vai permitir obter as bases para o desenvolvimento de condições apropriadas de operação de sistemas de combustão a biomassa, aumentando a eficiência e a rentabilidade económica deste tipo de sistema energético. O principal objetivo do presente trabalho consistiu na aplicação de metodologias de monitorização que permitam caracterizar e melhorar a eficiência do sistema de combustão, na implementação dos métodos escolhidos e na monitorização das condições de operação de uma caldeira industrial de combustão de biomassa, destacando-se: (i) monitorização dos caudais de alimentação de biomassa à caldeira realizada por sistemas de alimentação sem-fim; (ii) análise e monitorização de temperaturas e pressão; (iii) monitorização do caudal de ar de combustão; (iv) monitorização do caudal de gases de exaustão; (v) monitorização da potência térmica; (vi) monitorização da composição do efluente gasoso. A caracterização físicas de amostras de biomassa, o teste a diferentes tipos de biomassa com diferentes condições de operação e a recolha de amostras de cinzas de combustão para a caracterização físico-química são outros métodos de monitorização e caracterização aplicados. Também foi desenvolvido e aplicado um ensaio de controlo do sistema de alimentação em modo de operação manual e comparado com o sistema de controlo do sistema de alimentação em modo de operação automático. O estudo realizado permite concluir que deve ser desenvolvido e implementado um algoritmo de controlo e operação da fornalha que permita um doseamento mais adequado dos caudais de combustível e ar de combustão com vista a melhorar o desempenho do sistema combustão.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work is presented mixed convection heat transfer inside a lid-driven cavity heated from below and filled with heterogeneous and homogeneous porous medium. In the heterogeneous approach, the solid domain is represented by heat conductive equally spaced blocks; the fluid phase surrounds the blocks being limited by the cavity walls. The homogeneous or pore-continuum approach is characterized by the cavity porosity and permeability. Generalized mass, momentum and energy conservation equations are obtained in dimensionless form to represent both the continuum and the pore-continuum models. The numerical solution is obtained via the finite volume method. QUICK interpolation scheme is set for numerical treatment of the advection terms and SIMPLE algorithm is applied for pressure-velocity coupling. Aiming the laminar regime, the flow parameters are kept in the range of 102≤Re≤103 and 103≤Ra≤106 for both the heterogeneous and homogeneous approaches. In the tested configurations for the continuous model, 9, 16, 36, and 64 blocks are considered for each combination of Re and Ra being the microscopic porosity set as constant φ=0,64 . For the pore-continuum model the Darcy number (Da) is set according to the number of blocks in the heterogeneous cavity and the φ. Numerical results of the comparative study between the microscopic and macroscopic approaches are presented. As a result, average Nusselt number equations for the continuum and the pore continuum models as a function of Ra and Re are obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Is phraseology the third articulation of language? Fresh insights into a theoretical conundrum Jean-Pierre Colson University of Louvain (Louvain-la-Neuve, Belgium) Although the notion of phraseology is now used across a wide range of linguistic disciplines, its definition and the classification of phraseological units remain a subject of intense debate. It is generally agreed that phraseology implies polylexicality, but this term is problematic as well, because it brings us back to one of the most controversial topics in modern linguistics: the definition of a word. On the other hand, another widely accepted principle of language is the double articulation or duality of patterning (Martinet 1960): the first articulation consists of morphemes and the second of phonemes. The very definition of morphemes, however, also poses several problems, and the situation becomes even more confused if we wish to take phraseology into account. In this contribution, I will take the view that a corpus-based and computational approach to phraseology may shed some new light on this theoretical conundrum. A better understanding of the basic units of meaning is necessary for more efficient language learning and translation, especially in the case of machine translation. Previous research (Colson 2011, 2012, 2013, 2014), Corpas Pastor (2000, 2007, 2008, 2013, 2015), Corpas Pastor & Leiva Rojo (2011), Leiva Rojo (2013), has shown the paramount importance of phraseology for translation. A tentative step towards a coherent explanation of the role of phraseology in language has been proposed by Mejri (2006): it is postulated that a third articulation of language intervenes at the level of words, including simple morphemes, sequences of free and bound morphemes, but also phraseological units. I will present results from experiments with statistical associations of morphemes across several languages, and point out that (mainly) isolating languages such as Chinese are interesting for a better understanding of the interplay between morphemes and phraseological units. Named entities, in particular, are an extreme example of intertwining cultural, statistical and linguistic elements. Other examples show that the many borrowings and influences that characterize European languages tend to give a somewhat blurred vision of the interplay between morphology and phraseology. From a statistical point of view, the cpr-score (Colson 2016) provides a methodology for adapting the automatic extraction of phraseological units to the morphological structure of each language. The results obtained can therefore be used for testing hypotheses about the interaction between morphology, phraseology and culture. Experiments with the cpr-score on the extraction of Chinese phraseological units show that results depend on how the basic units of meaning are defined: a morpheme-based approach yields good results, which corroborates the claim by Beck and Mel'čuk (2011) that the association of morphemes into words may be similar to the association of words into phraseological units. A cross-linguistic experiment carried out for English, French, Spanish and Chinese also reveals that the results are quite compatible with Mejri’s hypothesis (2006) of a third articulation of language. Such findings, if confirmed, also corroborate the notion of statistical semantics in language. To illustrate this point, I will present the PhraseoRobot (Colson 2016), a computational tool for extracting phraseological associations around key words from the media, such as Brexit. The results confirm a previous study on the term globalization (Colson 2016): a significant part of sociolinguistic associations prevailing in the media is related to phraseology in the broad sense, and can therefore be partly extracted by means of statistical scores. References Beck, D. & I. Mel'čuk (2011). Morphological phrasemes and Totonacan verbal morphology. Linguistics 49/1: 175-228. Colson, J.-P. (2011). La traduction spécialisée basée sur les corpus : une expérience dans le domaine informatique. In : Sfar, I. & S. Mejri, La traduction de textes spécialisés : retour sur des lieux communs. Synergies Tunisie n° 2. Gerflint, Agence universitaire de la Francophonie, p. 115-123. Colson, J.-P. (2012). Traduire le figement en langue de spécialité : une expérience de phraséologie informatique. In : Mogorrón Huerta, P. & S. Mejri (dirs.), Lenguas de especialidad, traducción, fijación / Langues spécialisées, figement et traduction. Encuentros Mediterráneos / Rencontres Méditerranéennes, N°4. Universidad de Alicante, p. 159-171. Colson, J.-P. (2013). Pratique traduisante et idiomaticité : l’importance des structures semi-figées. In : Mogorrón Huerta, P., Gallego Hernández, D., Masseau, P. & Tolosa Igualada, M. (eds.), Fraseología, Opacidad y Traduccíon. Studien zur romanischen Sprachwissenschaft und interkulturellen Kommunikation (Herausgegeben von Gerd Wotjak). Frankfurt am Main, Peter Lang, p. 207-218. Colson, J.-P. (2014). La phraséologie et les corpus dans les recherches traductologiques. Communication lors du colloque international Europhras 2014, Association Européenne de Phraséologie. Université de Paris Sorbonne, 10-12 septembre 2014. Colson, J-P. (2016). Set phrases around globalization : an experiment in corpus-based computational phraseology. In: F. Alonso Almeida, I. Ortega Barrera, E. Quintana Toledo and M. Sánchez Cuervo (eds.), Input a Word, Analyse the World: Selected Approaches to Corpus Linguistics. Newcastle upon Tyne: Cambridge Scholars Publishing, p. 141-152. Corpas Pastor, G. (2000). Acerca de la (in)traducibilidad de la fraseología. In: G. Corpas Pastor (ed.), Las lenguas de Europa: Estudios de fraseología, fraseografía y traducción. Granada: Comares, p. 483-522. Corpas Pastor, G. (2007). Europäismen - von Natur aus phraseologische Äquivalente? Von blauem Blut und sangre azul. In: M. Emsel y J. Cuartero Otal (eds.), Brücken: Übersetzen und interkulturelle Kommunikationen. Festschrift für Gerd Wotjak zum 65. Geburtstag, Fráncfort: Peter Lang, p. 65-77. Corpas Pastor, G. (2008). Investigar con corpus en traducción: los retos de un nuevo paradigma [Studien zur romanische Sprachwissenschaft und interkulturellen Kommunikation, 49], Fráncfort: Peter Lang. Corpas Pastor, G. (2013). Detección, descripción y contraste de las unidades fraseológicas mediante tecnologías lingüísticas. In Olza, I. & R. Elvira Manero (eds.) Fraseopragmática. Berlin: Frank & Timme, p. 335-373. Leiva Rojo, J. (2013). La traducción de unidades fraseológicas (alemán-español/español-alemán) como parámetro para la evaluación y revisión de traducciones. In: Mellado Blanco, C., Buján, P, Iglesias N.M., Losada M.C. & A. Mansilla (eds), La fraseología del alemán y el español: lexicografía y traducción. ELS, Etudes Linguistiques / Linguistische Studien, Band 11. München: Peniope, p. 31-42. Leiva Rojo, J. & G. Corpas Pastor (2011). Placing Italian idioms in a foreign milieu: a case study. In: Pamies Bertrán, A., Luque Nadal, L., Bretana, J. &; M. Pazos (eds), (2011). Multilingual phraseography. Second Language Learning and Translation Applications. Baltmannsweiler: Schneider Verlag (Colección: Phraseologie und Parömiologie, 28), p. 289-298. Martinet, A. (1966). Eléments de linguistique générale. Paris: Colin. Mejri, S. (2006). Polylexicalité, monolexicalité et double articulation. Cahiers de Lexicologie 2: 209-221.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La diminution des doses administrées ou même la cessation complète d'un traitement chimiothérapeutique est souvent la conséquence de la réduction du nombre de neutrophiles, qui sont les globules blancs les plus fréquents dans le sang. Cette réduction dans le nombre absolu des neutrophiles, aussi connue sous le nom de myélosuppression, est précipitée par les effets létaux non spécifiques des médicaments anti-cancéreux, qui, parallèlement à leur effet thérapeutique, produisent aussi des effets toxiques sur les cellules saines. Dans le but d'atténuer cet impact myélosuppresseur, on administre aux patients un facteur de stimulation des colonies de granulocytes recombinant humain (rhG-CSF), une forme exogène du G-CSF, l'hormone responsable de la stimulation de la production des neutrophiles et de leurs libération dans la circulation sanguine. Bien que les bienfaits d'un traitement prophylactique avec le G-CSF pendant la chimiothérapie soient bien établis, les protocoles d'administration demeurent mal définis et sont fréquemment déterminés ad libitum par les cliniciens. Avec l'optique d'améliorer le dosage thérapeutique et rationaliser l'utilisation du rhG-CSF pendant le traitement chimiothérapeutique, nous avons développé un modèle physiologique du processus de granulopoïèse, qui incorpore les connaissances actuelles de pointe relatives à la production des neutrophiles des cellules souches hématopoïétiques dans la moelle osseuse. À ce modèle physiologique, nous avons intégré des modèles pharmacocinétiques/pharmacodynamiques (PK/PD) de deux médicaments: le PM00104 (Zalypsis®), un médicament anti-cancéreux, et le rhG-CSF (filgrastim). En se servant des principes fondamentaux sous-jacents à la physiologie, nous avons estimé les paramètres de manière exhaustive sans devoir recourir à l'ajustement des données, ce qui nous a permis de prédire des données cliniques provenant de 172 patients soumis au protocol CHOP14 (6 cycles de chimiothérapie avec une période de 14 jours où l'administration du rhG-CSF se fait du jour 4 au jour 13 post-chimiothérapie). En utilisant ce modèle physio-PK/PD, nous avons démontré que le nombre d'administrations du rhG-CSF pourrait être réduit de dix (pratique actuelle) à quatre ou même trois administrations, à condition de retarder le début du traitement prophylactique par le rhG-CSF. Dans un souci d'applicabilité clinique de notre approche de modélisation, nous avons investigué l'impact de la variabilité PK présente dans une population de patients, sur les prédictions du modèle, en intégrant des modèles PK de population (Pop-PK) des deux médicaments. En considérant des cohortes de 500 patients in silico pour chacun des cinq scénarios de variabilité plausibles et en utilisant trois marqueurs cliniques, soient le temps au nadir des neutrophiles, la valeur du nadir, ainsi que l'aire sous la courbe concentration-effet, nous avons établi qu'il n'y avait aucune différence significative dans les prédictions du modèle entre le patient-type et la population. Ceci démontre la robustesse de l'approche que nous avons développée et qui s'apparente à une approche de pharmacologie quantitative des systèmes (QSP). Motivés par l'utilisation du rhG-CSF dans le traitement d'autres maladies, comme des pathologies périodiques telles que la neutropénie cyclique, nous avons ensuite soumis l'étude du modèle au contexte des maladies dynamiques. En mettant en évidence la non validité du paradigme de la rétroaction des cytokines pour l'administration exogène des mimétiques du G-CSF, nous avons développé un modèle physiologique PK/PD novateur comprenant les concentrations libres et liées du G-CSF. Ce nouveau modèle PK a aussi nécessité des changements dans le modèle PD puisqu’il nous a permis de retracer les concentrations du G-CSF lié aux neutrophiles. Nous avons démontré que l'hypothèse sous-jacente de l'équilibre entre la concentration libre et liée, selon la loi d'action de masse, n'est plus valide pour le G-CSF aux concentrations endogènes et mènerait en fait à la surestimation de la clairance rénale du médicament. En procédant ainsi, nous avons réussi à reproduire des données cliniques obtenues dans diverses conditions (l'administration exogène du G-CSF, l'administration du PM00104, CHOP14). Nous avons aussi fourni une explication logique des mécanismes responsables de la réponse physiologique aux deux médicaments. Finalement, afin de mettre en exergue l’approche intégrative en pharmacologie adoptée dans cette thèse, nous avons démontré sa valeur inestimable pour la mise en lumière et la reconstruction des systèmes vivants complexes, en faisant le parallèle avec d’autres disciplines scientifiques telles que la paléontologie et la forensique, où une approche semblable a largement fait ses preuves. Nous avons aussi discuté du potentiel de la pharmacologie quantitative des systèmes appliquées au développement du médicament et à la médecine translationnelle, en se servant du modèle physio-PK/PD que nous avons mis au point.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los sistemas de respuesta activa tienen por objetivo ejecutar una respuesta en contra de una intrusión de forma automática. Sin embargo, ejecutar una respuesta automáticamente no es una tarea trivial ya que el costo de ejecutar una respuesta podría ser más grande que el efecto que cause la intrusión propiamente dicha. También, el sistema debe contar con un amplio conjunto de acciones de respuesta y un algoritmo que seleccione la respuesta óptima. Este artículo propone un toolkit de respuestas que será integrado a un IRS basado en Ontologías para permitir la ejecución automática de la mejor respuesta cuando una intrusión es detectada. Se presenta un conjunto de respuestas basadas en host y basadas en red que pueden ser ejecutadas por el IRS, dicha ejecución es llevada a cabo mediante agentes basados en plugins que han sido distribuidos en la red. Finalmente, se realiza la verificación del sistema propuesto, tomando como caso de uso un ataque de defacement obteniéndose resultados satisfactorios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tem sido relatado que as estacas de Camellia sinensis possuem baixa capacidade de emitir raízes, motivando assim a realização de estudos básicos para otimização do processo de propagação por estacas. Assim sendo, o presente trabalho objetivou quantificar o potencial rizogênico de diferentes genótipos e o efeito da posição da estaca no ramo e incisão na base, do substrato, tamanho do recipiente e ácido indolbutírico no enraizamento de estacas semi-lenhosas dessa espécie. Para tal, foram coletados ramos dos genótipos IAC 259, F15 e Comum, em Pariquera-Açu-SP, no inverno de 2010. em seguida, preparadas as estacas, contendo uma gema e uma folha, foram mantidas em viveiro com 70% de sombreamento. Estacas da posição basal e mediana dos ramos são as mais adequadas para estaquia devido a menor mortalidade e maior enraizamento. A injúria na base da estaca não afeta a mortalidade e o enraizamento das estacas, porém induz à formação de calo. Também não houve diferenças na mortalidade e no enraizamento das estacas quando as mesmas foram mantidas em recipiente de 50, 90 e 120 cm³. Comparado com vermiculita, areia e casca de arroz carbonizada, o solo foi o melhor substrato para estaquia, que na presença do ferimento, juntamente com o tratamento das estacas com 10 g L-1 de AIB promoveu a maior porcentagem de enraizamento. Todavia, ainda nessa condição a mortalidade média das estacas foi de 42%. O potencial de enraizamento do genótipo Comum foi superior ao do IAC 259 e F15.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Apresenta·se um breve resumo histórico da evolução da amostragem por transectos lineares e desenvolve·se a sua teoria. Descrevemos a teoria de amostragem por transectos lineares, proposta por Buckland (1992), sendo apresentados os pontos mais relevantes, no que diz respeito à modelação da função de detecção. Apresentamos uma descrição do princípio CDM (Rissanen, 1978) e a sua aplicação à estimação de uma função densidade por um histograma (Kontkanen e Myllymãki, 2006), procedendo à aplicação de um exemplo prático, recorrendo a uma mistura de densidades. Procedemos à sua aplicação ao cálculo do estimador da probabilidade de detecção, no caso dos transectos lineares e desta forma estimar a densidade populacional de animais. Analisamos dois casos práticos, clássicos na amostragem por distâncias, comparando os resultados obtidos. De forma a avaliar a metodologia, simulámos vários conjuntos de observações, tendo como base o exemplo das estacas, recorrendo às funções de detecção semi-normal, taxa de risco, exponencial e uniforme com um cosseno. Os resultados foram obtidos com o programa DISTANCE (Thomas et al., in press) e um algoritmo escrito em linguagem C, cedido pelo Professor Doutor Petri Kontkanen (Departamento de Ciências da Computação, Universidade de Helsínquia). Foram desenvolvidos programas de forma a calcular intervalos de confiança recorrendo à técnica bootstrap (Efron, 1978). São discutidos os resultados finais e apresentadas sugestões de desenvolvimentos futuros. ABSTRACT; We present a brief historical note on the evolution of line transect sampling and its theoretical developments. We describe line transect sampling theory as proposed by Buckland (1992), and present the most relevant issues about modeling the detection function. We present a description of the CDM principle (Rissanen, 1978) and its application to histogram density estimation (Kontkanen and Myllymãki, 2006), with a practical example, using a mixture of densities. We proceed with the application and estimate probability of detection and animal population density in the context of line transect sampling. Two classical examples from the literature are analyzed and compared. ln order to evaluate the proposed methodology, we carry out a simulation study based on a wooden stakes example, and using as detection functions half normal, hazard rate, exponential and uniform with a cosine term. The results were obtained using program DISTANCE (Thomas et al., in press), and an algorithm written in C language, kindly offered by Professor Petri Kontkanen (Department of Computer Science, University of Helsinki). We develop some programs in order to estimate confidence intervals using the bootstrap technique (Efron, 1978). Finally, the results are presented and discussed with suggestions for future developments.