995 resultados para LENGTHS
Resumo:
To create a clinically relevant gold nanoparticle (AuNP) treatment, the surface must be functionalized with multiple ligands such as drugs, antifouling agents and targeting moieties. However, attaching several ligands of differing chemistries and lengths, while ensuring they all retain their biological functionality remains a challenge. This review compares the two most widely employed methods of surface co-functionalization, namely mixed monolayers and hetero-bifunctional linkers. While there are numerous in vitro studies successfully utilizing both surface arrangements, there is little consensus regarding their relative merits. Animal and preclinical studies have demonstrated the effectiveness of mixed monolayer functionalization and while some promising in vitro results have been reported for PEG linker capped AuNPs, any potential benefits of the approach are not yet fully understood.
Resumo:
Objectives There is evidence from neuroscience, cognitive psychology and educational research that the delivery of a stimulus in a spaced format (over time) rather than a massed format (all at once) leads to more effective learning. This project aimed to pilot spaced learning materials using various spacing lengths for GCSE science by exploring the feasibility of introducing spaced leaning into regular classrooms and by evaluating teacher fidelity to the materials. The spaced learning methods will then be compared with traditional science revision techniques and a programme manual will be produced. Design A feasibility study. Methods A pilot study (4 schools) was carried out to examine the feasibility and teacher fidelity to the materials, using pupil workshops and teacher interviews. A subsequent random assignment experimental study (12 schools) will involve pre and post testing of students on a science attainment measure and a post-test implementation questionnaire. Results The literature review found that longer spacing intervals between repetitions of material (>24 hours) may be optimal for long term memory formation than shorter intervals. A logic model was developed to inform the design of various programme variants for the pilot and experimental study. This paper will report qualitative data from the initial pilot study. Conclusions The paper uses this research project as an example to explain the importance of conducting pilot work and small scale experimental studies to explore the feasibility and inform the design of educational interventions, rather than prematurely moving to RCT type studies.
Resumo:
The research presented, investigates the optimal set of operational codes (opcodes) that create a robust indicator of malicious software (malware) and also determines a program’s execution duration for accurate classification of benign and malicious software. The features extracted from the dataset are opcode density histograms, extracted during the program execution. The classifier used is a support vector machine and is configured to select those features to produce the optimal classification of malware over different program run lengths. The findings demonstrate that malware can be detected using dynamic analysis with relatively few opcodes.
Resumo:
The pine wood nematode Bursaphelenchus xylophilus reproduces bisexually: a haploid sperm fertilizes a haploid oocyte, and the two pronuclei rearrange, move together, fuse, and begin diploid development. Early embryonic events taking place in the B. xylophilus embryo are similar to those of Caenorhabditis elegans, although the anterior-posterior axis appeares to be determined oppositely to that observed for C. elegans. Thai is, in the B. xylophilus embryo, the male pronucleus emerges at the future anterior end, whereas the female pronucleus appeares laterally. To understand the evolution of nematode developmental systems, we cloned the full length of Bx-tbb-1 (beta tubulin) from B. xylophilus cDNA and attempted to apply reverse genetics analysis to B. xylophilus. Several lengths of double stranded RNA (dsRNA) for the Bx-tbb-1 gene were synthesized by in vitro transcription, and both B. xylophilus and C. elegans were soaked in dsRNA for RNAi. Both nematodes could suck up the dsRNA, and we could detect the abnormal phenotypes caused by Bx-tbb-1 dsRNA in C. elegans, but not in B. xylophilus. We suspect that systemic RNAi might be suppressed in B. xylophilus and are attempting to establish other methods for functionally analyzing B. xylophilus genes.
Resumo:
Este trabalho investiga novas metodologias para as redes óticas de acesso de próxima geração (NG-OAN). O trabalho está dividido em quatro tópicos de investigação: projeto da rede, modelos numéricos para efeitos não lineares da fibra ótica, impacto dos efeitos não lineares da fibra ótica e otimização da rede. A rede ótica de acesso investigada nesse trabalho está projetado para suprir os requisitos de densidade de utilizadores e cobertura, isto é, suportar muitos utilizadores ( 1000) com altas velocidades de conexão dedicada ( 1 Gb/s) ocupando uma faixa estreita do espectro ( 25 nm) e comprimentos de fibra ótica até 100 km. Os cenários são baseados em redes óticas passivas com multiplexagem por divisão no comprimento de onda de alta densidade (UDWDM-PON) utilizando transmissores/receptores coerentes nos terminais da rede. A rede é avaliada para vários ritmos de transmissão usando formatos de modulação avançados, requisitos de largura de banda por utilizador e partilha de banda com tecnologias tradicionais de redes óticas passivas (PON). Modelos numéricos baseados em funções de transferência das séries de Volterra (VSTF) são demonstrados tanto para a análise dos efeitos não lineares da fibra ótica quanto para avaliação do desempenho total da rede. São apresentadas as faixas de potência e distância de transmissão nas quais as séries de Volterra apresentam resultados semelhantes ao modelo referência Split-Step Fourier (SSF) (validado experimentalmente) para o desempenho total da rede. Além disso, um algoritmo, que evita componentes espectrais com intensidade nulo, é proposto para realizar cálculos rápidos das séries. O modelo VSTF é estendido para identificar unicamente os efeitos não lineares da fibra ótica mais relevantes no cenário investigado: Self-Phase Modulation (SPM), Cross-Phase Modulation (XPM) e Four-Wave Mixing (FWM). Simulações numéricas são apresentadas para identificar o impacto isolado de cada efeito não linear da fibra ótica, SPM, XPM e FWM, no desempenho da rede com detecção coerente UDWDM-PON, transportando canais com modulação digital em fase (M-ária PSK) ou modulação digital em amplitude (M-ária QAM). A análise numérica é estendida para diferentes comprimentos de fibra ótica mono modo (SSMF), potência por canal e ritmo de transmissão por canal. Por conseguinte, expressões analíticas são extrapoladas para determinar a evolução do SPM, XPM e FWM em função da potência e distância de transmissão em cenários NG-OAN. O desempenho da rede é otimizada através da minimização parcial da interferência FWM (via espaçamento desigual dos canais), que nesse caso, é o efeito não linear da fibra ótica mais relevante. Direções para melhorias adicionas no desempenho da rede são apresentados para cenários em que o XPM é relevante, isto é, redes transportando formatos de modulação QAM. A solução, nesse caso, é baseada na utilização de técnicas de processamento digital do sinal.
Resumo:
The common cuttlefish, Sepia officinalis, is a necto-benthic cephalopod that can live in coastal ecosystems, with high influence of anthropogenic pressures and thus be vulnerable to exposure to various types of contaminants. The cuttlefish is a species of great importance to the local economy of Aveiro, considering the global data of catches of this species in the Ria de Aveiro. However, studies on this species in Ria de Aveiro are scarce, so the present study aims to fill this information gap about the cuttlefish in the Ria de Aveiro. The cuttlefish enters Ria de Aveiro in the spring and summer to reproduce, returning to deeper waters in the winter. In terms of abundance, the eastern and center regions of the lagoon, closer to the sea, showed the highest values of abundance, while the northern and southern regions of the main channel had the lowest abundance. This fact may be related to abiotic factors, as well as depth, salinity and temperature. In the most southern point of the Ria de Aveiro (Areão) no cuttlefish was caught. This site had the lowest values of salinity and depth. The cuttlefish has an allometric the females being heavier than males to mantle lengths greater than 82.4 mm. Males reach sexual maturity first than females. In Ria de Aveiro in a generation of parents was found. The cuttlefish, presents itself as opportunistic predators, consuming a wide variety of prey from different taxa. The diet was similar in different sampling locations observing significant differences for the seasons. S. officinalis was captured at 10 sites in the Ria de Aveiro with different anthropogenic sources of contamination. Thus, levels of metals analyzed were similar at all sampling sites, with the exception of a restricted area, Laranjo, which showed higher values. The cuttlefish has the ability to accumulate metals in your body. The levels of Fe, Zn, Cu, Cd, Pb and Hg found in the digestive gland and mantle reflect a differential accumulation of metals in the tissues. This accumulation is related to the type and function of tissue analyzed and the type of metal analysis (essential and non-essential). The metal concentrations in the digestive gland are higher than in the mantle, with the exception of mercury. This may be due to the high affinity of the mantle for the incorporation of methylmercury (MeHg), the most abundant form of mercury. The accumulation of metals can vary over a lifetime, depending on the metal. The concentrations of Zn, Cd and Hg increases throughout life, while Pb decreases and essential metals such as Fe and Cu remain constant. The data collected suggest that the cuttlefish (Sepia officinalis) can be used as a bioindicator of environmental contamination for some metals.
Resumo:
Dissertação mest., Gestão da Água e da Costa, Universidade do Algarve, 2007
Resumo:
Dissertação de Mestrado, Biologia Marinha, Especialização em Ecologia e Conservação, Faculdade de Ciências do Mar e do Ambiente, Universidade do Algarve, 2007
Resumo:
Auch für das Teilnehmeranschlußnetz werden neben dem heute üblichen „Sternnetz" neuerdings „Ring-" und „Verzweigungsnetze" genannt, und es wird die Frage diskutiert, ob damit geringere Kosten zu erwarten sind. Mit Begriffen der Graphentheorie werden hier z.B. die Strukturen Stern, Ring, Baum definiert. Ein gedachtes Ortsnetz wird dann in quadratische Bereiche mit der Seitenlänge l und mit M Teilnehmern aufgeteilt. Für verschiedene Strukturen des Leiternetzes in der Teilnehmerebene werden die Mindestlängen der Leiter und der Kabelkanäle berechnet. Unter anderem zeigt sich, daß unabhängig von der Struktur des Leiternetzes die Kabelkanäle, ein dominierender Kostenanteil in der Teilnehmerebene, praktisch gleich lang sind, nämlich l/M^0,5 je Teilnehmer.
Resumo:
La compression des données est la technique informatique qui vise à réduire la taille de l’information pour minimiser l’espace de stockage nécessaire et accélérer la transmission des données dans les réseaux à bande passante limitée. Plusieurs techniques de compression telles que LZ77 et ses variantes souffrent d’un problème que nous appelons la redondance causée par la multiplicité d’encodages. La multiplicité d’encodages (ME) signifie que les données sources peuvent être encodées de différentes manières. Dans son cas le plus simple, ME se produit lorsqu’une technique de compression a la possibilité, au cours du processus d’encodage, de coder un symbole de différentes manières. La technique de compression par recyclage de bits a été introduite par D. Dubé et V. Beaudoin pour minimiser la redondance causée par ME. Des variantes de recyclage de bits ont été appliquées à LZ77 et les résultats expérimentaux obtenus conduisent à une meilleure compression (une réduction d’environ 9% de la taille des fichiers qui ont été compressés par Gzip en exploitant ME). Dubé et Beaudoin ont souligné que leur technique pourrait ne pas minimiser parfaitement la redondance causée par ME, car elle est construite sur la base du codage de Huffman qui n’a pas la capacité de traiter des mots de code (codewords) de longueurs fractionnaires, c’est-à-dire qu’elle permet de générer des mots de code de longueurs intégrales. En outre, le recyclage de bits s’appuie sur le codage de Huffman (HuBR) qui impose des contraintes supplémentaires pour éviter certaines situations qui diminuent sa performance. Contrairement aux codes de Huffman, le codage arithmétique (AC) peut manipuler des mots de code de longueurs fractionnaires. De plus, durant ces dernières décennies, les codes arithmétiques ont attiré plusieurs chercheurs vu qu’ils sont plus puissants et plus souples que les codes de Huffman. Par conséquent, ce travail vise à adapter le recyclage des bits pour les codes arithmétiques afin d’améliorer l’efficacité du codage et sa flexibilité. Nous avons abordé ce problème à travers nos quatre contributions (publiées). Ces contributions sont présentées dans cette thèse et peuvent être résumées comme suit. Premièrement, nous proposons une nouvelle technique utilisée pour adapter le recyclage de bits qui s’appuie sur les codes de Huffman (HuBR) au codage arithmétique. Cette technique est nommée recyclage de bits basé sur les codes arithmétiques (ACBR). Elle décrit le cadriciel et les principes de l’adaptation du HuBR à l’ACBR. Nous présentons aussi l’analyse théorique nécessaire pour estimer la redondance qui peut être réduite à l’aide de HuBR et ACBR pour les applications qui souffrent de ME. Cette analyse démontre que ACBR réalise un recyclage parfait dans tous les cas, tandis que HuBR ne réalise de telles performances que dans des cas très spécifiques. Deuxièmement, le problème de la technique ACBR précitée, c’est qu’elle requiert des calculs à précision arbitraire. Cela nécessite des ressources illimitées (ou infinies). Afin de bénéficier de cette dernière, nous proposons une nouvelle version à précision finie. Ladite technique devienne ainsi efficace et applicable sur les ordinateurs avec les registres classiques de taille fixe et peut être facilement interfacée avec les applications qui souffrent de ME. Troisièmement, nous proposons l’utilisation de HuBR et ACBR comme un moyen pour réduire la redondance afin d’obtenir un code binaire variable à fixe. Nous avons prouvé théoriquement et expérimentalement que les deux techniques permettent d’obtenir une amélioration significative (moins de redondance). À cet égard, ACBR surpasse HuBR et fournit une classe plus étendue des sources binaires qui pouvant bénéficier d’un dictionnaire pluriellement analysable. En outre, nous montrons qu’ACBR est plus souple que HuBR dans la pratique. Quatrièmement, nous utilisons HuBR pour réduire la redondance des codes équilibrés générés par l’algorithme de Knuth. Afin de comparer les performances de HuBR et ACBR, les résultats théoriques correspondants de HuBR et d’ACBR sont présentés. Les résultats montrent que les deux techniques réalisent presque la même réduction de redondance sur les codes équilibrés générés par l’algorithme de Knuth.
Resumo:
Previous work on Betula spp. (birch) in the UK and at five sites in Europe has shown that pollen seasons for this taxon have tended to become earlier by about 5–10 days per decade in most regions investigated over the last 30 years. This pattern has been linked to the trend to warmer winters and springs in recent years. However, little work has been done to investigate the changes in the pollen seasons for the early flowering trees. Several of these, such as Alnus spp. and Corylus spp., have allergens, which cross-react with those of Betula spp., and so have a priming effect on allergic people. This paper investigates pollen seasons for Alnus spp. and Corylus spp. for the years 1996–2005 at Worcester, in the West Midlands, United Kingdom. Pollen data for daily average counts were collected using a Burkard volumetric trap sited on the exposed roof of a three-storey building. The climate is western maritime. Meteorological data for daily temperatures (maximum and minimum) and rainfall were obtained from the local monitoring sites. The local area up to approximately 10 km surrounding the site is mostly level terrain with some undulating hills and valleys. The local vegetation is mixed farmland and deciduous woodland. The pollen seasons for the two taxa investigated are typically late December or early January to late March. Various ways of defining the start and end of the pollen seasons were considered for these taxa, but the most useful was the 1% method whereby the season is deemed to have started when 1% of the total catch is achieved and to have ended when 99% is reached. The cumulative catches (in grains/m3) for Alnus spp. varied from 698 (2001) to 3,467 (2004). For Corylus spp., they varied from 65 (2001) to 4,933 (2004). The start dates for Alnus spp. showed 39 days difference in the 10 years (earliest 2000 day 21, latest 1996 day 60). The end dates differed by 26 days and the length of season differed by 15 days. The last 4 years in the set had notably higher cumulative counts than the first 2, but there was no trend towards earlier starts. For Corylus spp. start days also differed by 39 days (earliest 1999 day 5, latest 1996 day 44). The end date differed by 35 days and length of season by 26 days. Cumulative counts and lengths of season showed a distinct pattern of alternative high (long) and low (short) years. There is some evidence of a synchronous pattern for Alnus spp.. These patterns show some significant correlations with temperature and rainfall through the autumn, winter and early spring, and some relationships with growth degree 4s and chill units, but the series is too short to discern trends. The analysis has provided insight to the variation in the seasons for these early flowering trees and will form a basis for future work on building predictive models for these taxa.
Resumo:
Tese de mestrado. Biologia (Biologia Molecular e Genética). Universidade de Lisboa, Faculdade de Ciências, 2014
Resumo:
Quantifying the topography of rivers and their associated bedforms has been a fundamental concern of fluvial geomorphology for decades. Such data, acquired at high temporal and spatial resolutions, are increasingly in demand for process-oriented investigations of flow hydraulics, sediment dynamics and in-stream habitat. In these riverine environments, the most challenging region for topographic measurement is the wetted, submerged channel. Generally, dry bed topography and submerged bathymetry are measured using different methods and technology. This adds to the costs, logistical challenges and data processing requirements of comprehensive river surveys. However, some technologies are capable of measuring the submerged topography. Through-water photogrammetry and bathymetric LiDAR are capable of reasonably accurate measurements of channel beds in clear water. Whilst the cost of bathymetric LiDAR remains high and its resolution relatively coarse, the recent developments in photogrammetry using Structure from Motion (SfM) algorithms promise a fundamental shift in the accessibility of topographic data for a wide range of settings. Here we present results demonstrating the potential of so called SfM-photogrammetry for quantifying both exposed and submerged fluvial topography at the mesohabitat scale. We show that imagery acquired from a rotary-winged Unmanned Aerial System (UAS) can be processed in order to produce digital elevation models (DEMs) with hyperspatial resolutions (c. 0.02 m) for two different river systems over channel lengths of 50-100 m. Errors in submerged areas range from 0.016 m to 0.089 m, which can be reduced to between 0.008 m and 0.053 m with the application of a simple refraction correction. This work therefore demonstrates the potential of UAS platforms and SfM-photogrammetry as a single technique for surveying fluvial topography at the mesoscale (defined as lengths of channel from c.10 m to a few hundred metres). This article is protected by copyright. All rights reserved.
Resumo:
The UMTS turbo encoder is composed of parallel concatenation of two Recursive Systematic Convolutional (RSC) encoders which start and end at a known state. This trellis termination directly affects the performance of turbo codes. This paper presents performance analysis of multi-point trellis termination of turbo codes which is to terminate RSC encoders at more than one point of the current frame while keeping the interleaver length the same. For long interleaver lengths, this approach provides dividing a data frame into sub-frames which can be treated as independent blocks. A novel decoding architecture using multi-point trellis termination and collision-free interleavers is presented. Collision-free interleavers are used to solve memory collision problems encountered by parallel decoding of turbo codes. The proposed parallel decoding architecture reduces the decoding delay caused by the iterative nature and forward-backward metric computations of turbo decoding algorithms. Our simulations verified that this turbo encoding and decoding scheme shows Bit Error Rate (BER) performance very close to that of the UMTS turbo coding while providing almost %50 time saving for the 2-point termination and %80 time saving for the 5-point termination.
Resumo:
An experimental and Finite Element study was performed on the bending behaviour of wood beams of the Pinus Pinaster species repaired with adhesively-bonded carbon–epoxy patches, after sustaining damage by cross-grain failure. This damage is characterized by crack growth at a small angle to the beams longitudinal axis, due to misalignment between the wood fibres and the beam axis. Cross-grain failure can occur in large-scale in a wood member when trees that have grown spirally or with a pronounced taper are cut for lumber. Three patch lengths were tested. The simulations include the possibility of cohesive fracture of the adhesive layer, failure within the wood beam in two propagation planes and patch interlaminar failure, by the use of cohesive zone modelling. The respective cohesive properties were estimated either by an inverse method or from the literature. The comparison with the tests allowed the validation of the proposed methodology, opening a good perspective for the reduction of costs in the design stages of these repairs due to extensive experimentation.