939 resultados para location analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple sound sources often contain harmonics that overlap and may be degraded by environmental noise. The auditory system is capable of teasing apart these sources into distinct mental objects, or streams. Such an "auditory scene analysis" enables the brain to solve the cocktail party problem. A neural network model of auditory scene analysis, called the AIRSTREAM model, is presented to propose how the brain accomplishes this feat. The model clarifies how the frequency components that correspond to a give acoustic source may be coherently grouped together into distinct streams based on pitch and spatial cues. The model also clarifies how multiple streams may be distinguishes and seperated by the brain. Streams are formed as spectral-pitch resonances that emerge through feedback interactions between frequency-specific spectral representaion of a sound source and its pitch. First, the model transforms a sound into a spatial pattern of frequency-specific activation across a spectral stream layer. The sound has multiple parallel representations at this layer. A sound's spectral representation activates a bottom-up filter that is sensitive to harmonics of the sound's pitch. The filter activates a pitch category which, in turn, activate a top-down expectation that allows one voice or instrument to be tracked through a noisy multiple source environment. Spectral components are suppressed if they do not match harmonics of the top-down expectation that is read-out by the selected pitch, thereby allowing another stream to capture these components, as in the "old-plus-new-heuristic" of Bregman. Multiple simultaneously occuring spectral-pitch resonances can hereby emerge. These resonance and matching mechanisms are specialized versions of Adaptive Resonance Theory, or ART, which clarifies how pitch representations can self-organize durin learning of harmonic bottom-up filters and top-down expectations. The model also clarifies how spatial location cues can help to disambiguate two sources with similar spectral cures. Data are simulated from psychophysical grouping experiments, such as how a tone sweeping upwards in frequency creates a bounce percept by grouping with a downward sweeping tone due to proximity in frequency, even if noise replaces the tones at their interection point. Illusory auditory percepts are also simulated, such as the auditory continuity illusion of a tone continuing through a noise burst even if the tone is not present during the noise, and the scale illusion of Deutsch whereby downward and upward scales presented alternately to the two ears are regrouped based on frequency proximity, leading to a bounce percept. Since related sorts of resonances have been used to quantitatively simulate psychophysical data about speech perception, the model strengthens the hypothesis the ART-like mechanisms are used at multiple levels of the auditory system. Proposals for developing the model to explain more complex streaming data are also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Countries across the world are being challenged to decarbonise their energy systems in response to diminishing fossil fuel reserves, rising GHG emissions and the dangerous threat of climate change. There has been a renewed interest in energy efficiency, renewable energy and low carbon energy as policy‐makers seek to identify and put in place the most robust sustainable energy system that can address this challenge. This thesis seeks to improve the evidence base underpinning energy policy decisions in Ireland with a particular focus on natural gas, which in 2011 grew to have a 30% share of Ireland’s TPER. Natural gas is used in all sectors of the Irish economy and is seen by many as a transition fuel to a low-carbon energy system; it is also a uniquely excellent source of data for many aspects of energy consumption. A detailed decomposition analysis of natural gas consumption in the residential sector quantifies many of the structural drives of change, with activity (R2 = 0.97) and intensity (R2 = 0.69) being the best explainers of changing gas demand. The 2002 residential building regulations are subject to an ex-post evaluation, which using empirical data finds a 44 ±9.5% shortfall in expected energy savings as well as a 13±1.6% level of non-compliance. A detailed energy demand model of the entire Irish energy system is presented together with scenario analysis of a large number of energy efficiency policies, which show an aggregate reduction in TFC of 8.9% compared to a reference scenario. The role for natural gas as a transition fuel over a long time horizon (2005-2050) is analysed using an energy systems model and a decomposition analysis, which shows the contribution of fuel switching to natural gas to be worth 12 percentage points of an overall 80% reduction in CO2 emissions. Finally, an analysis of the potential for CCS in Ireland finds gas CCS to be more robust than coal CCS for changes in fuel prices, capital costs and emissions reduction and the cost optimal location for a gas CCS plant in Ireland is found to be in Cork with sequestration in the depleted gas field of Kinsale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bacteriophages, viruses infecting bacteria, are uniformly present in any location where there are high numbers of bacteria, both in the external environment and the human body. Knowledge of their diversity is limited by the difficulty to culture the host species and by the lack of the universal marker gene present in all viruses. Metagenomics is a powerful tool that can be used to analyse viral communities in their natural environments. The aim of this study was to investigate diverse populations of uncultured viruses from clinical (a sputum of patient with cystic fibrosis, CF) and environmental samples (a sludge from a dairy food wastewater treatment plant) containing rich bacterial populations using genetic and metagenomic analyses. Metagenomic sequencing of viruses obtained from these samples revealed that the majority of the metagenomic reads (97-99%) were novel when compared to the NCBI protein database using BLAST. A large proportion of assembled contigs were assignable as novel phages or uncharacterised prophages, the next largest assignable group being single-stranded eukaryotic virus genomes. Sputum from a cystic fibrosis patient contained DNA typical of phages of bacteria that are traditionally involved in CF lung infections and other bacteria that are part of the normal oral flora. The only eukaryotic virus detected in the CF sputum was Torque Teno virus (TTV). A substantial number of assigned sequences from dairy wastewater could be affiliated with phages of bacteria that are typically found in the soil and aquatic environments, including wastewater. Eukaryotic viral sequences were dominated by plant pathogens from the Geminiviridae and Nanoviridae families, and animal pathogens from the Circoviridae family. Antibiotic resistance genes were detected in both metagenomes suggesting phages could be a source for transmissible antimicrobial resistance. Overall, diversity of viruses in the CF sputum was low, with 89 distinct viral genotypes predicted, and higher (409 genotypes) in the wastewater. Function-based screening of a metagenomic library constructed from DNA extracted from dairy food wastewater viruses revealed candidate promoter sequences that have ability to drive expression of GFP in a promoter-trap vector in Escherichia coli. The majority of the cloned DNA sequences selected by the assay were related to ssDNA circular eukaryotic viruses and phages which formed a minority of the metagenome assembly, and many lacked any significant homology to known database sequences. Natural diversity of bacteriophages in wastewater samples was also examined by PCR amplification of the major capsid protein sequences, conserved within T4-type bacteriophages from Myoviridae family. Phylogenetic analysis of capsid sequences revealed that dairy wastewater contained mainly diverse and uncharacterized phages, while some showed a high level of similarity with phages from geographically distant environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Injuries represent a significant and growing public health concern in the developing world, yet their impact on patients and the emergency health-care system in the countries of East Africa has received limited attention. This study evaluates the magnitude and scope of injury related disorders in the population presenting to a referral hospital emergency department in northern Tanzania. METHODS: A retrospective chart review of patients presenting to the emergency department at Kilimanjaro Christian Medical Centre was performed. A standardized data collection form was used for data abstraction from the emergency department logbook and the complete medical record for all injured patients. Patient demographics, mechanism of injury, location, type and outcomes were recorded. RESULTS: Ten thousand six hundred twenty-two patients presented to the emergency department for evaluation and treatment during the 7-month study period. One thousand two hundred twenty-four patients (11.5%) had injuries. Males and individuals aged 15 to 44 years were most frequently injured, representing 73.4% and 57.8%, respectively. Road traffic injuries were the most common mechanism of injury, representing 43.9% of injuries. Head injuries (36.5%) and extremity injuries (59.5%) were the most common location of injury. The majority of injured patients, 59.3%, were admitted from the emergency department to the hospital wards, and 5.6%, required admission to an intensive care unit. Death occurred in 5.4% of injured patients. CONCLUSIONS: These data give a detailed and more robust picture of the patient demographics, mechanisms of injury, types of injury and patient outcomes from similar resource-limited settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The outcomes for both (i) radiation therapy and (ii) preclinical small animal radio- biology studies are dependent on the delivery of a known quantity of radiation to a specific and intentional location. Adverse effects can result from these procedures if the dose to the target is too high or low, and can also result from an incorrect spatial distribution in which nearby normal healthy tissue can be undesirably damaged by poor radiation delivery techniques. Thus, in mice and humans alike, the spatial dose distributions from radiation sources should be well characterized in terms of the absolute dose quantity, and with pin-point accuracy. When dealing with the steep spatial dose gradients consequential to either (i) high dose rate (HDR) brachytherapy or (ii) within the small organs and tissue inhomogeneities of mice, obtaining accurate and highly precise dose results can be very challenging, considering commercially available radiation detection tools, such as ion chambers, are often too large for in-vivo use.

In this dissertation two tools are developed and applied for both clinical and preclinical radiation measurement. The first tool is a novel radiation detector for acquiring physical measurements, fabricated from an inorganic nano-crystalline scintillator that has been fixed on an optical fiber terminus. This dosimeter allows for the measurement of point doses to sub-millimeter resolution, and has the ability to be placed in-vivo in humans and small animals. Real-time data is displayed to the user to provide instant quality assurance and dose-rate information. The second tool utilizes an open source Monte Carlo particle transport code, and was applied for small animal dosimetry studies to calculate organ doses and recommend new techniques of dose prescription in mice, as well as to characterize dose to the murine bone marrow compartment with micron-scale resolution.

Hardware design changes were implemented to reduce the overall fiber diameter to <0.9 mm for the nano-crystalline scintillator based fiber optic detector (NanoFOD) system. Lower limits of device sensitivity were found to be approximately 0.05 cGy/s. Herein, this detector was demonstrated to perform quality assurance of clinical 192Ir HDR brachytherapy procedures, providing comparable dose measurements as thermo-luminescent dosimeters and accuracy within 20% of the treatment planning software (TPS) for 27 treatments conducted, with an inter-quartile range ratio to the TPS dose value of (1.02-0.94=0.08). After removing contaminant signals (Cerenkov and diode background), calibration of the detector enabled accurate dose measurements for vaginal applicator brachytherapy procedures. For 192Ir use, energy response changed by a factor of 2.25 over the SDD values of 3 to 9 cm; however a cap made of 0.2 mm thickness silver reduced energy dependence to a factor of 1.25 over the same SDD range, but had the consequence of reducing overall sensitivity by 33%.

For preclinical measurements, dose accuracy of the NanoFOD was within 1.3% of MOSFET measured dose values in a cylindrical mouse phantom at 225 kV for x-ray irradiation at angles of 0, 90, 180, and 270˝. The NanoFOD exhibited small changes in angular sensitivity, with a coefficient of variation (COV) of 3.6% at 120 kV and 1% at 225 kV. When the NanoFOD was placed alongside a MOSFET in the liver of a sacrificed mouse and treatment was delivered at 225 kV with 0.3 mm Cu filter, the dose difference was only 1.09% with use of the 4x4 cm collimator, and -0.03% with no collimation. Additionally, the NanoFOD utilized a scintillator of 11 µm thickness to measure small x-ray fields for microbeam radiation therapy (MRT) applications, and achieved 2.7% dose accuracy of the microbeam peak in comparison to radiochromic film. Modest differences between the full-width at half maximum measured lateral dimension of the MRT system were observed between the NanoFOD (420 µm) and radiochromic film (320 µm), but these differences have been explained mostly as an artifact due to the geometry used and volumetric effects in the scintillator material. Characterization of the energy dependence for the yttrium-oxide based scintillator material was performed in the range of 40-320 kV (2 mm Al filtration), and the maximum device sensitivity was achieved at 100 kV. Tissue maximum ratio data measurements were carried out on a small animal x-ray irradiator system at 320 kV and demonstrated an average difference of 0.9% as compared to a MOSFET dosimeter in the range of 2.5 to 33 cm depth in tissue equivalent plastic blocks. Irradiation of the NanoFOD fiber and scintillator material on a 137Cs gamma irradiator to 1600 Gy did not produce any measurable change in light output, suggesting that the NanoFOD system may be re-used without the need for replacement or recalibration over its lifetime.

For small animal irradiator systems, researchers can deliver a given dose to a target organ by controlling exposure time. Currently, researchers calculate this exposure time by dividing the total dose that they wish to deliver by a single provided dose rate value. This method is independent of the target organ. Studies conducted here used Monte Carlo particle transport codes to justify a new method of dose prescription in mice, that considers organ specific doses. Monte Carlo simulations were performed in the Geant4 Application for Tomographic Emission (GATE) toolkit using a MOBY mouse whole-body phantom. The non-homogeneous phantom was comprised of 256x256x800 voxels of size 0.145x0.145x0.145 mm3. Differences of up to 20-30% in dose to soft-tissue target organs was demonstrated, and methods for alleviating these errors were suggested during whole body radiation of mice by utilizing organ specific and x-ray tube filter specific dose rates for all irradiations.

Monte Carlo analysis was used on 1 µm resolution CT images of a mouse femur and a mouse vertebra to calculate the dose gradients within the bone marrow (BM) compartment of mice based on different radiation beam qualities relevant to x-ray and isotope type irradiators. Results and findings indicated that soft x-ray beams (160 kV at 0.62 mm Cu HVL and 320 kV at 1 mm Cu HVL) lead to substantially higher dose to BM within close proximity to mineral bone (within about 60 µm) as compared to hard x-ray beams (320 kV at 4 mm Cu HVL) and isotope based gamma irradiators (137Cs). The average dose increases to the BM in the vertebra for these four aforementioned radiation beam qualities were found to be 31%, 17%, 8%, and 1%, respectively. Both in-vitro and in-vivo experimental studies confirmed these simulation results, demonstrating that the 320 kV, 1 mm Cu HVL beam caused statistically significant increased killing to the BM cells at 6 Gy dose levels in comparison to both the 320 kV, 4 mm Cu HVL and the 662 keV, 137Cs beams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

© 2014 .The adoption of antisense gene silencing as a novel disinfectant for prokaryotic organisms is hindered by poor silencing efficiencies. Few studies have considered the effects of off-targets on silencing efficiencies, especially in prokaryotic organisms. In this computational study, a novel algorithm was developed that determined and sorted the number of off-targets as a function of alignment length in Escherichia coli K-12 MG1655 and Mycobacterium tuberculosis H37Rv. The mean number of off-targets per a single location was calculated to be 14.1. ±. 13.3 and 36.1. ±. 58.5 for the genomes of E. coli K-12 MG1655 and M. tuberculosis H37Rv, respectively. Furthermore, when the entire transcriptome was analyzed, it was found that there was no general gene location that could be targeted to minimize or maximize the number of off-targets. In an effort to determine the effects of off-targets on silencing efficiencies, previously published studies were used. Analyses with acpP, ino1, and marORAB revealed a statistically significant relationship between the number of short alignment length off-targets hybrids and the efficacy of the antisense gene silencing, suggesting that the minimization of off-targets may be beneficial for antisense gene silencing in prokaryotic organisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A nested heuristic approach that uses route length approximation is proposed to solve the location-routing problem. A new estimation formula for route length approximation is also developed. The heuristic is evaluated empirically against the sequential method and a recently developed nested method for location routing problems. This testing is carried out on a set of problems of 400 customers and around 15 to 25 depots with good results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Aircraft Accident Statistics and Knowledge (AASK) database is a repository of survivor accounts from aviation accidents. Its main purpose is to store observational and anecdotal data from the actual interviews of the occupants involved in aircraft accidents. The database has wide application to aviation safety analysis, being a source of factual data regarding the evacuation process. It is also key to the development of aircraft evacuation models such as airEXODUS, where insight into how people actually behave during evacuation from survivable aircraft crashes is required. This paper describes recent developments with the database leading to the development of AASK v3.0. These include significantly increasing the number of passenger accounts in the database, the introduction of cabin crew accounts, the introduction of fatality information, improved functionality through the seat plan viewer utility and improved ease of access to the database via the internet. In addition, the paper demonstrates the use of the database by investigating a number of important issues associated with aircraft evacuation. These include issues associated with social bonding and evacuation, the relationship between the number of crew and evacuation efficiency, frequency of exit/slide failures in accidents and exploring possible relationships between seating location and chances of survival. Finally, the passenger behavioural trends described in analysis undertaken with the earlier database are confirmed with the wider data set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper demonstrates a modeling and design approach that couples computational mechanics techniques with numerical optimisation and statistical models for virtual prototyping and testing in different application areas concerning reliability of eletronic packages. The integrated software modules provide a design engineer in the electronic manufacturing sector with fast design and process solutions by optimizing key parameters and taking into account complexity of certain operational conditions. The integrated modeling framework is obtained by coupling the multi-phsyics finite element framework - PHYSICA - with the numerical optimisation tool - VisualDOC into a fully automated design tool for solutions of electronic packaging problems. Response Surface Modeling Methodolgy and Design of Experiments statistical tools plus numerical optimisaiton techniques are demonstrated as a part of the modeling framework. Two different problems are discussed and solved using the integrated numerical FEM-Optimisation tool. First, an example of thermal management of an electronic package on a board is illustrated. Location of the device is optimized to ensure reduced junction temperature and stress in the die subject to certain cooling air profile and other heat dissipating active components. In the second example thermo-mechanical simulations of solder creep deformations are presented to predict flip-chip reliability and subsequently used to optimise the life-time of solder interconnects under thermal cycling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the current paper, the authors present an analysis of the structural characteristics of an intermediate rail vehicle and their effects on crash performance of the vehicle. Theirs is a simulation based analysis involving four stages. First, the crashworthiness of the vehicle is assessed by simulating an impact of the vehicle with a rigid wall. Second, the structural characteristics of the vehicle are analysed based on the structural behaviour during this impact and then the structure is modified. Third, the modified vehicle is tested again in the same impact scenario with a rigid wall. Finally, the modified vehicle is subjected to a modelled head-on impact which mirrors the real-life impact interface between two intermediate vehicles in a train impact. The emphasis of the current study is on the structural characteristics of the intermediate vehicle and the differences compared to an impact of a leading vehicle. The study shows that, similar to a leading vehicle, bending, or jackknifing is a main form of failure in this conventionally designed intermediate vehicle. It has also been found that the location of the door openings creates a major difference in the behaviour of an intermediate vehicle. It causes instability of the vehicle in the door area and leads to high stresses at the joint of the end beam with the solebar and shear stresses at the joint of the inner pillar with the cantrail. Apart from this, the shapes of the vehicle ends and impact interfaces are also different and have an effect on the crash performance of the vehicles. The simulation results allow the identification of the structural characteristics and show the effectiveness of relevant modifications. The conclusions have general relevance for the crashworthiness of rail vehicle design

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The trend towards miniaturization of electronic products leads to the need for very small sized solder joints. Therefore, there is a higher reliability risk that too large a fraction of solder joints will transform into Intermetallic Compounds (IMCs) at the solder interface. In this paper, fracture mechanics study of the IMC layer for SnPb and Pb-free solder joints was carried out using finite element numerical computer modelling method. It is assumed that only one crack is present in the IMC layer. Linear Elastic Fracture Mechanics (LEFM) approach is used for parametric study of the Stress Intensity Factors (SIF, KI and KII), at the predefined crack in the IMC layer of solder butt joint tensile sample. Contrary to intuition, it is revealed that a thicker IMC layer in fact increases the reliability of solder joint for a cracked IMC. Value of KI and KII are found to decrease with the location of the crack further away from the solder interfaces while other parameters are constant. Solder thickness and strain rate were also found to have a significant influence on the SIF values. It has been found that soft solder matrix generates non-uniform plastic deformation across the solder-IMC interface near the crack tip that is responsible to obtain higher KI and KII.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common problem faced by fire safety engineers in the field of evacuation analysis concerns the optimal design of an arbitrarily complex structure in order to minimise evacuation times. How does the engineer determine the best solution? In this study we introduce the concept of numerical optimisation techniques to address this problem. The study makes user of the buildingEXODUS evacuation model coupled with classical optimisation theory including Design of Experiments (DoE) and Response Surface Models (RSM). We demonstrate the technique using a relatively simple problem of determining the optimal location for a single exit in a square room.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the 1970’s and 1980’s, the late Dr Norman Holme undertook extensive towed sledge surveys in the English Channel and some in the Irish Sea. Only a minority of the resulting images were analysed and reported before his death in 1989 but logbooks, video and film material has been archived in the National Marine Biological Library (NMBL) in Plymouth. A scoping study was therefore commissioned by the Joint Nature Conservation Committee and as a part of the Mapping European Seabed Habitats (MESH) project to identify the value of the material archived and the procedure and cost to undertake further work. The results of the scoping study are: 1. NMBL archives hold 106 videotapes (reel-to-reel Sony HD format) and 59 video cassettes (including 15 from the Irish Sea) in VHS format together with 90 rolls of 35 mm colour transparency film (various lengths up to about 240 frames per film). These are stored in the Archive Room, either in a storage cabinet or in original film canisters. 2. Reel-to-reel material is extensive and had already been selectively copied to VHS cassettes. The cost of transferring it to an accepted ‘long-life’ medium (Betamax) would be approximately £15,000. It was not possible to view the tapes as a suitable machine was not located. The value of the tapes is uncertain but they are likely to become beyond salvation within one to two years. 3. Video cassette material is in good condition and is expected to remain so for several more years at least. Images viewed were generally of poor quality and the speed of tow often makes pictures blurred. No immediate action is required. 4. Colour transparency films are in good condition and the images are very clear. They provide the best source of information for mapping seabed biotopes. They should be scanned to digital format but inexpensive fast copying is problematic as there are no between-frame breaks between images and machines need to centre the image based on between-frame breaks. The minimum cost to scan all of the images commercially is approximately £6,000 and could be as much as £40,000 on some quotations. There is a further cost in coding and databasing each image and, all-in-all it would seem most economic to purchase a ‘continuous film’ scanner and undertake the work in-house. 5. Positional information in ships logs has been matched to films and to video tapes. Decca Chain co-ordinates recorded in the logbooks have been converted to latitude and longitude (degrees, minutes and seconds) and a further routine developed to convert to degrees and decimal degrees required for GIS mapping. However, it is unclear whether corrections to Decca positions were applied at the time the position was noted. Tow tracks have been mapped onto an electronic copy of a Hydrographic Office chart. 6. The positions of start and end of each tow were entered to a spread sheet so that they can be displayed on GIS or on a Hydrographic Office Chart backdrop. The cost of the Hydrographic Office chart backdrop at a scale of 1:75,000 for the whole area was £458 incl. VAT. 7. Viewing all of the video cassettes to note habitats and biological communities, even by an experienced marine biologist, would take at least in the order of 200 hours and is not recommended. English Channel towed sledge seabed images. Phase 1: scoping study and example analysis. 6 8. Once colour transparencies are scanned and indexed, viewing to identify seabed habitats and biological communities would probably take about 100 hours for an experienced marine biologist and is recommended. 9. It is expected that identifying biotopes along approximately 1 km lengths of each tow would be feasible although uncertainties about Decca co-ordinate corrections and exact positions of images most likely gives a ±250 m position error. More work to locate each image accurately and solve the Decca correction question would improve accuracy of image location. 10. Using codings (produced by Holme to identify different seabed types), and some viewing of video and transparency material, 10 biotopes have been identified, although more would be added as a result of full analysis. 11. Using the data available from the Holme archive, it is possible to populate various fields within the Marine Recorder database. The overall ‘survey’ will be ‘English Channel towed video sled survey’. The ‘events’ become the 104 tows. Each tow could be described as four samples, i.e. the start and end of the tow and two areas in the middle to give examples along the length of the tow. These samples would have their own latitude/longitude co-ordinates. The four samples would link to a GIS map. 12. Stills and video clips together with text information could be incorporated into a multimedia presentation, to demonstrate the range of level seabed types found along a part of the northern English Channel. More recent images taken during SCUBA diving of reef habitats in the same area as the towed sledge surveys could be added to the Holme images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The resolution of the SSU rRNA gene for phylogenetic analysis in the diatoms has been evaluated by Theriot et al. who claimed that the SSU rRNA gene could not be used to resolve the monophyly of the three diatoms classes described by Medlin and Kaczmarska. Although they used both only bolidomonads and heterokonts as outgroups, they did not explore outgroups further away than the heterokonts. In this study, the use of the multiple outgroups inside and outside the heterokonts with the rRNA gene for recovering the three monophyletic clades at the class level is evaluated. Trees with multiple outgroups ranging from only bolidophytes to Bacteria and Archea were analyzed with Bayesian and Maximum Likelihood analyses and two data sets were recovered with the classes being monophyletic. Other data sets were analyzed with non-weighted and weighted maximum parsimony. The latter reduced the number of clades and lengthened branch lengths between the clades. One data set using a weighted analysis recovered the three classes as monophyletic. Taking only bolidophytes as the only outgroup never produced monophyletic clades. Multiple outgroups including many heterokonts and certain members of the crown group radiation recovered monophyletic clades. The three classes can be defined by clear morphological differences primarily based on auxospore ontogeny and envelope structure, the presence or absence of a structure (tube process or sternum) associated with the annulus and the location of the cribrum in those genera with loculate areolae. A cladistic analysis of some of these features is presented and recovers the three classes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the flow characteristics in the near throat region of a poppet valve under steady flow conditions. An experimental and theoretical procedure was undertaken to determine the total pressure at the assumed throat region of the valve, and also at a downstream location. Experiments of this type can be used to accurately determine the flow performance of a particular induction system. The static pressure recovery was calculated from the near throat region of the valve to the downstream location and was shown to be dependant on valve lift. Total pressure profiles suggest that for this particular induction system, the majority of pressure loss occurs downstream of the valve for lift/diameter ratios up to 0.1, and upstream of the valve for lift/diameter ratios greater than 0.1. Negligible pressure recovery was shown to exist from the cylindrical periphery of the valve head to the downstream location for all valve lifts, indicating that the flow had probably separated completely from the trailing edge of the valve seating face. The calculated discharge coefficients, based on the geometric throat static pressure measurements on the seating face, were in general less than those determined using the downstream static pressure, by as much as 12% in some instances towards the valves lower mass flow rate range.