899 resultados para High-efficiency Transformation
Resumo:
BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process. RESULTS: The ADAPTE process consists of three phases (set-up, adaptation, finalisation), 9 modules and 24 steps. The adaptation phase involves identifying specific clinical questions, searching for, retrieving and assessing available guidelines, and preparing the draft adapted guideline. Among 330 registered individuals (46 countries), 144 completed the questionnaire. A majority found the ADAPTE process clear (78%), comprehensive (69%) and feasible (60%), and the manual useful (79%). However, 21% found the ADAPTE process complex. 44% feared that they will not find appropriate and high-quality source guidelines. DISCUSSION: A comprehensive framework for guideline adaptation has been developed to meet the challenges of timely guideline development and implementation. The ADAPTE process generated important interest among guideline developers and implementers. The majority perceived the ADAPTE process to be feasible, useful and leading to improved methodological rigour and guideline quality. However, some de novo development might be needed if no high quality guideline exists for a given topic.
Resumo:
T cells move randomly ("random-walk"), a characteristic thought to be integral to their function. Using migration assays and time-lapse microscopy, we found that CD8+ T cells lacking the lymph node homing receptors CCR7 and CD62L migrate more efficiently in transwell assays, and that these same cells are characterized by a high frequency of cells exhibiting random crawling activity under culture conditions mimicking the interstitial/extravascular milieu, but not when examined on endothelial cells. To assess the energy efficiency of cells crawling at a high frequency, we measured mRNA expression of genes key to mitochondrial energy metabolism (peroxisome proliferator-activated receptor gamma coactivator 1beta [PGC-1beta], estrogen-related receptor alpha [ERRalpha], cytochrome C, ATP synthase, and the uncoupling proteins [UCPs] UCP-2 and -3), quantified ATP contents, and performed calorimetric analyses. Together these assays indicated a high energy efficiency of the high crawling frequency CD8+ T-cell population, and identified differentially regulated heat production among nonlymphoid versus lymphoid homing CD8+ T cells.
Resumo:
The aim of this thesis was to produce information for the estimation of the flow balance of wood resin in mechanical pulping and to demonstrate the possibilities for improving the efficiency of deresination in practice. It was observed that chemical changes in wood resin take place only during peroxide bleaching, a significant amount of water dispersed wood resin is retained in the pulp mat during dewatering and the amount of wood resin in the solid phase of the process filtrates is very small. On the basis of this information there exist three parameters related to behaviour of wood resin that determine the flow balance in the process: 1. The liberation of wood resin to the pulp water phase 2. Theretention of water dispersed wood resin in dewatering 3. The proportion of wood resin degraded in the peroxide bleaching The effect of different factors on these parameters was evaluated with the help of laboratory studies and a literature survey. Also, information related to the values of these parameters in existing processes was obtained in mill measurements. With the help of this information, it was possible to evaluate the deresination efficiency and the effect of different factors on this efficiency in a pulping plant that produced low-freeness mechanical pulp. This evaluation showed that the wood resin content of mechanical pulp can be significantly decreased if there exists, in the process, a peroxide bleaching and subsequent washing stage. In the case of an optimal process configuration, as high as a 85 percent deresination efficiency seems to be possible with a water usage level of 8 m3/o.d.t.
Resumo:
This work was carried out in the laboratory of Fluid Dynamics, at Lappeenranta University of Technology during the years 1991-1996. The research was a part of larger high speed technology development research. First, there was the idea of making high speed machinery applications with the Brayton cycle. There was a clear need to deepen theknowledge of the cycle itself and to make a new approach in the field of the research. Also, the removal of water from the humid air seemed very interesting. The goal of this work was to study methods of designing high speed machinery to the reversed Brayton cycle, from theoretical principles to practical applications. The reversed Brayton cycle can be employed as an air dryer, a heat pump or a refrigerating machine. In this research the use of humid air as a working fluid has an environmental advantage, as well. A new calculation method for the Braytoncycle is developed. In this method especially the expansion process in the turbine is important because of the condensation of the water vapour in the humid air. This physical phenomena can have significant effects on the level of performance of the application. Also, the influence of calculating the process with actual, achievable process equipment efficiencies is essential for the development of the future machinery. The above theoretical calculations are confirmed with two different laboratory prototypes. The high speed machinery concept allows one to build an application with only one rotating shaft including all the major parts: the high speed motor, the compressor and the turbine wheel. The use of oil free bearings and high rotational speed outlines give several advantages compared to conventional machineries: light weight, compact structure, safe operation andhigher efficiency at a large operational region. There are always problems whentheory is applied to practice. The calibrations of pressure, temperature and humidity probes were made with care but still measurable errors were not negligible. Several different separators were examined and in all cases the content of the separated water was not exact. Due to the compact sizes and structures of the prototypes, the process measurement was slightly difficult. The experimental results agree well with the theoretical calculations. These experiments prove the operation of the process and lay a ground for the further development. The results of this work give very promising possibilities for the design of new, commercially competitive applications that use high speed machinery and the reversed Brayton cycle.
Resumo:
The last economic crisis raised huge challenges for nonprofit organizations. It is now critical for nonprofit organizations to show not only their social legitimacy but also their efficiency and competency to claim for grants (Kearns, Bell, Deem, & McShane, 2012). High Performance Work Practices (HPWP) are a way to foster performance and thus to answer challenges nonprofit organizations are currently facing. However, such practices have until then only been considered for the corporate world. The entire philosophy behind nonprofit organizations contrasts radically from the for-profit sector. Human resources management in particular may differ as well. The aim of this article is precisely to analyze the challenges of implementing HPWP in nonprofit organizations. In order to explore those challenges, we study the HR practices of a nonprofit organization based in UK that struggles against poverty. Discussion of results highlights good practices that should be applied along the nonprofit sector.
Resumo:
A patent foramen ovale (PFO), present in ∼40% of the general population, is a potential source of right-to-left shunt that can impair pulmonary gas exchange efficiency [i.e., increase the alveolar-to-arterial Po2 difference (A-aDO2)]. Prior studies investigating human acclimatization to high-altitude with A-aDO2 as a key parameter have not investigated differences between subjects with (PFO+) or without a PFO (PFO-). We hypothesized that in PFO+ subjects A-aDO2 would not improve (i.e., decrease) after acclimatization to high altitude compared with PFO- subjects. Twenty-one (11 PFO+) healthy sea-level residents were studied at rest and during cycle ergometer exercise at the highest iso-workload achieved at sea level (SL), after acute transport to 5,260 m (ALT1), and again at 5,260 m after 16 days of high-altitude acclimatization (ALT16). In contrast to PFO- subjects, PFO+ subjects had 1) no improvement in A-aDO2 at rest and during exercise at ALT16 compared with ALT1, 2) no significant increase in resting alveolar ventilation, or alveolar Po2, at ALT16 compared with ALT1, and consequently had 3) an increased arterial Pco2 and decreased arterial Po2 and arterial O2 saturation at rest at ALT16. Furthermore, PFO+ subjects had an increased incidence of acute mountain sickness (AMS) at ALT1 concomitant with significantly lower peripheral O2 saturation (SpO2). These data suggest that PFO+ subjects have increased susceptibility to AMS when not taking prophylactic treatments, that right-to-left shunt through a PFO impairs pulmonary gas exchange efficiency even after acclimatization to high altitude, and that PFO+ subjects have blunted ventilatory acclimatization after 16 days at altitude compared with PFO- subjects.
Resumo:
Clinical experience and experimental data suggest that intradialytic hemodynamic profiles could be influenced by the characteristics of the dialysis membranes. Even within the worldwide used polysulfone family, intolerance to specific membranes was occasionally evoked. The aim of this study was to compare hemodynamically some of the commonly used polysulfone dialyzers in Switzerland. We performed an open-label, randomized, cross-over trial, including 25 hemodialysis patients. Four polysulfone dialyzers, A (Revaclear high-flux, Gambro, Stockholm, Sweden), B (Helixone high-flux, Fresenius), C (Xevonta high-flux, BBraun, Melsungen, Germany), and D (Helixone low-flux, Fresenius, Bad Homburg vor der Höhe, Germany), were compared. The hemodynamic profile was assessed and patients were asked to provide tolerance feedback. The mean score (±SD) subjectively assigned to dialysis quality on a 1-10 scale was A 8.4 ± 1.3, B 8.6 ± 1.3, C 8.5 ± 1.6, D 8.5 ± 1.5. Kt/V was A 1.58 ± 0.30, B 1.67 ± 0.33, C 1.62 ± 0.32, D 1.45 ± 0.31. The low- compared with the high-flux membranes, correlated to higher systolic (128.1 ± 13.1 vs. 125.6 ± 12.1 mmHg, P < 0.01) and diastolic (76.8 ± 8.7 vs. 75.3 ± 9.0 mmHg; P < 0.05) pressures, higher peripheral resistance (1.44 ± 0.19 vs. 1.40 ± 0.18 s × mmHg/mL; P < 0.05) and lower cardiac output (3.76 ± 0.62 vs. 3.82 ± 0.59 L/min; P < 0.05). Hypotension events (decrease in systolic blood pressure by >20 mmHg) were 70 with A, 87 with B, 73 with C, and 75 with D (P < 0.01 B vs. A, 0.05 B vs. C and 0.07 B vs. D). The low-flux membrane correlated to higher blood pressure levels compared with the high-flux ones. The Helixone high-flux membrane ensured the best efficiency. Unfortunately, the very same dialyzer correlated to a higher incidence of hypotensive episodes.
Resumo:
Viime vuosien nopea kehitys on kiihdyttänyt uusien lääkkeiden kehittämisprosessia. Kombinatorinen kemia on tehnyt mahdolliseksi syntetisoida suuria kokoelmia rakenteeltaan toisistaan poikkeavia molekyylejä, nk. kombinatorisia kirjastoja, biologista seulontaa varten. Siinä molekyylien rakenteeseen liittyvä aktiivisuus tutkitaan useilla erilaisilla biologisilla testeillä mahdollisten "osumien" löytämiseksi, joista osasta saatetaan myöhemmin kehittää uusia lääkeaineita. Jotta biologisten tutkimusten tulokset olisivat luotettavia, on syntetisoitujen komponenttien oltava mahdollisimman puhtaita. Tämän vuoksi tarvitaan HTP-puhdistusta korkealaatuisten komponenttien ja luotettavan biologisen tiedon takaamiseksi. Jatkuvasti kasvavat tuotantovaatimukset ovat johtaneet näiden puhdistustekniikoiden automatisointiin ja rinnakkaistamiseen. Preparatiivinen LC/MS soveltuu kombinatoristen kirjastojen nopeaan ja tehokkaaseen puhdistamiseen. Monet tekijät, esimerkiksi erotuskolonnin ominaisuudet sekä virtausgradientti, vaikuttavat preparatiivisen LC/MS puhdistusprosessin tehokkuuteen. Nämä parametrit on optimoitava parhaan tuloksen saamiseksi. Tässä työssä tutkittiin emäksisiä komponentteja erilaisissa virtausolosuhteissa. Menetelmä kombinatoristen kirjastojen puhtaustason määrittämiseksi LC/MS-puhdistuksen jälkeen optimoitiin ja määritettiin puhtaus joillekin komponenteille eri kirjastoista ennen puhdistusta.
Resumo:
Työn tavoitteena oli selvittää koivun kasteluvarastoinnin kannattavuus selluteollisuudessa. Lisäksi tutkittiin, kuinka kastelu vaikuttaa puuaineeseen varastoinnin aikana ja kuinka koivun kasteluvarastointi vaikuttaa puun kuorintaan ja haketukseen, keitettävyyteen, vaalenevuuteen sekä sellun laatuun. Enocellin puukentälle rakennettiin kasteluvarasto, jossa varastoitiin 40,000 m3sob koivua. Kastelu oli päällä huhtikuusta lokakuuhun asti. Kastelun vaikutusta puuaineen muutoksiin arvioitiin lahotutkimusten avulla. Tehdaskoeajoissa verrattiin tuoretta, kasteluvarastoitua ja kuivavarastoitua koivua. Puuaines säilyi lähes muuttumattomana yhden kesän kasteluvarastoinnissa. Kastellulla koivulla terveen puun osuus oli yli 85 % kesän lopussa, kun se oli alle 20 % kuivavarastoidulla koivulla. Kuorinnan puuhäviö laskee selvästi kastelukoivulla ja myös hakkeen laatu oli parempaa kuin kuivavarastoidulla koivulla. Kastelukoivulla hakkeen kuoripitoisuus oli vain 0.13 %. Kuoren kuiva-aine oli 12 prosenttiyksikköä alhaisempi kuin kuivalla koivulla, mutta kuoren lämpöarvossa ero oli vain 1 €/ADt. Varastointimenetelmällä ei ollut vaikutusta hakkeen keitettävyyteen, mutta tuoreella puulla keitettävyys oli parempi kuin varastoidulla puulla. Sellun asetoniuutepitoisuus oli samalla tasolla tuoreella ja kastellulla puulla. Kuivalla syyspuulla uutetaso oli korkeampi, vaikka hartsisaippuan annostusta nostettiin 10 kg/ADt. Betulinolitaso oli kastellulla puulla erittäin alhainen puun hyvän kuoriutuvuuden vuoksi. Kastellun ja tuoreen puun vaalenevuus oli parempi kuin kuivalla puulla. Aktiivikloorin kulutus oli 3 – 4 kg/ADt alhaisempi kuin kuivalla syyspuulla. Puun varastoinnilla ei ollut vaikutusta sellun laatuun. Koivun kasteluvarastoinnin kannattavuus on erittäin hyvä. Tuotantokustannukset määritettiin tuoreelle, kastellulle, kierrätetylle sekä kuivalle koivulle. Kasteluvarastointi laskee tuotantokustannuksia noin 10 €/ADt verrattuna kierrätettyyn koivuun. Kuivavarastoidun puun käyttö nostaa tuotantokustannuksia noin 5 €/ADt verrattuna kastelukoivuun. Kierrätetyn ja kuivavarastoidun puun kustannusero johtuu kierrätyskustannuksista. Kasteluvarastolle, jota käytettiin kesällä 2004, takaisinmaksuaika on vain 0.4 vuotta. Jos tavoiteltu takaisinmaksuaika olisi kaksi vuotta, niin perusinvestointi 80,000 m3sob varastolle voisi maksaa noin 370 k€.
Resumo:
L'imagerie par résonance magnétique (IRM) peut fournir aux cardiologues des informations diagnostiques importantes sur l'état de la maladie de l'artère coronarienne dans les patients. Le défi majeur pour l'IRM cardiaque est de gérer toutes les sources de mouvement qui peuvent affecter la qualité des images en réduisant l'information diagnostique. Cette thèse a donc comme but de développer des nouvelles techniques d'acquisitions des images IRM, en changeant les techniques de compensation du mouvement, pour en augmenter l'efficacité, la flexibilité, la robustesse et pour obtenir plus d'information sur le tissu et plus d'information temporelle. Les techniques proposées favorisent donc l'avancement de l'imagerie des coronaires dans une direction plus maniable et multi-usage qui peut facilement être transférée dans l'environnement clinique. La première partie de la thèse s'est concentrée sur l'étude du mouvement des artères coronariennes sur des patients en utilisant la techniques d'imagerie standard (rayons x), pour mesurer la précision avec laquelle les artères coronariennes retournent dans la même position battement après battement (repositionnement des coronaires). Nous avons découvert qu'il y a des intervalles dans le cycle cardiaque, tôt dans la systole et à moitié de la diastole, où le repositionnement des coronaires est au minimum. En réponse nous avons développé une nouvelle séquence d'acquisition (T2-post) capable d'acquérir les données aussi tôt dans la systole. Cette séquence a été testée sur des volontaires sains et on a pu constater que la qualité de visualisation des artère coronariennes est égale à celle obtenue avec les techniques standard. De plus, le rapport signal sur bruit fourni par la séquence d'acquisition proposée est supérieur à celui obtenu avec les techniques d'imagerie standard. La deuxième partie de la thèse a exploré un paradigme d'acquisition des images cardiaques complètement nouveau pour l'imagerie du coeur entier. La technique proposée dans ce travail acquiert les données sans arrêt (free-running) au lieu d'être synchronisée avec le mouvement cardiaque. De cette façon, l'efficacité de la séquence d'acquisition est augmentée de manière significative et les images produites représentent le coeur entier dans toutes les phases cardiaques (quatre dimensions, 4D). Par ailleurs, l'auto-navigation de la respiration permet d'effectuer cette acquisition en respiration libre. Cette technologie rend possible de visualiser et évaluer l'anatomie du coeur et de ses vaisseaux ainsi que la fonction cardiaque en quatre dimensions et avec une très haute résolution spatiale et temporelle, sans la nécessité d'injecter un moyen de contraste. Le pas essentiel qui a permis le développement de cette technique est l'utilisation d'une trajectoire d'acquisition radiale 3D basée sur l'angle d'or. Avec cette trajectoire, il est possible d'acquérir continûment les données d'espace k, puis de réordonner les données et choisir les paramètres temporel des images 4D a posteriori. L'acquisition 4D a été aussi couplée avec un algorithme de reconstructions itératif (compressed sensing) qui permet d'augmenter la résolution temporelle tout en augmentant la qualité des images. Grâce aux images 4D, il est possible maintenant de visualiser les artères coronariennes entières dans chaque phase du cycle cardiaque et, avec les mêmes données, de visualiser et mesurer la fonction cardiaque. La qualité des artères coronariennes dans les images 4D est la même que dans les images obtenues avec une acquisition 3D standard, acquise en diastole Par ailleurs, les valeurs de fonction cardiaque mesurées au moyen des images 4D concorde avec les valeurs obtenues avec les images 2D standard. Finalement, dans la dernière partie de la thèse une technique d'acquisition a temps d'écho ultra-court (UTE) a été développée pour la visualisation in vivo des calcifications des artères coronariennes. Des études récentes ont démontré que les acquisitions UTE permettent de visualiser les calcifications dans des plaques athérosclérotiques ex vivo. Cepandent le mouvement du coeur a entravé jusqu'à maintenant l'utilisation des techniques UTE in vivo. Pour résoudre ce problème nous avons développé une séquence d'acquisition UTE avec trajectoire radiale 3D et l'avons testée sur des volontaires. La technique proposée utilise une auto-navigation 3D pour corriger le mouvement respiratoire et est synchronisée avec l'ECG. Trois échos sont acquis pour extraire le signal de la calcification avec des composants au T2 très court tout en permettant de séparer le signal de la graisse depuis le signal de l'eau. Les résultats sont encore préliminaires mais on peut affirmer que la technique développé peut potentiellement montrer les calcifications des artères coronariennes in vivo. En conclusion, ce travail de thèse présente trois nouvelles techniques pour l'IRM du coeur entier capables d'améliorer la visualisation et la caractérisation de la maladie athérosclérotique des coronaires. Ces techniques fournissent des informations anatomiques et fonctionnelles en quatre dimensions et des informations sur la composition du tissu auparavant indisponibles. CORONARY artery magnetic resonance imaging (MRI) has the potential to provide the cardiologist with relevant diagnostic information relative to coronary artery disease of patients. The major challenge of cardiac MRI, though, is dealing with all sources of motions that can corrupt the images affecting the diagnostic information provided. The current thesis, thus, focused on the development of new MRI techniques that change the standard approach to cardiac motion compensation in order to increase the efficiency of cardioavscular MRI, to provide more flexibility and robustness, new temporal information and new tissue information. The proposed approaches help in advancing coronary magnetic resonance angiography (MRA) in the direction of an easy-to-use and multipurpose tool that can be translated to the clinical environment. The first part of the thesis focused on the study of coronary artery motion through gold standard imaging techniques (x-ray angiography) in patients, in order to measure the precision with which the coronary arteries assume the same position beat after beat (coronary artery repositioning). We learned that intervals with minimal coronary artery repositioning occur in peak systole and in mid diastole and we responded with a new pulse sequence (T2~post) that is able to provide peak-systolic imaging. Such a sequence was tested in healthy volunteers and, from the image quality comparison, we learned that the proposed approach provides coronary artery visualization and contrast-to-noise ratio (CNR) comparable with the standard acquisition approach, but with increased signal-to-noise ratio (SNR). The second part of the thesis explored a completely new paradigm for whole- heart cardiovascular MRI. The proposed techniques acquires the data continuously (free-running), instead of being triggered, thus increasing the efficiency of the acquisition and providing four dimensional images of the whole heart, while respiratory self navigation allows for the scan to be performed in free breathing. This enabling technology allows for anatomical and functional evaluation in four dimensions, with high spatial and temporal resolution and without the need for contrast agent injection. The enabling step is the use of a golden-angle based 3D radial trajectory, which allows for a continuous sampling of the k-space and a retrospective selection of the timing parameters of the reconstructed dataset. The free-running 4D acquisition was then combined with a compressed sensing reconstruction algorithm that further increases the temporal resolution of the 4D dataset, while at the same time increasing the overall image quality by removing undersampling artifacts. The obtained 4D images provide visualization of the whole coronary artery tree in each phases of the cardiac cycle and, at the same time, allow for the assessment of the cardiac function with a single free- breathing scan. The quality of the coronary arteries provided by the frames of the free-running 4D acquisition is in line with the one obtained with the standard ECG-triggered one, and the cardiac function evaluation matched the one measured with gold-standard stack of 2D cine approaches. Finally, the last part of the thesis focused on the development of ultrashort echo time (UTE) acquisition scheme for in vivo detection of calcification in the coronary arteries. Recent studies showed that UTE imaging allows for the coronary artery plaque calcification ex vivo, since it is able to detect the short T2 components of the calcification. The heart motion, though, prevented this technique from being applied in vivo. An ECG-triggered self-navigated 3D radial triple- echo UTE acquisition has then been developed and tested in healthy volunteers. The proposed sequence combines a 3D self-navigation approach with a 3D radial UTE acquisition enabling data collection during free breathing. Three echoes are simultaneously acquired to extract the short T2 components of the calcification while a water and fat separation technique allows for proper visualization of the coronary arteries. Even though the results are still preliminary, the proposed sequence showed great potential for the in vivo visualization of coronary artery calcification. In conclusion, the thesis presents three novel MRI approaches aimed at improved characterization and assessment of atherosclerotic coronary artery disease. These approaches provide new anatomical and functional information in four dimensions, and support tissue characterization for coronary artery plaques.
Resumo:
Within the latest decade high-speed motor technology has been increasingly commonly applied within the range of medium and large power. More particularly, applications like such involved with gas movement and compression seem to be the most important area in which high-speed machines are used. In manufacturing the induction motor rotor core of one single piece of steel it is possible to achieve an extremely rigid rotor construction for the high-speed motor. In a mechanical sense, the solid rotor may be the best possible rotor construction. Unfortunately, the electromagnetic properties of a solid rotor are poorer than the properties of the traditional laminated rotor of an induction motor. This thesis analyses methods for improving the electromagnetic properties of a solid-rotor induction machine. The slip of the solid rotor is reduced notably if the solid rotor is axially slitted. The slitting patterns of the solid rotor are examined. It is shown how the slitting parameters affect the produced torque. Methods for decreasing the harmonic eddy currents on the surface of the rotor are also examined. The motivation for this is to improve the efficiency of the motor to reach the efficiency standard of a laminated rotor induction motor. To carry out these research tasks the finite element analysis is used. An analytical calculation of solid rotors based on the multi-layer transfer-matrix method is developed especially for the calculation of axially slitted solid rotors equipped with wellconducting end rings. The calculation results are verified by using the finite element analysis and laboratory measurements. The prototype motors of 250 – 300 kW and 140 Hz were tested to verify the results. Utilization factor data are given for several other prototypes the largest of which delivers 1000 kW at 12000 min-1.
Resumo:
We present an algorithm for the computation of reducible invariant tori of discrete dynamical systems that is suitable for tori of dimensions larger than 1. It is based on a quadratically convergent scheme that approximates, at the same time, the Fourier series of the torus, its Floquet transformation, and its Floquet matrix. The Floquet matrix describes the linearization of the dynamics around the torus and, hence, its linear stability. The algorithm presents a high degree of parallelism, and the computational effort grows linearly with the number of Fourier modes needed to represent the solution. For these reasons it is a very good option to compute quasi-periodic solutions with several basic frequencies. The paper includes some examples (flows) to show the efficiency of the method in a parallel computer. In these flows we compute invariant tori of dimensions up to 5, by taking suitable sections.
Resumo:
BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process.
Resumo:
The adequate selection of indicator groups of biodiversity is an important aspect of the systematic conservation planning. However, these assessments differ in the spatial scale, in the methods used and in the groups considered to accomplish this task, which generally produces contradictory results. The quantification of the spatial congruence between species richness and complementarity among different taxonomic groups is a fundamental step to identify potential indicator groups. Using a constructive approach, the main purposes of this study were to evaluate the performance and efficiency of eight potential indicator groups representing amphibian diversity in the Brazilian Atlantic Forest. Data on the geographic range of amphibian species that occur in the Brazilian Atlantic Forest was overlapped to the full geographic extent of the biome, which was divided into a regular equal-area grid. Optimization routines based on the concept of complementarily were applied to verify the performance of each indicator group selected in relation to the representativeness of the amphibians in the Brazilian Atlantic Forest as a whole, which were solved by the algorithm"simulated annealing", through the use of the software MARXAN. Some indicator groups were substantially more effective than others in regards to the representation of the taxonomic groups assessed, which was confirmed by the high significance of data (F = 312.76; p < 0.01). Leiuperidae was considered as the best indicator group among the families analyzed, as it showed a good performance, representing 71% of amphibian species in the Brazilian Atlantic Forest (i.e. 290 species), which may be associated with the diffuse geographic distribution of its species. This study promotes understanding of how the diversity standards of amphibians can be informative for systematic conservation planning on a regional scale.
Resumo:
We report on a field-effect light emitting device based on silicon nanocrystals in silicon oxide deposited by plasma-enhanced chemical vapor deposition. The device shows high power efficiency and long lifetime. The power efficiency is enhanced up to 0.1 %25 by the presence of a silicon nitride control layer. The leakage current reduction induced by this nitride buffer effectively increases the power efficiency two orders of magnitude with regard to similarly processed devices with solely oxide. In addition, the nitride cools down the electrons that reach the polycrystalline silicon gate lowering the formation of defects, which significantly reduces the device degradation.