858 resultados para Subtropical design and architecture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coupled Electromechanical Analysis, MEMS Modeling, MEMS, RF MEMS Switches, Defected Ground Structures, Reconfigurable Resonator

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2010

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2013

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An active, solvent-free solid sampler was developed for the collection of 1,6-hexamethylene diisocyanate (HDI) aerosol and prepolymers. The sampler was made of a filter impregnated with 1-(2-methoxyphenyl)piperazine contained in a filter holder. Interferences with HDI were observed when a set of cellulose acetate filters and a polystyrene filter holder were used; a glass fiber filter and polypropylene filter cassette gave better results. The applicability of the sampling and analytical procedure was validated with a test chamber, constructed for the dynamic generation of HDI aerosol and prepolymers in commercial two-component spray paints (Desmodur(R) N75) used in car refinishing. The particle size distribution, temporal stability, and spatial uniformity of the simulated aerosol were established in order to test the sample. The monitoring of aerosol concentrations was conducted with the solid sampler paired to the reference impinger technique (impinger flasks contained 10 mL of 0.5 mg/mL 1-(2-methoxyphenyl)piperazine in toluene) under a controlled atmosphere in the test chamber. Analyses of derivatized HDI and prepolymers were carried out by using high-performance liquid chromatography and ultraviolet detection. The correlation between the solvent-free and the impinger techniques appeared fairly good (Y = 0.979X - 0.161; R = 0.978), when the tests were conducted in the range of 0.1 to 10 times the threshold limit value (TLV) for HDI monomer and up to 60-mu-g/m3 (3 U.K. TLVs) for total -N = C = O groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this paper is to reexamine the optimal design and efficiency of loyalty rewards in markets for final consumption goods. While the literature has emphasized the role of loyalty rewards as endogenous switching costs (which distort the efficient allocation of consumers), in this paper I analyze the ability of alternative designs to foster consumer participation and increase total surplus. First, the efficiency of loyalty rewards depend on their specific design. A commitment to the price of repeat purchases can involve substantial efficiency gains by reducing price-cost margins. However, discount policies imply higher future regular prices and are likely to reduce total surplus. Second, firms may prefer to set up inefficient rewards (discounts), especially in those circumstances where a commitment to the price of repeat purchases triggers Coasian dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All-optical label swapping (AOLS) forms a key technology towards the implementation of all-optical packet switching nodes (AOPS) for the future optical Internet. The capital expenditures of the deployment of AOLS increases with the size of the label spaces (i.e. the number of used labels), since a special optical device is needed for each recognized label on every node. Label space sizes are affected by the way in which demands are routed. For instance, while shortest-path routing leads to the usage of fewer labels but high link utilization, minimum interference routing leads to the opposite. This paper studies all-optical label stacking (AOLStack), which is an extension of the AOLS architecture. AOLStack aims at reducing label spaces while easing the compromise with link utilization. In this paper, an integer lineal program is proposed with the objective of analyzing the softening of the aforementioned trade-off due to AOLStack. Furthermore, a heuristic aiming at finding good solutions in polynomial-time is proposed as well. Simulation results show that AOLStack either a) reduces the label spaces with a low increase in the link utilization or, similarly, b) uses better the residual bandwidth to decrease the number of labels even more

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bacteria have long been the targets for genetic manipulation, but more recently they have been synthetically designed to carry out specific tasks. Among the simplest of these tasks is chemical compound and toxicity detection coupled to the production of a quantifiable reporter signal. In this Review, we describe the current design of bacterial bioreporters and their use in a range of assays to measure the presence of harmful chemicals in water, air, soil, food or biological specimens. New trends for integrating synthetic biology and microengineering into the design of bacterial bioreporter platforms are also highlighted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this project consists of designing and constructing a RTM mould for a loose flange of glass fibre reinforced plastic (GRP). The design phase has the mission to realise a quality and simple design of the RTM mould, with the objective to obtain an easy and economic phase of construction and the more possible great characteristics of the loose flange. In fact this RTM mould will be a mould prototype, which will be developed in the future to obtain an RTM mould able to support a loose flange production. Therefore when more steps are developed in this project, then the future objective will be near

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MicroRNAs (miRNA) are recognized posttranscriptional gene repressors involved in the control of almost every biological process. Allelic variants in these regions may be an important source of phenotypic diversity and contribute to disease susceptibility. We analyzed the genomic organization of 325 human miRNAs (release 7.1, miRBase) to construct a panel of 768 single-nucleotide polymorphisms (SNPs) covering approximately 1 Mb of genomic DNA, including 131 isolated miRNAs (40%) and 194 miRNAs arranged in 48 miRNA clusters, as well as their 5-kb flanking regions. Of these miRNAs, 37% were inside known protein-coding genes, which were significantly associated with biological functions regarding neurological, psychological or nutritional disorders. SNP coverage analysis revealed a lower SNP density in miRNAs compared with the average of the genome, with only 24 SNPs located in the 325 miRNAs studied. Further genotyping of 340 unrelated Spanish individuals showed that more than half of the SNPs in miRNAs were either rare or monomorphic, in agreement with the reported selective constraint on human miRNAs. A comparison of the minor allele frequencies between Spanish and HapMap population samples confirmed the applicability of this SNP panel to the study of complex disorders among the Spanish population, and revealed two miRNA regions, hsa-mir-26a-2 in the CTDSP2 gene and hsa-mir-128-1 in the R3HDM1 gene, showing geographical allelic frequency variation among the four HapMap populations, probably because of differences in natural selection. The designed miRNA SNP panel could help to identify still hidden links between miRNAs and human disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

State Highway Departments and local street and road agencies are currently faced with aging highway systems and a need to extend the life of some of the pavements. The agency engineer should have the opportunity to explore the use of multiple surface types in the selection of a preferred rehabilitation strategy. This study was designed to look at the portland cement concrete overlay alternative and especially the design of overlays for existing composite (portland cement and asphaltic cement concrete) pavements. Existing design procedures for portland cement concrete overlays deal primarily with an existing asphaltic concrete pavement with an underlying granular base or stabilized base. This study reviewed those design methods and moved to the development of a design for overlays of composite pavements. It deals directly with existing portland cement concrete pavements that have been overlaid with successive asphaltic concrete overlays and are in need of another overlay due to poor performance of the existing surface. The results of this study provide the engineer with a way to use existing deflection technology coupled with materials testing and a combination of existing overlay design methods to determine the design thickness of the portland cement concrete overlay. The design methodology provides guidance for the engineer, from the evaluation of the existing pavement condition through the construction of the overlay. It also provides a structural analysis of various joint and widening patterns on the performance of such designs. This work provides the engineer with a portland cement concrete overlay solution to composite pavements or conventional asphaltic concrete pavements that are in need of surface rehabilitation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Granular shoulders are an important element of the transportation system and are constantly subjected to performance problems due to wind- and water-induced erosion, rutting, edge drop-off, and slope irregularities. Such problems can directly affect drivers’ safety and often require regular maintenance. The present research study was undertaken to investigate the factors contributing to these performance problems and to propose new ideas to design and maintain granular shoulders while keeping ownership costs low. This report includes observations made during a field reconnaissance study, findings from an effort to stabilize the granular and subgrade layer at six shoulder test sections, and the results of a laboratory box study where a shoulder section overlying a soft foundation layer was simulated. Based on the research described in this report, the following changes are proposed to the construction and maintenance methods for granular shoulders: • A minimum CBR value for the granular and subgrade layer should be selected to alleviate edge drop-off and rutting formation. • For those constructing new shoulder sections, the design charts provided in this report can be used as a rapid guide based on an allowable rut depth. The charts can also be used to predict the behavior of existing shoulders. • In the case of existing shoulder sections overlying soft foundations, the use of geogrid or fly ash stabilization proved to be an effective technique for mitigating shoulder rutting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Stroke registries are valuable tools for obtaining information about stroke epidemiology and management. The Acute STroke Registry and Analysis of Lausanne (ASTRAL) prospectively collects epidemiological, clinical, laboratory and multimodal brain imaging data of acute ischemic stroke patients in the Centre Hospitalier Universitaire Vaudois (CHUV). Here, we provide design and methods used to create ASTRAL and present baseline data of our patients (2003 to 2008). METHODS: All consecutive patients admitted to CHUV between January 1, 2003 and December 31, 2008 with acute ischemic stroke within 24 hours of symptom onset were included in ASTRAL. Patients arriving beyond 24 hours, with transient ischemic attack, intracerebral hemorrhage, subarachnoidal hemorrhage, or cerebral sinus venous thrombosis, were excluded. Recurrent ischemic strokes were registered as new events. RESULTS: Between 2003 and 2008, 1633 patients and 1742 events were registered in ASTRAL. There was a preponderance of males, even in the elderly. Cardioembolic stroke was the most frequent type of stroke. Most strokes were of minor severity (National Institute of Health Stroke Scale [NIHSS] score ≤ 4 in 40.8% of patients). Cardioembolic stroke and dissections presented with the most severe clinical picture. There was a significant number of patients with unknown onset stroke, including wake-up stroke (n=568, 33.1%). Median time from last-well time to hospital arrival was 142 minutes for known onset and 759 minutes for unknown-onset stroke. The rate of intravenous or intraarterial thrombolysis between 2003 and 2008 increased from 10.8% to 20.8% in patients admitted within 24 hours of last-well time. Acute brain imaging was performed in 1695 patients (97.3%) within 24 hours. In 1358 patients (78%) who underwent acute computed tomography angiography, 717 patients (52.8%) had significant abnormalities. Of the 1068 supratentorial stroke patients who underwent acute perfusion computed tomography (61.3%), focal hypoperfusion was demonstrated in 786 patients (73.6%). CONCLUSIONS: This hospital-based prospective registry of consecutive acute ischemic strokes incorporates demographic, clinical, metabolic, acute perfusion, and arterial imaging. It is characterized by a high proportion of minor and unknown-onset strokes, short onset-to-admission time for known-onset patients, rapidly increasing thrombolysis rates, and significant vascular and perfusion imaging abnormalities in the majority of patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cost systems have been shown to have developed considerably in recent years andactivity-based costing (ABC) has been shown to be a contribution to cost management,particularly in service businesses. The public sector is composed to a very great extentof service functions, yet considerably less has been reported of the use of ABC tosupport cost management in this sector.In Spain, cost systems are essential for city councils as they are obliged to calculate thecost of the services subject to taxation (eg. waste collection, etc). City councils musthave a cost system in place to calculate the cost of services, as they are legally requirednot to profit , from these services.This paper examines the development of systems to support cost management in theSpanish Public Sector. Through semi-structured interviews with 28 subjects within oneCity Council it contains a case study of cost management. The paper contains extractsfrom interviews and a number of factors are identified which contribute to thesuccessful development of the cost management system.Following the case study a number of other City Councils were identified where activity-based techniques had either failed or stalled. Based on the factors identified inthe single case study a further enquiry is reported. The paper includes a summary usingstatistical analysis which draws attention to change management, funding and politicalincentives as factors which had an influence on system success or failure.