846 resultados para Listener friendly


Relevância:

10.00% 10.00%

Publicador:

Resumo:

El presente estudio aborda la relación entre los estilos comunicativos de los estudiantes universitarios, su vinculación en la universidad y el nivel de adaptación psicosocial. Se analizan distintos estilos comunicativos en relación con el grado de vinculación universitaria y su influencia sobre el nivel de ansiedad, distimia, consumo de alcohol y dependencia de sustancias. Los datos han sido obtenidos mediante cuestionario administrado a una muestra representativa de 529 estudiantes universitarios. Los resultados indican la existencia de diferencias de género con respecto a algunos patrones comunicativos pero no en relación con la vinculación universitaria. Se constata también una relación estadísticamente significativa, aunque no muy elevada, entre los estilos comunicativos y la capacidad de los estudiantes para vincularse en el contexto universitario. Tanto los estilos comunicativos como la vinculación universitaria contribuyen a la explicación de la sintomatología afectiva, pero sólo los estilos comunicativos polémico y amigable contribuyen a la explicación del consumo de sustancias

Relevância:

10.00% 10.00%

Publicador:

Resumo:

East Okoboji Beach was platted on April 20, 1961 and includes over 90.4 acres with 489 lots. The East Okoboji Beach project includes a complete storm water discharge system, which includes low impact development and reconstruction of the roadways in East Okoboji Beach. The East Okoboji Beach Project is an enormous project that is the first Dickinson County project to retrofit LID practices, lake-friendly storm-water drainage systems and roadway reconstruction throughout an existing sub- division. This cooperative project between DNR, Dickinson County, and EOB landowners includes engineering retention ponds, rain gardens, bio-swales and other LID practices to reduce nutrient and sediment pollutants flowing directly into East Okoboji. The nature of the problem stems back to that original plat where small lots were platted and developed without planning for storm water discharge. There was no consideration of the effects of filling in and developing over the many wetland areas existing in EOB. The scope of the problem covers the entire 90.4 acres in East Okoboji Beach, the DNR owned land and the farmed land to the east. The nature of the problem stems from storm water runoff flowing throughout the watershed and into East Okoboji Beach where it flows down self-made paths and then into East Lake Okoboji. That storm water runoff dumps nutrient and sediment pollutions directly into East Lake Okoboji. The expected result of this project is a new roadway and drainage system constructed with engineering that is intended to protect East Lake Okoboji and the land and homes in East Okoboji Beach. The benefit will be the improvement in the waters and the reduction of the siltation in the East Lake Okoboji.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: oscillatory activity, which can be separated in background and oscillatory burst pattern activities, is supposed to be representative of local synchronies of neural assemblies. Oscillatory burst events should consequently play a specific functional role, distinct from background EEG activity – especially for cognitive tasks (e.g. working memory tasks), binding mechanisms and perceptual dynamics (e.g. visual binding), or in clinical contexts (e.g. effects of brain disorders). However extracting oscillatory events in single trials, with a reliable and consistent method, is not a simple task. Results: in this work we propose a user-friendly stand-alone toolbox, which models in a reasonable time a bump time-frequency model from the wavelet representations of a set of signals. The software is provided with a Matlab toolbox which can compute wavelet representations before calling automatically the stand-alone application. Conclusion: The tool is publicly available as a freeware at the address: http:// www.bsp.brain.riken.jp/bumptoolbox/toolbox_home.html

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mixture proportioning is routinely a matter of using a recipe based on a previously produced concrete, rather than adjusting the proportions based on the needs of the mixture and the locally available materials. As budgets grow tighter and increasing attention is being paid to sustainability metrics, greater attention is beginning to be focused on making mixtures that are more efficient in their usage of materials yet do not compromise engineering performance. Therefore, a performance-based mixture proportioning method is needed to provide the desired concrete properties for a given project specification. The proposed method should be user friendly, easy to apply in practice, and flexible in terms of allowing a wide range of material selection. The objective of this study is to further develop an innovative performance-based mixture proportioning method by analyzing the relationships between the selected mix characteristics and their corresponding effects on tested properties. The proposed method will provide step-by-step instructions to guide the selection of required aggregate and paste systems based on the performance requirements. Although the provided guidance in this report is primarily for concrete pavements, the same approach can be applied to other concrete applications as well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A prominent categorization of Indian classical music is the Hindustani and Carnatic traditions, the two styleshaving evolved under distinctly different historical andcultural influences. Both styles are grounded in the melodicand rhythmic framework of raga and tala. The styles differ along dimensions such as instrumentation,aesthetics and voice production. In particular, Carnatic music is perceived as being more ornamented. The hypothesisthat style distinctions are embedded in the melodic contour is validated via subjective classification tests. Melodic features representing the distinctive characteristicsare extracted from the audio. Previous work based on the extent of stable pitch regions is supported by measurements of musicians’ annotations of stable notes. Further, a new feature is introduced that captures thepresence of specific pitch modulations characteristic ofornamentation in Indian classical music. The combined features show high classification accuracy on a database of vocal music of prominent artistes. The misclassifications are seen to match actual listener confusions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of co-operativism is analysed in this report. As an introduction to the subject, the values that socially friendly businesses work with are mentioned, as well as, the basic differences with regard to capitalist companies. In order to broaden the model, seven basic principles that drive such companies are analysed. Recently, due to the economic crisis, many capitalist companies have changed their way of running the business and have opted for cooperatives. Therefore, the steps they have to follow to become part of this view will be considered. So as to introduce a more critical view, benefits and drawbacks of cooperative companies will be borne in mind. In addition, no theoretical model is complete if real examples are not provided, thus in the final part, four companies will be studied. The first one being a company that has always been part of a group of cooperatives and which has enjoyed positive results all over; the second one, a company that has experienced the benefits of leaving the group; the third one, the cornerstone of such group, whose effort to relaunch the company have failed; and the final one, an acquired company whose future is uncertain due to its parent company's decline. To conclude, the final section is going to be devoted to heighten the problems that cooperatives have and which may have compromised their status as alternative models to capitalism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Federal Highway Administration (FHWA) mandated utilizing the Load and Resistance Factor Design (LRFD) approach for all new bridges initiated in the United States after October 1, 2007. As a result, there has been a progressive move among state Departments of Transportation (DOTs) toward an increased use of the LRFD in geotechnical design practices. For the above reasons, the Iowa Highway Research Board (IHRB) sponsored three research projects: TR-573, TR-583 and TR-584. The research information is summarized in the project web site (http://srg.cce.iastate.edu/lrfd/). Two reports of total four volumes have been published. Report volume I by Roling et al. (2010) described the development of a user-friendly and electronic database (PILOT). Report volume II by Ng et al. (2011) summarized the 10 full-scale field tests conducted throughout Iowa and data analyses. This report presents the development of regionally calibrated LRFD resistance factors for bridge pile foundations in Iowa based on reliability theory, focusing on the strength limit states and incorporating the construction control aspects and soil setup into the design process. The calibration framework was selected to follow the guidelines provided by the American Association of State Highway and Transportation Officials (AASHTO), taking into consideration the current local practices. The resistance factors were developed for general and in-house static analysis methods used for the design of pile foundations as well as for dynamic analysis methods and dynamic formulas used for construction control. The following notable benefits to the bridge foundation design were attained in this project: 1) comprehensive design tables and charts were developed to facilitate the implementation of the LRFD approach, ensuring uniform reliability and consistency in the design and construction processes of bridge pile foundations; 2) the results showed a substantial gain in the factored capacity compared to the 2008 AASHTO-LRFD recommendations; and 3) contribution to the existing knowledge, thereby advancing the foundation design and construction practices in Iowa and the nation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two portable Radio Frequency IDentification (RFID) systems (made by Texas Instruments and HiTAG) were developed and tested for bridge scour monitoring by the Department of Civil and Environmental Engineering at the University of Iowa (UI). Both systems consist of three similar components: 1) a passive cylindrical transponder of 2.2 cm in length (derived from transmitter/responder); 2) a low frequency reader (~134.2 kHz frequency); and 3) an antenna (of rectangular or hexagonal loop). The Texas Instruments system can only read one smart particle per time, while the HiTAG system was successfully modified here at UI by adding the anti-collision feature. The HiTAG system was equipped with four antennas and could simultaneously detect 1,000s of smart particles located in a close proximity. A computer code was written in C++ at the UI for the HiTAG system to allow simultaneous, multiple readouts of smart particles under different flow conditions. The code is written for the Windows XP operational system which has a user-friendly windows interface that provides detailed information regarding the smart particle that includes: identification number, location (orientation in x,y,z), and the instance the particle was detected.. These systems were examined within the context of this innovative research in order to identify the best suited RFID system for performing autonomous bridge scour monitoring. A comprehensive laboratory study that included 142 experimental runs and limited field testing was performed to test the code and determine the performance of each system in terms of transponder orientation, transponder housing material, maximum antenna-transponder detection distance, minimum inter-particle distance and antenna sweep angle. The two RFID systems capabilities to predict scour depth were also examined using pier models. The findings can be summarized as follows: 1) The first system (Texas Instruments) read one smart particle per time, and its effective read range was about 3ft (~1m). The second system (HiTAG) had similar detection ranges but permitted the addition of an anti-collision system to facilitate the simultaneous identification of multiple smart particles (transponders placed into marbles). Therefore, it was sought that the HiTAG system, with the anti-collision feature (or a system with similar features), would be preferable when compared to a single-read-out system for bridge scour monitoring, as the former could provide repetitive readings at multiple locations, which could help in predicting the scour-hole bathymetry along with maximum scour depth. 2) The HiTAG system provided reliable measures of the scour depth (z-direction) and the locations of the smart particles on the x-y plane within a distance of about 3ft (~1m) from the 4 antennas. A Multiplexer HTM4-I allowed the simultaneous use of four antennas for the HiTAG system. The four Hexagonal Loop antennas permitted the complete identification of the smart particles in an x, y, z orthogonal system as function of time. The HiTAG system can be also used to measure the rate of sediment movement (in kg/s or tones/hr). 3) The maximum detection distance of the antenna did not change significantly for the buried particles compared to the particles tested in the air. Thus, the low frequency RFID systems (~134.2 kHz) are appropriate for monitoring bridge scour because their waves can penetrate water and sand bodies without significant loss of their signal strength. 4) The pier model experiments in a flume with first RFID system showed that the system was able to successfully predict the maximum scour depth when the system was used with a single particle in the vicinity of pier model where scour-hole was expected. The pier model experiments with the second RFID system, performed in a sandbox, showed that system was able to successfully predict the maximum scour depth when two scour balls were used in the vicinity of the pier model where scour-hole was developed. 5) The preliminary field experiments with the second RFID system, at the Raccoon River, IA near the Railroad Bridge (located upstream of 360th street Bridge, near Booneville), showed that the RFID technology is transferable to the field. A practical method would be developed for facilitating the placement of the smart particles within the river bed. This method needs to be straightforward for the Department of Transportation (DOT) and county road working crews so it can be easily implemented at different locations. 6) Since the inception of this project, further research showed that there is significant progress in RFID technology. This includes the availability of waterproof RFID systems with passive or active transponders of detection ranges up to 60 ft (~20 m) within the water–sediment column. These systems do have anti-collision and can facilitate up to 8 powerful antennas which can significantly increase the detection range. Such systems need to be further considered and modified for performing automatic bridge scour monitoring. The knowledge gained from the two systems, including the software, needs to be adapted to the new systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Enzymatic biodiesel is becoming an increasingly popular topic in bioenergy literature because of its potential to overcome the problems posed by chemical processes. However, the high cost of the enzymatic process still remains the main drawback for its industrial application, mostly because of the high price of refined oils. Unfortunately, low cost substrates, such as crude soybean oil, often release a product that hardly accomplishes the final required biodiesel specifications and need an additional pretreatment for gums removal. In order to reduce costs and to make the enzymatic process more efficient, we developed an innovative system for enzymatic biodiesel production involving a combination of a lipase and two phospholipases. This allows performing the enzymatic degumming and transesterification in a single step, using crude soybean oil as feedstock, and converting part of the phospholipids into biodiesel. Since the two processes have never been studied together, an accurate analysis of the different reaction components and conditions was carried out. Results Crude soybean oil, used as low cost feedstock, is characterized by a high content of phospholipids (900 ppm of phosphorus). However, after the combined activity of different phospholipases and liquid lipase Callera Trans L, a complete transformation into fatty acid methyl esters (FAMEs >95%) and a good reduction of phosphorus (P <5 ppm) was achieved. The combination of enzymes allowed avoidance of the acid treatment required for gums removal, the consequent caustic neutralization, and the high temperature commonly used in degumming systems, making the overall process more eco-friendly and with higher yield. Once the conditions were established, the process was also tested with different vegetable oils with variable phosphorus contents. Conclusions Use of liquid lipase Callera Trans L in biodiesel production can provide numerous and sustainable benefits. Besides reducing the costs derived from enzyme immobilization, the lipase can be used in combination with other enzymes such as phospholipases for gums removal, thus allowing the use of much cheaper, non-refined oils. The possibility to perform degumming and transesterification in a single tank involves a great efficiency increase in the new era of enzymatic biodiesel production at industrial scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genotypic frequencies at codominant marker loci in population samples convey information on mating systems. A classical way to extract this information is to measure heterozygote deficiencies (FIS) and obtain the selfing rate s from FIS = s/(2 - s), assuming inbreeding equilibrium. A major drawback is that heterozygote deficiencies are often present without selfing, owing largely to technical artefacts such as null alleles or partial dominance. We show here that, in the absence of gametic disequilibrium, the multilocus structure can be used to derive estimates of s independent of FIS and free of technical biases. Their statistical power and precision are comparable to those of FIS, although they are sensitive to certain types of gametic disequilibria, a bias shared with progeny-array methods but not FIS. We analyse four real data sets spanning a range of mating systems. In two examples, we obtain s = 0 despite positive FIS, strongly suggesting that the latter are artefactual. In the remaining examples, all estimates are consistent. All the computations have been implemented in a open-access and user-friendly software called rmes (robust multilocus estimate of selfing) available at http://ftp.cefe.cnrs.fr, and can be used on any multilocus data. Being able to extract the reliable information from imperfect data, our method opens the way to make use of the ever-growing number of published population genetic studies, in addition to the more demanding progeny-array approaches, to investigate selfing rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report summarizes progress made in Phase 1 of the GIS-based Accident Location and Analysis System (GIS-ALAS) project. The GIS-ALAS project builds on several longstanding efforts by the Iowa Department of Transportation (DOT), law enforcement agencies, Iowa State University, and several other entities to create a locationally-referenced highway accident database for Iowa. Most notable of these efforts is the Iowa DOT’s development of a PC-based accident location and analysis system (PC-ALAS), a system that has been well received by users since it was introduced in 1989. With its pull-down menu structure, PC-ALAS is more portable and user-friendly than its mainframe predecessor. Users can obtain accident statistics for locations during specified time periods. Searches may be refined to identify accidents of specific types or involving drivers with certain characteristics. Output can be viewed on a computer screen, sent to a file, or printed using pre-defined formats.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Colonic endoscopic submucosal dissection (ESD) is challenging as a result of the limited ability of conventional endoscopic instruments to achieve traction and exposure. The aim of this study was to evaluate the feasibility of colonic ESD in a porcine model using a novel endoscopic surgical platform, the Anubiscope (Karl Storz, Tüttlingen, Germany), equipped with two working channels for surgical instruments with four degrees of freedom offering surgical triangulation. METHODS: Nine ESDs were performed by a surgeon without any ESD experience in three swine, at 25, 15, and 10 cm above the anal verge with the Anubiscope. Sixteen ESDs were performed by an experienced endoscopist in five swine using conventional endoscopic instruments. Major ESD steps included the following for both groups: scoring the area, submucosal injection of glycerol, precut, and submucosal dissection. Outcomes measured were as follows: dissection time and speed, specimen size, en bloc dissection, and complications. RESULTS: No perforations occurred in the Anubis group, while there were eight perforations (50 %) in the conventional group (p = 0.02). Complete and en bloc dissections were achieved in all cases in the Anubis group. Mean dissection time for completed cases was statistically significantly shorter in the Anubis group (32.3 ± 16.1 vs. 55.87 ± 7.66 min; p = 0.0019). Mean specimen size was higher in the conventional group (1321 ± 230 vs. 927.77 ± 229.96 mm(2); p = 0.003), but mean dissection speed was similar (35.95 ± 18.93 vs. 23.98 ± 5.02 mm(2)/min in the Anubis and conventional groups, respectively; p = 0.1). CONCLUSIONS: Colonic ESDs were feasible in pig models with the Anubiscope. This surgical endoscopic platform is promising for endoluminal surgical procedures such as ESD, as it is user-friendly, effective, and safe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The R package EasyStrata facilitates the evaluation and visualization of stratified genome-wide association meta-analyses (GWAMAs) results. It provides (i) statistical methods to test and account for between-strata difference as a means to tackle gene-strata interaction effects and (ii) extended graphical features tailored for stratified GWAMA results. The software provides further features also suitable for general GWAMAs including functions to annotate, exclude or highlight specific loci in plots or to extract independent subsets of loci from genome-wide datasets. It is freely available and includes a user-friendly scripting interface that simplifies data handling and allows for combining statistical and graphical functions in a flexible fashion. AVAILABILITY: EasyStrata is available for free (under the GNU General Public License v3) from our Web site www.genepi-regensburg.de/easystrata and from the CRAN R package repository cran.r-project.org/web/packages/EasyStrata/. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hitsaavassa teollisuudessa kilpailukyvyn säilyttäminen ja mahdollinen parantaminen edellyttää hitsauksen tehokkuuden nostoa. Laserhitsauksen nopeus, tarkkuus, tasainen laatu ja aikaansaatava syvä tunkeuma ovatkin vakiinnuttaneet menetelmän vankan aseman tehokkaana valmistusmenetelmänä. Sähkön ja heliumin hinnan nousu ovat pakottaneet teollisuuden miettimään entistä tehokkaampien ja ympäristöystävällisempien laserlähteiden hankkimista. Kuitulaserin korkea hyötysuhde, hyvä säteenlaatu, suuri teho ja matalat käyttökustannukset ovat herättäneet kiinnostusta laserhitsaavassa teollisuudessa. Diplomityössä keskityttiin kuitulaserhitsauksen soveltamiseen. Työn tavoitteena oli parantaa kuitulaserhitsausmenetelmän ymmärrystä ja saada käsitys siitä, miten valitaan hitsausparametrien arvot, ja soveltuuko kuitulaser teolliseen tuotantoon. Tutkimuksessa pyrittiin löytämään peruskokeilla optimaaliset hitsausparametrit, joilla syntyy hyvin tunkeutunut, vähän huokosia sisältävä, ja ulkoisesti laadukas hitsi, sekä optimaalinen hitsin tunkeumaprofiili. Lopuksi hitsausparametreja testattiin tuotteen hitsauksessa. Kuitulaser soveltuu erinomaisesti hiiliteräksen hitsaukseen ja hyvin erikoislujien terästen hitsaukseen, kun teräksen hiili- ja rikkipitoisuudet ovat matalia. Sillä on laaja parametrialue. Yleisimmät hitsausvirheet ovat vajaa hitsautumissyvyys ja huokoset. Tässä diplomityössä keskityttiin etsimään yhdelle valmistettavalle tuotteelle optimaaliset kuitulaserhitsausparametrit. Kuitulaserin laser- ja prosessiparametrien vaikutusta hitsiin ei ole juurikaan tutkittu. Diplomityön kokeiden perusteella olisi hyvä tehdä eri materiaalien jatkotutkimusta railonvalmistuksen, kuten liitoksen oksidikerroksen ja ilmaraon sekä suojakaasun, vaikutuksesta hitsiin. Kuitulaserin hyvä säteenlaatu ja muut laser-parametrit ovat tuoneet mukanaan prosessiin uusia ilmiöitä, joita on syytä tutkia lisää.