893 resultados para DIFFERENT GENETIC MODELS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In June 2009 a study was completed that had been commissioned by Knowledge Exchange and written by Professor John Houghton, Victoria University, Australia. This report on the study was titled: "Open Access – What are the economic benefits? A comparison of the United Kingdom, Netherlands and Denmark." This report was based on the findings of studies in which John Houghton had modelled the costs and benefits of Open Access in three countries. These studies had been undertaken in the UK by JISC, in the Netherlands by SURF and in Denmark by DEFF. In the three national studies the costs and benefits of scholarly communication were compared based on three different publication models. The modelling revealed that the greatest advantage would be offered by the Open Access model, which means that the research institution or the party financing the research pays for publication and the article is then freely accessible. Adopting this model could lead to annual savings of around EUR 70 million in Denmark, EUR 133 million in The Netherlands and EUR 480 in the UK. The report concludes that the advantages would not just be in the long term; in the transitional phase too, more open access to research results would have positive effects. In this case the benefits would also outweigh the costs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work represents ongoing efforts to study high-enthalpy carbon dioxide flows in anticipation of the upcoming Mars Science Laboratory (MSL) and future missions to the red planet. The work is motivated by observed anomalies between experimental and numerical studies in hypervelocity impulse facilities for high enthalpy carbon dioxide flows. In this work, experiments are conducted in the Hypervelocity Expansion Tube (HET) which, by virtue of its flow acceleration process, exhibits minimal freestream dissociation in comparison to reflected shock tunnels. This simplifies the comparison with computational result as freestream dissociation and considerable thermochemical excitation can be neglected. Shock shapes of the MSL aeroshell and spherical geometries are compared with numerical simulations incorporating detailed CO2 thermochemical modeling. The shock stand-off distance has been identified in the past as sensitive to the thermochemical state and as such, is used here as an experimental measurable for comparison with CFD and two different theoretical models. It is seen that models based upon binary scaling assumptions are not applicable for the low-density, small-scale conditions of the current work. Mars Science Laboratory shock shapes at zero angle of attack are also in good agreement with available data from the LENS X expansion tunnel facility, confi rming results are facility-independent for the same type of flow acceleration, and indicating that the flow velocity is a suitable first-order matching parameter for comparative testing. In an e ffort to address surface chemistry issues arising from high-enthalpy carbon dioxide ground-test based experiments, spherical stagnation point and aeroshell heat transfer distributions are also compared with simulation. Very good agreement between experiment and CFD is seen for all shock shapes and heat transfer distributions fall within the non-catalytic and super-catalytic solutions. We also examine spatial temperature profiles in the non-equilibrium relaxation region behind a stationary shock wave in a hypervelocity air Mach 7.42 freestream. The normal shock wave is established through a Mach reflection from an opposing wedge arrangement. Schlieren images confirm that the shock con guration is steady and the location is repeatable. Emission spectroscopy is used to identify dissociated species and to make vibrational temperature measurements using both the nitric oxide and the hydroxyl radical A-X band sequences. Temperature measurements are presented at selected locations behind the normal shock. LIFBASE is used as the simulation spectrum software for OH temperature-fitting, however the need to access higher vibrational and rotational levels for NO leads to the use of an in-house developed algorithm. For NO, results demonstrate the contribution of higher vibrational and rotational levels to the spectra at the conditions of this study. Very good agreement is achieved between the experimentally measured NO vibrational temperatures and calculations performed using an existing state-resolved, three-dimensional forced harmonic oscillator thermochemical model. The measured NO A-X vibrational temperatures are significantly higher than the OH A-X temperatures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of digital image processing techniques is prominent in medical settings for the automatic diagnosis of diseases. Glaucoma is the second leading cause of blindness in the world and it has no cure. Currently, there are treatments to prevent vision loss, but the disease must be detected in the early stages. Thus, the objective of this work is to develop an automatic detection method of Glaucoma in retinal images. The methodology used in the study were: acquisition of image database, Optic Disc segmentation, texture feature extraction in different color models and classification of images in glaucomatous or not. We obtained results of 93% accuracy

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Edge-labeled graphs have proliferated rapidly over the last decade due to the increased popularity of social networks and the Semantic Web. In social networks, relationships between people are represented by edges and each edge is labeled with a semantic annotation. Hence, a huge single graph can express many different relationships between entities. The Semantic Web represents each single fragment of knowledge as a triple (subject, predicate, object), which is conceptually identical to an edge from subject to object labeled with predicates. A set of triples constitutes an edge-labeled graph on which knowledge inference is performed. Subgraph matching has been extensively used as a query language for patterns in the context of edge-labeled graphs. For example, in social networks, users can specify a subgraph matching query to find all people that have certain neighborhood relationships. Heavily used fragments of the SPARQL query language for the Semantic Web and graph queries of other graph DBMS can also be viewed as subgraph matching over large graphs. Though subgraph matching has been extensively studied as a query paradigm in the Semantic Web and in social networks, a user can get a large number of answers in response to a query. These answers can be shown to the user in accordance with an importance ranking. In this thesis proposal, we present four different scoring models along with scalable algorithms to find the top-k answers via a suite of intelligent pruning techniques. The suggested models consist of a practically important subset of the SPARQL query language augmented with some additional useful features. The first model called Substitution Importance Query (SIQ) identifies the top-k answers whose scores are calculated from matched vertices' properties in each answer in accordance with a user-specified notion of importance. The second model called Vertex Importance Query (VIQ) identifies important vertices in accordance with a user-defined scoring method that builds on top of various subgraphs articulated by the user. Approximate Importance Query (AIQ), our third model, allows partial and inexact matchings and returns top-k of them with a user-specified approximation terms and scoring functions. In the fourth model called Probabilistic Importance Query (PIQ), a query consists of several sub-blocks: one mandatory block that must be mapped and other blocks that can be opportunistically mapped. The probability is calculated from various aspects of answers such as the number of mapped blocks, vertices' properties in each block and so on and the most top-k probable answers are returned. An important distinguishing feature of our work is that we allow the user a huge amount of freedom in specifying: (i) what pattern and approximation he considers important, (ii) how to score answers - irrespective of whether they are vertices or substitution, and (iii) how to combine and aggregate scores generated by multiple patterns and/or multiple substitutions. Because so much power is given to the user, indexing is more challenging than in situations where additional restrictions are imposed on the queries the user can ask. The proposed algorithms for the first model can also be used for answering SPARQL queries with ORDER BY and LIMIT, and the method for the second model also works for SPARQL queries with GROUP BY, ORDER BY and LIMIT. We test our algorithms on multiple real-world graph databases, showing that our algorithms are far more efficient than popular triple stores.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Drilling fluids present a thixotropic behavior and they usually gel when at rest. The sol-gel transition is fundamental to prevent the deposit of rock fragments, generated by drilling the well, over the drill bit during eventual stops. Under those conditions, high pressures are then required in order to break-up the gel when circulation is resumed. Moreover, very high pressures can damage the rock formation at the bottom of the well. Thus, a better understanding of thixotropy and the behavior of thixotropic materials becomes increasingly important for process control. The mechanisms that control thixotropy are not yet well defined and modeling is still a challenge. The objective of this work is to develop a mathematical model to study the pressure transmission in drilling fluids. This work presents a review of thixotropy and of different mathematical models found in the literature that are used to predict such characteristic. It also shows a review of transient flows of compressible fluids. The problem is modeled as the flow between the drillpipe and the annular region (space between the wall and the external part of the drillpipe). The equations that describe the problem (mass conservation, momentum balance, constitutive and state) are then discretized and numerically solved by using a computational algorithm in Fortran. The model is validated with experimental and numerical data obtained from the literature. Comparisons between experimental data obtained from Petrobras and calculated by three viscoplastic and one pseudoplastic models are conducted. The viscoplastic fluids, due to the yield stress, do not fully transmit the pressure to the outlet of the annular space. Sensibility analyses are then conducted in order to evaluate the thixotropic effect in pressure transmission.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Heterogeneous computing systems have become common in modern processor architectures. These systems, such as those released by AMD, Intel, and Nvidia, include both CPU and GPU cores on a single die available with reduced communication overhead compared to their discrete predecessors. Currently, discrete CPU/GPU systems are limited, requiring larger, regular, highly-parallel workloads to overcome the communication costs of the system. Without the traditional communication delay assumed between GPUs and CPUs, we believe non-traditional workloads could be targeted for GPU execution. Specifically, this thesis focuses on the execution model of nested parallel workloads on heterogeneous systems. We have designed a simulation flow which utilizes widely used CPU and GPU simulators to model heterogeneous computing architectures. We then applied this simulator to non-traditional GPU workloads using different execution models. We also have proposed a new execution model for nested parallelism allowing users to exploit these heterogeneous systems to reduce execution time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research develops four case studies on small-scale fisheries in Central America located within indigenous territories. The ngöbe Bugle Conte Burica Territory in the south of Costa Rica, the Garífuna territory in nueva Armenia Honduras, the Rama territory in Nicaragua and the ngöbe Bugle territory in Bocas del Toro, Panamá. This is one of the first studies focusing on indigenous territories, artisanal fisheries and SSF guidelines. The cases are a first approach to discussing and analyzing relevant social and human rights issues related to conservation of marine resources and fisheries management in these territories. The cases discussed between other issues of interest, the relationships between marine protected areas under different governance models and issues related to the strengthening of the small-scale fisheries of these indigenous populations and marine fishing territories. They highlight sustainability, governance, land tenure and access to fishing resources, gender, traditional knowledge importance and new challenges as climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research aims to answer a fundamental question: which of the disability models currently in use is optimal for creating “accessible tourism-oriented” amenities, as well as more detailed problems: (1) what is disability and what determines different disability models? (2) what types of tourism market supply available for the disabled do the different disability models suggest? (3) are the disability models complementary or mutually exclusive? (4) is the idea of social integration and inclusion of people with disabilities (PWD) while on tourist trips supported of the society? Data sources comprise selected literature and results of a survey conducted using the face-to-face method and the SurveyMonkey website from May 2013 to July 2014. The surveyed group included 619 people (82% were Polish, the other 18% were foreigners from: Russia, Germany, Portugal, Slovakia, Canada, Tunisia and the United Kingdom). The research showed that the different disability models – medical, social, geographical and economic – are useful when creating the tourism supply for the PWD. Using the research results, the authors suggested a model of “diversification of tourism market supply structure available for the disabled”, which includes different types of supply – from specialist to universal. This model has practical usage and can help entrepreneurs with the segmentation of tourism offers addressed to the PWD. The work is innovative, both in its theoretical approach (the review of disability models and their practical application in creating tourism supply) and empirical values – it provides current data for the social attitude towards the development of PWD tourism. Especially the presentation of a wide range of perception of disability as well as the simple classification of tourism supply that meets the varied needs of PWD, is a particular novelty of this chapter.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Datacenters have emerged as the dominant form of computing infrastructure over the last two decades. The tremendous increase in the requirements of data analysis has led to a proportional increase in power consumption and datacenters are now one of the fastest growing electricity consumers in the United States. Another rising concern is the loss of throughput due to network congestion. Scheduling models that do not explicitly account for data placement may lead to a transfer of large amounts of data over the network causing unacceptable delays. In this dissertation, we study different scheduling models that are inspired by the dual objectives of minimizing energy costs and network congestion in a datacenter. As datacenters are equipped to handle peak workloads, the average server utilization in most datacenters is very low. As a result, one can achieve huge energy savings by selectively shutting down machines when demand is low. In this dissertation, we introduce the network-aware machine activation problem to find a schedule that simultaneously minimizes the number of machines necessary and the congestion incurred in the network. Our model significantly generalizes well-studied combinatorial optimization problems such as hard-capacitated hypergraph covering and is thus strongly NP-hard. As a result, we focus on finding good approximation algorithms. Data-parallel computation frameworks such as MapReduce have popularized the design of applications that require a large amount of communication between different machines. Efficient scheduling of these communication demands is essential to guarantee efficient execution of the different applications. In the second part of the thesis, we study the approximability of the co-flow scheduling problem that has been recently introduced to capture these application-level demands. Finally, we also study the question, "In what order should one process jobs?'' Often, precedence constraints specify a partial order over the set of jobs and the objective is to find suitable schedules that satisfy the partial order. However, in the presence of hard deadline constraints, it may be impossible to find a schedule that satisfies all precedence constraints. In this thesis we formalize different variants of job scheduling with soft precedence constraints and conduct the first systematic study of these problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Due to their unique physicochemical properties, including superparamagnetism, iron oxide nanoparticles (ION) have a number of interesting applications, especially in the biomedical field, that make them one of the most fascinating nanomaterials. They are used as contrast agents for magnetic resonance imaging, in targeted drug delivery, and for induced hyperthermia cancer treatments. Together with these valuable uses, concerns regarding the onset of unexpected adverse health effects following exposure have been also raised. Nevertheless, despite the numerous ION purposes being explored, currently available information on their potential toxicity is still scarce and controversial data have been reported. Although ION have traditionally been considered as biocompatible - mainly on the basis of viability tests results - influence of nanoparticle surface coating, size, or dose, and of other experimental factors such as treatment time or cell type, has been demonstrated to be important for ION in vitro toxicity manifestation. In vivo studies have shown distribution of ION to different tissues and organs, including brain after passing the blood-brain barrier; nevertheless results from acute toxicity, genotoxicity, immunotoxicity, neurotoxicity and reproductive toxicity investigations in different animal models do not provide a clear overview on ION safety yet, and epidemiological studies are almost inexistent. Much work has still to be done to fully understand how these nanomaterials interact with cellular systems and what, if any, potential adverse health consequences can derive from ION exposure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis develops and tests various transient and steady-state computational models such as direct numerical simulation (DNS), large eddy simulation (LES), filtered unsteady Reynolds-averaged Navier-Stokes (URANS) and steady Reynolds-averaged Navier-Stokes (RANS) with and without magnetic field to investigate turbulent flows in canonical as well as in the nozzle and mold geometries of the continuous casting process. The direct numerical simulations are first performed in channel, square and 2:1 aspect rectangular ducts to investigate the effect of magnetic field on turbulent flows. The rectangular duct is a more practical geometry for continuous casting nozzle and mold and has the option of applying magnetic field either perpendicular to broader side or shorter side. This work forms the part of a graphic processing unit (GPU) based CFD code (CU-FLOW) development for magnetohydrodynamic (MHD) turbulent flows. The DNS results revealed interesting effects of the magnetic field and its orientation on primary, secondary flows (instantaneous and mean), Reynolds stresses, turbulent kinetic energy (TKE) budgets, momentum budgets and frictional losses, besides providing DNS database for two-wall bounded square and rectangular duct MHD turbulent flows. Further, the low- and high-Reynolds number RANS models (k-ε and Reynolds stress models) are developed and tested with DNS databases for channel and square duct flows with and without magnetic field. The MHD sink terms in k- and ε-equations are implemented as proposed by Kenjereš and Hanjalić using a user defined function (UDF) in FLUENT. This work revealed varying accuracies of different RANS models at different levels. This work is useful for industry to understand the accuracies of these models, including continuous casting. After realizing the accuracy and computational cost of RANS models, the steady-state k-ε model is then combined with the particle image velocimetry (PIV) and impeller probe velocity measurements in a 1/3rd scale water model to study the flow quality coming out of the well- and mountain-bottom nozzles and the effect of stopper-rod misalignment on fluid flow. The mountain-bottom nozzle was found more prone to the longtime asymmetries and higher surface velocities. The left misalignment of stopper gave higher surface velocity on the right leading to significantly large number of vortices forming behind the nozzle on the left. Later, the transient and steady-state models such as LES, filtered URANS and steady RANS models are combined with ultrasonic Doppler velocimetry (UDV) measurements in a GaInSn model of typical continuous casting process. LES-CU-LOW is the fastest and the most accurate model owing to much finer mesh and a smaller timestep. This work provided a good understanding on the performance of these models. The behavior of instantaneous flows, Reynolds stresses and proper orthogonal decomposition (POD) analysis quantified the nozzle bottom swirl and its importance on the turbulent flow in the mold. Afterwards, the aforementioned work in GaInSn model is extended with electromagnetic braking (EMBr) to help optimize a ruler-type brake and its location for the continuous casting process. The magnetic field suppressed turbulence and promoted vortical structures with their axis aligned with the magnetic field suggesting tendency towards 2-d turbulence. The stronger magnetic field at the nozzle well and around the jet region created large scale and lower frequency flow behavior by suppressing nozzle bottom swirl and its front-back alternation. Based on this work, it is advised to avoid stronger magnetic field around jet and nozzle bottom to get more stable and less defect prone flow.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Blogging is one of the most common forms of social media today. Blogs have become a powerful media and bloggers are settled stakeholders to marketers. Commercialization of the blogosphere has enabled an increasing number of bloggers professionalize and blog as a full-time occupation. The purpose of this study is to understand the professionalization process of a blogger from an amateur blogger to a professional actor. The following sub-questions were used to further elaborate the topic: What have been the meaningful events and developments fostering professionalization? What are the prerequisites for popularity in blogging? Are there any key success factors to acknowledge in order being able to make business out of your blog? The theoretical framework of this study was formed based on the two chosen focus areas for professionalization; social drivers and business drivers. The theoretical framework is based on literature from fields of marketing and social sciences, as well as previous research on social media, blogging and professionalization. The study is a qualitative case-study and the research data was collected in a semi-structured interview. The case chosen to this study is a lifestyle-blog. The writer of the case blog has been able to develop her blog to become a full-time professional blogger. Based on the results, the professionalization process of a blogger is not a defined process, but instead comprised of coincidental events as well as considered advancements. Success in blogging is based on the bloggers own motivation and passion for writing and expressing oneself in the form of a blog, instead of a systematic construction of a successful career in blogging. Networking with other bloggers as well as affiliates was seen as an important success factor. Popularity in the blogosphere and a high number of followers enable professionalization, as marketers actively seek to collaborate with popular bloggers with strong personal brands. Bloggers with strong personal brands are especially attractive due to their opinion leadership in their reference group. A blogger can act professionally either as entrepreneur or blogging for a commercial webpage. According to the results of this study, it is beneficial for the blogger’s professional development as well as career progress, to act on different operating models

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study investigates the structure and intensity of the surface pathways connecting to and from the central areas of the large-scale convergence regions of the eastern Pacific Ocean. Surface waters are traced with numerical Lagrangian particles transported in the velocity field of three different ocean models with horizontal resolutions that range from ¼° to 1/32°. The connections resulting from the large-scale convergent Ekman dynamics agree qualitatively but are strongly modulated by eddy variability that introduces meridional asymmetry in the amplitude of transport. Lagrangian forward-in-time integrations are used to analyze the fate of particles originating from the central regions of the convergence zones and highlight specific outflows not yet reported for the southeastern Pacific when using the currents at the highest resolutions (1/12° and 1/32°). The meridional scales of these outflows are comparable to the characteristic width of the fine-scale striation of mean currents.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este artículo presenta un nuevo método de identificación para sistemas de fase no mínima basado en la respuesta escalón. El enfoque propuesto provee un modelo aproximado de segundo orden evitando diseños experimentales complejos. El método propuesto es un algoritmo de identificación cerrado basado en puntos característicos de la respuesta escalón de sistemas de fase no mínima de segundo orden. Él es validado usando diferentes modelos lineales. Ellos tienen respuesta inversa entre 3,5% y 38% de la respuesta en régimen permanente. En simulaciones, ha sido demostrado que resultados satisfactorios pueden ser obtenidos usando el procedimiento de identificación propuesto, donde los parámetros identificados presentan errores relativos medios, menores que los obtenidos mediante el método de Balaguer.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Preeclampsia is a multifactorial disease of unknown etiology that features with wide clinical symptoms, ranging from mild preeclampsia to severe forms, as eclampsia and HELLP syndrome. As a complex disease, preeclampsia is also influenced by genetic and environmental factors. Aiming to identify preeclampsia susceptibility genes, we genotyped a total of 22 genetic markers (single nucleotides polymorphisms SNPs) distributed in six candidates genes (ACVR2A, FLT1, ERAP1, ERAP2, LNPEP e CRHBP). By a case-control approach, the genotypic frequencies were compared between normotensive (control group) and preeclamptic women. The case s group was classified according to the disease clinical form in: preeclampsia, eclampsia and HELLP syndrome. As results we found the following genetic association: 1) ACVR2A and preeclampsia; 2) FLT1 and severe preeclampsia; 3) ERAP1 and eclampsia; 4) FLT1 and HELLP syndrome. When stratifying preeclampsia group according to symptoms severity (mild and severe preeclampsia) or according to the time of onset (early and late preeclampsia), it was detected that early preeclampsia is strongly associated to risk preeclampsia, eclampsia and HELLP syndrome have different genetic bases, although FLT1 gene seems to be involved in preeclampsia and HELLP syndrome pathophisiology