851 resultados para quick-EXAFS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identification of Fusarium species has always been difficult due to confusing phenotypic classification systems. We have developed a fluorescent-based polymerase chain reaction assay that allows for rapid and reliable identification of five toxigenic and pathogenic Fusarium species. The species includes Fusarium avenaceum, F. culmorum, F. equiseti, F. oxysporum and F. sambucinum. The method is based on the PCR amplification of species-specific DNA fragments using fluorescent oligonucleotide primers, which were designed based on sequence divergence within the internal transcribed spacer region of nuclear ribosomal DNA. Besides providing an accurate, reliable, and quick diagnosis of these Fusaria, another advantage with this method is that it reduces the potential for exposure to carcinogenic chemicals as it substitutes the use of fluorescent dyes in place of ethidium, bromide. Apart from its multidisciplinary importance and usefulness, it also obviates the need for gel electrophoresis. (C) 2002 Published by Elsevier Science B.V. on behalf of the Federation of European Microbiological Societies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses experimental and theoretical investigations and Computational Fluid Dynamics (CFD) modelling considerations to evaluate the performance of a square section wind catcher system connected to the top of a test room for the purpose of natural ventilation. The magnitude and distribution of pressure coefficients (C-p) around a wind catcher and the air flow into the test room were analysed. The modelling results indicated that air was supplied into the test room through the wind catcher's quadrants with positive external pressure coefficients and extracted out of the test room through quadrants with negative pressure coefficients. The air flow achieved through the wind catcher depends on the speed and direction of the wind. The results obtained using the explicit and AIDA implicit calculation procedures and CFX code correlate relatively well with the experimental results at lower wind speeds and with wind incidents at an angle of 0 degrees. Variation in the C-p and air flow results were observed particularly with a wind direction of 45 degrees. The explicit and implicit calculation procedures were found to be quick and easy to use in obtaining results whereas the wind tunnel tests were more expensive in terms of effort, cost and time. CFD codes are developing rapidly and are widely available especially with the decreasing prices of computer hardware. However, results obtained using CFD codes must be considered with care, particularly in the absence of empirical data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Eye gaze is an important conversational resource that until now could only be supported across a distance if people were rooted to the spot. We introduce EyeCVE, the worldpsilas first tele-presence system that allows people in different physical locations to not only see what each other are doing but follow each otherpsilas eyes, even when walking about. Projected into each space are avatar representations of remote participants, that reproduce not only body, head and hand movements, but also those of the eyes. Spatial and temporal alignment of remote spaces allows the focus of gaze as well as activity and gesture to be used as a resource for non-verbal communication. The temporal challenge met was to reproduce eye movements quick enough and often enough to interpret their focus during a multi-way interaction, along with communicating other verbal and non-verbal language. The spatial challenge met was to maintain communicational eye gaze while allowing free movement of participants within a virtually shared common frame of reference. This paper reports on the technical and especially temporal characteristics of the system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider hybrid (fast stochastic approximation and deterministic refinement) algorithms for Matrix Inversion (MI) and Solving Systems of Linear Equations (SLAE). Monte Carlo methods are used for the stochastic approximation, since it is known that they are very efficient in finding a quick rough approximation of the element or a row of the inverse matrix or finding a component of the solution vector. We show how the stochastic approximation of the MI can be combined with a deterministic refinement procedure to obtain MI with the required precision and further solve the SLAE using MI. We employ a splitting A = D – C of a given non-singular matrix A, where D is a diagonal dominant matrix and matrix C is a diagonal matrix. In our algorithm for solving SLAE and MI different choices of D can be considered in order to control the norm of matrix T = D –1C, of the resulting SLAE and to minimize the number of the Markov Chains required to reach given precision. Further we run the algorithms on a mini-Grid and investigate their efficiency depending on the granularity. Corresponding experimental results are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It's a fact that functional verification (FV) is paramount within the hardware's design cycle. With so many new techniques available today to help with FV, which techniques should we really use? The answer is not straightforward and is often confusing and costly. The tools and techniques to be used in a project have to be decided upon early in the design cycle to get the best value for these new verification methods. This paper gives a quick survey in the form of an overview on FV, establishes the difference between verification and validation, describes the bottlenecks that appear in the verification process, examines the challenges in FV and exposes the current FV technologies and trends.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study explores the implications of an organization moving toward service-dominant logic (S-D logic) on the sales function. Driven by its customers’ needs, a service orientation by its nature requires personal interaction and sales personnel are in an ideal position to develop offerings with the customer. However, the development of S-D logic may require sales staff to develop additional skills. Employing a single case study, the study identified that sales personnel are quick to appreciate the advantages of S-D logic for customer satisfaction and six specific skills were highlighted and explored. Further, three propositions were identified: in an organization adopting S-D logic, the sales process needs to elicit needs at both embedded-value and value-in-use levels. In addition, the sales process needs to coproduce not just goods and service attributes but also attributes of the customer’s usage processes. Further, the sales process needs to coproduce not just goods and service attributes but also attributes of the customer’s usage processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper offers an alternative viewpoint on why people choose to engage in artisanal mining – the low tech mineral extraction and processing of mainly precious metals and stones – for extended periods in sub-Saharan Africa. Drawing upon experiences from Akwatia, Ghana’s epicentre of diamond production since the mid-1920s, the analysis challenges the commonly-held view that the region’s people are drawn to artisanal mining solely because of a desire ‘to get rich quick’. A combination of events, including the recent closure of Ghana Consolidated Diamonds Ltd’s industrial-scale operation and decreased foreign investment in the country’s diamond industry over concerns of it potentially harbouring ‘conflict’ stones from neighbouring Coˆte D’Ivoire, has had a debilitating economic impact on Akwatia. In an attempt to alleviate their hardships, many of the town’s so-called ‘lifetime’ diamond miners have managed to secure employment in neighbouring artisanal gold mining camps. But their decision has been condemned by many of the country’s policymakers and traditional leaders, who see it solely as a move to secure ‘fast money’. It is argued here, however, that these people pursue work in surrounding artisanal gold mining communities mainly because of poverty, and that their decision has more to do with a desire to immerse in activities with which they are familiar, that offer stable employment and consistent salaries, and provide immediate debt relief. Misdiagnosis of cases such as Akwatia underscores how unfamiliar policymakers and donors are with the dynamics of ASM in sub-Saharan Africa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the challenges and opportunities posed by climate change become increasingly apparent, the need for facilitating successful adaptation and enhancing adaptive capacity within the context of sustainable development is clear. With adaptation high on the agenda, the notion of limits and barriers to adaptation has recently received much attention within both academic and policymaking spheres. While emerging literature has been quick to depict limits and barriers in terms of natural, financial, or technologic processes, there is a clear shortfall in acknowledging social barriers to adaptation. It is against such a backdrop that this paper sets out to expose and explore some of the underlying features of social barriers to adaptation, drawing on insights from two case studies in the Western Nepal. This paper exposes the significant role of cognitive, normative and institutional factors in both influencing and prescribing adaptation. It explores how restrictive social environments can limit adaptation actions and influence adaptive capacity at the local level, particularly for the marginalised and socially excluded. The findings suggest a need for greater recognition of the diversity and complexity of social barriers, strategic planning and incorporation at national and local levels, as well as an emphasis on tackling the underlying drivers of vulnerability and social exclusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical graphics are a fundamental, yet often overlooked, set of components in the repertoire of data analytic tools. Graphs are quick and efficient, yet simple instruments of preliminary exploration of a dataset to understand its structure and to provide insight into influential aspects of inference such as departures from assumptions and latent patterns. In this paper, we present and assess a graphical device for choosing a method for estimating population size in capture-recapture studies of closed populations. The basic concept is derived from a homogeneous Poisson distribution where the ratios of neighboring Poisson probabilities multiplied by the value of the larger neighbor count are constant. This property extends to the zero-truncated Poisson distribution which is of fundamental importance in capture–recapture studies. In practice however, this distributional property is often violated. The graphical device developed here, the ratio plot, can be used for assessing specific departures from a Poisson distribution. For example, simple contaminations of an otherwise homogeneous Poisson model can be easily detected and a robust estimator for the population size can be suggested. Several robust estimators are developed and a simulation study is provided to give some guidance on which should be used in practice. More systematic departures can also easily be detected using the ratio plot. In this paper, the focus is on Gamma mixtures of the Poisson distribution which leads to a linear pattern (called structured heterogeneity) in the ratio plot. More generally, the paper shows that the ratio plot is monotone for arbitrary mixtures of power series densities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is well known that gut bacteria contribute significantly to the host homeostasis, providing a range of benefits such as immune protection and vitamin synthesis. They also supply the host with a considerable amount of nutrients, making this ecosystem an essential metabolic organ. In the context of increasing evidence of the link between the gut flora and the metabolic syndrome, understanding the metabolic interaction between the host and its gut microbiota is becoming an important challenge of modern biology.1-4 Colonization (also referred to as normalization process) designates the establishment of micro-organisms in a former germ-free animal. While it is a natural process occurring at birth, it is also used in adult germ-free animals to control the gut floral ecosystem and further determine its impact on the host metabolism. A common procedure to control the colonization process is to use the gavage method with a single or a mixture of micro-organisms. This method results in a very quick colonization and presents the disadvantage of being extremely stressful5. It is therefore useful to minimize the stress and to obtain a slower colonization process to observe gradually the impact of bacterial establishment on the host metabolism. In this manuscript, we describe a procedure to assess the modification of hepatic metabolism during a gradual colonization process using a non-destructive metabolic profiling technique. We propose to monitor gut microbial colonization by assessing the gut microbial metabolic activity reflected by the urinary excretion of microbial co-metabolites by 1H NMR-based metabolic profiling. This allows an appreciation of the stability of gut microbial activity beyond the stable establishment of the gut microbial ecosystem usually assessed by monitoring fecal bacteria by DGGE (denaturing gradient gel electrophoresis).6 The colonization takes place in a conventional open environment and is initiated by a dirty litter soiled by conventional animals, which will serve as controls. Rodents being coprophagous animals, this ensures a homogenous colonization as previously described.7 Hepatic metabolic profiling is measured directly from an intact liver biopsy using 1H High Resolution Magic Angle Spinning NMR spectroscopy. This semi-quantitative technique offers a quick way to assess, without damaging the cell structure, the major metabolites such as triglycerides, glucose and glycogen in order to further estimate the complex interaction between the colonization process and the hepatic metabolism7-10. This method can also be applied to any tissue biopsy11,12.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Housebuilding is frequently viewed as an industry full of small firms. However, large firms exist in many countries. Here, a comparative analysis is made of the housebuilding industries in Australia, Britain and the USA. Housebuilding output is found to be much higher in Australia and the USA than in Britain when measured on a per capita basis. At the same time, the degree of market concentration in Australia and the USA is relatively low but in Britain it is far greater, with a few firms having quite substantial market shares. Investigation of the size distribution of the top 100 or so firms ranked by output also shows that the decline in firm size from the largest downwards is more rapid in Britain than elsewhere. The exceptionalism of the British case is put down to two principal reasons. First, the close proximity of Britain’s regions enables housebuilders to diversify successfully across different markets. The gains from such diversification are best achieved by large firms, because they can gain scale benefits in any particular market segment. Second, land shortages induced by a restrictive planning system encourage firms to takeover each other as a quick and beneficial means of acquiring land. The institutional rules of planning also make it difficult for new entrants to come in at the bottom end of the size hierarchy. In this way, concentration grows and a handful of large producers emerge. These conditions do not hold in the other two countries, so their industries are less concentrated. Given the degree of rivalry between firms over land purchases and takeovers, it is difficult to envisage them behaving in a long-term collusive manner, so that competition in British housebuilding is probably not unduly compromised by the exceptional degree of firm concentration. Reforms to lower the restrictions, improve the slow responsiveness and reduce the uncertainties associated with British planning systems’ role in housing supply are likely to greatly improve the ability of new firms to enter housebuilding and all firms’ abilities to increase output in response to rising housing demand. Such reforms would also probably lower overall housebuilding firm concentration over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The radiation of the mammals provides a 165-million-year test case for evolutionary theories of how species occupy and then fill ecological niches. It is widely assumed that species often diverge rapidly early in their evolution, and that this is followed by a longer, drawn-out period of slower evolutionary fine-tuning as natural selection fits organisms into an increasingly occupied niche space1,2. But recent studies have hinted that the process may not be so simple3–5. Here we apply statistical methods that automatically detect temporal shifts in the rate of evolution through time to a comprehensive mammalian phylogeny6 and data set7 of body sizes of 3,185 extant species. Unexpectedly, the majority of mammal species, including two of the most speciose orders (Rodentia and Chiroptera), have no history of substantial and sustained increases in the rates of evolution. Instead, a subset of the mammals has experienced an explosive increase (between 10- and 52-fold) in the rate of evolution along the single branch leading to the common ancestor of their monophyletic group (for example Chiroptera), followed by a quick return to lower or background levels. The remaining species are a taxonomically diverse assemblage showing a significant, sustained increase or decrease in their rates of evolution. These results necessarily decouple morphological diversification from speciation and suggest that the processes that give rise to the morphological diversity of a class of animals are far more free to vary than previously considered. Niches do not seem to fill up, and diversity seems to arise whenever, wherever and at whatever rate it is advantageous.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consistent with a negativity bias account, neuroscientific and behavioral evidence demonstrates modulation of even early sensory processes by unpleasant, potentially threat-relevant information. The aim of this research is to assess the extent to which pleasant and unpleasant visual stimuli presented extrafoveally capture attention and impact eye movement control. We report an experiment examining deviations in saccade metrics in the presence of emotional image distractors that are close to a nonemotional target. We additionally manipulate the saccade latency to test when the emotional distractor has its biggest impact on oculomotor control. The results demonstrate that saccade landing position was pulled toward unpleasant distractors, and that this pull was due to the quick saccade responses. Overall, these findings support a negativity bias account of early attentional control and call for the need to consider the time course of motivated attention when affect is implicit