965 resultados para concept analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence exists that many natural facts are described better as a fractal. Although fractals are very useful for describing nature, it is also appropiate to review the concept of random fractal in finance. Due to the extraordinary importance of Brownian motion in physics, chemistry or biology, we will consider the generalization that supposes fractional Brownian motion introduced by Mandelbrot.The main goal of this work is to analyse the existence of long range dependence in instantaneous forward rates of different financial markets. Concretelly, we perform an empirical analysis on the Spanish, Mexican and U.S. interbanking interest rate. We work with three time series of daily data corresponding to 1 day operations from 28th March 1996 to 21st May 2002. From among all the existing tests on this matter we apply the methodology proposed in Taqqu, Teverovsky and Willinger (1995).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The function of DNA-binding proteins is controlled not just by their abundance, but mainly at the level of their activity in terms of their interactions with DNA and protein targets. Moreover, the affinity of such transcription factors to their target sequences is often controlled by co-factors and/or modifications that are not easily assessed from biological samples. Here, we describe a scalable method for monitoring protein-DNA interactions on a microarray surface. This approach was designed to determine the DNA-binding activity of proteins in crude cell extracts, complementing conventional expression profiling arrays. Enzymatic labeling of DNA enables direct normalization of the protein binding to the microarray, allowing the estimation of relative binding affinities. Using DNA sequences covering a range of affinities, we show that the new microarray-based method yields binding strength estimates similar to low-throughput gel mobility-shift assays. The microarray is also of high sensitivity, as it allows the detection of a rare DNA-binding protein from breast cancer cells, the human tumor suppressor AP-2. This approach thus mediates precise and robust assessment of the activity of DNA-binding proteins and takes present DNA-binding assays to a high throughput level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genome-wide association studies (GWAS) are designed to identify the portion of single-nucleotide polymorphisms (SNPs) in genome sequences associated with a complex trait. Strategies based on the gene list enrichment concept are currently applied for the functional analysis of GWAS, according to which a significant overrepresentation of candidate genes associated with a biological pathway is used as a proxy to infer overrepresentation of candidate SNPs in the pathway. Here we show that such inference is not always valid and introduce the program SNP2GO, which implements a new method to properly test for the overrepresentation of candidate SNPs in biological pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study proposes a new concept for upscaling local information on failure surfaces derived from geophysical data, in order to develop the spatial information and quickly estimate the magnitude and intensity of a landslide. A new vision of seismic interpretation on landslides is also demonstrated by taking into account basic geomorphic information with a numeric method based on the Sloping Local Base Level (SLBL). The SLBL is a generalization of the base level defined in geomorphology applied to landslides, and allows the calculation of the potential geometry of the landslide failure surface. This approach was applied to a large scale landslide formed mainly in gypsum and situated in a former glacial valley along the Rhone within the Western European Alps. Previous studies identified the existence of two sliding surfaces that may continue below the level of the valley. In this study. seismic refraction-reflexion surveys were carried out to verify the existence of these failure surfaces. The analysis of the seismic data provides a four-layer model where three velocity layers (<1000 ms(-1), 1500 ms(-1) and 3000 ms(-1)) are interpreted as the mobilized mass at different weathering levels and compaction. The highest velocity layer (>4000 ms(-1)) with a maximum depth of similar to 58 m is interpreted as the stable anhydrite bedrock. Two failure surfaces were interpreted from the seismic surveys: an upper failure and a much deeper one (respectively 25 and 50 m deep). The upper failure surface depth deduced from geophysics is slightly different from the results obtained using the SLBL, and the deeper failure surface depth calculated with the SLBL method is underestimated in comparison with the geophysical interpretations. Optimal results were therefore obtained by including the seismic data in the SLBL calculations according to the geomorphic limits of the landslide (maximal volume of mobilized mass = 7.5 x 10(6) m(3)).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prediction of rockfall travel distance below a rock cliff is an indispensable activity in rockfall susceptibility, hazard and risk assessment. Although the size of the detached rock mass may differ considerably at each specific rock cliff, small rockfall (<100 m3) is the most frequent process. Empirical models may provide us with suitable information for predicting the travel distance of small rockfalls over an extensive area at a medium scale (1:100 000¿1:25 000). "Solà d'Andorra la Vella" is a rocky slope located close to the town of Andorra la Vella, where the government has been documenting rockfalls since 1999. This documentation consists in mapping the release point and the individual fallen blocks immediately after the event. The documentation of historical rockfalls by morphological analysis, eye-witness accounts and historical images serve to increase available information. In total, data from twenty small rockfalls have been gathered which reveal an amount of a hundred individual fallen rock blocks. The data acquired has been used to check the reliability of the main empirical models widely adopted (reach and shadow angle models) and to analyse the influence of parameters which affecting the travel distance (rockfall size, height of fall along the rock cliff and volume of the individual fallen rock block). For predicting travel distances in maps with medium scales, a method has been proposed based on the "reach probability" concept. The accuracy of results has been tested from the line entailing the farthest fallen boulders which represents the maximum travel distance of past rockfalls. The paper concludes with a discussion of the application of both empirical models to other study areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of my PhD research was the concept of modularity. In the last 15 years, modularity has become a classic term in different fields of biology. On the conceptual level, a module is a set of interacting elements that remain mostly independent from the elements outside of the module. I used modular analysis techniques to study gene expression evolution in vertebrates. In particular, I identified ``natural'' modules of gene expression in mouse and human, and I showed that expression of organ-specific and system-specific genes tends to be conserved between such distance vertebrates as mammals and fishes. Also with a modular approach, I studied patterns of developmental constraints on transcriptome evolution. I showed that none of the two commonly accepted models of the evolution of embryonic development (``evo-devo'') are exclusively valid. In particular, I found that the conservation of the sequences of regulatory regions is highest during mid-development of zebrafish, and thus it supports the ``hourglass model''. In contrast, events of gene duplication and new gene introduction are most rare in early development, which supports the ``early conservation model''. In addition to the biological insights on transcriptome evolution, I have also discussed in detail the advantages of modular approaches in large-scale data analysis. Moreover, I re-analyzed several studies (published in high-ranking journals), and showed that their conclusions do not hold out under a detailed analysis. This demonstrates that complex analysis of high-throughput data requires a co-operation between biologists, bioinformaticians, and statisticians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to present a new concept, called on-line desorption of dried blood spots (on-line DBS), allowing the direct analysis of a dried blood spot coupled to liquid chromatography mass spectrometry device (LC/MS). The system is based on an inox cell which can receive a blood sample (10 microL) previously spotted on a filter paper. The cell is then integrated into LC/MS system where the analytes are desorbed out of the paper towards a column switching system ensuring the purification and separation of the compounds before their detection on a single quadrupole MS coupled to atmospheric pressure chemical ionisation (APCI) source. The described procedure implies that no pretreatment is necessary in spite the analysis is based on whole blood sample. To ensure the applicability of the concept, saquinavir, imipramine, and verapamil were chosen. Despite the use of a small sampling volume and a single quadrupole detector, on-line DBS allowed the analyses of these three compounds over their therapeutic concentrations from 50 to 500 ng/mL for imipramine and verapamil and from 100 to 1000 ng/mL for saquinavir. Moreover, the method showed good repeatability with relative standard deviation (RSD) lower than 15% based on two levels of concentration (low and high). Function responses were found to be linear over the therapeutic concentration for each compound and were used to determine the concentrations of real patient samples for saquinavir. Comparison of the founded values with those of a validated method used routinely in a reference laboratory showed a good correlation between the two methods. Moreover, good selectivity was observed ensuring that no endogenous or chemical components interfered with the quantitation of the analytes. This work demonstrates the feasibility and applicability of the on-line DBS procedure for bioanalysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a critical analysis of the state of the art in the definition and typologies of paraphrasing. This analysis shows that there exists no characterization of paraphrasing that is comprehensive, linguistically based and computationally tractable at the same time. The following sets out to define and delimit the concept on the basis of the propositional content. We present a general, inclusive and computationally oriented typology of the linguistic mechanisms that give rise to form variations between paraphrase pairs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractBACKGROUND: Scientists have been trying to understand the molecular mechanisms of diseases to design preventive and therapeutic strategies for a long time. For some diseases, it has become evident that it is not enough to obtain a catalogue of the disease-related genes but to uncover how disruptions of molecular networks in the cell give rise to disease phenotypes. Moreover, with the unprecedented wealth of information available, even obtaining such catalogue is extremely difficult.PRINCIPAL FINDINGS: We developed a comprehensive gene-disease association database by integrating associations from several sources that cover different biomedical aspects of diseases. In particular, we focus on the current knowledge of human genetic diseases including mendelian, complex and environmental diseases. To assess the concept of modularity of human diseases, we performed a systematic study of the emergent properties of human gene-disease networks by means of network topology and functional annotation analysis. The results indicate a highly shared genetic origin of human diseases and show that for most diseases, including mendelian, complex and environmental diseases, functional modules exist. Moreover, a core set of biological pathways is found to be associated with most human diseases. We obtained similar results when studying clusters of diseases, suggesting that related diseases might arise due to dysfunction of common biological processes in the cell.CONCLUSIONS: For the first time, we include mendelian, complex and environmental diseases in an integrated gene-disease association database and show that the concept of modularity applies for all of them. We furthermore provide a functional analysis of disease-related modules providing important new biological insights, which might not be discovered when considering each of the gene-disease association repositories independently. Hence, we present a suitable framework for the study of how genetic and environmental factors, such as drugs, contribute to diseases.AVAILABILITY: The gene-disease networks used in this study and part of the analysis are available at http://ibi.imim.es/DisGeNET/DisGeNETweb.html#Download

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Standard cardiopulmonary bypass (CPB) circuits with their large surface area and volume contribute to postoperative systemic inflammatory reaction and hemodilution. In order to minimize these problems a new approach has been developed resulting in a single disposable, compact arterio-venous loop, which has integral kinetic-assist pumping, oxygenating, air removal, and gross filtration capabilities (CardioVention Inc., Santa Clara, CA, USA). The impact of this system on gas exchange capacity, blood elements and hemolysis is compared to that of a conventional circuit in a model of prolonged perfusion. METHODS: Twelve calves (mean body weight: 72.2+/-3.7 kg) were placed on cardiopulmonary bypass for 6 h with a flow of 5 l/min, and randomly assigned to the CardioVention system (n=6) or a standard CPB circuit (n=6). A standard battery of blood samples was taken before bypass and throughout bypass. Analysis of variance was used for comparison. RESULTS: The hematocrit remained stable throughout the experiment in the CardioVention group, whereas it dropped in the standard group in the early phase of perfusion. When normalized for prebypass values, both profiles differed significantly (P<0.01). Both O2 and CO2 transfers were significantly improved in the CardioVention group (P=0.04 and P<0.001, respectively). There was a slightly higher pressure drop in the CardioVention group but no single value exceeded 112 mmHg. No hemolysis could be detected in either group with all free plasma Hb values below 15 mg/l. Thrombocyte count, when corrected by hematocrit and normalized by prebypass values, exhibited an increased drop in the standard group (P=0.03). CONCLUSION: The CardioVention system with its concept of limited priming volume and exposed foreign surface area, improves gas exchange probably because of the absence of detectable hemodilution, and appears to limit the decrease in the thrombocyte count which may be ascribed to the reduced surface. Despite the volume and surface constraints, no hemolysis could be detected throughout the 6 h full-flow perfusion period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. METHODS: We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. RESULTS: We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. CONCLUSIONS: We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the low workability of slipform concrete mixtures, the science of rheology is not strictly applicable for such concrete. However, the concept of rheological behavior may still be considered useful. A novel workability test method (Vibrating Kelly Ball or VKelly test) that would quantitatively assess the responsiveness of a dry concrete mixture to vibration, as is desired of a mixture suitable for slipform paving, was developed and evaluated. The objectives of this test method are for it to be cost-effective, portable, and repeatable while reporting the suitability of a mixture for use in slipform paving. The work to evaluate and refine the test was conducted in three phases: 1. Assess whether the VKelly test can signal variations in laboratory mixtures with a range of materials and proportions 2. Run the VKelly test in the field at a number of construction sites 3. Validate the VKelly test results using the Box Test developed at Oklahoma State University for slipform paving concrete The data collected to date indicate that the VKelly test appears to be suitable for assessing a mixture’s response to vibration (workability) with a low multiple operator variability. A unique parameter, VKelly Index, is introduced and defined that seems to indicate that a mixture is suitable for slipform paving when it falls in the range of 0.8 to 1.2 in./√s.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the administration, planning, design, and maintenance of road systems, transportation professionals often need to choose between alternatives, justify decisions, evaluate tradeoffs, determine how much to spend, set priorities, assess how well the network meets traveler needs, and communicate the basis for their actions to others. A variety of technical guidelines, tools, and methods have been developed to help with these activities. Such work aids include design criteria guidelines, design exception analysis methods, needs studies, revenue allocation schemes, regional planning guides, designation of minimum standards, sufficiency ratings, management systems, point based systems to determine eligibility for paving, functional classification, and bridge ratings. While such tools play valuable roles, they also manifest a number of deficiencies and are poorly integrated. Design guides tell what solutions MAY be used, they aren't oriented towards helping find which one SHOULD be used. Design exception methods help justify deviation from design guide requirements but omit consideration of important factors. Resource distribution is too often based on dividing up what's available rather than helping determine how much should be spent. Point systems serve well as procedural tools but are employed primarily to justify decisions that have already been made. In addition, the tools aren't very scalable: a system level method of analysis seldom works at the project level and vice versa. In conjunction with the issues cited above, the operation and financing of the road and highway system is often the subject of criticisms that raise fundamental questions: What is the best way to determine how much money should be spent on a city or a county's road network? Is the size and quality of the rural road system appropriate? Is too much or too little money spent on road work? What parts of the system should be upgraded and in what sequence? Do truckers receive a hidden subsidy from other motorists? Do transportation professions evaluate road situations from too narrow of a perspective? In considering the issues and questions the author concluded that it would be of value if one could identify and develop a new method that would overcome the shortcomings of existing methods, be scalable, be capable of being understood by the general public, and utilize a broad viewpoint. After trying out a number of concepts, it appeared that a good approach would be to view the road network as a sub-component of a much larger system that also includes vehicles, people, goods-in-transit, and all the ancillary items needed to make the system function. Highway investment decisions could then be made on the basis of how they affect the total cost of operating the total system. A concept, named the "Total Cost of Transportation" method, was then developed and tested. The concept rests on four key principles: 1) that roads are but one sub-system of a much larger 'Road Based Transportation System', 2) that the size and activity level of the overall system are determined by market forces, 3) that the sum of everything expended, consumed, given up, or permanently reserved in building the system and generating the activity that results from the market forces represents the total cost of transportation, and 4) that the economic purpose of making road improvements is to minimize that total cost. To test the practical value of the theory, a special database and spreadsheet model of Iowa's county road network was developed. This involved creating a physical model to represent the size, characteristics, activity levels, and the rates at which the activities take place, developing a companion economic cost model, then using the two in tandem to explore a variety of issues. Ultimately, the theory and model proved capable of being used in full system, partial system, single segment, project, and general design guide levels of analysis. The method appeared to be capable of remedying many of the existing work method defects and to answer society's transportation questions from a new perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A prior project, HR-388, (which was entitled "Total Cost of Transportation analysis of road and highway issues"), explored the use of a total economic cost basis for evaluation of road based transportation issues. It was conducted as a proof-of-concept effort between 1996 and 2002, with the final report presented in May 2002. TR-477 rebuilt the analytical model using current data, then performed general, system level, county level, and road segment level analyses. The results are presented herein and will be distributed to all county engineers for information and local use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measurement of fat balance (fat input minus fat output) involves the accurate estimation of both metabolizable fat intake and total fat oxidation. This is possible mostly under laboratory conditions and not yet in free-living conditions. In the latter situation, net fat retention/mobilization can be estimated based on precise and accurate sequential body composition measurements. In case of positive balance, lipids stored in adipose tissue can originate from dietary (exogenous) lipids or from nonlipid precursors, mainly from carbohydrates (CHOs) but also from ethanol, through a process known as de novo lipogenesis (DNL). Basic equations are provided in this review to facilitate the interpretation of the different subcomponents of fat balance (endogenous vs exogenous) under different nutritional circumstances. One difficulty is methodological: total DNL is difficult to measure quantitatively in man; for example, indirect calorimetry only tracks net DNL, not total DNL. Although the numerous factors (mostly exogenous) influencing DNL have been studied, in particular the effect of CHO overfeeding, there is little information on the rate of DNL in habitual conditions of life, that is, large day-to-day fluctuations of CHO intakes, different types of CHO ingested with different glycemic indexes, alcohol combined with excess CHO intakes, etc. Three issues, which are still controversial today, will be addressed: (1) Is the increase of fat mass induced by CHO overfeeding explained by DNL only, or by decreased endogenous fat oxidation, or both? (2) Is DNL different in overweight and obese individuals as compared to their lean counterparts? (3) Does DNL occur both in the liver and in adipose tissue? Recent studies have demonstrated that acute CHO overfeeding influences adipose tissue lipogenic gene expression and that CHO may stimulate DNL in skeletal muscles, at least in vitro. The role of DNL and its importance in health and disease remain to be further clarified, in particular the putative effect of DNL on the control of energy intake and energy expenditure, as well as the occurrence of DNL in other tissues (such as in myocytes) in addition to hepatocytes and adipocytes.