973 resultados para Reserve Selection Procedures
Resumo:
—Microarray-based global gene expression profiling, with the use of sophisticated statistical algorithms is providing new insights into the pathogenesis of autoimmune diseases. We have applied a novel statistical technique for gene selection based on machine learning approaches to analyze microarray expression data gathered from patients with systemic lupus erythematosus (SLE) and primary antiphospholipid syndrome (PAPS), two autoimmune diseases of unknown genetic origin that share many common features. The methodology included a combination of three data discretization policies, a consensus gene selection method, and a multivariate correlation measurement. A set of 150 genes was found to discriminate SLE and PAPS patients from healthy individuals. Statistical validations demonstrate the relevance of this gene set from an univariate and multivariate perspective. Moreover, functional characterization of these genes identified an interferon-regulated gene signature, consistent with previous reports. It also revealed the existence of other regulatory pathways, including those regulated by PTEN, TNF, and BCL-2, which are altered in SLE and PAPS. Remarkably, a significant number of these genes carry E2F binding motifs in their promoters, projecting a role for E2F in the regulation of autoimmunity.
Resumo:
Geologic storage of carbon dioxide (CO2) has been proposed as a viable means for reducing anthropogenic CO2 emissions. Once injection begins, a program for measurement, monitoring, and verification (MMV) of CO2 distribution is required in order to: a) research key features, effects and processes needed for risk assessment; b) manage the injection process; c) delineate and identify leakage risk and surface escape; d) provide early warnings of failure near the reservoir; and f) verify storage for accounting and crediting. The selection of the methodology of monitoring (characterization of site and control and verification in the post-injection phase) is influenced by economic and technological variables. Multiple Criteria Decision Making (MCDM) refers to a methodology developed for making decisions in the presence of multiple criteria. MCDM as a discipline has only a relatively short history of 40 years, and it has been closely related to advancements on computer technology. Evaluation methods and multicriteria decisions include the selection of a set of feasible alternatives, the simultaneous optimization of several objective functions, and a decision-making process and evaluation procedures that must be rational and consistent. The application of a mathematical model of decision-making will help to find the best solution, establishing the mechanisms to facilitate the management of information generated by number of disciplines of knowledge. Those problems in which decision alternatives are finite are called Discrete Multicriteria Decision problems. Such problems are most common in reality and this case scenario will be applied in solving the problem of site selection for storing CO2. Discrete MCDM is used to assess and decide on issues that by nature or design support a finite number of alternative solutions. Recently, Multicriteria Decision Analysis has been applied to hierarchy policy incentives for CCS, to assess the role of CCS, and to select potential areas which could be suitable to store. For those reasons, MCDM have been considered in the monitoring phase of CO2 storage, in order to select suitable technologies which could be techno-economical viable. In this paper, we identify techniques of gas measurements in subsurface which are currently applying in the phase of characterization (pre-injection); MCDM will help decision-makers to hierarchy the most suitable technique which fit the purpose to monitor the specific physic-chemical parameter.
Resumo:
Road accidents are a very relevant issue in many countries and macroeconomic models are very frequently applied by academia and administrations to reduce their frequency and consequences. The selection of explanatory variables and response transformation parameter within the Bayesian framework for the selection of the set of explanatory variables a TIM and 3IM (two input and three input models) procedures are proposed. The procedure also uses the DIC and pseudo -R2 goodness of fit criteria. The model to which the methodology is applied is a dynamic regression model with Box-Cox transformation (BCT) for the explanatory variables and autorgressive (AR) structure for the response. The initial set of 22 explanatory variables are identified. The effects of these factors on the fatal accident frequency in Spain, during 2000-2012, are estimated. The dependent variable is constructed considering the stochastic trend component.
Resumo:
Fusion phage libraries expressing single-chain Fv antibodies were constructed from the peripheral blood lymphocytes of two melanoma patients who had been immunized with autologous melanoma cells transduced the gamma-interferon gene to enhance immunogenicity, in a trial conducted at another institution. Anti-melanoma antibodies were selected from each library by panning the phage against live cultures of the autologous tumor. After two or three rounds of panning, clones of the phage were tested by ELISA for binding to the autologous tumor cells; > 90% of the clones tested showed a strong ELISA reaction, demonstrating the effectiveness of the panning procedure for selecting antimelanoma antibodies. The panned phage population was extensively absorbed against normal melanocytes to enrich for antibodies that react with melanoma cells but not with melanocytes. The unabsorbed phage were cloned, and the specificities of the expressed antibodies were individually tested by ELISA with a panel of cultured human cells. The first tests were done with normal endothelial and fibroblast cells to identify antibodies that do not react, or react weakly, with two normal cell types, indicating some degree of specificity for melanoma cells. The proportion of phage clones expressing such antibodies was approximately 1%. Those phage were further tested by ELISA with melanocytes, several melanoma lines, and eight other tumor lines, including a glioma line derived from glial cells that share a common lineage with melanocytes. The ELISA tests identified three classes of anti-melanoma antibodies, as follows: (i) a melanoma-specific class that reacts almost exclusively with the melanoma lines; (ii) a tumor-specific class that reacts with melanoma and other tumor lines but does not react with the normal melanocyte, endothelial and fibroblast cells; and (iii) a lineage-specific class that reacts with the melanoma lines, melanocytes, and the glioma line but does not react with the other lines. These are rare classes from the immunized patients' repertoires of anti-melanoma antibodies, most of which are relatively nonspecific anti-self antibodies. The melanoma-specific class was isolated from one patient, and the lineage-specific class was isolated from the other patient, indicating that different patients can have markedly different responses to the same immunization protocol. The procedures described here can be used to screen the antibody repertoire of any person with cancer, providing access to an enormous untapped pool of human monoclonal anti-tumor antibodies with clinical and research potential.
Resumo:
Peripheral blood leukocytes incubated with a semisynthetic phage antibody library and fluorochrome-labeled CD3 and CD20 antibodies were used to isolate human single-chain Fv antibodies specific for subsets of blood leukocytes by flow cytometry. Isolated phage antibodies showed exclusive binding to the subpopulation used for selection or displayed additional binding to a restricted population of other cells in the mixture. At least two phage antibodies appeared to display hitherto-unknown staining patterns of B-lineage cells. This approach provides a subtractive procedure to rapidly obtain human antibodies against known and novel surface antigens in their native configuration, expressed on phenotypically defined subpopulations of cells. This approach does not depend on immunization procedures or the necessity to repeatedly construct phage antibody libraries.
Resumo:
"May 1990.
Resumo:
Mode of access: Internet.
A study of load support and other criteria appropriate to the selection of industrial conveyor belts
Resumo:
A study of conveying practice demonstrates that belt conveyors provide a versatile and. much-used method of transporting bulk materials, but a review of belting manufacturers' design procedures shows that belt design and selection rules are often based on experience with all-cotton belts no longer in common use, and are net completely relevant to modern synthetic constructions. In particular, provision of the property "load support", which was not critical with cotton belts, is shown to determine the outcome of most belt selection exercises and lead to gross over specification of other design properties in many cases. The results of an original experimental investigation into this property, carried out to determine the belt and conveyor parameters that affect it, how the major role that belt stiffness plays in its provision; the basis for a belt stiffness test relevant to service conditions is given. A proposal for a more rational method of specifying load support data results from the work, but correlation of the test results with service performance is necessary before the absolute toad support capability required from a belt for given working conditions can be quantified. A study to attain this correlation is the major proposal for future work resulting from the present investigation, but a full review of the literature on conveyor design and a study of present practice within the belting industry demonstrate other, less critical, factors that could profitably be investigated. It is suggested that the most suitable method of studying these would be a rational data collection system to provide information on various facets of belt service behaviour; a basis for such a system is proposed. In addition to the work above, proposals for simplifying the present belt selection methods are made and a strain transducer suitable for use in future experimental investigations is developed.
Resumo:
Businesses are seen as the next stage in delivering biodiversity improvements linked to local and UK Biodiversity Action Plans. Global discussion of biodiversity continues to grow, with the Millennium Ecosystem Assessment, updates to the Convention on Biological Diversity and The Economics of Ecosystems and Biodiversity being published during the time of this project. These publications and others detail the importance of biodiversity protection and also the lack of strategies to deliver this at an operational level. Pressure on UK landholding businesses is combined with significant business opportunities associated with biodiversity engagement. However, the measurement and reporting of biodiversity by business is currently limited by the complexity of the term and the lack of suitable procedures for the selection of metrics. Literature reviews identified confusion surrounding biodiversity as a term, limited academic literature regarding business and choice of biodiversity indicators. The aim of the project was to develop a methodology to enable companies to identify, quantify and monitor biodiversity. Research case studies interviews were undertaken with 10 collaborating organisations, selected to represent =best practice‘ examples and various situations. Information gained through case studies was combined with that from existing literature. This was used to develop a methodology for the selection of biodiversity indicators for company landholdings. The indicator selection methodology was discussed during a second stage of case study interviews with 4 collaborating companies. The information and opinions gained during this research was used to modify the methodology and provide the final biodiversity indicator selection methodology. The methodology was then tested through implementation at a mineral extraction site operated by a multi-national aggregates company. It was found that the methodology was a suitable process for implementation of global and national systems and conceptual frameworks at the practitioner scale. Further testing of robustness by independent parties is recommended to improve the system.
Resumo:
A model of multiple criteria decision making is presented for selecting the “best” of a finite number of alternatives. Techniques of scoring the alternatives and weighting the criteria are combined with different evaluating procedures and amalgamated in an interactive algorithm. Application of this method for choosing the best tender in a competitive bidding is discussed and a case is presented in some detail.
Resumo:
A decision-maker, when faced with a limited and fixed budget to collect data in support of a multiple attribute selection decision, must decide how many samples to observe from each alternative and attribute. This allocation decision is of particular importance when the information gained leads to uncertain estimates of the attribute values as with sample data collected from observations such as measurements, experimental evaluations, or simulation runs. For example, when the U.S. Department of Homeland Security must decide upon a radiation detection system to acquire, a number of performance attributes are of interest and must be measured in order to characterize each of the considered systems. We identified and evaluated several approaches to incorporate the uncertainty in the attribute value estimates into a normative model for a multiple attribute selection decision. Assuming an additive multiple attribute value model, we demonstrated the idea of propagating the attribute value uncertainty and describing the decision values for each alternative as probability distributions. These distributions were used to select an alternative. With the goal of maximizing the probability of correct selection we developed and evaluated, under several different sets of assumptions, procedures to allocate the fixed experimental budget across the multiple attributes and alternatives. Through a series of simulation studies, we compared the performance of these allocation procedures to the simple, but common, allocation procedure that distributed the sample budget equally across the alternatives and attributes. We found the allocation procedures that were developed based on the inclusion of decision-maker knowledge, such as knowledge of the decision model, outperformed those that neglected such information. Beginning with general knowledge of the attribute values provided by Bayesian prior distributions, and updating this knowledge with each observed sample, the sequential allocation procedure performed particularly well. These observations demonstrate that managing projects focused on a selection decision so that the decision modeling and the experimental planning are done jointly, rather than in isolation, can improve the overall selection results.
Resumo:
Agricultural crops can be damaged by funguses, insects, worms and other organisms that cause diseases and decrease the yield of production. The effect of these damaging agents can be reduced using pesticides. Among them, triazole compounds are effective substances against fungus; for example, Oidium. Nevertheless, it has been detected that the residues of these fungicides in foods as well as in derivate products can affect the health of the consumers. Therefore, the European Union has established several regulations fixing the maximum residue of pesticide levels in a wide range of foods trying to assure the consumer safety. Hence, it is very important to develop adequate methods to determine these pesticide compounds. In most cases, gas or liquid chromatographic (GC, LC) separations are used in the analysis of the samples. But firstly, it is necessary to use proper sample treatments in order to preconcentrate and isolate the target analytes. To reach this aim, microextraction techniques are very effective tools; because allow to do both preconcentration and extraction of the analytes in one simple step that considerably reduces the source of errors. With these objectives, two remarkable techniques have been widely used during the last years: solid phase microextraction (SPME) and liquid phase microextraction (LPME) with its different options. Both techniques that avoid the use or reduce the amount of toxic solvents are convenient coupled to chromatographic equipments providing good quantitative results in a wide number of matrices and compounds. In this work simple and reliable methods have been developed using SPME and ultrasound assisted emulsification microextraction (USAEME) coupled to GC or LC for triazole fungicides determination. The proposed methods allow confidently determine triazole concentrations of μg L‐1 order in different fruit samples. Chemometric tools have been used to accomplish successful determinations. Firstly, in the selection and optimization of the variables involved in the microextraction processes; and secondly, to overcome the problems related to the overlapping peaks. Different fractional factorial designs have been used for the screening of the experimental variables; and central composite designs have been carried out to get the best experimental conditions. Trying to solve the overlapping peak problems multivariate calibration methods have been used. Parallel Factor Analysis 2 (PARAFAC2), Multivariate Curve Resolution (MCR) and Parallel Factor Analysis with Linear Dependencies (PARALIND) have been proposed, the adequate algorithms have been used according to data characteristics, and the results have been compared. Because its occurrence in Basque Country and its relevance in the production of cider and txakoli regional wines the grape and apple samples were selected. These crops are often treated with triazole compounds trying to solve the problems caused by the funguses. The peel and pulp from grape and apple, their juices and some commercial products such as musts, juice and cider have been analysed showing the adequacy of the developed methods for the triazole determination in this kind of fruit samples.
Resumo:
Sequential panel selection methods (spsms — procedures that sequentially use conventional panel unit root tests to identify I(0)I(0) time series in panels) are increasingly used in the empirical literature. We check the reliability of spsms by using Monte Carlo simulations based on generating directly the individual asymptotic pp values to be combined into the panel unit root tests, in this way isolating the classification abilities of the procedures from the small sample properties of the underlying univariate unit root tests. The simulations consider both independent and cross-dependent individual test statistics. Results suggest that spsms may offer advantages over time series tests only under special conditions.
Resumo:
The 2007 Indigent Defense Act provides that each county must elect its representative(s) from the active licensed attorneys who reside within each county to serve on its Circuit Public Defender Selection Panel. The procedures included in this document have been adopted by the Commission on Indigent Defense for the Election of the Circuit Public Defender Selection Panels and the Nomination of Circuit Public Defenders.