44 resultados para Selection Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artifact selection decisions typically involve the selection of one from a number of possible/candidate options (decision alternatives). In order to support such decisions, it is important to identify and recognize relevant key issues of problem solving and decision making (Albers, 1996; Harris, 1998a, 1998b; Jacobs & Holten, 1995; Loch & Conger, 1996; Rumble, 1991; Sauter, 1999; Simon, 1986). Sauter classifies four problem solving/decision making styles: (1) left-brain style, (2) right-brain style, (3) accommodating, and (4) integrated (Sauter, 1999). The left-brain style employs analytical and quantitative techniques and relies on rational and logical reasoning. In an effort to achieve predictability and minimize uncertainty, problems are explicitly defined, solution methods are determined, orderly information searches are conducted, and analysis is increasingly refined. Left-brain style decision making works best when it is possible to predict/control, measure, and quantify all relevant variables, and when information is complete. In direct contrast, right-brain style decision making is based on intuitive techniques—it places more emphasis on feelings than facts. Accommodating decision makers use their non-dominant style when they realize that it will work best in a given situation. Lastly, integrated style decision makers are able to combine the left- and right-brain styles—they use analytical processes to filter information and intuition to contend with uncertainty and complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The existing method of pipeline monitoring, which requires an entire pipeline to be inspected periodically, wastes time and is expensive. A risk-based model that reduces the amount of time spent on inspection has been developed. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests an efficient design and operation philosophy, construction method and logical insurance plans.The risk-based model uses analytic hierarchy process, a multiple attribute decision-making technique, to identify factors that influence failure on specific segments and analyze their effects by determining the probabilities of risk factors. The severity of failure is determined through consequence analysis, which establishes the effect of a failure in terms of cost caused by each risk factor and determines the cumulative effect of failure through probability analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to investigate the effects of circularity, comorbidity, prevalence and presentation variation on the accuracy of differential diagnoses made in optometric primary care using a modified form of naïve Bayesian sequential analysis. No such investigation has ever been reported before. Data were collected for 1422 cases seen over one year. Positive test outcomes were recorded for case history (ethnicity, age, symptoms and ocular and medical history) and clinical signs in relation to each diagnosis. For this reason only positive likelihood ratios were used for this modified form of Bayesian analysis that was carried out with Laplacian correction and Chi-square filtration. Accuracy was expressed as the percentage of cases for which the diagnoses made by the clinician appeared at the top of a list generated by Bayesian analysis. Preliminary analyses were carried out on 10 diagnoses and 15 test outcomes. Accuracy of 100% was achieved in the absence of presentation variation but dropped by 6% when variation existed. Circularity artificially elevated accuracy by 0.5%. Surprisingly, removal of Chi-square filtering increased accuracy by 0.4%. Decision tree analysis showed that accuracy was influenced primarily by prevalence followed by presentation variation and comorbidity. Analysis of 35 diagnoses and 105 test outcomes followed. This explored the use of positive likelihood ratios, derived from the case history, to recommend signs to look for. Accuracy of 72% was achieved when all clinical signs were entered. The drop in accuracy, compared to the preliminary analysis, was attributed to the fact that some diagnoses lacked strong diagnostic signs; the accuracy increased by 1% when only recommended signs were entered. Chi-square filtering improved recommended test selection. Decision tree analysis showed that accuracy again influenced primarily by prevalence, followed by comorbidity and presentation variation. Future work will explore the use of likelihood ratios based on positive and negative test findings prior to considering naïve Bayesian analysis as a form of artificial intelligence in optometric practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To solve multi-objective problems, multiple reward signals are often scalarized into a single value and further processed using established single-objective problem solving techniques. While the field of multi-objective optimization has made many advances in applying scalarization techniques to obtain good solution trade-offs, the utility of applying these techniques in the multi-objective multi-agent learning domain has not yet been thoroughly investigated. Agents learn the value of their decisions by linearly scalarizing their reward signals at the local level, while acceptable system wide behaviour results. However, the non-linear relationship between weighting parameters of the scalarization function and the learned policy makes the discovery of system wide trade-offs time consuming. Our first contribution is a thorough analysis of well known scalarization schemes within the multi-objective multi-agent reinforcement learning setup. The analysed approaches intelligently explore the weight-space in order to find a wider range of system trade-offs. In our second contribution, we propose a novel adaptive weight algorithm which interacts with the underlying local multi-objective solvers and allows for a better coverage of the Pareto front. Our third contribution is the experimental validation of our approach by learning bi-objective policies in self-organising smart camera networks. We note that our algorithm (i) explores the objective space faster on many problem instances, (ii) obtained solutions that exhibit a larger hypervolume, while (iii) acquiring a greater spread in the objective space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strategic sourcing plays an important role in organisations' performance. Strategic sourcing has been researched extensively using empirical studies as well as review work, such as strategic sourcing importance, issues and challenges, processes, source selection criteria and framework. However, there is no research on critical success factors for strategic sourcing specific to industry and country. This research aims to qualitatively evaluate and understand the current role of strategic sourcing, the critical success factors for business performance and its relationship with strategic sourcing, and strategic supplier evaluation criteria from multiple stakeholders' perspectives specific to industry and country. This research studies twenty organisations from Germany and the United Kingdom (UK) covering two industry sectors - electronics manufacturing and construction. We consider five organisations from each industry sector and each country. The findings from twenty case studies reveal comparative analysis of strategic sourcing practices of two countries and two industries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Battery energy storage systems have traditionally been manufactured using new batteries with a good reliability. The high cost of such a system has led to investigations of using second life transportation batteries to provide an alternative energy storage capability. However, the reliability and performance of these batteries is unclear and multi-modular power electronics with redundancy have been suggested as a means of helping with this issue. This paper reviews work already undertaken on battery failure rate to suggest suitable figures for use in reliability calculations. The paper then uses reliability analysis and a numerical example to investigate six different multi-modular topologies and suggests how the number of series battery strings and power electronic module redundancy should be determined for the lowest hardware cost using a numerical example. The results reveal that the cascaded dc-side modular with single inverter is the lowest cost solution for a range of battery failure rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis objective is to discover “How are informal decisions reached by screeners when filtering out undesirable job applications?” Grounded theory techniques were employed in the field to observe and analyse informal decisions at the source by screeners in three distinct empirical studies. Whilst grounded theory provided the method for case and cross-case analysis, literature from academic and non-academic sources was evaluated and integrated to strengthen this research and create a foundation for understanding informal decisions. As informal decisions in early hiring processes have been under researched, this thesis contributes to current knowledge in several ways. First, it locates the Cycle of Employment which enhances Robertson and Smith’s (1993) Selection Paradigm through the integration of stages that individuals occupy whilst seeking employment. Secondly, a general depiction of the Workflow of General Hiring Processes provides a template for practitioners to map and further develop their organisational processes. Finally, it highlights the emergence of the Locality Effect, which is a geographically driven heuristic and bias that can significantly impact recruitment and informal decisions. Although screeners make informal decisions using multiple variables, informal decisions are made in stages as evidence in the Cycle of Employment. Moreover, informal decisions can be erroneous as a result of a majority and minority influence, the weighting of information, the injection of inappropriate information and criteria, and the influence of an assessor. This thesis considers these faults and develops a basic framework of understanding informal decisions to which future research can be launched.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human immunodeficiency virus type-1 (HIV-1) genome contains multiple, highly conserved structural RNA domains that play key roles in essential viral processes. Interference with the function of these RNA domains either by disrupting their structures or by blocking their interaction with viral or cellular factors may seriously compromise HIV-1 viability. RNA aptamers are amongst the most promising synthetic molecules able to interact with structural domains of viral genomes. However, aptamer shortening up to their minimal active domain is usually necessary for scaling up production, what requires very time-consuming, trial-and-error approaches. Here we report on the in vitro selection of 64 nt-long specific aptamers against the complete 5' -untranslated region of HIV-1 genome, which inhibit more than 75% of HIV-1 production in a human cell line. The analysis of the selected sequences and structures allowed for the identification of a highly conserved 16 nt-long stem-loop motif containing a common 8 nt-long apical loop. Based on this result, an in silico designed 16 nt-long RNA aptamer, termed RNApt16, was synthesized, with sequence 5'-CCCCGGCAAGGAGGGG-3-'. The HIV-1 inhibition efficiency of such an aptamer was close to 85%, thus constituting the shortest RNA molecule so far described that efficiently interferes with HIV-1 replication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: During last decade the use of ECG recordings in biometric recognition studies has increased. ECG characteristics made it suitable for subject identification: it is unique, present in all living individuals, and hard to forge. However, in spite of the great number of approaches found in literature, no agreement exists on the most appropriate methodology. This study aimed at providing a survey of the techniques used so far in ECG-based human identification. Specifically, a pattern recognition perspective is here proposed providing a unifying framework to appreciate previous studies and, hopefully, guide future research. Methods: We searched for papers on the subject from the earliest available date using relevant electronic databases (Medline, IEEEXplore, Scopus, and Web of Knowledge). The following terms were used in different combinations: electrocardiogram, ECG, human identification, biometric, authentication and individual variability. The electronic sources were last searched on 1st March 2015. In our selection we included published research on peer-reviewed journals, books chapters and conferences proceedings. The search was performed for English language documents. Results: 100 pertinent papers were found. Number of subjects involved in the journal studies ranges from 10 to 502, age from 16 to 86, male and female subjects are generally present. Number of analysed leads varies as well as the recording conditions. Identification performance differs widely as well as verification rate. Many studies refer to publicly available databases (Physionet ECG databases repository) while others rely on proprietary recordings making difficult them to compare. As a measure of overall accuracy we computed a weighted average of the identification rate and equal error rate in authentication scenarios. Identification rate resulted equal to 94.95 % while the equal error rate equal to 0.92 %. Conclusions: Biometric recognition is a mature field of research. Nevertheless, the use of physiological signals features, such as the ECG traits, needs further improvements. ECG features have the potential to be used in daily activities such as access control and patient handling as well as in wearable electronics applications. However, some barriers still limit its growth. Further analysis should be addressed on the use of single lead recordings and the study of features which are not dependent on the recording sites (e.g. fingers, hand palms). Moreover, it is expected that new techniques will be developed using fiducials and non-fiducial based features in order to catch the best of both approaches. ECG recognition in pathological subjects is also worth of additional investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hepatitis C virus (HCV) is able to persist as a chronic infection, which can lead to cirrhosis and liver cancer. There is evidence that clearance of HCV is linked to strong responses by CD8 cytotoxic T lymphocytes (CTLs), suggesting that eliciting CTL responses against HCV through an epitope-based vaccine could prove an effective means of immunization. However, HCV genomic plasticity as well as the polymorphisms of HLA I molecules restricting CD8 T-cell responses challenges the selection of epitopes for a widely protective vaccine. Here, we devised an approach to overcome these limitations. From available databases, we first collected a set of 245 HCV-specific CD8 T-cell epitopes, all known to be targeted in the course of a natural infection in humans. After a sequence variability analysis, we next identified 17 highly invariant epitopes. Subsequently, we predicted the epitope HLA I binding profiles that determine their potential presentation and recognition. Finally, using the relevant HLA I-genetic frequencies, we identified various epitope subsets encompassing 6 conserved HCV-specific CTL epitopes each predicted to elicit an effective T-cell response in any individual regardless of their HLA I background. We implemented this epitope selection approach for free public use at the EPISOPT web server. © 2013 Magdalena Molero-Abraham et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Selection in privatization is a decision-making process of choosing state-owned enterprises (SOEs), prioritizing and sequencing privatizing events, and determining the extent of private ownership in partial privatization. We investigate this process in an important but rarely studied case of China. Based on the SOE population over 1998-2008, we track 49,456 wholly SOEs and identify 9,359 privatization cases over time. Our econometric analysis concludes: (i) The privatization selection is a complex decision-making process in which local governments balance between various economic, financial and political objectives. (ii) In the recent Chinese privatization, firm performance relates to the selection, staging and sequencing in privatization in an inverted-U fashion. The worse and the best performing SOEs are more likely to remain state-owned, maintain higher state holding when privatized, and are less likely to be privatized later in time. These patterns suggest the privatization reform slowdown and the underlying changes in the privatization policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transaction cost theory is one of the most widely used theories in marketing, management, and economics. The focus of the theory is on explaining how firms organize transactions. The rules by which transactions are organized is called governance. A wide variety of strategic decisions of firms, such as outsourcing, the mode of organizing exports, the use of crowdsourcing, or partner selection efforts, can be analyzed and understood using transaction cost theory. The basic argument of transaction cost theory is that firms economize on costs by choosing a form of governance that minimizes production and transaction costs. We discuss the origins and uses of the theory, critical variables, assumptions, and limitations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative analysis of solid-state processes from isothermal microcalorimetric data is straightforward if data for the total process have been recorded and problematic (in the more likely case) when they have not. Data are usually plotted as a function of fraction reacted (α); for calorimetric data, this requires knowledge of the total heat change (Q) upon completion of the process. Determination of Q is difficult in cases where the process is fast (initial data missing) or slow (final data missing). Here we introduce several mathematical methods that allow the direct calculation of Q by selection of data points when only partial data are present, based on analysis with the Pérez-Maqueda model. All methods in addition allow direct determination of the reaction mechanism descriptors m and n and from this the rate constant, k. The validity of the methods is tested with the use of simulated calorimetric data, and we introduce a graphical method for generating solid-state power-time data. The methods are then applied to the crystallization of indomethacin from a glass. All methods correctly recovered the total reaction enthalpy (16.6 J) and suggested that the crystallization followed an Avrami model. The rate constants for crystallization were determined to be 3.98 × 10-6, 4.13 × 10-6, and 3.98 × 10 -6 s-1 with methods 1, 2, and 3, respectively. © 2010 American Chemical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To compare the accuracy of different forecasting approaches an error measure is required. Many error measures have been proposed in the literature, however in practice there are some situations where different measures yield different decisions on forecasting approach selection and there is no agreement on which approach should be used. Generally forecasting measures represent ratios or percentages providing an overall image of how well fitted the forecasting technique is to the observations. This paper proposes a multiplicative Data Envelopment Analysis (DEA) model in order to rank several forecasting techniques. We demonstrate the proposed model by applying it to the set of yearly time series of the M3 competition. The usefulness of the proposed approach has been tested using the M3-competition where five error measures have been applied in and aggregated to a single DEA score.