937 resultados para Markov chains, uniformization, inexact methods, relaxed matrix-vector
Resumo:
Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.
Resumo:
This is a review of methodology for the algorithmic study of some useful models in point process and queueing theory, as discussed in three lectures at the Summer Institute at Sozopol, Bulgaria. We provide references to sources where the extensive details of this work are found. For future investigation, some open problems and new methodological approaches are proposed.
Resumo:
MSC subject classification: 65C05, 65U05.
Resumo:
2000 Mathematics Subject Classification: 62H30, 62J20, 62P12, 68T99
Resumo:
Given the polynomials f, g ∈ Z[x] of degrees n, m, respectively, with n > m, three new, and easy to understand methods — along with the more efficient variants of the last two of them — are presented for the computation of their subresultant polynomial remainder sequence (prs). All three methods evaluate a single determinant (subresultant) of an appropriate sub-matrix of sylvester1, Sylvester’s widely known and used matrix of 1840 of dimension (m + n) × (m + n), in order to compute the correct sign of each polynomial in the sequence and — except for the second method — to force its coefficients to become subresultants. Of interest is the fact that only the first method uses pseudo remainders. The second method uses regular remainders and performs operations in Q[x], whereas the third one triangularizes sylvester2, Sylvester’s little known and hardly ever used matrix of 1853 of dimension 2n × 2n. All methods mentioned in this paper (along with their supporting functions) have been implemented in Sympy and can be downloaded from the link http://inf-server.inf.uth.gr/~akritas/publications/subresultants.py
Resumo:
2000 Mathematics Subject Classification: 65H10.
Resumo:
Background: DNA-binding proteins play a pivotal role in various intra- and extra-cellular activities ranging from DNA replication to gene expression control. Identification of DNA-binding proteins is one of the major challenges in the field of genome annotation. There have been several computational methods proposed in the literature to deal with the DNA-binding protein identification. However, most of them can't provide an invaluable knowledge base for our understanding of DNA-protein interactions. Results: We firstly presented a new protein sequence encoding method called PSSM Distance Transformation, and then constructed a DNA-binding protein identification method (SVM-PSSM-DT) by combining PSSM Distance Transformation with support vector machine (SVM). First, the PSSM profiles are generated by using the PSI-BLAST program to search the non-redundant (NR) database. Next, the PSSM profiles are transformed into uniform numeric representations appropriately by distance transformation scheme. Lastly, the resulting uniform numeric representations are inputted into a SVM classifier for prediction. Thus whether a sequence can bind to DNA or not can be determined. In benchmark test on 525 DNA-binding and 550 non DNA-binding proteins using jackknife validation, the present model achieved an ACC of 79.96%, MCC of 0.622 and AUC of 86.50%. This performance is considerably better than most of the existing state-of-the-art predictive methods. When tested on a recently constructed independent dataset PDB186, SVM-PSSM-DT also achieved the best performance with ACC of 80.00%, MCC of 0.647 and AUC of 87.40%, and outperformed some existing state-of-the-art methods. Conclusions: The experiment results demonstrate that PSSM Distance Transformation is an available protein sequence encoding method and SVM-PSSM-DT is a useful tool for identifying the DNA-binding proteins. A user-friendly web-server of SVM-PSSM-DT was constructed, which is freely accessible to the public at the web-site on http://bioinformatics.hitsz.edu.cn/PSSM-DT/.
Resumo:
A páronként összehasonlított alternatívák rangsorolásának problémája egyaránt felmerül a szavazáselmélet, a statisztika, a tudománymetria, a pszichológia és a sport területén. A nemzetközi szakirodalom alapján részletesen áttekintjük a megoldási lehetőségeket, bemutatjuk a gyakorlati alkalmazások során fellépő kérdések kezelésének, a valós adatoknak megfelelő matematikai környezet felépítésének módjait. Kiemelten tárgyaljuk a páros összehasonlítási mátrix megadását, az egyes pontozási eljárásokat és azok kapcsolatát. A tanulmány elméleti szempontból vizsgálja a Perron-Frobenius tételen alapuló invariáns, fair bets, PageRank, valamint az irányított gráfok csúcsainak rangsorolásra javasolt internal slackening és pozíciós erő módszereket. A közülük történő választáshoz az axiomatikus megközelítést ajánljuk, ennek keretében bemutatjuk az invariáns és a fair bets eljárások karakterizációját, és kitérünk a módszerek vitatható tulajdonságaira. _____ The ranking of the alternatives or selecting the best one are fundamental issues of social choice theory, statistics, psychology and sport. Different solution concepts, and various mathematical models of applications are reviewed based on the international literature. We are focusing on the de¯nition of paired comparison matrix, on main scoring procedures and their relation. The paper gives a theoretical analysis of the invariant, fair bets and PageRank methods, which are founded on Perron-Frobenius theorem, as well as the internal slackening and positional power procedures used for ranking the nodes of a directed graph. An axiomatic approach is proposed for the choice of an appropriate method. Besides some known characterizations for the invariant and fair bets methods, we also discuss the violation of some properties, meaning their main weakness.
Resumo:
A dolgozatban a döntéselméletben fontos szerepet játszó páros összehasonlítás mátrix prioritásvektorának meghatározására új megközelítést alkalmazunk. Az A páros összehasonlítás mátrix és a prioritásvektor által definiált B konzisztens mátrix közötti eltérést a Kullback-Leibler relatív entrópia-függvény segítségével mérjük. Ezen eltérés minimalizálása teljesen kitöltött mátrix esetében konvex programozási feladathoz vezet, nem teljesen kitöltött mátrix esetében pedig egy fixpont problémához. Az eltérésfüggvényt minimalizáló prioritásvektor egyben azzal a tulajdonsággal is rendelkezik, hogy az A mátrix elemeinek összege és a B mátrix elemeinek összege közötti különbség éppen az eltérésfüggvény minimumának az n-szerese, ahol n a feladat mérete. Így az eltérésfüggvény minimumának értéke két szempontból is lehet alkalmas az A mátrix inkonzisztenciájának a mérésére. _____ In this paper we apply a new approach for determining a priority vector for the pairwise comparison matrix which plays an important role in Decision Theory. The divergence between the pairwise comparison matrix A and the consistent matrix B defined by the priority vector is measured with the help of the Kullback-Leibler relative entropy function. The minimization of this divergence leads to a convex program in case of a complete matrix, leads to a fixed-point problem in case of an incomplete matrix. The priority vector minimizing the divergence also has the property that the difference of the sums of elements of the matrix A and the matrix B is n times the minimum of the divergence function where n is the dimension of the problem. Thus we developed two reasons for considering the value of the minimum of the divergence as a measure of inconsistency of the matrix A.
Resumo:
Több mint száz éve született meg Henry Gantt (Gantt, 1910) sávos ütemterve, Kelley (Kelley, 1961) és Walker (Walker, 1959) is több mint hatvan éve publikálta kritikus út módszerét. Az ezekre épülő költség- és erőforrás- tervezési módszerek vajon alkalmasak-e a ma kihívásaira? Az olvasó ebben a tanulmányban többéves kutatómunka gyümölcsét láthatja. A kutatás során az egyik legfontosabb cél annak vizsgálata volt, hogy a meglévő projekttervezési eszközök mennyiben felelnek meg a mai projektek kihívásainak; hol és milyen területen van szükség e módszerek továbbfejlesztésére, esetleg meghaladására. Ebben a tanulmányban a szerző olyan módszereket mutat be, amelyek messze túlvezetnek bennünket a projekttervezés eddig elsősorban operatív feladatokra szorítkozó módszereitől, és olyan kérdések megválaszolására fordítja figyelmünket, mint pl. milyen tevékenységeket, projekteket valósítsunk meg; melyeket hagyjuk el vagy ütemezzük be egy későbbi projektbe; hogyan rangsoroljuk, priorizáljuk a projektek megvalósítását, fontosságát? ______ Gantt chart (Gantt, 1910) was born by Henry Gantt more than a hundred years ago. Kelley and Walker published their critical planning method more than a 60 years ago (see i.e. Kelley-Walker, 1959). Can we use methods based on network planning methods for the challenges of 21st century? In this paper the author can see the results of the recent researches. In this study with their colleagues he investigated which project planning methods can be used in challenges of the 21st century and where and how to improve them. In these researches new matrix-based project planning methods are specified, where they can deal not only operative but strategic questions: which subprojects/tasks should be completed, how to treat priorities of completion in case of defining logic planning, how to support not only traditional but agile project management approaches.In this paper he introduces a new matrix-based method, which can be used for ranking project or multi project scenarios with different kinds of target functions. The author shows methods that are used in an expert module. He shows how to integrate this expert module into the traditional PMS system.
Resumo:
Accurate knowledge of the time since death, or postmortem interval (PMI), has enormous legal, criminological, and psychological impact. In this study, an investigation was made to determine whether the relationship between the degradation of the human cardiac structure protein Cardiac Troponin T and PMI could be used as an indicator of time since death, thus providing a rapid, high resolution, sensitive, and automated methodology for the determination of PMI. ^ The use of Cardiac Troponin T (cTnT), a protein found in heart tissue, as a selective marker for cardiac muscle damage has shown great promise in the determination of PMI. An optimized conventional immunoassay method was developed to quantify intact and fragmented cTnT. A small sample of cardiac tissue, which is less affected than other tissues by external factors, was taken, homogenized, extracted with magnetic microparticles, separated by SDS-PAGE, and visualized with Western blot by probing with monoclonal antibody against cTnT. This step was followed by labeling and available scanners. This conventional immunoassay provides a proper detection and quantitation of cTnT protein in cardiac tissue as a complex matrix; however, this method does not provide the analyst with immediate results. Therefore, a competitive separation method using capillary electrophoresis with laser-induced fluorescence (CE-LIF) was developed to study the interaction between human cTnT protein and monoclonal anti-TroponinT antibody. ^ Analysis of the results revealed a linear relationship between the percent of degraded cTnT and the log of the PMI, indicating that intact cTnT could be detected in human heart tissue up to 10 days postmortem at room temperature and beyond two weeks at 4C. The data presented demonstrates that this technique can provide an extended time range during which PMI can be more accurately estimated as compared to currently used methods. The data demonstrates that this technique represents a major advance in time of death determination through a fast and reliable, semi-quantitative measurement of a biochemical marker from an organ protected from outside factors. ^
Resumo:
This dissertation established a state-of-the-art programming tool for designing and training artificial neural networks (ANNs) and showed its applicability to brain research. The developed tool, called NeuralStudio, allows users without programming skills to conduct studies based on ANNs in a powerful and very user friendly interface. A series of unique features has been implemented in NeuralStudio, such as ROC analysis, cross-validation, network averaging, topology optimization, and optimization of the activation function’s slopes. It also included a Support Vector Machines module for comparison purposes. Once the tool was fully developed, it was applied to two studies in brain research. In the first study, the goal was to create and train an ANN to detect epileptic seizures from subdural EEG. This analysis involved extracting features from the spectral power in the gamma frequencies. In the second application, a unique method was devised to link EEG recordings to epileptic and nonepileptic subjects. The contribution of this method consisted of developing a descriptor matrix that can be used to represent any EEG file regarding its duration and the number of electrodes. The first study showed that the inter-electrode mean of the spectral power in the gamma frequencies and its duration above a specific threshold performs better than the other frequencies in seizure detection, exhibiting an accuracy of 95.90%, a sensitivity of 92.59%, and a specificity of 96.84%. The second study yielded that Hjorth’s parameter activity is sufficient to accurately relate EEG to epileptic and non-epileptic subjects. After testing, accuracy, sensitivity and specificity of the classifier were all above 0.9667. Statistical tests measured the superiority of activity at over 99.99 % certainty. It was demonstrated that (1) the spectral power in the gamma frequencies is highly effective in locating seizures from EEG and (2) activity can be used to link EEG recordings to epileptic and non-epileptic subjects. These two studies required high computational load and could be addressed thanks to NeuralStudio. From a medical perspective, both methods proved the merits of NeuralStudio in brain research applications. For its outstanding features, NeuralStudio has been recently awarded a patent (US patent No. 7502763).
Resumo:
This dissertation delivers a framework to diagnose the Bull-Whip Effect (BWE) in supply chains and then identify methods to minimize it. Such a framework is needed because in spite of the significant amount of literature discussing the bull-whip effect, many companies continue to experience the wide variations in demand that are indicative of the bull-whip effect. While the theory and knowledge of the bull-whip effect is well established, there still is the lack of an engineering framework and method to systematically identify the problem, diagnose its causes, and identify remedies. ^ The present work seeks to fill this gap by providing a holistic, systems perspective to bull-whip identification and diagnosis. The framework employs the SCOR reference model to examine the supply chain processes with a baseline measure of demand amplification. Then, research of the supply chain structural and behavioral features is conducted by means of the system dynamics modeling method. ^ The contribution of the diagnostic framework, is called Demand Amplification Protocol (DAMP), relies not only on the improvement of existent methods but also contributes with original developments introduced to accomplish successful diagnosis. DAMP contributes a comprehensive methodology that captures the dynamic complexities of supply chain processes. The method also contributes a BWE measurement method that is suitable for actual supply chains because of its low data requirements, and introduces a BWE scorecard for relating established causes to a central BWE metric. In addition, the dissertation makes a methodological contribution to the analysis of system dynamic models with a technique for statistical screening called SS-Opt, which determines the inputs with the greatest impact on the bull-whip effect by means of perturbation analysis and subsequent multivariate optimization. The dissertation describes the implementation of the DAMP framework in an actual case study that exposes the approach, analysis, results and conclusions. The case study suggests a balanced solution between costs and demand amplification can better serve both firms and supply chain interests. Insights pinpoint to supplier network redesign, postponement in manufacturing operations and collaborative forecasting agreements with main distributors.^
Resumo:
The environmental dynamics of dissolved organic matter (DOM) were characterized for a shallow, subtropical, seagrass-dominated estuarine bay, namely Florida Bay, USA. Large spatial and seasonal variations in DOM quantity and quality were assessed using dissolved organic C (DOC) measurements and spectrophotometric properties including excitation emission matrix (EEM) fluorescence with parallel factor analysis (PARAFAC). Surface water samples were collected monthly for 2 years across the bay. DOM characteristics were statistically different across the bay, and the bay was spatially characterized into four basins based on chemical characteristics of DOM as determined by EEM-PARAFAC. Differences between zones were explained based on hydrology, geomorphology, and primary productivity of the local seagrass community. In addition, potential disturbance effects from a very active hurricane season were identified. Although the overall seasonal patterns of DOM variations were not significantly affected on a bay-wide scale by this disturbance, enhanced freshwater delivery and associated P and DOM inputs (both quantity and quality) were suggested as potential drivers for the appearance of algal blooms in high impact areas. The application of EEM-PARAFAC proved to be ideally suited for studies requiring high sample throughput methods to assess spatial and temporal ecological drivers and to determine disturbance-induced impacts in aquatic ecosystems.
Resumo:
The presence of inhibitory substances in biological forensic samples has, and continues to affect the quality of the data generated following DNA typing processes. Although the chemistries used during the procedures have been enhanced to mitigate the effects of these deleterious compounds, some challenges remain. Inhibitors can be components of the samples, the substrate where samples were deposited or chemical(s) associated to the DNA purification step. Therefore, a thorough understanding of the extraction processes and their ability to handle the various types of inhibitory substances can help define the best analytical processing for any given sample. A series of experiments were conducted to establish the inhibition tolerance of quantification and amplification kits using common inhibitory substances in order to determine if current laboratory practices are optimal for identifying potential problems associated with inhibition. DART mass spectrometry was used to determine the amount of inhibitor carryover after sample purification, its correlation to the initial inhibitor input in the sample and the overall effect in the results. Finally, a novel alternative at gathering investigative leads from samples that would otherwise be ineffective for DNA typing due to the large amounts of inhibitory substances and/or environmental degradation was tested. This included generating data associated with microbial peak signatures to identify locations of clandestine human graves. Results demonstrate that the current methods for assessing inhibition are not necessarily accurate, as samples that appear inhibited in the quantification process can yield full DNA profiles, while those that do not indicate inhibition may suffer from lowered amplification efficiency or PCR artifacts. The extraction methods tested were able to remove >90% of the inhibitors from all samples with the exception of phenol, which was present in variable amounts whenever the organic extraction approach was utilized. Although the results attained suggested that most inhibitors produce minimal effect on downstream applications, analysts should practice caution when selecting the best extraction method for particular samples, as casework DNA samples are often present in small quantities and can contain an overwhelming amount of inhibitory substances.