877 resultados para new method


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Preference relations, and their modeling, have played a crucial role in both social sciences and applied mathematics. A special category of preference relations is represented by cardinal preference relations, which are nothing other than relations which can also take into account the degree of relation. Preference relations play a pivotal role in most of multi criteria decision making methods and in the operational research. This thesis aims at showing some recent advances in their methodology. Actually, there are a number of open issues in this field and the contributions presented in this thesis can be grouped accordingly. The first issue regards the estimation of a weight vector given a preference relation. A new and efficient algorithm for estimating the priority vector of a reciprocal relation, i.e. a special type of preference relation, is going to be presented. The same section contains the proof that twenty methods already proposed in literature lead to unsatisfactory results as they employ a conflicting constraint in their optimization model. The second area of interest concerns consistency evaluation and it is possibly the kernel of the thesis. This thesis contains the proofs that some indices are equivalent and that therefore, some seemingly different formulae, end up leading to the very same result. Moreover, some numerical simulations are presented. The section ends with some consideration of a new method for fairly evaluating consistency. The third matter regards incomplete relations and how to estimate missing comparisons. This section reports a numerical study of the methods already proposed in literature and analyzes their behavior in different situations. The fourth, and last, topic, proposes a way to deal with group decision making by means of connecting preference relations with social network analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Some beetle species can have devastating economic impacts on forest and nursery industries. A recent example is Anophophora glabripennis, a species of beetle known in the United States as the ''Asian Longhorrned beetle'', which has damaged many American forests, and is a threat which can unintentionally reach south American countries, including Brazil. This work presents a new method based on X-ray computerized tomography (CT) and image processing for beetle injury detection in forests. Its results show a set of images with correct identification of the location of beetles in living trees as well as damage evaluation with time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The drug discovery process is facing new challenges in the evaluation process of the lead compounds as the number of new compounds synthesized is increasing. The potentiality of test compounds is most frequently assayed through the binding of the test compound to the target molecule or receptor, or measuring functional secondary effects caused by the test compound in the target model cells, tissues or organism. Modern homogeneous high-throughput-screening (HTS) assays for purified estrogen receptors (ER) utilize various luminescence based detection methods. Fluorescence polarization (FP) is a standard method for ER ligand binding assay. It was used to demonstrate the performance of two-photon excitation of fluorescence (TPFE) vs. the conventional one-photon excitation method. As result, the TPFE method showed improved dynamics and was found to be comparable with the conventional method. It also held potential for efficient miniaturization. Other luminescence based ER assays utilize energy transfer from a long-lifetime luminescent label e.g. lanthanide chelates (Eu, Tb) to a prompt luminescent label, the signal being read in a time-resolved mode. As an alternative to this method, a new single-label (Eu) time-resolved detection method was developed, based on the quenching of the label by a soluble quencher molecule when displaced from the receptor to the solution phase by an unlabeled competing ligand. The new method was paralleled with the standard FP method. It was shown to yield comparable results with the FP method and found to hold a significantly higher signal-tobackground ratio than FP. Cell-based functional assays for determining the extent of cell surface adhesion molecule (CAM) expression combined with microscopy analysis of the target molecules would provide improved information content, compared to an expression level assay alone. In this work, immune response was simulated by exposing endothelial cells to cytokine stimulation and the resulting increase in the level of adhesion molecule expression was analyzed on fixed cells by means of immunocytochemistry utilizing specific long-lifetime luminophore labeled antibodies against chosen adhesion molecules. Results showed that the method was capable of use in amulti-parametric assay for protein expression levels of several CAMs simultaneously, combined with analysis of the cellular localization of the chosen adhesion molecules through time-resolved luminescence microscopy inspection.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tässä työssä on tarkasteltu uusien YVL-ohjeiden vikasietoisuusanalyysin vaatimuksia sekä kehitetty menetelmä, jolla vaatimusten toteutumista voidaan tarkastella todennäköisyysperusteisen riskianalyysin (PRA) avulla. Työssä on käsitelty riskianalyysin tärkeimmät osat, sekä sen tuloksena saatavia tärkeysmittoja ja näiden soveltamiskohteita. Tärkeysmittoja on käytetty myös kehitetyn menetelmän alkuarvoina. Ydinvoimalan turvallisuuden takaamiseksi tärkeimpiä turvallisuustoimintoja suorittavien järjestelmien on pystyttävä toteuttamaan tehtävänsä, vaikka mikä tahansa järjestelmän yksittäinen laite olisi toimintakyvytön ja vaikka mikä tahansa turvallisuustoimintoon vaikuttava laite olisi samanaikaisesti poissa käytöstä korjauksen tai huollon vuoksi. Tämä edellyttää, että vikasietoisuuden takaamiseksi tärkeimpien turvallisuustoimintojen varmistamisessa on käytettävä mahdollisuuksien mukaan moninkertaisuus- ja erilaisuusperiaatteisiin perustuvia järjestelmiä, joiden tulee olla toisistaan riippumattomia. Kehitetyn menetelmän ja uuden vikasietoisuuden lisäarvomitan avulla voidaan tunnistaa järjestelmien väliset riippuvuustekijät ja tarkastella vaadittujen turvallisuustekijöiden toteutumista.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The large and growing number of digital images is making manual image search laborious. Only a fraction of the images contain metadata that can be used to search for a particular type of image. Thus, the main research question of this thesis is whether it is possible to learn visual object categories directly from images. Computers process images as long lists of pixels that do not have a clear connection to high-level semantics which could be used in the image search. There are various methods introduced in the literature to extract low-level image features and also approaches to connect these low-level features with high-level semantics. One of these approaches is called Bag-of-Features which is studied in the thesis. In the Bag-of-Features approach, the images are described using a visual codebook. The codebook is built from the descriptions of the image patches using clustering. The images are described by matching descriptions of image patches with the visual codebook and computing the number of matches for each code. In this thesis, unsupervised visual object categorisation using the Bag-of-Features approach is studied. The goal is to find groups of similar images, e.g., images that contain an object from the same category. The standard Bag-of-Features approach is improved by using spatial information and visual saliency. It was found that the performance of the visual object categorisation can be improved by using spatial information of local features to verify the matches. However, this process is computationally heavy, and thus, the number of images must be limited in the spatial matching, for example, by using the Bag-of-Features method as in this study. Different approaches for saliency detection are studied and a new method based on the Hessian-Affine local feature detector is proposed. The new method achieves comparable results with current state-of-the-art. The visual object categorisation performance was improved by using foreground segmentation based on saliency information, especially when the background could be considered as clutter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

ABSTRACT Inventory and prediction of cork harvest over time and space is important to forest managers who must plan and organize harvest logistics (transport, storage, etc.). Common field inventory methods including the stem density, diameter and height structure are costly and generally point (plot) based. Furthermore, the irregular horizontal structure of cork oak stands makes it difficult, if not impossible, to interpolate between points. We propose a new method to estimate cork production using digital multispectral aerial imagery. We study the spectral response of individual trees in visible and near infrared spectra and then correlate that response with cork production prior to harvest. We use ground measurements of individual trees production to evaluate the model’s predictive capacity. We propose 14 candidate variables to predict cork production based on crown size in combination with different NDVI index derivates. We use Akaike Information Criteria to choose the best among them. The best model is composed of combinations of different NDVI derivates that include red, green, and blue channels. The proposed model is 15% more accurate than a model that includes only a crown projection without any spectral information.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The three main topics of this work are independent systems and chains of word equations, parametric solutions of word equations on three unknowns, and unique decipherability in the monoid of regular languages. The most important result about independent systems is a new method giving an upper bound for their sizes in the case of three unknowns. The bound depends on the length of the shortest equation. This result has generalizations for decreasing chains and for more than three unknowns. The method also leads to shorter proofs and generalizations of some old results. Hmelevksii’s theorem states that every word equation on three unknowns has a parametric solution. We give a significantly simplified proof for this theorem. As a new result we estimate the lengths of parametric solutions and get a bound for the length of the minimal nontrivial solution and for the complexity of deciding whether such a solution exists. The unique decipherability problem asks whether given elements of some monoid form a code, that is, whether they satisfy a nontrivial equation. We give characterizations for when a collection of unary regular languages is a code. We also prove that it is undecidable whether a collection of binary regular languages is a code.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

CHARGE syndrome, Sotos syndrome and 3p deletion syndrome are examples of rare inherited syndromes that have been recognized for decades but for which the molecular diagnostics only have been made possible by recent advances in genomic research. Despite these advances, development of diagnostic tests for rare syndromes has been hindered by diagnostic laboratories having limited funds for test development, and their prioritization of tests for which a (relatively) high demand can be expected. In this study, the molecular diagnostic tests for CHARGE syndrome and Sotos syndrome were developed, resulting in their successful translation into routine diagnostic testing in the laboratory of Medical Genetics (UTUlab). In the CHARGE syndrome group, mutation was identified in 40.5% of the patients and in the Sotos syndrome group, in 34%, reflecting the use of the tests in routine diagnostics in differential diagnostics. In CHARGE syndrome, the low prevalence of structural aberrations was also confirmed. In 3p deletion syndrome, it was shown that small terminal deletions are not causative for the syndrome, and that testing with arraybased analysis provides a reliable estimate of the deletion size but benign copy number variants complicate result interpretation. During the development of the tests, it was discovered that finding an optimal molecular diagnostic strategy for a given syndrome is always a compromise between the sensitivity, specificity and feasibility of applying a new method. In addition, the clinical utility of the test should be considered prior to test development: sometimes a test performing well in a laboratory has limited utility for the patient, whereas a test performing poorly in the laboratory may have a great impact on the patient and their family. At present, the development of next generation sequencing methods is changing the concept of molecular diagnostics of rare diseases from single tests towards whole-genome analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Natural orifice transluminal endoscopic surgery (NOTES) is an emerging experimental alternative to conventional surgery that eliminates abdominal incisions and incision-related complications by combining endoscopic and laparoscopic techniques to diagnose and treat abdominal pathology. Natural orifice transluminal endoscopic surgery refers to the method of accessing the abdominal cavity through a natural orifice under endoscopic visualization. Since its introduction in 2004, numerous reports have been published describing different surgical interventions. Recently, a group of expert laparoscopic surgeons and endoscopists outlined the limitations of this approach and issued recommendations for progress toward human trials. Transluminal surgery is a new method for accessing the abdomen under direct endoscopic visualization. Preliminary studies have demonstrated the feasibility of this technique in animal models; however, further research is warranted to validate its safety in humans.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bacteria can exist as planktonic, the lifestyle in which single cells exist in suspension, and as biofilms, which are surface-attached bacterial communities embedded in a selfproduced matrix. Most of the antibiotics and the methods for antimicrobial work have been developed for planktonic bacteria. However, the majority of the bacteria in natural habitats live as biofilms. Biofilms develop dauntingly fast high resistance towards conventional antibacterial treatments and thus, there is a great need to meet the demands of effective anti-biofilm therapy. In this thesis project it was attempted to fill the void of anti-biofilm screening methods by developing a platform of assays that evaluate the effect that screened compounds have on the total biomass, viability and the extracellular polysaccharide (EPS) layer of the biofilms. Additionally, a new method for studying biofilms and their interactions with compounds in a continuous flow system was developed using capillary electrochromatography (CEC). The screening platform was utilized with a screening campaign using a small library of cinchona alkaloids. The assays were optimized to be statistically robust enough for screening. The first assay, based on crystal violet staining, measures total biofilm biomass, and it was automated using a liquid handling workstation to decrease the manual workload and signal variation. The second assay, based on resazurin staining, measures viability of the biofilm, and it was thoroughly optimized for the strain used, but was then a very simple and fast method to be used for primary screening. The fluorescent resazurin probe is not toxic to the biofilms. In fact, it was also shown in this project that staining the biofilms with resazurin prior to staining with crystal violet had no effect on the latter and they can be used in sequence on the same screening plate. This sequential addition step was indeed a major improvement on the use of reagents and consumables and also shortened the work time. As a third assay in the platform a wheat germ agglutinin based assay was added to evaluate the effect a compound has on the EPS layer. Using this assay it was found that even if compounds might have clear effect on both biomass and viability, the EPS layer can be left untouched or even be increased. This is a clear implication of the importance of using several assays to be able to find “true hits” in a screening setting. In the pilot study of screening for antimicrobial and anti-biofilm effects using a cinchona alkaloid library, one compound was found to have antimicrobial effect against planktonic bacteria and prevent biofilm formation at low micromolar concentration. To eradicate biofilms, a higher concentration was needed. It was also shown that the chemical space occupied by the active compound was slightly different than the rest of the cinchona alkaloids as well as the rest of the compounds used for validatory screening during the optimization processes of the separate assays.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Knowledge of the behaviour of cellulose, hemicelluloses, and lignin during wood and pulp processing is essential for understanding and controlling the processes. Determination of monosaccharide composition gives information about the structural polysaccharide composition of wood material and helps when determining the quality of fibrous products. In addition, monitoring of the acidic degradation products gives information of the extent of degradation of lignin and polysaccharides. This work describes two capillary electrophoretic methods developed for the analysis of monosaccharides and for the determination of aliphatic carboxylic acids from alkaline oxidation solutions of lignin and wood. Capillary electrophoresis (CE), in its many variants is an alternative separation technique to chromatographic methods. In capillary zone electrophoresis (CZE) the fused silica capillary is filled with an electrolyte solution. An applied voltage generates a field across the capillary. The movement of the ions under electric field is based on the charge and hydrodynamic radius of ions. Carbohydrates contain hydroxyl groups that are ionised only in strongly alkaline conditions. After ionisation, the structures are suitable for electrophoretic analysis and identification through either indirect UV detection or electrochemical detection. The current work presents a new capillary zone electrophoretic method, relying on in-capillary reaction and direct UV detection at the wavelength of 270 nm. The method has been used for the simultaneous separation of neutral carbohydrates, including mono- and disaccharides and sugar alcohols. The in-capillary reaction produces negatively charged and UV-absorbing compounds. The optimised method was applied to real samples. The methodology is fast since no other sample preparation, except dilution, is required. A new method for aliphatic carboxylic acids in highly alkaline process liquids was developed. The goal was to develop a method for the simultaneous analysis of the dicarboxylic acids, hydroxy acids and volatile acids that are oxidation and degradation products of lignin and wood polysaccharides. The CZE method was applied to three process cases. First, the fate of lignin under alkaline oxidation conditions was monitored by determining the level of carboxylic acids from process solutions. In the second application, the degradation of spruce wood using alkaline and catalysed alkaline oxidation were compared by determining carboxylic acids from the process solutions. In addition, the effectiveness of membrane filtration and preparative liquid chromatography in the enrichment of hydroxy acids from black liquor was evaluated, by analysing the effluents with capillary electrophoresis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The dissertation ´I knit, therefore I am!´ Learning and identity in informal space has two main purposes. The first purpose being an investigation of how new value attributions and thinking can generate novel and usable knowledge to the field of craftsmanship, and the second purpose being a display of a different and overlooked philosophical and cultural potential in a reflexive mode of expression, which is able to reflect the normative comprehension of craftsmanship. The dissertation focuses on learning and identity in informal spaces of learning and how it is possible to relate such a learning perspective to crafts training in educational establishments. The empirical foundation of this dissertation is ‘craftivism’. In the dissertation activists from the Nordic countries have been interviewed about what they do when they put up their textile graffiti on lamp posts and house walls. Three research problems are presented: 1) What stories do people who work as crafts activists, tell about ways of relating and methods of action when they make crafts? 2) What do these stories tell about learning and identity? 3) How may the research results influence training and education in craftsmanship? These questions are being asked in order to acquire new knowledge in two aspects; first aspect being knowledge about crafts in relation to techniques, tradition and the objects in crafts, and the second aspect being knowledge about learning and identity in informal spaces of learning. The dissertations theoretical foundation is post structural and sociocultural combined with hermeneutical-inspired qualitative interviews. The author’s position and pre-understanding is subject to discussion in relation to the informant; the performing activist, as the background for both of them is craftsmanship. Starting from cultural studies, it is possible to see the activist subject’s conditions of possibilities in the culture, as the activism of the sub-cultural phenomenon’s craft lights up through a performing approach to the individual’s actions. First the research material has been analysed for events of textile graffiti and possible themes in the events, after which the results have been summarised. Next the research material has been analysed for events about learning and identity due to the author’s wish of comprehending the background of and motivational force in activism. The analysis is divided in main perspectives with different dimensions. The results of the analysis show the activist subject’s construction of an individual who actively takes part in a community by e.g. creating joy, changing the world’s perception of sustainability or by feminizing the public space. By taking crafts over the borders (and away from the class room) crafts become contextualized in a novel fashion thus obtaining an independent status. In this fashion the dissertation writes itself into a new method of comprehending and performing traditional craftsmanship techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Energy efficiency is one of the major objectives which should be achieved in order to implement the limited energy resources of the world in a sustainable way. Since radiative heat transfer is the dominant heat transfer mechanism in most of fossil fuel combustion systems, more accurate insight and models may cause improvement in the energy efficiency of the new designed combustion systems. The radiative properties of combustion gases are highly wavelength dependent. Better models for calculating the radiative properties of combustion gases are highly required in the modeling of large scale industrial combustion systems. With detailed knowledge of spectral radiative properties of gases, the modeling of combustion processes in the different applications can be more accurate. In order to propose a new method for effective non gray modeling of radiative heat transfer in combustion systems, different models for the spectral properties of gases including SNBM, EWBM, and WSGGM have been studied in this research. Using this detailed analysis of different approaches, the thesis presents new methods for gray and non gray radiative heat transfer modeling in homogeneous and inhomogeneous H2O–CO2 mixtures at atmospheric pressure. The proposed method is able to support the modeling of a wide range of combustion systems including the oxy-fired combustion scenario. The new methods are based on implementing some pre-obtained correlations for the total emissivity and band absorption coefficient of H2O–CO2 mixtures in different temperatures, gas compositions, and optical path lengths. They can be easily used within any commercial CFD software for radiative heat transfer modeling resulting in more accurate, simple, and fast calculations. The new methods were successfully used in CFD modeling by applying them to industrial scale backpass channel under oxy-fired conditions. The developed approaches are more accurate compared with other methods; moreover, they can provide complete explanation and detailed analysis of the radiation heat transfer in different systems under different combustion conditions. The methods were verified by applying them to some benchmarks, and they showed a good level of accuracy and computational speed compared to other methods. Furthermore, the implementation of the suggested banded approach in CFD software is very easy and straightforward.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study examines the structure of the Russian Reflexive Marker ( ся/-сь) and offers a usage-based model building on Construction Grammar and a probabilistic view of linguistic structure. Traditionally, reflexive verbs are accounted for relative to non-reflexive verbs. These accounts assume that linguistic structures emerge as pairs. Furthermore, these accounts assume directionality where the semantics and structure of a reflexive verb can be derived from the non-reflexive verb. However, this directionality does not necessarily hold diachronically. Additionally, the semantics and the patterns associated with a particular reflexive verb are not always shared with the non-reflexive verb. Thus, a model is proposed that can accommodate the traditional pairs as well as for the possible deviations without postulating different systems. A random sample of 2000 instances marked with the Reflexive Marker was extracted from the Russian National Corpus and the sample used in this study contains 819 unique reflexive verbs. This study moves away from the traditional pair account and introduces the concept of Neighbor Verb. A neighbor verb exists for a reflexive verb if they share the same phonological form excluding the Reflexive Marker. It is claimed here that the Reflexive Marker constitutes a system in Russian and the relation between the reflexive and neighbor verbs constitutes a cross-paradigmatic relation. Furthermore, the relation between the reflexive and the neighbor verb is argued to be of symbolic connectivity rather than directionality. Effectively, the relation holding between particular instantiations can vary. The theoretical basis of the present study builds on this assumption. Several new variables are examined in order to systematically model variability of this symbolic connectivity, specifically the degree and strength of connectivity between items. In usage-based models, the lexicon does not constitute an unstructured list of items. Instead, items are assumed to be interconnected in a network. This interconnectedness is defined as Neighborhood in this study. Additionally, each verb carves its own niche within the Neighborhood and this interconnectedness is modeled through rhyme verbs constituting the degree of connectivity of a particular verb in the lexicon. The second component of the degree of connectivity concerns the status of a particular verb relative to its rhyme verbs. The connectivity within the neighborhood of a particular verb varies and this variability is quantified by using the Levenshtein distance. The second property of the lexical network is the strength of connectivity between items. Frequency of use has been one of the primary variables in functional linguistics used to probe this. In addition, a new variable called Constructional Entropy is introduced in this study building on information theory. It is a quantification of the amount of information carried by a particular reflexive verb in one or more argument constructions. The results of the lexical connectivity indicate that the reflexive verbs have statistically greater neighborhood distances than the neighbor verbs. This distributional property can be used to motivate the traditional observation that the reflexive verbs tend to have idiosyncratic properties. A set of argument constructions, generalizations over usage patterns, are proposed for the reflexive verbs in this study. In addition to the variables associated with the lexical connectivity, a number of variables proposed in the literature are explored and used as predictors in the model. The second part of this study introduces the use of a machine learning algorithm called Random Forests. The performance of the model indicates that it is capable, up to a degree, of disambiguating the proposed argument construction types of the Russian Reflexive Marker. Additionally, a global ranking of the predictors used in the model is offered. Finally, most construction grammars assume that argument construction form a network structure. A new method is proposed that establishes generalization over the argument constructions referred to as Linking Construction. In sum, this study explores the structural properties of the Russian Reflexive Marker and a new model is set forth that can accommodate both the traditional pairs and potential deviations from it in a principled manner.