918 resultados para Kaski, Antti: The security complex: a theoretical analysis and the Baltic case


Relevância:

100.00% 100.00%

Publicador:

Resumo:

MHC-peptide multimers containing biotinylated MHC-peptide complexes bound to phycoerythrin (PE) streptavidin (SA) are widely used for analyzing and sorting antigen-specific T cells. Here we describe alternative T cell-staining reagents that are superior to conventional reagents. They are built on reversible chelate complexes of Ni(2+)-nitrilotriacetic acid (NTA) with oligohistidines. We synthesized biotinylated linear mono-, di-, and tetra-NTA compounds using conventional solid phase peptide chemistry and studied their interaction with HLA-A*0201-peptide complexes containing a His(6), His(12), or 2×His(6) tag by surface plasmon resonance on SA-coated sensor chips and equilibrium dialysis. The binding avidity increased in the order His(6) < His(12) < 2×His(6) and NTA(1) < NTA(2) < NTA(4), respectively, depending on the configuration of the NTA moieties and increased to picomolar K(D) for the combination of a 2×His(6) tag and a 2×Ni(2+)-NTA(2). We demonstrate that HLA-A2-2×His(6)-peptide multimers containing either Ni(2+)-NTA(4)-biotin and PE-SA- or PE-NTA(4)-stained influenza and Melan A-specific CD8+ T cells equal or better than conventional multimers. Although these complexes were highly stable, they very rapidly dissociated in the presence of imidazole, which allowed sorting of bona fide antigen-specific CD8+ T cells without inducing T cell death as well as assessment of HLA-A2-peptide monomer dissociation kinetics on CD8+ T cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: The aim of this study was to develop models based on kernel regression and probability estimation in order to predict and map IRC in Switzerland by taking into account all of the following: architectural factors, spatial relationships between the measurements, as well as geological information. METHODS: We looked at about 240,000 IRC measurements carried out in about 150,000 houses. As predictor variables we included: building type, foundation type, year of construction, detector type, geographical coordinates, altitude, temperature and lithology into the kernel estimation models. We developed predictive maps as well as a map of the local probability to exceed 300 Bq/m(3). Additionally, we developed a map of a confidence index in order to estimate the reliability of the probability map. RESULTS: Our models were able to explain 28% of the variations of IRC data. All variables added information to the model. The model estimation revealed a bandwidth for each variable, making it possible to characterize the influence of each variable on the IRC estimation. Furthermore, we assessed the mapping characteristics of kernel estimation overall as well as by municipality. Overall, our model reproduces spatial IRC patterns which were already obtained earlier. On the municipal level, we could show that our model accounts well for IRC trends within municipal boundaries. Finally, we found that different building characteristics result in different IRC maps. Maps corresponding to detached houses with concrete foundations indicate systematically smaller IRC than maps corresponding to farms with earth foundation. CONCLUSIONS: IRC mapping based on kernel estimation is a powerful tool to predict and analyze IRC on a large-scale as well as on a local level. This approach enables to develop tailor-made maps for different architectural elements and measurement conditions and to account at the same time for geological information and spatial relations between IRC measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industry's growing need for higher productivity is placing new demands on mechanisms connected with electrical motors, because these can easily lead to vibration problems due to fast dynamics. Furthermore, the nonlinear effects caused by a motor frequently reduce servo stability, which diminishes the controller's ability to predict and maintain speed. Hence, the flexibility of a mechanism and its control has become an important area of research. The basic approach in control system engineering is to assume that the mechanism connected to a motor is rigid, so that vibrations in the tool mechanism, reel, gripper or any apparatus connected to the motor are not taken into account. This might reduce the ability of the machine system to carry out its assignment and shorten the lifetime of the equipment. Nonetheless, it is usually more important to know how the mechanism, or in other words the load on the motor, behaves. A nonlinear load control method for a permanent magnet linear synchronous motor is developed and implemented in the thesis. The purpose of the controller is to track a flexible load to the desired velocity reference as fast as possible and without awkward oscillations. The control method is based on an adaptive backstepping algorithm with its stability ensured by the Lyapunov stability theorem. As a reference controller for the backstepping method, a hybrid neural controller is introduced in which the linear motor itself is controlled by a conventional PI velocity controller and the vibration of the associated flexible mechanism is suppressed from an outer control loop using a compensation signal from a multilayer perceptron network. To avoid the local minimum problem entailed in neural networks, the initial weights are searched for offline by means of a differential evolution algorithm. The states of a mechanical system for controllers are estimated using the Kalman filter. The theoretical results obtained from the control design are validated with the lumped mass model for a mechanism. Generalization of the mechanism allows the methods derived here to be widely implemented in machine automation. The control algorithms are first designed in a specially introduced nonlinear simulation model and then implemented in the physical linear motor using a DSP (Digital Signal Processor) application. The measurements prove that both controllers are capable of suppressing vibration, but that the backstepping method is superior to others due to its accuracy of response and stability properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data transmission between an electric motor and a frequency converter is required in variablespeed electric drives because of sensors installed at the motor. Sensor information can be used for various useful applications to improve the system reliability and its properties. Traditionally, the communication medium is implemented by an additional cabling. However, the costs of the traditional method may be an obstacle to the wider application of data transmission between a motor and a frequency converter. In any case, a power cable is always installed between a motor and a frequency converter for power supply, and hence it may be applied as a communication medium for sensor level data. This thesis considers power line communication (PLC) in inverter-fed motor power cables. The motor cable is studied as a communication channel in the frequency band of 100 kHz−30 MHz. The communication channel and noise characteristics are described. All the individual components included in a variable-speed electric drive are presented in detail. A channel model is developed, and it is verified by measurements. A theoretical channel information capacity analysis is carried out to estimate the opportunities of a communication medium. Suitable communication and forward error correction (FEC) methods are suggested. A general method to implement a broadband and Ethernet-based communication medium between a motor and a frequency converter is proposed. A coupling interface is also developed that allows to install the communication device safely to a three-phase inverter-fed motor power cable. Practical tests are carried out, and the results are analyzed. Possible applications for the proposed method are presented. A speed feedback motor control application is verified in detail by simulations and laboratory tests because of restrictions for the delay in the feedback loop caused by PLC. Other possible applications are discussed at a more general level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identification of biomarkers of vascular cognitive impairment is urgent for its early diagnosis. The aim of this study was to detect and monitor changes in brain structure and connectivity, and to correlate them with the decline in executive function. We examined the feasibility of early diagnostic magnetic resonance imaging (MRI) to predict cognitive impairment before onset in an animal model of chronic hypertension: Spontaneously Hypertensive Rats. Cognitive performance was tested in an operant conditioning paradigm that evaluated learning, memory, and behavioral flexibility skills. Behavioral tests were coupled with longitudinal diffusion weighted imaging acquired with 126 diffusion gradient directions and 0.3 mm(3) isometric resolution at 10, 14, 18, 22, 26, and 40 weeks after birth. Diffusion weighted imaging was analyzed in two different ways, by regional characterization of diffusion tensor imaging (DTI) indices, and by assessing changes in structural brain network organization based on Q-Ball tractography. Already at the first evaluated times, DTI scalar maps revealed significant differences in many regions, suggesting loss of integrity in white and gray matter of spontaneously hypertensive rats when compared to normotensive control rats. In addition, graph theory analysis of the structural brain network demonstrated a significant decrease of hierarchical modularity, global and local efficacy, with predictive value as shown by regional three-fold cross validation study. Moreover, these decreases were significantly correlated with the behavioral performance deficits observed at subsequent time points, suggesting that the diffusion weighted imaging and connectivity studies can unravel neuroimaging alterations even overt signs of cognitive impairment become apparent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present investigation reports on the interaction of the C/O triplet atoms inside of the [60] fullerene (C60) species with small polar molecules (H²O, CH³OH, HF, NH³) using Density Functional Theory (DFT) calculations. The calculations show that in all the computed cases the encapuslated complexes with the molecules are more stable than without internal atoms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work investigates the Bullwhip Effect, which is one of the most important phenomena in contemporary supply chain management. The author uses most recent theoretical apparatus to analyze operational activities of a leading FMCG company British American Tobacco Eastern Europe. This paper investigates and describes the process in BAT supply chain management and considers the impact of the Bullwhip Effect together with the potential risks threatening company's operations. Emergence of the Bullwhip Effect leads to supply chain inefficiency. This paper contains methodological supply chain risk mitigation recommendations, description of a real case study and an analytical study of internal and external supply chain processes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Communications play a key role in modern smart grids. New functionalities that make the grids ‘smart’ require the communication network to function properly. Data transmission between intelligent electric devices (IEDs) in the rectifier and the customer-end inverters (CEIs) used for power conversion is also required in the smart grid concept of the low-voltage direct current (LVDC) distribution network. Smart grid applications, such as smart metering, demand side management (DSM), and grid protection applied with communications are all installed in the LVDC system. Thus, besides remote connection to the databases of the grid operators, a local communication network in the LVDC network is needed. One solution applied to implement the communication medium in power distribution grids is power line communication (PLC). There are power cables in the distribution grids, and hence, they may be applied as a communication channel for the distribution-level data. This doctoral thesis proposes an IP-based high-frequency (HF) band PLC data transmission concept for the LVDC network. A general method to implement the Ethernet-based PLC concept between the public distribution rectifier and the customerend inverters in the LVDC grid is introduced. Low-voltage cables are studied as the communication channel in the frequency band of 100 kHz–30 MHz. The communication channel characteristics and the noise in the channel are described. All individual components in the channel are presented in detail, and a channel model, comprising models for each channel component is developed and verified by measurements. The channel noise is also studied by measurements. Theoretical signalto- noise ratio (SNR) and channel capacity analyses and practical data transmission tests are carried out to evaluate the applicability of the PLC concept against the requirements set by the smart grid applications in the LVDC system. The main results concerning the applicability of the PLC concept and its limitations are presented, and suggestion for future research proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crude extracts of house dust mites are used clinically for diagnosis and immunotherapy of allergic diseases, including bronchial asthma, perennial rhinitis, and atopic dermatitis. However, crude extracts are complexes with non-allergenic antigens and lack effective concentrations of important allergens, resulting in several side effects. Dermatophagoides farinae (Hughes; Acari: Pyroglyphidae) is one of the predominant sources of dust mite allergens, which has more than 30 groups of allergen. The cDNA coding for the group 5 allergen of D. farinae from China was cloned, sequenced and expressed. According to alignment using the VECTOR NTI 9.0 software, there were eight mismatched nucleotides in five cDNA clones resulting in seven incompatible amino acid residues, suggesting that the Der f 5 allergen might have sequence polymorphism. Bioinformatics analysis revealed that the matured Der f 5 allergen has a molecular mass of 13604.03 Da, a theoretical pI of 5.43 and is probably hydrophobic and cytoplasmic. Similarities in amino acid sequences between Der f 5 and allergens of other domestic mite species, viz. Der p 5, Blo t 5, Sui m 5, and Lep d 5, were 79, 48, 53, and 37%, respectively. Phylogenetic analysis indicated that Der f 5 and Der p 5 clustered together. Blo t 5 and Ale o 5 also clustered together, although Blomia tropicalis and Aleuroglyphus ovatus belong to different mite families, viz. Echimyopodidae and Acaridae, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Social tagging has become very popular around the Internet as well as in research. The main idea behind tagging is to allow users to provide metadata to the web content from their perspective to facilitate categorization and retrieval. There are many factors that influence users' tag choice. Many studies have been conducted to reveal these factors by analysing tagging data. This paper uses two theories to identify these factors, namely the semiotics theory and activity theory. The former treats tags as signs and the latter treats tagging as an activity. The paper uses both theories to analyse tagging behaviour by explaining all aspects of a tagging system, including tags, tagging system components and the tagging activity. The theoretical analysis produced a framework that was used to identify a number of factors. These factors can be considered as categories that can be consulted to redirect user tagging choice in order to support particular tagging behaviour, such as cross-lingual tagging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We demonstrate that stakeholder-oriented multi-criteria analysis (MCA) can adequately address a variety of sustainable development dilemmas in decision-making, especially when applied to complex project evaluations involving multiple objectives and multiple stakeholder groups. Such evaluations are typically geared towards satisfying simultaneously private economic goals, broader social objectives and environmental targets. We show that, under specific conditions, a variety of stakeholder-oriented MCA approaches may be able to contribute substantively to the resolution or improved governance of societal conflicts and the pursuit of the public good in the form of sustainable development. We contrast the potential usefulness of these stakeholder-oriented approaches – in terms of their ability to contribute to sustainable development – with more conventional MCA approaches and social cost–benefit analysis.