941 resultados para Algorithms, Properties, the KCube Graphs
Resumo:
This paper presents the German version of the Short Understanding of Substance Abuse Scale (SUSS) [Humphreys et al.: Psychol Addict Behav 1996;10:38-44], the Verstandnis von Storungen durch Substanzkonsum (VSS), and evaluates its psychometric properties. The VSS assesses clinicians' beliefs about the nature and treatment of substance use disorders, particularly their endorsement of psychosocial and disease orientation. The VSS was administered to 160 treatment staff members at 12 substance use disorder treatment programs in the German-speaking part of Switzerland. Because the confirmatory factor analysis of the VSS did not completely replicate the factorial structure of the SUSS, an exploratory factor analysis was undertaken. This analysis identified two factors: the Psychosocial model factor and a slightly different Disease model factor. The VSS Disease and Psychosocial subscales showed convergent and discriminant validity, as well as sufficient reliability.
Resumo:
In this dissertation, the problem of creating effective large scale Adaptive Optics (AO) systems control algorithms for the new generation of giant optical telescopes is addressed. The effectiveness of AO control algorithms is evaluated in several respects, such as computational complexity, compensation error rejection and robustness, i.e. reasonable insensitivity to the system imperfections. The results of this research are summarized as follows: 1. Robustness study of Sparse Minimum Variance Pseudo Open Loop Controller (POLC) for multi-conjugate adaptive optics (MCAO). The AO system model that accounts for various system errors has been developed and applied to check the stability and performance of the POLC algorithm, which is one of the most promising approaches for the future AO systems control. It has been shown through numerous simulations that, despite the initial assumption that the exact system knowledge is necessary for the POLC algorithm to work, it is highly robust against various system errors. 2. Predictive Kalman Filter (KF) and Minimum Variance (MV) control algorithms for MCAO. The limiting performance of the non-dynamic Minimum Variance and dynamic KF-based phase estimation algorithms for MCAO has been evaluated by doing Monte-Carlo simulations. The validity of simple near-Markov autoregressive phase dynamics model has been tested and its adequate ability to predict the turbulence phase has been demonstrated both for single- and multiconjugate AO. It has also been shown that there is no performance improvement gained from the use of the more complicated KF approach in comparison to the much simpler MV algorithm in the case of MCAO. 3. Sparse predictive Minimum Variance control algorithm for MCAO. The temporal prediction stage has been added to the non-dynamic MV control algorithm in such a way that no additional computational burden is introduced. It has been confirmed through simulations that the use of phase prediction makes it possible to significantly reduce the system sampling rate and thus overall computational complexity while both maintaining the system stable and effectively compensating for the measurement and control latencies.
Resumo:
ab-initio Hartree Fock (HF), density functional theory (DFT) and hybrid potentials were employed to compute the optimized lattice parameters and elastic properties of perovskite 3-d transition metal oxides. The optimized lattice parameters and elastic properties are interdependent in these materials. An interaction is observed between the electronic charge, spin and lattice degrees of freedom in 3-d transition metal oxides. The coupling between the electronic charge, spin and lattice structures originates due to localization of d-atomic orbitals. The coupling between the electronic charge, spin and crystalline lattice also contributes in the ferroelectric and ferromagnetic properties in perovskites. The cubic and tetragonal crystalline structures of perovskite transition metal oxides of ABO3 are studied. The electronic structure and the physics of 3-d perovskite materials is complex and less well considered. Moreover, the novelty of the electronic structure and properties of these perovskites transition metal oxides exceeds the challenge offered by their complex crystalline structures. To achieve the objective of understanding the structure and property relationship of these materials the first-principle computational method is employed. CRYSTAL09 code is employed for computing crystalline structure, elastic, ferromagnetic and other electronic properties. Second-order elastic constants (SOEC) and bulk moduli (B) are computed in an automated process by employing ELASTCON (elastic constants) and EOS (equation of state) programs in CRYSTAL09 code. ELASTCON, EOS and other computational algorithms are utilized to determine the elastic properties of tetragonal BaTiO3, rutile TiO2, cubic and tetragonal BaFeO3 and the ferromagentic properties of 3-d transition metal oxides. Multiple methods are employed to crosscheck the consistency of our computational results. Computational results have motivated us to explore the ferromagnetic properties of 3-d transition metal oxides. Billyscript and CRYSTAL09 code are employed to compute the optimized geometry of the cubic and tetragonal crystalline structure of transition metal oxides of Sc to Cu. Cubic crystalline structure is initially chosen to determine the effect of lattice strains on ferromagnetism due to the spin angular momentum of an electron. The 3-d transition metals and their oxides are challenging as the basis functions and potentials are not fully developed to address the complex physics of the transition metals. Moreover, perovskite crystalline structures are extremely challenging with respect to the quality of computations as the latter requires the well established methods. Ferroelectric and ferromagnetic properties of bulk, surfaces and interfaces are explored by employing CRYSTAL09 code. In our computations done on cubic TMOs of Sc-Fe it is observed that there is a coupling between the crystalline structure and FM/AFM spin polarization. Strained crystalline structures of 3-d transition metal oxides are subjected to changes in the electromagnetic and electronic properties. The electronic structure and properties of bulk, composites, surfaces of 3-d transition metal oxides are computed successfully.
Resumo:
In 1969, Lovasz asked whether every connected, vertex-transitive graph has a Hamilton path. This question has generated a considerable amount of interest, yet remains vastly open. To date, there exist no known connected, vertex-transitive graph that does not possess a Hamilton path. For the Cayley graphs, a subclass of vertex-transitive graphs, the following conjecture was made: Weak Lovász Conjecture: Every nontrivial, finite, connected Cayley graph is hamiltonian. The Chen-Quimpo Theorem proves that Cayley graphs on abelian groups flourish with Hamilton cycles, thus prompting Alspach to make the following conjecture: Alspach Conjecture: Every 2k-regular, connected Cayley graph on a finite abelian group has a Hamilton decomposition. Alspach’s conjecture is true for k = 1 and 2, but even the case k = 3 is still open. It is this case that this thesis addresses. Chapters 1–3 give introductory material and past work on the conjecture. Chapter 3 investigates the relationship between 6-regular Cayley graphs and associated quotient graphs. A proof of Alspach’s conjecture is given for the odd order case when k = 3. Chapter 4 provides a proof of the conjecture for even order graphs with 3-element connection sets that have an element generating a subgroup of index 2, and having a linear dependency among the other generators. Chapter 5 shows that if Γ = Cay(A, {s1, s2, s3}) is a connected, 6-regular, abelian Cayley graph of even order, and for some1 ≤ i ≤ 3, Δi = Cay(A/(si), {sj1 , sj2}) is 4-regular, and Δi ≄ Cay(ℤ3, {1, 1}), then Γ has a Hamilton decomposition. Alternatively stated, if Γ = Cay(A, S) is a connected, 6-regular, abelian Cayley graph of even order, then Γ has a Hamilton decomposition if S has no involutions, and for some s ∈ S, Cay(A/(s), S) is 4-regular, and of order at least 4. Finally, the Appendices give computational data resulting from C and MAGMA programs used to generate Hamilton decompositions of certain non-isomorphic Cayley graphs on low order abelian groups.
Resumo:
More than 3000 years ago, men began quenching and tempering tools to improve their physical properties. The ancient people found that iron was easier to shape and form in a heated condition. Charcoal was used as the fuel, and when the shaping process was completed, the smiths cooled the piece in the most obvious way, quenching in water. Quite unintentionally, these people stumbled on the process for improving the properties of iron, and the art of blacksmithing began.
Resumo:
The central question for this paper is how to improve the production process by closing the gap between industrial designers and software engineers of television(TV)-based User Interfaces (UI) in an industrial environment. Software engineers are highly interested whether one UI design can be converted into several fully functional UIs for TV products with different screen properties. The aim of the software engineers is to apply automatic layout and scaling in order to speed up and improve the production process. However, the question is whether a UI design lends itself for such automatic layout and scaling. This is investigated by analysing a prototype UI design done by industrial designers. In a first requirements study, industrial designers had created meta-annotations on top of their UI design in order to disclose their design rationale for discussions with software engineers. In a second study, five (out of ten) industrial designers assessed the potential of four different meta-annotation approaches. The question was which annotation method industrial designers would prefer and whether it could satisfy the technical requirements of the software engineering process. One main result is that the industrial designers preferred the method they were already familiar with, which therefore seems to be the most effective one although the main objective of automatic layout and scaling could still not be achieved.
Resumo:
The article introduces the E-learning Circle, a tool developed to assure the quality of the software design process of e-learning systems, considering pedagogical principles as well as technology. The E-learning Circle consists of a number of concentric circles which are divided into three sectors. The content of the inner circles is based on pedagogical principles, while the outer circle specifies how the pedagogical principles may be implemented with technology. The circle’s centre is dedicated to the subject taught, ensuring focus on the specific subject’s properties. The three sectors represent the student, the teacher and the learning objectives. The strengths of the E-learning Circle are the compact presentation combined with the overview it provides, as well as the usefulness of a design tool dealing with complexity, providing a common language and embedding best practice. The E-learning Circle is not a prescriptive method, but is useful in several design models and processes. The article presents two projects where the E-learning Circle was used as a design tool.
Resumo:
Meteorological or climatological extremes are rare and hence studying them requires long meteorological data sets. Moreover, for addressing the underlying atmospheric processes, detailed three-dimensional data are desired. Until recently the two requirements were incompatible as long meteorological series were only available for a few locations, whereas detailed 3-dimensional data sets such as reanalyses were limited to the past few decades. In 2011, the “Twentieth Century Reanalysis” (20CR) was released, a 6-hourly global atmospheric data set covering the past 140 years, thus combining the two properties. The collection of short papers in this volume contains case studies of individual extreme events in the 20CR data set. In this overview paper we introduce the first six cases and summarise some common findings. All of the events are represented in 20CR in a physically consistent way, allowing further meteorological interpretations and process studies. Also, for most of the events, the magnitudes are underestimated in the ensemble mean. Possible causes are addressed. For interpreting extrema it may be necessary to address individual ensemble members. Also, the density of observations underlying 20CR should be considered. Finally, we point to problems in wind speeds over the Arctic and the northern North Pacific in 20CR prior to the 1950s.
Resumo:
Purpose Femoral fracture is a common medical problem in osteoporotic individuals. Bone mineral density (BMD) is the gold standard measure to evaluate fracture risk in vivo. Quantitative computed tomography (QCT)-based homogenized voxel finite element (hvFE) models have been proved to be more accurate predictors of femoral strength than BMD by adding geometrical and material properties. The aim of this study was to evaluate the ability of hvFE models in predicting femoral stiffness, strength and failure location for a large number of pairs of human femora tested in two different loading scenarios. Methods Thirty-six pairs of femora were scanned with QCT and total proximal BMD and BMC were evaluated. For each pair, one femur was positioned in one-legged stance configuration (STANCE) and the other in a sideways configuration (SIDE). Nonlinear hvFE models were generated from QCT images by reproducing the same loading configurations imposed in the experiments. For experiments and models, the structural properties (stiffness and ultimate load), the failure location and the motion of the femoral head were computed and compared. Results In both configurations, hvFE models predicted both stiffness (R2=0.82 for STANCE and R2=0.74 for SIDE) and femoral ultimate load (R2=0.80 for STANCE and R2=0.85 for SIDE) better than BMD and BMC. Moreover, the models predicted qualitatively well the failure location (66% of cases) and the motion of the femoral head. Conclusions The subject specific QCT-based nonlinear hvFE model cannot only predict femoral apparent mechanical properties better than densitometric measures, but can additionally provide useful qualitative information about failure location.
Resumo:
Reproducing the characteristics and the functional responses of the blood-brain barrier (BBB) in vitro represents an important task for the research community, and would be a critical biotechnological breakthrough. Pharmaceutical and biotechnology industries provide strong demand for inexpensive and easy-to-handle in vitro BBB models to screen novel drug candidates. Recently, it was shown that canonical Wnt signaling is responsible for the induction of the BBB properties in the neonatal brain microvasculature in vivo. In the present study, following on from earlier observations, we have developed a novel model of the BBB in vitro that may be suitable for large scale screening assays. This model is based on immortalized endothelial cell lines derived from murine and human brain, with no need for co-culture with astrocytes. To maintain the BBB endothelial cell properties, the cell lines are cultured in the presence of Wnt3a or drugs that stabilize β-catenin, or they are infected with a transcriptionally active form of β-catenin. Upon these treatments, the cell lines maintain expression of BBB-specific markers, which results in elevated transendothelial electrical resistance and reduced cell permeability. Importantly, these properties are retained for several passages in culture, and they can be reproduced and maintained in different laboratories over time. We conclude that the brain-derived endothelial cell lines that we have investigated gain their specialized characteristics upon activation of the canonical Wnt pathway. This model may be thus suitable to test the BBB permeability to chemicals or large molecular weight proteins, transmigration of inflammatory cells, treatments with cytokines, and genetic manipulation.
Resumo:
The variability of results from different automated methods of detection and tracking of extratropical cyclones is assessed in order to identify uncertainties related to the choice of method. Fifteen international teams applied their own algorithms to the same dataset - the period 1989-2009 of interim European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERAInterim) data. This experiment is part of the community project Intercomparison of Mid Latitude Storm Diagnostics (IMILAST; see www.proclim.ch/imilast/index.html). The spread of results for cyclone frequency, intensity, life cycle, and track location is presented to illustrate the impact of using different methods. Globally, methods agree well for geographical distribution in large oceanic regions, interannual variability of cyclone numbers, geographical patterns of strong trends, and distribution shape for many life cycle characteristics. In contrast, the largest disparities exist for the total numbers of cyclones, the detection of weak cyclones, and distribution in some densely populated regions. Consistency between methods is better for strong cyclones than for shallow ones. Two case studies of relatively large, intense cyclones reveal that the identification of the most intense part of the life cycle of these events is robust between methods, but considerable differences exist during the development and the dissolution phases.
Resumo:
Aquatic ecosystems are confronted with multiple stress factors. Current approaches to assess the risk of anthropogenic stressors to aquatic ecosystems are developed for single stressors and determine stressor effects primarily as a function of stressor properties. The cumulative impact of several stressors, however, may differ markedly from the impact of the single stressors and can result in nonlinear effects and ecological surprises. To meet the challenge of diagnosing and predicting multiple stressor impacts, assessment strategies should focus on properties of the biological receptors rather than on stressor properties. This change of paradigm is required because (i) multiple stressors affect multiple biological targets at multiple organizational levels, (ii) biological receptors differ in their sensitivities, vulnerabilities, and response dynamics to the individual stressors, and (iii) biological receptors function as networks, so that actions of stressors at disparate sites within the network can lead via indirect or cascading effects, to unexpected outcomes.
Resumo:
INTRODUCTION The gastrin-releasing peptide receptor (GRPR) was shown to be expressed with high density on several types of cancers. Radiolabeled peptides for imaging and targeted radionuclide therapy have been developed. In this study, we evaluated the potential of statine-based bombesin antagonists, conjugated to 1,4,7,10-tetraazacyclododecane-1,4,7,10-tetraacetic acid (DOTA) through oligoethyleneglycol spacers, labeled with (177)Lu and we determined the effect of polyethyleneglycol (PEG) spacer length on in vitro and in vivo properties. METHODS The bombesin antagonists were synthesized on solid phase using Fmoc chemistry; the spacers Fmoc-dPEGx-OH (x=2, 4, 6 and 12) and the DOTA(tBu)3 were coupled using a standard procedure. The peptides were labeled with (177)Lu and evaluated in vitro (lipophilicity, serum stability, internalization and binding affinity assays). Biodistribution studies were performed in PC-3 tumor-bearing nude mice. RESULTS The solid-phase synthesis was straightforward with an overall yield ranging from 30% to 35% based on the first Fmoc cleavage. The hydrophilicity increased with spacer length (logD: -1.95 vs -2.22 of PEG2 and PEG12 analogs, respectively). There is a tendency of increased serum stability by increasing the spacer length (T1/2=246±4 and 584±20 for PEG2 and PEG6 analogs, respectively) which seems to reverse with the PEG12 analog. The IC50 values are similar with the only significant difference of the PEG12 analog. The (177)Lu-labeled PEG4 and PEG6 conjugates showed similar pharmacokinetic with high tumor uptake and excellent tumor-to-kidney ratios (7.8 and 9.7 at 4h for the PEG4 and PEG6 derivatives, respectively). The pancreas uptake was relatively high at 1h but it shows fast washout (0.46%±0.02% IA/g and 0.29%±0.08% IA/g already at 4h). CONCLUSION Among all the studied analogs the PEG4 and PEG6 showed significantly better properties. The very high tumor-to-non-target organ ratios, in particular tumor-to-kidney ratios, already at early time point will be important in regard to safety concerning kidney toxicity.
Resumo:
The two small asteroid-like bodies orbiting Mars, Phobos and Deimos, are low albedo and exhibit similar visible to near-infrared spectra. Determining the origin of these moons is closely tied to determining their composition. From available spectroscopic data Phobos exhibits two distinct types of materials across its surface, and data from both Mars Express and Mars Reconnaissance Orbiter have provided additional details about the properties of these materials and their spatial relation to one another. Although no prominent diagnostic absorptions have been detected, systematic weak features are seen in some data. An extensive regolith is observed to have developed on both moons with characteristics that may be unique due to their special environment in Mars orbit. Understanding the character and evolution of the regolith of Phobos and Deimos is central to interpreting the moons׳ physical and optical properties. The cumulative data available for compositional analyses across the surface of Phobos and Deimos, however, remain incomplete in scope and character and ambiguous in interpretation. Consequently the composition of the moons of Mars remains uncertain.
Resumo:
Long-term electrocardiogram (ECG) often suffers from relevant noise. Baseline wander in particular is pronounced in ECG recordings using dry or esophageal electrodes, which are dedicated for prolonged registration. While analog high-pass filters introduce phase distortions, reliable offline filtering of the baseline wander implies a computational burden that has to be put in relation to the increase in signal-to-baseline ratio (SBR). Here we present a graphics processor unit (GPU) based parallelization method to speed up offline baseline wander filter algorithms, namely the wavelet, finite, and infinite impulse response, moving mean, and moving median filter. Individual filter parameters were optimized with respect to the SBR increase based on ECGs from the Physionet database superimposed to auto-regressive modeled, real baseline wander. A Monte-Carlo simulation showed that for low input SBR the moving median filter outperforms any other method but negatively affects ECG wave detection. In contrast, the infinite impulse response filter is preferred in case of high input SBR. However, the parallelized wavelet filter is processed 500 and 4 times faster than these two algorithms on the GPU, respectively, and offers superior baseline wander suppression in low SBR situations. Using a signal segment of 64 mega samples that is filtered as entire unit, wavelet filtering of a 7-day high-resolution ECG is computed within less than 3 seconds. Taking the high filtering speed into account, the GPU wavelet filter is the most efficient method to remove baseline wander present in long-term ECGs, with which computational burden can be strongly reduced.