884 resultados para 080403 Data Structures


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the thesis is to propose a Bayesian estimation through Markov chain Monte Carlo of multidimensional item response theory models for graded responses with complex structures and correlated traits. In particular, this work focuses on the multiunidimensional and the additive underlying latent structures, considering that the first one is widely used and represents a classical approach in multidimensional item response analysis, while the second one is able to reflect the complexity of real interactions between items and respondents. A simulation study is conducted to evaluate the parameter recovery for the proposed models under different conditions (sample size, test and subtest length, number of response categories, and correlation structure). The results show that the parameter recovery is particularly sensitive to the sample size, due to the model complexity and the high number of parameters to be estimated. For a sufficiently large sample size the parameters of the multiunidimensional and additive graded response models are well reproduced. The results are also affected by the trade-off between the number of items constituting the test and the number of item categories. An application of the proposed models on response data collected to investigate Romagna and San Marino residents' perceptions and attitudes towards the tourism industry is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die vorliegende Arbeit befasst sich mit der Synthese und Charakterisierung von Polymeren mit redox-funktionalen Phenothiazin-Seitenketten. Phenothiazin und seine Derivate sind kleine Redoxeinheiten, deren reversibles Redoxverhalten mit electrochromen Eigenschaften verbunden ist. Das besondere an Phenothiazine ist die Bildung von stabilen Radikalkationen im oxidierten Zustand. Daher können Phenothiazine als bistabile Moleküle agieren und zwischen zwei stabilen Redoxzuständen wechseln. Dieser Schaltprozess geht gleichzeitig mit einer Farbveränderung an her.rnrnIm Rahmen dieser Arbeit wird die Synthese neuartiger Phenothiazin-Polymere mittels radikalischer Polymerisation beschrieben. Phenothiazin-Derivate wurden kovalent an aliphatischen und aromatischen Polymerketten gebunden. Dies erfolgte über zwei unterschiedlichen synthetischen Routen. Die erste Route beinhaltet den Einsatz von Vinyl-Monomeren mit Phenothiazin Funktionalität zur direkten Polymerisation. Die zweite Route verwendet Amin modifizierte Phenothiazin-Derivate zur Funktionalisierung von Polymeren mit Aktivester-Seitenketten in einer polymeranalogen Reaktion. rnrnPolymere mit redox-funktionalen Phenothiazin-Seitenketten sind aufgrund ihrer Elektron-Donor-Eigenschaften geeignete Kandidaten für die Verwendung als Kathodenmaterialien. Zur Überprüfung ihrer Eignung wurden Phenothiazin-Polymere als Elektrodenmaterialien in Lithium-Batteriezellen eingesetzt. Die verwendeten Polymere wiesen gute Kapazitätswerte von circa 50-90 Ah/kg sowie schnelle Aufladezeiten in der Batteriezelle auf. Besonders die Aufladezeiten sind 5-10 mal höher als konventionelle Lithium-Batterien. Im Hinblick auf Anzahl der Lade- und Entladezyklen, erzielten die Polymere gute Werte in den Langzeit-Stabilitätstests. Insgesamt überstehen die Polymere 500 Ladezyklen mit geringen Veränderungen der Anfangswerte bezüglich Ladezeiten und -kapazitäten. Die Langzeit-Stabilität hängt unmittelbar mit der Radikalstabilität zusammen. Eine Stabilisierung der Radikalkationen gelang durch die Verlängerung der Seitenkette am Stickstoffatom des Phenothiazins und der Polymerhauptkette. Eine derartige Alkyl-Substitution erhöht die Radikalstabilität durch verstärkte Wechselwirkung mit dem aromatischen Ring und verbessert somit die Batterieleistung hinsichtlich der Stabilität gegenüber Lade- und Entladezyklen. rnrnDes Weiteren wurde die praktische Anwendung von bistabilen Phenothiazin-Polymeren als Speichermedium für hohe Datendichten untersucht. Dazu wurden dünne Filme des Polymers auf leitfähigen Substraten elektrochemisch oxidiert. Die elektrochemische Oxidation erfolgte mittels Rasterkraftmikroskopie in Kombination mit leitfähigen Mikroskopspitzen. Mittels dieser Technik gelang es, die Oberfläche des Polymers im nanoskaligen Bereich zu oxidieren und somit die lokale Leitfähigkeit zu verändern. Damit konnten unterschiedlich große Muster lithographisch beschrieben und aufgrund der Veränderung ihrer Leitfähigkeit detektiert werden. Der Schreibprozess führte nur zu einer Veränderung der lokalen Leitfähigkeit ohne die topographische Beschaffenheit des Polymerfilms zu beeinflussen. Außerdem erwiesen sich die Muster als besonders stabil sowohl mechanisch als auch über die Zeit.rnrnZum Schluss wurden neue Synthesestrategien entwickelt um mechanisch stabile als auch redox-funktionale Oberflächen zu produzieren. Mit Hilfe der oberflächen-initiierten Atomtransfer-Radikalpolymerisation wurden gepfropfte Polymerbürsten mit redox-funktionalen Phenothiazin-Seitenketten hergestellt und mittels Röntgenmethoden und Rasterkraftmikroskopie analysiert. Eine der Synthesestrategien geht von gepfropften Aktivesterbürsten aus, die anschließend in einem nachfolgenden Schritt mit redox-funktionalen Gruppen modifiziert werden können. Diese Vorgehensweise ist besonders vielversprechend und erlaubt es unterschiedliche funktionelle Gruppen an den Aktivesterbürsten zu verankern. Damit können durch Verwendung von vernetzenden Gruppen neben den Redoxeigenschaften, die mechanische Stabilität solcher Polymerfilme optimiert werden. rn rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work is aimed to the study and the analysis of the defects detected in the civil structure and that are object of civil litigation in order to create an instruments capable of helping the different actor involved in the building process. It is divided in three main sections. The first part is focused on the collection of the data related to the civil proceeding of the 2012 and the development of in depth analysis of the main aspects regarding the defects on existing buildings. The research center “Osservatorio Claudio Ceccoli” developed a system for the collection of the information coming from the civil proceedings of the Court of Bologna. Statistical analysis are been performed and the results are been shown and discussed in the first chapters.The second part analyzes the main issues emerged during the study of the real cases, related to the activities of the technical consultant. The idea is to create documents, called “focus”, addressed to clarify and codify specific problems in order to develop guidelines that help the technician editing of the technical advice.The third part is centered on the estimation of the methods used for the collection of data. The first results show that these are not efficient. The critical analysis of the database, the result and the experience and throughout, allowed the implementation of the collection system for the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Potential energy curves have been calculated for CnH22+ (n = 2−9) ions and results have been compared with data on unimolecular charge-separation reactions obtained by Rabrenović and Beynon. Geometry-optimized, minimum energy, linear CnH22+ structures have been computed for ground and low-lying excited states. These carbodications exist in stable configurations with well depths greater than 3 eV. Decomposition pathways into singly charged fragment ions lead to products with computed kinetic energies in excess of 1 eV. A high degree of correlation exists between experimental information and results computed for linear CnH22+ structures having hydrogen atoms on each end. The exception involves C4H22+reactions where a low-lying doubly charged isomer must be invoked to rationalize the experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In-stream structures including cross-vanes, J-hooks, rock vanes, and W-weirs are widely used in river restoration to limit bank erosion, prevent changes in channel gradient, and improve aquatic habitat. During this investigation, a rapid assessment protocol was combined with post-project monitoring data to assess factors influencing the performance of more than 558 in-stream structures and rootwads in North Carolina. Cross-sectional survey data examined for 221 cross sections from 26 sites showed that channel adjustments were highly variable from site to site, but approximately 60 % of the sites underwent at least a 20 % net change in channel capacity. Evaluation of in-stream structures ranging from 1 to 8 years in age showed that about half of the structures were impaired at 10 of the 26 sites. Major structural damage was often associated with floods of low to moderate frequency and magnitude. Failure mechanisms varied between sites and structure types, but included: (1) erosion of the channel bed and banks (outflanking); (2) movement of rock materials during floods; and (3) burial of the structures in the channel bed. Sites with reconstructed channels that exhibited large changes in channel capacity possessed the highest rates of structural impairment, suggesting that channel adjustments between structures led to their degradation of function. The data question whether currently used in-stream structures are capable of stabilizing reconfigured channels for even short periods when applied to dynamic rivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The calculation of projection structures (PSs) from Protein Data Bank (PDB)-coordinate files of membrane proteins is not well-established. Reports on such attempts exist but are rare. In addition, the different procedures are barely described and thus difficult if not impossible to reproduce. Here we present a simple, fast and well-documented method for the calculation and visualization of PSs from PDB-coordinate files of membrane proteins: the projection structure visualization (PSV)-method. The PSV-method was successfully validated using the PS of aquaporin-1 (AQP1) from 2D crystals and cryo-transmission electron microscopy, and the PDB-coordinate file of AQP1 determined from 3D crystals and X-ray crystallography. Besides AQP1, which is a relatively rigid protein, we also studied a flexible membrane transport protein, i.e. the L-arginine/agmatine antiporter AdiC. Comparison of PSs calculated from the existing PDB-coordinate files of substrate-free and L-arginine-bound AdiC indicated that conformational changes are detected in projection. Importantly, structural differences were found between the PSV-method calculated PSs of the detergent-solubilized AdiC proteins and the PS from cryo-TEM of membrane-embedded AdiC. These differences are particularly exciting since they may reflect a different conformation of AdiC induced by the lateral pressure in the lipid bilayer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project addresses the unreliability of operating system code, in particular in device drivers. Device driver software is the interface between the operating system and the device's hardware. Device drivers are written in low level code, making them difficult to understand. Almost all device drivers are written in the programming language C which allows for direct manipulation of memory. Due to the complexity of manual movement of data, most mistakes in operating systems occur in device driver code. The programming language Clay can be used to check device driver code at compile-time. Clay does most of its error checking statically to minimize the overhead of run-time checks in order to stay competitive with C's performance time. The Clay compiler can detect a lot more types of errors than the C compiler like buffer overflows, kernel stack overflows, NULL pointer uses, freed memory uses, and aliasing errors. Clay code that successfully compiles is guaranteed to run without failing on errors that Clay can detect. Even though C is unsafe, currently most device drivers are written in it. Not only are device drivers the part of the operating system most likely to fail, they also are the largest part of the operating system. As rewriting every existing device driver in Clay by hand would be impractical, this thesis is part of a project to automate translation of existing drivers from C to Clay. Although C and Clay both allow low level manipulation of data and fill the same niche for developing low level code, they have different syntax, type systems, and paradigms. This paper explores how C can be translated into Clay. It identifies what part of C device drivers cannot be translated into Clay and what information drivers in Clay will require that C cannot provide. It also explains how these translations will occur by explaining how each C structure is represented in the compiler and how these structures are changed to represent a Clay structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are concerned with the estimation of the exterior surface of tube-shaped anatomical structures. This interest is motivated by two distinct scientific goals, one dealing with the distribution of HIV microbicide in the colon and the other with measuring degradation in white-matter tracts in the brain. Our problem is posed as the estimation of the support of a distribution in three dimensions from a sample from that distribution, possibly measured with error. We propose a novel tube-fitting algorithm to construct such estimators. Further, we conduct a simulation study to aid in the choice of a key parameter of the algorithm, and we test our algorithm with validation study tailored to the motivating data sets. Finally, we apply the tube-fitting algorithm to a colon image produced by single photon emission computed tomography (SPECT)and to a white-matter tract image produced using diffusion tensor `imaging (DTI).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this study was to evaluate soft tissue image quality of a mobile cone-beam computed tomography (CBCT) scanner with an integrated flat-panel detector. STUDY DESIGN: Eight fresh human cadavers were used in this study. For evaluation of soft tissue visualization, CBCT data sets and corresponding computed tomography (CT) and magnetic resonance imaging (MRI) data sets were acquired. Evaluation was performed with the help of 10 defined cervical anatomical structures. RESULTS: The statistical analysis of the scoring results of 3 examiners revealed the CBCT images to be of inferior quality regarding the visualization of most of the predefined structures. Visualization without a significant difference was found regarding the demarcation of the vertebral bodies and the pyramidal cartilages, the arteriosclerosis of the carotids (compared with CT), and the laryngeal skeleton (compared with MRI). Regarding arteriosclerosis of the carotids compared with MRI, CBCT proved to be superior. CONCLUSIONS: The integration of a flat-panel detector improves soft tissue visualization using a mobile CBCT scanner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Certain fatty acid N-alkyl amides from the medicinal plant Echinacea activate cannabinoid type-2 (CB2) receptors. In this study we show that the CB2-binding Echinacea constituents dodeca-2E,4E-dienoic acid isobutylamide (1) and dodeca-2E,4E,8Z,10Z-tetraenoic acid isobutylamide (2) form micelles in aqueous medium. In contrast, micelle formation is not observed for undeca-2E-ene-8,10-diynoic acid isobutylamide (3), which does not bind to CB2, or structurally related endogenous cannabinoids, such as arachidonoyl ethanolamine (anandamide). The critical micelle concentration (CMC) range of 1 and 2 was determined by fluorescence spectroscopy as 200-300 and 7400-10000 nM, respectively. The size of premicelle aggregates, micelles, and supermicelles was studied by dynamic light scattering. Microscopy images show that compound 1, but not 2, forms globular and rod-like supermicelles with radii of approximately 75 nm. The self-assembling N-alkyl amides partition between themselves and the CB2 receptor, and aggregation of N-alkyl amides thus determines their in vitro pharmacological effects. Molecular mechanics by Monte Carlo simulations of the aggregation process support the experimental data, suggesting that both 1 and 2 can readily aggregate into premicelles, but only 1 spontaneously assembles into larger aggregates. These findings have important implications for biological studies with this class of compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. Ancient Lake Ohrid is a steep-sided, oligotrophic, karst lake that was tectonically formed most likely within the Pliocene and often referred to as a hotspot of endemic biodiversity. This study aims on tracing significant lake level fluctuations at Lake Ohrid using high-resolution acoustic data in combination with lithological, geochemical, and chronological information from two sediment cores recovered from sub-aquatic terrace levels at ca. 32 and 60m water depth. According to our data, significant lake level fluctuations with prominent lowstands of ca. 60 and 35m below the present water level occurred during Marine Isotope Stage (MIS) 6 and MIS 5, respectively. The effect of these lowstands on biodiversity in most coastal parts of the lake is negligible, due to only small changes in lake surface area, coastline, and habitat. In contrast, biodiversity in shallower areas was more severely affected due to disconnection of today sublacustrine springs from the main water body. Multichannel seismic data from deeper parts of the lake clearly image several clinoform structures stacked on top of each other. These stacked clinoforms indicate significantly lower lake levels prior to MIS 6 and a stepwise rise of water level with intermittent stillstands since its existence as water-filled body, which might have caused enhanced expansion of endemic species within Lake Ohrid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of long-term historical information derived from paleoecological studies has long been recognized as a fundamental aspect of effective conservation. However, there remains some uncertainty regarding the extent to which paleoecology can inform on specific issues of high conservation priority, at the scale for which conservation policy decisions often take place. Here we review to what extent the past occurrence of three fundamental aspects of forest conservation can be assessed using paleoecological data, with a focus on northern Europe. These aspects are (1) tree species composition, (2) old/large trees and coarse woody debris, and (3) natural disturbances. We begin by evaluating the types of relevant historical information available from contemporary forests, then evaluate common paleoecological techniques, namely dendrochronology, pollen, macrofossil, charcoal, and fossil insect and wood analyses. We conclude that whereas contemporary forests can be used to estimate historical, natural occurrences of several of the aspects addressed here (e.g. old/large trees), paleoecological techniques are capable of providing much greater temporal depth, as well as robust quantitative data for tree species composition and fire disturbance, qualitative insights regarding old/large trees and woody debris, but limited indications of past windstorms and insect outbreaks. We also find that studies of fossil wood and paleoentomology are perhaps the most underutilized sources of information. Not only can paleoentomology provide species specific information, but it also enables the reconstruction of former environmental conditions otherwise unavailable. Despite the potential, the majority of conservation-relevant paleoecological studies primarily focus on describing historical forest conditions in broad terms and for large spatial scales, addressing former climate, land-use, and landscape developments, often in the absence of a specific conservation context. In contrast, relatively few studies address the most pressing conservation issues in northern Europe, often requiring data on the presence or quantities of dead wood, large trees or specific tree species, at the scale of the stand or reserve. Furthermore, even fewer examples exist of detailed paleoecological data being used for conservation planning, or the setting of operative restorative baseline conditions at local scales. If ecologist and conservation biologists are going to benefit to the full extent possible from the ever-advancing techniques developed by the paleoecological sciences, further integration of these disciplines is desirable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric circulation modes are important concepts in understanding the variability of atmospheric dynamics. Assuming their spatial patterns to be fixed, such modes are often described by simple indices from rather short observational data sets. The increasing length of reanalysis products allows these concepts and assumptions to be scrutinised. Here we investigate the stability of spatial patterns of Northern Hemisphere teleconnections by using the Twentieth Century Reanalysis as well as several control and transient millennium-scale simulations with coupled models. The observed and simulated centre of action of the two major teleconnection patterns, the North Atlantic Oscillation (NAO) and to some extent the Pacific North American (PNA), are not stable in time. The currently observed dipole pattern of the NAO, its centre of action over Iceland and the Azores, split into a north–south dipole pattern in the western Atlantic with a wave train pattern in the eastern part, connecting the British Isles with West Greenland and the eastern Mediterranean during the period 1940–1969 AD. The PNA centres of action over Canada are shifted southwards and over Florida into the Gulf of Mexico during the period 1915–1944 AD. The analysis further shows that shifts in the centres of action of either teleconnection pattern are not related to changes in the external forcing applied in transient simulations of the last millennium. Such shifts in their centres of action are accompanied by changes in the relation of local precipitation and temperature with the overlying atmospheric mode. These findings further undermine the assumption of stationarity between local climate/proxy variability and large-scale dynamics inherent when using proxy-based reconstructions of atmospheric modes, and call for a more robust understanding of atmospheric variability on decadal timescales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensity modulated radiation therapy (IMRT) is a technique that delivers a highly conformal dose distribution to a target volume while attempting to maximally spare the surrounding normal tissues. IMRT is a common treatment modality used for treating head and neck (H&N) cancers, and the presence of many critical structures in this region requires accurate treatment delivery. The Radiological Physics Center (RPC) acts as both a remote and on-site quality assurance agency that credentials institutions participating in clinical trials. To date, about 30% of all IMRT participants have failed the RPC’s remote audit using the IMRT H&N phantom. The purpose of this project is to evaluate possible causes of H&N IMRT delivery errors observed by the RPC, specifically IMRT treatment plan complexity and the use of improper dosimetry data from machines that were thought to be matched but in reality were not. Eight H&N IMRT plans with a range of complexity defined by total MU (1460-3466), number of segments (54-225), and modulation complexity scores (MCS) (0.181-0.609) were created in Pinnacle v.8m. These plans were delivered to the RPC’s H&N phantom on a single Varian Clinac. One of the IMRT plans (1851 MU, 88 segments, and MCS=0.469) was equivalent to the median H&N plan from 130 previous RPC H&N phantom irradiations. This average IMRT plan was also delivered on four matched Varian Clinac machines and the dose distribution calculated using a different 6MV beam model. Radiochromic film and TLD within the phantom were used to analyze the dose profiles and absolute doses, respectively. The measured and calculated were compared to evaluate the dosimetric accuracy. All deliveries met the RPC acceptance criteria of ±7% absolute dose difference and 4 mm distance-to-agreement (DTA). Additionally, gamma index analysis was performed for all deliveries using a ±7%/4mm and ±5%/3mm criteria. Increasing the treatment plan complexity by varying the MU, number of segments, or varying the MCS resulted in no clear trend toward an increase in dosimetric error determined by the absolute dose difference, DTA, or gamma index. Varying the delivery machines as well as the beam model (use of a Clinac 6EX 6MV beam model vs. Clinac 21EX 6MV model), also did not show any clear trend towards an increased dosimetric error using the same criteria indicated above.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In numerous intervention studies and education field trials, random assignment to treatment occurs in clusters rather than at the level of observation. This departure of random assignment of units may be due to logistics, political feasibility, or ecological validity. Data within the same cluster or grouping are often correlated. Application of traditional regression techniques, which assume independence between observations, to clustered data produce consistent parameter estimates. However such estimators are often inefficient as compared to methods which incorporate the clustered nature of the data into the estimation procedure (Neuhaus 1993).1 Multilevel models, also known as random effects or random components models, can be used to account for the clustering of data by estimating higher level, or group, as well as lower level, or individual variation. Designing a study, in which the unit of observation is nested within higher level groupings, requires the determination of sample sizes at each level. This study investigates the design and analysis of various sampling strategies for a 3-level repeated measures design on the parameter estimates when the outcome variable of interest follows a Poisson distribution. ^ Results study suggest that second order PQL estimation produces the least biased estimates in the 3-level multilevel Poisson model followed by first order PQL and then second and first order MQL. The MQL estimates of both fixed and random parameters are generally satisfactory when the level 2 and level 3 variation is less than 0.10. However, as the higher level error variance increases, the MQL estimates become increasingly biased. If convergence of the estimation algorithm is not obtained by PQL procedure and higher level error variance is large, the estimates may be significantly biased. In this case bias correction techniques such as bootstrapping should be considered as an alternative procedure. For larger sample sizes, those structures with 20 or more units sampled at levels with normally distributed random errors produced more stable estimates with less sampling variance than structures with an increased number of level 1 units. For small sample sizes, sampling fewer units at the level with Poisson variation produces less sampling variation, however this criterion is no longer important when sample sizes are large. ^ 1Neuhaus J (1993). “Estimation efficiency and Tests of Covariate Effects with Clustered Binary Data”. Biometrics , 49, 989–996^