851 resultados para Television -- Antennas -- Design and construction -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wealth of genetic associations for cardiovascular and metabolic phenotypes in humans has been accumulating over the last decade, in particular a large number of loci derived from recent genome wide association studies (GWAS). True complex disease-associated loci often exert modest effects, so their delineation currently requires integration of diverse phenotypic data from large studies to ensure robust meta-analyses. We have designed a gene-centric 50 K single nucleotide polymorphism (SNP) array to assess potentially relevant loci across a range of cardiovascular, metabolic and inflammatory syndromes. The array utilizes a "cosmopolitan" tagging approach to capture the genetic diversity across approximately 2,000 loci in populations represented in the HapMap and SeattleSNPs projects. The array content is informed by GWAS of vascular and inflammatory disease, expression quantitative trait loci implicated in atherosclerosis, pathway based approaches and comprehensive literature searching. The custom flexibility of the array platform facilitated interrogation of loci at differing stringencies, according to a gene prioritization strategy that allows saturation of high priority loci with a greater density of markers than the existing GWAS tools, particularly in African HapMap samples. We also demonstrate that the IBC array can be used to complement GWAS, increasing coverage in high priority CVD-related loci across all major HapMap populations. DNA from over 200,000 extensively phenotyped individuals will be genotyped with this array with a significant portion of the generated data being released into the academic domain facilitating in silico replication attempts, analyses of rare variants and cross-cohort meta-analyses in diverse populations. These datasets will also facilitate more robust secondary analyses, such as explorations with alternative genetic models, epistasis and gene-environment interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In numerous intervention studies and education field trials, random assignment to treatment occurs in clusters rather than at the level of observation. This departure of random assignment of units may be due to logistics, political feasibility, or ecological validity. Data within the same cluster or grouping are often correlated. Application of traditional regression techniques, which assume independence between observations, to clustered data produce consistent parameter estimates. However such estimators are often inefficient as compared to methods which incorporate the clustered nature of the data into the estimation procedure (Neuhaus 1993).1 Multilevel models, also known as random effects or random components models, can be used to account for the clustering of data by estimating higher level, or group, as well as lower level, or individual variation. Designing a study, in which the unit of observation is nested within higher level groupings, requires the determination of sample sizes at each level. This study investigates the design and analysis of various sampling strategies for a 3-level repeated measures design on the parameter estimates when the outcome variable of interest follows a Poisson distribution. ^ Results study suggest that second order PQL estimation produces the least biased estimates in the 3-level multilevel Poisson model followed by first order PQL and then second and first order MQL. The MQL estimates of both fixed and random parameters are generally satisfactory when the level 2 and level 3 variation is less than 0.10. However, as the higher level error variance increases, the MQL estimates become increasingly biased. If convergence of the estimation algorithm is not obtained by PQL procedure and higher level error variance is large, the estimates may be significantly biased. In this case bias correction techniques such as bootstrapping should be considered as an alternative procedure. For larger sample sizes, those structures with 20 or more units sampled at levels with normally distributed random errors produced more stable estimates with less sampling variance than structures with an increased number of level 1 units. For small sample sizes, sampling fewer units at the level with Poisson variation produces less sampling variation, however this criterion is no longer important when sample sizes are large. ^ 1Neuhaus J (1993). “Estimation efficiency and Tests of Covariate Effects with Clustered Binary Data”. Biometrics , 49, 989–996^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using stress and coping as a unifying theoretical concept, a series of five models was developed in order to synthesize the survey questions and to classify information. These models identified the question, listed the research study, described measurements, listed workplace data, and listed industry and national reference data.^ A set of 38 instrument questions was developed within the five coping correlate categories. In addition, a set of 22 stress symptoms was also developed. The study was conducted within two groups, police and professors, on a large university campus. The groups were selected because their occupations were diverse, but they were a part of the same macroenvironment. The premise was that police officers would be more highly stressed than professors.^ Of a total study group of 80, there were 37 respondents. The difference in the mean stress responses was observable between the two groups. Not only were the responses similar within each group, but the stress level of response was also similar within each group. While the response to the survey instrument was good, only 3 respondents answered the stress symptom survey properly. It was determined that none of the 37 respondents believed that they were ill. This perception of being well was also evidenced by the grand mean of the stress scores of 2.76 (3.0 = moderate stress). This also caused fewer independent variables to be entered in the multiple regression model.^ The survey instrument was carefully designed to be universal. Universality is the ability to transcend occupational or regional definitions as applied to stress. It is the ability to measure responses within broad categories such as physiological, emotional, behavioral, social, and cognitive functions without losing the ability to measure the detail within the individual questions, or the relationships between questions and categories.^ Replication is much easier to achieve with standardized categories, questions, and measurement procedures such as those developed for the universal survey instrument. Because the survey instrument is universal it can be used as an analytical device, an assessment device, a basic tool for planning and a follow-up instrument to measure individual response to planned reductions in occupational stress. (Abstract shortened with permission of author.) ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resting-state functional connectivity (FC) fMRI (rs-fcMRI) offers an appealing approach to mapping the brain's intrinsic functional organization. Blood oxygen level dependent (BOLD) and arterial spin labeling (ASL) are the two main rs-fcMRI approaches to assess alterations in brain networks associated with individual differences, behavior and psychopathology. While the BOLD signal is stronger with a higher temporal resolution, ASL provides quantitative, direct measures of the physiology and metabolism of specific networks. This study systematically investigated the similarity and reliability of resting brain networks (RBNs) in BOLD and ASL. A 2×2×2 factorial design was employed where each subject underwent repeated BOLD and ASL rs-fcMRI scans on two occasions on two MRI scanners respectively. Both independent and joint FC analyses revealed common RBNs in ASL and BOLD rs-fcMRI with a moderate to high level of spatial overlap, verified by Dice Similarity Coefficients. Test-retest analyses indicated more reliable spatial network patterns in BOLD (average modal Intraclass Correlation Coefficients: 0.905±0.033 between-sessions; 0.885±0.052 between-scanners) than ASL (0.545±0.048; 0.575±0.059). Nevertheless, ASL provided highly reproducible (0.955±0.021; 0.970±0.011) network-specific CBF measurements. Moreover, we observed positive correlations between regional CBF and FC in core areas of all RBNs indicating a relationship between network connectivity and its baseline metabolism. Taken together, the combination of ASL and BOLD rs-fcMRI provides a powerful tool for characterizing the spatiotemporal and quantitative properties of RBNs. These findings pave the way for future BOLD and ASL rs-fcMRI studies in clinical populations that are carried out across time and scanners.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to harness the unique properties of nanoparticles for novel clinical applications and to modulate their uptake into specific immune cells we designed a new library of homo- and hetero-functional fluorescence-encoded gold nanoparticles (Au-NPs) using different poly(vinyl alcohol) and poly(ethylene glycol)-based polymers for particle coating and stabilization. The encoded particles were fully characterized by UV-Vis and fluorescence spectroscopy, zeta potential and dynamic light scattering. The uptake by human monocyte derived dendritic cells in vitro was studied by confocal laser scanning microscopy and quantified by fluorescence-activated cell sorting and inductively coupled plasma atomic emission spectroscopy. We show how the chemical modification of particle surfaces, for instance by attaching fluorescent dyes, can conceal fundamental particle properties and modulate cellular uptake. In order to mask the influence of fluorescent dyes on cellular uptake while still exploiting its fluorescence for detection, we have created hetero-functionalized Au-NPs, which again show typical particle dependent cellular interactions. Our study clearly prove that the thorough characterization of nanoparticles at each modification step in the engineering process is absolutely essential and that it can be necessary to make substantial adjustments of the particles in order to obtain reliable cellular uptake data, which truly reflects particle properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis consists of four essays on the design and disclosure of compensation contracts. Essays 1, 2 and 3 focus on behavioral aspects of mandatory compensation disclosure rules and of contract negotiations in agency relationships. The three experimental studies develop psychology- based theory and present results that deviate from standard economic predictions. Furthermore, the results of Essay 1 and 2 also have implications for firms’ discretion in how to communicate their top management’s incentives to the capital market. Essay 4 analyzes the role of fairness perceptions for the evaluation of executive compensation. For this purpose, two surveys targeting representative eligible voters as well as investment professionals were conducted. Essay 1 investigates the role of the detailed ‘Compensation Discussion and Analysis’, which is part of the Security and Exchange Commission’s 2006 regulation, on investors’ evaluations of executive performance. Compensation disclosure complying with this regulation clarifies the relationship between realized reported compensation and the underlying performance measures and their target achievement levels. The experimental findings suggest that the salient presentation of executives’ incentives inherent in the ‘Compensation Discussion and Analysis’ makes investors’ performance evaluations less outcome dependent. Therefore, investors’ judgment and investment decisions might be less affected by noisy environmental factors that drive financial performance. The results also suggest that fairness perceptions of compensation contracts are essential for investors’ performance evaluations in that more transparent disclosure increases the perceived fairness of compensation and the performance evaluation of managers who are not responsible for a bad financial performance. These results have important practical implications as firms might choose to communicate their top management’s incentive compensation more transparently in order to benefit from less volatile expectations about their future performance. Similar to the first experiment, the experiment described in Essay 2 addresses the question of more transparent compensation disclosure. However, other than the first experiment, the second experiment does not analyze the effect of a more salient presentation of contract information but the informational effect of contract information itself. For this purpose, the experiment tests two conditions in which the assessment of the compensation contracts’ incentive compatibility, which determines executive effort, is either possible or not. On the one hand, the results suggest that the quality of investors’ expectations about executive effort is improved, but on the other hand investors might over-adjust their prior expectations about executive effort if being confronted with an unexpected financial performance and under-adjust if the financial performance confirms their prior expectations. Therefore, in the experiment, more transparent compensation disclosure does not lead to more correct overall judgments of executive effort and to even lower processing quality of outcome information. These results add to the literature on disclosure which predominantly advocates more transparency. The findings of the experiment however, identify decreased information processing quality as a relevant disclosure cost category. Firms might therefore carefully evaluate the additional costs and benefits of more transparent compensation disclosure. Together with the results from the experiment in Essay 1, the two experiments on compensation disclosure imply that firms should rather focus on their discretion how to present their compensation disclosure to benefit from investors’ improved fairness perceptions and their spill-over on performance evaluation. Essay 3 studies the behavioral effects of contextual factors in recruitment processes that do not affect the employer’s or the applicant’s bargaining power from a standard economic perspective. In particular, the experiment studies two common characteristics of recruitment processes: Pre-contractual competition among job applicants and job applicants’ non-binding effort announcements as they might be made during job interviews. Despite the standard economic irrelevance of these factors, the experiment develops theory regarding the behavioral effects on employees’ subsequent effort provision and the employers’ contract design choices. The experimental findings largely support the predictions. More specifically, the results suggest that firms can benefit from increased effort and, therefore, may generate higher profits. Further, firms may seize a larger share of the employment relationship’s profit by highlighting the competitive aspects of the recruitment process and by requiring applicants to make announcements about their future effort. Finally, Essay 4 studies the role of fairness perceptions for the public evaluation of executive compensation. Although economic criteria for the design of incentive compensation generally do not make restrictive recommendations with regard to the amount of compensation, fairness perceptions might be relevant from the perspective of firms and standard setters. This is because behavioral theory has identified fairness as an important determinant of individuals’ judgment and decisions. However, although fairness concerns about executive compensation are often stated in the popular media and even in the literature, evidence on the meaning of fairness in the context of executive compensation is scarce and ambiguous. In order to inform practitioners and standard setters whether fairness concerns are exclusive to non-professionals or relevant for investment professionals as well, the two surveys presented in Essay 4 aim to find commonalities in the opinions of representative eligible voters and investments professionals. The results suggest that fairness is an important criterion for both groups. Especially, exposure to risk in the form of the variable compensation share is an important criterion shared by both groups. The higher the assumed variable share, the higher is the compensation amount to be perceived as fair. However, to a large extent, opinions on executive compensation depend on personality characteristics, and to some extent, investment professionals’ perceptions deviate systematically from those of non-professionals. The findings imply that firms might benefit from emphasizing the riskiness of their managers’ variable pay components and, therefore, the findings are also in line with those of Essay 1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

XENON is a dark matter direct detection project, consisting of a time projection chamber (TPC) filled with liquid xenon as detection medium. The construction of the next generation detector, XENON1T, is presently taking place at the Laboratori Nazionali del Gran Sasso (LNGS) in Italy. It aims at a sensitivity to spin-independent cross sections of 2 10-47 c 2 for WIMP masses around 50 GeV2, which requires a background reduction by two orders of magnitude compared to XENON100, the current generation detector. An active system that is able to tag muons and muon-induced backgrounds is critical for this goal. A water Cherenkov detector of ~ 10 m height and diameter has been therefore developed, equipped with 8 inch photomultipliers and cladded by a reflective foil. We present the design and optimization study for this detector, which has been carried out with a series of Monte Carlo simulations. The muon veto will reach very high detection efficiencies for muons (>99.5%) and showers of secondary particles from muon interactions in the rock (>70%). Similar efficiencies will be obtained for XENONnT, the upgrade of XENON1T, which will later improve the WIMP sensitivity by another order of magnitude. With the Cherenkov water shield studied here, the background from muon-induced neutrons in XENON1T is negligible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cleverly designed molecular building blocks provide chemists with the tools of a powerful molecular-scale construction set. They enable them to engineer materials having a predictable order and useful solid-state properties. Hence, it is in the realm of supramolecular chemistry to follow a strategy for synthesizing materials which combine a selected set of properties, for instance from the areas of magnetism, photophysics and electronics. As a successful approach, host/guest solids which are based on extended anionic, homo- and bimetallic oxalato-bridged transition-metal compounds with two-and three-dimensional connectivities have been investigated. In this report, a brief review is given on the structural aspects of this class of compounds followed by a presentation of a thermal and magnetic study for two distinct, heterometallic oxalato-bridged layer compounds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Sutureless aortic valve replacement (SU-AVR) is an innovative approach which shortens cardiopulmonary bypass and cross-clamp durations and may facilitate minimally invasive approach. Evidence outlining its safety, efficacy, hemodynamic profile and potential complications is replete with small-volume observational studies and few comparative publications. METHODS Minimally invasive aortic valve surgery and high-volume SU-AVR replacement centers were contacted for recruitment into a global collaborative coalition dedicated to sutureless valve research. A Research Steering Committee was formulated to direct research and support the mission of providing registry evidence warranted for SU-AVR. RESULTS The International Valvular Surgery Study Group (IVSSG) was formed under the auspices of the Research Steering Committee, comprised of 36 expert valvular surgeons from 27 major centers across the globe. IVSSG Sutureless Projects currently proceeding include the Retrospective and Prospective Phases of the SU-AVR International Registry (SU-AVR-IR). CONCLUSIONS The global pooling of data by the IVSSG Sutureless Projects will provide required robust clinical evidence on the safety, efficacy and hemodynamic outcomes of SU-AVR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a large health care system, the importance of accurate information as feedback mechanisms about its performance is necessary on many levels from the senior level management to service level managers for valid decision-making purposes. The implementation of dashboards is one way to remedy the problem of data overload by providing up-to-date, accurate, and concise information. As this health care system seeks to have an organized, systematic review mechanism in place, dashboards are being created in a variety of the hospital service departments to monitor performance indicators. The Infection Control Administration of this health care system is one that does not currently utilize a dashboard but seeks to implement one. ^ The purpose of this project is to research and design a clinical dashboard for the Infection Control Administration. The intent is that the implementation and usefulness of the clinical dashboard translates into improvement in the measurement of health care quality.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was a retrospective design and used secondary data from the National Child Abuse and Neglect Data System (NCANDS), provided by the National Data Archive on Child Abuse and Neglect Family Life Development Center administered by Cornell University. The dataset contained information for the year 2005 on children from birth to 18 years of age. Child abuse and neglect for disabled children, was evaluated in-depth in the present study. Descriptive and statistical analysis was performed using the children with and without disabilities. It was found that children with disabilities have a lower rate of substantiation that likely indicates the interference of reporting due to their handicap. The results of this research demonstrate the important need to teach professionals and laypersons alike on how to recognize and substantiate abuse among disabled children.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Doctoral Thesis entitled Contribution to the analysis, design and assessment of compact antenna test ranges at millimeter wavelengths aims to deepen the knowledge of a particular antenna measurement system: the compact range, operating in the frequency bands of millimeter wavelengths. The thesis has been developed at Radiation Group (GR), an antenna laboratory which belongs to the Signals, Systems and Radiocommunications department (SSR), from Technical University of Madrid (UPM). The Radiation Group owns an extensive experience on antenna measurements, running at present four facilities which operate in different configurations: Gregorian compact antenna test range, spherical near field, planar near field and semianechoic arch system. The research work performed in line with this thesis contributes the knowledge of the first measurement configuration at higher frequencies, beyond the microwaves region where Radiation Group features customer-level performance. To reach this high level purpose, a set of scientific tasks were sequentially carried out. Those are succinctly described in the subsequent paragraphs. A first step dealed with the State of Art review. The study of scientific literature dealed with the analysis of measurement practices in compact antenna test ranges in addition with the particularities of millimeter wavelength technologies. Joint study of both fields of knowledge converged, when this measurement facilities are of interest, in a series of technological challenges which become serious bottlenecks at different stages: analysis, design and assessment. Thirdly after the overview study, focus was set on Electromagnetic analysis algorithms. These formulations allow to approach certain electromagnetic features of interest, such as field distribution phase or stray signal analysis of particular structures when they interact with electromagnetic waves sources. Properly operated, a CATR facility features electromagnetic waves collimation optics which are large, in terms of wavelengths. Accordingly, the electromagnetic analysis tasks introduce an extense number of mathematic unknowns which grow with frequency, following different polynomic order laws depending on the used algorithmia. In particular, the optics configuration which was of our interest consisted on the reflection type serrated edge collimator. The analysis of these devices requires a flexible handling of almost arbitrary scattering geometries, becoming this flexibility the nucleus of the algorithmia’s ability to perform the subsequent design tasks. This thesis’ contribution to this field of knowledge consisted on reaching a formulation which was powerful at the same time when dealing with various analysis geometries and computationally speaking. Two algorithmia were developed. While based on the same principle of hybridization, they reached different order Physics performance at the cost of the computational efficiency. Inter-comparison of their CATR design capabilities was performed, reaching both qualitative as well as quantitative conclusions on their scope. In third place, interest was shifted from analysis - design tasks towards range assessment. Millimetre wavelengths imply strict mechanical tolerances and fine setup adjustment. In addition, the large number of unknowns issue already faced in the analysis stage appears as well in the on chamber field probing stage. Natural decrease of dynamic range available by semiconductor millimeter waves sources requires in addition larger integration times at each probing point. These peculiarities increase exponentially the difficulty of performing assessment processes in CATR facilities beyond microwaves. The bottleneck becomes so tight that it compromises the range characterization beyond a certain limit frequency which typically lies on the lowest segment of millimeter wavelength frequencies. However the value of range assessment moves, on the contrary, towards the highest segment. This thesis contributes this technological scenario developing quiet zone probing techniques which achieves substantial data reduction ratii. Collaterally, it increases the robustness of the results to noise, which is a virtual rise of the setup’s available dynamic range. In fourth place, the environmental sensitivity of millimeter wavelengths issue was approached. It is well known the drifts of electromagnetic experiments due to the dependance of the re sults with respect to the surrounding environment. This feature relegates many industrial practices of microwave frequencies to the experimental stage, at millimeter wavelengths. In particular, evolution of the atmosphere within acceptable conditioning bounds redounds in drift phenomena which completely mask the experimental results. The contribution of this thesis on this aspect consists on modeling electrically the indoor atmosphere existing in a CATR, as a function of environmental variables which affect the range’s performance. A simple model was developed, being able to handle high level phenomena, such as feed - probe phase drift as a function of low level magnitudes easy to be sampled: relative humidity and temperature. With this model, environmental compensation can be performed and chamber conditioning is automatically extended towards higher frequencies. Therefore, the purpose of this thesis is to go further into the knowledge of millimetre wavelengths involving compact antenna test ranges. This knowledge is dosified through the sequential stages of a CATR conception, form early low level electromagnetic analysis towards the assessment of an operative facility, stages for each one of which nowadays bottleneck phenomena exist and seriously compromise the antenna measurement practices at millimeter wavelengths.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The manipulation and handling of an ever increasing volume of data by current data-intensive applications require novel techniques for e?cient data management. Despite recent advances in every aspect of data management (storage, access, querying, analysis, mining), future applications are expected to scale to even higher degrees, not only in terms of volumes of data handled but also in terms of users and resources, often making use of multiple, pre-existing autonomous, distributed or heterogeneous resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, more a more base stations are equipped with active conformal antennas. These antenna designs combine phase shift systems with multibeam networks providing multi-beam ability and interference rejection, which optimize multiple channel systems. GEODA is a conformal adaptive antenna system designed for satellite communications. Operating at 1.7 GHz with circular polarization, it is possible to track and communicate with several satellites at once thanks to its adaptive beam. The antenna is based on a set of similar triangular arrays that are divided in subarrays of three elements called `cells'. Transmission/Receiver (T/R) modules manage beam steering by shifting the phases. A more accurate steering of the antenna GEODA could be achieved by using a multibeam network. Several multibeam network designs based on Butler network will be presented