899 resultados para Integrated circuits Very large scale integration Design and construction.
Resumo:
Nitrogen oxides play a crucial role in the budget of tropospheric ozone (O sub(3)) and the formation of the hydroxyl radical. Anthropogenic activities and boreal wildfires are large sources of emissions in the atmosphere. However, the influence of the transport of these emissions on nitrogen oxides and O sub(3) levels at hemispheric scales is not well understood, in particular due to a lack of nitrogen oxides measurements in remote regions. In order to address these deficiencies, measurements of NO, NO sub(2) and NO sub(y) (total reactive nitrogen oxides) were made in the lower free troposphere (FT) over the central North Atlantic region (Pico Mountain station, 38 degree N 28 degree W, 2.3 km asl) from July 2002 to August 2005. These measurements reveal a well-defined seasonal cycle of nitrogen oxides (NO sub(x) = NO+NO sub(2) and NO sub(y)) in the background central North Atlantic lower FT, with higher mixing ratios during the summertime. Observed NO sub(x) and NO sub(y) levels are consistent with long-range transport of emissions, but with significant removal en-route to the measurement site. Reactive nitrogen largely exists in the form of PAN and HNO sub(3) ( similar to 80-90% of NO sub(y)) all year round. A shift in the composition of NO sub(y) from dominance of PAN to dominance of HNO sub(3) occurs from winter-spring to summer-fall, as a result of changes in temperature and photochemistry over the region. Analysis of the long-range transport of boreal wildfire emissions on nitrogen oxides provides evidence of the very large-scale impacts of boreal wildfires on the tropospheric NO sub(x) and O sub(3) budgets. Boreal wildfire emissions are responsible for significant shifts in the nitrogen oxides distributions toward higher levels during the summer, with medians of NO sub(y) (117-175 pptv) and NO sub(x) (9-30 pptv) greater in the presence of boreal wildfire emissions. Extreme levels of NO sub(x) (up to 150 pptv) and NO sub(y) (up to 1100 pptv) observed in boreal wildfire plumes suggest that decomposition of PAN to NO sub(x) is a significant source of NO sub(x), and imply that O sub(3) formation occurs during transport. Ozone levels are also significantly enhanced in boreal wildfire plumes. However, a complex behavior of O sub(3) is observed in the plumes, which varies from significant to lower O sub(3) production to O sub(3) destruction. Long-range transport of anthropogenic emissions from North America also has a significant influence on the regional NO sub(x) and O sub(3) budgets. Transport of pollution from North America causes significant enhancements on nitrogen oxides year-round. Enhancements of CO, NO sub(y) and NO sub(x) indicate that, consistent with previous studies, more than 95% of the NO sub(x) emitted over the U.S. is removed before and during export out of the U.S. boundary layer. However, about 30% of the NO sub(x) emissions exported out of the U.S. boundary layer remain in the airmasses. Since the lifetime of NO sub(x) is shorter than the transport timescale, PAN decomposition and potentially photolysis of HNO sub(3) provide a supply of NO sub(x) over the central North Atlantic lower FT. Observed Delta O sub(3)/ Delta NO sub(y) and large NO sub(y) levels remaining in the North American plumes suggest potential O sub(3) formation well downwind from North America. Finally, a comparison of the nitrogen oxides measurements with results from the global chemical transport (GCT) model GEOS-Chem identifies differences between the observations and the model. GEOS-Chem reproduces the seasonal variation of nitrogen oxides over the central North Atlantic lower FT, but does not capture the magnitude of the cycles. Improvements in our understanding of nitrogen oxides chemistry in the remote FT and emission sources are necessary for the current GCT models to adequately estimate the impacts of emissions on tropospheric NO sub(x) and the resulting impacts on the O sub(3) budget.
DESIGN AND IMPLEMENT DYNAMIC PROGRAMMING BASED DISCRETE POWER LEVEL SMART HOME SCHEDULING USING FPGA
Resumo:
With the development and capabilities of the Smart Home system, people today are entering an era in which household appliances are no longer just controlled by people, but also operated by a Smart System. This results in a more efficient, convenient, comfortable, and environmentally friendly living environment. A critical part of the Smart Home system is Home Automation, which means that there is a Micro-Controller Unit (MCU) to control all the household appliances and schedule their operating times. This reduces electricity bills by shifting amounts of power consumption from the on-peak hour consumption to the off-peak hour consumption, in terms of different “hour price”. In this paper, we propose an algorithm for scheduling multi-user power consumption and implement it on an FPGA board, using it as the MCU. This algorithm for discrete power level tasks scheduling is based on dynamic programming, which could find a scheduling solution close to the optimal one. We chose FPGA as our system’s controller because FPGA has low complexity, parallel processing capability, a large amount of I/O interface for further development and is programmable on both software and hardware. In conclusion, it costs little time running on FPGA board and the solution obtained is good enough for the consumers.
Resumo:
Projects in the area of architectural design and urban planning typically engage several architects as well as experts from other professions. While the design and review meetings thus often involve a large number of cooperating participants, the actual design is still done by the individuals in the time in between those meetings using desktop PCs and CAD applications. A real collaborative approach to architectural design and urban planning is often limited to early paper-based sketches.In order to overcome these limitations, we designed and realized the ARTHUR system, an Augmented Reality (AR) enhanced round table to support complex design and planning decisions for architects. WhileAR has been applied to this area earlier, our approach does not try to replace the use of CAD systems but rather integrates them seamlessly into the collaborative AR environment. The approach is enhanced by intuitiveinteraction mechanisms that can be easily con-figured for different application scenarios.
Resumo:
Post-Fordist economies come along with post-welfarist societies marked by intensified cultural individualism and increased structural inequalities. These conditions are commonly held to be conducive to relative deprivation and, thereby, anomic crime. At the same time, post-welfarist societies develop a new ‘balance of power’ between institutions providing for welfare regulation, such as the family, the state and the (labour) market – and also the penal system. These institutions are generally expected to improve social integration, ensure conformity and thus reduce anomic crime. Combining both perspectives, we analyse the effects of moral individualism, social inequality, and different integration strategies on crime rates in contemporary societies through the lenses of anomie theory. To test our hypotheses, we draw on time-series cross-section data compiled from different data sources (OECD, UN, WHO, WDI) for twenty developed countries in the period 1970-2004, and run multiple regressions that control for country-specific effects. Although we find some evidence that the mismatch between cultural ideal (individual inclusion) and structural reality (stratified exclusion) increases the anomic pressure, whereas conservative (i. e. family-based), social-democratic (i. e. state-based) and liberal (i. e. market-based) integration strategies to a certain extent prove effective in controlling the incidence of crime, the results are not very robust. Moreover, reservations have to be made regarding the effects of “market” income inequality as well as familialist, unionist and liberalist employment policies that are shown to have reversed effects in our sample: the former reducing, the latter occasionally increasing anomic crime. As expected, the mismatch between cultural ideal (individual inclusion) and structural reality (stratified exclusion) increases the anomic pressure, whereas conservative (i. e. family-based), social-democratic (i. e. state-based) and liberal (i. e. market-based) integration strategies generally prove effective in controlling the incidence of crime. Nevertheless, we conclude that the new cult of the individual undermines the effectiveness of conservative and social-democratic integration strategies and drives societies towards more “liberal” regimes that build on incentive as well as punitive elements.
Resumo:
Quality of education should be stable or permanently increased – even if the number of students rises. Quality of education is often related to possibilities for active learning and individual facilitation. This paper deals with the question how high-quality learning within oversized courses could be enabled and it presents the approach of e-flashcards that enables active learning and individual facilitation within large scale university courses.
Resumo:
BACKGROUND: Enterococcus faecalis has emerged as a major hospital pathogen. To explore its diversity, we sequenced E. faecalis strain OG1RF, which is commonly used for molecular manipulation and virulence studies. RESULTS: The 2,739,625 base pair chromosome of OG1RF was found to contain approximately 232 kilobases unique to this strain compared to V583, the only publicly available sequenced strain. Almost no mobile genetic elements were found in OG1RF. The 64 areas of divergence were classified into three categories. First, OG1RF carries 39 unique regions, including 2 CRISPR loci and a new WxL locus. Second, we found nine replacements where a sequence specific to V583 was substituted by a sequence specific to OG1RF. For example, the iol operon of OG1RF replaces a possible prophage and the vanB transposon in V583. Finally, we found 16 regions that were present in V583 but missing from OG1RF, including the proposed pathogenicity island, several probable prophages, and the cpsCDEFGHIJK capsular polysaccharide operon. OG1RF was more rapidly but less frequently lethal than V583 in the mouse peritonitis model and considerably outcompeted V583 in a murine model of urinary tract infections. CONCLUSION: E. faecalis OG1RF carries a number of unique loci compared to V583, but the almost complete lack of mobile genetic elements demonstrates that this is not a defining feature of the species. Additionally, OG1RF's effects in experimental models suggest that mediators of virulence may be diverse between different E. faecalis strains and that virulence is not dependent on the presence of mobile genetic elements.
Resumo:
In numerous intervention studies and education field trials, random assignment to treatment occurs in clusters rather than at the level of observation. This departure of random assignment of units may be due to logistics, political feasibility, or ecological validity. Data within the same cluster or grouping are often correlated. Application of traditional regression techniques, which assume independence between observations, to clustered data produce consistent parameter estimates. However such estimators are often inefficient as compared to methods which incorporate the clustered nature of the data into the estimation procedure (Neuhaus 1993).1 Multilevel models, also known as random effects or random components models, can be used to account for the clustering of data by estimating higher level, or group, as well as lower level, or individual variation. Designing a study, in which the unit of observation is nested within higher level groupings, requires the determination of sample sizes at each level. This study investigates the design and analysis of various sampling strategies for a 3-level repeated measures design on the parameter estimates when the outcome variable of interest follows a Poisson distribution. ^ Results study suggest that second order PQL estimation produces the least biased estimates in the 3-level multilevel Poisson model followed by first order PQL and then second and first order MQL. The MQL estimates of both fixed and random parameters are generally satisfactory when the level 2 and level 3 variation is less than 0.10. However, as the higher level error variance increases, the MQL estimates become increasingly biased. If convergence of the estimation algorithm is not obtained by PQL procedure and higher level error variance is large, the estimates may be significantly biased. In this case bias correction techniques such as bootstrapping should be considered as an alternative procedure. For larger sample sizes, those structures with 20 or more units sampled at levels with normally distributed random errors produced more stable estimates with less sampling variance than structures with an increased number of level 1 units. For small sample sizes, sampling fewer units at the level with Poisson variation produces less sampling variation, however this criterion is no longer important when sample sizes are large. ^ 1Neuhaus J (1993). “Estimation efficiency and Tests of Covariate Effects with Clustered Binary Data”. Biometrics , 49, 989–996^
Resumo:
Using stress and coping as a unifying theoretical concept, a series of five models was developed in order to synthesize the survey questions and to classify information. These models identified the question, listed the research study, described measurements, listed workplace data, and listed industry and national reference data.^ A set of 38 instrument questions was developed within the five coping correlate categories. In addition, a set of 22 stress symptoms was also developed. The study was conducted within two groups, police and professors, on a large university campus. The groups were selected because their occupations were diverse, but they were a part of the same macroenvironment. The premise was that police officers would be more highly stressed than professors.^ Of a total study group of 80, there were 37 respondents. The difference in the mean stress responses was observable between the two groups. Not only were the responses similar within each group, but the stress level of response was also similar within each group. While the response to the survey instrument was good, only 3 respondents answered the stress symptom survey properly. It was determined that none of the 37 respondents believed that they were ill. This perception of being well was also evidenced by the grand mean of the stress scores of 2.76 (3.0 = moderate stress). This also caused fewer independent variables to be entered in the multiple regression model.^ The survey instrument was carefully designed to be universal. Universality is the ability to transcend occupational or regional definitions as applied to stress. It is the ability to measure responses within broad categories such as physiological, emotional, behavioral, social, and cognitive functions without losing the ability to measure the detail within the individual questions, or the relationships between questions and categories.^ Replication is much easier to achieve with standardized categories, questions, and measurement procedures such as those developed for the universal survey instrument. Because the survey instrument is universal it can be used as an analytical device, an assessment device, a basic tool for planning and a follow-up instrument to measure individual response to planned reductions in occupational stress. (Abstract shortened with permission of author.) ^
Resumo:
This paper analyses local geographical contexts targeted by transnational large-scale land acquisitions (>200 ha per deal) in order to understand how emerging patterns of socio-ecological characteristics can be related to processes of large-scale foreign investment in land. Using a sample of 139 land deals georeferenced with high spatial accuracy, we first analyse their target contexts in terms of land cover, population density, accessibility, and indicators for agricultural potential. Three distinct patterns emerge from the analysis: densely populated and easily accessible croplands (35% of land deals); remote forestlands with lower population densities (34% of land deals); and moderately populated and moderately accessible shrub- or grasslands (26% of land deals). These patterns are consistent with processes described in the relevant case study literature, and they each involve distinct types of stakeholders and associated competition over land. We then repeat the often-cited analysis that postulates a link between land investments and target countries with abundant so-called “idle” or “marginal” lands as measured by yield gap and available suitable but uncultivated land; our methods differ from the earlier approach, however, in that we examine local context (10-km radius) rather than countries as a whole. The results show that earlier findings are disputable in terms of concepts, methods, and contents. Further, we reflect on methodologies for exploring linkages between socioecological patterns and land investment processes. Improving and enhancing large datasets of georeferenced land deals is an important next step; at the same time, careful choice of the spatial scale of analysis is crucial for ensuring compatibility between the spatial accuracy of land deal locations and the resolution of available geospatial data layers. Finally, we argue that new approaches and methods must be developed to empirically link socio-ecological patterns in target contexts to key determinants of land investment processes. This would help to improve the validity and the reach of our findings as an input for evidence-informed policy debates.
Resumo:
This thesis consists of four essays on the design and disclosure of compensation contracts. Essays 1, 2 and 3 focus on behavioral aspects of mandatory compensation disclosure rules and of contract negotiations in agency relationships. The three experimental studies develop psychology- based theory and present results that deviate from standard economic predictions. Furthermore, the results of Essay 1 and 2 also have implications for firms’ discretion in how to communicate their top management’s incentives to the capital market. Essay 4 analyzes the role of fairness perceptions for the evaluation of executive compensation. For this purpose, two surveys targeting representative eligible voters as well as investment professionals were conducted. Essay 1 investigates the role of the detailed ‘Compensation Discussion and Analysis’, which is part of the Security and Exchange Commission’s 2006 regulation, on investors’ evaluations of executive performance. Compensation disclosure complying with this regulation clarifies the relationship between realized reported compensation and the underlying performance measures and their target achievement levels. The experimental findings suggest that the salient presentation of executives’ incentives inherent in the ‘Compensation Discussion and Analysis’ makes investors’ performance evaluations less outcome dependent. Therefore, investors’ judgment and investment decisions might be less affected by noisy environmental factors that drive financial performance. The results also suggest that fairness perceptions of compensation contracts are essential for investors’ performance evaluations in that more transparent disclosure increases the perceived fairness of compensation and the performance evaluation of managers who are not responsible for a bad financial performance. These results have important practical implications as firms might choose to communicate their top management’s incentive compensation more transparently in order to benefit from less volatile expectations about their future performance. Similar to the first experiment, the experiment described in Essay 2 addresses the question of more transparent compensation disclosure. However, other than the first experiment, the second experiment does not analyze the effect of a more salient presentation of contract information but the informational effect of contract information itself. For this purpose, the experiment tests two conditions in which the assessment of the compensation contracts’ incentive compatibility, which determines executive effort, is either possible or not. On the one hand, the results suggest that the quality of investors’ expectations about executive effort is improved, but on the other hand investors might over-adjust their prior expectations about executive effort if being confronted with an unexpected financial performance and under-adjust if the financial performance confirms their prior expectations. Therefore, in the experiment, more transparent compensation disclosure does not lead to more correct overall judgments of executive effort and to even lower processing quality of outcome information. These results add to the literature on disclosure which predominantly advocates more transparency. The findings of the experiment however, identify decreased information processing quality as a relevant disclosure cost category. Firms might therefore carefully evaluate the additional costs and benefits of more transparent compensation disclosure. Together with the results from the experiment in Essay 1, the two experiments on compensation disclosure imply that firms should rather focus on their discretion how to present their compensation disclosure to benefit from investors’ improved fairness perceptions and their spill-over on performance evaluation. Essay 3 studies the behavioral effects of contextual factors in recruitment processes that do not affect the employer’s or the applicant’s bargaining power from a standard economic perspective. In particular, the experiment studies two common characteristics of recruitment processes: Pre-contractual competition among job applicants and job applicants’ non-binding effort announcements as they might be made during job interviews. Despite the standard economic irrelevance of these factors, the experiment develops theory regarding the behavioral effects on employees’ subsequent effort provision and the employers’ contract design choices. The experimental findings largely support the predictions. More specifically, the results suggest that firms can benefit from increased effort and, therefore, may generate higher profits. Further, firms may seize a larger share of the employment relationship’s profit by highlighting the competitive aspects of the recruitment process and by requiring applicants to make announcements about their future effort. Finally, Essay 4 studies the role of fairness perceptions for the public evaluation of executive compensation. Although economic criteria for the design of incentive compensation generally do not make restrictive recommendations with regard to the amount of compensation, fairness perceptions might be relevant from the perspective of firms and standard setters. This is because behavioral theory has identified fairness as an important determinant of individuals’ judgment and decisions. However, although fairness concerns about executive compensation are often stated in the popular media and even in the literature, evidence on the meaning of fairness in the context of executive compensation is scarce and ambiguous. In order to inform practitioners and standard setters whether fairness concerns are exclusive to non-professionals or relevant for investment professionals as well, the two surveys presented in Essay 4 aim to find commonalities in the opinions of representative eligible voters and investments professionals. The results suggest that fairness is an important criterion for both groups. Especially, exposure to risk in the form of the variable compensation share is an important criterion shared by both groups. The higher the assumed variable share, the higher is the compensation amount to be perceived as fair. However, to a large extent, opinions on executive compensation depend on personality characteristics, and to some extent, investment professionals’ perceptions deviate systematically from those of non-professionals. The findings imply that firms might benefit from emphasizing the riskiness of their managers’ variable pay components and, therefore, the findings are also in line with those of Essay 1.
Resumo:
INTRODUCTION Even though arthroplasty of the ankle joint is considered to be an established procedure, only about 1,300 endoprostheses are implanted in Germany annually. Arthrodeses of the ankle joint are performed almost three times more often. This may be due to the availability of the procedure - more than twice as many providers perform arthrodesis - as well as the postulated high frequency of revision procedures of arthroplasties in the literature. In those publications, however, there is often no clear differentiation between revision surgery with exchange of components, subsequent interventions due to complications and subsequent surgery not associated with complications. The German Orthopaedic Foot and Ankle Association's (D. A. F.) registry for total ankle replacement collects data pertaining to perioperative complications as well as cause, nature and extent of the subsequent interventions, and postoperative patient satisfaction. MATERIAL AND METHODS The D. A. F.'s total ankle replacement register is a nation-wide, voluntary registry. After giving written informed consent, the patients can be added to the database by participating providers. Data are collected during hospital stay for surgical treatment, during routine follow-up inspections and in the context of revision surgery. The information can be submitted in paper-based or online formats. The survey instruments are available as minimum data sets or scientific questionnaires which include patient-reported outcome measures (PROMs). The pseudonymous clinical data are collected and evaluated at the Institute for Evaluative Research in Medicine, University of Bern/Switzerland (IEFM). The patient-related data remain on the register's module server in North Rhine-Westphalia, Germany. The registry's methodology as well as the results of the revisions and patient satisfaction for 115 patients with a two year follow-up period are presented. Statistical analyses are performed with SAS™ (Version 9.4, SAS Institute, Inc., Cary, NC, USA). RESULTS About 2½ years after the register was launched there are 621 datasets on primary implantations, 1,427 on follow-ups and 121 records on re-operation available. 49 % of the patients received their implants due to post-traumatic osteoarthritis, 27 % because of a primary osteoarthritis and 15 % of patients suffered from a rheumatic disease. More than 90 % of the primary interventions proceeded without complications. Subsequent interventions were recorded for 84 patients, which corresponds to a rate of 13.5 % with respect to the primary implantations. It should be noted that these secondary procedures also include two-stage procedures not due to a complication. "True revisions" are interventions with exchange of components due to mechanical complications and/or infection and were present in 7.6 % of patients. 415 of the patients commented on their satisfaction with the operative result during the last follow-up: 89.9 % of patients evaluate their outcome as excellent or good, 9.4 % as moderate and only 0.7 % (3 patients) as poor. In these three cases a component loosening or symptomatic USG osteoarthritis was present. Two-year follow-up data using the American Orthopedic Foot and Ankle Society Ankle and Hindfoot Scale (AOFAS-AHS) are already available for 115 patients. The median AOFAS-AHS score increased from 33 points preoperatively to more than 80 points three to six months postoperatively. This increase remained nearly constant over the entire two-year follow-up period. CONCLUSION Covering less than 10 % of the approximately 240 providers in Germany and approximately 12 % of the annually implanted total ankle-replacements, the D. A. F.-register is still far from being seen as a national registry. Nevertheless, geographical coverage and inclusion of "high-" (more than 100 total ankle replacements a year) and "low-volume surgeons" (less than 5 total ankle replacements a year) make the register representative for Germany. The registry data show that the number of subsequent interventions and in particular the "true revision" procedures are markedly lower than the 20 % often postulated in the literature. In addition, a high level of patient satisfaction over the short and medium term is recorded. From the perspective of the authors, these results indicate that total ankle arthroplasty - given a correct indication and appropriate selection of patients - is not inferior to an ankle arthrodesis concerning patients' satisfaction and function. First valid survival rates can be expected about 10 years after the register's start.
Resumo:
XENON is a dark matter direct detection project, consisting of a time projection chamber (TPC) filled with liquid xenon as detection medium. The construction of the next generation detector, XENON1T, is presently taking place at the Laboratori Nazionali del Gran Sasso (LNGS) in Italy. It aims at a sensitivity to spin-independent cross sections of 2 10-47 c 2 for WIMP masses around 50 GeV2, which requires a background reduction by two orders of magnitude compared to XENON100, the current generation detector. An active system that is able to tag muons and muon-induced backgrounds is critical for this goal. A water Cherenkov detector of ~ 10 m height and diameter has been therefore developed, equipped with 8 inch photomultipliers and cladded by a reflective foil. We present the design and optimization study for this detector, which has been carried out with a series of Monte Carlo simulations. The muon veto will reach very high detection efficiencies for muons (>99.5%) and showers of secondary particles from muon interactions in the rock (>70%). Similar efficiencies will be obtained for XENONnT, the upgrade of XENON1T, which will later improve the WIMP sensitivity by another order of magnitude. With the Cherenkov water shield studied here, the background from muon-induced neutrons in XENON1T is negligible.
Resumo:
Cleverly designed molecular building blocks provide chemists with the tools of a powerful molecular-scale construction set. They enable them to engineer materials having a predictable order and useful solid-state properties. Hence, it is in the realm of supramolecular chemistry to follow a strategy for synthesizing materials which combine a selected set of properties, for instance from the areas of magnetism, photophysics and electronics. As a successful approach, host/guest solids which are based on extended anionic, homo- and bimetallic oxalato-bridged transition-metal compounds with two-and three-dimensional connectivities have been investigated. In this report, a brief review is given on the structural aspects of this class of compounds followed by a presentation of a thermal and magnetic study for two distinct, heterometallic oxalato-bridged layer compounds.
Resumo:
Large-scale land acquisition, or "land grabbing", has become a key research topic among scholars interested in agrarian change, development, and the environment. The term "land acquisitions" refers to a highly contested process in terms of governance and impacts on livelihoods and human rights. This book focuses on South-East Asia. A series of thematic and in-depth case studies put "land grabbing" into specific historical and institutional contexts. The volume also offers a human rights analysis of the phenomenon, examining the potential and limits of human rights mechanisms aimed at preventing and mitigating land grabs' negative consequences.