903 resultados para Twin coronet porphyrins


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we demonstrate through computer simulation and experiment a novel subcarrier coding scheme combined with pre-electrical dispersion compensation (pre-EDC) for fiber nonlinearity mitigation in coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems. As the frequency spacing in CO-OFDM systems is usually small (tens of MHz), neighbouring subcarriers tend to experience correlated nonlinear distortions after propagation over a fiber link. As a consequence, nonlinearity mitigation can be achieved by encoding and processing neighbouring OFDM subcarriers simultaneously. Herein, we propose to adopt the concept of dual phase conjugated twin wave for CO-OFDM transmission. Simulation and experimental results show that this simple technique combined with 50% pre-EDC can effectively offer up to 1.5 and 0.8 dB performance gains in CO-OFDM systems with BPSK and QPSK modulation formats, respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss recent progress on the use of optical and digital phase conjugation techniques for nonlinearity compensation in optical fiber links. We compare the achievable performance gain of phase conjugated twin wave applied in two polarization states and time segments with mid-link optical phase conjugation and digital back propagation. For multicarrier transmission scheme such as orthogonal frequency division multiplexing, two recently proposed schemes, namely phase-conjugated pilots and phase-conjugated subcarrier coding are reviewed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to present a conceptual framework in order to analyse and understand the twin developments of successful microeconomic reform on the one hand and failed macroeconomic stabilisation attempts on the other hand in Hungary. The case study also attempts to explore the reasons why Hungarian policymakers were willing to initiate reforms in the micro sphere, but were reluctant to initiate major changes in public finances both before and after the regime change of 1989/1990. Design/methodology/approach – The paper applies a path-dependent approach by carefully analysing Hungary's Communist and post-Communist economic development. The study restricts itself to a positive analysis but normative statements can also be drawn accordingly. Findings – The study demonstrates that the recent deteriorating economic performance of Hungary is not a recent phenomenon. By providing a path-dependent explanation, it argues that both Communist and post-Communist governments used the general budget as a buffer to compensate the losers of economic reforms, especially microeconomic restructuring. The gradualist success of the country – which dates back to at least 1968 – in the field of liberalisation, marketisation and privatisation was accompanied by a constant overspending in the general government. Practical implications – Hungary has been one of the worst-hit countries of the 2008/2009 financial crisis, not just in Central and Eastern Europe but in the whole world. The capacity and opportunity for strengthening international investors' confidence is, however, not without doubts. The current deterioration is deeply rooted in failed past macroeconomic management. The dissolution of fiscal laxity and state paternalism in a broader context requires, therefore, an all-encompassing reform of the general government, which may trigger serious challenges to the political regime as well. Originality/value – The study aims to show that a relatively high ratio of redistribution, a high and persistent public deficit and an accelerated indebtedness are not recent phenomena in Hungary. In fact, these trends characterised the country well before the transformation of 1989/1990, and have continued in the post-socialist years, too. To explain such a phenomenon, the study argues that in the last couple of decades the hardening of the budget constraint of firms have come at the cost of maintaining the soft budget constraint of the state.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The history of planning and creating strategies has a past of over half a century. Throughout this lifetime period we have witnessed both the evolution of theory and practice. The MBA study books in the last-third of the 20th century have with predilection exhibited this very process as a complex of monetary centered budget planning, forecast-based planning, strategic planning and strategic management. There might be a controversy existing about the naming, characteristics and timing of these different sections but there is an accordance that the changes that we have taken place in the last decade as a whole without a doubt can be derived from these very changes in the business environment or in some outstanding cases (like 9/11) they can be acknowledged as the ability of corporate foreseeing and the ability to adapt to the vision of the future. The main purposes of the research is to provide a summarized picture about the changing process of this procedure during last decades as far as the planning and creating strategies are concerned and also their milestones and periods. Try to explore and systemize the very aspects of these changes. The happenings of the first decade of the new millennium are outstandingly interesting if we consider their real effect on the theory and practice of strategic management. Let us remember the euphoria around the year 2000, the predictions of „new technologies”, „new economy”, „new organization” and „new leadership”. We have implied before on the destruction of the twin towers of the World Trade Center which meant a new era, a new quality of international terrorism and its consequences (Afghanistan, Iraq). But the „product” of this decade is the strategic aim that companies focus on, which is the social responsibility regarding the unavoidance of the effects of climate change on the long run. During the research the big question has risen concerning how did the science of strategic management do as far as the predictions of the global monetary and economic crisis are concerned? And also its solutions this very science has to offer in order to handle and get over the crisis. Does it conclude from the answers given to the questions that a change in paradigms are necessary, a new quality is needed or may be we have come to a new crossroad of the development process that will take over strategic management? (...)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Long-span bridges are flexible and therefore are sensitive to wind induced effects. One way to improve the stability of long span bridges against flutter is to use cross-sections that involve twin side-by-side decks. However, this can amplify responses due to vortex induced oscillations. Wind tunnel testing is a well-established practice to evaluate the stability of bridges against wind loads. In order to study the response of the prototype in laboratory, dynamic similarity requirements should be satisfied. One of the parameters that is normally violated in wind tunnel testing is Reynolds number. In this dissertation, the effects of Reynolds number on the aerodynamics of a double deck bridge were evaluated by measuring fluctuating forces on a motionless sectional model of a bridge at different wind speeds representing different Reynolds regimes. Also, the efficacy of vortex mitigation devices was evaluated at different Reynolds number regimes. One other parameter that is frequently ignored in wind tunnel studies is the correct simulation of turbulence characteristics. Due to the difficulties in simulating flow with large turbulence length scale on a sectional model, wind tunnel tests are often performed in smooth flow as a conservative approach. The validity of simplifying assumptions in calculation of buffeting loads, as the direct impact of turbulence, needs to be verified for twin deck bridges. The effects of turbulence characteristics were investigated by testing sectional models of a twin deck bridge under two different turbulent flow conditions. Not only the flow properties play an important role on the aerodynamic response of the bridge, but also the geometry of the cross section shape is expected to have significant effects. In this dissertation, the effects of deck details, such as width of the gap between the twin decks, and traffic barriers on the aerodynamic characteristics of a twin deck bridge were investigated, particularly on the vortex shedding forces with the aim of clarifying how these shape details can alter the wind induced responses. Finally, a summary of the issues that are involved in designing a dynamic test rig for high Reynolds number tests is given, using the studied cross section as an example.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the intention of studying and developing the design process based on a specific methodology, the object of this work is to present the design of a gated condominium community in Natal based on the application of principles of shape grammar, used in their design process. The shape grammar is a design method developed in the 1970s by George Stiny and James Gips. It is used for the analysis of the project as well as for its synthesis, with the goal of creating a "formal vocabulary" through mathematical and/or geometrical operations. Here, the methodology was used in the synthesis of the design process, through the relationship between formal subtractions and the houses’ architectural planning. As a result, five dwellings configurations were proposed, each one different from the other with respect to their shape and architectural programming, distributed in three twin groups, which are repeated until the final total of nine architectural volumes. In addition to studies of the condominium’s ventilation and the buildings’ shading simulations, studies of spatial flexibility and acoustic performance were also performed. The mapping of the design process, one of the specific objectives of the dissertation, was composed not only by the record of formal constraints (the preparation and application of rules), but also by physical, environmental, legal and sustainability aspects in relation to, on one hand, the optimization of the shading and passive ventilation for hot and humid climates, and, on the other hand, the modulation and rationalization of the construction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thèse numérisée par la Direction des bibliothèques de l'Université de Montréal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background - Specific language impairment (SLI) is a common neurodevelopmental disorder, observed in 5–10 % of children. Family and twin studies suggest a strong genetic component, but relatively few candidate genes have been reported to date. A recent genome-wide association study (GWAS) described the first statistically significant association specifically for a SLI cohort between a missense variant (rs4280164) in the NOP9 gene and language-related phenotypes under a parent-of-origin model. Replications of these findings are particularly challenging because the availability of parental DNA is required. Methods - We used two independent family-based cohorts characterised with reading- and language-related traits: a longitudinal cohort (n = 106 informative families) including children with language and reading difficulties and a nuclear family cohort (n = 264 families) selected for dyslexia. Results - We observed association with language-related measures when modelling for parent-of-origin effects at the NOP9 locus in both cohorts: minimum P = 0.001 for phonological awareness with a paternal effect in the first cohort and minimum P = 0.0004 for irregular word reading with a maternal effect in the second cohort. Allelic and parental trends were not consistent when compared to the original study. Conclusions - A parent-of-origin effect at this locus was detected in both cohorts, albeit with different trends. These findings contribute in interpreting the original GWAS report and support further investigations of the NOP9 locus and its role in language-related traits. A systematic evaluation of parent-of-origin effects in genetic association studies has the potential to reveal novel mechanisms underlying complex traits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Germanium (Ge) nanowires are of current research interest for high speed nanoelectronic devices due to the lower band gap and high carrier mobility compatible with high K-dielectrics and larger excitonic Bohr radius ensuing a more pronounced quantum confinement effect [1-6]. A general way for the growth of Ge nanowires is to use liquid or a solid growth promoters in a bottom-up approach which allow control of the aspect ratio, diameter, and structure of 1D crystals via external parameters, such as precursor feedstock, temperature, operating pressure, precursor flow rate etc [3, 7-11]. The Solid-phase seeding is preferred for more control processing of the nanomaterials and potential suppression of the unintentional incorporation of high dopant concentrations in semiconductor nanowires and unrequired compositional tailing of the seed-nanowire interface [2, 5, 9, 12]. There are therefore distinct features of the solid phase seeding mechanism that potentially offer opportunities for the controlled processing of nanomaterials with new physical properties. A superior control over the growth kinetics of nanowires could be achieved by controlling the inherent growth constraints instead of external parameters which always account for instrumental inaccuracy. The high dopant concentrations in semiconductor nanowires can result from unintentional incorporation of atoms from the metal seed material, as described for the Al catalyzed VLS growth of Si nanowires [13] which can in turn be depressed by solid-phase seeding. In addition, the creation of very sharp interfaces between group IV semiconductor segments has been achieved by solid seeds [14], whereas the traditionally used liquid Au particles often leads to compositional tailing of the interface [15] . Korgel et al. also described the superior size retention of metal seeds in a SFSS nanowire growth process, when compared to a SFLS process using Au colloids [12]. Here in this work we have used silver and alloy seed particle with different compositions to manipulate the growth of nanowires in sub-eutectic regime. The solid seeding approach also gives an opportunity to influence the crystallinity of the nanowires independent of the substrate. Taking advantage of the readily formation of stacking faults in metal nanoparticles, lamellar twins in nanowires could be formed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An Aerosol Time-Of-Flight Mass Spectrometer (ATOFMS) was deployed to investigate the size-resolved chemical composition of single particles at an urban background site in Paris, France, as part of the MEGAPOLI winter campaign in January/February 2010. ATOFMS particle counts were scaled to match coincident Twin Differential Mobility Particle Sizer (TDMPS) data in order to generate hourly size-resolved mass concentrations for the single particle classes observed. The total scaled ATOFMS particle mass concentration in the size range 150–1067 nm was found to agree very well with the sum of concurrent High-Resolution Time-of-Flight Aerosol Mass Spectrometer (HR-ToF-AMS) and Multi-Angle Absorption Photometer (MAAP) mass concentration measurements of organic carbon (OC), inorganic ions and black carbon (BC) (R2 = 0.91). Clustering analysis of the ATOFMS single particle mass spectra allowed the separation of elemental carbon (EC) particles into four classes: (i) EC attributed to biomass burning (ECbiomass), (ii) EC attributed to traffic (ECtraffic), (iii) EC internally mixed with OC and ammonium sulfate (ECOCSOx), and (iv) EC internally mixed with OC and ammonium nitrate (ECOCNOx). Average hourly mass concentrations for EC-containing particles detected by the ATOFMS were found to agree reasonably well with semi-continuous quantitative thermal/optical EC and optical BC measurements (r2 = 0.61 and 0.65–0.68 respectively, n = 552). The EC particle mass assigned to fossil fuel and biomass burning sources also agreed reasonably well with BC mass fractions assigned to the same sources using seven-wavelength aethalometer data (r2 = 0.60 and 0.48, respectively, n = 568). Agreement between the ATOFMS and other instrumentation improved noticeably when a period influenced by significantly aged, internally mixed EC particles was removed from the intercomparison. 88% and 12% of EC particle mass was apportioned to fossil fuel and biomass burning respectively using the ATOFMS data compared with 85% and 15% respectively for BC estimated from the aethalometer model. On average, the mass size distribution for EC particles is bimodal; the smaller mode is attributed to locally emitted, mostly externally mixed EC particles, while the larger mode is dominated by aged, internally mixed ECOCNOx particles associated with continental transport events. Periods of continental influence were identified using the Lagrangian Particle Dispersion Model (LPDM) "FLEXPART". A consistent minimum between the two EC mass size modes was observed at approximately 400 nm for the measurement period. EC particles below this size are attributed to local emissions using chemical mixing state information and contribute 79% of the scaled ATOFMS EC particle mass, while particles above this size are attributed to continental transport events and contribute 21% of the EC particle mass. These results clearly demonstrate the potential benefit of monitoring size-resolved mass concentrations for the separation of local and continental EC emissions. Knowledge of the relative input of these emissions is essential for assessing the effectiveness of local abatement strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The science of genetics is undergoing a paradigm shift. Recent discoveries, including the activity of retrotransposons, the extent of copy number variations, somatic and chromosomal mosaicism, and the nature of the epigenome as a regulator of DNA expressivity, are challenging a series of dogmas concerning the nature of the genome and the relationship between genotype and phenotype. DNA, once held to be the unchanging template of heredity, now appears subject to a good deal of environmental change; considered to be identical in all cells and tissues of the body, there is growing evidence that somatic mosaicism is the normal human condition; and treated as the sole biological agent of heritability, we now know that the epigenome, which regulates gene expressivity, can be inherited via the germline. These developments are particularly significant for behavior genetics for at least three reasons: First, these phenomena appear to be particularly prevalent in the human brain, and likely are involved in much of human behavior; second, they have important implications for the validity of heritability and gene association studies, the methodologies that largely define the discipline of behavior genetics; and third, they appear to play a critical role in development during the perinatal period, and in enabling phenotypic plasticity in offspring in particular. I examine one of the central claims to emerge from the use of heritability studies in the behavioral sciences, the principle of “minimal shared maternal effects,” in light of the growing awareness that the maternal perinatal environment is a critical venue for the exercise of adaptive phenotypic plasticity. This consideration has important implications for both developmental and evolutionary biology

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

My dissertation investigates twin financial interventions—urban development and emergency management—in a single small town. Once a thriving city drawing blacks as blue-collar workers during the Great Migration, Benton Harbor, Michigan has suffered from waves of out-migration, debt, and alleged poor management. Benton Harbor’s emphasis on high-end economic development to attract white-collar workers and tourism, amidst the poverty, unemployment, and disenfranchisement of black residents, highlights an extreme case of American urban inequality. At the same time, many bystanders and representative observers argue that this urban redevelopment scheme and the city’s takeover by the state represent Benton Harbor residents’ only hope for a better life. I interviewed 44 key players and observers in local politics and development, attended 20 public meetings, conducted three months of observations, and collected extensive archival data. Examining Benton Harbor’s time under emergency management and its luxury golf course development as two exemplars of a larger relationship, I find that the top-down processes allegedly intended to alleviate Benton Harbor’s inequality actually reproduce and deepen the city’s problems. I propose that the beneficiaries of both plans constitute a white urban regime active in Benton Harbor. I show how the white urban regime serves its interests by operating an extraction machine in the city, which serves to reproduce local poverty and wealth by directing resources toward the white urban regime and away from the city.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thèse numérisée par la Direction des bibliothèques de l'Université de Montréal.