973 resultados para core-level spectroscopies
Resumo:
Urban planning in Europe has its roots in social reform movements for reform of the 18th and 19th centuries and in the UK evolved into the state-backed comprehensive planning system established as a pillar of the welfare state in 1947. This new planning system played a key role in meeting key social needs of the early post-war period, through, for example, an ambitious new town programme. However, from the late 1970s onwards the main priorities of the planning system have shifted as the UK state has withdrawn support for welfare and reasserted market values. One consequence of this has been an increased inequality in access to many of the resources that planning seeks to regulate, including affordable housing, local services and environmental quality.
Drawing on evidence from recent literature on equality, including Wilkinson and Pickett’s The Spirit Level this paper will question the role of planning in an era of post-politics and a neo-liberal state. It will review some of the consequences for the governance and practice of planning and question what this means for the core values of the planning profession. Finally, the paper will discuss the rise of the Healthy Urban Planning Movement in the US and Europe and ask whether this provides any potential for reasserting the public interest in planning process.
Resumo:
With security and surveillance, there is an increasing need to be able to process image data efficiently and effectively either at source or in a large data networks. Whilst Field Programmable Gate Arrays have been seen as a key technology for enabling this, they typically use high level and/or hardware description language synthesis approaches; this provides a major disadvantage in terms of the time needed to design or program them and to verify correct operation; it considerably reduces the programmability capability of any technique based on this technology. The work here proposes a different approach of using optimised soft-core processors which can be programmed in software. In particular, the paper proposes a design tool chain for programming such processors that uses the CAL Actor Language as a starting point for describing an image processing algorithm and targets its implementation to these custom designed, soft-core processors on FPGA. The main purpose is to exploit the task and data parallelism in order to achieve the same parallelism as a previous HDL implementation but avoiding the design time, verification and debugging steps associated with such approaches.
Resumo:
At its core, Duverger’s Law—holding that the number of viable parties in first-past-the-post systems should not exceed two—applies primarily at the district level. While the number of parties nationally may exceed two, district-level party system fragmentation should not. Given that a growing body of research shows that district-level party system fragmentation can indeed exceed two in first-past-the-post systems, I explore whether the major alternative explanation for party system fragmentation—the social cleavage approach—can explain such violations of Duverger’s Law. Testing this argument in several West European elections prior to the adoption of proportional representation, I find evidence favouring a social cleavage explanation: with the expansion of the class cleavage, the average district-level party system eventually came to violate the two-party predictions associated with Duverger’s Law. This suggests that sufficient social cleavage diversity may produce multiparty systems in other first-past-the-post systems.
Resumo:
An experimental study on strengthening prestressed concrete (PC) hollow-core slabs was conducted. Nine PC hollow-core slabs were tested, including three unstrengthened reference slabs and six slabs strengthened with bamboo plates. The results show that compared with unreinforced slabs, the cracking loads of PC hollow-core slabs strengthened with bamboo plates increase by 5% to 96% (with an average of 41%), the loads at allowable deflection increase by 8% to 76% (with an average of 35%), and the ultimate loads increase by 83% to 184% (with an average of 123%), respectively. All the degrees of improvement in the crack load, allowable load and ultimate load increase with the increase in the thickness and width of the bamboo plates. With the increase in the loads, the strain distribution along the height of the strengthened slabs at the mid-span basically remains a plan-assumption. With the increase in the thickness and width of the bamboo plates, both the bamboo tensile strain on the tensile face and the concrete compressive strain on the compression face of the strengthened slabs decrease under the same load level.
Resumo:
In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.
In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.
Resumo:
In the last decade, mobile wireless communications have witnessed an explosive growth in the user’s penetration rate and their widespread deployment around the globe. In particular, a research topic of particular relevance in telecommunications nowadays is related to the design and implementation of mobile communication systems of 4th generation (4G). 4G networks will be characterized by the support of multiple radio access technologies in a core network fully compliant with the Internet Protocol (all IP paradigms). Such networks will sustain the stringent quality of service (QoS) requirements and the expected high data rates from the type of multimedia applications (i.e. YouTube and Skype) to be available in the near future. Therefore, 4G wireless communications system will be of paramount importance on the development of the information society in the near future. As 4G wireless services will continue to increase, this will put more and more pressure on the spectrum availability. There is a worldwide recognition that methods of spectrum managements have reached their limit and are no longer optimal, therefore new paradigms must be sought. Studies show that most of the assigned spectrum is under-utilized, thus the problem in most cases is inefficient spectrum management rather spectrum shortage. There are currently trends towards a more liberalized approach of spectrum management, which are tightly linked to what is commonly termed as Cognitive Radio (CR). Furthermore, conventional deployment of 4G wireless systems (one BS in cell and mobile deploy around it) are known to have problems in providing fairness (users closer to the BS are more benefited relatively to the cell edge users) and in covering some zones affected by shadowing, therefore the use of relays has been proposed as a solution. To evaluate and analyse the performances of 4G wireless systems software tools are normally used. Software tools have become more and more mature in recent years and their need to provide a high level evaluation of proposed algorithms and protocols is now more important. The system level simulation (SLS) tools provide a fundamental and flexible way to test all the envisioned algorithms and protocols under realistic conditions, without the need to deal with the problems of live networks or reduced scope prototypes. Furthermore, the tools allow network designers a rapid collection of a wide range of performance metrics that are useful for the analysis and optimization of different algorithms. This dissertation proposes the design and implementation of conventional system level simulator (SLS), which afterwards enhances for the 4G wireless technologies namely cognitive Radios (IEEE802.22) and Relays (IEEE802.16j). SLS is then used for the analysis of proposed algorithms and protocols.
Resumo:
The Localism Act 2011 created an opportunity for local communities to form neighbourhood forums and to prepare their own neighbourhood development plans in urban and rural areas in England. Initial reactions suggested that, rather than leading to the development of more housing, these initiatives would confirm all the stereotypes of local residents blocking unwanted development in their defined neighbourhoods. However, neighbourhood plans need to be in general conformity with the core strategies of higher-tier plans and often make provision for more new homes than planned before 2011. This article discusses the role and purpose of neighbourhood plans, the evidence base on which they are founded and some of the legal challenges which have helped clarify procedures. It then identifies two types of plan based on the ways housing strategies and evidence of need are reflected in a sample of 10 plans which have been made to date. It concludes that the voluntary nature of localism to date tends to favour more rural and affluent areas and ends with an assessment of the impact of neighbourhood plans on the planning process. It suggests that the implications for spatial planning may be far-reaching.
Resumo:
Collaborative networks are typically formed by heterogeneous and autonomous entities, and thus it is natural that each member has its own set of core-values. Since these values somehow drive the behaviour of the involved entities, the ability to quickly identify partners with compatible or common core-values represents an important element for the success of collaborative networks. However, tools to assess or measure the level of alignment of core-values are lacking. Since the concept of 'alignment' in this context is still ill-defined and shows a multifaceted nature, three perspectives are discussed. The first one uses a causal maps approach in order to capture, structure, and represent the influence relationships among core-values. This representation provides the basis to measure the alignment in terms of the structural similarity and influence among value systems. The second perspective considers the compatibility and incompatibility among core-values in order to define the alignment level. Under this perspective we propose a fuzzy inference system to estimate the alignment level, since this approach allows dealing with variables that are vaguely defined, and whose inter-relationships are difficult to define. Another advantage provided by this method is the possibility to incorporate expert human judgment in the definition of the alignment level. The last perspective uses a belief Bayesian network method, and was selected in order to assess the alignment level based on members' past behaviour. An example of application is presented where the details of each method are discussed.
Resumo:
On s’accorde aujourd’hui sur la nécessité de la dimension textuelle dans l’enseignement de la langue écrite. L’objectif de notre recherche est de mettre à l’essai une démarche pédagogique visant à enseigner la compréhension/expression écrite en prenant appui sur la typologie textuelle et en adoptant une approche stratégique. Compte tenu que les Coréens apprennent le français comme deuxième langue étrangère après l’apprentissage de l’anglais, nous menons notre recherche dans un contexte d’apprentissage multilingue (le coréen, le français et l’anglais). Nous effectuons notre recherche à Montréal. Nous sélectionnons vingt- et-un apprenants coréens âgés de 14 à 15 ans en passant des entrevues sur les caractéristiques de leurs expériences scolaires et leurs apprentissages des langues. Ils possèdent tous un bagage éducatif solide en anglais mais leurs niveaux de français sont variés (i.e. sept sujets débutants, sept intermédiaires et sept avancés). Notre recherche se base sur trois expérimentations. Dans la première, nous nous intéressons notamment au rôle de la typologie textuelle auprès des débutants, dont les caractéristiques sont représentatives des apprenants coréens qui sont grammaticalement et lexicalement faibles en français. Nous mobilisons les connaissances textuelles par le biais des textes en anglais puis nous mesurons si les participants peuvent les utiliser dans les textes en français. Nous vérifions cette utilisation en comparant les résultats de la perception du fonctionnement de l’écrit en français avant et après la mobilisation des connaissances textuelles. Les donnés empiriques révèlent que les apprenants coréens qui n’ont pas encore maîtrisé les compétences de base réussissent à percevoir le fonctionnement de l’écrit en français grâce à leurs connaissances textuelles préalablement mobilisées en anglais.Dans notre deuxième expérimentation, nous examinons l’effet de l’enseignement de la typologie textuelle sur la lecture stratégique dans l’apprentissage multilingue. Nous offrons le cours de lecture stratégique avec un texte en français et examinons l’effet de cette pratique. En comparant les résultats de la compréhension avant et après le cours, nous vérifions que le cours de lecture stratégique est efficace non seulement sur la perception du fonctionnement de l’écrit, mais également sur l’apprentissage de la grammaire et du vocabulaire. Nous vérifions également l’influence translinguistique du français vers l’anglais. Dans la troisième expérimentation, nous examinons l’effet de l’enseignement de la typologie textuelle sur le processus de production écrite en français. Nous recueillons les productions des participants avant et après le cours de l’écriture. Nous les analysons avec les mêmes grilles de codage concernant la forme typologique et le sens culturel. Nous observons que les scripteurs qui ont l’occasion de mobiliser explicitement leurs connaissances textuelles peuvent obtenir des performances plus élevées concernant la forme typologique ainsi que le sens culturel après le processus de production. Nous en concluons que la didactique effectuée à partir de la typologie textuelle a toute sa pertinence dans l’apprentissage multilingue et que l’approche stratégique peut stimuler la mise en place de la typologie textuelle pour appréhender la langue écrite au niveau textuel tant en lecture qu’en écriture.
Resumo:
Distribution of toxic metal in the sediment core is an important area of research for environmental impact studies. Sediment cores were collected from two prominent region(C1 and C2) of CE and subjected to geochemical analysis to determine distribution of toxic metals (Cd, Co, Cr, Cu and Pb ), texture characteristics, total organic carbon (TOC) and CHNS. Statistical analysis was done to understand the interrelationship between the components. In the studied cores, metal contamination level was identified for Pb, Cu; Cr, in C1 and C2 respectively. The metal distribution depends on the granulometric factor, geogenic mineral components and anthropogenic input. Correlation analysis (CA) and Principal component(PCA) analysis also support these results
Resumo:
The evolution of coast through geological time scale is dependent on the transgression-regression event subsequent to the rise or fall of sea level. This event is accounted by investigation of the vertical sediment deposition patterns and their interrelationship for paleo-enviornmental reconstruction. Different methods like sedimentological (grain size and micro-morphological) and geochemical (elemental relationship) analyses as well as radiocarbon dating are generally used to decipher the sea level changes and paleoclimatic conditions of the Quaternary sediment sequence. For the Indian coast with a coastline length of about 7500 km, studies on geological and geomorphological signatures of sea level changes during the Quaternary were reported in general by researchers during the last two decades. However, for the southwest coast of India particularily Kerala which is famous for its coastal landforms comprising of estuaries, lagoons, backwaters, coastal plains, cliffs and barrier beaches, studies pertaining to the marine transgression-regression events in the southern region are limited. The Neendakara-Kayamkulam coastal stretch in central Kerala where the coast is manifested with shore parallel Kayamkulam Lagoon on one side and shore perpendicular Ashtamudi Estuary on the other side indicating existence of an uplifted prograded coastal margin followed by barrier beaches, backwater channels, ridge and runnel topography is an ideal site for studying such events. Hence the present study has been taken up in this context to address the gap area. The location for collection of core samples representing coastal plain, estuarylagoon and offshore regions have been identified based on published literature and available sedimentary records. The objectives of the research work are: To study the lithological variations and depositional environments of sediment cores along the coastal plain, estuary-lagoon and offshore regions between Kollam and Kayamkulam in the central Kerala coast To study the transportation and diagenetic history of sediments in the area To investigate the geochemical characterization of sediments and to elucidate the source-sink relationship To understand the marine transgression-regression events and to propose a conceptual model for the region The thesis comprises of 8 chapters. The first chapter embodies the preamble for the selection and significance of this research work. The study area is introduced with details on its physiographical, geological, geomorphological, rainfall and climate information. A review of literature, compiling the research on different aspects such as physico-chemical, geomorphological, tectonics, transgression-regression events are presented in the second chapter and they are broadly classified into three viz:- International, National and Kerala. The field data collection and laboratory analyses adopted in the research work are discussed in the third chapter. For collection of sediment core samples from the coastal plains, rotary drilling method was employed whereas for the estuary-lagoon and offshore locations the gravity/piston corer method was adopted. The collected subsurficial samples were analysed for texture, surface micro-texture, elemental analysis, XRD and radiocarbon dating techniques for age determination. The fourth chapter deals with the textural analysis of the core samples collected from various predefined locations of the study area. The result reveals that the Ashtamudi Estuary is composed of silty clay to clayey type of sediments whereas offshore cores are carpeted with silty clay to relict sand. Investigation of the source of sediments deposited in the coastal plain located on either side of the estuary indicates the dominance of terrigenous to marine origin in the southern region whereas it is predominantly of marine origin towards the north. Further the hydrodynamic conditions as well as the depositional enviornment of the sediment cores are elucidated based on statistical parameters that decipher the deposition pattern at various locations viz., coastal plain (open to closed basin), Ashtamudi Estuary (partially open to restricted estuary to closed basin) and offshore (open channel). The intensity of clay minerals is also discussed. From the results of radiocarbon dating the sediment depositional environments were deciphered.The results of the microtextural study of sediment samples (quartz grains) using Scanning Electron Microscope (SEM) are presented in the fifth chapter. These results throw light on the processes of transport and diagenetic history of the detrital sediments. Based on the lithological variations, selected quartz grains of different environments were also analysed. The study indicates that the southern coastal plain sediments were transported and deposited mechanically under fluvial environment followed by diagenesis under prolonged marine incursion. But in the case of the northern coastal plain, the sediments were transported and deposited under littoral environment indicating the dominance of marine incursion through mechanical as well as chemical processes. The quartz grains of the Ashtamudi Estuary indicate fluvial origin. The surface texture features of the offshore sediments suggest that the quartz grains are of littoral origin and represent the relict beach deposits. The geochemical characterisation of sediment cores based on geochemical classification, sediment maturity, palaeo-weathering and provenance in different environments are discussed in the sixth chapter. In the seventh chapter the integration of multiproxies data along with radiocarbon dates are presented and finally evolution and depositional history based on transgression–regression events is deciphered. The eighth chapter summarizes the major findings and conclusions of the study with recommendation for future work.
Resumo:
Zur Senkung von Kosten werden in vielen Unternehmen Dienstleistungen, die nicht zur Kernkompetenz gehören, an externe Dienstleister ausgelagert. Dieser Prozess wird auch als Outsourcing bezeichnet. Die dadurch entstehenden Abhängigkeiten zu den externen Dienstleistern werden mit Hilfe von Service Level Agreements (SLAs) vertraglich geregelt. Die Aufgabe des Service Level Managements (SLM) ist es, die Einhaltung der vertraglich fixierten Dienstgüteparameter zu überwachen bzw. sicherzustellen. Für eine automatische Bearbeitung ist daher eine formale Spezifikation von SLAs notwendig. Da der Markt eine Vielzahl von unterschiedlichen SLM-Werkzeugen hervorgebracht hat, entstehen in der Praxis Probleme durch proprietäre SLA-Formate und fehlende Spezifikationsmethoden. Daraus resultiert eine Werkzeugabhängigkeit und eine limitierte Wiederverwendbarkeit bereits spezifizierter SLAs. In der vorliegenden Arbeit wird ein Ansatz für ein plattformunabhängiges Service Level Management entwickelt. Ziel ist eine Vereinheitlichung der Modellierung, so dass unterschiedliche Managementansätze integriert und eine Trennung zwischen Problem- und Technologiedomäne erreicht wird. Zudem wird durch die Plattformunabhängigkeit eine hohe zeitliche Stabilität erstellter Modelle erreicht. Weiteres Ziel der Arbeit ist, die Wiederverwendbarkeit modellierter SLAs zu gewährleisten und eine prozessorientierte Modellierungsmethodik bereitzustellen. Eine automatisierte Etablierung modellierter SLAs ist für eine praktische Nutzung von entscheidender Relevanz. Zur Erreichung dieser Ziele werden die Prinzipien der Model Driven Architecture (MDA) auf die Problemdomäne des Service Level Managements angewandt. Zentrale Idee der Arbeit ist die Definition von SLA-Mustern, die konfigurationsunabhängige Abstraktionen von Service Level Agreements darstellen. Diese SLA-Muster entsprechen dem Plattformunabhängigen Modell (PIM) der MDA. Durch eine geeignete Modelltransformation wird aus einem SLA-Muster eine SLA-Instanz generiert, die alle notwendigen Konfigurationsinformationen beinhaltet und bereits im Format der Zielplattform vorliegt. Eine SLA-Instanz entspricht damit dem Plattformspezifischen Modell (PSM) der MDA. Die Etablierung der SLA-Instanzen und die daraus resultierende Konfiguration des Managementsystems entspricht dem Plattformspezifischen Code (PSC) der MDA. Nach diesem Schritt ist das Managementsystem in der Lage, die im SLA vereinbarten Dienstgüteparameter eigenständig zu überwachen. Im Rahmen der Arbeit wurde eine UML-Erweiterung definiert, die eine Modellierung von SLA-Mustern mit Hilfe eines UML-Werkzeugs ermöglicht. Hierbei kann die Modellierung rein graphisch als auch unter Einbeziehung der Object Constraint Language (OCL) erfolgen. Für die praktische Realisierung des Ansatzes wurde eine Managementarchitektur entwickelt, die im Rahmen eines Prototypen realisiert wurde. Der Gesamtansatz wurde anhand einer Fallstudie evaluiert.
Resumo:
Introduction: This article aims to show an alternative intervention for the prevention and control of back pain to the people of a production plant of geotextiles for the construction exposed to handling and awkward postures through the implementation of the Back School using the CORE technique. This technique being understood as trainer of the stability musculature of the spine; whose benefit is proportionate the muscular complex of the back, stability and avoid osteomuscular lesions and improved posture. Objective: To present the results about the implementation of the back school by the CORE technique for prevention of back pain in a population of forty-eight male collaborators. Materials and methods: The back school began with talks of awareness by the occupational health physician explaining the objectives and benefits of it to all participants. Once this activity was done, was continued to evaluate all plant employees to establish health status through the PAR-Q questionnaire, who were surveyed for the perception of pain using visual analog scale (VAS) and stability was determined column through the CORE assessment, to determine the training plan. Then, were made every six months the revaluations and implementation of a survey of assistant public perception to identify the impact of the implementation of the school back on the two variables referred (pain perception and stability of column). Results: The pain perception according VAS increased in the number of workers asymptomatic in 12% and based in the satisfaction survey 94% of population reported that with the development of this technique decrease the muscle fatigue in lumbar level; and 96% of population reported an improvement in the performance of their work activities. Discussion: Posterior to the analysis of all results, it is interpreted that back schools practice through CORE technique, contributes to the prevention and / or control of symptoms at the lumbar level in population of productive sector exposed to risks derived from the physical load, provided that ensure its continuously development and supervised for a competent professional.
Resumo:
The simulated annealing approach to structure solution from powder diffraction data, as implemented in the DASH program, is easily amenable to parallelization at the individual run level. Modest increases in speed of execution can therefore be achieved by executing individual DASH runs on the individual cores of CPUs.
Resumo:
A high-resolution record of sea-level change spanning the past 1000 years is derived from foraminiferal and chronological analyses of a 2m thick salt-marsh peat sequence at Chezzetcook, Nova Scotia, Canada. Former mean tide level positions are reconstructed with a precision of +/- 0.055 in using a transfer function derived from distributions of modern salt-marsh foraminifera. Our age model for the core section older than 300 years is based on 19 AMS C-14 ages and takes into account the individual probability distributions of calibrated radiocarbon ages. The past 300 years is dated by pollen and the isotopes Pb-206, Pb-207, Pb-210, Cs-137 and Am-241. Between AD 1000 and AD 1800, relative sea level rose at a mean rate of 17cm per century. Apparent pre-industrial rises of sea level dated at AD 1500-1550 and AD 1700-1800 cannot be clearly distinguished when radiocarbon age errors are taken into account. Furthermore, they may be an artefact of fluctuations in atmospheric C-14 production. In the 19th century sea level rose at a mean rate of 1.6mm/yr. Between AD 1900 and AD 1920, sea-level rise accelerated to the modern mean rate of 3.2mm/yr. This acceleration corresponds in time with global temperature rise and may therefore be associated with recent global warming. (c) 2005 Elsevier Ltd. All rights reserved.