971 resultados para layered medium theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The E01-011 experiment at Jefferson Laboratory (JLab) studied light-to-medium mass Λ hypernuclei via the AZ + e → [special characters omitted] + e' + K+ electroproduction reaction. Precise measurement of hypernuclear ground state masses and excitation energies provides information about the nature of hyperon-nucleon interactions. Until recently, hypernuclei were studied at accelerator facilities with intense π+ and K- meson beams. The poor quality of these beams limited the resolution of the hypernuclear excitation energy spectra to about 1.5 MeV (FWHM). This resolution is not sufficient for resolving the rich structure observed in the excitation spectra. By using a high quality electron beam and employing a new high resolution spectrometer system, this study aims to improve the resolution to a few hundred keV with an absolute precision of about 100 keV for excitation energies. In this work the high-resolution excitation spectra of [special characters omitted], and [special characters omitted] hypernuclei are presented. In an attempt to emphasize the presence of the core-excited states we introduced a novel likelihood approach to particle identification (PID) to serve as an alternative to the commonly used standard hard-cut PID. The new method resulted in almost identical missing mass spectra as obtained by the standard approach. An energy resolution of approximately 400–500 keV (FWHM) has been achieved, an unprecedented value in hypernuclear reaction spectroscopy. For [special characters omitted] the core-excited configuration has been clearly observed with significant statistics. The embedded Λ hyperon increases the excitation energies of the 11B nuclear core by 0.5–1 MeV. The [special characters omitted] spectrum has been observed with significant statistics for the first time. The ground state is bound deeper by roughly 400 keV than currently predicted by theory. Indication for the core-excited doublet, which is unbound in the core itself, is observed. The measurement of [special characters omitted] provides the first study of a d-shell hypernucleus with sub-MeV resolution. Discrepancies of up to 2 MeV between measured and theoretically predicted binding energies are found. Similar disagreement exists when comparing to the [special characters omitted] mirror hypernucleus. Also the core-excited structure observed between the major s-, p- and d-shell Λ orbits is not consistent with the available theoretical calculations. In conclusion, the discrepancies found in this study will provide valuable input for the further development of theoretical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several different mechanisms leading to the formation of (substituted) naphthalene and azanaphthalenes were examined using theoretical quantum chemical calculations. As a result, a series of novel synthetic routes to Polycyclic Aromatic Hydrocarbons (PAHs) and Nitrogen Containing Polycyclic Aromatic Compounds (N-PACs) have been proposed. On Earth, these aromatic compounds originate from incomplete combustion and are released into our environment, where they are known to be major pollutants, often with carcinogenic properties. In the atmosphere of a Saturn's moon Titan, these PAH and N-PACs are believed to play a critical role in organic haze formation, as well as acting as chemical precursors to biologically relevant molecules. The theoretical calculations were performed by employing the ab initio G3(MP2,CC)/B3LYP/6-311G** method to effectively probe the Potential Energy Surfaces (PES) relevant to the PAH and N-PAC formation. Following the construction of the PES, Rice-Ramsperger-Kassel-Markus (RRKM) theory was used to evaluate all unimolecular rate constants as a function of collision energy under single-collision conditions. Branching ratios were then evaluated by solving phenomenological rate expressions for the various product concentrations. The most viable pathways to PAH and N-PAC formation were found to be those where the initial attack by the ethynyl (C2H) or cyano (CN) radical toward a unsaturated hydrocarbon molecule led to the formation of an intermediate which could not effectively lose a hydrogen atom. It is not until ring cyclization has occurred, that hydrogen elimination leads to a closed shell product. By quenching the possibility of the initial hydrogen atom elimination, one of the most competitive processes preventing the PAH or N-PAC formation was avoided, and the PAH or N-PAC formation was allowed to proceed. It is concluded that these considerations should be taken into account when attempting to explore any other potential routes towards aromatic compounds in cold environments, such as on Titan or in the interstellar medium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is funded by UK Medical Research Council grant number MR/L011115/1. We would like to thank the 105 experts in behaviour change who have committed their time and offered their expertise for study 2 of this research. We are also very grateful to all those who sent us peer-reviewed behaviour change intervention descriptions for study 1. Finally, we would like thank Dr. Emma Beard and Dr. Dan Dediu for their statistical input and to all the researchers, particularly Holly Walton, who have assisted in the coding of papers for study 1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Funding for this study was received from the Chief Scientist Office for Scotland. We would like to thank Asthma UK and Asthma UK Scotland for facilitating the advertisement of the study pilot and consultative user group. Thanks to Dr Mark Grindle for his helpful discussions concerning narrative. Thanks also to Mr Mark Haldane who designed the characters, backgrounds, and user interface used within the 3D computer animation. Particular thanks to the participants of the consultative user group for their enthusiasm, comments, and suggestions at all stages of the intervention design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the world population continues to grow past seven billion people and global challenges continue to persist including resource availability, biodiversity loss, climate change and human well-being, a new science is required that can address the integrated nature of these challenges and the multiple scales on which they are manifest. Sustainability science has emerged to fill this role. In the fifteen years since it was first called for in the pages of Science, it has rapidly matured, however its place in the history of science and the way it is practiced today must be continually evaluated. In Part I, two chapters address this theoretical and practical grounding. Part II transitions to the applied practice of sustainability science in addressing the urban heat island (UHI) challenge wherein the climate of urban areas are warmer than their surrounding rural environs. The UHI has become increasingly important within the study of earth sciences given the increased focus on climate change and as the balance of humans now live in urban areas.

In Chapter 2 a novel contribution to the historical context of sustainability is argued. Sustainability as a concept characterizing the relationship between humans and nature emerged in the mid to late 20th century as a response to findings used to also characterize the Anthropocene. Emerging from the human-nature relationships that came before it, evidence is provided that suggests Sustainability was enabled by technology and a reorientation of world-view and is unique in its global boundary, systematic approach and ambition for both well being and the continued availability of resources and Earth system function. Sustainability is further an ambition that has wide appeal, making it one of the first normative concepts of the Anthropocene.

Despite its widespread emergence and adoption, sustainability science continues to suffer from definitional ambiguity within the academe. In Chapter 3, a review of efforts to provide direction and structure to the science reveals a continuum of approaches anchored at either end by differing visions of how the science interfaces with practice (solutions). At one end, basic science of societally defined problems informs decisions about possible solutions and their application. At the other end, applied research directly affects the options available to decision makers. While clear from the literature, survey data further suggests that the dichotomy does not appear to be as apparent in the minds of practitioners.

In Chapter 4, the UHI is first addressed at the synoptic, mesoscale. Urban climate is the most immediate manifestation of the warming global climate for the majority of people on earth. Nearly half of those people live in small to medium sized cities, an understudied scale in urban climate research. Widespread characterization would be useful to decision makers in planning and design. Using a multi-method approach, the mesoscale UHI in the study region is characterized and the secular trend over the last sixty years evaluated. Under isolated ideal conditions the findings indicate a UHI of 5.3 ± 0.97 °C to be present in the study area, the magnitude of which is growing over time.

Although urban heat islands (UHI) are well studied, there remain no panaceas for local scale mitigation and adaptation methods, therefore continued attention to characterization of the phenomenon in urban centers of different scales around the globe is required. In Chapter 5, a local scale analysis of the canopy layer and surface UHI in a medium sized city in North Carolina, USA is conducted using multiple methods including stationary urban sensors, mobile transects and remote sensing. Focusing on the ideal conditions for UHI development during an anticyclonic summer heat event, the study observes a range of UHI intensity depending on the method of observation: 8.7 °C from the stationary urban sensors; 6.9 °C from mobile transects; and, 2.2 °C from remote sensing. Additional attention is paid to the diurnal dynamics of the UHI and its correlation with vegetation indices, dewpoint and albedo. Evapotranspiration is shown to drive dynamics in the study region.

Finally, recognizing that a bridge must be established between the physical science community studying the Urban Heat Island (UHI) effect, and the planning community and decision makers implementing urban form and development policies, Chapter 6 evaluates multiple urban form characterization methods. Methods evaluated include local climate zones (LCZ), national land cover database (NCLD) classes and urban cluster analysis (UCA) to determine their utility in describing the distribution of the UHI based on three standard observation types 1) fixed urban temperature sensors, 2) mobile transects and, 3) remote sensing. Bivariate, regression and ANOVA tests are used to conduct the analyses. Findings indicate that the NLCD classes are best correlated to the UHI intensity and distribution in the study area. Further, while the UCA method is not useful directly, the variables included in the method are predictive based on regression analysis so the potential for better model design exists. Land cover variables including albedo, impervious surface fraction and pervious surface fraction are found to dominate the distribution of the UHI in the study area regardless of observation method.

Chapter 7 provides a summary of findings, and offers a brief analysis of their implications for both the scientific discourse generally, and the study area specifically. In general, the work undertaken does not achieve the full ambition of sustainability science, additional work is required to translate findings to practice and more fully evaluate adoption. The implications for planning and development in the local region are addressed in the context of a major light-rail infrastructure project including several systems level considerations like human health and development. Finally, several avenues for future work are outlined. Within the theoretical development of sustainability science, these pathways include more robust evaluations of the theoretical and actual practice. Within the UHI context, these include development of an integrated urban form characterization model, application of study methodology in other geographic areas and at different scales, and use of novel experimental methods including distributed sensor networks and citizen science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to explore the role of Quality Management (QM) theory and practice using a contingency theory perspective. The study is grounded in the role of QM in improving strategic alignment within Small and Medium Sized Enterprises (SMEs) using Contingency Theory rather than adopting best practice approaches. An inductive theory building research methodology was used involving multiple case analyses of five SMEs, involving repeat interviews (n=45), focus groups (n=5) and document analysis. From the findings, it was found that Contingency Variables (strategy, culture, lifecycle and customer focus) and their respective typologies were found to interact with QM practices in helping to shape strategic alignment between the SMEs and their environments. This shaping process based on contingency approaches occurred in a manner unique to each SME and their respective environments rather than in an overarching best practice manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose of this paper:
Recent literature indicates that around one third of perishable products finish as waste (Mena et al., 2014): 60% of this waste can be classified as avoidable (EC, 2010) suggesting logistics and operational inefficiencies along the supply chain. In developed countries perishable products are predominantly wasted in wholesale and retail (Gustavsson et al., 2011) due to customer demand uncertainty the errors and delays in the supply chain (Fernie and Sparks, 2014). While research on logistics of large retail supply chains is well documented, research on retail small and medium enterprises’ (SMEs) capabilities to prevent and manage waste of perishable products is in its infancy (c.f. Ellegaard, 2008) and needs further exploration. In our study, we investigate the retail logistics practice of small food retailers, the factors that contribute to perishable products waste and the barriers and opportunities of SMEs in retail logistics to preserve product quality and participate in reverse logistics flows.

Design/methodology/approach:
As research on waste of perishable products for SMEs is scattered, we focus on identifying key variables that contribute to the creation of avoidable waste. Secondly we identify patterns of waste creation at the retail level and its possibilities for value added recovery. We use explorative case studies (Eisenhardt, 1989) and compare four SMEs and one large retailer that operate in a developed market. To get insights into specificities of SMEs that affect retail logistics practice, we select two types of food retailers: specialised (e.g. greengrocers and bakers) and general (e.g. convenience store that sells perishable products as a part of the assortment)

Findings:
Our preliminary findings indicate that there is a difference between large retailers and SME retailers in factors that contribute to the waste creation, as well as opportunities for value added recovery of products. While more factors appear to affect waste creation and management at large retailers, a small number of specific factors appears to affect SMEs. Similarly, large retailers utilise a range of practices to reduce risks of product perishability and short shelf life, manage demand, and manage reverse logistics practices. Retail SMEs on the other hand have limited options to address waste creation and value added recovery. However, our findings show that specialist SMEs could successfully minimize waste and even create possibilities for value added recovery of perishable products. Data indicates that business orientation of the SME, the buyersupplier relationship, and an extent of adoption of lean principles in retail coupled with SME resources, product specific regulations and support from local authorities for waste management or partnerships with other organizations determine extent of successful preservation of a product quality and value added recovery.

Value:
Our contribution to the SCM academic literature is threefold: first, we identify major factors that contribute to the generation waste of perishable products in retail environment; second, we identify possibilities for value added recovery for perishable products and third, we present opportunities and challenges for SME retailers to manage or participate in activities of value added recovery. Our findings contribute to theory by filling a gap in the literature that considers product quality preservation and value added recovery in the context of retail logistics and SMEs.

Research limitations/implications:
Our findings are limited to insights from five case studies of retail companies that operate within a developed market. To improve on generalisability, we intend to increase the number of cases and include data obtained from the suppliers and organizations involved in reverse logistics flows (e.g. local authorities, charities, etc.).

Practical implications:
With this paper, we contribute to the improvement of retail logistics and operations in SMEs which constitute over 99% of business activities in UK (Rhodes, 2015). Our findings will help retail managers and owners to better understand the possibilities for value added recovery, investigate a range of logistics and retail strategies suitable for the specificities of SME environment and, ultimately, improve their profitability and sustainability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to crystallize recent research performed at the University of Worcester to investigate the feasibility of using the commercial game engine ‘Unreal Tournament 2004’ (UT2004) to produce ‘Educational Immersive Environments’ (EIEs) suitable for education and training. Our research has been supported by the UK Higher Education Academy. We discuss both practical and theoretical aspects of EIEs. The practical aspects include the production of EIEs to support high school physics education, the education of architects, and the learning of literacy by primary school children. This research is based on the development of our novel instructional medium, ‘UnrealPowerPoint’. Our fundamental guiding principles are that, first, pedagogy must inform technology, and second, that both teachers and pupils should be empowered to produce educational materials. Our work is informed by current educational theories such as constructivism, experiential learning and socio-cultural approaches as well as elements of instructional design and game principles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Observations of H3+ in the Galactic diffuse interstellar medium (ISM) have led to various surprising results, including the conclusion that the cosmic-ray ionization rate (zeta_2) is about 1 order of magnitude larger than previously thought. The present survey expands the sample of diffuse cloud sight lines with H3+ observations to 50, with detections in 21 of those. Ionization rates inferred from these detections are in the range (1.7+-1.0)x10^-16 s^-1 < zeta_2 < (10.6+-6.8)x10^-16 s^-1 with a mean value of zeta_2=(3.3+-0.4)x10^-16 s^-1. Upper limits (3sigma) derived from non-detections of H3+ are as low as zeta_2 < 0.4x10^-16 s^-1. These low upper-limits, in combination with the wide range of inferred cosmic-ray ionization rates, indicate variations in zeta_2 between different diffuse cloud sight lines. Calculations of the cosmic-ray ionization rate from theoretical cosmic-ray spectra require a large flux of low-energy (MeV) particles to reproduce values inferred from observations. Given the relatively short range of low-energy cosmic rays --- those most efficient at ionization --- the proximity of a cloud to a site of particle acceleration may set its ionization rate. Variations in zeta_2 are thus likely due to variations in the cosmic-ray spectrum at low energies resulting from the effects of particle propagation. To test this theory, H3+ was observed in sight lines passing through diffuse molecular clouds known to be interacting with the supernova remnant IC 443, a probable site of particle acceleration. Where H3+ is detected, ionization rates of zeta_2=(20+-10)x10^-16 s^-1 are inferred, higher than for any other diffuse cloud. These results support both the concept that supernova remnants act as particle accelerators, and the hypothesis that propagation effects are responsible for causing spatial variations in the cosmic-ray spectrum and ionization rate. Future observations of H3+ near other supernova remnants and in sight lines where complementary ionization tracers (OH+, H2O+, H3O+) have been observed will further our understanding of the subject.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Women with a disability continue to experience social oppression and domestic violence as a consequence of gender and disability dimensions. Current explanations of domestic violence and disability inadequately explain several features that lead women who have a disability to experience violent situations. This article incorporates both disability and material feminist theory as an alternative explanation to the dominant approaches (psychological and sociological traditions) of conceptualising domestic violence. This paper is informed by a study which was concerned with examining the nature and perceptions of violence against women with a physical impairment. The emerging analytical framework integrating material feminist interpretations and disability theory provided a basis for exploring gender and disability dimensions. Insight was also provided by the women who identified as having a disability in the study and who explained domestic violence in terms of a gendered and disabling experience. The article argues that material feminist interpretations and disability theory, with their emphasis on gender relations, disablism and poverty, should be used as an alternative tool for exploring the nature and consequences of violence against women with a disability.