925 resultados para Four Level Frame Work


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bifidobacterium bifidum NCIMB41171 carries four genes encoding different beta-galactosidases. One of them, named bbgIII, consisted of an open reading frame of 1,935 amino acid (a.a.) residues encoding a protein with a multidomain structure, commonly identified on cell wall bound enzymes, having a signal peptide, a membrane anchor, FIVAR domains, immunoglobulin Ig-like and discoidin-like domains. The other three genes, termed bbgI, bbgII and bbgIV, encoded proteins of 1,291, 689 and 1,052 a.a. residues, respectively, which were most probably intracellularly located. Two cases of protein evolution between strains of the same species were identified when the a.a. sequences of the BbgI and BbgIII were compared with homologous proteins from B. bifidum DSM20215. The homologous proteins were found to be differentiated at the C-terminal a.a. part either due to a single nucleotide insertion or to a whole DNA sequence insertion, respectively. The bbgIV gene was located in a gene organisation surrounded by divergently transcribed genes putatively for sugar transport (galactoside-symporter) and gene regulation (LacI-transcriptional regulator), a structure that was found to be highly conserved in B. longum, B. adolescentis and B. infantis, suggesting optimal organisation shared amongst those species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Consistency of performance across tasks that assess syntactic comprehension in aphasia has clinical and theoretical relevance. In this paper we add to the relatively sparse previous work on how sentence comprehension abilities are influenced by the nature of the assessment task. Aims: Our aims are: (1) to compare linguistic performance across sentence-picture matching, enactment, and truth-value judgement tasks; (2) to investigate the impact of pictorial stimuli on syntactic comprehension. Methods Procedures: We tested a group of 10 aphasic speakers (3 with fluent and 7 with non-fluent aphasia) in three tasks (Experiment 1): (i) sentence-picture matching with four pictures, (ii) sentence-picture matching with two pictures, and (iii) enactment. A further task of truth-value judgement was given to a subgroup of those speakers (n=5, Experiment 2). Similar sentence types across all tasks were used and included canonical (actives, subject clefts) and non-canonical (passives, object clefts) sentences. We undertook two types of analyses: (a) we compared canonical and non-canonical sentences in each task; (b) we compared performance between (i) actives and passives, (ii) subject and object clefts in each task. We examined the results of all participants as a group and as case-series. Outcomes Results: Several task effects emerged. Overall, the two-picture sentence-picture matching and enactment tasks were more discriminating than the four-picture condition. Group performance in the truth-value judgement task was similar to two-picture sentence-picture matching and enactment. At the individual level performance across tasks contrasted to some group results. Conclusions: Our findings revealed task effects across participants. We discuss reasons that could explain the diverse profiles of performance and the implications for clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we introduce a novel high-level visual content descriptor devised for performing semantic-based image classification and retrieval. The work can be treated as an attempt for bridging the so called "semantic gap". The proposed image feature vector model is fundamentally underpinned by an automatic image labelling framework, called Collaterally Cued Labelling (CCL), which incorporates the collateral knowledge extracted from the collateral texts accompanying the images with the state-of-the-art low-level visual feature extraction techniques for automatically assigning textual keywords to image regions. A subset of the Corel image collection was used for evaluating the proposed method. The experimental results indicate that our semantic-level visual content descriptors outperform both conventional visual and textual image feature models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of four operational weather forecast models [ECMWF, Action de Recherche Petite Echelle Grande Echelle model (ARPEGE), Regional Atmospheric Climate Model (RACMO), and Met Office] to generate a cloud at the right location and time (the cloud frequency of occurrence) is assessed in the present paper using a two-year time series of observations collected by profiling ground-based active remote sensors (cloud radar and lidar) located at three different sites in western Europe (Cabauw. Netherlands; Chilbolton, United Kingdom; and Palaiseau, France). Particular attention is given to potential biases that may arise from instrumentation differences (especially sensitivity) from one site to another and intermittent sampling. In a second step the statistical properties of the cloud variables involved in most advanced cloud schemes of numerical weather forecast models (ice water content and cloud fraction) are characterized and compared with their counterparts in the models. The two years of observations are first considered as a whole in order to evaluate the accuracy of the statistical representation of the cloud variables in each model. It is shown that all models tend to produce too many high-level clouds, with too-high cloud fraction and ice water content. The midlevel and low-level cloud occurrence is also generally overestimated, with too-low cloud fraction but a correct ice water content. The dataset is then divided into seasons to evaluate the potential of the models to generate different cloud situations in response to different large-scale forcings. Strong variations in cloud occurrence are found in the observations from one season to the same season the following year as well as in the seasonal cycle. Overall, the model biases observed using the whole dataset are still found at seasonal scale, but the models generally manage to well reproduce the observed seasonal variations in cloud occurrence. Overall, models do not generate the same cloud fraction distributions and these distributions do not agree with the observations. Another general conclusion is that the use of continuous ground-based radar and lidar observations is definitely a powerful tool for evaluating model cloud schemes and for a responsive assessment of the benefit achieved by changing or tuning a model cloud

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A theoretical framework is developed for the evolution of baroclinic waves with latent heat release parameterized in terms of vertical velocity. Both wave–conditional instability of the second kind (CISK) and large-scale rain approaches are included. The new quasigeostrophic framework covers evolution from general initial conditions on zonal flows with vertical shear, planetary vorticity gradient, a lower boundary, and a tropopause. The formulation is given completely in terms of potential vorticity, enabling the partition of perturbations into Rossby wave components, just as for the dry problem. Both modal and nonmodal development can be understood to a good approximation in terms of propagation and interaction between these components alone. The key change with moisture is that growing normal modes are described in terms of four counterpropagating Rossby wave (CRW) components rather than two. Moist CRWs exist above and below the maximum in latent heating, in addition to the upper- and lower-level CRWs of dry theory. Four classifications of baroclinic development are defined by quantifying the strength of interaction between the four components and identifying the dominant pairs, which range from essentially dry instability to instability in the limit of strong heating far from boundaries, with type-C cyclogenesis and diabatic Rossby waves being intermediate types. General initial conditions must also include passively advected residual PV, as in the dry problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classical measures of network connectivity are the number of disjoint paths between a pair of nodes and the size of a minimum cut. For standard graphs, these measures can be computed efficiently using network flow techniques. However, in the Internet on the level of autonomous systems (ASs), referred to as AS-level Internet, routing policies impose restrictions on the paths that traffic can take in the network. These restrictions can be captured by the valley-free path model, which assumes a special directed graph model in which edge types represent relationships between ASs. We consider the adaptation of the classical connectivity measures to the valley-free path model, where it is -hard to compute them. Our first main contribution consists of presenting algorithms for the computation of disjoint paths, and minimum cuts, in the valley-free path model. These algorithms are useful for ASs that want to evaluate different options for selecting upstream providers to improve the robustness of their connection to the Internet. Our second main contribution is an experimental evaluation of our algorithms on four types of directed graph models of the AS-level Internet produced by different inference algorithms. Most importantly, the evaluation shows that our algorithms are able to compute optimal solutions to instances of realistic size of the connectivity problems in the valley-free path model in reasonable time. Furthermore, our experimental results provide information about the characteristics of the directed graph models of the AS-level Internet produced by different inference algorithms. It turns out that (i) we can quantify the difference between the undirected AS-level topology and the directed graph models with respect to fundamental connectivity measures, and (ii) the different inference algorithms yield topologies that are similar with respect to connectivity and are different with respect to the types of paths that exist between pairs of ASs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented in this report is part of the effort to define the landscape state and diversity indicator in the frame of COM (2006) 508 “Development of agri-environmental indicators for monitoring the integration of environmental concerns into the common agricultural policy”. The Communication classifies the indicators according to their level of development, which, for the landscape indicator is “in need of substantial improvements in order to become fully operational”. For this reason a full re-definition of the indicator has been carried out, following the initial proposal presented in the frame of the IRENA operation (“Indicator Reporting on the Integration of Environmental Concerns into Agricultural Policy”). The new proposal for the landscape state and diversity indicator is structured in three components: the first concerns the degree of naturalness, the second landscape structure, the third the societal appreciation of the rural landscape. While the first two components rely on a strong bulk of existing literature, the development of the methodology has made evident the need for further analysis of the third component, which is based on a newly proposed top-down approach. This report presents an in-depth analysis of such component of the indicator, and the effort to include a social dimension in large scale landscape assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the extent to which a structured undergraduate research intervention, UROP, permits undergraduate students early access to legitimate peripheral participation (LPP) in a research community of practice. Accounts of placement experiences suggest that UROP affords rich possibilities for engagement with research practice. Undergraduates tread a path of gaining access to mature practice while also building their own independence, participating in work that they see matters to the community and making gains in use of a shared research repertoire. Students place UROP experiences in a contrasting frame to research exercises experienced during degree programmes; their sense of the authenticity of the research experienced through UROP emerges as a key element of these accounts. The data generate the interesting question that the degree of engagement with mature practice may account for more of the gain from UROP than simply the quantity of contact other researchers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The United Nation Intergovernmental Panel on Climate Change (IPCC) makes it clear that climate change is due to human activities and it recognises buildings as a distinct sector among the seven analysed in its 2007 Fourth Assessment Report. Global concerns have escalated regarding carbon emissions and sustainability in the built environment. The built environment is a human-made setting to accommodate human activities, including building and transport, which covers an interdisciplinary field addressing design, construction, operation and management. Specifically, Sustainable Buildings are expected to achieve high performance throughout the life-cycle of siting, design, construction, operation, maintenance and demolition, in the following areas: • energy and resource efficiency; • cost effectiveness; • minimisation of emissions that negatively impact global warming, indoor air quality and acid rain; • minimisation of waste discharges; and • maximisation of fulfilling the requirements of occupants’ health and wellbeing. Professionals in the built environment sector, for example, urban planners, architects, building scientists, engineers, facilities managers, performance assessors and policy makers, will play a significant role in delivering a sustainable built environment. Delivering a sustainable built environment needs an integrated approach and so it is essential for built environment professionals to have interdisciplinary knowledge in building design and management . Building and urban designers need to have a good understanding of the planning, design and management of the buildings in terms of low carbon and energy efficiency. There are a limited number of traditional engineers who know how to design environmental systems (services engineer) in great detail. Yet there is a very large market for technologists with multi-disciplinary skills who are able to identify the need for, envision and manage the deployment of a wide range of sustainable technologies, both passive (architectural) and active (engineering system),, and select the appropriate approach. Employers seek applicants with skills in analysis, decision-making/assessment, computer simulation and project implementation. An integrated approach is expected in practice, which encourages built environment professionals to think ‘out of the box’ and learn to analyse real problems using the most relevant approach, irrespective of discipline. The Design and Management of Sustainable Built Environment book aims to produce readers able to apply fundamental scientific research to solve real-world problems in the general area of sustainability in the built environment. The book contains twenty chapters covering climate change and sustainability, urban design and assessment (planning, travel systems, urban environment), urban management (drainage and waste), buildings (indoor environment, architectural design and renewable energy), simulation techniques (energy and airflow), management (end-user behaviour, facilities and information), assessment (materials and tools), procurement, and cases studies ( BRE Science Park). Chapters one and two present general global issues of climate change and sustainability in the built environment. Chapter one illustrates that applying the concepts of sustainability to the urban environment (buildings, infrastructure, transport) raises some key issues for tackling climate change, resource depletion and energy supply. Buildings, and the way we operate them, play a vital role in tackling global greenhouse gas emissions. Holistic thinking and an integrated approach in delivering a sustainable built environment is highlighted. Chapter two demonstrates the important role that buildings (their services and appliances) and building energy policies play in this area. Substantial investment is required to implement such policies, much of which will earn a good return. Chapters three and four discuss urban planning and transport. Chapter three stresses the importance of using modelling techniques at the early stage for strategic master-planning of a new development and a retrofit programme. A general framework for sustainable urban-scale master planning is introduced. This chapter also addressed the needs for the development of a more holistic and pragmatic view of how the built environment performs, , in order to produce tools to help design for a higher level of sustainability and, in particular, how people plan, design and use it. Chapter four discusses microcirculation, which is an emerging and challenging area which relates to changing travel behaviour in the quest for urban sustainability. The chapter outlines the main drivers for travel behaviour and choices, the workings of the transport system and its interaction with urban land use. It also covers the new approach to managing urban traffic to maximise economic, social and environmental benefits. Chapters five and six present topics related to urban microclimates including thermal and acoustic issues. Chapter five discusses urban microclimates and urban heat island, as well as the interrelationship of urban design (urban forms and textures) with energy consumption and urban thermal comfort. It introduces models that can be used to analyse microclimates for a careful and considered approach for planning sustainable cities. Chapter six discusses urban acoustics, focusing on urban noise evaluation and mitigation. Various prediction and simulation methods for sound propagation in micro-scale urban areas, as well as techniques for large scale urban noise-mapping, are presented. Chapters seven and eight discuss urban drainage and waste management. The growing demand for housing and commercial developments in the 21st century, as well as the environmental pressure caused by climate change, has increased the focus on sustainable urban drainage systems (SUDS). Chapter seven discusses the SUDS concept which is an integrated approach to surface water management. It takes into consideration quality, quantity and amenity aspects to provide a more pleasant habitat for people as well as increasing the biodiversity value of the local environment. Chapter eight discusses the main issues in urban waste management. It points out that population increases, land use pressures, technical and socio-economic influences have become inextricably interwoven and how ensuring a safe means of dealing with humanity’s waste becomes more challenging. Sustainable building design needs to consider healthy indoor environments, minimising energy for heating, cooling and lighting, and maximising the utilisation of renewable energy. Chapter nine considers how people respond to the physical environment and how that is used in the design of indoor environments. It considers environmental components such as thermal, acoustic, visual, air quality and vibration and their interaction and integration. Chapter ten introduces the concept of passive building design and its relevant strategies, including passive solar heating, shading, natural ventilation, daylighting and thermal mass, in order to minimise heating and cooling load as well as energy consumption for artificial lighting. Chapter eleven discusses the growing importance of integrating Renewable Energy Technologies (RETs) into buildings, the range of technologies currently available and what to consider during technology selection processes in order to minimise carbon emissions from burning fossil fuels. The chapter draws to a close by highlighting the issues concerning system design and the need for careful integration and management of RETs once installed; and for home owners and operators to understand the characteristics of the technology in their building. Computer simulation tools play a significant role in sustainable building design because, as the modern built environment design (building and systems) becomes more complex, it requires tools to assist in the design process. Chapter twelve gives an overview of the primary benefits and users of simulation programs, the role of simulation in the construction process and examines the validity and interpretation of simulation results. Chapter thirteen particularly focuses on the Computational Fluid Dynamics (CFD) simulation method used for optimisation and performance assessment of technologies and solutions for sustainable building design and its application through a series of cases studies. People and building performance are intimately linked. A better understanding of occupants’ interaction with the indoor environment is essential to building energy and facilities management. Chapter fourteen focuses on the issue of occupant behaviour; principally, its impact, and the influence of building performance on them. Chapter fifteen explores the discipline of facilities management and the contribution that this emerging profession makes to securing sustainable building performance. The chapter highlights a much greater diversity of opportunities in sustainable building design that extends well into the operational life. Chapter sixteen reviews the concepts of modelling information flows and the use of Building Information Modelling (BIM), describing these techniques and how these aspects of information management can help drive sustainability. An explanation is offered concerning why information management is the key to ‘life-cycle’ thinking in sustainable building and construction. Measurement of building performance and sustainability is a key issue in delivering a sustainable built environment. Chapter seventeen identifies the means by which construction materials can be evaluated with respect to their sustainability. It identifies the key issues that impact the sustainability of construction materials and the methodologies commonly used to assess them. Chapter eighteen focuses on the topics of green building assessment, green building materials, sustainable construction and operation. Commonly-used assessment tools such as BRE Environmental Assessment Method (BREEAM), Leadership in Energy and Environmental Design ( LEED) and others are introduced. Chapter nineteen discusses sustainable procurement which is one of the areas to have naturally emerged from the overall sustainable development agenda. It aims to ensure that current use of resources does not compromise the ability of future generations to meet their own needs. Chapter twenty is a best-practice exemplar - the BRE Innovation Park which features a number of demonstration buildings that have been built to the UK Government’s Code for Sustainable Homes. It showcases the very latest innovative methods of construction, and cutting edge technology for sustainable buildings. In summary, Design and Management of Sustainable Built Environment book is the result of co-operation and dedication of individual chapter authors. We hope readers benefit from gaining a broad interdisciplinary knowledge of design and management in the built environment in the context of sustainability. We believe that the knowledge and insights of our academics and professional colleagues from different institutions and disciplines illuminate a way of delivering sustainable built environment through holistic integrated design and management approaches. Last, but not least, I would like to take this opportunity to thank all the chapter authors for their contribution. I would like to thank David Lim for his assistance in the editorial work and proofreading.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

English teachers in England have experienced a lengthy period of external constraint, increasingly controlling their practice. This constraint was originated in the 1989 National curriculum. Although in its first version it was in harmony with practice, its numerous revisions have moved it a long way from teachers’ own values and beliefs. This move is illustrated through research into the teaching of literature, which is seen by English teachers as often arid and driven by examinations alone. This period has been increasingly dominated by high-stakes testing, school league tables and frequent school inspections. Another powerful element has been the introduction of Standards for teachers at every career level from student teachers to the Advanced Skills Teachers. Research demonstrates that this introduction of Standards has had some beneficial effects. However, research also shows that the government decision to replace all these, hierarchically structured standards, with a single standard is seen by many teachers as a retrograde step. Evidence from Advanced Skills Teachers of English shows that the government’s additional proposal to bring in a Master Teacher standard is equally problematic. The decline of the National Association for the Teaching of English, the key subject association for English teachers, is discussed in relation to this increasingly negative and constraining environment, concluding that many English teachers are choosing a form of local resistance which, while understandable, weakens the credibility of the profession and erodes the influence of its key voice, NATE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article looks at how four British-based poets born in the Caribbean exploit the rich language repertoire available to them in their work for children and young people. Following initial consideration of questions of definition and terminology, poetry collections by James Berry, John Agard, Grace Nichols and Valerie Bloom are discussed, with a focus on the interplay and creative tension between the different varieties of Caribbean creoles (“Bad Talk”) and standard English evident in their work. Variation both between the four poets’ usage and within each individual poet’s work is considered, and a trend over time towards the inclusion of fewer creole-influenced poems is noted. This and other issues, such as the labelling of the four poets’ work as ‘performance poetry’ and the nature of the poets’ contribution to British children’s literature, are considered in the conclusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UK Government's Department for Energy and Climate Change has been investigating the feasibility of developing a national energy efficiency data framework covering both domestic and non-domestic buildings. Working closely with the Energy Saving Trust and energy suppliers, the aim is to develop a data framework to monitor changes in energy efficiency, develop and evaluate programmes and improve information available to consumers. Key applications of the framework are to understand trends in built stock energy use, identify drivers and evaluate the success of different policies. For energy suppliers, it could identify what energy uses are growing, in which sectors and why. This would help with market segmentation and the design of products. For building professionals, it could supplement energy audits and modelling of end-use consumption with real data and support the generation of accurate and comprehensive benchmarks. This paper critically examines the results of the first phase of work to construct a national energy efficiency data-framework for the domestic sector focusing on two specific issues: (a) drivers of domestic energy consumption in terms of the physical nature of the dwellings and socio-economic characteristics of occupants and (b) the impact of energy efficiency measures on energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sub-seasonal variability including equatorial waves significantly influence the dehydration and transport processes in the tropical tropopause layer (TTL). This study investigates the wave activity in the TTL in 7 reanalysis data sets (RAs; NCEP1, NCEP2, ERA40, ERA-Interim, JRA25, MERRA, and CFSR) and 4 chemistry climate models (CCMs; CCSRNIES, CMAM, MRI, and WACCM) using the zonal wave number-frequency spectral analysis method with equatorially symmetric-antisymmetric decomposition. Analyses are made for temperature and horizontal winds at 100 hPa in the RAs and CCMs and for outgoing longwave radiation (OLR), which is a proxy for convective activity that generates tropopause-level disturbances, in satellite data and the CCMs. Particular focus is placed on equatorial Kelvin waves, mixed Rossby-gravity (MRG) waves, and the Madden-Julian Oscillation (MJO). The wave activity is defined as the variance, i.e., the power spectral density integrated in a particular zonal wave number-frequency region. It is found that the TTL wave activities show significant difference among the RAs, ranging from ∼0.7 (for NCEP1 and NCEP2) to ∼1.4 (for ERA-Interim, MERRA, and CFSR) with respect to the averages from the RAs. The TTL activities in the CCMs lie generally within the range of those in the RAs, with a few exceptions. However, the spectral features in OLR for all the CCMs are very different from those in the observations, and the OLR wave activities are too low for CCSRNIES, CMAM, and MRI. It is concluded that the broad range of wave activity found in the different RAs decreases our confidence in their validity and in particular their value for validation of CCM performance in the TTL, thereby limiting our quantitative understanding of the dehydration and transport processes in the TTL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is estimated that the adult human brain contains 100 billion neurons with 5–10 times as many astrocytes. Although it has been generally considered that the astrocyte is a simple supportive cell to the neuron, recent research has revealed new functionality of the astrocyte in the form of information transfer to neurons of the brain. In our previous work we developed a protocol to pattern the hNT neuron (derived from the human teratocarcinoma cell line (hNT)) on parylene-C/SiO2 substrates. In this work, we report how we have managed to pattern hNT astrocytes, on parylene-C/SiO2 substrates to single cell resolution. This article disseminates the nanofabrication and cell culturing steps necessary for the patterning of such cells. In addition, it reports the necessary strip lengths and strip width dimensions of parylene-C that encourage high degrees of cellular coverage and single cell isolation for this cell type. The significance in patterning the hNT astrocyte on silicon chip is that it will help enable single cell and network studies into the undiscovered functionality of this interesting cell, thus, contributing to closer pathological studies of the human brain.