913 resultados para global strategic networks of linkages
Resumo:
Background Infant development is adversely affected in the context of postnatal depression. This relationship may be mediated by both the nature of early mother-infant interactions and the quality of the home environment. Aim To establish the usefulness of the Global Ratings Scales of Mother-Infant Interaction and the Infant-Toddler version of the Home Observation for the Measurement of the Environment (IT-HOME), and to test expected associations of the measures with characteristics of the social context and with major or minor depression. Method Both assessments were administered postnatally in four European centres; 144 mothers were assessed with the Global Ratings Scales and 114 with the IT-HOME. Affective disorder was assessed by means of the Structured Clinical Interview for DSM-IV Disorders. Results Analyses of mother-infant interaction indicated no main effect for depression but maternal sensitivity to infant behaviour was associated with better infant communication, especially for women who were not depressed. Poor overall emotional support also reduced sensitivity scores. Poor support was also related to poorer IT-HOME scores, but there was no effect of depression. Conclusions The Global Ratings Scales were effectively applied but there was less evidence of the usefulness of the IT-HOME. Declaration of interest None.
Resumo:
It is usually expected that the intelligent controlling mechanism of a robot is a computer system. Research is however now ongoing in which biological neural networks are being cultured and trained to act as the brain of an interactive real world robot - thereby either completely replacing or operating in a cooperative fashion with a computer system. Studying such neural systems can give a distinct insight into biological neural structures and therefore such research has immediate medical implications. In particular, the use of rodent primary dissociated cultured neuronal networks for the control of mobile `animals' (artificial animals, a contraction of animal and materials) is a novel approach to discovering the computational capabilities of networks of biological neurones. A dissociated culture of this nature requires appropriate embodiment in some form, to enable appropriate development in a controlled environment within which appropriate stimuli may be received via sensory data but ultimate influence over motor actions retained. The principal aims of the present research are to assess the computational and learning capacity of dissociated cultured neuronal networks with a view to advancing network level processing of artificial neural networks. This will be approached by the creation of an artificial hybrid system (animal) involving closed loop control of a mobile robot by a dissociated culture of rat neurons. This 'closed loop' interaction with the environment through both sensing and effecting will enable investigation of its learning capacity This paper details the components of the overall animat closed loop system and reports on the evaluation of the results from the experiments being carried out with regard to robot behaviour.
Resumo:
We study the complex formation of a peptide betaAbetaAKLVFF, previously developed by our group, with Abeta(1–42) in aqueous solution. Circular dichroism spectroscopy is used to probe the interactions between betaAbetaAKLVFF and Abeta(1–42), and to study the secondary structure of the species in solution. Thioflavin T fluorescence spectroscopy shows that the population of fibers is higher in betaAbetaAKLVFF/Abeta(1–42) mixtures compared to pure Abeta(1–42) solutions. TEM and cryo-TEM demonstrate that co-incubation of betaAbetaAKLVFF with Abeta(1–42) causes the formation of extended dense networks of branched fibrils, very different from the straight fibrils observed for Abeta(1–42) alone. Neurotoxicity assays show that although betaAbetaAKLVFF alters the fibrillization of Abeta(1–42), it does not decrease the neurotoxicity, which suggests that toxic oligomeric Abeta(1–42) species are still present in the betaAbetaAKLVFF/Abeta(1–42) mixtures. Our results show that our designed peptide binds to Abeta(1–42) and changes the amyloid fibril morphology. This is shown to not necessarily translate into reduced toxicity.
Resumo:
As integrated software solutions reshape project delivery, they alter the bases for collaboration and competition across firms in complex industries. This paper synthesises and extends literatures on strategy in project-based industries and digitally-integrated work to understand how project-based firms interact with digital infrastructures for project delivery. Four identified strategies are to: 1) develop and use capabilities to shape the integrated software solutions that are used in projects; 2) co-specialize, developing complementary assets to work repeatedly with a particular integrator firm; 3) retain flexibility by developing and maintaining capabilities in multiple digital technologies and processes; and 4) manage interfaces, translating work into project formats for coordination while hiding proprietary data and capabilities in internal systems. The paper articulates the strategic importance of digital infrastructures for delivery as well as product architectures. It concludes by discussing managerial implications of the identified strategies and areas for further research.
Resumo:
Associative memory networks such as Radial Basis Functions, Neurofuzzy and Fuzzy Logic used for modelling nonlinear processes suffer from the curse of dimensionality (COD), in that as the input dimension increases the parameterization, computation cost, training data requirements, etc. increase exponentially. Here a new algorithm is introduced for the construction of a Delaunay input space partitioned optimal piecewise locally linear models to overcome the COD as well as generate locally linear models directly amenable to linear control and estimation algorithms. The training of the model is configured as a new mixture of experts network with a new fast decision rule derived using convex set theory. A very fast simulated reannealing (VFSR) algorithm is utilized to search a global optimal solution of the Delaunay input space partition. A benchmark non-linear time series is used to demonstrate the new approach.
Resumo:
Scenarios are used to explore the consequences of different adaptation and mitigation strategies under uncertainty. In this paper, two scenarios are used to explore developments with (1) no mitigation leading to an increase of global mean temperature of 4 °C by 2100 and (2) an ambitious mitigation strategy leading to 2 °C increase by 2100. For the second scenario, uncertainties in the climate system imply that a global mean temperature increase of 3 °C or more cannot be ruled out. Our analysis shows that, in many cases, adaptation and mitigation are not trade-offs but supplements. For example, the number of people exposed to increased water resource stress due to climate change can be substantially reduced in the mitigation scenario, but adaptation will still be required for the remaining large numbers of people exposed to increased stress. Another example is sea level rise, for which, from a global and purely monetary perspective, adaptation (up to 2100) seems more effective than mitigation. From the perspective of poorer and small island countries, however, stringent mitigation is necessary to keep risks at manageable levels. For agriculture, only a scenario based on a combination of adaptation and mitigation is able to avoid serious climate change impacts.
Resumo:
The ozone-ethene reaction has been investigated at low pressure in a flow-tube interfaced to a u.v. photoelectron spectrometer. Photoelectron spectra recorded as a function of reaction time have been used to estimate partial pressures of the reagents and products, using photoionization cross-sections for selected photoelectron bands of the reagents and products, which have been measured separately. Product yields compare favourably with results of other studies, and the production of oxygen and acetaldehyde have been measured as a function of time for the first time. A reaction scheme developed for the ozone-ethene reaction has been used to simulate the reagents and products as a function of time. The results obtained are in good agreement with the experimental measurements. For each of the observed products, the simulations allow the main reaction (or reactions) for production of that product to be established. The product yields have been used in a global model to estimate their global annual emissions in the atmosphere. Of particular interest are the calculated global annual emissions of formaldehyde (0.96 ± 0.10 Tg) and formic acid, (0.05 ± 0.01 Tg) which are estimated as 0.04% and 0.7% of the total annual emission respectively.
Resumo:
A developing polar low is targeted with dropsonde observations to improve the forecast of its landfall. Accurately forecasting a polar low's strength and location remains a challenge; polar lows form over the ocean in poorly observed regions, therefore initial condition errors may contribute significantly to forecast error. The targeted polar low formed in the Norwegian Sea on 3 March 2008, during the Norwegian IPY-THORPEX field campaign. Two flights, six hours apart, released dense networks of dropsondes into a sensitive region covering the polar low and Arctic front to its west. The impact of the targeted observations is assessed using the limited-area Met Office Unified Model and three-dimensional variational (3D-Var) data assimilation scheme. Forecasts were verified using ECMWF analysis data, which show good agreement with both dropsonde data from a flight through the mature polar low, and 10 m QuikSCAT winds. The impact of the targeted data moved southwards with the polar low as it developed and then hit the Norwegian coast after 24 hours. The results show that the forecast of the polar low is sensitive to the initial conditions; targeted observations from the first flight did not improve the forecast, but those from the second flight clearly improved the forecast polar low position and intensity. However, caution should be applied to attributing the forecast improvement to the assimilation of the targeted observations from a single case-study, especially in this case as the forecast improvement is moderate relative to the spread from an operational ensemble forecast
Resumo:
Ice cloud representation in general circulation models remains a challenging task, due to the lack of accurate observations and the complexity of microphysical processes. In this article, we evaluate the ice water content (IWC) and ice cloud fraction statistical distributions from the numerical weather prediction models of the European Centre for Medium-Range Weather Forecasts (ECMWF) and the UK Met Office, exploiting the synergy between the CloudSat radar and CALIPSO lidar. Using the last three weeks of July 2006, we analyse the global ice cloud occurrence as a function of temperature and latitude and show that the models capture the main geographical and temperature-dependent distributions, but overestimate the ice cloud occurrence in the Tropics in the temperature range from −60 °C to −20 °C and in the Antarctic for temperatures higher than −20 °C, but underestimate ice cloud occurrence at very low temperatures. A global statistical comparison of the occurrence of grid-box mean IWC at different temperatures shows that both the mean and range of IWC increases with increasing temperature. Globally, the models capture most of the IWC variability in the temperature range between −60 °C and −5 °C, and also reproduce the observed latitudinal dependencies in the IWC distribution due to different meteorological regimes. Two versions of the ECMWF model are assessed. The recent operational version with a diagnostic representation of precipitating snow and mixed-phase ice cloud fails to represent the IWC distribution in the −20 °C to 0 °C range, but a new version with prognostic variables for liquid water, ice and snow is much closer to the observed distribution. The comparison of models and observations provides a much-needed analysis of the vertical distribution of IWC across the globe, highlighting the ability of the models to reproduce much of the observed variability as well as the deficiencies where further improvements are required.
Resumo:
Multiple versions of information and associated problems are well documented in both academic research and industry best practices. Many solutions have proposed a single version of the truth, with Business intelligence being adopted by many organizations. Business Intelligence (BI), however, is largely based on the collection of data, processing and presentation of information to meet different stakeholders’ requirement. This paper reviews the promise of Enterprise Intelligence, which promises to support decision-making based on a defined strategic understanding of the organizations goals and a unified version of the truth.
Resumo:
This paper describes a simplified dynamic thermal model which simulates the energy and overheating performance of windows. To calculate artificial energy use within a room, the model employs the average illuminance method, which takes into account the daylight energy impacting upon the room by the use of hourly climate data. This tool describes the main thermal performance ( heating, cooling and overheating risk) resulting proposed a design of window. The inputs are fewer and simpler than that are required by complicated simulation programmes. The method is suited for the use of architects and engineers at the strategic phase of design, when little is available.
Resumo:
In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.
Resumo:
Despite the growing intensity of the debate about environmental management, it is only recently that rural practice surveyors have become aware of its significance and potential. Consequently, few surveyors are yet in a position to offer professional advice, despite evidence from the RICS's client needs survey that nearly half of all existing clients require more advice on environmental matters. As a prerequisite to becoming involved in environmental management, it is clear that chartered surveyors have to develop new skills alongside new perceptions of their work. Rather than being conterminous, however, the alignment of these attributes reflects a fundamental tension. This is focused on the dichotomy between the strategic construction of the environment as a basis for realigning corporate policy and the more limited evocation of environmentalism as potential new business. This paper seeks to explore the nature and policy context of sustainable development, in the process examining its significance for rural chartered surveyors. In doing so, the paper will seek to contrast the essentially anthropocentric utilitarianism of surveyors' current attitudes with the radical agenda inferred by a more ecocentric, sustainable development approach to professional management and advice. The paper will conclude with a discussion about how far the principles of sustainable development can be incorporated into the management of surveying businesses, and what this implies for the future of the rural practice chartered surveyor as land manager.
Resumo:
The idea of a community of practice (CoP) has been offered as the engine to unlock the potential of organizational resources, mainly knowledge and people, to achieve the strategic goal of sustained competitiveness. The relevance and application of CoPs in large UK contracting companies was investigated using two case studies. Contrasting variations in the understanding of the concept between the two contracting companies were observed. While a CoP was applied in one company with strategic intent, the concept was not fully understood in the other. In one company, only a third of CoP members surveyed agreed that CoPs were a vehicle for driving best practice and innovation throughout the business; this compared with more than 60% in agreement in the other contracting firm. The higher agreement and satisfaction of CoP members in the latter case study was the result of the management's understanding and commitment. CoPs require time and organizational support to mature. The strategic inception and management support of CoP application is vital for their maturation and progress. Although the construction industry change discourses portray CoPs as fostering an environment of trust, and hence serving as innovation and competiveness enablers, their potential contribution to contracting firms does not provide a compelling case and hence merits further research studies.