839 resultados para Spontaneous generation
Resumo:
We consider the relation between so called continuous localization models—i.e. non-linear stochastic Schrödinger evolutions—and the discrete GRW-model of wave function collapse. The former can be understood as scaling limit of the GRW process. The proof relies on a stochastic Trotter formula, which is of interest in its own right. Our Trotter formula also allows to complement results on existence theory of stochastic Schrödinger evolutions by Holevo and Mora/Rebolledo.
Resumo:
In this paper we draw on the theory of dynamic capabilities to examine development of the only surviving family-owned Liverpool shipping company. The Bibby Line was founded in 1807 to take advantage of the growing sea-trade based in Liverpool. The company remained in shipping until the mid-1960s, when a series of external crises led the owner, Derek Bibby, to begin a process of diversification. In the last 50 years, the Bibby Line has grown into a £1bn business with interests in retail, distribution and financial services as well as a continuing commitment to shipping. Our intention is to demonstrate how multigenerational ownership contributes to the creation of dynamic capabilities in family firms. The distinctive nature of Bibby as a long-standing family business is related to unique assets such as patient capital, flexible governance structures as well as the ability to mobilise social and human capital.
Resumo:
Traditional vaccines such as inactivated or live attenuated vaccines, are gradually giving way to more biochemically defined vaccines that are most often based on a recombinant antigen known to possess neutralizing epitopes. Such vaccines can offer improvements in speed, safety and manufacturing process but an inevitable consequence of their high degree of purification is that immunogenicity is reduced through the lack of the innate triggering molecules present in more complex preparations. Targeting recombinant vaccines to antigen presenting cells (APCs) such as dendritic cells however can improve immunogenicity by ensuring that antigen processing is as efficient as possible. Immune complexes, one of a number of routes of APC targeting, are mimicked by a recombinant approach, crystallizable fragment (Fc) fusion proteins, in which the target immunogen is linked directly to an antibody effector domain capable of interaction with receptors, FcR, on the APC cell surface. A number of virus Fc fusion proteins have been expressed in insect cells using the baculovirus expression system and shown to be efficiently produced and purified. Their use for immunization next to non-Fc tagged equivalents shows that they are powerfully immunogenic in the absence of added adjuvant and that immune stimulation is the result of the Fc-FcR interaction.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Design summer years representing near-extreme hot summers have been used in the United Kingdom for the evaluation of thermal comfort and overheating risk. The years have been selected from measured weather data basically representative of an assumed stationary climate. Recent developments have made available ‘morphed’ equivalents of these years by shifting and stretching the measured variables using change factors produced by the UKCIP02 climate projections. The release of the latest, probabilistic, climate projections of UKCP09 together with the availability of a weather generator that can produce plausible daily or hourly sequences of weather variables has opened up the opportunity for generating new design summer years which can be used in risk-based decision-making. There are many possible methods for the production of design summer years from UKCP09 output: in this article, the original concept of the design summer year is largely retained, but a number of alternative methodologies for generating the years are explored. An alternative, more robust measure of warmth (weighted cooling degree hours) is also employed. It is demonstrated that the UKCP09 weather generator is capable of producing years for the baseline period, which are comparable with those in current use. Four methodologies for the generation of future years are described, and their output related to the future (deterministic) years that are currently available. It is concluded that, in general, years produced from the UKCP09 projections are warmer than those generated previously. Practical applications: The methodologies described in this article will facilitate designers who have access to the output of the UKCP09 weather generator (WG) to generate Design Summer Year hourly files tailored to their needs. The files produced will differ according to the methodology selected, in addition to location, emissions scenario and timeslice.
Resumo:
As electricity systems incorporate increasing levels of variable renewable generation, conventional plant will be required to operate more flexibly, with potential impacts for economic viability and reliability. Northern Ireland is pursuing an ambitious target of 40% of electricity to be supplied from renewable sources by 2020. The dominant source of this energy is anticipated to come from inherently variable wind power, one of the most mature renewable technologies. Conventional thermal generators will have a significant role to play in maintaining security of supply. However, running conventional generation more flexibly in order to cater for a wind led regime can reduce its efficiency, as well as shortening its lifespan and increasing O&M costs. This paper examines the impacts of variable operation on existing fossil fuel based generators, with a particular focus on Northern Ireland. Access to plant operators and industry experts has provided insight not currently evident in the energy literature. Characteristics of plant operation and the market framework are identified that present significant challenges in moving to the proposed levels of wind penetration. Opportunities for increasing flexible operation are proposed and future research needs identified.
Resumo:
Meteorological (met) station data is used as the basis for a number of influential studies into the impacts of the variability of renewable resources. Real turbine output data is not often easy to acquire, whereas meteorological wind data, supplied at a standardised height of 10 m, is widely available. This data can be extrapolated to a standard turbine height using the wind profile power law and used to simulate the hypothetical power output of a turbine. Utilising a number of met sites in such a manner can develop a model of future wind generation output. However, the accuracy of this extrapolation is strongly dependent on the choice of the wind shear exponent alpha. This paper investigates the accuracy of the simulated generation output compared to reality using a wind farm in North Rhins, Scotland and a nearby met station in West Freugh. The results show that while a single annual average value for alpha may be selected to accurately represent the long term energy generation from a simulated wind farm, there are significant differences between simulation and reality on an hourly power generation basis, with implications for understanding the impact of variability of renewables on short timescales, particularly system balancing and the way that conventional generation may be asked to respond to a high level of variable renewable generation on the grid in the future.
Resumo:
Reaction of [Cu(pic)2]·2H2O (where pic stands for 2-picolinato) with 2-({[2-(dimethylamino)ethyl]amino}methyl)phenol (HL1) produces the square-pyramidal complex [CuL1(pic)] (1), which crystallizes as a conglomerate (namely a mixture of optically pure crystals) in the Sohncke space group P212121. The use of the methylated ligand at the benzylic position, i.e. (±)-2-(1-{[2-(dimethylamino)ethyl]amino}ethyl)phenol (HL2), yields the analogous five-coordinate complex [CuL2(pic)] (2) that crystallizes as a true racemate (namely the crystals contain both enantiomers) in the centrosymmetric space group P21/c. Density functional theory (DFT) calculations indicate that the presence of the methyl group indeed leads to a distinct crystallization behaviour, not only by intramolecular steric effects, but also because its involvement in non-covalent C–H···π and hydrophobic intermolecular contacts appears to be an important factor contributing to the crystal-lattice (stabilizing) energy of 2
Resumo:
As wind generation increases, system impact studies rely on predictions of future generation and effective representation of wind variability. A well-established approach to investigate the impact of wind variability is to simulate generation using observations from 10 m meteorological mast-data. However, there are problems with relying purely on historical wind-speed records or generation histories: mast-data is often incomplete, not sited at a relevant wind generation sites, and recorded at the wrong altitude above ground (usually 10 m), each of which may distort the generation profile. A possible complimentary approach is to use reanalysis data, where data assimilation techniques are combined with state-of-the-art weather forecast models to produce complete gridded wind time-series over an area. Previous investigations of reanalysis datasets have placed an emphasis on comparing reanalysis to meteorological site records whereas this paper compares wind generation simulated using reanalysis data directly against historic wind generation records. Importantly, this comparison is conducted using raw reanalysis data (typical resolution ∼50 km), without relying on a computationally expensive “dynamical downscaling” for a particular target region. Although the raw reanalysis data cannot, by nature of its construction, represent the site-specific effects of sub-gridscale topography, it is nevertheless shown to be comparable to or better than the mast-based simulation in the region considered and it is therefore argued that raw reanalysis data may offer a number of significant advantages as a data source.
Resumo:
Control and optimization of flavor is the ultimate challenge for the food and flavor industry. The major route to flavor formation during thermal processing is the Maillard reaction, which is a complex cascade of interdependent reactions initiated by the reaction between a reducing sugar and an amino compd. The complexity of the reaction means that researchers turn to kinetic modeling in order to understand the control points of the reaction and to manipulate the flavor profile. Studies of the kinetics of flavor formation have developed over the past 30 years from single- response empirical models of binary aq. systems to sophisticated multi-response models in food matrixes, based on the underlying chem., with the power to predict the formation of some key aroma compds. This paper discusses in detail the development of kinetic models of thermal generation of flavor and looks at the challenges involved in predicting flavor.
Resumo:
Purpose: Increasing costs of health care, fuelled by demand for high quality, cost-effective healthcare has drove hospitals to streamline their patient care delivery systems. One such systematic approach is the adaptation of Clinical Pathways (CP) as a tool to increase the quality of healthcare delivery. However, most organizations still rely on are paper-based pathway guidelines or specifications, which have limitations in process management and as a result can influence patient safety outcomes. In this paper, we present a method for generating clinical pathways based on organizational semiotics by capturing knowledge from syntactic, semantic and pragmatic to social level. Design/methodology/approach: The proposed modeling approach to generation of CPs adopts organizational semiotics and enables the generation of semantically rich representation of CP knowledge. Semantic Analysis Method (SAM) is applied to explicitly represent the semantics of the concepts, their relationships and patterns of behavior in terms of an ontology chart. Norm Analysis Method (NAM) is adopted to identify and formally specify patterns of behavior and rules that govern the actions identified on the ontology chart. Information collected during semantic and norm analysis is integrated to guide the generation of CPs using best practice represented in BPMN thus enabling the automation of CP. Findings: This research confirms the necessity of taking into consideration social aspects in designing information systems and automating CP. The complexity of healthcare processes can be best tackled by analyzing stakeholders, which we treat as social agents, their goals and patterns of action within the agent network. Originality/value: The current modeling methods describe CPs from a structural aspect comprising activities, properties and interrelationships. However, these methods lack a mechanism to describe possible patterns of human behavior and the conditions under which the behavior will occur. To overcome this weakness, a semiotic approach to generation of clinical pathway is introduced. The CP generated from SAM together with norms will enrich the knowledge representation of the domain through ontology modeling, which allows the recognition of human responsibilities and obligations and more importantly, the ultimate power of decision making in exceptional circumstances.
Resumo:
Stroke is a medical emergency and can cause a neurological damage, affecting the motor and sensory systems. Harnessing brain plasticity should make it possible to reconstruct the closed loop between the brain and the body, i.e., association of the generation of the motor command with the somatic sensory feedback might enhance motor recovery. In order to aid reconstruction of this loop with a robotic device it is necessary to assist the paretic side of the body at the right moment to achieve simultaneity between motor command and feedback signal to somatic sensory area in brain. To this end, we propose an integrated EEG-driven assistive robotic system for stroke rehabilitation. Depending on the level of motor recovery, it is important to provide adequate stimulation for upper limb motion. Thus, we propose an assist arm incorporating a Magnetic Levitation Joint that can generate a compliant motion due to its levitation and mechanical redundancy. This paper reports on a feasibility study carried out to verify the validity of the robot sensing and on EEG measurements conducted with healthy volunteers while performing a spontaneous arm flexion/extension movement. A characteristic feature was found in the temporal evolution of EEG signal in the single motion prior to executed motion which can aid in coordinating timing of the robotic arm assistance onset.
Resumo:
Spontaneous activity of the brain at rest frequently has been considered a mere backdrop to the salient activity evoked by external stimuli or tasks. However, the resting state of the brain consumes most of its energy budget, which suggests a far more important role. An intriguing hint comes from experimental observations of spontaneous activity patterns, which closely resemble those evoked by visual stimulation with oriented gratings, except that cortex appeared to cycle between different orientation maps. Moreover, patterns similar to those evoked by the behaviorally most relevant horizontal and vertical orientations occurred more often than those corresponding to oblique angles. We hypothesize that this kind of spontaneous activity develops at least to some degree autonomously, providing a dynamical reservoir of cortical states, which are then associated with visual stimuli through learning. To test this hypothesis, we use a biologically inspired neural mass model to simulate a patch of cat visual cortex. Spontaneous transitions between orientation states were induced by modest modifications of the neural connectivity, establishing a stable heteroclinic channel. Significantly, the experimentally observed greater frequency of states representing the behaviorally important horizontal and vertical orientations emerged spontaneously from these simulations. We then applied bar-shaped inputs to the model cortex and used Hebbian learning rules to modify the corresponding synaptic strengths. After unsupervised learning, different bar inputs reliably and exclusively evoked their associated orientation state; whereas in the absence of input, the model cortex resumed its spontaneous cycling. We conclude that the experimentally observed similarities between spontaneous and evoked activity in visual cortex can be explained as the outcome of a learning process that associates external stimuli with a preexisting reservoir of autonomous neural activity states. Our findings hence demonstrate how cortical connectivity can link the maintenance of spontaneous activity in the brain mechanistically to its core cognitive functions.
Lost in flatlands: will the next generation of page layout programs give us back our sense of space?
Resumo:
Background Cortical cultures grown long-term on multi-electrode arrays (MEAs) are frequently and extensively used as models of cortical networks in studies of neuronal firing activity, neuropharmacology, toxicology and mechanisms underlying synaptic plasticity. However, in contrast to the predominantly asynchronous neuronal firing activity exhibited by intact cortex, electrophysiological activity of mature cortical cultures is dominated by spontaneous epileptiform-like global burst events which hinders their effective use in network-level studies, particularly for neurally-controlled animat (‘artificial animal’) applications. Thus, the identification of culture features that can be exploited to produce neuronal activity more representative of that seen in vivo could increase the utility and relevance of studies that employ these preparations. Acetylcholine has a recognised neuromodulatory role affecting excitability, rhythmicity, plasticity and information flow in vivo although its endogenous production by cortical cultures and subsequent functional influence upon neuronal excitability remains unknown. Results Consequently, using MEA electrophysiological recording supported by immunohistochemical and RT-qPCR methods, we demonstrate for the first time, the presence of intrinsic cholinergic neurons and significant, endogenous cholinergic tone in cortical cultures with a characterisation of the muscarinic and nicotinic components that underlie modulation of spontaneous neuronal activity. We found that tonic muscarinic ACh receptor (mAChR) activation affects global excitability and burst event regularity in a culture age-dependent manner whilst, in contrast, tonic nicotinic ACh receptor (nAChR) activation can modulate burst duration and the proportion of spikes occurring within bursts in a spatio-temporal fashion. Conclusions We suggest that the presence of significant endogenous cholinergic tone in cortical cultures and the comparability of its modulatory effects to those seen in intact brain tissues support emerging, exploitable commonalities between in vivo and in vitro preparations. We conclude that experimental manipulation of endogenous cholinergic tone could offer a novel opportunity to improve the use of cortical cultures for studies of network-level mechanisms in a manner that remains largely consistent with its functional role.