916 resultados para continuous-resource model
Resumo:
Propagation of discharges in cortical and thalamic systems, which is used as a probe for examining network circuitry, is studied by constructing a one-dimensional model of integrate-and-fire neurons that are coupled by excitatory synapses with delay. Each neuron fires only one spike. The velocity and stability of propagating continuous pulses are calculated analytically. Above a certain critical value of the constant delay, these pulses lose stability. Instead, lurching pulses propagate with discontinuous and periodic spatio-temporal characteristics. The parameter regime for which lurching occurs is strongly affected by the footprint (connectivity) shape; bistability may occur with a square footprint shape but not with an exponential footprint shape. For strong synaptic coupling, the velocity of both continuous and lurching pulses increases logarithmically with the synaptic coupling strength gsyn for an exponential footprint shape, and it is bounded for a step footprint shape. We conclude that the differences in velocity and shape between the front of thalamic spindle waves in vitro and cortical paroxysmal discharges stem from their different effective delay; in thalamic networks, large effective delay between inhibitory neurons arises from their effective interaction via the excitatory cells which display postinhibitory rebound.
Resumo:
When Ca2+ is released from internal stores in living cells, the resulting wave of increased concentration can travel without deformation (continuous propagation) or with burst-like behavior (saltatory propagation). We analyze the “fire–diffuse–fire” model in order to illuminate the differences between these two modes of propagation. We show that the Ca2+ release wave in immature Xenopus oocytes and cardiac myocytes is saltatory, whereas the fertilization wave in the mature oocyte is continuous.
The Zebrafish Information Network (ZFIN): a resource for genetic, genomic and developmental research
Resumo:
The Zebrafish Information Network, ZFIN, is a WWW community resource of zebrafish genetic, genomic and developmental research information (http://zfin.org). ZFIN provides an anatomical atlas and dictionary, developmental staging criteria, research methods, pathology information and a link to the ZFIN relational database (http://zfin.org/ZFIN/). The database, built on a relational, object-oriented model, provides integrated information about mutants, genes, genetic markers, mapping panels, publications and contact information for the zebrafish research community. The database is populated with curated published data, user submitted data and large dataset uploads. A broad range of data types including text, images, graphical representations and genetic maps supports the data. ZFIN incorporates links to other genomic resources that provide sequence and ortholog data. Zebrafish nomenclature guidelines and an automated registration mechanism for new names are provided. Extensive usability testing has resulted in an easy to learn and use forms interface with complex searching capabilities.
Resumo:
Grazed pastures are the backbone of the Brazilian livestock industry and grasses of the genus Brachiaria (syn. Urochloa) are some of most used tropical forages in the country. Although the dependence on the forage resource is high, grazing management is often empirical and based on broad and non-specific guidelines. Mulato II brachiariagrass (Convert HD 364, Dow AgroSciences, São Paulo, Brazil) (B. brizantha × B. ruziziensis × B. decumbens), a new Brachiaria hybrid, was released as an option for a broad range of environmental conditions. There is no scientific information on specific management practices for Mulato II under continuous stocking in Brazil. The objectives of this research were to describe and explain variations in carbon assimilation, herbage accumulation (HA), plant-part accumulation, nutritive value, and grazing efficiency (GE) of Mulato II brachiariagrass as affected by canopy height and growth rate, the latter imposed by N fertilization rate, under continuous stocking. An experiment was carried out in Piracicaba, SP, Brazil, during two summer grazing seasons. The experimental design was a randomized complete block, with a 3 x 2 factorial arrangement, corresponding to three steady-state canopy heights (10, 25 and 40 cm) maintained by mimicked continuous stocking and two growth rates (imposed as 50 and 250 kg N ha-1 yr-1), with three replications. There were no height × N rate interactions for most of the responses studied. The HA of Mulato II increased linearly (8640 to 13400 kg DM ha-1 yr-1), the in vitro digestible organic matter (IVDOM) decreased linearly (652 to 586 g kg-1), and the GE decreased (65 to 44%) as canopy height increased. Thus, although GE and IVDOM were greatest at 10 cm height, HA was 36% less for the 10- than for the 40-cm height. The leaf carbon assimilation was greater for the shortest canopy (10 cm), but canopy assimilation was less than in taller canopies, likely a result of less leaf area index (LAI). The reductions in HA, plant-part accumulation, and LAI, were not associated with other signs of stand deterioration. Leaf was the main plant-part accumulated, at a rate that increased from 70 to 100 kg DM ha-1 d-1 as canopy height increased from 10 to 40 cm. Mulato II was less productive (7940 vs. 13380 kg ha-1 yr-1) and had lesser IVDOM (581 vs. 652 g kg-1) at the lower N rate. The increase in N rate affected plant growth, increasing carbon assimilation, LAI, rates of plant-part accumulation (leaf, stem, and dead), and HA. The results indicate that the increase in the rate of dead material accumulation due to more N applied is a result of overall increase in the accumulation rates of all plant-parts. Taller canopies (25 or 40 cm) are advantageous for herbage accumulation of Mulato II, but nutritive value and GE was greater for 25 cm, suggesting that maintaining ∼25-cm canopy height is optimal for continuously stocked Mulato II.
Resumo:
In this paper we describe Fénix, a data model for exchanging information between Natural Language Processing applications. The format proposed is intended to be flexible enough to cover both current and future data structures employed in the field of Computational Linguistics. The Fénix architecture is divided into four separate layers: conceptual, logical, persistence and physical. This division provides a simple interface to abstract the users from low-level implementation details, such as programming languages and data storage employed, allowing them to focus in the concepts and processes to be modelled. The Fénix architecture is accompanied by a set of programming libraries to facilitate the access and manipulation of the structures created in this framework. We will also show how this architecture has been already successfully applied in different research projects.
Resumo:
The continuous improvement of management and assessment processes for curricular external internships has led a group of university teachers specialised in this area to develop a mixed model of measurement that combines the verification of skill acquisition by those students choosing external internships with the satisfaction of the parties involved in that process. They included academics, educational tutors of companies and organisations and administration and services personnel in the latter category. The experience, developed within University of Alicante, has been carried out in the degrees of Business Administration and Management, Business Studies, Economics, Advertising and Public Relations, Sociology and Social Work, all part of the Faculty of Economics and Business. By designing and managing closed standardised interviews and other research tools, validated outside the centre, a system of continuous improvement and quality assurance has been created, clearly contributing to the gradual increase in the number of students with internships in this Faculty, as well as to the improvement in satisfaction, efficiency and efficacy indicators at a global level. As this experience of educational innovation has shown, the acquisition of curricular knowledge, skills, abilities and competences by the students is directly correlated with the satisfaction of those parties involved in a process that takes the student beyond the physical borders of a university campus. Ensuring the latter is a task made easier by the implementation of a mixed assessment method, combining continuous and final assessment, and characterised by its rigorousness and simple management. This report presents that model, subject in turn to a persistent and continuous control, a model all parties involved in the external internships are taking part of. Its short-term results imply an increase, estimated at 15% for the last course, in the number of students choosing curricular internships and, for the medium and long-term, a major interweaving between the academic world and its social and productive environment, both in the business and institutional areas. The potentiality of this assessment model does not lie only in the quality of its measurement tools, but also in the effects from its use in the various groups and in the actions that are carried out as a result of its implementation and which, without any doubt and as it is shown below, are the real guarantee of a continuous improvement.
Resumo:
An empirical model based on constant flux is presented for chloride transport through concrete in atmospherical exposure conditions. A continuous supply of chlorides is assumed as a constant mass flux at the exposed concrete surface. The model is applied to experimental chloride profiles obtained from a real marine structure, and results are compared with the classical error-function model. The proposed model shows some advantages. It yields a better predictive capacity than the classical error-function model. The previously observed chloride surface concentration increases are compatible with the proposed model. Nevertheless, the predictive capacity of the model can fail if the concrete microstructure changes with time. The model seems to be appropriate for well-maturated concretes exposed to a marine environment in atmospherical conditions.
Resumo:
We present an extension of the logic outer-approximation algorithm for dealing with disjunctive discrete-continuous optimal control problems whose dynamic behavior is modeled in terms of differential-algebraic equations. Although the proposed algorithm can be applied to a wide variety of discrete-continuous optimal control problems, we are mainly interested in problems where disjunctions are also present. Disjunctions are included to take into account only certain parts of the underlying model which become relevant under some processing conditions. By doing so the numerical robustness of the optimization algorithm improves since those parts of the model that are not active are discarded leading to a reduced size problem and avoiding potential model singularities. We test the proposed algorithm using three examples of different complex dynamic behavior. In all the case studies the number of iterations and the computational effort required to obtain the optimal solutions is modest and the solutions are relatively easy to find.
Resumo:
The high hopes for rapid convergence of Eastern and Southern EU member states are increasingly being disappointed. With the onset of the Eurocrisis convergence has given way to divergence in the southern members, and many Eastern members have made little headway in closing the development gap. The EU´s performance compares unfavourably with East Asian success cases as well as with Western Europe´s own rapid catch-up to the USA after 1945. Historical experience indicates that successful catch up requires that less-developed economies to some extent are allowed to free-ride on an open international economic order. However, the EU´s model is based on the principle of a level-playing field, which militates against such a form of economic integration. The EU´s developmental model thus contrasts with the various strategies that have enabled successful catch up of industrial latecomers. Instead the EU´s current approach is more and more reminiscent of the relations between the pre-1945 European empires and their dependent territories. One reason for this unfortunate historical continuity is that the EU appears to have become entangled in its own myths. In the EU´s own interpretation, European integration is a peace project designed to overcome the almost continuous warfare that characterised the Westphalian system. As the sovereign state is identified as the root cause of all evil, any project to curtail its room of manoeuvre must ultimately benefit the common good. Yet, the existence of a Westphalian system of nation states is a myth. Empires and not states were the dominant actors in the international system for at least the last three centuries. If anything, the dawn of the age of the sovereign state in Western Europe occurred after 1945 with the disintegration of the colonial empires and thus historically coincided with the birth of European integration.
Resumo:
Addressing high and volatile natural resource prices, uncertain supply prospects, reindustrialization attempts and environmental damages related to resource use, resource efficiency has evolved into a highly debated proposal among academia, policy makers, firms and international financial institutions (IFIs). In 2011, the European Union (EU) declared resource efficiency as one of its seven flagship initiatives in its Europe 2020 strategy. This paper contributes to the discussions by assessing its key initiative, the Roadmap to a Resource Efficient Europe (EC 2011 571), following two streams of evaluation. In a first step, resource efficiency is linked to two theoretical frameworks regarding sustainability, (i) the sustainability triangle (consisting of economic, social and ecological dimensions) and (ii) balanced sustainability (combining weak and strong sustainability). Subsequently, both sustainability frameworks are used to assess to which degree the Roadmap follows the concept of sustainability. It can be concluded that it partially respects the sustainability triangle as well as balanced sustainability, primarily lacking a social dimension. In a second step, following Steger and Bleischwitz (2009), the impact of resource efficiency on competitiveness as advocated in the Roadmap is empirically evaluated. Using an Arellano–Bond dynamic panel data model reveals no robust impact of resource efficiency on competiveness in the EU between 2004 and 2009 – a puzzling result. Further empirical research and enhanced data availability are needed to better understand the impacts of resource efficiency on competitiveness on the macroeconomic, microeconomic and industry level. In that regard, strengthening the methodologies of resource indicators seem essential. Last but certainly not least, political will is required to achieve the transition of the EU-economy into a resource efficient future.
Resumo:
This dataset contains continuous time series of land surface temperature (LST) at spatial resolution of 300m around the 12 experimental sites of the PAGE21 project (grant agreement number 282700, funded by the EC seventh Framework Program theme FP7-ENV-2011). This dataset was produced from hourly LST time series at 25km scale, retrieved from SSM/I data (André et al., 2015, doi:10.1016/j.rse.2015.01.028) and downscaled to 300m using a dynamic model and a particle smoothing approach. This methodology is based on two main assumptions. First, LST spatial variability is mostly explained by land cover and soil hydric state. Second, LST is unique for a land cover class within the low resolution pixel. Given these hypotheses, this variable can be estimated using a land cover map and a physically based land surface model constrained with observations using a data assimilation process. This methodology described in Mechri et al. (2014, doi:10.1002/2013JD020354) was applied to the ORCHIDEE land surface model (Krinner et al., 2005, doi:10.1029/2003GB002199) to estimate prior values of each land cover class provided by the ESA CCI-Land Cover product (Bontemps et al., 2013) at 300m resolution . The assimilation process (particle smoother) consists in simulating ensemble of LST time series for each land cover class and for a large number of parameter sets. For each parameter set, the resulting temperatures are aggregated considering the grid fraction of each land cover and compared to the coarse observations. Miniminizing the distance between the aggregated model solutions and the observations allow us to select the simulated LST and the corresponding parameter sets which fit the observations most closely. The retained parameter sets are then duplicated and randomly perturbed before simulating the next time window. At the end, the most likely LST of each land cover class are estimated and used to reconstruct LST maps at 300m resolution using ESA CCI-Land Cover. The resulting temperature maps on which ice pixels were masked, are provided at daily time step during the nine-year analysis period (2000-2009).
Resumo:
Abrupt climate changes from 18 to 15 thousand years before present (kyr BP) associated with Heinrich Event 1 (HE1) had a strong impact on vegetation patterns not only at high latitudes of the Northern Hemisphere, but also in the tropical regions around the Atlantic Ocean. To gain a better understanding of the linkage between high and low latitudes, we used the University of Victoria (UVic) Earth System-Climate Model (ESCM) with dynamical vegetation and land surface components to simulate four scenarios of climate-vegetation interaction: the pre-industrial era, the Last Glacial Maximum (LGM), and a Heinrich-like event with two different climate backgrounds (interglacial and glacial). We calculated mega-biomes from the plant-functional types (PFTs) generated by the model to allow for a direct comparison between model results and palynological vegetation reconstructions. Our calculated mega-biomes for the pre-industrial period and the LGM corresponded well with biome reconstructions of the modern and LGM time slices, respectively, except that our pre-industrial simulation predicted the dominance of grassland in southern Europe and our LGM simulation resulted in more forest cover in tropical and sub-tropical South America. The HE1-like simulation with a glacial climate background produced sea-surface temperature patterns and enhanced inter-hemispheric thermal gradients in accordance with the "bipolar seesaw" hypothesis. We found that the cooling of the Northern Hemisphere caused a southward shift of those PFTs that are indicative of an increased desertification and a retreat of broadleaf forests in West Africa and northern South America. The mega-biomes from our HE1 simulation agreed well with paleovegetation data from tropical Africa and northern South America. Thus, according to our model-data comparison, the reconstructed vegetation changes for the tropical regions around the Atlantic Ocean were physically consistent with the remote effects of a Heinrich event under a glacial climate background.