52 resultados para Theories and models


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Crumpets are made by heating fermented batter on a hot plate at around 230°C. The characteristic structure dominated by vertical pores develops rapidly: structure has developed throughout around 75% of the product height within 30s, which is far faster than might be expected from transient heat conduction through the batter. Cooking is complete within around 3 min. Image analysis based on results from X-ray tomography shows that the voidage fraction is approximately constant and that there is continual coalescence between the larger pores throughout the product although there is also a steady level of small bubbles trapped within the solidified batter. We report here experimental studies which shed light on some of the mechanisms responsible for this structure, together with some models of key phenomena.Three aspects are discussed here: the role of gas (carbon dioxide and nitrogen) nuclei in initiating structure development; convective heat transfer inside the developing pores; and the kinetics of setting the batter into an elastic solid structure. It is shown conclusively that the small bubbles of carbon dioxide resulting from the fermentation stage play a crucial role as nuclei for pore development: without these nuclei, the result is not a porous structure, but rather a solid, elastic, inedible, gelatinized product. These nuclei are also responsible for the tiny bubbles which are set in the final product. The nuclei form the source of the dominant pore structure which is largely driven by the, initially explosive, release of water vapour from the batter together with the desorption of dissolved carbon dioxide. It is argued that the rapid evaporation, transport and condensation of steam within the growing pores provides an important mechanism, as in a heat pipe, for rapid heat transfer, and models for this process are developed and tested. The setting of the continuous batter phase is essential for final product quality: studies using differential scanning calorimetry and on the kinetics of change in the visco-elastic properties of the batter suggest that this process is driven by the kinetics of gelatinization. Unlike many thermally driven food processes the rates of heating are such that gelatinization kinetics cannot be neglected. The implications of these results for modelling and for the development of novel structures are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper argues the need for the information communication technology (ICT), labor exchange (job boards), and Human Capital ontology engineers (ontoEngineers) to jointly design and socialize an upper level meta-ontology for people readiness and career portability. These enticing ontology research topics have yielded "independent" results, but have yet to meet the more broader or "universal" requirement that emerging frameworks demand. This paper will focus on the need to universally develop an upper level ontology and provide the reader concepts and models that can be transformed into marketable solutions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The SPE taxonomy of evolving software systems, first proposed by Lehman in 1980, is re-examined in this work. The primary concepts of software evolution are related to generic theories of evolution, particularly Dawkins' concept of a replicator, to the hermeneutic tradition in philosophy and to Kuhn's concept of paradigm. These concepts provide the foundations that are needed for understanding the phenomenon of software evolution and for refining the definitions of the SPE categories. In particular, this work argues that a software system should be defined as of type P if its controlling stakeholders have made a strategic decision that the system must comply with a single paradigm in its representation of domain knowledge. The proposed refinement of SPE is expected to provide a more productive basis for developing testable hypotheses and models about possible differences in the evolution of E- and P-type systems than is provided by the original scheme. Copyright (C) 2005 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The quality of information provision influences considerably knowledge construction driven by individual users’ needs. In the design of information systems for e-learning, personal information requirements should be incorporated to determine a selection of suitable learning content, instructive sequencing for learning content, and effective presentation of learning content. This is considered as an important part of instructional design for a personalised information package. The current research reveals that there is a lack of means by which individual users’ information requirements can be effectively incorporated to support personal knowledge construction. This paper presents a method which enables an articulation of users’ requirements based on the rooted learning theories and requirements engineering paradigms. The user’s information requirements can be systematically encapsulated in a user profile (i.e. user requirements space), and further transformed onto instructional design specifications (i.e. information space). These two spaces allow the discovering of information requirements patterns for self-maintaining and self-adapting personalisation that enhance experience in the knowledge construction process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Molecular dynamics simulations of the events after the photodissociation of CO in the myoglobin mutant L29F in which leucine is replaced by phenylalanine are reported. Using both classical and mixed quantum-classical molecular dynamics calculations, we observed the rapid motion of CO away from the distal heme pocket to other regions of the protein, in agreement with recent experimental results. The experimentally observed and calculated infrared spectra of CO after dissociation are also in good agreement. We compared the results with data from simulations of WT myoglobin. As the time resolution of experimental techniques is increased, theoretical methods and models can be validated at the atomic scale by direct comparison with experiment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Accurate observations of cloud microphysical properties are needed for evaluating and improving the representation of cloud processes in climate models and better estimate of the Earth radiative budget. However, large differences are found in current cloud products retrieved from ground-based remote sensing measurements using various retrieval algorithms. Understanding the differences is an important step to address uncertainties in the cloud retrievals. In this study, an in-depth analysis of nine existing ground-based cloud retrievals using ARM remote sensing measurements is carried out. We place emphasis on boundary layer overcast clouds and high level ice clouds, which are the focus of many current retrieval development efforts due to their radiative importance and relatively simple structure. Large systematic discrepancies in cloud microphysical properties are found in these two types of clouds among the nine cloud retrieval products, particularly for the cloud liquid and ice particle effective radius. Note that the differences among some retrieval products are even larger than the prescribed uncertainties reported by the retrieval algorithm developers. It is shown that most of these large differences have their roots in the retrieval theoretical bases, assumptions, as well as input and constraint parameters. This study suggests the need to further validate current retrieval theories and assumptions and even the development of new retrieval algorithms with more observations under different cloud regimes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This is one of the first papers in which arguments are given to treat code-switching and borrowing as similar phenomena. It is argued that it is theoretically undesirable to distinguish both phenomena, and empirically very problematic. A probabilistic account of code-switching and a hierarchy of switched constituents (similar to hierarchies of borrowability) are proposed which account for the fact that some constituents are more likely to be borrowed/switched than others. It is argued that the same kinds of constraints apply to both code-switching and borrowing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Urban metabolism considers a city as a system with flows of energy and material between it and the environment. Recent advances in bio-physical sciences provide methods and models to estimate local scale energy, water, carbon and pollutant fluxes. However, good communication is required to provide this new knowledge and its implications to endusers (such as urban planners, architects and engineers). The FP7 project BRIDGE (sustainaBle uRban plannIng Decision support accountinG for urban mEtabolism) aimed to address this gap by illustrating the advantages of considering these issues in urban planning. The BRIDGE Decision Support System (DSS) aids the evaluation of the sustainability of urban planning interventions. The Multi Criteria Analysis approach adopted provides a method to cope with the complexity of urban metabolism. In consultation with targeted end-users, objectives were defined in relation to the interactions between the environmental elements (fluxes of energy, water, carbon and pollutants) and socioeconomic components (investment costs, housing, employment, etc.) of urban sustainability. The tool was tested in five case study cities: Helsinki, Athens, London, Florence and Gliwice; and sub-models were evaluated using flux data selected. This overview of the BRIDGE project covers the methods and tools used to measure and model the physical flows, the selected set of sustainability indicators, the methodological framework for evaluating urban planning alternatives and the resulting DSS prototype.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Question: What plant properties might define plant functional types (PFTs) for the analysis of global vegetation responses to climate change, and what aspects of the physical environment might be expected to predict the distributions of PFTs? Methods: We review principles to explain the distribution of key plant traits as a function of bioclimatic variables. We focus on those whole-plant and leaf traits that are commonly used to define biomes and PFTs in global maps and models. Results: Raunkiær's plant life forms (underlying most later classifications) describe different adaptive strategies for surviving low temperature or drought, while satisfying requirements for reproduction and growth. Simple conceptual models and published observations are used to quantify the adaptive significance of leaf size for temperature regulation, leaf consistency for maintaining transpiration under drought, and phenology for the optimization of annual carbon balance. A new compilation of experimental data supports the functional definition of tropical, warm-temperate, temperate and boreal phanerophytes based on mechanisms for withstanding low temperature extremes. Chilling requirements are less well quantified, but are a necessary adjunct to cold tolerance. Functional traits generally confer both advantages and restrictions; the existence of trade-offs contributes to the diversity of plants along bioclimatic gradients. Conclusions: Quantitative analysis of plant trait distributions against bioclimatic variables is becoming possible; this opens up new opportunities for PFT classification. A PFT classification based on bioclimatic responses will need to be enhanced by information on traits related to competition, successional dynamics and disturbance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives. This paper considers the intersection of Corporate Social Responsibility (CSR) and social entrepreneurship in South Africa through the lens of institutional theories and draws upon a number of illustrative case study examples. In particular it: (1) charts the historically evolving relationship between CSR and social entrepreneurship in South Africa, and how this relationship has been informed by institutional changes since the end of apartheid, particularly over the last few years; (2) identifies different interactional relationship forms between social enterprises and corporates engaging in CSR, with an emphasis on new innovative multi-stakeholder partnerships; and (3) considers internal engagements with social responsibility by SME social enterprises in South Africa. Prior Work. Reflecting South Africa’s history of division, the controversial role of business during apartheid, and the ongoing legacies of that period, the South African government has been particularly pro-active in encouraging companies to contribute to development and societal transformation through CSR and Black Economic Empowerment (BEE). Accordingly a substantial body of work now exists examining and critically reflecting upon CSR and BEE across a range of sectors. In response to perceived problems with BEE, efforts have recently been made to foster broader-based economic empowerment. However the implications of these transitions for the relationship between CSR and social entrepreneurship in South Africa have received scant academic attention. Approach. Analysis is undertaken of legislative and policy changes in South Africa with a bearing on CSR and social entrepreneurship. Data collected during fieldwork in South Africa working with 6 social enterprise case studies is utilised including qualitative data from key informant interviews, focus groups with stakeholders and observational research. Results. The paper considers the historically evolving relationship between CSR and social entrepreneurship in South Africa informed by institutional change. Five different relationship forms are identified and illustrated with reference to case examples. Finally internal engagement with social responsibility concerns by small and medium social enterprises are critically discussed. Implications. This paper sheds light on some of the innovative partnerships emerging between corporates and social enterprises in South Africa. It reflects on some of the strengths and weaknesses of South Africa’s policy and legislative approaches. Value. The paper provides insights useful for academic and practitioner audiences. It also has policy relevance, in particularly for other African countries potentially looking to follow South Africa’s example, in the development of legislative and policy frameworks to promote corporate responsibility, empowerment and transformation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, research into the impact of genetic abnormalities on cognitive development, including language, has become recognized for its potential to make valuable contributions to our understanding of the brain–behaviour relationships underlying language acquisition as well as to understanding the cognitive architecture of the human mind. The publication of Fodor’s ( 1983 ) book The Modularity of Mind has had a profound impact on the study of language and the cognitive architecture of the human mind. Its central claim is that many of the processes involved in comprehension are undertaken by special brain systems termed ‘modules’. This domain specificity of language or modularity has become a fundamental feature that differentiates competing theories and accounts of language acquisition (Fodor 1983 , 1985 ; Levy 1994 ; Karmiloff-Smith 1998 ). However, although the fact that the adult brain is modularized is hardly disputed, there are different views of how brain regions become specialized for specific functions. A question of some interest to theorists is whether the human brain is modularized from the outset (nativist view) or whether these distinct brain regions develop as a result of biological maturation and environmental input (neuroconstructivist view). One source of insight into these issues has been the study of developmental disorders, and in particular genetic syndromes, such as Williams syndrome (WS) and Down syndrome (DS). Because of their uneven profiles characterized by dissociations of different cognitive skills, these syndromes can help us address theoretically significant questions. Investigations into the linguistic and cognitive profiles of individuals with these genetic abnormalities have been used as evidence to advance theoretical views about innate modularity and the cognitive architecture of the human mind. The present chapter will be organized as follows. To begin, two different theoretical proposals in the modularity debate will be presented. Then studies of linguistic abilities in WS and in DS will be reviewed. Here, the emphasis will be mainly on WS due to the fact that theoretical debates have focused primarily on WS, there is a larger body of literature on WS, and DS subjects have typically been used for the purposes of comparison. Finally, the modularity debate will be revisited in light of the literature review of both WS and DS. Conclusions will be drawn regarding the contribution of these two genetic syndromes to the issue of cognitive modularity, and in particular innate modularity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The response of the Southern Ocean to a repeating seasonal cycle of ozone loss is studied in two coupled climate models and found to comprise both fast and slow processes. The fast response is similar to the inter-annual signature of the Southern Annular Mode (SAM) on Sea Surface Temperature (SST), on to which the ozone-hole forcing projects in the summer. It comprises enhanced northward Ekman drift inducing negative summertime SST anomalies around Antarctica, earlier sea ice freeze-up the following winter, and northward expansion of the sea ice edge year-round. The enhanced northward Ekman drift, however, results in upwelling of warm waters from below the mixed layer in the region of seasonal sea ice. With sustained bursts of westerly winds induced by ozone-hole depletion, this warming from below eventually dominates over the cooling from anomalous Ekman drift. The resulting slow-timescale response (years to decades) leads to warming of SSTs around Antarctica and ultimately a reduction in sea-ice cover year-round. This two-timescale behavior - rapid cooling followed by slow but persistent warming - is found in the two coupled models analysed, one with an idealized geometry, the other a complex global climate model with realistic geometry. Processes that control the timescale of the transition from cooling to warming, and their uncertainties are described. Finally we discuss the implications of our results for rationalizing previous studies of the effect of the ozone-hole on SST and sea-ice extent. %Interannual variability in the Southern Annular Mode (SAM) and sea ice covary such that an increase and southward shift in the surface westerlies (a positive phase of the SAM) coincides with a cooling of Sea Surface Temperature (SST) around 70-50$^\circ$S and an expansion of the sea ice cover, as seen in observations and models alike. Yet, in modeling studies, the Southern Ocean warms and sea ice extent decreases in response to sustained, multi-decadal positive SAM-like wind anomalies driven by 20th century ozone depletion. Why does the Southern Ocean appear to have disparate responses to SAM-like variability on interannual and multidecadal timescales? Here it is demonstrated that the response of the Southern Ocean to ozone depletion has a fast and a slow response. The fast response is similar to the interannual variability signature of the SAM. It is dominated by an enhanced northward Ekman drift, which transports heat northward and causes negative SST anomalies in summertime, earlier sea ice freeze-up the following winter, and northward expansion of the sea ice edge year round. The enhanced northward Ekman drift causes a region of Ekman divergence around 70-50$^\circ$S, which results in upwelling of warmer waters from below the mixed layer. With sustained westerly wind enhancement in that latitudinal band, the warming due to the anomalous upwelling of warm waters eventually dominates over the cooling from the anomalous Ekman drift. Hence, the slow response ultimately results in a positive SST anomaly and a reduction in the sea ice cover year round. We demonstrate this behavior in two models: one with an idealized geometry and another, more detailed, global climate model. However, the models disagree on the timescale of transition from the fast (cooling) to the slow (warming) response. Processes that controls this transition and their uncertainties are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Though many global aerosols models prognose surface deposition, only a few models have been used to directly simulate the radiative effect from black carbon (BC) deposition to snow and sea ice. Here, we apply aerosol deposition fields from 25 models contributing to two phases of the Aerosol Comparisons between Observations and Models (AeroCom) project to simulate and evaluate within-snow BC concentrations and radiative effect in the Arctic. We accomplish this by driving the offline land and sea ice components of the Community Earth System Model with different deposition fields and meteorological conditions from 2004 to 2009, during which an extensive field campaign of BC measurements in Arctic snow occurred. We find that models generally underestimate BC concentrations in snow in northern Russia and Norway, while overestimating BC amounts elsewhere in the Arctic. Although simulated BC distributions in snow are poorly correlated with measurements, mean values are reasonable. The multi-model mean (range) bias in BC concentrations, sampled over the same grid cells, snow depths, and months of measurements, are −4.4 (−13.2 to +10.7) ng g−1 for an earlier phase of AeroCom models (phase I), and +4.1 (−13.0 to +21.4) ng g−1 for a more recent phase of AeroCom models (phase II), compared to the observational mean of 19.2 ng g−1. Factors determining model BC concentrations in Arctic snow include Arctic BC emissions, transport of extra-Arctic aerosols, precipitation, deposition efficiency of aerosols within the Arctic, and meltwater removal of particles in snow. Sensitivity studies show that the model–measurement evaluation is only weakly affected by meltwater scavenging efficiency because most measurements were conducted in non-melting snow. The Arctic (60–90° N) atmospheric residence time for BC in phase II models ranges from 3.7 to 23.2 days, implying large inter-model variation in local BC deposition efficiency. Combined with the fact that most Arctic BC deposition originates from extra-Arctic emissions, these results suggest that aerosol removal processes are a leading source of variation in model performance. The multi-model mean (full range) of Arctic radiative effect from BC in snow is 0.15 (0.07–0.25) W m−2 and 0.18 (0.06–0.28) W m−2 in phase I and phase II models, respectively. After correcting for model biases relative to observed BC concentrations in different regions of the Arctic, we obtain a multi-model mean Arctic radiative effect of 0.17 W m−2 for the combined AeroCom ensembles. Finally, there is a high correlation between modeled BC concentrations sampled over the observational sites and the Arctic as a whole, indicating that the field campaign provided a reasonable sample of the Arctic.