964 resultados para Add-06_4
Resumo:
We describe experiments designed to explore the possibility of using amyloid fibrils as new nanoscale biomaterials for promoting and exploiting cell adhesion, migration and differentiation in vitro. We created peptides that add the biological cell adhesion sequence (RGD) or a control sequence (RAD) to the C-terminus of an 11-residue peptide corresponding to residues 105-115 of the amyloidogenic protein transthyretin. These peptides readily self-assemble in aqueous solution to form amyloid fibrils, and X-ray fibre diffraction shows that they possess the same strand and sheet spacing in the characteristic cross-beta structure as do fibrils formed by the parent peptide. We report that the fibrils containing the RGD sequence are bioactive and that these fibrils interact specifically with cells via the RGD group displayed on the fibril surface. As the design of such functionalized fibrils can be systematically altered, these findings suggest that it will be possible to generate nanomaterials based on amyloid fibrils that are tailored to promote interactions with a wide variety of cell types. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Modern organisms are adapted to a wide variety of habitats and lifestyles. The processes of evolution have led to complex, interdependent, well-designed mechanisms of todays world and this research challenge is to transpose these innovative solutions to resolve problems in the context of architectural design practice, e.g., to relate design by nature with design by human. In a design by human environment, design synthesis can be performed with the use of rapid prototyping techniques that will enable to transform almost instantaneously any 2D design representation into a physical three-dimensional model, through a rapid prototyping printer machine. Rapid prototyping processes add layers of material one on top of another until a complete model is built and an analogy can be established with design by nature where the natural lay down of earth layers shapes the earth surface, a natural process occurring repeatedly over long periods of time. Concurrence in design will particularly benefit from rapid prototyping techniques, as the prime purpose of physical prototyping is to promptly assist iterative design, enabling design participants to work with a three-dimensional hardcopy and use it for the validation of their design-ideas. Concurrent design is a systematic approach aiming to facilitate the simultaneous involvment and commitment of all participants in the building design process, enabling both an effective reduction of time and costs at the design phase and a quality improvement of the design product. This paper presents the results of an exploratory survey investigating both how computer-aided design systems help designers to fully define the shape of their design-ideas and the extent of the application of rapid prototyping technologies coupled with Internet facilities by design practice. The findings suggest that design practitioners recognize that these technologies can greatly enhance concurrence in design, though acknowledging a lack of knowledge in relation to the issue of rapid prototyping.
Resumo:
Background: Consistency of performance across tasks that assess syntactic comprehension in aphasia has clinical and theoretical relevance. In this paper we add to the relatively sparse previous work on how sentence comprehension abilities are influenced by the nature of the assessment task. Aims: Our aims are: (1) to compare linguistic performance across sentence-picture matching, enactment, and truth-value judgement tasks; (2) to investigate the impact of pictorial stimuli on syntactic comprehension. Methods Procedures: We tested a group of 10 aphasic speakers (3 with fluent and 7 with non-fluent aphasia) in three tasks (Experiment 1): (i) sentence-picture matching with four pictures, (ii) sentence-picture matching with two pictures, and (iii) enactment. A further task of truth-value judgement was given to a subgroup of those speakers (n=5, Experiment 2). Similar sentence types across all tasks were used and included canonical (actives, subject clefts) and non-canonical (passives, object clefts) sentences. We undertook two types of analyses: (a) we compared canonical and non-canonical sentences in each task; (b) we compared performance between (i) actives and passives, (ii) subject and object clefts in each task. We examined the results of all participants as a group and as case-series. Outcomes Results: Several task effects emerged. Overall, the two-picture sentence-picture matching and enactment tasks were more discriminating than the four-picture condition. Group performance in the truth-value judgement task was similar to two-picture sentence-picture matching and enactment. At the individual level performance across tasks contrasted to some group results. Conclusions: Our findings revealed task effects across participants. We discuss reasons that could explain the diverse profiles of performance and the implications for clinical practice.
Resumo:
There are still major challenges in the area of automatic indexing and retrieval of digital data. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. Research has been ongoing for a few years in the field of ontological engineering with the aim of using ontologies to add knowledge to information. In this paper we describe the architecture of a system designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval.
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
The credit arrangements between the three Edwards and Italian merchants were crucial for financing England’s ambitious foreign policies and ensuring the smooth running of governmental administration. The functioning of this credit system can be followed in detail through the well-kept but mostly unpublished records of the English Exchequer. This volume combines a transcription of the most important surviving accounts between the merchants and the Crown, with a parallel abstract presenting the core data in a double-entry format as credits to or debits from the king's account. This dual format was chosen to facilitate the interpretation of the source while still retaining the language and, as far as possible, the structure of the original documents. The wealth of evidence presented here has much value to add to our understanding of the financing of medieval government and the early development of banking services provided by Italian merchant societies. In particular, although the relationship between king and banker was, for the most part, mutually profitable, the English kings also acquired a reputation for defaulting on their debts and thus 'breaking' a succession of merchant societies. These documents provide an essential basis for a re-examination of the 'credit rating' of the medieval English Crown.
Resumo:
We consider problems of splitting and connectivity augmentation in hypergraphs. In a hypergraph G = (V +s, E), to split two edges su, sv, is to replace them with a single edge uv. We are interested in doing this in such a way as to preserve a defined level of connectivity in V . The splitting technique is often used as a way of adding new edges into a graph or hypergraph, so as to augment the connectivity to some prescribed level. We begin by providing a short history of work done in this area. Then several preliminary results are given in a general form so that they may be used to tackle several problems. We then analyse the hypergraphs G = (V + s, E) for which there is no split preserving the local-edge-connectivity present in V. We provide two structural theorems, one of which implies a slight extension to Mader’s classical splitting theorem. We also provide a characterisation of the hypergraphs for which there is no such “good” split and a splitting result concerned with a specialisation of the local-connectivity function. We then use our splitting results to provide an upper bound on the smallest number of size-two edges we must add to any given hypergraph to ensure that in the resulting hypergraph we have λ(x, y) ≥ r(x, y) for all x, y in V, where r is an integer valued, symmetric requirement function on V*V. This is the so called “local-edge-connectivity augmentation problem” for hypergraphs. We also provide an extension to a Theorem of Szigeti, about augmenting to satisfy a requirement r, but using hyperedges. Next, in a result born of collaborative work with Zoltán Király from Budapest, we show that the local-connectivity augmentation problem is NP-complete for hypergraphs. Lastly we concern ourselves with an augmentation problem that includes a locational constraint. The premise is that we are given a hypergraph H = (V,E) with a bipartition P = {P1, P2} of V and asked to augment it with size-two edges, so that the result is k-edge-connected, and has no new edge contained in some P(i). We consider the splitting technique and describe the obstacles that prevent us forming “good” splits. From this we deduce results about which hypergraphs have a complete Pk-split. This leads to a minimax result on the optimal number of edges required and a polynomial algorithm to provide an optimal augmentation.
Resumo:
Major construction clients are increasingly looking to procure built facilities on the basis of added value, rather than capital cost. Recent advances in the procurement of construction projects have emphasised a whole-life value approach to meeting the client’s objectives, with strategies put in place to encourage long-term commitment and through-life service provision. Construction firms are therefore increasingly required to take on responsibility for the operation and maintenance of the construction project on the client’s behalf - with the emphasis on value and service. This inevitably throws up a host of challenges, not the least of which is the need for construction firms to manage and accommodate the new emphasis on service. Indeed, these ‘service-led’ projects represent a new realm of construction projects where the rationale for the project is driven by client’s objectives with some aspect of service provision. This vision of downstream service delivery increases the number of stakeholders, adds to project complexity and challenges deeply-ingrained working practices. Ultimately it presents a major challenge for the construction sector. This paper sets out to unravel some of the many implications that this change brings with it. It draws upon ongoing research investigating how construction firms can adapt to a more service-orientated built environment and add value in project-based environments. The conclusions lay bare the challenges that firms face when trying to compete on the basis of added-value and service delivery. In particular, how it affects deeply-ingrained working practices and established relationships in the sector.
Resumo:
A novel extension to Kohonen's self-organising map, called the plastic self organising map (PSOM), is presented. PSOM is unlike any other network because it only has one phase of operation. The PSOM does not go through a training cycle before testing, like the SOM does and its variants. Each pattern is thus treated identically for all time. The algorithm uses a graph structure to represent data and can add or remove neurons to learn dynamic nonstationary pattern sets. The network is tested on a real world radar application and an artificial nonstationary problem.
Resumo:
The volume–volatility relationship during the dissemination stages of information flow is examined by analyzing various theories relating volume and volatility as complementary rather than competing models. The mixture of distributions hypothesis, sequential arrival of information hypothesis, the dispersion of beliefs hypothesis, and the noise trader hypothesis all add to the understanding of how volume and volatility interact for different types of futures traders. An integrated picture of the volume–volatility relationship is provided by investigating the dynamic linear and nonlinear associations between volatility and the volume of informed (institutional) and uninformed (the general public) traders. In particular, the trading behavior explanation for the persistence of futures volatility, the effect of the timing of private information arrival, and the response of institutional traders to excess noise trading risk is examined
Resumo:
The ASTER Global Digital Elevation Model (GDEM) has made elevation data at 30 m spatial resolution freely available, enabling reinvestigation of morphometric relationships derived from limited field data using much larger sample sizes. These data are used to analyse a range of morphometric relationships derived for dunes (between dune height, spacing, and equivalent sand thickness) in the Namib Sand Sea, which was chosen because there are a number of extant studies that could be used for comparison with the results. The relative accuracy of GDEM for capturing dune height and shape was tested against multiple individual ASTER DEM scenes and against field surveys, highlighting the smoothing of the dune crest and resultant underestimation of dune height, and the omission of the smallest dunes, because of the 30 m sampling of ASTER DEM products. It is demonstrated that morphometric relationships derived from GDEM data are broadly comparable with relationships derived by previous methods, across a range of different dune types. The data confirm patterns of dune height, spacing and equivalent sand thickness mapped previously in the Namib Sand Sea, but add new detail to these patterns.
Resumo:
This article introduces generalized beta-generated (GBG) distributions. Sub-models include all classical beta-generated, Kumaraswamy-generated and exponentiated distributions. They are maximum entropy distributions under three intuitive conditions, which show that the classical beta generator skewness parameters only control tail entropy and an additional shape parameter is needed to add entropy to the centre of the parent distribution. This parameter controls skewness without necessarily differentiating tail weights. The GBG class also has tractable properties: we present various expansions for moments, generating function and quantiles. The model parameters are estimated by maximum likelihood and the usefulness of the new class is illustrated by means of some real data sets.
Resumo:
This paper sets out progress during the first eighteen months of doctoral research into the City of London office market. The overall aim of the research is to explore relationships between office rents and the economy in the UK over the last 150 years. To do this, a database of lettings has been created from which a long run index of City office rents can be constructed. With this index, it should then be possible to analyse trends in rents and relationships with their long run determinants. The focus of this paper is on the creation of the rent database. First, it considers the existing secondary sources of long run rental data for the UK. This highlights a lack of information for years prior to 1970 and the need for primary data collection if earlier periods are to be studied. The paper then discusses the selection of the City of London and of the time period chosen for research. After this, it describes how a dataset covering the period 1860-1960 has been assembled using the records of property companies active in the City office market. It is hoped that, if successful, this research will contribute to existing knowledge on the long run characteristics of commercial real estate. In particular, it should add a price dimension (rents) to the existing long run information on stock/supply and investment. Hence, it should enable a more complete picture of the development and performance of commercial real estate through time to be gained.
Resumo:
Modern Portfolio Theory (MPT) has been advocated as a more rational approach to the construction of real estate portfolios. The application of MPT can now be achieved with relative ease using the powerful facilities of modern spreadsheet, and does not necessarily need specialist software. This capability is to be found in the use of an add-in Tool now found in several spreadsheets, called an Optimiser or Solver. The value in using this kind of more sophisticated analysis feature of spreadsheets is increasingly difficult to ignore. This paper examines the use of the spreadsheet Optimiser in handling asset allocation problems. Using the Markowitz Mean-Variance approach, the paper introduces the necessary calculations, and shows, by means of an elementary example implemented in Microsoft's Excel, how the Optimiser may be used. Emphasis is placed on understanding the inputs and outputs from the portfolio optimisation process, and the danger of treating the Optimiser as a Black Box is discussed.
Resumo:
The paper conceptualises and explores the links between cities, commerce, urbanism and cultural planning by drawing on Temple Bar in Dublin as an example of how, by linking these concepts to practice in real concrete situations urban life or urban culture can be created and/or revitalised. Temple Bar is Dublin's emerging cultural quarter, an experiment in urban revitalisation which is deliberately focused on culture and urbanism as ways of rediscovering the good city. It has attracted considerable interest from across Europe, and has secured EC funding to kick-start the process of renewal. The author was appointed by the Irish Government to prepare the area management and development strategy for Temple Bar in 1990. Wary of the dangers of property led regeneration, of the destructive impacts of sudden or cataclysmic change, the agencies in Temple Bar have deliberately adopted a strategic management approach to the area. This is referred to as 'urban stewardship', a process of looking after and respecting a place, and helping it to help itself. The paper explores whether there is a 'culture of cities' and whether it is possible to recreate an urban culture. Following Raymond Williams, an anthropological definition of culture is employed, "... a particular way of life, which expresses certain meaning and values not only in art and learning but also in institutional and ordinary behaviour". Rather than being simply an add-on to the serious concerns of economic development and the built environment, culture has both helped shape, and continues to develop in, the streets, spaces and buildings of the city.